H17 on Mac?

   9265   43   4
User Avatar
Member
135 posts
Joined: March 2018
Offline
I don't think so. I changed in Edit > Pref > Misc and restarted the app as stated in the post above. Clearly showing usage of the right GPU in Activity Monitors GPU history window.

I have attached a screenshot that clearly shows the usage of different GPUS in three conesecutive runs with a restart of the app between the runs.
Edited by filipw - Oct. 11, 2018 09:58:30

Attachments:
Screen Shot 2018-10-11 at 15.55.38.png (1.4 MB)

User Avatar
Staff
5156 posts
Joined: July 2005
Offline
You'll only see a difference when the mesh in question is denser. Right now the small number of points in the pig isn't enough to fill all the the ALU cores in even the 560 (880 points, 1024 ALUs). That's why the Vega's increased ALU cores aren't helping. More complex sims with more points should see the Vega pull ahead as it can do more of them at once. The only advantages the Vega has now is potentially clock speed and archtecture enhancements, which likely don't amount to anything noticeable.
User Avatar
Member
135 posts
Joined: March 2018
Offline
That sounds very reasonable and I was thinking that might be the case but even with a coarser or subdivided mesh there is not a difference except for being a lot slower across the board. I guess I will have to test the same on my PC with the same Vega64 to see if it is similar (don't have access to it now) Have you tested with eGPUS on Mac internally? Does it work satisfactory for you? I just run it as an accellerator not connected to a screen a the moment.
User Avatar
Member
135 posts
Joined: March 2018
Offline
Some datapoints:
Did a 20 step sim of pighead at hard using a Vellum “ballon” sim as of above. Subdivided the mesh so that there where 11208 points.
(info just for reference, it is the relative performance figures I am baffled at)

Using OpenCL on 560 pro: total 3:16
Using OpenCL in RX Vega64: total: 2:13
Using OpenCL on CPU (4 core Mac laptop): 45s
User Avatar
Member
135 posts
Joined: March 2018
Offline
Friend of mine tested on win10 using a GTX 1060 and a 8 core r1700:
GPU: ca 50 s
CPU: ca 24 s
I can upload the h17 apperentice .hip file if anyone is interested
User Avatar
Member
833 posts
Joined: Jan. 2018
Offline
twod
Apple does support us and they are aware of our specific software needs (GL, CL, etc). We're in contact with them on a fairly regular basis about various issues.

Honestly I think that's the best piece of info from this entire thread.
>>Kays
For my Houdini tutorials and more visit:
https://www.youtube.com/c/RightBrainedTutorials [www.youtube.com]
User Avatar
Member
833 posts
Joined: Jan. 2018
Offline
filipw
Friend of mine tested on win10 using a GTX 1060 and a 8 core r1700:
GPU: ca 50 s
CPU: ca 24 s
I can upload the h17 apperentice .hip file if anyone is interested

Sure, go ahead. I can run it on my 6-core and the two 1080ti's and see what results I get.
>>Kays
For my Houdini tutorials and more visit:
https://www.youtube.com/c/RightBrainedTutorials [www.youtube.com]
User Avatar
Member
135 posts
Joined: March 2018
Offline
So here is the sample file.
Edited by filipw - Oct. 11, 2018 12:54:09

Attachments:
h17_pig_sim_simple.hipnc (562.0 KB)

User Avatar
Member
833 posts
Joined: Jan. 2018
Offline
Ok, if I'm doing this correctly (keep in mind I literally just installed H17):

GPU (single 1080ti since Houdini doesn't seem to recognize multiple GPU's) results for 20 frame - 1:10

CPU (i7 8700k OC'd to 4.8ghz, 32Mb RAM) results for 20 frames - 0:22
>>Kays
For my Houdini tutorials and more visit:
https://www.youtube.com/c/RightBrainedTutorials [www.youtube.com]
User Avatar
Member
155 posts
Joined: Nov. 2015
Offline
Midphase
twod
Apple does support us and they are aware of our specific software needs (GL, CL, etc). We're in contact with them on a fairly regular basis about various issues.

Honestly I think that's the best piece of info from this entire thread.

And that's where we are.. which is kind of sad if you think about it.
I know a lot of people, me included, who would immedeately jump back to a mac pro if Apple just released decent headless mac that is upgradeble and changed their behaviour towards one or two things. Mainly NVIDIA hardware support.

Intel f*** their releases up so maybe AMD has something to put into a mac pro? ok.. this is a long shot but a threadripper mac pro would be so awesome
User Avatar
Member
155 posts
Joined: Nov. 2015
Offline
twod
Apple does support us and they are aware of our specific software needs (GL, CL, etc). We're in contact with them on a fairly regular basis about various issues.

this is truly great to hear!

I have a question though. Going forward with openGL and openCL being marked as deprecated. Will it throttle performance on macs? I mean if on windows and linux there is (I am guessing) full support and up to date drivers?
My concern is if they really release a new mac pro and I am jumping back will performance be the same on equal hardware?

Also.. please let them know pro 3D Artists kinda need support for NVIDIA because .. well.. GPU rendering.
User Avatar
Member
135 posts
Joined: March 2018
Offline
Seems reasobable. Just tested on my PC/win10 i7 5820@3.3Ghz and GTX 970:
CPU: 35 s
GPU: 1:53 s

Switched to Vega64 on the PC(called gfx900 in the settings):
GPU: 2:05

Ok, unexpected. Same low performance as on the Mac.

After update to latest drivers (18.5.1) and trying again:
GPU: 1:56

Then trying a i7 7700 with a GTX 1080 that I have in the same office space:
CPU: 28s
GPU: 1:06

Maybe this is a lousy test, but for this particular case we can at least conclude that CPU simulation is at least twice as fast on a low end CPU than a high end GPU and that Nvidia cards in general are a lot faster than AMD cards.

I wish someone with actual skills and knowledge of Houdini could make a real benchmark in order for us to make informed Hardware investments. Seems like the most sane thing to do is investing in a beefy CPU.
User Avatar
Staff
5156 posts
Joined: July 2005
Offline
That particular test uses a pressure solver, which according to the Vellum devs happens to run better on the CPU than a GPU. Other solver types are quite so biased towards the CPU.

From what I understand of GL/CL's deprecation on Mac, it is still supported but no new features are being added, meaning GL4.1 and CL1.2 are the max versions we will see. Performance will only suffer in the sense that if we add GL features for effeciency that require GL4.2 or up, it won't be available to Mac's despite the hardware being able to support it.
User Avatar
Member
833 posts
Joined: Jan. 2018
Offline
twod
Performance will only suffer in the sense that if we add GL features for effeciency that require GL4.2 or up, it won't be available to Mac's despite the hardware being able to support it.

I guess we'll cross that bridge when we come to it, but I wouldn't put it past Apple to silently reverse course if there is enough discontent from the end users. I don't believe that for Apple it's ever been about the market, as much as the prestige factor. Big studios work the same way, they'll put out whatever superhero blockbuster to make their bulk income, but they still want their art films when it's Oscar time for the prestige. Since Houdini is so prominently featured in the iMac Pro videos, I would surmise that Apple very much cares about being able to boast about one of the most sophisticated DCC apps in the world running on their hardware.

I think the real proof of their commitment to pro users will be revealed with the announcement of the new Mac Pro models. It will speak very loudly about whether or not they still want to inhabit this part of the market, or not.

I can't wait until they reveal some concrete info about it, but in the meantime their silence is deafening!
>>Kays
For my Houdini tutorials and more visit:
https://www.youtube.com/c/RightBrainedTutorials [www.youtube.com]
User Avatar
Member
31 posts
Joined:
Offline
Apple has clear goals with the metal, will try to Osx have a platform for the development of games. It offers an API that will work well on its own hardware and little else. Apple will not be able to fight with that API, among other things because Directx and Opengl have better compatibility with any hardware.

OpenGl has a lot of life left. For example, Pixar Hydra (USD) is an OpenGL implementation. I think it will have a better acceptance and support in the professional sector than Metal, which is a market maneuver.
User Avatar
Member
4189 posts
Joined: June 2012
Offline
we have a problem where wireframes can't even have thickness on MacOS and the in-dev LOPs/project Solaris is meant to compete with Katana, which is now using a Hydra viewport:

Hydra specs:
○ Fstest w/ OpenGL 4.5 + Bindless Textures + Bindless Buffers + Direct State Access
○ Graceful fallback to OpenGL 4.0

Interestingly Hydra can in the future have a Metal backend, as OSD already does.
User Avatar
Member
8 posts
Joined: March 2012
Offline
Midphase
twod
Apple does support us and they are aware of our specific software needs (GL, CL, etc). We're in contact with them on a fairly regular basis about various issues.

Honestly I think that's the best piece of info from this entire thread.

May be someone else will support OpenCL and OpenGL for Mac? For example AMD! After all, the NVIDIA company produces drivers for Mac now?
User Avatar
Member
833 posts
Joined: Jan. 2018
Offline
If anyone is interested, Apple just announced an upcoming event on the 30th:

https://www.apple.com/apple-events/ [www.apple.com]

Could this be the long awaited official announcement (not necessarily availability) of the new Mac Pro's? It would be great if they can shed some light on what to expect for the pro market.
>>Kays
For my Houdini tutorials and more visit:
https://www.youtube.com/c/RightBrainedTutorials [www.youtube.com]
User Avatar
Member
4189 posts
Joined: June 2012
Offline
Anyone using Houdini with discrete AMD cards on MacOS? Last time I tried the 7950 viewport selections were all messed up, circa 2014/15. Looking at something like ‘SAPPHIRE Radeon PULSE RX 580 8GB GDDR5’ as it's on the approved Apple list for Mojave 10.14:

https://support.apple.com/en-us/HT208898 [support.apple.com]

As Nvidia and Apple are still not speaking to each other:
https://devtalk.nvidia.com/default/topic/1043070/announcements/faq-about-macos-10-14-mojave-nvidia-drivers/ [devtalk.nvidia.com]
User Avatar
Member
833 posts
Joined: Jan. 2018
Offline
My understanding is that AMD GPUs run incredibly well on OS X, if it wasn't for Redshift I would have long sold my Nvidia cards in favor of AMD's.
>>Kays
For my Houdini tutorials and more visit:
https://www.youtube.com/c/RightBrainedTutorials [www.youtube.com]
  • Quick Links