Houdini using OpenGL 2.1 instead of 3.2 on OSX

   3673   4   2
User Avatar
Member
5 posts
Joined: July 2011
Offline
Hi.

Is it normal that Houdini is using opengl 2.1 instead of the newer 3.2 in OSX Lion?

Is something wrong with my system or is this just the way it works?

(Basic hardware config in screenshot)

Attachments:
houdini.jpg (313.9 KB)

User Avatar
Member
696 posts
Joined: March 2009
Offline
It seems like OpenGL 3.2 is not supported on macs yet.
Not sure if it'll ever be for Houdini 11, but 12 is on it's way!
Toronto - ON
My Houdini playground [renderfarm.tumblr.com]
“As technology advances, the rendering time remains constant.”
User Avatar
Staff
5161 posts
Joined: July 2005
Offline
Apple's new OpenGL 3.2 driver uses the core profile, which means that all deprecated OpenGL 1.x and 2.x features were removed. Because of sheer amount of GL code in Houdini, it'll likely take awhile before all these “old” features can be migrated to core GL3.2. I hesitate to use the word “old” as a lot of these deprecated features are still quite useful, still hardware-accelerated, and converting most of the existing code to GL3.2 is not likely to see any performance improvement. It's a bit of a tough sell when both Nvidia and AMD still support the compatibility profiles on the other platforms.
User Avatar
Member
696 posts
Joined: March 2009
Offline
Yeah, as a mac user, it bugs the hell out of me when Apple takes these unilateral decisions that create more problems than it solves…
Toronto - ON
My Houdini playground [renderfarm.tumblr.com]
“As technology advances, the rendering time remains constant.”
User Avatar
Member
5 posts
Joined: July 2011
Offline
Good to know, thanks for your replies
  • Quick Links