Multiparams and menus, UI performance

   5368   20   1
User Avatar
Member
14 posts
Joined: Feb. 2017
Offline
I am experiencing a significant slowdown when interacting with my digital asset's UI.
I have a multiparam block (tabs) with a string field inside that gets populated with options (as a menu).
When I have a few tabs (not that many, like 5 or 6) I start noticing the UI becoming slower and slower as I add more tabs.
I am printing something out (just for debugging this) on my menu generation script and I notice that when I have a single tab, the menu generates six times (I see my printout on the Python Shell six times).

Is there a reason for this menu to have to generate multiple times? Is there anything obvious that I am missing?

Also, all these menus are the same under every tab. Is there a more efficient way to do this? Like generating the menu entries ‘somewhere else’ and referencing them from there?

Cheers
User Avatar
Member
28 posts
Joined: July 2015
Offline
UI is sluggish overall in 16…something with QT I heard.
User Avatar
Member
14 posts
Joined: Feb. 2017
Offline
Just to clarify, I am seeing this on Houdini 15.5.x
User Avatar
Member
7732 posts
Joined: July 2005
Offline
Have you tried timing how long your menu script takes? You always try caching the menu results in the node's userdata() or something like that.
User Avatar
Member
14 posts
Joined: Feb. 2017
Offline
Scripts don't take too long, problem is that they re-evaluate way too many times (having the same menu output for every tab).
userdata() sounds like a nice workaround for the problem, thanks for suggesting that, however I am not sure when so use it, as in which event to hook the code to, since menus change depending on the attributes of the geometry flowing into the node…
User Avatar
Member
7732 posts
Joined: July 2005
Offline
To cache things in userdata(), you just make it up some “key” that you want to use. The simplest thing is probably to store the hou.Node.cookCount(). In your menu callback, check if the current node cook count is the same as what's in userdata(). If it's the same, then use the cached results, else recompute.

EDIT: Make sure you call cook() on the node before calling cookCount() to ensure the node is up to date.
Edited by edward - March 1, 2017 09:24:55
User Avatar
Member
101 posts
Joined: Sept. 2015
Offline
I had a good back and forth with support on this for a couple of days. Long story short, H16 is a resource hog and won't perform well on video cards that aren't pretty hefty. Their system requirements will likely be updated. I have a compatible semi-pro card and 16 runs like crap if you attempt to use more than one screen. Support basically told me to use one monitor or go out and buy another video card. 15.5 runs beautifully for me even when I'm using other compositing apps running at the same time so I intend to just stay there and only use H16 to produce content then export it when the need arises to use H16 features.

Personally, I think it's just bad coding given the new architecture. They're not the only ones using QT. Other companies have done this just fine and their apps perform well.

I sincerely hope SideFX changes their tune on this performance bit. I got a pretty solid brush off from them in my support emails. Really added my first sour note in my experiences with SideFX.
User Avatar
Member
333 posts
Joined: Oct. 2012
Offline
H16 interface feels really slow and sluggish i have to agree.
I am wondering why it is not a bigger topic in the forums or anywhere else.

it seems like performance things take too long to get fixed or get ignored entirely. (hint: unified noise)


ps: For the viewport you can disable all new material stuff in the display options and it should perform better.
User Avatar
Member
65 posts
Joined: Sept. 2014
Offline
The last 2-3 builds (>16.0.530) seem to have way better menu performance and so far I haven't noticed a bug I've had with the earlier version, that leaving Houdini open for a couple of hours (or overnight) made it unusable in regard to GUI lag…

Tobias Steiner
… I have a compatible semi-pro card and 16 runs like crap if you attempt to use more than one screen. …
Doesn't really say much, Quadro 4000, M4000? Or perhaps FirePro..? Quite a big difference between those, for different reasons…

And how do you run Houdini dual screen? I run the main app on monitor one and a panel, full screen, on monitor two - and so far I haven't noticed lag based on if I use one or two monitors - but then again, I run dual GTX1070. (though only one of them is in use for drawing the desktop)
Edited by Farmfield - March 1, 2017 13:20:48

Attachments:
H16.DESKTOP.jpg (765.3 KB)

~ Messing about with pixels, vectors and voxels since 25 years [vimeo.com] ~
User Avatar
Member
4189 posts
Joined: June 2012
Offline
yeah - we want to nail this performance stuff down, but make sure to test the latest version, as stuff like the viewport textures have been optimised post initial release.

Definitely noticed some UI issues on a Gtx980 / MacOs but nailing down the specific steps is the challenge.

Re: unified noise? what's the issue.
User Avatar
Member
333 posts
Joined: Oct. 2012
Offline
the unified noise works really slow since some time now. Test it yourself on a 100x100 grid. I was told its a known issue. When I test the unified noise in H16 its still not usable for us in production.
User Avatar
Member
918 posts
Joined: March 2014
Offline
Doudini
the unified noise works really slow since some time now. Test it yourself on a 100x100 grid. I was told its a known issue. When I test the unified noise in H16 its still not usable for us in production.

May I ask what the issue with unified noise is? Does it take add 2-3 seconds per instance before a render starts?

Thanks.
Andy
User Avatar
Member
333 posts
Joined: Oct. 2012
Offline
Andy58
May I ask what the issue with unified noise is? Does it take add 2-3 seconds per instance before a render starts?

Thanks.
Andy

I have to say I feel really stupid now. I made another test and the unified noise is actually working fast now. I am not sure what was wrong with the other setup I tested.

Also installed the new daily build and it seems to have fixed a few things with the UI.
User Avatar
Member
65 posts
Joined: Sept. 2014
Offline
Unified noise is amazingly slow in VOPs, tweaking, changing values, but well set, if you animate it, it's fast as F in the viewport.

Edit: Running 16.0.534 and unified noise is still really slow when tweaking values…
Edited by Farmfield - March 1, 2017 16:26:12
~ Messing about with pixels, vectors and voxels since 25 years [vimeo.com] ~
User Avatar
Member
333 posts
Joined: Oct. 2012
Offline
Farmfield
Unified noise is amazingly slow in VOPs, tweaking, changing values, but well set, if you animate it, it's fast as F in the viewport.

Edit: Running 16.0.534 and unified noise is still really slow when tweaking values…

Well put. That seems to be exactly the issue. Its really only running fast when the parameters are in SOP. Which makes sense that the issue comes from a $F. Not sure why that is happening because the noise is not animated by default?
User Avatar
Member
4189 posts
Joined: June 2012
Offline
Ummm - you're all promoting the parms I hope - recompiling the VOP network is always slow but promoted parms are fine.
User Avatar
Staff
6220 posts
Joined: July 2005
Offline
Unified noise implicitly generates a *lot* of code that then gets culled during the optimization pass. Sadly, it isn't compile time that is slow, but optimization. So caching vex code doesn't help as much as it should. You can see this even if you promote parameters - the first cook will be slow (since it has to optimize.) Then it will be fast with changing geometry. But if you change a parameter, it will cook slow (as it has to re-optimize for that parameter being changeable). After that slow cook, however, it will go back to fast while you continue to change the parameter.

We have a Unified Noise Static VOP to work around this problem in 16.0. It makes the noise type & fractal type slow to animate, but in exchange it is significantly faster for the first run & if you are adjusting values in VOPs. The new Mountain SOP and HeightField Noise use this. The main downside is if you are using stamp() it will trigger a full recompile because all parms are dirtied in this case.

We'd love to fix the root cause and just make the optimization pass fast, of course.
User Avatar
Member
101 posts
Joined: Sept. 2015
Offline
Farmfield
The last 2-3 builds (>16.0.530) seem to have way better menu performance and so far I haven't noticed a bug I've had with the earlier version, that leaving Houdini open for a couple of hours (or overnight) made it unusable in regard to GUI lag…

Tobias Steiner
… I have a compatible semi-pro card and 16 runs like crap if you attempt to use more than one screen. …
Doesn't really say much, Quadro 4000, M4000? Or perhaps FirePro..? Quite a big difference between those, for different reasons…

And how do you run Houdini dual screen? I run the main app on monitor one and a panel, full screen, on monitor two - and so far I haven't noticed lag based on if I use one or two monitors - but then again, I run dual GTX1070. (though only one of them is in use for drawing the desktop)


Yeah your cards are much better….and much more expensive. nVidia 780 3gb here. Fact remains, it WAS advertised as compatible by SESI and even worked just fine with 16.04.504.20 official release. H16 “broke” starting at 16.0.530 in terms of performance for me. Same thing in 531. Either way, it's clear that the UI design is a point of issue for many reasons regardless of >>this guy's<< video card.

My money is still on software dev to be honest. Others are doing UI work with QT and not having these issues. Would I love to run out and drop a thousand or so on video cards….well hell yes but not really going to be in response to design work that is suspect.

If you have any suggestions on tweaks that might make this work well (as it clearly does in 16.0.504.20) then I'm all ears. :-D

I think Mr. Murphy explains it best below:

https://youtu.be/HktV2yGtLv8 [youtu.be]
Edited by Tobias Steiner - March 1, 2017 18:34:11
User Avatar
Member
65 posts
Joined: Sept. 2014
Offline
A GTX780 should be well enough to drive Houdini. As for the GTX1070s being expensive, well, I'm running Redshift, so I make that up in saved time - I could as well have spent that cash on Gridmarkets, it's just a question about how you approach optimizing your workflow, really.

Interesting, though, that I had issues <530 and you have issues >530, that does indicate it could actually be connected to the Nvidia hardware and/or driver API something something…
Edited by Farmfield - March 1, 2017 18:47:57
~ Messing about with pixels, vectors and voxels since 25 years [vimeo.com] ~
User Avatar
Member
4189 posts
Joined: June 2012
Offline
@Tobias did you try the QT4 version too? BTW Maya had/has many terrible QT5 issues for many people.

@jlait - thanks! Hope it can be fixed.
  • Quick Links