Background Image sequence realtime
4633 8 1- OneBigTree
- Member
- 378 posts
- Joined: Nov. 2010
- Offline
- jsmack
- Member
- 7733 posts
- Joined: Sept. 2011
- Offline
- OneBigTree
- Member
- 378 posts
- Joined: Nov. 2010
- Offline
jsmack
I was under the impression that it already did that when playing back.
Yes, in a way it seems to do it. It seems to cache the first 50 frames or so.
I don't see an option to assign more memory to image caching or anything and if it is really cashing the playback performance is abysmal. I can easily play 4k exr sequences in other applications.
Btw. I am talking about a sequence loaded in the Camera as background.
Edited by OneBigTree - Jan. 25, 2020 11:18:34
- anon_user_37409885
- Member
- 4189 posts
- Joined: June 2012
- Offline
- druitre
- Member
- 63 posts
- Joined: May 2006
- Offline
I don't see this working. Whenever I have a background imagesequence (be it exr or jpg), my viewport performance drops to less than 5 fps, however often I let it run. It seems no buffering is happening. Viewport texture use says 5%, Texture Cache Size says 1638 MB. Is there another setting/preference to set this?
- OneBigTree
- Member
- 378 posts
- Joined: Nov. 2010
- Offline
I have to bump this up.
I still have trouble playing a sequence of 140 frames as a camera background (loaded in the camera parameters)or as a viewport background set in the view options. Only a few frames play real time then Houdini starts reading from disk again. Houdini memory usage doesn't even increase during the playing of the timeline.
This is something I can easily do anywhere else. There must be a way to do it in Houdini, the most sophisticated DCC application today?
I still have trouble playing a sequence of 140 frames as a camera background (loaded in the camera parameters)or as a viewport background set in the view options. Only a few frames play real time then Houdini starts reading from disk again. Houdini memory usage doesn't even increase during the playing of the timeline.
This is something I can easily do anywhere else. There must be a way to do it in Houdini, the most sophisticated DCC application today?
Edited by OneBigTree - Oct. 3, 2020 08:34:23
- anon_user_37409885
- Member
- 4189 posts
- Joined: June 2012
- Offline
- OneBigTree
- Member
- 378 posts
- Joined: Nov. 2010
- Offline
- OneBigTree
- Member
- 378 posts
- Joined: Nov. 2010
- Offline
I figured it out. It is so obvious - and so absurd - that I subconciously ignored the possibility:
The view background sequence is loaded into the OGL Cache, which is, yes the VRAM on your graphics card.
So you have a choice: Fill up your 6-12 gig of VRAM with a third of your sequence which prevents you from rendering with Redshift while the better part of you 64 gig of RAM is empty, or…. switch back ( again ) to a software where you can load easily you view background in your ram where it belongs while happily rendering with your GPU renderer.
RFE has been logged now. I wonder if they'd noticed it once Karma went GPU….
The view background sequence is loaded into the OGL Cache, which is, yes the VRAM on your graphics card.
So you have a choice: Fill up your 6-12 gig of VRAM with a third of your sequence which prevents you from rendering with Redshift while the better part of you 64 gig of RAM is empty, or…. switch back ( again ) to a software where you can load easily you view background in your ram where it belongs while happily rendering with your GPU renderer.
RFE has been logged now. I wonder if they'd noticed it once Karma went GPU….
Edited by OneBigTree - Nov. 1, 2020 19:37:28
-
- Quick Links