Background Image sequence realtime

   3075   8   1
User Avatar
Member
374 posts
Joined: Nov. 2010
Offline
Is there any way to load a background image sequence into memory so I can play a scene in realtime without having to flipbook it first?
User Avatar
Member
6434 posts
Joined: Sept. 2011
Online
I was under the impression that it already did that when playing back.
User Avatar
Member
374 posts
Joined: Nov. 2010
Offline
jsmack
I was under the impression that it already did that when playing back.

Yes, in a way it seems to do it. It seems to cache the first 50 frames or so.
I don't see an option to assign more memory to image caching or anything and if it is really cashing the playback performance is abysmal. I can easily play 4k exr sequences in other applications.

Btw. I am talking about a sequence loaded in the Camera as background.
Edited by OneBigTree - Jan. 25, 2020 11:18:34
User Avatar
Member
4189 posts
Joined: June 2012
Offline
The background images are loaded into the OpenGL Texture Cache - the maximum amount can be increased in the Cache Manager.
User Avatar
Member
62 posts
Joined: May 2006
Offline
I don't see this working. Whenever I have a background imagesequence (be it exr or jpg), my viewport performance drops to less than 5 fps, however often I let it run. It seems no buffering is happening. Viewport texture use says 5%, Texture Cache Size says 1638 MB. Is there another setting/preference to set this?
User Avatar
Member
374 posts
Joined: Nov. 2010
Offline
I have to bump this up.
I still have trouble playing a sequence of 140 frames as a camera background (loaded in the camera parameters)or as a viewport background set in the view options. Only a few frames play real time then Houdini starts reading from disk again. Houdini memory usage doesn't even increase during the playing of the timeline.

This is something I can easily do anywhere else. There must be a way to do it in Houdini, the most sophisticated DCC application today?
Edited by OneBigTree - Oct. 3, 2020 08:34:23
User Avatar
Member
4189 posts
Joined: June 2012
Offline
You should submit a bug via the Support menu above as it's always worked here on Linux.

SideFx support staff in general don't read these forums.
Edited by anon_user_37409885 - Oct. 4, 2020 19:50:29
User Avatar
Member
374 posts
Joined: Nov. 2010
Offline
goat
You should submit a bug via the Support menu above as it's always worked here on Linux.

SideFx support staff in general don't read these forums.

I will. I was hoping someone in the community might have an answer.
User Avatar
Member
374 posts
Joined: Nov. 2010
Offline
I figured it out. It is so obvious - and so absurd - that I subconciously ignored the possibility:

The view background sequence is loaded into the OGL Cache, which is, yes the VRAM on your graphics card.
So you have a choice: Fill up your 6-12 gig of VRAM with a third of your sequence which prevents you from rendering with Redshift while the better part of you 64 gig of RAM is empty, or…. switch back ( again ) to a software where you can load easily you view background in your ram where it belongs while happily rendering with your GPU renderer.
RFE has been logged now. I wonder if they'd noticed it once Karma went GPU….
Edited by OneBigTree - Nov. 1, 2020 19:37:28
  • Quick Links