Search - User list
Full Version: Background Image sequence realtime
Root » Technical Discussion » Background Image sequence realtime
OneBigTree
Is there any way to load a background image sequence into memory so I can play a scene in realtime without having to flipbook it first?
jsmack
I was under the impression that it already did that when playing back.
OneBigTree
jsmack
I was under the impression that it already did that when playing back.

Yes, in a way it seems to do it. It seems to cache the first 50 frames or so.
I don't see an option to assign more memory to image caching or anything and if it is really cashing the playback performance is abysmal. I can easily play 4k exr sequences in other applications.

Btw. I am talking about a sequence loaded in the Camera as background.
anon_user_37409885
The background images are loaded into the OpenGL Texture Cache - the maximum amount can be increased in the Cache Manager.
druitre
I don't see this working. Whenever I have a background imagesequence (be it exr or jpg), my viewport performance drops to less than 5 fps, however often I let it run. It seems no buffering is happening. Viewport texture use says 5%, Texture Cache Size says 1638 MB. Is there another setting/preference to set this?
OneBigTree
I have to bump this up.
I still have trouble playing a sequence of 140 frames as a camera background (loaded in the camera parameters)or as a viewport background set in the view options. Only a few frames play real time then Houdini starts reading from disk again. Houdini memory usage doesn't even increase during the playing of the timeline.

This is something I can easily do anywhere else. There must be a way to do it in Houdini, the most sophisticated DCC application today?
anon_user_37409885
You should submit a bug via the Support menu above as it's always worked here on Linux.

SideFx support staff in general don't read these forums.
OneBigTree
goat
You should submit a bug via the Support menu above as it's always worked here on Linux.

SideFx support staff in general don't read these forums.

I will. I was hoping someone in the community might have an answer.
OneBigTree
I figured it out. It is so obvious - and so absurd - that I subconciously ignored the possibility:

The view background sequence is loaded into the OGL Cache, which is, yes the VRAM on your graphics card.
So you have a choice: Fill up your 6-12 gig of VRAM with a third of your sequence which prevents you from rendering with Redshift while the better part of you 64 gig of RAM is empty, or…. switch back ( again ) to a software where you can load easily you view background in your ram where it belongs while happily rendering with your GPU renderer.
RFE has been logged now. I wonder if they'd noticed it once Karma went GPU….
This is a "lo-fi" version of our main content. To view the full version with more information, formatting and images, please click here.
Powered by DjangoBB