Houdini Engine 2.0
 All Classes Files Functions Variables Typedefs Enumerations Enumerator Macros Pages
Sessions

Introduction

The concept of sessions is new to Houdini Engine 2.x. In prior versions of Houdini Engine, the Houdini Engine dynamic library (libHAPI) linked directly to core components of Houdini and Houdini Engine hosts in their turn were typically linked to libHAPI. This meant that the Houdini Engine implementation ran in the host process.

This setup had two major disadvantages:

  • since libHAPI loaded core Houdini libraries in the host process and those libraries have multiple third-party dependencies, there was a high probability of them conflicting with the host application's own dependencies;
  • there was only one Houdini scene state supported per process.

Houdini Engine 2.x still supports this in-process scenario, and for simple hosts it's still probably a good choice, however a new inter-process communication (IPC) mechanism is also available, making it possible for the host application to run one or multiple Houdini Engine sessions in its own process or in separate processes, in parallel and on another machine if desired.

Every such session is represented by a HAPI_Session parameter passed by pointer into most API calls.

Migration

Just like in Houdini Engine 1.x, the libHAPI dynamic library contains the actual implementation of Houdini Engine functions and interfaces directly with core Houdini components, but it doesn't contain the implementation of IPC mechanisms.

If you are not interested in using IPC and would like to simply run Houdini Engine in-process, you can link your host application to libHAPI directly and you don't need to call any additional functions before you can successfully call HAPI_Initialize(). If porting an existing HAPI 1.x code base, you will only need to add a HAPI_Session pointer parameter to all HAPI calls (in this case it is ignored, so you may want to set it to NULL for consistency).

Houdini Engine Loader

If on the other hand you want to take advantage of an IPC implementation and be able to easily switch between IPC and the in-process implementation with low overhead, you should link your application to libHAPIL instead. HAPIL stands for Houdini Engine API Loader and is a "shim" library which can load libHAPI, libHARC (our IPC client implementation) or even a custom user-supplied dynamic library.

libHAPIL exports functions declared in HAPI.h. Most of these functions are redirected to a dynamically loaded implementation library such as libHAPI or libHARC but others are special functions for loading libraries and creating sessions, implemented in libHAPIL itself.

If your application is linked to libHAPIL, you MUST explicitly create a session before you can call any HAPI function that receives a session as an input parameter, including HAPI_Initialize().

For example, HAPI_CreateInProcessSession() loads libHAPI dynamically and returns a singleton in-process session in its output parameter. Please note that unless you call this function, no Houdini libraries and dependencies are loaded in the host process, which can help you solve dependency problems.

To simplify migration, it is acceptable to pass a NULL pointer to represent the singleton in-process session after creating it with HAPI_CreateInProcessSession().

If you'd like to use our Thrift-based IPC implementation, you can create a session with HAPI_CreateThriftSocketSession() (which uses TCP sockets as the transport layer) or HAPI_CreateThriftNamedPipeSession() (which uses named pipes on Windows or domain sockets on Unix-like OSes). The first time a Thrift session is created, libHAPIL loads libHARC - the client-side dynamic library implementing Thrift bindings for HAPI calls.

You can even implement your own custom IPC client library, bind it to one of custom slots in the HAPI_SessionType enumeration with HAPI_BindCustomImplementation() and create sessions specific to this implementation with HAPI_CreateCustomSession(), passing the required session creation parameters as a raw pointer.

Finally, when you're done with a Houdini Engine session, you can close it with HAPI_CloseSession(). In the libHARC implementation, this will close the client-server connection. If the session is not closed explicitly with HAPI_CloseSession() during the client's lifetime, it will be closed automatically when the client process terminates.

Note that multiple sessions can be open and multiple implementation libraries can be loaded simultaneously.

Thrift Server

When using libHARC (the Thrift-based IPC library) on the client side, you need to start the server executable HARS, included in Houdini installation. HARS is a console application with simple command-line arguments:

$ HARS -h
Allowed options:
  -h [ --help ]             Produce this help message
  -s [ --socket-port ] arg  The server port, if using a TCP socket session
  -n [ --named-pipe ] arg   The name of the pipe, if using a named pipe session
  -a [ --auto-close ]       Close the server automatically when all client
                            connections are closed
  -r [ --ready-handle ] arg Event to signal when the server is ready to serve
                            (for automated server startup)

HARS links directly to libHAPI and to core Houdini libraries and their dependencies. Since Thrift IPC is cross-platform, the host process (using libHAPIL and libHARC) and the server process (HARS) can be built for and run on different platforms.

You can start a HARS process from the command line or from a pipeline script with the desired options and then establish a client connection to the server in the C++ code of your application using the respective session creation function. Alternatively, you can use one of the convenience functions HAPI_StartThriftSocketServer() and HAPI_StartThriftNamedPipeServer() to start the server from C++ code before creating a session. Both of these functions block until the server signals that it's ready to serve, so once one of them returns success, it is safe to create a session with matching parameters.

Please note that the HARS server currently only supports a single client connection. If a client is connected to the server and another client tries to connect, then the second client will block until the first connection is closed.

Multithreading

Using the libHARC Thrift RPC implementation, one host application process can interact with multiple HAPI sessions, each representing its own HARS process with its separate Houdini state. This opens up new multithreading possiblities for applications using Houdini Engine. libHARC is designed to be thread-safe when using multiple sessions from multiple threads.

Access to each session is protected with a mutex, so mutltiple threads can access one session safely but it's impossible to truly parallelize operations using just one session. The client code may also need to synchronize operations on a single session in cases when the order of operations is important. For example, when marshalling geometry, it doesn't matter in what order geometry attributes are set but it's important that HAPI_CommitGeo() is called after all of them have been set.

With multiple sessions on the other hand, parallelization becomes possible because their Houdini contexts are completely independent. The following source sample illustrates how reading and writing volume data per tile can be parallelized between asynchronous tasks by using separate sessions for reading and writing.

class testharcSession : public HAPI_Session
{
public:
testharcSession(const char* base_name, int task_id)
{
cook_options.splitGeosByGroup = false;
std::ostringstream pipe_name_os;
pipe_name_os << base_name << task_id << '_'
#ifdef WIN32
<< GetCurrentProcessId();
#else
<< getpid();
#endif
const std::string pipe_name = pipe_name_os.str();
HARC_TEST_SUCCESS( HAPI_StartThriftNamedPipeServer(
true, pipe_name.c_str(), 5000, NULL
));
this, pipe_name.c_str()
));
HARC_TEST_SUCCESS( HAPI_Initialize(
this, &cook_options, true, -1,
NULL, NULL, NULL, NULL
));
}
~testharcSession()
{
HARC_TEST_SUCCESS( HAPI_IsInitialized( this ) );
HARC_TEST_SUCCESS( HAPI_Cleanup( this ) );
HARC_TEST_SUCCESS( HAPI_IsSessionValid( this ) );
HARC_TEST_SUCCESS( HAPI_CloseSession( this ) );
HARC_TEST_ASSERT(
);
}
};
class testharcCopyTileValues
{
public:
testharcCopyTileValues(
const HAPI_Session& dst_session,
const HAPI_Session& src_session,
HAPI_AssetId asset_id,
HAPI_ObjectId object_id,
HAPI_PartId part_id,
HAPI_AssetId input_asset_id,
const HAPI_VolumeTileInfo& tile_info,
int tile_value_count
)
: myDstSession(dst_session)
, mySrcSession(src_session)
, myAssetId(asset_id)
, myObjectId(object_id)
, myPartId(part_id)
, myInputAssetId(input_asset_id)
, myTileInfo(tile_info)
, myTileValueCount(tile_value_count)
{
}
void operator()()
{
// Allocate tile data buffer.
std::vector<float> tile_values(
static_cast<size_t>(myTileValueCount), -5.8f
);
// Get the color data.
HARC_TEST_SUCCESS( HAPI_GetVolumeTileFloatData(
&mySrcSession,
myAssetId, myObjectId, 0, myPartId,
-8.8f,
&myTileInfo, &tile_values.front(), myTileValueCount
));
// Set the color data on the input volume.
HARC_TEST_SUCCESS( HAPI_SetVolumeTileFloatData(
&myDstSession,
myInputAssetId, 0, 0,
&myTileInfo, &tile_values.front(), myTileValueCount
));
}
private:
HAPI_Session myDstSession;
HAPI_Session mySrcSession;
HAPI_AssetId myAssetId;
HAPI_ObjectId myObjectId;
HAPI_PartId myPartId;
HAPI_AssetId myInputAssetId;
HAPI_VolumeTileInfo myTileInfo;
int myTileValueCount;
};
void testharcCopyVolume(size_t task_id)
{
testharcSession src_session("testharc_src", task_id);
// Load the library from file.
HAPI_AssetLibraryId library_id = -1;
HARC_TEST_SUCCESS( HAPI_LoadAssetLibraryFromFile(
&src_session, "HAPI_Test_Volumes_HoudiniFogColor.otl",
false, &library_id
));
HARC_TEST_ASSERT( library_id >= 0 );
// Instantiate the asset.
HAPI_AssetId asset_id = -1;
HARC_TEST_SUCCESS( HAPI_InstantiateAsset(
&src_session, "Object/HAPI_Test_Volumes_HoudiniFogColor",
true, &asset_id
));
// There's only one object, so just reference it directly.
const HAPI_ObjectId object_id = 0;
// Get the part info for the second part which should be the red volume.
const HAPI_PartId part_id = 1;
// Get the volume info.
HAPI_VolumeInfo volume_info;
HARC_TEST_SUCCESS( HAPI_GetVolumeInfo(
&src_session, asset_id,
object_id, 0, part_id, &volume_info
));
HARC_TEST_ASSERT( volume_info.tupleSize == 1 );
HARC_TEST_ASSERT( volume_info.hasTaper == false );
testharcSession dst_session("testharc_dst", task_id);
// Create input asset to receive volume.
HAPI_AssetId input_asset_id;
HARC_TEST_SUCCESS( HAPI_CreateInputAsset(
&dst_session, &input_asset_id, "Input_Volume"
));
// Set the volume info.
HARC_TEST_SUCCESS( HAPI_SetVolumeInfo(
&dst_session, input_asset_id, 0, 0, &volume_info
));
// Get the first volume tile.
HARC_TEST_SUCCESS( HAPI_GetFirstVolumeTile(
&src_session, asset_id, object_id, 0, part_id, &tile_info
));
HARC_TEST_ASSERT( tile_info.isValid );
std::vector<std::future<void> > futures;
const int tile_value_count =
volume_info.tileSize *
volume_info.tileSize *
volume_info.tileSize *
volume_info.tupleSize;
while ( tile_info.isValid )
{
futures.push_back(
std::async(
std::launch::async,
testharcCopyTileValues(
dst_session, src_session,
asset_id, object_id, part_id, input_asset_id,
tile_info, tile_value_count
)
)
);
// Get the next color tile.
HARC_TEST_SUCCESS( HAPI_GetNextVolumeTile(
&src_session,
asset_id, object_id, 0, part_id,
&tile_info
));
}
for ( auto& future : futures )
{
future.get();
}
// Commit the volume inputs.
HARC_TEST_SUCCESS( HAPI_CommitGeo(
&dst_session, input_asset_id, 0, 0
));
// Cook the asset.
HARC_TEST_SUCCESS( HAPI_CookAsset(
&dst_session, input_asset_id, NULL
));
HARC_TEST_ASSERT( testharcVerifyInputVolume_HoudiniVolume(
dst_session, input_asset_id, 0
));
}

Finally, session management functions (session creation functions, HAPI_IsSessionValid() and HAPI_CloseSession()) share a mutex, however they don't interfere with normal HAPI functions called concurrently, with the exception that HAPI_CloseSession() invalidates the session and passing it to a HAPI function later will result in undefined behaviour. You can always safely call HAPI_IsSessionValid() to determine if a session is valid.