One more possible avenue.
Looks like the shell rop has it on the scripts tab, so perhaps I could prepend an empty shell rop to my fetch rops (fetching the cop rops).
Found 247 posts.
Search results Show results as topic list.
Technical Discussion » Initialize Simulation OPs on COP rop?
- pclaes
- 257 posts
- Offline
Technical Discussion » Initialize Simulation OPs on COP rop?
- pclaes
- 257 posts
- Offline
Found a bit more info here:
https://www.sidefx.com/docs/hdk/_h_d_k__s_o_h_o.html [www.sidefx.com]
So the name of the property might be 'soho_initsim'.
I wonder if I can add that to a fetch rop as well.
https://www.sidefx.com/docs/hdk/_h_d_k__s_o_h_o.html [www.sidefx.com]
int soho_initsim
If this parameter exists and evaluates to a non-zero value, the output driver will initialize all simulation OPs before invoking the soho_program.
This traverses all nodes in the scene and invokes specific methods on certain node types.
So the name of the property might be 'soho_initsim'.
I wonder if I can add that to a fetch rop as well.
Technical Discussion » Initialize Simulation OPs on COP rop?
- pclaes
- 257 posts
- Offline
Hi,
Is there a way to add/enable 'Initialize Simulation OPs' on a cop rop?
I am running a 2d simulation and want to write the result of that simulation to disk, but directly as a png.
On a geometry rop there is a 'Initialize Simulation OPs' checkbox that is useful to make sure the simulation is reset before writing out any geo. I would like to do the same thing, but for writing out my image from cops.
I'm not sure if I can add this as a render_property (perhaps 'initsim' ? -- but this is not showing up as an actual render property, that is just the name of the parameter on the geometry rop).
Alternatively, in my rop network I could trigger a pre-render script python call to perhaps do the same.
Anyone know a python command that might trigger this?
Thanks,
Peter
Is there a way to add/enable 'Initialize Simulation OPs' on a cop rop?
I am running a 2d simulation and want to write the result of that simulation to disk, but directly as a png.
On a geometry rop there is a 'Initialize Simulation OPs' checkbox that is useful to make sure the simulation is reset before writing out any geo. I would like to do the same thing, but for writing out my image from cops.
I'm not sure if I can add this as a render_property (perhaps 'initsim' ? -- but this is not showing up as an actual render property, that is just the name of the parameter on the geometry rop).
Alternatively, in my rop network I could trigger a pre-render script python call to perhaps do the same.
Anyone know a python command that might trigger this?
Thanks,
Peter
PDG/TOPs » Best way to write out metadata together with geo?
- pclaes
- 257 posts
- Offline
Hey Tomas,
Thanks for the suggestion. If you look at what the csv exporter rop does, it calls a python definition from the pre-render script section of that shell rop. So that should work.
I am trying to 'stick with tops' and was trying to figure out if what I wanted to do would be possible with tops only.
Turns out that it is not possible with only tops (currently). So a chain of rop nodes seems like a clean solution.
Thank you tpetrick for the example file. I see the 'frame by frame' parameter on the ROP fetch TOP.
And the info of the 'Output Parm Name' makes sense as well. When you mention 'so it can evaluate that path for cache checking', is that also what is used for 'Delete This Node's Results from Disk'?
Thanks for the suggestion. If you look at what the csv exporter rop does, it calls a python definition from the pre-render script section of that shell rop. So that should work.
I am trying to 'stick with tops' and was trying to figure out if what I wanted to do would be possible with tops only.
Turns out that it is not possible with only tops (currently). So a chain of rop nodes seems like a clean solution.
Thank you tpetrick for the example file. I see the 'frame by frame' parameter on the ROP fetch TOP.
And the info of the 'Output Parm Name' makes sense as well. When you mention 'so it can evaluate that path for cache checking', is that also what is used for 'Delete This Node's Results from Disk'?
PDG/TOPs » Best way to write out metadata together with geo?
- pclaes
- 257 posts
- Offline
Thank you for your explanations!
For 1)
The clarification in regards to writing out multiple outputs during one work-item by fetching a rop network consisting of multiple rop nodes makes sense and I will try it. An example of this would be very much appreciated. Thank you.
If this also makes it so the cook of the heavy geo only happens once (per tile), then this is probably the approach I will use going forward. That seems to be the best approach forward for this task.
The analogy with simulation data sort of makes sense as well then, but makes me curious if that would A) write the sim data first (for all frames) and then write the additional output geo (for all frames - potentially recooking the sim?). Or if B) the sim data is written out frame by frame and after each sim frame finishes, the additional output geo writes (frame by frame) thereby avoiding recooking.
Basically:
work-item1: Sim all frames (rop) -> write additional output for all frames (rop)
vs
work-item1: Sim a frame (rop) -> write additional output for a frame (rop) -> go to next frame
Is this dependent on how the rops are wired perhaps? (in sequence, vs in parallel merged?) -- Normally I think rops don't tend to do frame-by-frame dependencies. They tend to run the entire frame-range before moving on to the next rop. Maybe I'm missing something here?
I see now in regards to partitions that they are behaving more like groups of work-items that are visually packed together, but under the hood still execute individually.
For 2)
This makes some sense, but also massively reduces the usefulness of tthe pythonscript top. I guess its' purpose is more to directly manipulate the top attributes on the work-items instead of trying to pull (changing) geometry data from the scene.
For 1)
The clarification in regards to writing out multiple outputs during one work-item by fetching a rop network consisting of multiple rop nodes makes sense and I will try it. An example of this would be very much appreciated. Thank you.
If this also makes it so the cook of the heavy geo only happens once (per tile), then this is probably the approach I will use going forward. That seems to be the best approach forward for this task.
The analogy with simulation data sort of makes sense as well then, but makes me curious if that would A) write the sim data first (for all frames) and then write the additional output geo (for all frames - potentially recooking the sim?). Or if B) the sim data is written out frame by frame and after each sim frame finishes, the additional output geo writes (frame by frame) thereby avoiding recooking.
Basically:
work-item1: Sim all frames (rop) -> write additional output for all frames (rop)
vs
work-item1: Sim a frame (rop) -> write additional output for a frame (rop) -> go to next frame
Is this dependent on how the rops are wired perhaps? (in sequence, vs in parallel merged?) -- Normally I think rops don't tend to do frame-by-frame dependencies. They tend to run the entire frame-range before moving on to the next rop. Maybe I'm missing something here?
I see now in regards to partitions that they are behaving more like groups of work-items that are visually packed together, but under the hood still execute individually.
For 2)
This makes some sense, but also massively reduces the usefulness of tthe pythonscript top. I guess its' purpose is more to directly manipulate the top attributes on the work-items instead of trying to pull (changing) geometry data from the scene.
PDG/TOPs » Best way to write out metadata together with geo?
- pclaes
- 257 posts
- Offline
Hi,
I am trying to find out what the best way is to write out meta-data into a json file whilst I am processing a bunch of geometry.
I have attached a file to show some of the different ways I am trying to get this meta data out.
In this example I have different polygonal tiles that I want to process (think terrain).
Ultimately each tile will end up subdivided with color data and a custom (single float) mask. This color data is then brought into COPs and written out as a texture (png).
Besides writing out the texture, I also want to write out a json file that contains information about the tile. Specifically the min and max values of the custom mask.
The goal:
To write out that metadata file at the same time when the texture from cops is cooked. -- This is important as the metadata file is 'light' to write and 'light' to process, but the cooking of the image data would take a while. So I want this to happen during the same process (during the same work-item?).
What I have tried so far:
What works:
*) I created a python sop that writes the metadata file whenever this sop cooks. If this sop is injected right before the node that goes to COPs is called, then it will correctly bake out the json metadata. -- Although this works, this seems a bit hacky as it is not really part of the tops chain. -- And the metadata will get exported whenever the python sop cooks (also whilst debugging the sop network).
In this case the dependency looks like:
TOP Fetch -> fetches COP rop -> fetches sop node (will trigger sop upstream cooking - computationally expensive step) -> triggers python sop cook (writes json - computationally cheap step) -> continues (downstream) to COPs to write image data.
*) I used the labs csv baker tool to bake the metadata as csv. In this case the python code lives on a digital asset and the definition is called through a shell rop inside of the csv baker rop. This does seem to correctly update the geometry so the metadata is updated correctly. ( I can build my own json baker tool similar to the labs csv exporter tool if this would be a good approach)
In this case the dependency looks like:
TOP Fetch -> Fetches COP rop & write image data -> fetches sop (will trigger upstream cooking - computationally expensive step)
TOP Fetch -> fetches the Csv exporter rop -> fetches sop (will trigger upstream cooking - computationally expensive step) -> continue downstream to csv exporter & write metadata (computationally cheap step).
My concern with this is that the 'computationally expensive step' is executed twice. Once for the writing of the image data and again for the writing of the metadata. Ideally I want to somehow link these two outputs to be 'run as one' or 'run after each other but during the same process/work-item'. I don't know if this is what partitioning is supposed to do? Basically grouping work items together that are supposed to run together. -- Similar to how the frames of a simulation are run as a batch. I would like the work items for each tile to run together.
I do like that the json exporting functionality would be wrapped into its own rop as that seems clean and also creates work items for each json file.
*) What does not work:
I tried using the pythonscript top. But because this is updating 'in-process' it will update the tops work-item, but it will not actually update the geometry in the scene and therefore it would write the metadata for whichever work-item was last selected. The metadata currently contains the min/max values of the custom mask - this is done using a sop attribpromote, which requires the geometry to be correctly updated.
I would almost be tempted to make a new 'python script geo' top hda that wraps a rop net with a shell rop that grabs the callback definition from a string panel. This seems a bit much and this is also when I am thinking there must be an easier/better way to have the existing pythonscript top correctly update the geometry that is triggered or sampled. Perhaps I should try to force a dirty & cook on a portion of my sop network so the data gets updated?
Any suggestions or advice as to what is the best way forward would be greatly appreciated.
The main two questions are:
1) How can I trigger the work-items from two different tops so they run during the same process? (partitions?)
2) How can/should I use the pythonscript top so it updates the geometry correctly so I can pull the metadata from the geo correctly.
Thanks!
I am trying to find out what the best way is to write out meta-data into a json file whilst I am processing a bunch of geometry.
I have attached a file to show some of the different ways I am trying to get this meta data out.
In this example I have different polygonal tiles that I want to process (think terrain).
Ultimately each tile will end up subdivided with color data and a custom (single float) mask. This color data is then brought into COPs and written out as a texture (png).
Besides writing out the texture, I also want to write out a json file that contains information about the tile. Specifically the min and max values of the custom mask.
The goal:
To write out that metadata file at the same time when the texture from cops is cooked. -- This is important as the metadata file is 'light' to write and 'light' to process, but the cooking of the image data would take a while. So I want this to happen during the same process (during the same work-item?).
What I have tried so far:
What works:
*) I created a python sop that writes the metadata file whenever this sop cooks. If this sop is injected right before the node that goes to COPs is called, then it will correctly bake out the json metadata. -- Although this works, this seems a bit hacky as it is not really part of the tops chain. -- And the metadata will get exported whenever the python sop cooks (also whilst debugging the sop network).
In this case the dependency looks like:
TOP Fetch -> fetches COP rop -> fetches sop node (will trigger sop upstream cooking - computationally expensive step) -> triggers python sop cook (writes json - computationally cheap step) -> continues (downstream) to COPs to write image data.
*) I used the labs csv baker tool to bake the metadata as csv. In this case the python code lives on a digital asset and the definition is called through a shell rop inside of the csv baker rop. This does seem to correctly update the geometry so the metadata is updated correctly. ( I can build my own json baker tool similar to the labs csv exporter tool if this would be a good approach)
In this case the dependency looks like:
TOP Fetch -> Fetches COP rop & write image data -> fetches sop (will trigger upstream cooking - computationally expensive step)
TOP Fetch -> fetches the Csv exporter rop -> fetches sop (will trigger upstream cooking - computationally expensive step) -> continue downstream to csv exporter & write metadata (computationally cheap step).
My concern with this is that the 'computationally expensive step' is executed twice. Once for the writing of the image data and again for the writing of the metadata. Ideally I want to somehow link these two outputs to be 'run as one' or 'run after each other but during the same process/work-item'. I don't know if this is what partitioning is supposed to do? Basically grouping work items together that are supposed to run together. -- Similar to how the frames of a simulation are run as a batch. I would like the work items for each tile to run together.
I do like that the json exporting functionality would be wrapped into its own rop as that seems clean and also creates work items for each json file.
*) What does not work:
I tried using the pythonscript top. But because this is updating 'in-process' it will update the tops work-item, but it will not actually update the geometry in the scene and therefore it would write the metadata for whichever work-item was last selected. The metadata currently contains the min/max values of the custom mask - this is done using a sop attribpromote, which requires the geometry to be correctly updated.
I would almost be tempted to make a new 'python script geo' top hda that wraps a rop net with a shell rop that grabs the callback definition from a string panel. This seems a bit much and this is also when I am thinking there must be an easier/better way to have the existing pythonscript top correctly update the geometry that is triggered or sampled. Perhaps I should try to force a dirty & cook on a portion of my sop network so the data gets updated?
Any suggestions or advice as to what is the best way forward would be greatly appreciated.
The main two questions are:
1) How can I trigger the work-items from two different tops so they run during the same process? (partitions?)
2) How can/should I use the pythonscript top so it updates the geometry correctly so I can pull the metadata from the geo correctly.
Thanks!
Houdini Engine for Maya » Houdini Engine and querying which is the host dcc
- pclaes
- 257 posts
- Offline
Houdini Engine for Maya » Houdini Engine and querying which is the host dcc
- pclaes
- 257 posts
- Offline
Hi,
I am starting to develop Houdini Engine tools that can work in multiple dccs (3dsmax, maya, ...). The houdini engine implementations treat the world transforms differently for different dccs (maya and max are different). There are also some differences with visualization of vertex colors etc.
Ideally there would be a way to find out which host dcc the plugin is running in. I can make a checkbox/dropdown where the user can specify that in the interface for now, but this feels like something I should be able to automate.
Does anyone know of a command or env variable that would allow me to query which host dcc I am running the Houdini engine plugin in?
I've tried looking in the docs for a while, but could not really find anything there.
Thanks!
Peter
I am starting to develop Houdini Engine tools that can work in multiple dccs (3dsmax, maya, ...). The houdini engine implementations treat the world transforms differently for different dccs (maya and max are different). There are also some differences with visualization of vertex colors etc.
Ideally there would be a way to find out which host dcc the plugin is running in. I can make a checkbox/dropdown where the user can specify that in the interface for now, but this feels like something I should be able to automate.
Does anyone know of a command or env variable that would allow me to query which host dcc I am running the Houdini engine plugin in?
I've tried looking in the docs for a while, but could not really find anything there.
Thanks!
Peter
Houdini for Realtime » SIDEFXLABS - VHDA Workflows
- pclaes
- 257 posts
- Offline
Nice addition!
Pretty much every studio I've been at rolled their own because the default way of making namespaced versioned digital assets is so prone to making errors.
Pretty much every studio I've been at rolled their own because the default way of making namespaced versioned digital assets is so prone to making errors.
Houdini Engine for Maya » TOPs in Houdini Engine for Maya
- pclaes
- 257 posts
- Offline
Could this be mentioned somewhere in the docs?
Perhaps here:
http://www.sidefx.com/docs/maya/_maya__compatibility.html [www.sidefx.com]
I am also interested in wedging and tops use in engine for Maya.
Is there a timeline on when tops might be supported?
Thanks!
Perhaps here:
http://www.sidefx.com/docs/maya/_maya__compatibility.html [www.sidefx.com]
I am also interested in wedging and tops use in engine for Maya.
Is there a timeline on when tops might be supported?
Thanks!
Houdini Engine for Maya » Triggering sync asset after button press?
- pclaes
- 257 posts
- Offline
Hi,
Is there a way to trigger a ‘Sync Asset’ after a button is pressed on the HDA?
Eg: I have 5 input path fields and 1 populate button.
In the first input path I point to a model file on disk.
Then I want to press the populate button and it scans that same directory for the other 4 models and puts their paths into the input path fields. I can do this in python with a callback script that is a definition that lives on the digitial asset.
^^ I have all this working, but after pressing the ‘populate’ button, it does not update the information in the input path fields. Only after pressing the ‘Sync Asset’ button will it update the fields.
So is there a way to trigger a ‘Sync Asset’ after my button is pressed?
'Auto Sync Outputs' does not help
I can build an example asset if that helps.
Is there a way to trigger a ‘Sync Asset’ after a button is pressed on the HDA?
Eg: I have 5 input path fields and 1 populate button.
In the first input path I point to a model file on disk.
Then I want to press the populate button and it scans that same directory for the other 4 models and puts their paths into the input path fields. I can do this in python with a callback script that is a definition that lives on the digitial asset.
^^ I have all this working, but after pressing the ‘populate’ button, it does not update the information in the input path fields. Only after pressing the ‘Sync Asset’ button will it update the fields.
So is there a way to trigger a ‘Sync Asset’ after my button is pressed?
'Auto Sync Outputs' does not help
I can build an example asset if that helps.
Houdini Engine for Maya » hda not processing geo when I have a python sop
- pclaes
- 257 posts
- Offline
With the clean version of Maya it is loading the plug in from the same location.
Still figuring out the other part.
Still figuring out the other part.
Houdini Engine for Maya » hda not processing geo when I have a python sop
- pclaes
- 257 posts
- Offline
Also I was unable to run the Debugger -> View Assets in Houdini.
It tries to load houdini fx, but spits out lots of errors in the console. ( within our environment ).
I'll report back if I can and when I know what has caused this.
It tries to load houdini fx, but spits out lots of errors in the console. ( within our environment ).
I'll report back if I can and when I know what has caused this.
Houdini Engine for Maya » hda not processing geo when I have a python sop
- pclaes
- 257 posts
- Offline
Hi John,
After some further debugging, it has to do with the environment within Maya that is being set.
A clean Maya works. So now we're trying to find out what part of our Maya userSetup.py is breaking the houdini engine.
The Houdini engine is v3.3 (API 3). The plugin was installed with the installer.
After some further debugging, it has to do with the environment within Maya that is being set.
A clean Maya works. So now we're trying to find out what part of our Maya userSetup.py is breaking the houdini engine.
The Houdini engine is v3.3 (API 3). The plugin was installed with the installer.
Houdini Engine for Maya » hda not processing geo when I have a python sop
- pclaes
- 257 posts
- Offline
Thanks for looking into it.
It's looking more like there is something else going on with the Maya environment that is causing a library conflict.
I'm going to try to use the debugger today to see if that provides any info to help figure out what path or library might be causing this.
It's looking more like there is something else going on with the Maya environment that is causing a library conflict.
I'm going to try to use the debugger today to see if that provides any info to help figure out what path or library might be causing this.
Houdini Engine for Maya » hda not processing geo when I have a python sop
- pclaes
- 257 posts
- Offline
I've updated my Maya to 2018.6 and Houdini to h18.0.416.
But unfortunately the error remains. (and is tricky to debug. )
But unfortunately the error remains. (and is tricky to debug. )
Houdini Engine for Maya » hda not processing geo when I have a python sop
- pclaes
- 257 posts
- Offline
Hi,
I am currently building a larger hda, but ran into an issue when I am trying to do some geometry processing and python is called.
I first noticed it with the group_expression sop.
I am able to load the hda in Maya 2018.3, Houdini Version 18.0.377, Houdini engine version: 3.3 (API: 2).
But as soon as I included the group_expression sop, the asset would throw an error:
Also the very first time I load the asset I get:
So I looked around a bit and this seems to happen when:
is not added to a python script.
The group expression sop internally seems to be doing some python script, so that is probably why it fails.
I then tried it with a python sop to confirm that this is the issue. I picked the preset to ‘move points up’ and at the beginning of that blurp of code I added:
But that did not help.
So now I'm thinking that houdini engine is unable to find ‘hou’. So that would mean that there is something wrong in the configuration of my Houdini engine installation.
Any advice to help find out where this is going wrong? I'm on Windows 10.
I have attached my simple test otl.
It requires an polygonal input (like a poly sphere) as an input.
- it will subdivide it
- it will rotate it around centroid
- it will try to move the points with a python sop (this is where the asset fails in Maya)
Thank you!
Peter
I am currently building a larger hda, but ran into an issue when I am trying to do some geometry processing and python is called.
I first noticed it with the group_expression sop.
I am able to load the hda in Maya 2018.3, Houdini Version 18.0.377, Houdini engine version: 3.3 (API: 2).
But as soon as I included the group_expression sop, the asset would throw an error:
// Error: Object (ID: 0): /obj/test_engine_simple1
Geo (ID: 14): /obj/test_engine_simple1/geo_OUT/output0
No geometry generated!
//
Also the very first time I load the asset I get:
// Error: HAPI Error: Error setting kwargs:
Traceback (most recent call last):
File "opdef:/Sop/groupexpression?PythonModule", line 1, in
NameError: name 'hou' is not defined
So I looked around a bit and this seems to happen when:
import hou
The group expression sop internally seems to be doing some python script, so that is probably why it fails.
I then tried it with a python sop to confirm that this is the issue. I picked the preset to ‘move points up’ and at the beginning of that blurp of code I added:
import hou
But that did not help.
So now I'm thinking that houdini engine is unable to find ‘hou’. So that would mean that there is something wrong in the configuration of my Houdini engine installation.
Any advice to help find out where this is going wrong? I'm on Windows 10.
I have attached my simple test otl.
It requires an polygonal input (like a poly sphere) as an input.
- it will subdivide it
- it will rotate it around centroid
- it will try to move the points with a python sop (this is where the asset fails in Maya)
Thank you!
Peter
Houdini Lounge » Houdini Growth Masterclass
- pclaes
- 257 posts
- Offline
As of right now it won't be posted online as it is a class as well as has some parts that are workshop (and it is a 7h class).
That said, considering I have developed all the material I can probably turn this into a tutorial series afterwards so more people can learn the workflows. This masterclass is limited to only 20 students.
That said, considering I have developed all the material I can probably turn this into a tutorial series afterwards so more people can learn the workflows. This masterclass is limited to only 20 students.
Houdini Lounge » Houdini Growth Masterclass
- pclaes
- 257 posts
- Offline
Houdini Lounge » Houdini Growth Masterclass
- pclaes
- 257 posts
- Offline
Hey,
On June 3rd I will be teaching an advanced Houdini masterclass at the Effects America conference in Montreal. I will be covering growth systems. Should be good fun:
https://www.effects-events.com/en/master-classes/ [www.effects-events.com]
Description:
During this advanced Houdini masterclass you will learn how to create an art-directable growth system. Digitally constructing things can be as challenging if not more challenging than destruction, this class will focus on the former. The class is split into two main sections. The first section of the class will dive into building the growth solver prototype tool. This covers solvers, some vector math, chaos theory, 2d & 3d growth, custom forces and tool development. The second section of the class dives into using the tools to grow a 2d and 3d pattern that is procedurally animated and prepared for rendering. This covers path finding, procedural animation, combining 2d and 3d patterns, custom attributes and aovs/render passes for comp.
Take away:
Understand the algorithm and concepts for building a growth solver. Build a user-friendly and efficient tool that can scale from small scale single growth to growing large datasets for entire vfx sequences. Understanding and making use of Houdini’s data acceleration structures. Gaining insight into the art-direction and approval process for both the grown pattern as well as the procedural animation.
The audience:
This course is intended for intermediate to advanced Houdini users. Users should have a working understanding of the Houdini interface and overall data flow (contexts, attributes, datatypes). Houdini Apprentice can be used for this class.
Hope to see you there or perhaps at the conference,
Peter
On June 3rd I will be teaching an advanced Houdini masterclass at the Effects America conference in Montreal. I will be covering growth systems. Should be good fun:
https://www.effects-events.com/en/master-classes/ [www.effects-events.com]
Description:
During this advanced Houdini masterclass you will learn how to create an art-directable growth system. Digitally constructing things can be as challenging if not more challenging than destruction, this class will focus on the former. The class is split into two main sections. The first section of the class will dive into building the growth solver prototype tool. This covers solvers, some vector math, chaos theory, 2d & 3d growth, custom forces and tool development. The second section of the class dives into using the tools to grow a 2d and 3d pattern that is procedurally animated and prepared for rendering. This covers path finding, procedural animation, combining 2d and 3d patterns, custom attributes and aovs/render passes for comp.
Take away:
Understand the algorithm and concepts for building a growth solver. Build a user-friendly and efficient tool that can scale from small scale single growth to growing large datasets for entire vfx sequences. Understanding and making use of Houdini’s data acceleration structures. Gaining insight into the art-direction and approval process for both the grown pattern as well as the procedural animation.
The audience:
This course is intended for intermediate to advanced Houdini users. Users should have a working understanding of the Houdini interface and overall data flow (contexts, attributes, datatypes). Houdini Apprentice can be used for this class.
Hope to see you there or perhaps at the conference,
Peter
Edited by pclaes - May 27, 2019 19:01:46
-
- Quick Links