Found 349 posts.
Search results Show results as topic list.
Work in Progress » alembc dop
- ragupasta
- 349 posts
- Offline
When you set Deadpool to an RBD did you turn on “Deforming Geometry” in the RBD node in the dopnet?
Edited by ragupasta - Sept. 5, 2017 13:50:32
Work in Progress » Back On The Shading Bandwagon
- ragupasta
- 349 posts
- Offline
hi chaps, so I've been looking at shading a little closer lately and I really want to push the Principled Shader and see what it can achieve with very limited modification from the vanilla shader.
So I am starting with a snow scene.
I found a very nice image of an ice Utah Teapot frozen in snow on google and seemed like a nice place to start. So here is my rendition of a similar scene.
I'm concentrating on the ice shader first, the the snow secondly.
I had some advice quite a long time ago from tamte in regards to internal depth of ice (not sure if it was here or odforce), but since I have some time once again, I am returning to the fray.
Simple scene:
1: Displaced grid for floor. MountainSOP for main and shader displacements for secondary and tertiary.
2. Teapot is supposed to be shaded as a clear ice object with cloudy internal ice as found in most types of ice. Displacements are shader based and not geometry based.
The internal cloudy ice is not a mix of different shaders. It is a copy of the teapot. The size is reduced and turned into a volume via the isoffset node. An uniform volume shader has been applied to this, in conjunction with a volumevop node to create some noise in the density values.
The lighting is very simple.
1x Area light for the main lighting with the area sizes scaled up for soft shadows.
1x Environment light with the Bosch HDR that ships with Houdini, with a slight blue colour to tint the light a bit from grey.
1x spotlight. This one is only to accentuate the internal volume. The light mask and shadow mask is set to the volume only.
Any suggestions are welcome. I should have an update in the next day or so.
So I am starting with a snow scene.
I found a very nice image of an ice Utah Teapot frozen in snow on google and seemed like a nice place to start. So here is my rendition of a similar scene.
I'm concentrating on the ice shader first, the the snow secondly.
I had some advice quite a long time ago from tamte in regards to internal depth of ice (not sure if it was here or odforce), but since I have some time once again, I am returning to the fray.
Simple scene:
1: Displaced grid for floor. MountainSOP for main and shader displacements for secondary and tertiary.
2. Teapot is supposed to be shaded as a clear ice object with cloudy internal ice as found in most types of ice. Displacements are shader based and not geometry based.
The internal cloudy ice is not a mix of different shaders. It is a copy of the teapot. The size is reduced and turned into a volume via the isoffset node. An uniform volume shader has been applied to this, in conjunction with a volumevop node to create some noise in the density values.
The lighting is very simple.
1x Area light for the main lighting with the area sizes scaled up for soft shadows.
1x Environment light with the Bosch HDR that ships with Houdini, with a slight blue colour to tint the light a bit from grey.
1x spotlight. This one is only to accentuate the internal volume. The light mask and shadow mask is set to the volume only.
Any suggestions are welcome. I should have an update in the next day or so.
Work in Progress » WIP - Personal RnD on Houdini
- ragupasta
- 349 posts
- Offline
Hi Matteo.
Interesting thread. A lot of good stuff in here.
Just taking you back to your Sci-Fi corridor scene. I'm looking at your render settings and I'm wondering why you have your pixel samples to 8x8.
I try my best to keep the pixel sampling down as low as I can. Yes it makes a nice clean image, but if you look at the problems individually you can make a nice clean render with a lot lower rendertimes than using pixel sampling.
So Houdini 16 has a lot of individual sampling sliders. Please use these!
For example: If you are using global illumination you always get noise in the shadow area's as you do with any other renderer. You can clear a lot of this with the Pixel Samples. However a much better way is using the “Diffuse Quality” parameter.
For subsurface scattering, try and use the SSS Quality parameter before pixel sampling.
The trouble with Pixel Sampling is that if you use any other form of sampling (for example: Reflection Quality), the pixel samples are multiplied by each sample value set in Mantra. That can be a lot of samples.
A better way is to use extra image planes.
Mantra –> Images –> extra Image planes. Choose the ones you want to inspect and render in MPlay or render view (render view for me). Each of those image planes with be saved into the buffer where you can choose from the rendered view (C), (direct samples), (indirect samples), ect, ect. You can view these individually so you can trouble shoot area's, and see what areas need additional sampling, and which don't.
With this knowledge you can improve areas of the render without resorting to huge rendertimes.
Keep up the good work, some nice stuff going on so far.
Interesting thread. A lot of good stuff in here.
Just taking you back to your Sci-Fi corridor scene. I'm looking at your render settings and I'm wondering why you have your pixel samples to 8x8.
I try my best to keep the pixel sampling down as low as I can. Yes it makes a nice clean image, but if you look at the problems individually you can make a nice clean render with a lot lower rendertimes than using pixel sampling.
So Houdini 16 has a lot of individual sampling sliders. Please use these!
For example: If you are using global illumination you always get noise in the shadow area's as you do with any other renderer. You can clear a lot of this with the Pixel Samples. However a much better way is using the “Diffuse Quality” parameter.
For subsurface scattering, try and use the SSS Quality parameter before pixel sampling.
The trouble with Pixel Sampling is that if you use any other form of sampling (for example: Reflection Quality), the pixel samples are multiplied by each sample value set in Mantra. That can be a lot of samples.
A better way is to use extra image planes.
Mantra –> Images –> extra Image planes. Choose the ones you want to inspect and render in MPlay or render view (render view for me). Each of those image planes with be saved into the buffer where you can choose from the rendered view (C), (direct samples), (indirect samples), ect, ect. You can view these individually so you can trouble shoot area's, and see what areas need additional sampling, and which don't.
With this knowledge you can improve areas of the render without resorting to huge rendertimes.
Keep up the good work, some nice stuff going on so far.
Edited by ragupasta - Sept. 4, 2017 16:25:44
Work in Progress » Interior Rendering
- ragupasta
- 349 posts
- Offline
Additional question: Atmospheric effects such as volume lighting (aka: god rays).
The old method of lit fog/atmospheric object is notoriously slow so that is not going to be an effective way of doing it.
The next logical solution would be to create a volume fog to encompass the scene and render it that way. Problem is when using say a billowy smoke shader, when dropping the density value very low to actually be able to get light through the density it also becomes very slow and noisy as well. This means significantly increasing the volume samples in Mantra and possibly the stochastic samples depending on the situation.
Would using VDB be a better solution? And if so what would be the best workflow?
VolumeSOP to encomass the scene then use a convertVDB set to VDB?
Not really played much with VDB's so it's an unfamiliar area for me.
Any suggestions?
The old method of lit fog/atmospheric object is notoriously slow so that is not going to be an effective way of doing it.
The next logical solution would be to create a volume fog to encompass the scene and render it that way. Problem is when using say a billowy smoke shader, when dropping the density value very low to actually be able to get light through the density it also becomes very slow and noisy as well. This means significantly increasing the volume samples in Mantra and possibly the stochastic samples depending on the situation.
Would using VDB be a better solution? And if so what would be the best workflow?
VolumeSOP to encomass the scene then use a convertVDB set to VDB?
Not really played much with VDB's so it's an unfamiliar area for me.
Any suggestions?
Work in Progress » Interior Rendering
- ragupasta
- 349 posts
- Offline
So I am having another go with mantra with a more geometrically heavy scene. You have all seen this scene before due to the lighting challenges over at CGTalk. Files are hosted at Jeremy Birns site over at www.3drender.com
From a Mantra perspective the settings are pretty much the same as the last scene, but lighting consists of:
1x spotlight for the sun
4x portal lights for room fill
1x GI light for caching
Basic lighting and no scene shaders as of yet. Rendetime stands at:- 34:55 which is very much acceptable in my eyes.
The more I'm playing with lighting/mantra the more I'm liking it.
The sunlight is a tad too strong at the moment, but lets see after I build some shaders.
Modeled by Alvaro Luna Bautista and Joel Andersdon.
From a Mantra perspective the settings are pretty much the same as the last scene, but lighting consists of:
1x spotlight for the sun
4x portal lights for room fill
1x GI light for caching
Basic lighting and no scene shaders as of yet. Rendetime stands at:- 34:55 which is very much acceptable in my eyes.
The more I'm playing with lighting/mantra the more I'm liking it.
The sunlight is a tad too strong at the moment, but lets see after I build some shaders.
Modeled by Alvaro Luna Bautista and Joel Andersdon.
Houdini Lounge » Marvelous Machines Winning Entries
- ragupasta
- 349 posts
- Offline
Technical Discussion » Having problem with volume light
- ragupasta
- 349 posts
- Offline
I used to get this a lot and never actually figured it out. When I used to get this error the volume light did not work. Can you confirm that it actually is working after the error is received?
Work in Progress » Terrain test
- ragupasta
- 349 posts
- Offline
Houdini Lounge » Houdini in comparision to 3ds Max?
- ragupasta
- 349 posts
- Offline
I would ask on the forums rather than a private message. Reason being if some other people get stuck on the same thing, they will be able to find the answer using the forum search function.
Work in Progress » Interior Rendering
- ragupasta
- 349 posts
- Offline
Thanks Mark I shall have a read. I have the direct and indirect ray sample planes activated as usual inside Mantra, as they really help a lot in troubleshooting noisy areas. So yeah they still exist in H16.
Houdini Lounge » Houdini in comparision to 3ds Max?
- ragupasta
- 349 posts
- Offline
Here is a few beginners tutorials covering geometry, custom shader building and compositing inside Houdini. Nothing is scripted so when things don't work you see how to find the issues. Author is me so it's a bit of a shamless plug.
https://www.youtube.com/playlist?list=PLqBA97RtpBD6efJHapb2tVeq_AWio6yOo [youtube.com]
https://www.youtube.com/playlist?list=PLqBA97RtpBD6efJHapb2tVeq_AWio6yOo [youtube.com]
Houdini Learning Materials » What is the best way to make a realistic grass field in Houdini 16?
- ragupasta
- 349 posts
- Offline
mawi
Why are you not using the fur system or the fur SOP?
Because it is fun to experiment.
Houdini Lounge » Houdini in comparision to 3ds Max?
- ragupasta
- 349 posts
- Offline
logan_the_hamster
You said, that I've to think different, procedually to be accurate, but how do I do that? I never worked procedually and furthermore the same goes for nodes too. Concerning the nodes, what each of it does etc. I guess thats not to difficult and is just something you have to learn by heart.
Since your a max user you have experienced proceduralism even at a lower end without probably realising it. Take 3ds max for instance.
Add a sphere into the scene. You get a series of parameters to control radius, rows and column amounts. You set what you want, then right click and collapse to editable poly right? So now you realise you needed another few rows and columns. Problem is collapsing the stack has removed the procedural parameters from the sphere and you are now in a polygon modelling mode so to speak. Yes you can CTRL+Z your way to success, but when you have made a lot of changes, you will then have to rebuild what you CTRL+Z your way past.
Not very efficient.
Procedural workflow allows you to “go back in time” and make changes that will propagate through the network and update everything with minimal effort and time. Also you can automate tasks which will auto-update other areas at the change of a parameter.
3ds max's modifier stack is semi-procedural in that you layer on different modifiers to describe the end result. We both know that once max's modifier stack starts to get large, it throws a wobbly and crashes or becomes very slow indeed. Houdini doesn't get this issue even after huge numbers of nodes. Even then you can lock nodes so that the entire network doesn't have to cook to speed things up.
Another nice thing about the node based workflow is, say you drop a grid down and displace it into a terrain of some kind. Now you need another terrain to sit behind it. You don't need to rebuild a whole new terrain, you can simply branch a new node tree from the existing one and make changes. Boom that is 2 terrains generated in a very small amount of time.
On the node side. Open up a fresh scene of Houdini, lay down a geometry node and jump inside. Delete the file operator, and hit the TAB key and drop down any nodes you are interested in (they do not have to be connected to anything as you are not going to use them). Select a node and in the parameters pane hit the “?” button in the top right. This will bring up the help file for that specific node. Have a read through and at the bottom most nodes in the help files come with scene files to open up and explore. These files will have full explanations of what the node is doing and why.
Very cool way of learning.
I spent hours just dropping random nodes down and opening the help files and .hips embedded within.
Edited by ragupasta - March 5, 2017 04:59:40
Houdini Learning Materials » What is the best way to make a realistic grass field in Houdini 16?
- ragupasta
- 349 posts
- Offline
Well thought I would have a play with this. I made a point of not looking at Almina's node network and after completion mine looks very similar to yours. This is the copy stamp method, and for 10,000 blades of grass I think 10 seconds of stamping time is acceptable.
This is certainly not polished or anything and could be taken a lot further, but I gave myself a 30 minute window to play with.
NURBS curves rendered with the width attribute for Mantra. The attribute is controlled by a ramp parameter for customisation. Noise is added to the curves point position, colour is by a UV attribute on the curve itself, again using a ramp parameter.
Pretty basic stuff.
Was a fun little thing.
This is certainly not polished or anything and could be taken a lot further, but I gave myself a 30 minute window to play with.
NURBS curves rendered with the width attribute for Mantra. The attribute is controlled by a ramp parameter for customisation. Noise is added to the curves point position, colour is by a UV attribute on the curve itself, again using a ramp parameter.
Pretty basic stuff.
Was a fun little thing.
Houdini Lounge » Houdini in comparision to 3ds Max?
- ragupasta
- 349 posts
- Offline
All 3d apps have their hiccups and crashes and Houdini does sometimes crash. However, that aside Houdini's crashes are few and far between, and usually that is down to me doing something stupid which causes the crash, like overloading my RAM ect. When you said about H16 coming out a few weeks and being very stable, this is because Side FX works alongside major VFX houses whilst in development. Essentially this means that as development continues it is being tested at Major companies at the same time, who give a constant stream of feedback to Side FX, making changes and bug squashing much faster. I don't think Autodesk use this approach at all and do most in house.
For modelling, Houdini is catching up with the other packages and although the workflow is slightly different from say 3ds max, once you get used to it, it is smooth and really quite rapid. Selecting your tools via the viewport tab menu or shelf tools means you can completely ignore the node tree if you so wish. I don't as I like to keep things organised as I go.
Now as Houdini's render engine Mantra is getting significantly faster, I hope to ditch 3ds max completely and use Houdini as my main app. Everything not rendering and modelling at the moment is done in Houdini, but that is about to change for me.
In the case of plugins as you mentioned, Houdini's procedural node based workflows means that you can create your own “Plug-ins” in the flavour of Digital Assets. Now the beauty of this is, that using things like VOP nodes (Vex OPerators)you can create assets with no prior scripting knowledge as all these nodes contain snippets of code within them, also SOPS and other nodes can all be used. All you do is chain a series of nodes together to describe the tools functionality. Very powerful for non-coding people.
Very cool.
For modelling, Houdini is catching up with the other packages and although the workflow is slightly different from say 3ds max, once you get used to it, it is smooth and really quite rapid. Selecting your tools via the viewport tab menu or shelf tools means you can completely ignore the node tree if you so wish. I don't as I like to keep things organised as I go.
Now as Houdini's render engine Mantra is getting significantly faster, I hope to ditch 3ds max completely and use Houdini as my main app. Everything not rendering and modelling at the moment is done in Houdini, but that is about to change for me.
In the case of plugins as you mentioned, Houdini's procedural node based workflows means that you can create your own “Plug-ins” in the flavour of Digital Assets. Now the beauty of this is, that using things like VOP nodes (Vex OPerators)you can create assets with no prior scripting knowledge as all these nodes contain snippets of code within them, also SOPS and other nodes can all be used. All you do is chain a series of nodes together to describe the tools functionality. Very powerful for non-coding people.
Very cool.
Edited by ragupasta - March 4, 2017 07:19:18
Work in Progress » Interior Rendering
- ragupasta
- 349 posts
- Offline
Thanks for the advice and link Edward.
So I gave the portal light method a try. I have to keep the GI Light in the scene otherwise the portal light wont get enough light in, even with a very high intensity setting. Rendertimes seem very similar to the age old method I was using of area lights. I dropped the max ray samples to 32 as a test, and the rendertime came out at 40:14. With the max samples back to 64 I can easily see another 10 minutes rendertime added, so performance wise both methods seem similar.
Obviously noise has returned due to lower sampling value for the dark areas.
Quick question Edward if you can answer it.
H15 had a parameter on the Mantra node called “Indirect ray samples” for directly controlling GI noise area's. Having a look around at Mantra in H16, I am assuming this is now controlled by the “Diffuse Quality” slider under the sampling tab.
So I gave the portal light method a try. I have to keep the GI Light in the scene otherwise the portal light wont get enough light in, even with a very high intensity setting. Rendertimes seem very similar to the age old method I was using of area lights. I dropped the max ray samples to 32 as a test, and the rendertime came out at 40:14. With the max samples back to 64 I can easily see another 10 minutes rendertime added, so performance wise both methods seem similar.
Obviously noise has returned due to lower sampling value for the dark areas.
Quick question Edward if you can answer it.
H15 had a parameter on the Mantra node called “Indirect ray samples” for directly controlling GI noise area's. Having a look around at Mantra in H16, I am assuming this is now controlled by the “Diffuse Quality” slider under the sampling tab.
Edited by ragupasta - March 4, 2017 06:02:17
Houdini Learning Materials » pop sop dop? what do these mean?
- ragupasta
- 349 posts
- Offline
Just to add this here as well guys. Houdini comes with a very robust help system. Drop down a node in the scene that you don't understand and hit the “?” in the top right corner of the parameters pane. This will bring up the online documentation. Now the beauty of this is, after reading the information, right at the bottom there is usually scene files embedded there which will auto-load into Houdini with a simple click, and the scene files are dotted with lots of information of exactly what the node(s) are doing at one particular time. So you can read and interactively run an animation that may be in the file and what it is doing and when.
A very strong way of learning and couple that with the beginner tutorials here and the transition from beginner to intermediate will happen faster than you think.
A very strong way of learning and couple that with the beginner tutorials here and the transition from beginner to intermediate will happen faster than you think.
Work in Progress » Interior Rendering
- ragupasta
- 349 posts
- Offline
Hi guys.
I decided to try some lighting practice with Houdini since version 16 is out. This build seems significantly faster than 15.5. Rendering this in 15.5 was taking nearly 2 hours whilst with 16 is taking 58 minutes per frame.
Still a long way to go but so far it's coming along nicely.
So info of the scene is pretty simple in lighting terms, 1 spotlight for the sun, 1 area light for filling the room and a GI light for GI/caching.
Mantra settings are:
Sampling: 3x3
Noise level: 0.005
Min/Max Samples: 1/64
Diffuse Limit: 1
Model created and supplied by: Georgio Luciano.
I decided to try some lighting practice with Houdini since version 16 is out. This build seems significantly faster than 15.5. Rendering this in 15.5 was taking nearly 2 hours whilst with 16 is taking 58 minutes per frame.
Still a long way to go but so far it's coming along nicely.
So info of the scene is pretty simple in lighting terms, 1 spotlight for the sun, 1 area light for filling the room and a GI light for GI/caching.
Mantra settings are:
Sampling: 3x3
Noise level: 0.005
Min/Max Samples: 1/64
Diffuse Limit: 1
Model created and supplied by: Georgio Luciano.
Work in Progress » Ground Collapse
- ragupasta
- 349 posts
- Offline
Pretty impressive. I like the way the shattering is not uniform. The main thing that sticks out to me is the sandy coloured ground and the smoke sim being dark grey. Colors need to match. Also the smoke sim only really kicks in after the shattered elements hit the floor. If there was an underground collapse/earthquake to set this off, the smoke would emit much earlier?
Looking good.
Looking good.
Work in Progress » Shading Thread
- ragupasta
- 349 posts
- Offline
Hello people.
A shading wip thread for me as a learning curve I can use as reference. So the first project is still strongly in the wip phase. All advises are welcomed.
This project is based on the proto-planet Earth and it's sister planet Thea/creation of the moon and so forth.
The shader setup for the planets are similar and quite complicated. For the shading of Earth the colour is 2 different mixes of colours masked by procedural noises. The cracks and debris are worley noise multiplied by turbulent noise, that is then inverted and clamped in a fit range vop, with a few additional procedural noises added ontop.
Lava colour is simple. Colour parameter that is multiplied with another parameter to create a brightness control.
PBR renderer and shader build.
As all of the displacements that matter are built into the shader, any ideas on how to add specular highlights to just the lava then merge it back into the shader without affecting the darker-rocky areas of the shader?
Also any ideas of how to use the lava and pipe it into some sort of emission/glow shader? Multiplying the colour only takes the look so far.
A couple of screenies and such.
A shading wip thread for me as a learning curve I can use as reference. So the first project is still strongly in the wip phase. All advises are welcomed.
This project is based on the proto-planet Earth and it's sister planet Thea/creation of the moon and so forth.
The shader setup for the planets are similar and quite complicated. For the shading of Earth the colour is 2 different mixes of colours masked by procedural noises. The cracks and debris are worley noise multiplied by turbulent noise, that is then inverted and clamped in a fit range vop, with a few additional procedural noises added ontop.
Lava colour is simple. Colour parameter that is multiplied with another parameter to create a brightness control.
PBR renderer and shader build.
As all of the displacements that matter are built into the shader, any ideas on how to add specular highlights to just the lava then merge it back into the shader without affecting the darker-rocky areas of the shader?
Also any ideas of how to use the lava and pipe it into some sort of emission/glow shader? Multiplying the colour only takes the look so far.
A couple of screenies and such.
-
- Quick Links