Making particles interact with displaced objects?

   14252   14   2
User Avatar
Member
710 posts
Joined: July 2005
Offline
Is this possible? Say I have some particles that are dropping on a surface and sticking to it. Now the surface is just a flat default grid but it has a displacement shader applied that makes it look like terrain when rendered. The problem is that the particles will only see the orginal flat grid and not the displaced render-time version. Is there some way to get the particles to recognize and interact with the micro-poly geometry?

User Avatar
Member
196 posts
Joined: July 2005
Offline
For a dirty, takes 16 hours to process, completely miss your deadline version:

Take your displacement map into COPs. Make your grid have loads of divisons. Then use the pic() function in a point SOP. Like this:

pic(“/img/img1/file”, $BBX, $BBZ, A)

or use tex() if you don't need to look at it in COPs and exchange the path for the filename on disk.

If your displacement map is animated it might be quicker to draw it by hand!!
:shock:

good luck
Henster
User Avatar
Member
710 posts
Joined: July 2005
Offline
Yeah, that was the first option, but it's just not feasable in this case. I have an environment modeled and to get the mesh detailed enough I estimate it would have to be between 12million to 48million polygons. :shock:
User Avatar
Member
321 posts
Joined: July 2005
Offline
DaJuice
Is this possible? Say I have some particles that are dropping on a surface and sticking to it. Now the surface is just a flat default grid but it has a displacement shader applied that makes it look like terrain when rendered. The problem is that the particles will only see the orginal flat grid and not the displaced render-time version. Is there some way to get the particles to recognize and interact with the micro-poly geometry?

I think you could get away with a grid that's significantly lower rez than something with 12-48 million polys! You wouldn't even see that many polys on screen at any given time!

The trick, though, might be to migrate the displacement from the map to the geometry, i.e. do rough displacement in geometry, and then fine-level displacement in the shader. That's a technique that in general produces good results, and prevents the rendering engine from having to be stretched too far.

– Antoine
Antoine Durr
Floq FX
antoine@floqfx.com
_________________
User Avatar
Staff
2591 posts
Joined: July 2005
Offline
Another technique I've heard of is rendering a z-depth map and using that for particle collisions. This only works if the displacement is a height field though. It also requires a lot more work since you have to write the collision detection stuff yourself, but it can be faster than geometric collision.

Otherwise, I'd suggest something along Antoine's idea.
User Avatar
Member
710 posts
Joined: July 2005
Offline
Antoine, the problem is that I plan to copy a lot of very small geometry (think small pebbles/stones) over a very large area, and I'm afraid that much of that geometry will end up below, or floating above the surface if the res is not high enough. Take a look at the pic, the little yellow box is supposed to be the scale of a human being, now imagine a small stone sitting next to it. Obviously the terrain would need to be subdivided A LOT more, otherwise any displacement map would alter the surface of the terrain too much for the pebbels/stones to appear where they're supposed to be.



Mark, sounds sorta interesting, but I wouldn't know where to begin.

So basically, particles and subdivision surfaces don't play together?
User Avatar
Member
321 posts
Joined: July 2005
Offline
DaJuice
Antoine, the problem is that I plan to copy a lot of very small geometry (think small pebbles/stones) over a very large area, and I'm afraid that much of that geometry will end up below, or floating above the surface if the res is not high enough. Take a look at the pic, the little yellow box is supposed to be the scale of a human being, now imagine a small stone sitting next to it. Obviously the terrain would need to be subdivided A LOT more, otherwise any displacement map would alter the surface of the terrain too much for the pebbels/stones to appear where they're supposed to be.

Thanks for prodiving a picture: I can now give a couple more concrete suggestions.

1) group the polys around your human, and subdivide only those (subdivide heavily, as you said!) (I'm assuming the stones are only around the human). If not, subdivide wherever you're planning to put stones.

OR:

2) add an attribute to each stone which contains the uv of where the stone hit the surface (doable via the collision POP, though with polys, you might have to color the geometry according to UVs, and then extract the hitcolor and stuff it into uv). Now that each stone has a UV, add a displacement shader to the stone, that uses the same map as the ground, but since each whole stone will have the same uv's all around (inherited from the hituv/hitcolor), the stone will get displaced in Y by the amount that's meaningful at wherever it landed. In other words, use displacement mapping to reposition the *entire* stone according to the height field.


– Antoine
Antoine Durr
Floq FX
antoine@floqfx.com
_________________
User Avatar
Member
710 posts
Joined: July 2005
Offline
Thanks Antoine, the second option sounds pretty intriguing! I'll have a shot at it and see how far I can get before getting stuck.
User Avatar
Member
941 posts
Joined: July 2005
Offline
Antoine Durr
… Now that each stone has a UV, add a displacement shader to the stone, that uses the same map as the ground, but since each whole stone will have the same uv's all around

May I suggest a slight variation on the theme?

Why not displace the points after projection?
What I mean is: displace the points that got projected, not the grid they get projected onto.

During the projection step, get the surface position P, surface normal N, and any other attributes you may need to drive the displacement. Then pipe these projected points to an exact duplicate of your displacement shader, except implemented as a VexSOP (instead of a VexSHOP) – likely, this just involves copying VOPs from one context to another.

It's really just a tiny variation on Antoine's second suggestion, but I thought I'd mention it anyway.

Good luck!
Mario Marengo
Senior Developer at Folks VFX [folksvfx.com] in Toronto, Canada.
User Avatar
Staff
2591 posts
Joined: July 2005
Offline
Another idea might be to displace the collided particles using the same displacement shader. However, instead of displacing each shade point on the particles, displace all the pebble's points by the same amount.

If you can transfer the attributes you use in the displacement shader to primitive attributes, you might be able to simply run the same displacement shader on the pebbles and have the renderer move them to the “right” place.
User Avatar
Member
710 posts
Joined: July 2005
Offline
Thank you for all the excellent suggestions, but I'm really confused as how to implement the whole thing.
I'm using the Collision POP to get hitpos, hituv, and hituvpos, not sure which ones I actually will need. But I don't know their usage in VOPs, can I just import them with the Parameter VOP? They seem to react like a normal Parameter VOP when I plug them in here and there, so I'm doubtful the attributes have actually been succesfully imported. Could this have something to do with the fact that those attributes only have values after a certain frame (once the particles hit the collision object)? And assuming I have the necessary attribute in VOPs, where in the network would it be used? In place of regular uv coords?
Another thing, For my instance geometry I have an Add SOP going into a Copy SOP, with the Pop Merge going into the template input. This just yields a grid of points, which I later plan to instance the actual stone models to. For now, these points are what I actually want to displace with a vex geo sop, as it seems like the easiest method. I noticed that some Vex SOPs don't work on points alone, including a test SOP I made. For example the standard Vex Mountain SOP has no effect, but if I PolyKnit some polygons on the grid, those areas suddenly become affected. The Vex Ripple SOP on the other hand works fine on just points alone.
I have uploaded the test scene, if anyone cares to take a look at it.
Thank you.

http://www.geocities.com/stehrani3d/3dfiles/particle_displace.zip [geocities.com] (Right Click>Save Target) made in 6.1.149 btw
User Avatar
Member
941 posts
Joined: July 2005
Offline
Hey DaJuice,

Here's your hip, modified to use the method I suggested – very simple: a couple of copy/pastes and a few parameter drag'n'drops.

The only slight complication came from the fact that that “bumpnoise” vop you were using, was originally restricted (for no good reason) to the displace and surface contexts only. So I just copied its contents and created a new vop-type without those restrictions. It's embedded as “MultiCtxtBump”.

I also rayed the points because it was faster to prototype, but that doesn't mean you can't stick to the POP system you set up.

Everything is tied to the shop Terrain. To drive the locations of the spheres, just change the parameters of that shop – everything else changes accordingly (including the visualization geo). So there's no need to touch anything else.

Here's your modified hipfile [members.rogers.com].

And here's a pic:



Cheers!
Mario Marengo
Senior Developer at Folks VFX [folksvfx.com] in Toronto, Canada.
User Avatar
Member
710 posts
Joined: July 2005
Offline
Cheers Mario, you rock.
I updated the file slightly for anyone else who wants to check it out. Just deleted the old stuff from my previous attempt, plugged the particles back in, and used randomized geometry for the instances (props to aracid over at odforce for showing me how it works).

http://www.geocities.com/stehrani3d/3dfiles/particle_displace_updated.zip [geocities.com] (Right Click>Save Target)

User Avatar
Member
412 posts
Joined: July 2005
Offline
this has been an awesome thread you guys.. thanks

a couple questions though:

1. when the terrain grid is merged in to the pebbles obj, it get's triangulated.. is there a specific reason for this?

2. you transfer the uv attrib from the terrain grid to the rayed points.. why exactly does this need to be done?

3. Mario - i get a little confused with the dispSOP you created. basically i can't get my head around the imported rest attrib and the switch between that and the global point position and how that all interacts with your created MultiContextBump VOP. just having a hard time getting exactly everything going on with that and why you did all of it.. would love an explanation when you get the time. thanks.

everything else looks pretty sweet.. but i must ask one last thing:

this is great and all and works based off of the bumpnoise vop but what if i wanted to have this working based on my own displacement shader or even using an image for displacement? how could i convert that displacement information into geometry displacement (sop) so i could move points based off of that?

any chance an otl could be created so that you could throw down a “displace” sop and tell it to point to any displacement shader (procedural or not)? that would be sooo cool. :wink:

thanks again,
dave
Dave Quirus
User Avatar
Member
941 posts
Joined: July 2005
Offline
Hi Dave,

Most of the answers to your questions have to do with me ripping through the hipfile as fast as possible before leaving work that night… I was just curious to see if it would work – not trying to be neat at all. So I unfortunately left a lot of meaningless debris all over the place :shock:

deecue
1. when the terrain grid is merged in to the pebbles obj, it get's triangulated.. is there a specific reason for this?

DaJuice's original projection was done using a POP system. And since I was in a rush, I didn't want to deal with POPs, so I yanked out the POPs and rayed the points instead. And in my experience, the ray sop behaves a lot better with poly-triangles, than with any other geometry type – so now I kind'a triangulate instinctively. Although it's *hardly* necessary for a flat grid… just old habit… disregard…

deecue
2. you transfer the uv attrib from the terrain grid to the rayed points.. why exactly does this need to be done?
As a prototype for a general method for transfering any attribute that might be needed for displacement in the final application. Not because uv is being used for anything at all (it isn't) – just a test. Also; I hadn't had the chance to test the AttributeTransfer SOP yet, so I was curious to see what it would do… very nice SOP! But this would be the wrong application for it, since we require exact (read interpolated) values in this case, and the proximity method just wouldn't give you enough granularity I think – awesome SOP nonetheless, but I'd probably end up using POPs like DaJuice to soak up the attributes… didn't look too deep into that side of things though. (displacement shaders don't typically need *that* many attributes to make them go…)

deecue
3. Mario - i get a little confused with the dispSOP you created. basically i can't get my head around the imported rest attrib and the switch between that and the global point position and how that all interacts with your created MultiContextBump VOP.

The original SHOP version had that functionality, so I just duplicated it in the SOP version, that's all. If “rest” is bound then it is used, otherwise it falls back to the global P. Again, nothing deeply significant here, just blind duplication of functionality (in case DaJuice had intended to use “rest” at some future point). The rest attribute is actually not being used at all in that test (nor do I think it should ever need to be – just pass the rest points directly as P and avoid the whole attribute issue, I'd say… we are in SOPs after all)… more debris; ignore.

deecue
just having a hard time getting exactly everything going on with that and why you did all of it.. would love an explanation when you get the time. thanks.

My fault. You're just getting confused with my sloppy editing job, that's all… apologies ops:

deecue
this is great and all and works based off of the bumpnoise vop but what if i wanted to have this working based on my own displacement shader or even using an image for displacement? how could i convert that displacement information into geometry displacement (sop) so i could move points based off of that?

<EDIT>

In the same way that it was done for this little test: duplicate the functionality of your vexSHOP, inside a vexSOP (assuming you have access to the code of the original SHOP, of course). Mechanically, there's no difference between pushing surface points along normals in a shader, and pushing them in a sop. You just have to make sure the spaces match, that's all.

The only tiny problem that I can think of, is that the displacement shader will likely (hopefully ) be filtered (antialiased), and the tools to do this are not available in the SOP context. But you could always get around it by modulating the amplitude of the displacement based on distance to the render camera and some constant fudge factor – that is if this actually becomes an issue, which I doubt (i.e: height discrepancies should be hard to see by the time the filter kicks in at full). On the other hand, all solutions will likely have some complications in some situations.

</EDIT>

deecue
any chance an otl could be created so that you could throw down a “displace” sop and tell it to point to any displacement shader (procedural or not)? that would be sooo cool. :wink:


Hehehe

I sense a mighty od Effects Challenge&trade in the making!!
Mario Marengo
Senior Developer at Folks VFX [folksvfx.com] in Toronto, Canada.
  • Quick Links