Solaris Camera to Obj Camera: Getting same view

   10338   30   1
User Avatar
Member
2042 posts
Joined: 9月 2015
Offline
I've come accross a potential rfe which I'm not sure would really be usefull or desired, except maybe in specific cases.

In my case I was playing around with lighting and materials in LOPs, at a certain ‘stage’(non usd meaning) of my project.

But for additional objects and their layout, it's preferred to work at the obj level to get the new elements where I want them.

The issue is that I want to use the same camera settings I had in LOPs (how the image is ‘framed’), to be the same at the obj level.

With the LOP camera one can re-use the same parameter values of ‘perspective’ and ‘focal length’. But that's where the similarities end.

LOPs cameras have Horizontal Aperature and Vertical Aperature. OBJ cameras have resolution and a single aperature value.

One can't really map the LOP Aperature settings to the OBJ resolution because the OBJ resolutions are only in whole integer values.

Plus, even if they were floats, the question would be - What LOP aperature to use? Horizontal or Vertical? or some interpolated value?

Seems like it's not possible.

The short answer I guess would be - if one is going to render through Solaris/LOPs, just do the layout there.

Would be good to be able to have some option of direct camera mapping between the two contexts so one could work in either context and still get the same ‘framed’ results for the final render.
Edited by BabaJ - 2020年2月9日 12:39:58

Attachments:
Solaris Camera to Obj Camera.jpg (158.9 KB)

User Avatar
Member
7805 posts
Joined: 9月 2011
Offline
I don't understand. The translation is relatively trivial. There are even nodes for converting one to the other.

LOPs to OBJ:

Aperture = Horizontal Aperture
ResY = ResX * Vertical Aperture ÷ Horizontal Aperture

OBJ to LOPs

Horizontal Aperture = Aperture
Vertical Aperture = Horizontal Aperture * ResY ÷ ResX
User Avatar
Member
2042 posts
Joined: 9月 2015
Offline
Thanks jsmack.

Yes the application of what you've given is trivial.

Could be done slightly differently - for LOPs to OBJ:

Just use the corresponding Horizontal Aperature mapped to Resolution X, and Vertical Aperature mapped to Resolution Y

Then which ever one is the larger of Horizontal/Vertical Aperatures, use that value for the OBJ Aperature.

However, understanding how to get this to begin with is not trivial - other than trial and error to get an understanding of the relationship between the aperature parameters (either LOPs or OBJ).

Coming from a photography background, aperature settings in Houdini have no correspondance to what happens when you adjust aperature on a real camera - this is what confuses me and makes it hard to understand at face value what the parameter really represents.( aperature in OBJ or horizontal/vertical aperature in LOPs).

But again, thanks for the tip in giving me a working framework to go from.

Edit: What are these nodes you speak of? That do the conversion of one camera to another(if that's what you meant).
Edited by BabaJ - 2020年2月9日 14:57:11
User Avatar
Member
7805 posts
Joined: 9月 2011
Offline
BabaJ
Coming from a photography background, aperature settings in Houdini have no correspondance to what happens when you adjust aperature on a real camera - this is what confuses me and makes it hard to understand at face value what the parameter really represents.( aperature in OBJ or horizontal/vertical aperature in LOPs).

The term aperture is a cinema term, it refers to the size of the metal window in front of the film. Changing the size of the aperture is the same as changing the film format. The ‘fstop’ as it's called in houdini refers to the lens aperture which is maybe what you are thinking of.

BabaJ
Just use the corresponding Horizontal Aperature mapped to Resolution X, and Vertical Aperature mapped to Resolution Y

I suppose you could do that, but then the resolution of the render will probably be very very small and also most likely truncate your precision resulting in a different aspect ratio. I would use a real resolution, such as hd or a resolution corresponding to the plates, if any, and select a real film back size (aperture) from a physical camera.

BabaJ
Edit: What are these nodes you speak of? That do the conversion of one camera to another(if that's what you meant).

Scene Import (Cameras) (obj -> lop)
LOP Import Camera (lop -> obj)
User Avatar
Member
2042 posts
Joined: 9月 2015
Offline
The term aperture is a cinema term, it refers to the size of the metal window in front of the film. Changing the size of the aperture is the same as changing the film format. The ‘fstop’ as it's called in houdini refers to the lens aperture which is maybe what you are thinking of.

You got it right up to ‘size of metal window in front of the film’; Except I'm not sure what you mean by film format.

However in real life a cameras aperature is controlled by ‘F-Stop’ and ‘Shutter Speed’; Both contribute towards how much exposure the film receives and both have their own respective additional contributions. The Aperatures F-stop also determines Depth of Field which can be used as a blurring technique or the opposite in how much of the scene is kept in focus. The shutter speed can be used as a blurring technique as well but of a different kind - one caused by motion.

So in ‘real life’ the aperature is controlled by ‘f-stop’, there is no ‘aperature’ control per se, other than f-stop and shutter speed, i.e. having a parameter called ‘Aperature’(in Houdini) is nonsenesical in terms of representing what it does and how it is controlled (in real life).

That's because the f-stop in Houdini is taken out of the camera and put in lights to contribute towards exposure, and doesn't contribute towards the Depth Of Field.

The Aperature in Houdini does neither of controlling exposure or Depth Of Field(as far as I know, since I haven't tried to get Depth of Field effects like a real life camera in Houdini).

So in real life aperature opening has the inseperable effect of both contributing towards exposure and Depth Of Field.

Lower F-Stops(more open) result in less Depth Of Field and the converse.

In either case, the resulting image size does not change.

Yet in Houdini, this Aperature parameter, actually changes the size of the image - Acting like a zoom/cropping tool. Zoom not being a good term because a Zoom on a camera lens(that is a zoom lens that has the capacity for different focal lengths) is essentially only changing the focal length, the net result being a change in image composition.

And yet in Houdini we already have a focal length parameter - so the size of the image can be set in two ways - focal length parameter and Aperature parameter, leaving out resolution parameter settings(that also change image size).

Cropping is not a good analogy since that implies less than the original, but the Aperature parameter can make the image area larger as well.

So this Aperature parameter does this zoom/cropping, and as usefull as it is, has no real world counterpart to relate to.

I suppose you could do that, but then the resolution of the render will probably be very very small and also most likely truncate your precision resulting in a different aspect ratio.

Then with the ‘expression’ you gave me, yours would have the same result. I simply am doing what you said manually - your expression gives the same results. So long as I choose the larger of the two values(Horizontal vs. Vertical LOP aperature) for the OBJ aperature). The only exception being, you didn't mention what to use for the initial ‘ResX’ of your expression, of which I assumed the LOPs Horizontal. And again, all that gets switched out if the other is larger.

and select a real film back size (aperture) from a physical camera.

As I mentioned before, there is no ‘real film back size’. In real life the size of an aperatures opening is controlled by f-stops and the end result does not change the size of the image.

Thanks for the tip on the nodes. The LOP Import Camera node works well.
Edited by BabaJ - 2020年2月10日 12:25:20
User Avatar
Member
711 posts
Joined: 7月 2005
Offline
The Aperture setting does not equate to the lens aperture (as in iris blades that stop up/down). It's referring to the size of the film back/sensor. The film/sensor size determines the crop factor. For example a 50mm lens will have a different field of view when projecting on a full-frame sensor (36x24mm) compared to a crop sensor (23.6×15.6mm). The smaller your film/sensor size is the greater “zoom” or crop-factor will be applied to the image for a given focal-length.
User Avatar
Member
2042 posts
Joined: 9月 2015
Offline
The smaller your film/sensor size is the greater “zoom” or crop-factor will be applied to the image for a given focal-length.

Yes it's ‘good’ that you put zoom in quotes, because doing what you describe is not an actual zoom in terms of real camera/film results.

The better term as which you use is crop-factor. Maybe Up-Cropping/Down-Cropping? or Whatever other term.

Otherwise in addtion to aperature, the term zoom doesn't have the same meaning and effects as well.( In houdini or most other software packages vs. zoom with real lens).

In real effects of camera/film a zooming by the camera also has the effect, in addition to the ‘apparent’ change of cropping factor, is the compression or expansion of object sizes relative to their distances to each other.i.e.,

If you take a shot of the moon that also has some buildings in the foreground, with a 1k mm lens, you will get a much larger moon size relative to the buildings. However, if you take the same shot with a 50mm lens; No matter how much you ‘tinker’ to try and get the same approximate final image with ‘cropping’ (changing your image/film plate size), you will never get the same image. You will never get the same object sizes relative to each other.(In this example Moon compared to the buildings, that comparison in the two contexts - one 50mm the other 1k mm lens).
User Avatar
Member
4 posts
Joined: 1月 2018
Offline
BabaJ
Yes it's ‘good’ that you put zoom in quotes, because doing what you describe is not an actual zoom in terms of real camera/film results.

No that's exactly what it is. Zoom is changing the size of the projection. It is equivalent to changing the size of the imaging device. It is expressed in Houdini in terms of focal length and aperture, since these are the values that correspond to the real world. A zoom value would be just as valid, however since it is unitless, it would make relation to physical camera and lens sizes impossible.


BabaJ
In real effects of camera/film a zooming by the camera also has the effect, in addition to the ‘apparent’ change of cropping factor, is the compression or expansion of object sizes relative to their distances to each other.i.e.,

This is not the effect of the lens, but the effect of distance to subject, and the relative distances of subject to subject. Changing the lens or film size only changes the angle of view. The captured image is invariant to focal length or cropping–the same as cropping an image in photoshop.


To get a better understanding, read up on the basics of field of view and imaging.

https://en.wikipedia.org/wiki/Angle_of_view [en.wikipedia.org]
User Avatar
Member
2042 posts
Joined: 9月 2015
Offline
jonathan.mack

Sorry jonathan, you don't understand what I am talking about. Maybe this link will get you started in the right direction.

https://petapixel.com/2018/07/17/is-lens-compression-fact-or-fiction/ [petapixel.com]
User Avatar
Member
4 posts
Joined: 1月 2018
Offline
The article in the link is in agreement with my statements and illustrates my point well.

Are we on the same page yet?
User Avatar
Member
8595 posts
Joined: 7月 2007
Offline
zoom, crop, aperture, focal length are more or less ultimately changing the same thing, FOV

but more important question is:
is jsmack and jonathan.mack the same person?
Tomas Slancik
FX Supervisor
Method Studios, NY
User Avatar
Member
7805 posts
Joined: 9月 2011
Offline
tamte
but more important question is:
is jsmack and jonathan.mack the same person?

Ludicrous!
User Avatar
Member
8595 posts
Joined: 7月 2007
Offline
jsmack
tamte
but more important question is:
is jsmack and jonathan.mack the same person?

Ludicrous!
That's what I thought
Tomas Slancik
FX Supervisor
Method Studios, NY
User Avatar
Member
2042 posts
Joined: 9月 2015
Offline
No that's exactly what it is. Zoom is changing the size of the projection. It is equivalent to changing the size of the imaging device.

The article in the link is in agreement with my statements and illustrates my point well.

Zoom indeed does change the size of projection. But my point is that with a real life lens that is not the only factor at play.(In what you can and cannot get as the final image).

But to add for clarity, as you still don't understand what I am saying:

Take a scene shot with a camera using a 35mm lens then another shot using the 1000mm.(the same could be done with one zoom lens if it had such a wide range capability).

Now try with whatever means you want, either through an actual physical alteration of the film/sensor plane medium, or software afterwards,e.g. Photoshop, Houdini ‘aperature’ parameter, etc.;

Try and crop that 35mm shot to have the same image as the 1000mm.

What you will find is that, although you can accomplish the same ‘field of view’ through the ‘cropping’. The resulting image will never be the same, due too compression - as what my link I provided illustrates.

So although your link does indeed illustrate your point well, you and your link are incomplete since the factor I am talking about has been left out of the equation which is important to what I was suggesting earlier that cropping an image is not the same as zooming with a real camera lens. Both methods may share affecting the field of view; but both methods do not give the same result as they do not share the factor of ‘distortion’(as per the link).

Are we on the same page yet?

Yes good question, are we?
User Avatar
Member
2042 posts
Joined: 9月 2015
Offline
zoom, crop, aperture, focal length are more or less ultimately changing the same thing, FOV

Aperature in Houdini indeed does change FOV. However with a real camera, changing aperature does not change FOV whatsoever.

It only changes the Depth of Field - what is and what is not in focus(blurring - not to be confused with blurring caused by motion), plus contributing towards exposure as well.
User Avatar
Member
274 posts
Joined: 11月 2013
Online
BabaJ it’s unfortunate but particularly on CG cameras it is common for the term aperture to be overloaded. You are referring to lens aperture but aperture in this context relates to filmback/sensor/screenwindow.
User Avatar
Member
2042 posts
Joined: 9月 2015
Offline
BabaJ it’s unfortunate but particularly on CG cameras it is common for the term aperture to be overloaded. You are referring to lens aperture but aperture in this context relates to filmback/sensor/screenwindow.

Yes, somewhere along the line that's what I was thinking has happened.

What I mean is that CG software like Houdini has closer ties to cinematic film making rather than still photography.

And I'm sure a long time ago the term ‘aperature’ morphed/entered the lingo on production sets to mean what you say. Thanks for that confirmation.
Edited by BabaJ - 2020年2月12日 12:15:04
User Avatar
Member
274 posts
Joined: 11月 2013
Online
Btw if you look up the definition of aperture it is not specific to photography and simply means ‘an opening, hole, or gap‘. Therefore the filmback/sensor on a camera (real or cg) qualifies as it is the size of the rectangular opening/hole capable of storing the incoming signal.
User Avatar
Member
8595 posts
Joined: 7月 2007
Offline
BabaJ
Aperature in Houdini indeed does change FOV. However with a real camera, changing aperature does not change FOV whatsoever.

right, same word different meaning, no relation
I agree that it's confusing, but that's what the sensor size currently is called
Edited by tamte - 2020年2月12日 12:17:24
Tomas Slancik
FX Supervisor
Method Studios, NY
User Avatar
Member
2042 posts
Joined: 9月 2015
Offline
Btw if you look up the definition of aperture it is not specific to photography and simply means ‘an opening, hole, or gap‘. Therefore the filmback/sensor on a camera (real or cg) qualifies as it is the size of the rectangular opening/hole capable of storing the incoming signal.

Ah but that hole/opening sits ‘off’ that image plane and it's size of opening does not affect the size of the image that results from that opening. Whether still or cinematic camera. So it really can't ‘qualify’ techinically as being the same thing in terms of results.

Just to re-iterate, changing the aperature(hole opening size in this case) does not change the ‘cropping’ or ‘zoom’ or final ‘size’ of the image.
Edited by BabaJ - 2020年2月12日 12:21:41
  • Quick Links