Jump to content

Simulating the human eye


trygvewastvedt
 Share

Recommended Posts

I've been looking around for info on simulating the human eye (instead of a camera lens) in Maxwell but haven't found much. Of course it is to some extent subjective. Right now I'm trying to figure out appropriate settings for Simulens. Does anyone know the real meaning of the number in the sliders for scattering and diffraction? Or does the appropriate number here change based on the light? Any other thoughts on human eye simulation would be welcome. Thanks.

Link to comment
Share on other sites

  • 1 month later...

I always wondered about this.

 

To me the biggest thing seemed on how to align dynamic range of that of eye to camera, and that, to 3D. Based on Wikipedia, it seems human eye can achieve a dynamic range of 90DB (would never expect the same unit used for optics, but that's what written there) which at supplied table equates to roughly 6.5 stops, seems less than some top CMOS in current cameras.

Now, how to naturally tonemap linear result (1 stop) generated in CGI to mimic human sight ? I don't know honestly, I don't even know how to convert linear footage into camera response correctly so it matches 1:1 of what cameras output. Some renderers allow directly to load LUT/color profiles of these response curves, but those just seem to apply over like an filter instead of changing the linear curve to identical curve of camera. Same LUT is used also for color grading camera real footage (from digital cmos) which is not linear so the result is obviously, not the same.

 

I would wager that only pro researches in CGI like Paul Devebec and him likes can clearly articulate some answer to this.

Link to comment
Share on other sites

There are only so many parallels we can draw between human vision, optical cameras and virtual ones.

 

The human eye does not work like a CMOS, it does not have linear response across its surface (retina) and also there is no separating the output stream from the eye itself from the processing the brain does with it. We do not see with our eyes, they are just sensors. We see with our minds.

 

I would like to know about the spectral response of the retina, also. Remember film? Film came in a multitude of flavors that each had chemical tonemapping via their unique spectral characteristics.

 

I do think it is useful to look at this concept of the equivalent focal length and aperture of human vision when considering virtual imaging, but it only goes so far. Like our eyes, it's all in how it's processed and interpreted. "Photoreal" isn't.

Link to comment
Share on other sites

  • 2 weeks later...

Thanks for all the great info! This is all very helpful. I think Chris's last point is key - the main problem, as far as field of view and distortion are concerned, is that you're trying to represent a 360 environment wrapping around your eyes in a flat image, typically occupying a small portion of that environment. So, I need a 360 panorama viewed inside of Oculus Rift goggles? Sounds good to me!

 

@Chris: yes, that formula would be great if you can find it.

 

Any Maxwell users out there have thoughts about accurate Simulens settings for the human eye?

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...