Jump to content

HDR Displays at Siggraph


Recommended Posts

Originally posted by Timmatron:

At the risk of sounding ignorant...what good will these do when there is no such thing as HDR paper to print on?

 

And is this going to kill your eyes? Do you need sunglasses to work?

You are thinking like an architect... which is fine, but soon, paper will not be all that important.

 

This display, which I saw, was indeed very exciting, and surprisingly simple technology.

 

Your monitor today can only display 8 bits per channel

 

Film is equivalent to 12 bits per channel

 

This display was equivilant to 16 bits per channel.

 

Imagine showing your client an "animation" of their new building, not only in High Definition, but also in High Dynamic Range.... WOW!!! how cool would THAT be. It would be like Greg said... walking outside, and seeing your building. And no it will not kill your eyes...

 

BTW, you can't "print" an animation.

Link to comment
Share on other sites

quote:
...what good will these do when there is no such thing as HDR paper to print on?

Your monitor today can only display 8 bits per channel

Film is equivalent to 12 bits per channel

This display was equivilant to 16 bits per channel. [/QB]When Photoshop7 came out and I--and others--started hounding them about WHY there was such limited support for 16bit/channel editing, one of their developers actually wrote in a public newsgroup "well it's not like there are any 16 bit output devices". There are still dents in my wall from beating my head against it.

 

First off, I am using a grayscale printing system that CAN squeeze more resolution out of the luminence data than just 256 levels. And if these monitors can do +/- 16 bits, it would be REALLY NICE if Photoshop could be used for editing images to display.

 

Film has about 4K levels of luminence per light primary? Would that be grain-based or dye-based (KodaColor vs. KodaChrome)? CG work for film transfers are often done in 12bit - 16bit color to get the most out of the medium.

 

Now, to say our monitors are 8 bit suggests that they are digital devices, which they are not (CRT's anyway), but the graphics cards are--that's why they have DACs on 'em. I wonder if its possible to drive a 'regular' monitor with a better signal from a 12bit DAC?

 

As usual I am pushing the edge of my technical knowledge, and I know some of you exceed me, so correct if I need correcting.

Link to comment
Share on other sites

Yeah I'm obviously missing something here. All this talk of a brighter monitor, more bits etc. sounds great. But I don't really think I could understand it unless I saw it.

 

Sure wish I was there at Siggraph!

 

Another question though. Lets say you do have this wonderful animation in HDR, client is speechless - then what happens when they ask you to put it on disk to show all of their people? Would there be a huge loss? I mean, they would have to stand at your desk to see it correctly, right?

 

Is this so huge that everyone will have one soon? Like the cassette to cd? Or it it something thats nice, but will take a super long time to catch on - like...i dunno..xm radio?

Link to comment
Share on other sites

This leap would be more like vinyl to DVD.

 

Yes showing work on something other then a HDR display (if the work was of the same data rate as HDR) would degrade the quality.

 

How much? Its easy to give numbers, but since nobody really has HDR's, or even

has worked with them...who knows.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...