Re: Information re. display gamma
- From: Graeme Gill <graeme2 argyllcms com>
- To: gnome-color-manager-list gnome org
- Subject: Re: Information re. display gamma
- Date: Sat, 06 Feb 2010 13:10:48 +1100
Milan Knížek wrote:
Gamma is adjusted to affect the color of the attached monitor.
Traditionally Linux has used a gamma value of 1.0, but this makes
monitors look washed out compared Windows XP or OS X. Apple
Sounds bogus. Unix machines have "traditionally" used RGB encoded
for display on a typical native CRT gamma. Linux is
no different - "traditionally" raw RGB numbers have been sent
to the monitor, and you get native CRT gamma or an emulation
of it if it's an LCD monitor, so the encoding is notionally
set for a gamma of 2.2. But there are exceptions - SGI's had
a different scheme, and people creating synthetic images (CGI)
have often left their images linear light, and expect the display
process to counteract the displays natural gamma, even though this
introduces quantization problems if they use 8 bits.
traditionally used a value of 1.8 for a long time, but now use the same
value as Microsoft. Microsoft has always used a value of 2.2.
Note a couple of things:
Latest OS X (10.6 "Snow Leopard") has a default of 2.2, but
it seems that "1.8" of the Mac mac prior to this is actually a misnomer,
the encoding is actually closer to 1/1.72, which is corrected by a framebuffer
ramp of 1/1.45 for display on a CRT of 2.5, giving an overall
system gamma of 1.0.
Real CRT's have a gamma of nearer 2.4 - 2.5 (although such numbers
are very sensitive to the setting of the black level).
"Traditionally" from the TV world, material is broadcast with a gamma
of 0.45 (ie. 1/2.22) with the expectation that it will be displayed
on a monitor that has a gamma of 2.5, the discrepancy applying a viewing
conditions transform that accounts for the different appearance
between the bright TV studio lighting and the usually dim viewing
conditions use with a CRT monitor. For all the details please refer
to Charles Poynton's book "Digital Video and HDTV".
Linux tools - like xgamma and gcm - instead tell the number, which is
applied to LUT only. Note that the final viewing gamma is equal to
monitor's native gamma divided by LUT gamma.
Right, and the final overall gamma should be 1.0, or (allowing for
the viewing condition transform, about 1.1 - 1.2).
I would propose to change the text so that it is clear that the user can
correct the monitor's native gamma and that it is not the final viewing
gamma.
Or am I wrong?
Most people are unfamiliar with the concept of overall system gamma. They
might easily confuse the encoding gamma, display gamma and system gamma.
The distinction between encoding gamma and display gamma is probably
lost on most people too (strictly speaking the encoding gamma for
"2.2" is 1/2.2 which is about 0.45, and the overall gamma is the
product of the encoding gamma and the display gamma.)
I think that if this is a display calibration application, then
really it should talk about what the display is being set to, and
could indicate how this corresponds to the encoding expectation.
So typically a CRT display in a dim viewing environment should
be calibrated to have a gamma of about 2.4, with the expectation
that it is being fed material that has been encoded with a gamma
of 1/2.2. If the display is brighter and in a brighter environment,
then a calibration gamma of < 2.4 might be desirable, for the same
2.2 gamma encoded material.
Graeme Gill.
Graeme Gill.
[
Date Prev][
Date Next] [
Thread Prev][
Thread Next]
[
Thread Index]
[
Date Index]
[
Author Index]