Understanding Gamma Display gamma is numerical expression that describes the relationship between signal input and video output of a display device. As you increase signal intensity, displays do not produce linear increases in light output. The relationship between signal input and video output is non-linear. To correct for this, reciprocal non-linearity is applied at the production stage. This is typically referred to as camera gamma. The combination of these two opposite nonlinear luminance curves—camera gamma at the production end and display gamma at the device end—results in a linear system gamma of 1.0, which is what we want. However, when viewing material in a dim environment, it is generally thought desirable to have a system gamma that is slightly higher: somewhere between 1.1 and 1.2 is most often quoted figure. Assuming a Rec. 709 camera encode gamma of 0.51, this means that display gamma should be in the 2.2-2.35 range, but what does this mean? For an idea of what different display gammas provide, see the chart below. These numbers are calculated by using a standard power law. Quite simply, the output at any given level is just that percentage of 100% video that is the level of input to the power of the gamma used. For example, if we assume a 2.2 gamma, then an 80% input results in an output that is 61.21% of 100% luminance, or 0.8^2.2. So to calculate the desired level of output at any video level, all you need to know is the measured luminance at 100% and the desired gamma. As you can see, the various gammas all begin and end with a one-to-one relationship between input and output. Zero input produces zero output (actually, because of the display’s residual black level it is really just the minimum amount of light the display produces, not literally zero) and maximum input produces maximum output. This is as you would expect. However, the precise relationship between input and output as you gradually increase input from 0% and above is not linear, as shown above, and it varies depending on gamma. A display with lower gamma increases its light output more quickly as you increase the signal input. If you look at the 10% input, you will see that a 2.8 gamma produces only 16% of the light output of 2.0 gamma. This is obviously an enormous difference. The difference becomes increasingly less significant as the input rises, so that at 80% input a display with a 2.8 gamma produces nearly 84% of the output of a display with 2.0 gamma, which would be barely noticeable. For this reason, the primary effect of gamma on image quality will at the low end of the video scale impacting most obviously shadow detail and black levels. If the gamma is too low you will achieve great shadow detail but your black levels will be noticeably elevated and contrast will suffer. If you raise gamma too high then you will create deep, dark blacks but with compromised shadow detail. Setting gamma correctly There are several myths about how to properly set gamma. For example, Rec. 709—the high definition standard—includes an encoding specification, but no decoding specification, so it provides no help in setting display gamma properly. sRGB—the standard for computer monitors and which incidentally includes exactly the same gamut and white point as Rec. 709—does have a complete gamma specification. It is sometimes claimed that sRGB recommends a display gamma of 2.2. This is not quite correct. Although the sRGB display gamma is on average near 2.2, it actually recommends a higher gamma at the top end of the video range and a much lower gamma at the bottom of the video range. In general, this is a good approach, but sRGB is intended for viewing conditions that are considerably brighter than what one experiences in the typical home theater environment where lighting tends to be low or even completely dark. Others argue that one should use a display gamma of 2.4, especially if one has a high contrast display. This stems partially from the fact that Rec. 709 implies a 2.4 display gamma and many professional studio environments reportedly use 2.4 when mastering content for Blu-ray release. This is also not quite true. Calibrating a display to a straight power curve of 2.4 will only result in substantially reduced shadow detail and an unnatural "contrasty" quality to the image. BT.1886 Gamma The correct approach is suggested by the sRGB standard and has fairly recently been codified in a new gamma specification called BT.1886, which uses 2.4 as a starting point but adjusts the overall response curve depending on the black level and white level of the display. Like sRGB, BT.1886 recommends a gamma response that is higher at the top end than at the low end. A straight power curve of 2.4 is correct only if the display has a zero black level and an infinite contrast ratio, which no real-world display has. The full BT.1886 specification is complex and its precise recommendations vary depending upon the white level, and especially the black level, of the display. However, if you don't want to bother with a precise BT.1886 calculation, white/black values of 120/0.03 cd/m2 serve as a good rule of thumb. This results in a gamma response between 2.3-2.4 at the top end and 2.2-2.1 on the low end.
Gamma and degamma To develop specifications for a video calibration system, you not only need to know how to calculate gamma, but degamma as well. degamma is just the process of removing gamma. There are different formulas for degamma. For a straight power law gamma degamma is just the reciprocal of the gamma: 1/gamma. BT.1886 degamma assumes the reciprocal of 2.4. sRGB uses a more complicated formula for removing gamma, but in practice it ends up being almost identical to the reciprocal of 2.2. Degamma is important for two reasons. First, the colors we typically start with are specified in xyY, which has gamma by definition. We will want to know the linear RGB equivalent of that xyY color. To calculate that we need to convert the xyY color to R’G’B’, which is gamma-weighted RGB and then remove gamma to get linear RGB. Unlike most color spaces, RGB has a non-linear version that includes gamma and a linear version that does not. Linear RGB is important for a number of reasons, but the most obvious is that it forms the basis of test patterns. To know the 8-bit (16-235) or 10-bit (64-940) test pattern for a specific xyY color, you have to calculate the linear RGB value of that color, from which the RGB triplet test pattern is derived. HDR (High Dynamic Range) Gamma HDR gamma works very differently from all of the other gamma systems discussed above. HDR10—which is currently the dominant standard, though several others are waiting in the wings—assumes that 100% output is an absolute value of 10,000 cd/m2. This is unlike any other gamma system for which 100% is a relative value that changes depending upon the capabilities of the display. The absolute spec of 10,000 cd/m2 causes all sorts of problems for HDR10, not the least of which is that it becomes impossible to calibrate gamma above 60%-75% video input. Current displays are simply not capable of outputting 10,000 cd/m2, so all the user can do is measure and calibrate up to the maximum video level that HDR10 allows, and for current displays this is generally no more than 75%. Above that you just have to let the display clip. Some people talk about tone mapping, which would gradually roll off the response instead of abruptly clipping at the limit of the display’s ability. However, there is currently no standard for this. There are other HDR standards that are almost certainly superior to HDR10, but they have yet to gain much traction (All current UHD discs use HDR10). DolbyVision, which is one HDR approach that many tout as technically superior, requires a licensing fee and is thus at a competitive disadvantage to HDR10, which is open source. There are several points to keep in mind about gamma:
|