With the ever changing world of video and audio, I though that an introduction to video and audio formats on plastic – CD, SACD, DVD, DVDA and Blu-Ray may be in order to bring readers up to speed. This is not an in depth technical review of all the new and upcoming standards; there are plenty of articles, books and standards documents that do that. It is to help the average consumer understand where the industry was and is now taking us all, it provides an insight into most of the common video and audio formats and what you might want to understand/know before you buy yet more equipment that will be redundant before you even unbox it.
There will be three posts to review the formats:
- High Definition Video Formats on DVD and Blu-Ray
- The Dreaded HDMI & HDCP
- High Definition Sound Formats on CD, SACD, DVDA and Blu-Ray
Some technical liberties are taken in order to simplify the technical information.
High Definition Video Formats on DVD and Blu-Ray
First let’s review where we were and where we are heading.
A Potted History of TV – from analog to digital
The Original (American) Analog TV System
So what was wrong with the old analog NTSC system, often referred to in engineering circles as Never Twice the Same Color? This analog video distribution system provided countless years of great service to the viewing public but was very limited in that it was only mono audio and had a poor resolution being 525 interlaced lines. Interlacing (I) is the creation of a full frame of video by takin two fields of video one even lines and one odd lines; and then relying on the cathode ray tube (CRT) display and your eyes persistence in order to add them together in order to create a whole frame. It worked really well, and when processed using high quality full bandwidth studio signals actually looked very good; much better than what you saw at home. (Note: Modern video techniques now use progressive (P) video where the entire frame is displayed all at once, similar to film; this has numerous advantages for motion and the removal of “interlacing artifacts”). This technology has numerous limitations for both video and audio, so a whole new approach was required in order to upgrade the entire broadcast chain; enter digital processing and compression.
A New Digital Era has dawned
Digital techniques and processing speeds reached a point where they could now be applied to video in a cost effective way, as opposed to just audio. In 1996/1997 the first standard definition DVD players were released and in 1998 digital TV was unleashed upon the American public.
Transmitting high quality video in an analogue format was never going to be possible due to the bandwidth (frequency width of the signal) required to support each TV channel. This was where digital techniques allowed high speed sampling of the video signal and using a technique known as compression, allowed what was an extremely high bit rate, up to 1.5Gb/s, to be reduced by a factor of up to 1500 prior to transmission; actually all the way down to 19.39Mb/s for OTA (Over The Air) broadcast, as low as 1Mb/s for satellite and cable and 36Mbs/s for Blu ray. (Note: 1 Gb = 1,000,000,000 bits of binary data and 1Mb = 1,000,000 bits of binary data. Binary data is just a 0 or 1.). Nothing is free; there are very definite quality hits that are taken using compression for both video AND audio. Anybody who understands what they are looking or listening for can easily spot or hear them even with today’s very sophisticated compression formats and algorithms. The current compression technique used for broadcast is ATSC1 using MPEG2 compression. Cable and web channels have the ability to use other formats and standards as they provide their own closed distribution system.
So the initial high definition digital TV system supported standard definition at 480I60, and high definition at 720P60 and 1080I60. The ATSC specification also supported multi-channel surround sound and Dolby AC3 5.1. This technology was soon to be overtaken by the industries need to sell more sets and add more “bells and whistles” in order to generate more revenue. After all once you’ve bought you new digital TV what else are you going to buy; OK a DVD player? I never once heard the general public (or a broadcaster) crying out for 4K and even more channels of audio…maybe I was just “hard of hearing”!
So here we are now, in all its glory, 4K ultra high definition (UHD) TV, together with a new upcoming, not backward compatible, ATSC3 OTA broadcast standard with even more “bells and whistles”, and the whole upgrade and buying process starts all over again! BUT wait, we are not done yet, 8K is just around the corner…..why? I have no idea; more about this later.
Enter Ultra High Definition (UHD) Video
The general term UHD is the used to describe all video formats from Quad HD video and up.
The above diagram shows the relative screen size changes for the increased resolution if the pixel size (see below for pixel explanation) remained the same. As you can see the higher resolutions provide more pixels for a given image and therefore the ability to show smaller and more changes to that image; hence the term Ultra High Definition.
The technical demands of this standard are quite exceptional and push the current boundaries of cost effective technical solutions. It uses an extensive array of very high powered and clever compression algorithms (HEVC – High Efficiency Video Coding or H.265) in order to get 12Gb/s into a 57Mb/s ATSC3 OTA broadcast channel or 108Mb/s from a UHD disc; a bit rate reduction of over 200 to 1. Cable, satellite and web providers can use different compression schemes and lower bit rates in order to meet their available delivery bandwidths and business models. Obviously you can’t get something for nothing. Even with these clever compression techniques there are very visible differences between the original content and what you see at home. These differences only really become apparent on large screen displays and projection systems that have been correctly calibrated. Even then the viewer needs to know what they are looking for.
There are a lot of new terms being thrown around now that refer to the upcoming new video and audio standards. The video terms can be summarized as follows:
- UHD – Ultra High Definition Video – refers to a video display system that supports more than 1920×1080 pixels (see below) in the picture; currently 4K (3840×2160); (native DCI 4K 4096×2160).
- HDR – High Dynamic Range – refers to the difference in brightness between the darkest and brightest parts of the picture. There are several “standards”.
- HDR-10 – refers to the default UHD standard used to encode 4K UHD Blurays and supports both HDR and WCG at 10bits.
- DV – Dolby Video – refers to a new proprietary video encoding system by Dolby that provides both HDR and WCG at up to 12 bits – optionally included for UHD Blu rays.
- HFR – High Frame Rate – refers to the number of pictures or frames displayed per second.
- GAMUT (WCG – Wide Color Gamut) – refers to the range of visible colors that can be seen by the eye and that can be accurately shown by the display device
- GAMMA (gamma curve) – refers to the way that the luminance (brightness) of a signal is converted to and from an electrical signal. It was originally developed to compensate for the old TV cathode ray display (CRT) display characteristics. However, the “gamma curve” is still a very necessary part of the new UHD system and is used in order to maximize the final image quality.
- BITS – refers to the number of brightness levels (luminance and chrominance) that the electronics/display can show. Currently 8 bit= 256 levels, 10 bit = 1024 levels and 12 bit = 4096 levels.
- PIXELS – refers to the smallest addressable area of a display that can be illuminated and that makes up the display. UHD TV = 8,294,000 pixels (remember that each pixel is composed of three display elements one for each of the three primary red, green and blue colors.)
- COLOR SPACE – refers to a method of specifying a color using three numbers I.E 4:4:4. Each number refers to a color component red, green and blue or Y, Cr and Cb.
- CHROMA SUBSAMPLING – refers to a method of sending the color (chrominance) information in a video signal with a smaller number of bits than was in the original signal.
- ASPECT RATIO – refers to the ratio of the width to the height of an image. For most HD content this is displayed as 1.85:1 or 1.78:1 (16:9). Most moves are shot in wide screen 2.35:1.
For those of you wanting a little more technical information on the above terms get a cup of tea or coffee and read on.
HDR
HDR or high dynamic range refers to the difference in brightness or contrast between the darkest and brightest parts of an image. The human eye has a far wider dynamic range than any available display technology can show today.
Up until recently achieving absolute black levels in conjunction with very bright peak whites was not cost effective for home displays or projectors. Notably OLED displays and DILA projectors can achieve outstanding black levels while at the same time achieving respectably high peak white levels. OLED displays also achieve some of the best WCG’s available in today’s displays. It should be noted here that unless your viewing conditions are ideal, a blacked out room with NO LIGHTS WHATSOEVER, most display/projector contrast ratios are very much less than that quoted by the manufacturer or required to fully realize HDR.
Care should be taken with some flat panel displays specifications as they are not capable of providing their quoted peak levels for the entire screen; only a small percentage of it. See later comments.
The current implementation of the HDR is causing significant viewing issues for many of us due to non-standardization of the production process and its associated metadata (see later comments), gamma and the fact that any affordable home projector cannot achieve anything close to the peak brightness required to meet the UHD or DV specifications.
If we quickly look at how brightness is measured and the way gamma is used a little more insight may be had into the viewing challenges.
Luminance or brightness is measured in candela per square meter (cd/m2) or nits. Most analog TV’s, broadcast monitors and older projectors can only produce about 200 nits of light. Modern 4K displays can create up to 2000 nits while the latest 4K projectors can provide up to 400 nits.
(Note: the luminance of projectors is usually quoted in Lumens as it is a point source of light made to cover a varying range of screen areas. This affects the perceived brightness/sqft of screen surface. For projection systems the luminance term foot-lambert (ftL) is often used and there are minimum standards set for the values in cinemas (16ftL) and home use. 1 nit = 0.29 foot-lamberts = 0.093 lumen/squarefoot/steradian – see later for explanation of steradian).
It should also be noted that the gamma relationships are also different for HD (REC709), 4K (HDR10) and 4K Dolby Video (DV), adding still more complexities to obtaining the correct display average brightness, blacks and peak whites.
The peak nit specification for HDR10 is 4000 (current target 1000) while for DV it is 10,000 (current target 4000). The target levels are currently set lower as current displays cannot get anywhere close to the levels that the specifications set out. While values up to 2000 nits can be affordably created on small to medium flat panels doing the same for a large projection screen would require bulbs that have the same light output as a plane spotting arc lamp…obviously totally impractical; not to mention the fact that you would require sunglasses and sunblock 2000 to watch your movie on a large screenJ. Even commercial cinemas do not achieve more than about 55 nits with a contrast ratio of about 1000:1. So while “mapping” HDR10 or DV luminance levels to a flat panel is quite doable the same cannot be said for a projector especially for DV. This requires careful manipulation of the brightness and contrast controls and a special gamma in order to get a respectable picture.
So why have these very high brightness levels and different gamma functions? Most of an images brightness level is very much below its peak level, often referred to as average picture level – APL. These peak levels provide that extra “pop”, for example for bright stars in the night sky or strong reflections of light etc. providing a high level of visual realism and “punch” to the image. So the need to reach that peak brightness over the whole screen is not necessary unless you want to go blind! Explanations regarding the various gamma curves are quite complex and I shall not address them here. Just remember that each standard requires a different gamma curve in order to optimize the displays luminance performance and for some devices that gamma curve may be unique to the display device I.E. projectors.
4K HDR Video Standards
HDR10 – Blu-ray UHD Discs.
The HDR10 Media Profile, more commonly known as HDR10 uses the wide-gamut Rec. 2020 color space and a digital video sample word length of 10-bits. It uses static metadata to send the color calibration data of the mastering display, the Maximum Frame Average Light Level and Maximum Content Light Level values to your display in an attempt to correctly configure it to the incoming video standard. HDR10 is an open standard supported by all UHD Blu-ray manufactures and producers and it is the base standard required to be supported on ALL 4K UHD discs
HDR10+ Blu-ray UHD Discs
This is a new open standard (no licensing fees) developed by Samsung and Amazon, apparently in opposition to the Dolby DV format, and is a newcomer to the UHD format. It provides dynamic metadata similar to Dolby Vision (DV) and DTS-X in order to provide scene-by-scene or even frame-by-frame brightness changes. This creates the optimal performance of the display to show the dynamic range that the producer wants the viewer to experience. However, it does not support the wider 12 bit color words and 10,000 nit brightness that DV can. Many Samsung TVs can be upgraded to support this new “standard” via a simple firmware update indicating that its support may not be hardware dependent. If this is the case then it may allow other display/TV systems to be upgraded to support the standard. More on this as the standard emerges.
Dolby Vision – Blu-ray UHD Discs
Most existing equipment cannot support this format. A very few can be upgraded with firmware if the hardware will support the format. Currently no home projectors can support this format, nor do see that support in the near future.
Dolby Vision is an OPTIONAL HDR format from Dolby Laboratories that can be supported by Ultra HD Blu-ray discs and streaming video services; Dolby Vision is a proprietary format. It supports up to 4K resolution and the 2020 wide-gamut color space. The main two differences from HDR10 are that Dolby Vision has a 12-bit digital video word length and dynamic metadata. The color depth allows up to 10,000-nit maximum brightness (typically mastered to 4,000-nit in practice). It can encode mastering display colorimetry information using static metadata and also provide dynamic metadata for each scene. This dynamic metadata gives the producers a lot more creative freedom to use difficult lighting scenes yet still display them with great visual impact, as your display will react to this metadata and optimize its performance for the different lighting conditions.
More details can be found on the Dolby Vision page.
Hybrid Log-Gamma (HLG) – TV and Web use.
Hybrid Log-Gamma (HLG) is a HDR standard that was jointly developed by the BBC and NHK. The HLG standard is royalty-free and is compatible with SDR displays. HLG is supported by HDMI 2.0b, HEVC, and VP9. HLG is also supported by a number of video services such as YouTube.
SL-HDR1 – Distribution and dynamic range compatibility between SD and HD.
SL-HDR1 was a joint developed by STMicroelectronics, Philips International B.V., CableLabs, and Technicolor R&D France. It provides direct backwards compatibility by using metadata to reconstruct an HDR signal from a SDR video stream which can be delivered using SDR distribution networks and services already in place. SL-HDR1 allows for HDR rendering on HDR devices and SDR rendering on SDR devices.
MPEG –H – UHDTV broadcast and streaming.
This new standard is currently under development and is designed primarily as a broadcast format to support UHDTV and streaming with video resolutions of 3840×2160 (4K) and 7680×4320 (8K).
MPEG-H is a group of standards being created by the Moving Picture Experts Group (MPEG) for the development for a digital container, video and audio compression, and conformance-testing standards; to be formally known as ISO/IEC 23008 – High efficiency coding and media delivery in heterogeneous environments.
MPEG-H consists of 13 parts the most relevant to this overview is part 2.
MPEG-H Part 2: High Efficiency Video Coding (under joint development with the ITU-T Video Coding Experts Group) – A video compression standard that doubles the data compression ratio compared to H.264/MPEG-4 AVC and can support resolutions up to 8192×4320.
More on this standard as and when it arrives and is applied to everyday distribution.
HFR
Many movies (film and video) are produced at 24 frames per second (fps) some recently having been captured at 48 fps; UHD supports frames rates as high as 60fps. TV currently supports up to 60 fps (1080P60) but ATSC3.0 will support 4K, HDR using HLG and REC 2020 all at frame rates up to 120fps. Higher frame rates generally provide for smoother motion. Displays and projectors that support frame interpolation that artificially creates higher frame rates in order to “smooth” out motion can soften or defocus the image if not correctly implemented.
GAMUT
This is the range of visible colors that can be seen by the eye that can be accurately shown by the display device. There are many color gamut standards that define what a system or display can handle and reproduce. Your eyes have the widest color gamut and it is this gamut that the displays are trying to reproduce. See the graph below for a selection of gamut’s.
The Rec. 2020 color gamut is the goal for UHD TV. The DCI-P3 gamut is currently used for cinema theater presentations and may become an intermediate UHD gamut. The Rec. 709 color gamut (BT/REC601), with much lower saturated colors, is the current HDTV standard.
Cost effective modern display and projector light sources cannot meet the full gamut that the human eye can detect. However, the use of new bulb, laser and OLED technologies are continually getting closer to the full visible range, see REC 2020. Achieving the full eye gamut will probably be too costly for the benefits that may be realized beyond achieving REC2020; but technology marches on and you can never say never.
GAMMA
A detailed explanation of gamma will not be provided here. Suffice to say the systems gamma refers to the way that the luminance (brightness) of a signal is converted to and from an electrical signal. It was originally developed to compensate for the old TV cathode ray tube (CRT) display brightness characteristics. However, the “gamma curve”, which in today’s technical language is now referred to as the Optical Electrical Transfer Function (OETF) (or its inverse EOTF) is still a very necessary part of the new UHD system. It is used to maximize the final image quality based upon the eye, the display and electronics response. Below are shown several, examples of these curves.
CRT Gamma curve. The red curve is that of the display, the blue curve is that of the video electronics. When added together they produce a straight line giving a linear relationship between signal level and brightness.
Shown above is a comparison of the very different gamma curves used for different video systems. Clearly each type of system will require the display to create the inverse performance in order to achieve the desired brightness representation of the signal. (OETF – Optical Electrical Transfer Function; the relationship between the electrical signal and the optical device)
Selection of the correct gamma curve and its adjustment for a specific type of display is of paramount importance in order to get an optimal picture. This is particularly true for projectors as their conditions of operation are not as predictable as flat panel displays. Fortunately the HDMI connection supports the metadata that should tell your display which gamma curve to select. If it doesn’t support it the display should remap the luminance values to those that the display can handle. In many cases this re-mapping is not done effectively and the user is left to the task of adjusting the contrast and brightness controls (in the display or the UHD DVD player) in order to get a satisfactory image.
BITS
In all modern TVs and projectors the main signal processing is digital. In these systems the original analog signal is represented by digital words (8,10 or 12 bits long; a bit being a 0 or 1). These bits are used to describe the signals luminance (brightness) and chrominance (color) at an instant in time and just for one pixel at a time. Each color, red, green and blue is assigned a word that is 8, 10 or 12 bits long (only one word length is used by the system). The number of bits creates the number of different levels of luminance that that a color can have I.E. 8 bit= 256 levels, 10 bit = 1024 levels and 12 bit = 4096 levels. Clearly the more bits that are assigned the better defined a small change in color or brightness that can be determined. In UHD systems that are poorly implemented, or have poor signal processing, low numbers of levels can cause color banding or posturizing in areas of gradual color shading such as skies.
PIXELS
A pixel is the smallest addressable area of a display that can be illuminated and that makes up the display. 4K UHD TV has 8,294,000 pixels (remember that each pixel is composed of three display elements one for each of the three primary red, green and blue colors.)
If you sit too close to a display, even watching 4K, you will see the individual pixels. As you move further from the screen the density of the pixels increase and the picture image improves. This effect is due to the eyes angular resolution I.E. the number of pixels per angle. The farther away from the screen you are, the higher the angular resolution will be.
Your visual acuity determines how far back from the screen you can be in order to improve your angular acuity. At some distance your eyes will not be able to distinguish fine details. It has been shown that somebody with 20/20 vision can distinguish something that is 1/60 of a degree apart. For a 1080 HD display that means an angle of approximately 31degrees/screen width or 60 pixels per degree. (Single pixels may be seen from greater distances if they are part of a high contrast area.)
This information shows that the closest you can sit to a TV and still maintain the maximum angular resolution is about 1.6 times the displays diagonal. The following tables give you the minimum distance for different sized displays at different resolutions while still maintaining the eyes maximum angular resolution.
It can clearly be seen (pun intended) that to get optimal viewing resolution for high resolution screens, you either need very large displays or you need to sit very close to smaller displays. Neither of these options works well in the typical household. So what is the benefit of 4K; it’s not really the 4K resolution, yet 8K resolution is now being pushed why (I will explain later)? In a nutshell, it is the displays HDR and WCG that provides the real “bang for the buck” and for many viewers if these new parameters had been applied to 1080 they would have been just as impressed. For those of us who are lucky enough to own 80”+ displays or large projection screens that extra resolution is definitely of benefit.
COLOR SPACE
Put simply, color space refers to the triangular area in the gamut that defines what colors can be displayed by the system (it is actually a three dimensional space but we shall ignore that for now). The data that is used to define those colors within the triangle use three numbers I.E. 4:4:4. The first represents the luminance (Y) and the second two represent the color difference signals Cb and Cr. (The exact method as to how these color different signals are electronically created shall not be discussed here other than to say an electronic matrix is used to process the luminance, red, blue and green signals that create these difference signals). Color difference signals can be used as the eye is much less sensitive to the exact position and motion of color. The numbers refer to the sample or sub-sample rate that each video component is digitized at. As each color (chrominance) and luminance value must be specified at every instant in time, this gives rise to a great deal of data and requires very wide bandwidths and high data speeds to accommodate it.
CHROMANCE SUBSAMPLING
This is a form of data compression that allows the color space to be recreated from a reduced number of chrominance (color) samples. It takes advantage of the eyes much lower acuity to color resolution. For example all DVD’s, standard definition, high definition and ultra high definition use 4:2:0 sub sampling. Clever signal processing and matrixing can re-constitute the original 4:4:4 color space. Common values are 4:4:4, 4:2:2 and 4:2:0. 4:2:2 reduces the data rate by 1/3rd, while 4:2:0 reduces the data rate by ½. Note that chroma sub-sampling reduces the color horizontal and vertical resolution by as much as ¼ in each direction, but as stated the eyes acuity is not very sensitive to this loss of resolution.
ASPECT RATIO
The aspect ratio on an image is the ratio of the width to the height of that image. For most HD content this is displayed as 1.85:1 or 1.78:1 (16:9). Most moves are shot in wide screen 2.35:1.
Currently there are very few flat panels that are native 2.35:1; nor are there very likely to be so. However 2.35:1 can easily be created on displays and projection screens in several ways:
- By underscaning the 16:9 screen and having black bars at the top and bottom. This has the unfortunate side effect of significantly reducing both the vertical resolution and image brightness.
- By underscaning a 2.35:1 projection screen, creating black bars at the top and bottom of the display and then zooming out the image to fill the screen. This also has the unfortunate side effect of significantly reducing both the vertical resolution and image brightness
- Using an anamorphic lens that stretches the width of the picture to match the 16:9 fixed height, compressed 2:35:1 image. This maintains the pictures resolution and also increases the images brightness by approximately 30%. Unfortunately high quality lenses of this type are quite expensive see Panamorph, Prismasonic and Navitar.
The only method of watching widescreen 2.35:1 movies on 16:9 displays is option 1. That’s a pity as you just lost some of the resolution benefits of running 4K plus a loss in image brightness!
UHD Test Material & Additional Reading
There is still a very limited amount of material for setting up and testing 4K UHD HDR10 systems. As of todays date, November 2017, this is what is available:
- Samsung HDR10 Test Disc.
- Joe Kanes UHD Digital Video Essentials DVE V0.9 – thumb drive only
- Diversified Video Solutions.
- OPPO Digital – OEM Diversified Video Solutions HDR10 Test Patterns Suite UHD Blu-ray Disc.
- SpectraCal Pattern Generators
See these sites for more information:
- SpectralCal – HDR Demystified – requires sign up to download this excellent and free, easy to read e-book.
- Lightillusion
So where to now?
So it’s great to have a standard(s) that have plenty of headroom for future growth but currently most of this headroom is just wasted. Very few displays and even less HT projectors (actually none) can get anywhere near what these standards have set as a lofty goal. So guess what, we now have a great way of “getting you”, the public, to continually upgrade as each year goes by, who benefits, well certainly the manufactures! Will you benefit? Well I am just not too sure how much more color and brightness is going to make that news program more appealing; come to think of it unless you are all buying 80”+ displays or sit three to four feet away from your current 60” display I do not even know how much more resolution you are going to see.
Don’t forget that no broadcaster is even close (in 2018) to upgrading to 4K, HDR or WCG and I mean nobody; yes, I know there have been a few broadcast tests. However, in order to support the new ATSC 3.0 standard for 4K broadcasts not only do broadcaster need new broadcast hardware, and LOTS of it, but all of us “off the air” viewers WILL DO to. Cable viewers are at the mercy, as always, of their providers. So while these standards will undoubtedly benefit film producers, there is absolutely no financial benefits for a broadcaster, and many are still digging themselves out from the last HD 1080 upgrade.
Being heavily involved in the design and building of many of the current “state of the art” 4K facilities I can honestly say that there are very few. Most are 4K sports arenas and trucks, and I have yet to see any broadcaster ask for a 4K upgrade to their facility!! Yes, I realize that some cable companies are now providing 4K UHD programming and even some support DV. That is easy, as it only requires relatively low cost infrastructure changes assuming that their cable distribution system has the required bandwidth. The cable consumer also has to pay for this feature, not so with OTA broadcasters. End-to-end 4K at broadcasts are a long ways of. Such an upgrade requires the replacement of the entire existing SDI infrastructure with an all IP infrastructure and a complete replacement of all the cameras, switchers, routers, facility signal processing and editing systems plus an upgrade to there TX systems. Yes they could “cheat”, and many will, by just up converting their 1080/720 signal just prior to transmission; but that IS NOT 4K UHD.
So now, there is a move to 8K. I ask myself why? Especially as can be seen that the resolution benefits of 4K in a typical household-viewing environment are marginal at best. I realize that there are some of us that have very large screens, but even then it barely benefits the viewer at the typical viewing distances with such high (8k) resolutions. Not to mention all the technical issues trying to get projector panel pixel convergence to be accurate over their entire area and creating cost effective lenses.
8K does have benefits and a lot of uses in the fields of medicine and sports. In sports, for instance, it allows the producers to potentially use a lot less cameras and “wide angle” just a few in order to capture all the action in one shot. This allows for the creation of several HD “virtual” cameras from just one 4k/8K camera scene allowing the producer to electronically pan in on an area(s) in the 4k/8K image and “cutting” several HD images out of it; as if each image was from a HD camera. This can significantly improve the TV viewer’s experience of the game providing seamless shots of field action. How the use and improvement of the viewers experience of 8K for TV or film production for home use is yet to be seen.
I also need to say that in my case, while yes it is impressive to take all these measurements and examine test patterns etc., at the end of the day we are all watching and listening to a film or show etc.; NOT a engineering test sequence. If you (well me actually) are deeply engaged in the video and audio production noting these levels of fine detail enhancement is something that is unlikely to affect your level of enjoyment; well it certainly doesn’t affect mine! It brings to mind the expressions “content is king”, it doesn’t matter what you view it on and when selling any house “location, location, location”, its not the house but its location.
The second post of this series of three posts will look at the Dreaded HDMI and HDCP. See the third post, High Definition Audio Formats on CD, SACD, DVDA and Blu-Ray here.