Anatomy of a TV: Everything You Wanted to Ask About Modern TV Technology

Anatomy of a TV: Everything You Wanted to Ask About Modern TV Technology 

You can't just pick and choose a TV in 2020. Of course, you can point your finger at the first model from the entrance in the store, or the first one on the list on the online showcase. You can rely on intuition or the opinion of the second half. But a conscious choice still requires some understanding of modern technologies used in the production of televisions. These are all incomprehensible letter combinations and terrible terms. So if you're even a little curious about how OLED differs from QLED or UHD from HDR, you are welcome to this guide. Experienced techies, please do not grumble about the simplicity of presentation, but humbly slip this text to your grandmother and thank the authors for the nerves they have saved when explaining simple truths.
TV Image


Let's start with simple questions.

What's outside, what are the connectors and interfaces now?

  • HDMI is now the most common connector for connecting a TV to signal sources: computer graphics cards, laptops, or multimedia players. It can be used for additional technologies, for example:
    • HDCP 2.2 (High-bandwidth Digital Content Protection) is not a connector, but a protocol for protecting digital content transmitted over HDMI, which is relevant for modern 4K TVs.
    • HDMI-CEC (aka Easy Link or Anynet) is a technology that allows you to control multiple devices connected via HDMI cables from one remote control.
  • USB - just like on a computer, it is used to connect flash drives or external hard drives in order to watch video files from them on a TV.
  • MHL (Mobile High-Definition Link) is a connector for connecting a smartphone and displaying pictures from it on a large screen. the technology combines HDMI and USB interfaces. 
  • DVI is a classic interface for connecting a computer to a monitor or TV.
  • SCART is a large universal connector for signal sources. It is famous for the fact that in nature there are many adapters to it for anything.
  • RJ-45 (aka LAN or Ethernet) is a port for connecting a TV to a computer network, today it can be replaced by a built-in Wi-Fi module.
  • DLNA is often found where there is a LAN. This letter combination indicates that the TV is ready to play video files from compatible network devices.
  • CI (Common Interface) or CI + is used to connect decoding modules, roughly speaking, miniature cable tuners inserted into the TV.
  • Composite, A / V input (tulips) - one of the oldest standards for transmitting sound and video over two separate cables
  • Component Input - Outwardly similar to composite input, but uses three cables that only carry higher quality video.
  • VESA is not a connector for connecting some kind of cable, but a standard for attaching a TV to all kinds of stands and brackets.

Okay, now let's see what's inside. What is the TV screen resolution?

This is not an extension, do not confuse two different concepts. The extension is for the file, the resolution is for the screen. So, the screen resolution is the number of dots vertically and horizontally. The image on a TV screen, like on a computer monitor or smartphone, consists of dots. The more they fit on the screen, the clearer the picture looks. But the resolution is not only for the screen itself but also for the video that is displayed on it. If you watch a low-definition video on a high-definition screen, the result will be the same as the video itself. Or vice versa, there is no point in watching 4K video on an HD Ready TV.

How is HD Ready different from Full HD and 4K?

The number of dots that make up the image. In general, the very combination of letters HD comes from the words High Definition, that is, high resolution. Once upon a time in the old days, screen resolutions were low, and marketers used this letter combination to distinguish new items with a large number of dots. But the resolution continued to rise in newer devices, and now you can find many variations on the HD theme.
  • The term HD Ready (that is, literally "ready for HD") - refers to the most inexpensive TVs with a resolution of 1280x720, 1366x768, 1400x900, or 1680x1050 pixels.     
  • FullHD (or 1080p) - screens with a resolution of 1920 × 1080, until now this is perhaps the most popular format, a kind of golden mean.
  • UHD or UltraHD is the most recent resolution format. There are two types: UHD 4K (3840x2160 or 2160p) and UHD 8K (7680x4320, 4320p). Moreover, the second type (read "eight ka") is now, in 2020, considered new, very expensive, and not very practical yet useful. There are still few TVs of this format on the market, and even less video to watch on it. However, recently we could say this about 4K, and today even smartphones of an average price level can shoot it.

What's the difference between 1080p and 1080i? 

In the type of sweep. Look: the number indicates the number of pixels vertically. In the 1920x1080 format, there are 1080, and in 1280x720 there are 720. Therefore, the second of the named formats is also called 720p. But the letter "p" comes from the word "progressive" and means progressive scan. This means that all lines, that is, horizontal lines on the screen, are updated when changing frames at the same time. It seems to sound logical, and in modern TVs it always does. But old cathode-ray TVs used a different type of scanning: interlaced. With this method, the even and odd lines of points that make up the image are updated in turn. This video format has an “i” in its title. 

Then what is PPI? 

This is generally different, the letters are similar, but the meaning is different. PPI - pixels per inch, the number of pixels per inch. This is an important characteristic that shows how densely the TV screen is filled with pixels. After all, besides the resolution, it has a physical size. A 32-inch FullHD TV will have a pixel density of 69 PPI, while a 40-inch one (FullHD) will have a lower density (55ppi) because the same number of pixels is stretched over a larger screen area. 

How much PPI is needed on a TV?

Simply put, the more the better. But TVs with high PPI are more expensive and, perhaps, it makes no sense for you to pay extra pixels. Here's a simple rule: the farther the viewer sits from the TV, the lower the pixel density is enough for the visual quality of the picture. From a distance of one meter, an image on a 90 PPI screen will look about the same as only 9 PPI, but from a ten-meter distance. For a distance of two meters, a value of 40 PPI will be sufficient, which roughly corresponds to a 52-inch FullHD TV. But, of course, this is a rough estimate, it all depends on your wishes for the image and the peculiarities of vision. To calculate this parameter, there are special ppi calculators on the Internet, Enter the resolution and the diagonal, you get the result and don't mess with the formulas.

PPI, DPI, what's the difference? 

Don't be confused: DPI is dots per inch, dots per inch. Point - it is when you print to a printer, and pixels  - it's on the screen. DPI is irrelevant for TVs.

Okay, but what brightness should the TV have?

Advertising has already taught us: the brightness is better - the maximum. If anything, you can always adjust it in the settings. The brightness is measured in candelas per square meter (cd / m2), sometimes the old name "nit" is used, this is the same. The maximum brightness depends on the type of screen. If in liquid crystal panels it can fluctuate in the range of 300-600 cd / m2, then for LED screens figures of the order of 1500 cd / m2 and even higher are achievable.

Is the contrast the same?

Yes, it seems. Moreover, the concept of contrast is associated with brightness. A contrast value such as 3000: 1 (read "three thousand to one") means that on this TV the lightest pixel is 3000 times brighter than the darkest one. It is clear that it can be the same pixel, which is three thousand times brighter, displaying white, than it, showing black. In OLED screens, black pixels do not glow at all, that is, they are completely black, so their contrast can be considered infinitely high.   

OLED, LED, QLED - how they differ, you can get confused!

You can easily! Therefore, first, you need to understand how TV screens generally work. I am very fast, do not be afraid. We are not talking about old tube TVs now, we will only talk about the modern "flat" ones. They can be divided into three large groups.
  • Liquid crystal (LCD, liquid crystal display). The principle of operation is as follows. The light from the backlight passes through a matrix of liquid crystals, and the electronics control this matrix, letting in the right amount of light or not in the right place. This creates an image on the screen. This is the most popular type of screen and comes in many varieties. 
    • LED TVs. This is the name given to LCD TVs that use light-emitting diode (LED) as the backlight rather than lamps. This class includes almost all modern LCD TVs, including the different types described below. Once again: almost any TV nowadays is LED (but not to be confused with OLED, we will talk about them further). And besides this, he can additionally wear some other letter combination. For example:  
    • Direct LED is a backlighting method in which LEDs are placed over the entire screen area. Allows you to flexibly control the backlight in different areas of the screen. 
    • Edge LED - in such TVs, LED backlighting is located only on the sides of the screen. This allows the TVs to be very thin but reduces the uniformity of the backlighting. 
    • QLED (Quantum LED) - screens with "quantum dots", a technology from Samsung. LG has similar technology, there it is called NanoCell.
    • ULED is the name of the Hisense brand technology suite. This is not a separate type of matrix, but a set of software (algorithms) and hardware (image processor) solutions to improve the image quality. 
  • LED (OLED, organic light-emitting diode). Everything is different here. There is no backlight, the dots themselves, consisting of organic LEDs, glow. Such TVs are superior to LCD TVs in almost all respects, but so far they are significantly more expensive. 
  • Plasma panels. Once upon a time, they were the only progressive means of obtaining high-quality images with high frequency and wide color gamut on a large flat screen. It is now an obsolete type, although you can still hear how any large flat-panel TV is called "plasma". 
Well, how have you dealt with the "ice"?

Yes, but there is also IPS, PVA, MVA - what is it?

This is a little different, here we are talking about different types of matrix control and methods of its production in LCD-screens. 
  • TN and TN + Film are the old "passive" types of matrices with small viewing angles, which are not used in modern TVs.
    • TFT - "active" TN-matrix, where each pixel is controlled by a separate transistor. 
    • SFT, IPS, PLS - a further development of this technology with improved color reproduction, brightness, and viewing angles.  
  • VA is an alternative technology (unlike TN, such matrices are opaque when turned off).
    • MVA, PVA - modern technologies based on VA. 
In short, the modern choice often comes down to IPS versus MVA, and for most consumers, there may not be a radical difference between them.

So what is HDR?

These are also three letters but from a completely different area. HDR (High Dynamic Range)  is a technology for increasing dynamic range. More precisely, the general name for all technologies of this type. The human eye sees the world around us with many shades and nuances of light and shadow. The TV screen spoils everything, the picture on it is much poorer. Roughly speaking, compared to  " normal " displays, HDR displays show a dark subject darker, a lighter one lighter, and colors  -as close as possible to how the operator's eye saw them when shooting a video. Or the director's fantasy during editing. This problem has been taken seriously by technology manufacturers and has spawned many technologies that increase the dynamic range. A detailed description of how HDR works can take more than one separate article, so for simplicity, remember what HDR does for video in general:
  • more gradation of shades (technically this is called  "color depth", which is measured in bits, the more the better);
  • more colors in general (they say:  "wider color space" );
  • the maximum brightness is higher (we have already talked about threads and candelas).

What types of HDR are there?

There are many of them, it is easy to get confused, although all, in general, solve the same problem. But it often happens that one TV can work with several types of HDR content at once, this is convenient. So, here are the letter combinations you can find on TV boxes:  
  • HDR 10  is the most popular and most widely adopted standard. Provides 10-bit color and supports display up to 1.07 billion shades.
  • HDR10 +  - an updated version of HDR10, supported on Samsung and Panasonic TVs, featuring dynamic metadata (that is, additional information inside a video file indicating how to display it correctly).
  • Dolby Vision is an HDR technology from Dolby, renowned for quality sound. They also do great with video: the technology supports not only 10, but also 12-bit color (that's more than 68 billion shades), a maximum brightness of up to 10,000 nits and dynamic metadata. 
    This technology can be found in TVs from Sharp, Phillips, Hisense, Vizio and other brands.   
  • HLG and HLG10 have nothing to do with the LG brand, these technologies are developed by the BBC and NHK broadcasters. Video of this standard does not contain metadata and is compatible with a wide range of devices.
  • Advanced HDR by Technicolor is the generic name for a group of technologies used in Philips TVs. In fact, this is the  " author's interpretation " of the standards described above.
These are the basic HDR technologies, so to speak,  " pillars " . They are the basis for the solutions of various TV manufacturers, often of a purely marketing nature. Here is a far from an exhaustive list of examples: 
  • Samsung: QHDR and Quantum HDR, 
  • LG: Cinema HDR and HDR Pro,
  • Hisense: HDR Supreme,
  • Philips: HDR Plus, Perfect, and Premium.

And what, with such TVs, any video will be in HDR?

No, the video must also be  " special ", in the appropriate HDR format. And the TV must be able to display this format correctly so that the result differs from the  " normal " video.

So where to watch HDR video?

Many streaming services and online cinemas have learned to show such a video: Netflix, Amazon Prime, Megogo, Ivi, Okko. Plus, you can play HDR games on your HDR TV on your PS4 or Xbox One S.

Okay, what other technologies are there to improve video and sound on TVs?

There are many of them, and something new constantly appears. Here are a few pieces for you:
  • Wide Color Gamut (WCG) is a kind of predecessor of HDR in TVs, a technology for software increase in the color of the picture.
  • Local Dimming - a function of local dimming of the screen, makes blacks more saturated in certain areas of the image.
  • Depth Enhancer is a technology for automatically adjusting dynamic contrast.
  • UHD Upscaling - Smart stretching (ie, upscaling) of HD video on UHD TVs. 
  • Dolby Atmos is a multi-channel surround sound technology. Allows you to create the effect of sound emanating from different sources around the listener, an important feature for creating an advanced home theater.
  • NICAM is also about sound, but only stereo. This coding technology allows you to get high-quality sound even with a weak broadcast signal.

Post a comment

0 Comments