We may earn a commission from links on this page.
Credit: René Ramos/Lifehacker/ideabug/saravuth-photohut/Ton Pornprasit Panada/iStock/Tom Werner/DigitalVision/Getty Images
We’ve reached the end of television. Since the invention of the technology in the 1920s, TV screens have gradually grown larger, pictures clearer, and sets cheaper, until now: For all intents and purposes, we’re at the end of the road. This “nothing special” 65-inch Samsung unit, is, for most people, as good as a TV ever needs to be. It displays an image more highly detailed than most viewers can perceive from a couch-length viewing distance, its screen is as big as the average American living room can handle, and it costs less than $500. For 100 years, manufacturers and consumers have been chasing screen size and image clarity, so what happens now that the dog has caught the mail truck, and just about everyone has a TV that’s essentially perfect?
A brief history of big-screen TVs
Television has come a long way. If we traveled back in time to 1986 with the equivalent of $500 to buy a TV, we would only be able to afford the cheapest set from that year’s Sears catalog. For $159.99, the same relative cost as a 65-inch Samsung today, you could snag a set featuring a 13-inch, 4:3 screen with an equivalent resolution of around 480i. (CRT televisions don’t have pixels, but their screens displayed roughly 330–480 lines of usable detail depending on the signal.) By comparison, the Samsung has a 65-inch, 16:9 screen with a 3840×2160 resolution.
Those CRTs originally displayed images by firing electrons at a phosphorescent screen inside a vacuum-sealed glass tube. The cathode ray tube (hence CRT) had to be deep enough for the electron beams to accelerate, with glass thick enough safely contain them. The result: heavy, deep, fragile machines that couldn’t practically support screens much larger than 40 inches without being prohibitively expensive and heavy. The 1981 Sony KV-3000R, a 30-inch model that cost $10,000 ($36,500 in today’s money) and weighed over 500 pounds, was at the top of the big CRT consumer market. It was technically possible to go bigger—Trinitron created a 45-inch CRT in 1989 that sold in Japan for $40,000—but these were not the kind of screens you’d find in anyone’s living room.
The projection TVs that followed were able to achieve their unheard-of screen sizes by using internal projectors and mirrors to project the cathode ray image onto a translucent screen, but this came with significant drawbacks. The sets were massive and could weigh up to 500 pounds, and the projected image was blurrier and dimmer than a typical CRT’s already “standard definition” image. Viewing angles were limited—you basically had to sit directly in front of it to see anything clearly—and projector bulbs had a limited lifespan and were expensive to replace.
The limitations and cost of rear projection TVs didn’t dissuade people from adopting the technology, especially as they came down in price. By the 1990s, improvements in rear-projection optics, CRT projectors, and production efficiency made big-screen, rear projection TVs into a status symbol, resulting in 50-, 60-, and even 70-inch behemoths appearing in suburban living rooms. They were still heavy, fuzzy, and crazy expensive—a 61-inch Magnavox rear-projection television cost $2,999.99 in 1993—but everything changed in the late 90s with the release of the first plasma TVs.
The flat screen revolution
Plasma and LCD TVs weren’t just better ways of displaying images, but worked on entirely differently principles altogether. In a plasma TV, each pixel is a tiny gas-filled cell that emits ultraviolet light when charged with electricity, which then excites phosphors on the display to create visible colors that resolve into an episode of Friends. LCD TVs use liquid crystals to control the passage of light sourced from a backlight behind. Each pixel contains a liquid crystal layer that can twist or block light, allowing precise control over color and brightness and thus a much more detailed look at Rachel’s hair. Both technologies supported far brighter and more defined images than rear projection TVs all without weighing 400 pounds, making big screen, high-definition displays obtainable for average consumers.
Both LCD and plasma TVs had advantages and drawbacks—plasmas had faster response times (how quickly a pixel can adjust) and darker blacks than LCDs, but LCD TVs lasted longer (around 50,000 hours vs 30,000 hours), used less power, work better in brighter rooms, and weren’t as prone to “burn in” as older plasma and CRT monitors. Ultimately, LCD won out, and plasma TVs became a thing of the past by 2014.
In 2004, Sony introduced the first LED TVs. Where older LCD TVs use cold cathode fluorescent lamps for back lighting, LEDs use light-emitting diodes as backlighting. They’re much more energy efficient and produce a brighter image, more accurate colors, and greater contrast than either LCD or Plasma displays. LED and other technical improvements also solved problems like narrow viewing angles, motion blur, and uneven backlighting that plagued earlier generations of flat screens.
Flat panel displays were expensive at first, but prices fell rapidly. A 42-inch plasma cost around $20,000 in 1997, but cost less than $1,000 a decade later. As prices fell, resolution rose, from 720p (1,280 pixels wide by 720 pixels tall) to 1080p (1,920 pixels wide by 1,080 pixels long) to 4K (3,840 pixels wide by 2,160 pixels long), making it feasible for anyone to mount a giant TV on their living room wall and enjoy a level of realism and image quality previously only available in movie theaters.
Fine tuning your television: All about backlighting
As screen size and resolution improved, so too did the qualitative aspects of TV images—contrast, color accuracy, and brightness. Older LCD TVs use fluorescent lamps to shine light through liquid crystals, but the crystals can’t block all of the light, so no pixel is ever truly black. That’s why you can tell whether an older LCD TV is on, even if there is no picture. LED displays are built with local dimming—backlights that can light up or dim zones of the screen as needed. The result is less light leaking through the pixels, and thus darker blacks. Mini-LED displays have many more backlighting “zones,” sometimes thousands, further refining the darkness. QLED displays slide a film of “quantum dots” between the LED lights and the LCD front that dilate to improve color saturation and brightness.
Organic light-emitting diode TVs (OLED) take it even further. Many OLED televisions don’t have a backlight at all. Instead, each pixel in the display contains an organic material that lights up individually when electricity is applied. So when a pixel is black, it’s off, which means it’s totally black. OLED televisions aren’t perfect—they tend to be less bright than LED or mini-LED displays—and the emerging technology of microLED TVs promises to solve that problem, but current six-figure price tags make them prohibitively expensive.
We may have achieved peak television
The difference between a color image and a black-and-white one were immediately obvious when the first color TVs hit the market in the 1950s, as was the difference between high-definition and standard definition in late 1990s, but the distinction between an OLED and a QLED display are fine enough to be almost indistinguishable to the average consumer. I’m sure some people are passionately devoted to OLED over mini-LED, or feel you haven’t really experienced Breaking Bad if you haven’t seen it on a $100,000 microLED TV, but for the rest of us, midrange TVs are so close to “as good as they can possibly be” that granular technological improvements are meaningless.
Now, no technology is perfect for everyone. CRT TVs, for instance, are better than the best LED TVs for old school gaming, and a 4K TV might not be detailed enough for some technical uses, but if you’re just talking about the needs and desires of standard, living-room-dwelling watchers, current TV technology is all but perfect. Here are some reasons why:
The limits of vision
A standard 65-inch 4K television delivers a resolution of 3,840 x 2,160 pixels, a density high enough to create an image that is pixel-invisible to a typical viewer sitting at reasonable distance from a television. You can buy an 8K TV (7,680 pixels wide by 4,320 pixels tall), but those extra pixels won’t make the picture look clearer or more highly defined in a practical way; they’ll only add more detail than you can physically see from your couch. For reasonable viewing, even 4K screens are overkill.
Then, there’s the question of size. TVs always could get bigger, but there’s a point where it doesn’t add value to the experience of watching. The Society of Motion Picture and Television Engineers has determined that the best screen viewing experience for most people is achieved when sitting at a distance where your display screen is taking up 30 degrees of your vision. That’s about 8.5 feet away for a 65-inch TV, more than adequate for most living rooms, and even if it isn’t, commercially available televisions go up to 115-inches, which is big enough for all but a cathedral-sized rec room.
What do you think so far?
The limits of light, color, and comfort
Contrast, the difference in brightness between the darkest blacks and the brightest whites that a screen can display, helps determine how vivid and detailed an image looks. OLED TVs don’t have contrast ratios, because the contrast is infinite. Each pixel in an OLED TV is its own light source, so when a pixel is told to be black, it is literally off, and it doesn’t get blacker than that. In terms of color, modern OLED TVs can reproduce 98 to 100% of the colors used in movies and TV shows, so what you see on screen is all the color there is in source material. While other display types don’t have OLED’s infinite contrast ratio, they get pretty close: Some mini-LED TVs have a contrast ratio as high as 10,000,000:1.
TVs are also brighter than ever. Displays designed for use outside are bright enough to be watchable in full sunlight, and their peak HDR brightness of 1,400 or so nits is far brighter than then the 250 nits of typical screen viewed indoors, which is already more than bright enough to be comfortable for your living room.
The limits of content
As far as what we watch on TV, if you define perfect TV as “the ability to watch anything I want, whenever I want,” we’re practically there. Viewers used to have a scarcity problem; you’d watch whatever happened to be on one of three channels and you’d like it. Now, our problem now is abundance. We’re overwhelmed with content to watch—there are millions of instantly available things to stream on your TV, from shows to movies to YouTube videos. While programming spread over thousands of channels and across dozens of pay and free streaming services is messy, almost every film or TV show ever produced is available somewhere, although it might take a little work (and monthly subscription fees) to find it.
What’s next for TV?
Consumer demand for bigger-screened televisions with higher quality displays has essentially driven the industry for the last 80 years, so what happens now that the race is almost over and we can all watch whatever we want on an all but-perfect TV? A marketing person might answer that TV makers will create reasons for people to want new TVs by expanding what TV actually is. You can see this happening with things Samsung’s The Wall or Sony’s Crystal LED—systems that let you cover an entire wall with seamless TV panels (if you have a spare $100,000 sitting around).
But do people really want a TV wall enough to buy one, assuming they become more affordable? Some people would, sure, but a wall screen wouldn’t really make sitting on the couch watching TV better for most of us. A more down-to-earth potential future for TVs is represented by Samsung’s Frame, a “a lifestyle TV” designed to turn your screen into a gallery of digital art when you’re not watching Netflix. It’s cool, but if it doesn’t improve the experience of watching Pluribus, I’m not rushing out to replace my TV.
When “big TV” tries to create a desire for TVs that do something other than just work like TVs, the results haven’t always worked. Back 2010, perhaps sensing the need for a “gotta have it” feature, the industry rolled out the first 3D TVs. Despite years of hyping the technology as the next big thing, consumers didn’t bite, and by 2017, 3D TV was a dead technology. It was cool, but not cool enough to justify buying a new TV when people just wanted to watch Game of Thrones. Another example: the “screenless screen” represented by AR/VR devices like the Apple VisionPro or Meta Quest 3. It’s too early to say for sure, but these much-hyped devices seem to be meeting with lukewarm consumer response as well.
The one way your TV isn’t perfect
Don’t get too smug about your perfect TV, though, as it’s probably going to break soon. The profitability of the TV industry requires a lot of people buy new TVs every few years, so your 65-inch Samsung isn’t designed to last as long as the clunky CRTs of yore. Older sets were fairly simple machines that could last for decades (if Elvis didn’t shoot them), but modern flat-panels are packed with LEDs that dim and LCDs that flicker out. Maybe more importantly, almost all new TVs are smart TVs, which introduces new ways of adding obsolescence—manufacturers could stop updating your TV’s operating system and streaming services could drop support too. Even if the display still works, you might find navigating your TV to be such a slow, cumbersome, and useless experience that you’ll go out and pick up a new one, far earlier than you otherwise would.
There’s also the matter of privacy: These TVs are constantly watching what we do, and collect our data when connected to the internet. It’s part of why TVs don’t cost as much up front: You are subsidizing the price with your data. Disconnecting these TVs from the internet helps, but many streaming devices aren’t much better, so you need to choose wisely. Choosing the right one, however, can expand the life of an old, otherwise functioning TV—until the hardware gives out, of course.
The TVs we have today are brilliant, cheap, and enormous, but they’re also designed for a world where replacing your screen every five-seven years is normal, even if a “better” set doesn’t necessarily exist.
