Tech Defined: UHD vs 4K


Hey everyone. So we are in to 2014, and as I mentioned coming into the holidays, 4K and UHD were going to be the big buzzwords moving in to 2014 and CES 2014 in Las Vegas. With the trade show in full effect for it's last day today, more than 300 devices sporting 4K displays have been on exhibition. From TV's to monitors to laptop displays, 4K and UHD (Ultra HD, or Ultra High Definition) is everywhere. Unfortunately, with so many devices and articles about 4K and UHD out there, the two terms have become so conflated that many times, even tech writes who know the difference have began using them interchangeably. the problem is, UDH and 4K, are NOT THE SAME THING. So here is some knowledge to help you understand the difference. 

Before I break down the differences, keep in mind that, as a consumer, there is no PRACTICAL difference between 4K and UHD. That said, the term “4K” originally derives from the Digital Cinema Initiatives (DCI), a consortium of motion picture studios that standardized a spec for the production and digital projection of 4K content. In this case, 4K is 4096×2160, and is exactly twice the previous standard for digital editing and projection (2K – 2048×1080). As you can see, 4K clearly refers to the fact that the vertical resolution (4096) is just over four thousand. The 4K standard is not just a resolution, either: It also defines how 4K content is encoded. A DCI 4K stream is compressed using JPEG2000, can have a bitrate of up to 250 megabits per second (Mbps), and 12-bit 4:4:4 color depth and image quality. 


Ultra High Definition, or UHD for short, is the next step up from Full HD — the official name (that no one uses) for the display resolution of 1920×1080. UHD doubles that resolution to 3840×2160. It does not take a genius to see that 3840 is actually quite far away from four thousand. Almost every TV or monitor that you see advertised as “4K” is actually UHD. There are some panels out there that are 4096×2160 (aspect ratio 1.9:1), but the vast majority are 3840×2160 (1.78:1). If you displayed 4K content on one of these “4K” displays you would get letterboxing (black bars) down the left and right side of the screen. There isn’t yet a specification for how UHD content is encoded (which is one of the reasons there’s almost no UHD content in existence), but it’s unlikely to be the same quality as DCI 4K.

So there you have it. UHD, and 4K are just not the same thing. It is interesting knowledge, but not essential. The truth is, that because the TV industry has decided 4K sounds great, and they are using it. And in all honesty, 4K is just a name. It communicates what the industry wants, which is a much higher end display than the current generation of 1080p displays. But at least now, when that salesman at the local electronics shop or Best Buy tries to talk to you about a new 4K TV, you can set him right and let him know, you aren't the average consumer: You are a Rethink Associates customer. 

Jumping Into HD: The "Soap Opera" effect...


Do movies look weird on your new TV? Does everything have a hyper-real, ultra-smooth motion to it? Are you sure something is happening with the TV's image you don't like, but you can't figure out what? Everything looks really, really, vivid and sharp, but almost fake? Like you are watching National Geographic or something. Or the BBC?

Chances are, what you're seeing is called the "Soap Opera Effect," as descriptive a moniker as we get in tech, in that this feature makes everything on your TV look like a cheap soap opera. This is actually a feature of most new TV's called "Motion Smoothing" or "Motion Interpolation".

That effect that unnerves you is a common issue with 120Hz & 240 Hz HDTV's. These TV's with refresh rates higher than 60Hz are designed to make action movies and fast moving sports programs look much smoother. And sports watchers were one of the key and most influential target groups that HDTV was developed for. 120Hz sets did an amazing job of that effect, The leap to 240 Hz is supposedly a further improvement but the trade-off is that it can make the picture look fake. Or simply make it look too good to be real because our brains have been trained to expect movies to look different to TV shows.


The impact of the effect also varies depending on the content you're watching. Movies tend to be produced in 24 frames per second, while TV shows are shot and produced in 30 or 60 frames per second. That may not seem like much of a difference, but you start to notice it once you ramp up the refresh rate on your television. A 240 Hz television creates extra frames to smooth out the action, but it's only taking an educated guess and sometimes the results look worse rather than better. You don't tend to experience this problem with plasma TVs. Another thing to note though, is that some movies and shows are intentionally now shot in HFR or High Frame Rate. Movies like The Hobbit: An Unexpected Journey.  It was filmed and projected at 48 frames per second as opposed to the traditional 24 frames per second.

If you're experiencing the effect on your television then you should try going into your video settings and dialing down or shutting off the Motion Interpolation. Every TV manufacturer has a different name for it; Sony calls it MotionFlow, Samsung calls it Motion Plus, & LG calls it TruMotion. If you're watching a show or movie and you start to notice the unreal  effect, try changing these settings. Most televisions should remember the individual picture settings for each HDMI input, so you can change it for your Game Console or Blu-ray player without effecting your Cable DVR, or vice versa.

Hope that helps folks!!