4k is just a pipeline, HDR is what matters as does Dolby vision and HDR10 (although that will be extinct soon in favor of DV). true, if you dont have a 4K feed, you cannot have HDR or DV but in terms of picture quality, the better way to measure it is by HDR or DV since the 4K itself doesnt really define the image quality. HDR10, 10-Bit, vs DV's 12 bit represents the colors presented on your tv screen. DV is 68 billion vs HDR10 of 1 billion and the bandwidth required is much greater. (HDMI 2.0, 18gbps vs HDMI 2.1 or 2.2, 48gbps) Also Peak radiance or NITs goes up to an incredible 10K with DV which of course is currently maxed at 4K which DV is mastered at.( No tv can produce a NIT beyond 4000), Without delving too deep, OLEDS are still the best possible picture available but accomplish this with lower NIT level. (~1000) but HDR content can be reached at as little as 500 NITs . overall brightness is a bit of a farce. if you ever looked at light reflecting off water, its often too bright to look at which is why you wear polarized sunglasses! Imagine having a tv image that was so bright you had to wear sunglasses to watch tv!
. thats what an image with too high of a NIT rating would result in.
decent article on HDR10 vs Dolby Vision and what it all means.
The biggest TV trend in past few years is High Dynamic Range (HDR). It has also stretched to other devices. Here's Dolby's take on HDR.
www.pocket-lint.com
Why does the Bit rate matter? if you are a geek like me, its matters a lot! going from a 4k feed that carries HDR at 18gbps (actually lower) to Dolby Vision which requires a 48gbps bandwidth results is a greatly improved image.
Since I'm already geeking out, read this article on your HDMI cables and where you should be.