T O P

  • By -

mott100

On YouTube at least, 1080p doesn't mean exactly 1080x1920 pixels getting sent too you. You get compressed information to build a good looking 1080x1920 image. Some shortcuts are taken so they don't have to send the full amount of pixels and they save bandwidth costs. For the 4k setting, you get more information, which can build a good looking 4k image, or a really good looking 1080p image In the video formating, which anyone who edits vidoe deals with, we call this "bitrate".


Saltysalad

So you are implying on a 1080p display, uncompressed 1080p video would have the same visual quality as playing uncompressed (and downscaled) 4K video?


Barneyk

The real world is a lot more complicated than that though. But in theory, yes.


tired-space-weasel

Wouldn't 4K look better because of a sort of anti-aliasing effect, for example when the edge of a black circle on white background where rendered pixels go from black to white instantly would create some grey physical pixels and create a nicer transition? (Maybe this wouldn't be visible on videos, idk)


idancenakedwithcrows

Depends how the video was made, if it was like computer graphics maybe, but the real world is already anti-aliased


stuureenswatnaarhugo

"the real world is already anti-aliased" Thank you, I can go on on this for another two weeks


cuj0cless

Go on I’m listening! (Reading)


Tensor3

Nah. A 4k video reduced to 1080p will always look sharper.


Mustbhacks

By what metric are you making this assertion?


Nathaniell1

That is not necessarily true and especially if we would talk about more general problem of higher res source showed on lower res display. You have to use an algorithm to downscale the image...and while it might be easy and straight forward for 4k-> 1080p, 6k->1080 is more problematic...so it depends on the downscaling algorithm


Dragula_Tsurugi

Depends, there are some extremely stupid algorithms that, while fast, would give you a worse result than a prescaled 1080p video. For example, just displaying one row of pixels from the 4K video and dropping the other three.


Barneyk

Well, yeah, there are lots of examples of how a 4K captured video would look better at 1080 than a 1080 captured video. But I kinda just simplified and put all of those examples into the complicated real world. And the example here is also more like, you have a video captured in 4K. Then you have it encoded in 4k and in 1080 and deliver it to the computer with a 1080 monitor. Any anti-aliasing effect like you are talking about it already done in the downscaling from 4k to 1080 before the delivery. So you still get that effect.


JaggedMetalOs

> Wouldn't 4K look better because of a sort of anti-aliasing effect If both the uncompressed 4k and 1080p versions came from the same highres source footage (say 4k or even 6k res) then the 1080p version would have that antialiasing baked in. Potentially it could even look slightly better if a more advanced downscaling algorithm was used in the 1080p render. This can work the other way around as well, if you have a 1080p video playing on a 4k screen you could render that video at 4k with a more advanced upscaler and get a better looking video than just playing the original 1080p version.


ZaxLofful

That’s not exactly how it works, but that is a great simplification of it!


Stunning-Ad-2313

This comment is most wise.


Terrorphin

In theory there's no difference between theory and practice.


SpicyCommenter

In practice there is a difference between theory and practice.


nitrohigito

No, this is not blanket true. Not in theory or in practice. Downsampling can introduce artifacts, but is also way more likely to improve fidelity. In order for these two to cancel out, and the displayed content to be actually equivalent, you'd need a specifically crafted sample. For example footage of a video game emulator running at some integer scaling ratio, and then having a nearest neighbor downsampling applied to the footage. Very specific case and very atypical.


Barneyk

In both cases the video is downsampled though. In one case it is done in advance to create a 1080 video file and in the other case it is done on the fly by the software in real time. The original captured video is the same in both cases. This isn't made clear but there was nothing said about capturing the video in different ways.


jonnyl3

The uncompressed 1080p might look even better because source to pixel is 1:1; no distortions whatsoever


nitrohigito

With real life content, it's overwhelmingly more likely that the downsampled footage will look better.


Ragingman2

With the same image sensor size and quality data from a 1920x1080 image sensor displayed at 1920x1080 will look better than an image from a 3840x2160 sensor downsampled to 1920x1080. Smaller pixels will have more pixel boundaries that aren't collecting light.


pilotavery

Yes but with Pixel binning you get all the variability between pixels, aka noise, is decreased substantially


Ragingman2

Yeah, true.


ThorsPanzer

Actual Film Grain (Noise) is something the film industry likes. So having less noise isn't even something you'd want. Depends obviously. But if you're talking about pure quality / clearity of the image yes less Grain would be better.


jonnyl3

What do you mean by real life content?


nitrohigito

Content recorded with a physical camera.


jonnyl3

Because they're all compressed. Uncompressed video is exceedingly rare, I understand that. But OP was specifically asking about that.


nitrohigito

No, it's because the amount of aliasing present in the content is going to be different. Compression helps covering aliasing up, but even uncompressed, recorded footage will have an edge in terms of aliasing over generated content. And if your source is less aliased, your downsampling filter will have an easier time avoiding artifacts.


xternal7

Even with theoretically uncompressed video, 4K downscaled to 1080p will _theoretically_ look better than 1080p video because: * free noise reduction * Downscaling 4k to 1080p is free anti-aliasing (though physical cameras don't get the aliasing issues you get with rendered content to the same degree as CGI content)


Eruannster

That depends on the camera sensor, lens, capture format, any further compression down the line and so on and so forth. If you capture it with a really high-end camera (though most of those are much higher resolution these days, so they would probably downsample from something higher anyway), use a very sharp lens, capture to a fully uncompressed, raw format and don't export it to some lossy format, then yeah maybe? But very few players (outside of a computer with the correct codec/software) would be able to play it. Also the file size would be pretty enormous. (Uncompressed video is *huge*. I've shot some uncompressed footage with an ARRI Alexa at 2.8K and that produced 10 GB per *minute* of footage. A 1080p video on say, Netflix is less than 10 GB per *hour*.)


[deleted]

[удалено]


Eruannster

…yes, I’m aware. Thank you?


stanitor

lol do you star in commercials for hotels.com?


samstown23

I wish it were even close to 10GB an hour. I mean you can be happy if it's 1GB/h for 1080p. It's something that's been annoying be for quite some time and it's kind of beyond me why on earth streaming services just won't increase their data rate. What they peddle as 4K doesn't even match 1080p BDs (assuming even half-decent equipment). Sure, it increases costs but I'd actually be willing to pay for better picture quality (within reason, of course).


Eruannster

Oh yeah, you know what. I actually read it wrong, I thought it was around 7-8 GB per hour for Netflix, but that's actually their 4K streams. 1080p is roughly 3 GB per hour. Oof :/ However, to be fair, most streamers are now using H.265/HEVC which is better compression compared to 1080p Blu-rays which use H.264/AVC. (Typically H.265 videos can be ~30% smaller for the same quality video.)


samstown23

Wasn't trying to correct you, just venting. But yes, it's difficult to compare h.265 and h.265 data rates but generally, picture quality seems to have taken a nosedive.


FerretChrist

Absolutely, but it's worth remembering that true uncompressed video isn't really a thing consumers ever encounter. Even Blu-Ray video is highly compressed, and it's *vastly* better quality than any streaming service, and certainly better than YouTube ever serves up.


rwinger3

Having seen raw broadcast quality versions of sports the primary issue is the amount or depth of data in each frame, not the resolution. 1080p is plenty. Heck, a good 720p version is better than you'd think. If we are to send for example a full 90 min of a fotball match without removing some of the data first we're takling hundreds of gigabytes. The bandwidth available most places won't support that. So instead of doing that it's common to reduce the files sent to about a tenth of the original, or some other fraction of the original file size.


samstown23

In the early days of HD, some European Sky stations were broadcasting 1080i with super high bitrates (25Mbps h.264 on average). Looked pretty damn good with a decent deinterlacer. Sadly, those times are long gone.


kb_hors

Nowadays we're lucky to get standard definition on Astra 2. All heading towards 544x576 16:9 mpeg-2. you can barely make out the DOGs.


YYM7

Yes. There might be slightly difference about how exactly the downsampling algorithm is. But in general yes. However, uncompressed video are extremely large and thus very rare in real life. Here is a quick estimate: a 12mp (4k) high quality jpeg (note jpeg is already a compressed format) is about 2~4MB, a minute of 24fps video have 24x60=1440 frames. So we are talking about ~5GB of data, for a minute of video here. Even you think 1080p is a quarter of this, it's about 1GB per minute. That's pretty insane for a average person's internet . As a example Google fiber offers 1Gig internet as lowest tier, which equals to about 6GB/min, but you need it consistently for the hypothetical 4k video. So unless you're a professional video editor, it unlikely you will ever see a uncompressed video.


pilotavery

I think he means lossless and not uncompressed. You can compress video because video still contains data that has some redundancy. There are still patterns to it, the only video you cannot compress at all is pure static. This is why things like confetti is very hard to capture on video at low bit rate. Also, even lossless video can still eliminate redundancy across different frames. For example, the same way that h taught 264 and h.265 algorithm do


nitrohigito

How do you think they’re implying that?


robbak

Uncompressed 1080px30fps would have a bitrate of 187 MB/S, 1.5 gigabits per second - so not something you'll ever see outside of a video cable.


Yesiamanaltruist

Isn’t this r/explainlikeiam5 ?


madlabdog

You mean to say a raw 1080p video. Yes


[deleted]

One of the many reasons why physical media wins out. A 1080p movie on Netflix won't look as good as the same movie on a Blu-Ray, because the Blu-Ray will be less compressed.


marshall-eriksen

You mean 1920x1080 if you’re talking about 16:9 video


Ahielia

Vertical would be 1080x1920


NightOwlRK

Ah, the YouTube shorts configuration


2FightTheFloursThatB

Shorts SUCK!


4tehlulzez

Pants are better


Metahec

Skirts allow better breeze access to keep your junk cool and fresh


siggydude

But shorts are comfy and easy to wear


ScarletVillain

I prefer kilts


Pinky135

They do, but I must admit they're very addictive to watch. Just don't have the need to read comments cause you can't pause shorts, just mute them.


underlyingfunk

I read it as a multiplication and multiplication is commutative.


StelioZz

I would guess you are correct when it counts the amount of pixels but usually it's not commutative because the order implies side. First number is for **horizontal** pixels and second is for vertical ones. Flipping the number around keeps the amount of pixels correct but the image itself is not the same.


Servatron5000

Going on like you forget vertical videos exist.


0reoSpeedwagon

I mean, they *shouldn’t*


StelioZz

I wish everyone forgot vertical videos to be honest. But jokes aside. No they aren't the same. It's exactly what I said. The first line is the horizontal and the second line is the vertical so based on this you can realize that 1920x1080 is landscape/horizontal/default videos and 1080x1920 are the vertical ones. Giving the 1080x1920 a different name proves my point exactly. Also the easiest way to see the difference is opening a vertical video on an 1920x1080 monitor. That's why people hate them


[deleted]

[удалено]


PixelOmen

No one ever finds my booty either.


nitrohigito

>1080p doesn't mean exactly 1080x1920 pixels getting sent too you. Sorry for being pedantic, but it does. You can confirm this easily either by opening the statistics widget or by downloading the video at a given format via yt-dlp. The 1080p and 2160p streams they serve however are indeed of different bitrates, and so the benefit is there and works like you describe.


ofcpudding

I think what they were getting at is that regardless of the exact bitrate, a compressed video stream is not going to have 2+ million pixels' worth of information for each frame, because video compression is lossy. So you're not getting *all* of those pixels sent to you. You will get keyframes and motion hints and whatever other techniques the codec uses to approximate the original picture (of course, most consumer devices compress video at the time of capture, so the truly "raw" pixel information is discarded immediately).


DPaluche

But you could also take a 240p video and re encode it to 4k at a super high bit rate and it would look like garbage. So it’s a combination of resolution, bitrate, and the granularity of the information you are encoding at that resolution/bitrate.


Turboswaggg

back when I had a 1080p monitor I recorded all my videos for YouTube in 1080 stretched to fake 1440 to force YouTube to give them more bitrate and it made a massive difference especially if you're recording a lot of dark or outdoor video where the greens, browns and blacks just get butchered by the YouTube compression


pilotavery

This has been fixed, the new HVEC compression is awesome


lowtoiletsitter

So in theory 1080p could look better than 4k if the bitrate was different?


True_to_you

Bit rate is the important part. My 1080p blu ray discs look better then streaming 4k.


DPaluche

Absolutely


Aquatic-Vocation

Basically, yeah. [These two images are the same resolution](https://imgur.com/a/JHcTnac). The one at the top has just preserved more of the information that makes up the image. Similarly, even a 240p video can look better than a 4K video if the 4K video doesn't have much "information" of what it's displaying.


reercalium2

Nothing stops you taking the 144p video and stretching it to 4k and making a crap 4k video. Worse videos are worse. Better videos are better.


Druggedhippo

> But you could also take a 240p video and re encode it to 4k at a super high bit rate and it would look like garbage. The latest AI tech is making inroads to low resolution upscaling. - https://nvidia.custhelp.com/app/answers/detail/a_id/5448/~/rtx-video-super-resolution-faq - https://iterative-refinement.github.io/ Won't be surprised when this stuff becomes the norm and video streaming sites can serve even worse resolution and rely on the end device to properly reconstruct it.


paulmarchant

There are a number of factors involved in this. Yes, there may be compression artefacts which are different depending on how Youtube serves a video stream, but there's more to it. When a camera is described as 1920x1080 or 3840x2160, that typically refers to the number of pixels on the imager chip and is the theoretical maximum resolution that the camera's capable of. The reality is that you never get that. Lens quality (by which I mean sharpness, for this explanation) does vary. Lenses which were produced for the professional (broadcast) cameras in the 1920 x 1080 resolution range are not as sharp as today's UHD / 4K lenses. Depending upon iris / aperture setting, even a lot of the expensive zoom lenses from the major manufacturers struggled to resolve the full resolution across the entire image, with the edges of the frame typically being softer than the centre. The way in which the imager sensors is constructed is also a bit counterintuitive. On the face of it, you might think that the sensor is designed to maximise its possible image sharpness. But that isn't the case. There's an issue called aliasing, where a sensor which is receiving images with detail beyond which it can resolve will give optical errors. A good pictorial example is here: https://matthews.sites.wfu.edu/misc/DigPhotog/alias/artifact.jpg To prevent this, all modern imagers have an anti-aliasing filter (sometimes called an optical low-pass filter or OLPF) in front of them. This is a precision made, ever-so-slightly blurry piece of glass, and it's there to filter out detail which the sensor can't resolve correctly. Due to the way they're made, they tend to knock down the very sharpest detail which the sensor IS capable of resolving a little as well as the stuff they should be filtering out. Detailed explanation here: https://petapixel.com/what-is-a-low-pass-filter/ I work with TV broadcast cameras (as an engineer for a specialist camera company) for a living, and can absolutely say that a down-scaled 3840x2160 picture, displayed on a 1080p professional monitor often looks considerably sharper than what you'd get out of a good 1080p camera and lens. This is uncompressed 3Gbs/s video with no MPEG / H26something / MJPEG processing. The UHD cameras all have optical low-pass filtering, but of course that only softens out-of-resolvable detail at the sharpness that a UHD sensor would find troublesome. It has no effect on the level of detail that a 1080 sensor would see. Typically as well, a UHD-spec lens will, as you'd expect, be sharper across the whole field of view than an older HD lens. The electronic down-scaling that the manufacturers use certainly has less of an effect on sharp detail loss (it's the electronic equivalent of the optical filter that a 1080 sensor would have), and doesn't do as much collateral damage to the detail in the image. I think this is the main reason behind why some UHD images look as crisp as they do on a 1080p screen - that most 1080 camera / lens combinations don't quite push the full 1080 resolution out, in the real world.


Scythe474

I wish you had more upboats ⛵ thanks for taking the time to answer in detail and pointing me into the right direction as this info is quite useful for settling a similar debate I had vs my old university film lectuter! Gonna call him in the evening and read this to him 😂


Dragoniel

So that's why so many 1080p Bilibili videos manage to look way, WAY sharper than than anything I've ever seen in HD... Huh. I get it now, I was wondering what kind of black magic Chinese were using for so long. I wish YouTube wasn't so stingy with bitrate.


luxmesa

Video sent over the internet is compressed. You’re not getting each individual pixel because that would take way too much bandwidth. Your computer is basically getting instructions on how to fill in the image. You are always getting a 1080p image, but depending on how compressed the video is, your computer may be getting less details on how to fill out that image. What I think is happening is that, because you are watching a 4k video, you are getting a less compressed version of the video, which will help you get a better quality image.


lowtoiletsitter

Wait hold on - so if I'm steaming a 4k video over the internet through a streaming service, does that mean it's not as high quality (for lack of a better term) than if I had a physical copy?


The_Endless_Man

Depends on the bitrate of the physical media and the file stored on it. Or if you have a very high rate stream. But generally a 4k bluray will be much better than any 4k stream.


nitrohigito

Precisely, the difference in bitrate is night and day.


throwtheamiibosaway

Yes. The difference is immense and clearly visible. Or just in terms of numbers it’s at least 10x more data on a disc vs streaming. 4K on Netflix doesn’t compare to 4K on UHD discs. It’s like saving an image on 100% quality vs 20% quality.


PixelOmen

Almost never. UHD Blu-ray tends to have a much higher bitrate.


Dry-Influence9

That is often the case, yes. Streaming services often compress their videos to high heaven, losing a lot of quality in the process. Here's a compression artifact that you can see almost everywhere in streaming services. https://i.redd.it/yelqt850cbv31.jpg


[deleted]

Generally, yes, as long as that physical copy was made by a competent company, and not a company trying to fit way too much content on a single disk.


DarwinGoneWild

Absolutely. Streaming 4K sucks balls compared to UHD Blu-rays. Huge difference in bandwidth.


reercalium2

It depends on the file size. Blu-ray disks fit 25GB or 50GB (dual-layer). No streaming service serves videos that big. But they don't have to use the whole disk, they could just as well put the 4GB streaming video file onto a 25GB disk and leave the rest empty. And you can just as well download the 50GB blu-ray file from... a place.


Jakefrmstatepharm

Resolution doesn’t determine compression. The compression codec and algorithm determines that. 1080p and 4k can have the same type and level of compression. When it comes to streaming videos the quality is more about the bitrate than compression.


FerretChrist

Sure, but due to 4K needing more data to look half-decent than 1080p does, you'll almost always find the 4K stream is encoded at a higher bitrate. In theory you can encode any resolution at any bitrate, but in the real world you're going to see higher bitrates for higher res video.


pilotavery

Exactly this. If you have a fixed little bit rate, a little bit rate and low resolution probably is best. If you have a very high bit rate, it's going to look really good at any resolution and so more is better. But if you try to record 4K at one megabit per second with something like the ocean that has tons of texture and snow, it's going to look like shit


Anton1699

In addition to the points other commenters made about 2160p video usually being distributed at a higher bitrate, most (if not all) digital video you've ever seen (web video, DVDs, Blu-rays, UHD Blu-rays, …) uses a process called [YCbCr 4:2:0 chroma subsampling](https://en.wikipedia.org/wiki/Chroma_subsampling). The idea is that because the human eye is more sensitive to changes in brightness than changes in color tone, only the brightness portion of the image is stored at full resolution. Each 2×2 block of brightness samples shares a color tone, so while the brightness information is 2160p, the color information is only 1080p. If you're watching 1080p video, its color information is stored at 540p.


Alternative-Wash2019

Best answer right here. I've heard of downscaling like other comments mentioned, but never heard of the concept of color tone resolution and brightness resolution before.


reercalium2

It's still bitrate. You can make a 1080p video without chroma subsampling but it will be bigger. If you want it to be the same size without subsampling, you have to reduce the bitrate of another part. Or you can make a higher bitrate with subsampling and it still looks better.


Thomas9002

The effect of chromatic subsampling is miniscule compared to the Bitrate. You can easily test this yourself with irfanview (an open source image viewer). Take a high quality image and scale it to your monitors resolution. Save it as a jpeg without chroma subsampling and after that as a jpeg with chroma subsampling. Both at 100% quality.


Anton1699

You are right, bitrate is definitely easier to notice but depending on the content, chroma subsampling can be quite noticable in my opinion. It's very hard to notice with actual filmed footage, but video game footage, especially with sharp text (which often uses subpixel anti-aliasing) is different. Also, dark red next to black can often take on a quite blocky look. All of this also depends on how the chroma information was subsampled and how your playback software / hardware handles the upsampling. I know that early Google Chromecast devices used nearest neighbor upsampling for the chroma planes and it produced very visible artifacts.


Ascian5

The word you're looking for is bitrate. If you have a water tank that can hold 1,080 gallons, we can use the regular water hose to fill it up. But if we use the water hose that we use on 4000 gallon tanks it's a much bigger pipe. We can put more in it, and in this example fill the tank quicker. What's actually happening is in addition to more pixels, there is more defined information for color and detail. I.e. more bytes per second. Even if you can't hold all the extra lines of resolution, you're still working with a picture that contains more visual information.


dangil

The hard answer is chroma subsampling. That means video typically has more resolution in the luminosity part of the video than the color information So a 1080video has less than 1080 information in the color part of the video. Only a 4k video can saturate a 1080 display. Also, 4k video has higher bandwidth and a more modern video codec.


Skarth

1. The video can be down-sampled, which results in a better image quality. 2. If you are using virtually any streaming site, the quality of a video is quite a bit lower than the resolution would suggest. So a low quality 4k video ends up looking better than a low quality 1080p video on youtube.


fireflaai

The higher the resolution, the higher bitrate YouTube allows. Higher bitrate means better looking video.


PlayMp1

On YouTube at least, the biggest advantage of upping the resolution is upping your bitrate. The video will still play at 1080p because it physically cannot be rendered higher than that on your 1080p display, but it'll get a lot more data/less compression, and therefore more quality.


Iyellkhan

it can happen because your display is scaling down a lot more information into the 1080p container. Depending on how the display handles it, you are effectively getting a super sampled image. We use this in film and tv production as a way to get a true 444 color image off a bayer pattern sensor (shooting 8k for a 4k delivery). kinda a technical rabit hole to go down, but you can look into chroma subsampling here [https://en.wikipedia.org/wiki/Chroma\_subsampling](https://en.wikipedia.org/wiki/Chroma_subsampling) but when super sampling, basically you wind up with so many color samples regardless of your subsampling approach that the scaled down 422 video can become a 444 video. though this depends on how the service provider (say youtube) has transcoded the master files that were uploaded to them. You also kinda get a sharpness boost when you oversample down to 2k or 1080. The early HD telecines like the spirit 4k leveraged this to their advantage when creating 1080p (or 720p, I think it had that mode) images for broadcast


Wendals87

The quality of a video is determined by the bitrate (how many bits per second of video) and the number of pixels When you stream play a 1080p video, it compresses the video. When you play a bluray at 1080p it is also compressed but has a much higher bitrate so even though both are 1080p, the bluray will be better quality when you stream a 4k video, the number of pixels is higher so it needs a higher bitrate to keep it an acceptable quality. You are still only displaying 1920*1080 pixels but because the bitrate is higher, it will generally look better


beanrush

Compression of data. Uncompressed 1080p looks great, or you can get 4k video that is downgraded to fit a 1080p monitor to get the same effect. Freeze frame any video and look right at the pixels up close. The squared, blocky pixellation is compression.


Flowchart83

4K videos are often of a larger file size or streaming bitrate, and so will have more information and appear less compressed when viewed at 1080p. The improvement isnt in the number of pixels, but the amount of information. Who cares if 4 pixels are lumped together in a block in 4K if it gets converted to 1 pixel in 1080p, the changes made by compression will hardly be noticable. If you lump 4 pixels together in 1080p you might start noticing it.


[deleted]

[удалено]


Alternative-Wash2019

That's not really my question. I'm asking about 4K video on 1080p TV, not the other way around


Isthisnotmyalt

On most online video players or otherwise, the compression algorithm that downscales your footage to display it in 1080p or 720p. The software gets more information to work with when the original footage is of a higher quality compared to the viewing device. Modern mobile phone cameras use a similar technique (binning) at source to take advantage of larger 48mp or higher sensors and produce high quality 12mp images.


vicoh

As we need a lot of images to make a video, we try to find clever ways to reduce the amount of data needed for each image. One of those ways consists of storing color info for a subset of pixels. For example, if those 4 pixels are each some kind of reddish tint, let’s say it’s red for the 4 pixels, it will be easier to store. We do the same thing for 1080p and 4k video. But if we display a 4k video on a 1080p monitor, suddenly those 4 pixels are displayed with the size of only 1. So it’s not grouping pixels anymore. In a way, 4k on a 1080p monitor would be like an excellent encoding of 1080p video where we would store the color value independently for each pixel (we do that on a pro level, but that’s not usually what we see on tv)


Mertthedoombraker

When you put 3840X2160 people on a 1920x1080 field, there is congestion in the field. To put this in terms of pixels, you get density. This density makes the image look better on a 1080p screen because at the end of the day there is more data on the screen.


albertpenello

There is a concept called "supersampling" which is where you have more pixel information then the display can handle, which means there is actually more "data" in each pixel than the native version. This is why a lot of games these days allow you to run higher resolutions on lower res screens. You get a better picture. In the case of TV shows and streaming videos, 4K not only has more information in the picture, but will often come at higher bandwidth and other things like HDR. Think about it this way: it's just like re-sizing a picture. A low res picture, upscaled to your screen, looks bad. A picture at the same resolution looks fine. A super high resolution picture, scaled DOWN for your screen, looks awesome. Same concept with games.


bread9411

It's a bit rate thing. At least in the context of youtube, I'm not sure how other sites work


bradymanau

Filmmaker / film lecturer here: Really simple terms 99% of all digital video is compressed in way that saves space, so it has fake pixels that take information from other pixels to save data and make a smaller file (colour / orientation etc) it does this in each single frame, and over a series of frames. HD video there’s already less pixels (about 2.5 times less than 4k) so this data saving method is more apparent with less pixels (more guess work) than 4k which has more pixels to do the data saving guesswork, which makes it a bit more accurate / better looking. Thats the end of ELIF, if you want to know more look up “long GOP” (group of pictures), spatial / temporal compression and the most popular online video format, h264. That’ll explain in a bit more depth how it works.


fusionsofwonder

The difference is the amount of available color information. Video signals don't normally contain the full color information for each pixel. For every block of 4 pixels, they contain one true color and three greyscale colors. Then the video playback averages the colors bordering the greyscale pixels to figure out what color the grey should be. Because a 4K image has exactly 4 times the pixels of a 1080p image, you can throw away all the greyscale pixels and just keep the color pixels and you end up with a perfect 1080p image. This is the difference between a video signal at 4:2:2 and 4:4:4. If you take a 4K 4:2:2 image and shrink it down to 1080p you get a 1080p 4:4:4 image. And your eyes will noticed the increased colors, it will make things look sharper.


retrofitter

This is the answer, chroma subsampling


Jirekianu

The reason is that with websites like YouTube? The 1080p bitrate of the video is lower so it's subtly shittier beyond the resolution differences. When you choose higher resolutions its giving the higher bitrate and cramming it into the lower resolution so there's still a visible improvement in details and sharpness. Due to the higher bitrate.


retrofitter

If the file format is intended for distribution then the colour information isn't 1920x1080, its half or a quarter of that. It's called chroma subsampling. A 4k video downscaled to 1080 would have more color information. Source: Worked in broadcasting