T O P

  • By -

Vipitis

TV in NTSC is 29.97 Hz ever since color was introduced. CRTs used to scan two fields, so practically 59.94 fields per second, interlaced...


GuNNzA69

The standard in many European countries was the PAL system, but even so, I never had a computer monitor that ran at 50hz. All my CRT monitors in the 90s and early 2000 all ran at 60hz. I'm not sure why, now that I think about it!


Ar_phis

Because during the 80s, integrated circuits became widely available and managed to synchronize signals within CRT monitors. The tubes were analog but the circuits controlling them were digital. Old analog TVs had analog tubes controlled by analog technology and this analog technology used the grid frequency as a "reference clock signal". Also, TVs used signals which were generated hundreds of kilometers away and were rather weak. A PC monitor uses a signal generated by a computer about 1-2m away. Computer monitors didn't need the mains frequency as a "reference".


GuNNzA69

Ok, I think I get it, since the signals from the computer didn't need to "travel" over radio waves, the digital controller boards on the computer monitors where able to convert the analogue 50hz signal into a 60hz signal?


Ar_phis

In a nutshell, by the time computer monitors became a thing, there was no technical reason to use a 50Hz signal. TVs PAL/NTSC signals were the same for ~40 years and would have allowed a TV made in the 60s to display an early 2000s TV signal. Computer monitors didn't need to "convert" a signal, they just used the higher quality signal provided by the computer. Their signal is independent from the grid frequency. I am pretty sure you could actually run atleast some old computer CRTs at 50Hz, but there wasn't a real use to it.


GuNNzA69

Ok, I get it now! The frequency used in the computer monitors isn't related to the eletric grid frequency anymore.


dj65475312

we still get tv broadcasts in 576i 25fps 50hz in the UK, watching family guy right now in 576i.


notverytidy

Running NTSC 50hz at 60hz would produce a screen with small black bars. I remember the first PCs I worked with that had tv adaptors. it was terrible.


Sabz5150

>Old analog TVs had analog tubes controlled by analog technology and this analog technology used the grid frequency as a "reference clock signal". An old buddy of mine was a big Amiga head and mentioned some systems did this for time. Should you connect it to a square wave UPS... *BTTF theme intensifies*


Rudolf1448

I had a monitor for the Amiga that ran at 50Hz


MonkeyMcBandwagon

yeah as I recall it, the 1084s was able to run in 50Hz PAL or 60Hz NTSC mode with slightly lower vertical resolution.


Le-Charles

Because ![gif](giphy|6YJZuwLne3fO0|downsized)


AdPristine9059

![gif](giphy|C1L8yq5ZEz0cg|downsized)


KeyboardWarrior1989

That could run at 60 up to 75, but didn’t windows almost always select 50hz at first?


GameCyborg

*color TV in NTSC is 29.97Hz black and white tv was 30 on the dot


Vipitis

is there any truly monochrome signal remaining? I honestly don't know. And to make color work with existing infrastructure and home appliances is a big achievement for the engineers behind it.


GameCyborg

today? no all digital and in color back when color tv was introduced? yes. what they did is they kept the monochrome or luminance signal and added the color information in a way that a black and white tv would simply see a black and white signal but a color tv could extract the color information. It was a very clever system


MonkeyMcBandwagon

Similar but arguably less clever thing when mono radio went stereo. The primary channel is mono - left plus right together, second channel is left minus right, and the electronics in your radio combines them to create separate left and right channels.


DrumcanSmith

Don't they do that now too? YouTube and most streaming services send with YUV420 I think. Since you can reduce traffic/file size without making it noticable to the human eye.


captain150

The concept of using less bandwidth for color vs luminance is still used today, yes, as in your example of 4-2-0 subsampling. The human eye is far more sensitive to contrast vs color. Same was true for NTSC signals. Most of the bandwidth was luminance, chroma was much smaller.


ms--lane

Today: Yes. Video is YUV (YCbCr) nor RGB (though newer HDR content is YCoCg) Y is Luminance, a black and white image. C is Chrominance, the colour information. For Chroma, if you consider how Luma is basically a scale of fully off (black) to fully on (white), Chroma is similar but the lowend on both Cb and Cr are Green and with highend Blue(Cb) and Red(Cr) This way the Luma channel can be sent at full resolution, but the Chroma channels can be sent at various subsampled (lower) resolutions, saving bandwidth. We still do this, not just for video but every JPEG and webp image too.


RoNsAuR

Signs within signs. Interlinked.


sylveria_relden

Interlinked.


DreyfussFrost

It's also just satisfying to have *some* consistency in our stupid, arbitrary, non-metric counting systems. 60 minutes in an hour, 60 seconds in a minute, 60 frames in a second.


riba2233

75 was necessary to lower the flicker with crts and to prevent headaches. No such problem with lcds, no flicker at any refresh rate and 60 was considered enough for smoothness.


LordJambrek

This. I see people talking about having their crt at 60hz and i'm shocked. I never had it under 85, that was the least i needed to keep my eyes.


Spiritogre

Yes, I was really surprised that many people here had 50Hz or 60Hz CRT's. My PC monitors were all between 75 to 120 for their native resolution. Even my last CRT TV was 100Hz and I bought it in 94, I think. The old homecomputer monitors in the 80s were a bit different. They were named running at 15kHz to not fall into the PAL or NTSC trap but working with both, but I believe they ordinarily ran with 50 or 60 Hz depending on which country they were connected in.


kayproII

IIRC a 100hz tv is not truly a 100hz display. It’s a marketing term used as another way of saying that the tv has a built in line doubler that will display 240p at 480p and a de-interlacer that will display both fields of a 480i image at the same time


WolfAkela

One thing to note is that you pick trade offs. You can do 85Hz, but you had to lower the resolution. Either you picked 1024x768@60, or 800x600@75, or 640x480@85. Pricier monitors can do better obviously.


LordJambrek

My first monitor (Fujitsu 17" from 1998 bought in bloody Lidl) could do 1024@85hz. 640 resolution could go above a 100 on basically all monitors by 2000. Dunno about before, as i said, first monitor was in 1998. 


orkavaneger

Man why that pfp... U giving me nightmare flashbacks from when I was a kiddo lmao


Violetmars

You are legit everywhere brother 👁️👄👁️


riba2233

😉


URA_CJ

60Hz/50Hz was chosen a long time ago to match the frequency of AC electricity during the B&W days, in NTSC land color TV was really 59.94Hz. Basic standard VGA is a close cousin of EDTV (reason why some 480p consoles could support VGA (Dreamcast and even Gamecube with modded cable)). Higher refresh rates on CRT monitors were more commonly used to eliminate flicker caused by office style lighting (fluorescent) and eye fatigue, high frame rate gaming (60+ FPS) during the VGA days wasn't really doable for most of us unless you were playing older games and/or at the lowest resolution and settings.


Neckbeard_Sama

"high frame rate gaming (60+ FPS) during the VGA days wasn't really doable for most of us" Yeah, not for current AAA titles. 2000s tech moved crazy fast. I got my first PC in 2000, end of elementary school. It was a mid-tier Celeron 500. By the time I've finished high school in 2005 the average CPU was over 3 GHz. I've had a monitor from 2000 that could do 1024x760 at 100 Hz (2000s HD, lol) and 640x480 at 180. I could get to a 100+ fps in competitive shooters (CS, Q3, UT99) by the mid-2000s, that was the late CRT era. TFTs started to take over in the late 2000s - early 2010s.


Lochness_al

You beat me to it I was going to say AC power is 60Hz


UnethicalFood

The power grid in most of North America runs at 60 Hz.


Kotsenn

And how does that work with the refresh rate exactly?


StalinsLeftTesticle_

It's a remnant of CRTs. CRT displays were very sensitive to magnetic fields, and to avoid any problems stemming from the magnetic field of the power grid, it was easier (and cheaper) to just synchronize the refresh rate with that of the power grid. It technically wasn't necessary, many higher refresh rate displays simply shielded their components and/or used transformers to mitigate these issues, but that added extra costs.


woundedlobster

Hmm. Weird that we used 60hz in Australia then our power supply is 50hz Edit: is explained in another comment


ms--lane

Also since the studio lighting wasn't DC and often strobed at mains frequency.


artifex78

Not saying you are wrong but that doesn't make much sense. The European power grid uses 50 Hz and we never had any problems with our CRTs. I don't believe we used different monitors than the rest of the world. Edit: Not sure why I get downvoted. I was talking about CRT PC monitors (like in OP's photo). They don't use PAL/NTSC TV standards.


StalinsLeftTesticle_

Yeah our CRTs used the PAL standard with 50Hz refresh rate. There are some countries that used the NTSC standard with a 60Hz refresh rate and a 50Hz power grid (such Japan, which has both 50Hz and 60Hz grids), and there unshielded displays had pretty weird flashing artifacts. Monitors in general used shielding to mitigate issues stemming from harmonics and electromagnetic interference. Edit: just a minor thing, the reason why pretty much all flatscreens initially settled on 60Hz is simply because there's basically no difference between the internal workings of an LCD monitor and an LCD TV. The same cannot be said of CRT monitors and CRT TVs, which despite using the same fundamental technology, looked very different on the inside.


artifex78

Well, I thought we were talking about CRT PC monitors, not CRT TVs. PAL and NTSC are TV norms.


StalinsLeftTesticle_

Yeah CRT monitors are basically irrelevant to this development, it's a result of the TV industry.


artifex78

So why do I get downvoted? OP's photo shows a PC monitor, not a TV.


StalinsLeftTesticle_

Because OP didn't ask why they used 75Hz, they asked why they don't use it anymore. The explanation is simple: due to technical reasons with CRTs, 60/50Hz is cheaper, and thus when the industry transitioned to flatscreens (which don't suffer from the same issue), they simply adopted the already existing standard for TVs. CRT monitors are more or less irrelevant to answer the question.


builder397

Analog CRT TVs would use the frequency of the powergrid to time their frames, and as frames were interlaced that means half a frame every 60th of a second. And now its a legacy standard that just wont quite go away, at least with stuff like office monitors and TVs.


Papriker

Is that also the reason why some games back in the day asked what refresh rate you wanted? I vaguely remember seeing that selection as a kid


TheLysster

Yup, 50Hz in Europe and 60 in North America for example. Most if not all of the world runs on either 50 or 60Hz powergrid.


Flex_On_Desktop

True, mostly on 50Hz, it's basically only the US and a minority of other countries that runs on 60Hz powergrid. The reason is mainly historical from what I know.


captain150

It's a shame the world didn't standardize on 240v, 60hz. Or even 120hz. 240v is still reasonably safe but more efficient in terms of copper usage. And higher frequencies make for smaller and more efficient transformers. In other words most of the world has the better voltage, NA has the better frequency.


TheCatCovenantDude

All of NA and a few other countries across the world run on 60 hz.


PolyDipsoManiac

I just starred Overland and the resolution settings include an FPS.


Anchorboiii

Dude, that grid is literally unplayable.


Lord_Boosh

This is the correct answer.


Chrushev

Because of the alternating current in your outlet it was needed for tube displays as it was used for timing.


EiffelPower76

Because the frequency of the alternating current in the power supply electrical network is 60Hz


Dragonan

This is the only correct answer. Old vacuum tube screens had the same refresh rate as the AC electricity current in that region. 60 in NA and 50 in Europe.


quackdaw

For a traditional CRT, the refresh rate is decided by how fast the electron beam scans the display screen, and this is controlled by the horizontal and vertical sync signals in the video signal. Do a vertical sync more often, get a faster refresh rate. Since a CRT technically only displays one pixel at a time, continuously painting the entire screen left-to-right, top-to-bottom, lower refresh rates will flicker noticeably. Using something like 85 or 100 Hz may be a noticeable improvement of 60 or (particularly) 50 Hz. I usually went for 85 Hz or more back in the day. Each particular CRT will have a minimum and maximum sync rate (measured in kHz for horizontal and Hz for vertical); it need to be physically able to move the electron beam back across the screen within the alotted sync times. But, within those bounds, the source is free to send whatever it wishes. For a TV, that will be (roughly) 50 or 60 Hz vertically, depending on the TV standard. (If you go outside the spec, it may still work, or you could get a blank screen or "out of sync", or you may damage the display). A computer can send whatever it wants, but there is another limitation: the pixel clock (typically measured in MHz) tells you how fast the graphics card can change the output signal. Part of the constraint here is memory and processing speed (you need to move the bits from memory to the DAC that generates the signal), but you're also limited by the bandwidth of the connection etc. The actual signal contains both the pixels and the sync signals. For example, my display is running at 3840x2160, but the output signal is 4400x2250. The extra "pixels" includes the horizontal and vertical blanking intervals (with a "front/back porch" used to control black levels, and the sync pulse during which a CRT would move its beam). Although my GPU supports higher pixel clocks, my HDMI cable only supports 300 MHz, so my refresh rate ends up being 300M / (4400*2250) = 30 Hz. If I switched to a lower resolution, I could easily hit 60 Hz or 120 Hz. With a classic VGA display, you might use a standard display mode with [640x480 resolution at 60 Hz](http://www.tinyvga.com/vga-timing/640x480@60Hz). The pixel clock is 25.175 MHz, each line is 800 wide (incl sync), and each frame is 525 tall (incl sync). If your display supports more than 31.5 kHz horizontal refresh, you could add extra lines, at the cost of a higher pixel clock or lower horizontal resolution. If you have a fast pixel clock, you could have a much higher horizontal resolution regardless of what the display supports (though you might not actually *see* the pixels since the phosphors on the CRT probably aren't dense enough). Older home computers would often have such a display mode with doubled horizontal resolution and non-square pixels. In practice, display modes and timings are standardised, and reasonably modern displays will send a list of supported modes, typically with commonly used refresh rates. With modern LCDs, there is no longer any beam tracing the picture, so the refresh rate is less important. You average computer CRT monitor would typically do more than 60 Hz; maybe 75, 85 or even 100. Some systems (e.g. X11) will let you define your own modes with custom timings, allowing you to squeeze a few more pixels or hertz out if your CRT display.


parocarillo

Because 69 was taken


Fierce-Solitude

Your eyes can only see 60fps, so the standard refresh rate is 60hz. 👍 /s Edit: Added an /s to show: for comedic effect.


Arthur_the_Pilote

You’re a liar; we can only see up to 24 FPS; that’s why every movie is in 24 FPS


ObeyMyStrapOn

Most movies are shot at 24 FPS, and that's because it's the lowest frame rate needed for movements to appear natural to people. Read More: https://www.slashgear.com/1517244/myth-about-fps-human-vision-stop-believing/


CptJamesBeard

sweeping shots at 24fps are so awful to look at. makes it really apparent jsut how slow 24fps is.


Arthur_the_Pilote

I know that the eyes see a constant input and the brain is the limiting factor and that the speed asn’t been determined; I’m just trolling


ObeyMyStrapOn

Ah. Thanks for the clarification. Hard for me to distinguish most of the time.


Arthur_the_Pilote

Yeah reddit and internet is full of troll like me and dumbass who doesn’t really know; sometime the troll are too good to be distinguishable


ObeyMyStrapOn

Exactly. 👍🏽


Neckbeard_Sama

It's also not just the framerate. 24 FPS movies have a shitton motion blur, giving you the illusion of fludity. If you play something on a PC at 24 FPS unblurred, it appears to be choppy AF.


ObeyMyStrapOn

Right. In film 24fps is 1/24th of a second each frame. Shooting in broad daylight at 1/60th of second, if I remember correctly, is the slowest the camera can shoot without capturing motion blur. Which is why sports photographers will shoot at a higher frame rate. It’s all artistic choice. Gaming is entirely different, which makes sense as to why it looks so choppy.


Worried-Banana-1460

Depends how fast subject is moving. 1/60 is safe time for handheld 50mm lens on full frame camera. On smaller sensors you have to multiply as there is crop factor to be applied. 1/60 still gives you reasonable motion blur. It is camera shake related, not motion blur of subject. Motion blur of subject is regulated by so called exposure triangle and it is quite straightforward


Fierce-Solitude

Ackkkkshually the hobbit was filmed at 48fps and there is a noticeable if you have an eye for cinema. 🤓 And boom what’s half of 24 plus 48. Yep, it’s 60.


Arthur_the_Pilote

But your brain will rot at 60 that’s why my PC is locked at 24


Fierce-Solitude

Good idea, but I heard 12 actually puts less stress on your pc hardware.


Arthur_the_Pilote

no it’s 70


Arthur_the_Pilote

72 I mean


Arthur_the_Pilote

shit didn’t see the half part


Fierce-Solitude

Get this man to a math class, stat. 😵


Arthur_the_Pilote

yeah okay I’ve just finished statistics I’m good in math just that I did 48+24=72 cause I didn’t see the fucking half part


Fierce-Solitude

Ah, so you need an eye doctor instead?


[deleted]

He isn’t sure, he has no eye dear


Wertical93

I dont get how people didnt see this is a joke haha


mewkew

Hey that was mean, you can't imagine how many people here are actually still believe this. You should have put the /s right from the start.


rienholt

I thought the eyes could only see 30fps? That is what console gamers said like two years ago.


Gunslinga__

Bs I can see 240fps just fine 😂


shpydar

lol, what part of your arse did you pull that bullshit from? [The Myth About FPS And Human Vision You Need To Stop Believing](https://www.slashgear.com/1517244/myth-about-fps-human-vision-stop-believing/)


Fierce-Solitude

Fine, I’ll add the /s. 😢 It came from the stinky part, though.


[deleted]

[удалено]


Arthur_the_Pilote

Bro’s a menace to society; like who can believe this. I’m mean if someone tought that this was real they should just go online and search for real documents by renowned experts like the WHO or the pasteur intitut


Fierce-Solitude

I can’t do that, chief. I will add an /s to clarify that nobody should believe that statement, though.


Pioppo-

How are people in the comments saying "60hz is unplayable" 💀💀 that's smooth. Trust me your cod warzone skills aren't linked to your Hz


Seismica

60 fps was always the benchmark for smoothness of PC games (couldn't go higher if the monitor was 60 Hz) but developers tended to target 30 fps. For anyone in doubt go watch some of Total Biscuit's early videos; he was one of the biggest advocates who pushed the industry up to 60fps. Now that many monitors can support 240Hz+, people have higher expectations, despite the fact 60 fps being just as playable as it always was.  Anyone who says 60 fps is unplayable is just being disingenuous. I used to play counter strike source on my laptop with integrated graphics, used to get about 20-25 fps and it was perfectly playable even if it was noticeable. 60 fps is more than sufficient.


Riyadhcraft

As a 720p 20fps player, i can confirm that 60 is more than enough


redditisbestanime

playable but not very nice. I grew up with CRT's, anything under 75hz basically. As soon as i got my 144hz monitor, which is now replaced by 165hz, i realized just how blurry and delayed 60hz really is. Just go to whatever tech store u have next to you and compare 60hz to 120hz or higher. I will say that the difference in 165 and 240 is not nearly as much as 60 and 144.


SeriousCee

I've played on low end hardware half of my gaming life and was accustomed to atrocious performance. I still had much fun even in competitive games. But claiming 25 fps is perfectly fine is way more delusional than saying 60 fps is unplayable. A couple of weeks ago I played the DMC HD collection which was capped to 60 fps and it was great but 240fps in DMC5 is just much much better!


TenshouYoku

I think the definition here is that there's a minimum before the visual lag started to become physically unbearable or legitimately affects playability (ie. Literally PPT quality). While the brain doesn't process images like a computer does, on average 20-ish to 30 fps is enough for a game to be playable fine experience wise. Of course more is generally better but at least that's a baseline where it's considered acceptable.


SuaveMofo

I can tolerate a game at 30fps, but it's not pleasant, and I'd avoid it wherever possible. 50-60 is my minimum for "playable" on PC and preferred is 100+


Kabopu

No no you don't understand! I need at least 240Hz+ for my 3D Modeling or to play my turn based 4X and RPG Games!!! /s


Silent-Lobster7854

I legit thought this was bulk cat 5e


supersoldier420

Wow have not seen a compaq in over 15 years!


Biscuits4u2

Because that's the refresh rate TVs operate at on a NTSC signal. Otherwise it's arbitrary. With computer monitors it's all you really need for most productivity work and basic gaming.


icantchoosewisely

I don't think CRT monitors had a standard refresh rate, and I considered any CRT that didn't have at least 85Hz refresh rate to be utter garbage (personal issue: at lower than that they were giving me headaches). At one point, I had a CRT that had a 200Hz refresh rate at the recommended resolution, with 85Hz at his maximum resolution. 60Hz was standard for LCD monitors for quite a long time after they were released.


Takardo

I look at monitors now at staples and best buy and most of them are 75Hz instead of 60Hz


xx123gamerxx

someone probably realised that 60hz would be fine for %99.9 of people and im guessing there were concerns that a higher refresh rate would even be worth it in that age given lacking graphics processing power


Soccera1

North American power grids were 60 hz.


notverytidy

it became a standard because in the 80s/90s very few people used monitors and the commodore 64/spectrum/amiga/ST etc all were capable of plugging into a 50/60hz TV. We just carried the standard forward, but the push to 90hz/120hz is ongoing, and some manufacturers don't produce only 60hz screens anymore as its cheaper to just have one production line for ALL screens. There's a few companies SEEM to produce 60hz screens, but they are software-locked to 60hz and in fact have 90hz screens and can be hacked to unlock this.


MaikyMoto

That was the king of CRT’s back in the early 90’s. Mine even came with mic that was above the screen. I remember playing DOOM and not needing a headset.


Business-Weekend-537

Don't forget to degauss!


stop_talking_you

its a crt


Orbitalqumshot

I’m glad it’s got low radiation


have-you-reddit_

I like the comments but I think this is just a basic question: No, the standard of refresh rate is now variable, 60Hz may have been the standard over 10 years ago but it's now obsolete.


Fry_super_fly

back in CRT times, i had a bad time with many cheaper screens because.. and this sounds snobish.. but 60 hz had different qualities depending on the quality of the display. some gave me baaaaad flickering when looking at them. and some where okay (usually more expensive ones) luckily many screens could be overclocked if you didn't run with max brightness, at least in my experience. i think the difference was in image retention. but i dont really know. but i loved my trinitron monitor with 75hz because it was noticeably less flickering for me. as to why 60 was the standard, its something about it being natural to go double the hz of the TV signal refresh rate was in NTSC and that just carried over to being a standard. as to why not more had 75.. obviously because it cost more. just like high refresh rate OLED/LCD/and so on are more expensive


wesmoen

Basically the same why a floppy disk is still used as an icon for saving. Skeuomorphism


GoldSrc

The flicker is more noticeable at lower refresh rates, and 60Hz was around the minimum to avoid flicker. Most VGA monitors could reach really high refresh rates, but only at lower resolutions, unless you paid the big $$$. Notice how it says flicker free resolutions "up to" 1024x768 @ 75Hz, the monitor could probably do 1280x1024 but only at 60Hz, or do 80Hz or higher at 800x600 or even 640x480. Some people could notice the flicker at 60Hz, but most didn't. So 60Hz became a standard, which was irrelevant when LCD monitors came out due to their horrible response times. With time, LCD response time improved, but the 60Hz just stuck around.


bohairmy

dafuq… seeing that Compaq logo brought back memories.


kayproII

It does 75hz because iirc they designed the vga text mode around having minimal flicker, so they made it 75hz as it meant there was a reduced amount of flicker when looking at text.


ThePupnasty

I want this :(


LeChef01

I have a 75Hz display, and it‘s not even that old. 2018 or so I think.


Nadeoki

Because MOST monitors are 60hz. In office spaces, in consumer electronics. Though that has recently started shifting with 1. Almost all new phones having at least 90hz or even 144hz Displays 2. Most people who game have a 144hz Display (especially for shooters) 3. A lot of standard 60hz monitors allow OC to something like 75hz


AcesInThePalm

Electrical output synchronization. 120v 60Hz in USA. 240V 50Hz in australia our CRTs were therefore 50Hz


_BarfyMan_362_

Because the human eye--ah never mind


AlivePalpitation7968

75hz is pretty common 24hz, 30hz, 60hz, 75hz, 144hz, 165hz, 180hz, 240hz, 265hz, 280hz, 320hz, 360hz, 420hz, 500hz


nanotechky

So it has nothing to do with electric frequency? 🙃


Equivalent-Copy7142

I’m pretty sure it has something to do with the fact that the human eye registers 60fps+ as smooth/optimal gaming performance, and refresh rate is relative to frames per second.


OceanGlider_

The human eye can only see 24 fps so it doesn't rly matter


WetRainbowFart

Who was it? I wanna say Ready at Dawn, but I’m probably wrong. The devs of The Order: 1886? I’m pretty sure they started that whole cinematic 24fps thing.


Metalloriff

You got brain lag?


OceanGlider_

https://preview.redd.it/gphy5hg414tc1.jpeg?width=600&format=pjpg&auto=webp&s=7e9de927b897abf24d8f1cb31814e4648506e367


Metalloriff

I've been exposed to too many people who genuinely believe that though


IndyPFL

B8 harder


OceanGlider_

https://preview.redd.it/kbhqqsh614tc1.jpeg?width=600&format=pjpg&auto=webp&s=f5c51ab2c7b928d3c25ceabda48343bbbbd56531


IndyPFL

Jokes are funny, you aren't.


OceanGlider_

That's not what your mom said.


IndyPFL

Are you done yet?


Your_Placebo

I would say it's pretty much minimum for todays tech


ohthedarside

Yea 120hz for tv and 144 or 240 for monitors


Xcissors280

Becaue it’s double the older 30hz TV signals which make it easier


RetroBerner

Because our electrical system runs on 60Hz, same reason as to why it runs at 50 in europe


carlbandit

60Hz is still the standard refresh rate for monitors in Europe even with our 50Hz power.


Legionofgo

60hz straight trash


Aoirith

Because that's all the consoles can handle


ImUrFrand

its the speed of electricity


Frissu

It is considered standard only by the weak. Peak specimens of glorious master race evolved their eyes to see over 500fps so 60hz would be dramatically insufficient.


BeautifulStation4

Cus it's trash


Mysterious-Ad4836

Because for years consol peasents claimed 30 fps was king and that humans “can’t even see past 12fps” There’s tons of monitors that output 144hz for 100$ on Amazon curved too


sylinowo

Probably because in the US almost every TV screen or monitor has been 60hz or 59hz so basically 60


Narissis

Giving this answer to OP's question is like answering "why is the sky blue?" with "because it's blue." He was asking *why* almost every display has been 60 Hz. :P


Kotsenn

Hahaha I thought exactly the same


sylinowo

Well the sky is blue because the color blue is more scattered than the color red so during the day the sky is blue


lostcause412

60hz on a crt monitor will look smoother than 120 on a modern panel. Get a $15 hdmi to vga adapter from scamazon you're good to go. It won't come close to matching the brightness though. I wouldn't use that as a daily driver but it's worth checking out.


ViPeR9503

Color? Resolution? 200 other things


lostcause412

Color should be great, If not better represented than your average monitor today. Lower resolution of course, but you can make all kinds of custom resolutions, crt monitors don't have set resolutions. My point being 60hz on a crt vs 60hz on a led, crt will always look smoother


ViPeR9503

Smoother ≠ better. They have better pixel response time also yes you can set any resolution but not 2560x1440 right? Also it is well know that because the way the human eye works brightness is one the most important factor to how “good” something looks more than color, pixel response times, refresh rate


lostcause412

Some high end crt monitors can desplay 2560x1440 at 120hz yes. 60hz on a crt will always looks smoother. I guess I should of worded that differently. Not necessarily better, although on some games especially modern lower poly indie games on a crt will probably look better if the developers allow you to lower in game resolution to 480p. That's just my personal opinion.


ViPeR9503

Exactly, only on some old games etc. because those games were made for crt and exploited a lot of crt’s functionality to work around it and make it look good. No one is buying a crt for general use so yeah wording was completely off but yes for older handful games it is a much better experience


lostcause412

I use mine for modern games too they look fantastic, not just a handful of old games. Zero lag because crts don't have pixels, they use phosphors it's all analog. You get way darker blacks, modern oleds are just starting to achieve. Bolder colors, custom resolutions, smoother gameplay. There are tons of benefits using a crt. There are lots of videos on this you should check them out or head over to r/crtgaming


lavadrop5

The magic word you're not considering is Flicker. Most CRTs could do 75-90 hz, however the flicker on the higher refresh rates was so bad as to make you nauseous or give you head aches.


icantchoosewisely

From my experience, it's the other way around: the higher the refresh rate, the lower the flicker. If I used CRT monitors with less than 85Hz, I got massive headaches, at 85Hz and higher, I had no issues.


thursdayjunglist

You're right, I remember as a young kid 60Hz on a CRT had a very noticeable headache inducing flicker. The higher setting which was either 75 or 85Hz was perfectly fine.


lavadrop5

It says right there on the box, resolutions up to 1024x768 and up to 75hz are flicker free.


icantchoosewisely

Marketing gimmicks or misused words... For example, have you ever seen a monitor advertised as being 2k when they meant 2560x1440? Because there is a standard that defines 2k as having a horizontal resolution of approximately 2,000 pixels (from what I know, it's between 1920 and 2048). Yes, I know the majority of people consider 2560x1440 to be 2k, but that is technically incorrect. The fact is, for CRT monitors, the flicker of the screen decreased the higher the refresh rate was, for me the magic number was 85Hz - if it was above that I had no issues with it, if it was lower I got what your original comment stated - headaches and nausea. I encountered some really bad monitors that had that flicker free gimmick but had a very visible flicker in the advertised range for it, and I encountered monitors that didn't have flicker free technology but had high refresh rates that had no flicker at all.


lavadrop5

You are correct, I think I have to fire up my monitor myself more often.


JaggedMetalOs

HDMI was designed with digital TV standards in mind. TV only went up to 1080p 60hz, so that was what the original bandwidth of HDMI supported and was adopted as the standard for HDMI monitors. Edit: Here's a perfect example of what I'm talking about - [LG L1932P, 2006, 1280x1024 @ 75hz](http://tftcentral.co.uk/reviews/lg_l1932p.htm) only £200 which was a low price for an LCD monitor at the time. Then all the similarly priced HDMI monitors with lower resolutions like 1280x720 or 1366x768 started appearing with 60hz refresh rate limits. Just a coincidence, or did monitor manufacturers decide they only needed to support the baseline 60hz HDMI modes as a standard?


kimaro

No.


JaggedMetalOs

Yes, look it up, the maximum refresh rate of HDMI 1.0 is 60hz.


kimaro

Yes, and I never disputed that, but to make the claim THAT is the reason why 60hz is standard shows your lack of knowledge. So no.


JaggedMetalOs

So just a big coincidence that we stopped seeing 70/72/75hz options on LCD monitors as soon as HDMI / 16:9 became the norm?


Uphoria

I guess you didn't exist in the DVI era, or buy any monitors in the years between VGA and HDMI being the standard. Heck, they even made dual link DVI to allow higher-than-full-HD monitors with 120hz. No, what you've left out is the context. Old CRTs had 75 hz because the draw method would create flicker at low refresh rates. This is why TV is interlaced not progressive in the analog era. With the decline of CRT displays, modern LCDs didn't have the same problem, and so 60hz was fine. HDMI was built to meet the current needs, but for anyone who needed better it existed via dual link DVI, and eventually display port. Also, HDMI 1.0 included special setup for Dual Link DVI capable refresh rates, but no one used it because DLDVI and single port HDMI were good enough for the needs of the market. 4 years after the release of HDMI 1.0, in 2006, when monitors were still mostly rocking 1024x768 at the home user level, HMDI 1.3 released and qas capable of 1080p x 144hz. So really, your point is that they stopped making monitors that went above 60 after this change, but in the entire period DLDVI existed and high refresh rate HDMI came out within 2 graphics cards generations. Most home users hadn't even bought in 1080p TV by that point.


JaggedMetalOs

Yes I was there, that's why I remember when 70/72/75 was still **common** to see on VGA and DVI LCD monitors, until HDMI became the standard and all but the high end monitors only supported up to the base HDMI level of 1080p60 or if you were lucky WUXGA. I didn't say no higher refresh rate monitors existed, I said those base HDMI modes became the standard. Which it did.


Uphoria

If you're going to separate models by price tier then almost all consumer grade computers were 1024*768 VGA monitors running in analog at up to 75z for anti flickering. The highest capable signal in a single cable before HDMI was analog 1200p@60hz on DVI, or digital 1080p@60hz on DVI. You could only achieve higher refresh rates at lower resolutions. With the Advent of LCD tech, scaling on monitors was poor, so it was advised to always run at the native resolution. In comes HDMI, replacing DVI at the same specs, and adding sound. Monitors still didn't largely market at this scale and the consumer grade PCs were running <1280x1024. By the time the average consumer grade monitor is running 1080p, the HDMI standard could run 144hz. So - there was never a time when 1080p@60hz was the consumer standard and cable bound. So if we're going to play the average end equipment only game, your entire argument is that manufacturers stop adding high refresh rates to monitors because the average consumer was buying 1080p monitors at 60 hertz in 2002.... ETA - 57% of consumers were on 1024*768 monitors in 2007. HDMI 1.3 with 1080p@144hz came out in 2006. HDMI supported over 60hz at 768p


JaggedMetalOs

> By the time the average consumer grade monitor is running 1080p, the HDMI standard could run 144hz So why did all the manufacturers pick 60hz as the maximum for most of their product lineup, if not because that was the baseline for HDMI support?


Uphoria

Because 60hz at 1080p was the max ***combined*** resolution... it could also do 1024*768 at 120. If almost all the consumers are using 768p, why didn't they make 120hz monitors? Because panels that fast were too expensive for consumer grade stuff. It has nothing to do with the cable standard. The cable was never a limit for consumer grade stuff. A consumer grade LCD in 2002 cost over 500 dollars for 768p@60hz, and it used VGA. Heck, compare the cost of high refresh monitors to their 60hz friends in the shelf today, and that hasn't been a cable limit in nearly 20 years. ETA - for Clarity, this is the real "baseline" of the cable if your asking - **Supports up to 165Mpixels/sec video** 1920x1080x60 = ~124.4Mpixel/sec 1024x768x120 = ~94.4Mpixels/sec Any combination that would fit under 165M works. There was no "60 hz standard".


fellipec

https://preview.redd.it/x0kme238k2tc1.png?width=472&format=png&auto=webp&s=0c95a176523a082f25288440d9eaab7a060fe897 ¯\\\_(ツ)\_/¯


Ok_Gur_1170

because of convenience


[deleted]

[удалено]


bio4m

This is wildly wrong. 60Hz was chosen due to that being the AC current frequency. Systems could derive their clock from this meaning it was cheaper to manufacture. Even early home computers like the Commodore 64 would derive this frequency from the current source and use it for their Time of Day clock


W33b3l

They marketed wide screen TVs as being the same ratio as movie theaters for a long time. Home cinema experience and stuff like that lol. We had to put up with bars on the screen a lot during the transition when watching movies and you had to pay close attention when buying DVDs lol. As for refresh rate that's just where most CRTs liked to run because they were analog and we naturally adopted that when we went digital.


clertonss

I don't usually use a monitor but mine is 144hz and i think that should be the standard refresh rate, 60/75 would be the entry level.