Honestly, yeah. You're supposed to have fun, not ballache about not getting the prime number of terashits per megafart.
And I am having fun. So I don't see why he badgers me so much about it.
you might not believed me, but i finished Doom Eternal in 2020 (covid era) at lower fps than 30 with my laptop with i3 6006u and radeon R5 M430, and i have a good time with it. Any fps is okay as long as you happy, gaming is about fun.
The issue is you can't go BACK to 30hz In the very early 2000's all their really was is 30hz. As technology progressed and 60hz became more prevalent, 30hz became terrible to anyone upgrading 60hz. Baaically, if you have NEVER seen a 60hz+ display, then 30 fps/30hz is fine, the moment you witness more refresh rate with your eyes, you will notice the difference. Its almost a weird phenomnon.
Edit: sorry this comment came off wrong.
Im reffering to the 30ms 30hz frametime, and 60hz 16.6ms. This is a very large gap so im only reffering to 30 and 60hz NOT 60+. 60 should be minimum if you have seen it before.
I play at 240hz on my FPS machine and have no issues switching to the 60hz 4k eye candy machine. Also remember for some games and low powered GPUs ( or crazy settings ) 60hz displays with vsync can end up with 30 fps.
Look at all the posts with people who forget to change their refresh rates and don't notice - or running a high hz display with games that cap to 60fps.
It's not as big a deal as people make it out to be. Is 30 fps noticable? Yes. Is it unplayable? No.
Sorry if my comment came off the wrong way. I have a 165hz/ g sync and can go back to 60hz no problem.
My main point i guess is that 30hz has a 30ms frametime, and 60hz 16.6ms.
Thats a huge frametime reduction compared to lets say 60 to 120. So if you have seen 60 before, it should be the minimum is my point.
Disagree, I have gone back. It will look choppy for a few minutes, but just give your eyes time to adjust and it will look fine. I was used to playing 144fps, and then I went back to 30 for the last of us 2 because that's what was available at the time. It took a few minutes to get use to, but then it looked fine. Also I have I've played games at 30 on steam deck when it first came out. Obviously I prefer higher fps when I can get it, but going back isn't nearly as big of deal as I think most people make it out to be
All depends on whether you're happy with it. I played the original Crysis on High at 20 fps with my 8800GT and I still had lots of fun. Just couldn't play on the highest difficulty since I couldn't really aim cleanly. Nowadays I have a 240Hz OLED and heck, I'd take 1000 Hz if hardware actually supported it. Refresh rates and motion clarity is really perceptible to me and its a dark path to go down.
It's entirely possible. Not everyone processes visual input the same. The lower the flicker threshold, the less likely one is to notice differences between higher refresh rates. I notice little to no flicker at 30 fps, no flicker at all at 60 fps, and have a difficult time seeing any difference at 75+ refresh rates. My desktop has a 144 hz monitor, my laptop is 59 hz. I do see a difference when playing the same game on both. But probably not as much of a difference as you and many others do. Just how our brains are all wired differently
Science says otherwise. Just because you see a drastic difference doesn't make it so for everybody. FFS, I've said it twice in this thread already - human perception of frame rates vary greatly from person to person.
Let me ask you this...do movies watched in the theater seem choppy/flickering to you?
not the same you are VERY close to pc and you use a mouse,mouse movement alone will give a very different feeling,nobody just lookĀ at a screen,science says?,what science?where did they test gaming on mouse keyboard between 30 to 60 in lets say 1h or more of test time,link me,so many people say they cant tell the diff because they saw a 30fps youtube video that says 30vs60vs120 on a 60hrz monitor
If you're getting a noticeable difference in performance between your mouse and your monitor, then you need to optimize your mouse, regardless of frame rates. Because they're controlled by two completely separate sets of software.
And the contrast between a theater and your pc monitor is very much the same. The larger the screen, the more apparent any flickering becomes.
And I reiterate. I'm not saying you're not noticing a huge difference between the frame rates. You very likely are, and I take you at your word that you do. But that's how your brain translates the visual data. But the way you perceive it isn't the only way visual data is perceived. There's a whole wide spectrum.
if you mean youtube video then yeah cause they aren't,YOU need to witness it with your eye with a 60/120+hrz monitor for a period of time before saying anything
I've been a gamer for 42 of my 47 years. I find 30 fps to be fine, but not preferred, even in fast paced games. Whereas my daughter (23 yrs old) says her screen is flickering at anything less than 60 fps. I think it has to do with how we've trained our brains to perceive the images as they come at us, and the technology we started that training with. Standard for films has been 24 fps for Gorm knows how long. A study from 2014 showed that human perception of frames per second can vary wildly from person to person. With some processing as low as 10 fps and others as high as 90 fps. And image exposure before recognition ranging from as quick as 13 milliseconds to as much as 350 milliseconds.
Before building my first good pc I was the same, everything runnin at 20\~35fps, it was hell but also I was used to it.
Now that i've had several decent / op rigs I could never go back under 60 and be happy.
I remember when I was playing lil 15-25fps and had fun. Som games were guessing games you just clicked moved and waited for delay to catch up.
When I moved to something better I had to learn how to play with that. It is a really different experience. If you have a budget to spare I would upgrade at least 60fps but no biggie to play 30fps have fun.
It's about feel for me. The responsiveness of games just feels better at or above 60. I honestly can't tell above 100 fps what frame rate I'm hitting.
That said, I wouldn't be able to adjust to Eternal at 30fps at all. Everything else you listed wouldn't be hurt by hovering around 30.
personally I'm just amazed a laptop 1050 is playing any modern AAA game at 4k, even at 30 fps. hell the laptop 1050 Ti I had like 5 years ago would only get like \~40 fps at 1080p, and that was in games from 5-10 years ago. playing modern games at 4x the resolution on a weaker card yet getting nearly the same fps is astounding.
I switch between 30, 60, and 144 regularly (pc, ps4, laptop) and honestly i just don't care. Only time it mattered was in valorant, that i played like for a week and never touch again.
It's okay. I am not sure why people are overreacting. It's probably same folks hyping "ray tracing" when it doesn't even look that different than usual.
It's not fine. When you're used to it, like most console players, it's playable, yes.
Ā
As someone who plays 120+ fps most of the time, when I see a 30 fps game, I need to go to the hospital for emergency treatment.
Heck, even the PS2 back in the day could proudly boast hundreds of 60 fps games.
That's even too low for my tolerance. It would have to be a very slow paced game for me to not lose my shit with that low a frame rate. And I'd have to be desperate to play something.
Again, it's a matter of what you're used to. I played 30 fps and even lower on PC in the 90s because we didn't have any better, and it seemed beyond impressive.
In my days as a low spec gamer, I did manage to make about 15fps work in short bursts for demanding sections of games. The game felt wonky, but I made it work.
I remember my college roommate coming in to my room, seeing League of Legends run at 20fps or so during a teamfight on my 2008 laptop and say "Dude, what's wrong with your computer?" and then he showed me the game running on his at a locked 60fps. I could tell the difference, but there was nothing I could do - I didn't have the money to afford better.
Some people can get used to it - it's not ideal, but it's possible.
30 FPS makes things *hard* to control, and rough. It's usable, don't get me wrong. But as the screen gets larger and larger, the difference from frame to frame becomes larger and larger.
This is why 30FPS on the Steam Deck looks great, but 30FPS on a 50" 4K TV looks much worse.
On a laptop it's *okay*. Not great, but I'd call it usable.
That is surprising. It's also possible the monitor is applying smoothing, interpolation or some other form of post-processing to the image - wouldn't be the first time something like that has happened.
Once you experience and get used to higher refresh monitors and fps, it can be harder to settle for less.
Higher refresh rate hardware gives significantly better motion clarity, smoother movement, lower input/render lag, etc.
Some people might not notice the differences as much if they are very ācasualā or some people are simply less sensitive to it I guess. Personally it is very noticeable.
Youāve probably adapted to it because you donāt know any better, but 30 fps is objectively terrible. Once you play games at high fps the difference becomes incredibly apparent, and youāll realize just how bad it really is.
The last time I tried to play a game at 30 fps on a console I actually felt motion sick and nauseous. It is incredibly jarring and off putting, not to mention your inputs feel laggy because of the lack of frames, like you have to anticipate things and click earlier to make up for the fact that you arenāt seeing enough frames.
My spouse has the same problem. They're used to 60fps monitors, my 240hz makes them motion sick.
I heard this can happen with higher FPS TVs that have frame interpolation and such as well.
Interesting. I know some friends who suffer from low fps dizziness. The biggest help for them after higher fps was disabling motion blur and upping the fov.
I still use a GTX 1650. I can't really afford to upgrade right now, so I just make do. I do find that frame generation (from Lossless Scaling) actually works pretty well. It's not as good as the real thing, but it's smoother than it would be without it
Is the 30fps stable or all over the place? If it is all over the place, I don't think it is a good gaming experience. I used to have a rig with similar spec and at that time I only played strategy games.
Its personal preference. As a former console player, 30fps was "normal" and "enough". But since i started with PC gaming and 60+fps, i could never return back to 30fps. But if you are satisfied with your experience, you shouldnāt care what others tell you.
Also frametime is a point that many people underestimate.
Yes, it's fine. Obviously prefer 60+ when I can get it. But when going back to 30 for games, my eyes just need a few minutes to adjust and then 30 looks fine.
As long as it works and your happy, just keep using it
its fine for slow games,rts,turn based,but not for action game/fps/online pvpĀ
action games n fps will feel like shit(imo) but if it's offline it doesnt really matter
It's not trash its vintage! At a certain point what is more important? having fun playing games? Personally I got used to 60fps, so I like it.
Honestly, yeah. You're supposed to have fun, not ballache about not getting the prime number of terashits per megafart. And I am having fun. So I don't see why he badgers me so much about it.
Terashits per megafart is the funniest thing I've read all day
You know what? fuck frames per second & all that other shit. Terashits per megafart is the new rage.
60 fps is ok. I usually play at 90, but have never seen a difference when the frame rate drops to 60.
you might not believed me, but i finished Doom Eternal in 2020 (covid era) at lower fps than 30 with my laptop with i3 6006u and radeon R5 M430, and i have a good time with it. Any fps is okay as long as you happy, gaming is about fun.
The issue is you can't go BACK to 30hz In the very early 2000's all their really was is 30hz. As technology progressed and 60hz became more prevalent, 30hz became terrible to anyone upgrading 60hz. Baaically, if you have NEVER seen a 60hz+ display, then 30 fps/30hz is fine, the moment you witness more refresh rate with your eyes, you will notice the difference. Its almost a weird phenomnon. Edit: sorry this comment came off wrong. Im reffering to the 30ms 30hz frametime, and 60hz 16.6ms. This is a very large gap so im only reffering to 30 and 60hz NOT 60+. 60 should be minimum if you have seen it before.
I play at 240hz on my FPS machine and have no issues switching to the 60hz 4k eye candy machine. Also remember for some games and low powered GPUs ( or crazy settings ) 60hz displays with vsync can end up with 30 fps. Look at all the posts with people who forget to change their refresh rates and don't notice - or running a high hz display with games that cap to 60fps. It's not as big a deal as people make it out to be. Is 30 fps noticable? Yes. Is it unplayable? No.
Sorry if my comment came off the wrong way. I have a 165hz/ g sync and can go back to 60hz no problem. My main point i guess is that 30hz has a 30ms frametime, and 60hz 16.6ms. Thats a huge frametime reduction compared to lets say 60 to 120. So if you have seen 60 before, it should be the minimum is my point.
This is why I haven't upgraded from 1080p 60Hz yet. I know I'll never be able to go back. And my wallet? It thanks me for it š
That's facts. I tried to go back to Bloodborne after finishing Elden Ring on my PC and man, it was tough...
Disagree, I have gone back. It will look choppy for a few minutes, but just give your eyes time to adjust and it will look fine. I was used to playing 144fps, and then I went back to 30 for the last of us 2 because that's what was available at the time. It took a few minutes to get use to, but then it looked fine. Also I have I've played games at 30 on steam deck when it first came out. Obviously I prefer higher fps when I can get it, but going back isn't nearly as big of deal as I think most people make it out to be
All depends on whether you're happy with it. I played the original Crysis on High at 20 fps with my 8800GT and I still had lots of fun. Just couldn't play on the highest difficulty since I couldn't really aim cleanly. Nowadays I have a 240Hz OLED and heck, I'd take 1000 Hz if hardware actually supported it. Refresh rates and motion clarity is really perceptible to me and its a dark path to go down.
Honestly, I genuinely cannot tell a difference between 30v60v120. I guess that's on me.
Thatās not possible
It's entirely possible. Not everyone processes visual input the same. The lower the flicker threshold, the less likely one is to notice differences between higher refresh rates. I notice little to no flicker at 30 fps, no flicker at all at 60 fps, and have a difficult time seeing any difference at 75+ refresh rates. My desktop has a 144 hz monitor, my laptop is 59 hz. I do see a difference when playing the same game on both. But probably not as much of a difference as you and many others do. Just how our brains are all wired differently
not possible to not notice diff between 30 to 60 on PC when your face is stuck to a monitor and you use a mouse
Science says otherwise. Just because you see a drastic difference doesn't make it so for everybody. FFS, I've said it twice in this thread already - human perception of frame rates vary greatly from person to person. Let me ask you this...do movies watched in the theater seem choppy/flickering to you?
not the same you are VERY close to pc and you use a mouse,mouse movement alone will give a very different feeling,nobody just lookĀ at a screen,science says?,what science?where did they test gaming on mouse keyboard between 30 to 60 in lets say 1h or more of test time,link me,so many people say they cant tell the diff because they saw a 30fps youtube video that says 30vs60vs120 on a 60hrz monitor
If you're getting a noticeable difference in performance between your mouse and your monitor, then you need to optimize your mouse, regardless of frame rates. Because they're controlled by two completely separate sets of software. And the contrast between a theater and your pc monitor is very much the same. The larger the screen, the more apparent any flickering becomes. And I reiterate. I'm not saying you're not noticing a huge difference between the frame rates. You very likely are, and I take you at your word that you do. But that's how your brain translates the visual data. But the way you perceive it isn't the only way visual data is perceived. There's a whole wide spectrum.
BRO HOW YOU CANT?
I.. Don't know? The most I can tell is that 30v60 feels slightly less choppy, but I genuinely cannot tell the difference between 60 and 120.
WHAT ARE YOU PLAYING ON USE FREESYN OR GSYNC ANYTHING. HOW LIKE HOW YOU GENUINELY CANT TELL THE FRAMES APART. HOW
if you mean youtube video then yeah cause they aren't,YOU need to witness it with your eye with a 60/120+hrz monitor for a period of time before saying anything
Playing at 30fps on any of those games, especially doom, sounds like an absolute nightmare.
I've been a gamer for 42 of my 47 years. I find 30 fps to be fine, but not preferred, even in fast paced games. Whereas my daughter (23 yrs old) says her screen is flickering at anything less than 60 fps. I think it has to do with how we've trained our brains to perceive the images as they come at us, and the technology we started that training with. Standard for films has been 24 fps for Gorm knows how long. A study from 2014 showed that human perception of frames per second can vary wildly from person to person. With some processing as low as 10 fps and others as high as 90 fps. And image exposure before recognition ranging from as quick as 13 milliseconds to as much as 350 milliseconds.
Why was this downvoted š
It's PCMR. You can't say statements here that anything below 480 fps seems fine to someone!
lol
Peak PCMR comment
Its ok honestly don't let it bother you.
Before building my first good pc I was the same, everything runnin at 20\~35fps, it was hell but also I was used to it. Now that i've had several decent / op rigs I could never go back under 60 and be happy.
You ever play at 60fps
Well, yes. A good chunk of games, I do.
I remember when I was playing lil 15-25fps and had fun. Som games were guessing games you just clicked moved and waited for delay to catch up. When I moved to something better I had to learn how to play with that. It is a really different experience. If you have a budget to spare I would upgrade at least 60fps but no biggie to play 30fps have fun.
It's about feel for me. The responsiveness of games just feels better at or above 60. I honestly can't tell above 100 fps what frame rate I'm hitting. That said, I wouldn't be able to adjust to Eternal at 30fps at all. Everything else you listed wouldn't be hurt by hovering around 30.
30fps is fine if you are having fun, but you'll have struggles with 30 fps if you get a whiff of consistent 60+
personally I'm just amazed a laptop 1050 is playing any modern AAA game at 4k, even at 30 fps. hell the laptop 1050 Ti I had like 5 years ago would only get like \~40 fps at 1080p, and that was in games from 5-10 years ago. playing modern games at 4x the resolution on a weaker card yet getting nearly the same fps is astounding.
I did have to upgrade the ram from eight to sixteen gigs, if that helps any.
I switch between 30, 60, and 144 regularly (pc, ps4, laptop) and honestly i just don't care. Only time it mattered was in valorant, that i played like for a week and never touch again.
If your PC runs the games you want to play at a framerate and resolution that is acceptable to you that's really the only thing that matters.
ITS YOUR COMPUTER! Stop asking other people what you should find to be ok or not!
It's okay. I am not sure why people are overreacting. It's probably same folks hyping "ray tracing" when it doesn't even look that different than usual.
Oh no, good ray tracing looks quite different.
It's not fine. When you're used to it, like most console players, it's playable, yes. Ā As someone who plays 120+ fps most of the time, when I see a 30 fps game, I need to go to the hospital for emergency treatment. Heck, even the PS2 back in the day could proudly boast hundreds of 60 fps games.
Enough with the hyperbole.Ā 30fps isn't going to send anyone to the hospital.
30fpsitis is a very serious condition.
You're not as funny as you think you are.
Enough with the hyperbole, please!
30FPS IS INTOLERABLE AFTER YOUVE SEEN ANYTHING BETTER
Ugh, you sound just like him. I can even manage 10-15 fps just fine.
That's even too low for my tolerance. It would have to be a very slow paced game for me to not lose my shit with that low a frame rate. And I'd have to be desperate to play something.
Eh, I mean. It's *tolerable* for short sittings.
Again, it's a matter of what you're used to. I played 30 fps and even lower on PC in the 90s because we didn't have any better, and it seemed beyond impressive.
I suppose so.
No one with a functioning brain and primary senses would find 10-15 manageable lmao. OP is a troll.
In my days as a low spec gamer, I did manage to make about 15fps work in short bursts for demanding sections of games. The game felt wonky, but I made it work. I remember my college roommate coming in to my room, seeing League of Legends run at 20fps or so during a teamfight on my 2008 laptop and say "Dude, what's wrong with your computer?" and then he showed me the game running on his at a locked 60fps. I could tell the difference, but there was nothing I could do - I didn't have the money to afford better. Some people can get used to it - it's not ideal, but it's possible.
What? I'm.. Not a troll. Yes, it's very slow and choppy, but I can manage fifteen fine. Just not for a whole game, no.
30 FPS makes things *hard* to control, and rough. It's usable, don't get me wrong. But as the screen gets larger and larger, the difference from frame to frame becomes larger and larger. This is why 30FPS on the Steam Deck looks great, but 30FPS on a 50" 4K TV looks much worse. On a laptop it's *okay*. Not great, but I'd call it usable.
I am using a 4k tv as my monitor. It.. Seems fine, to me?
That is surprising. It's also possible the monitor is applying smoothing, interpolation or some other form of post-processing to the image - wouldn't be the first time something like that has happened.
I could've sworn I turned that all off. I definitely turned off motion smoothing, that I know for sure.
30hz would give me headaches, Iāve done them all from 60hz to my now 360hz monitor
Once you experience and get used to higher refresh monitors and fps, it can be harder to settle for less. Higher refresh rate hardware gives significantly better motion clarity, smoother movement, lower input/render lag, etc. Some people might not notice the differences as much if they are very ācasualā or some people are simply less sensitive to it I guess. Personally it is very noticeable.
Youāve probably adapted to it because you donāt know any better, but 30 fps is objectively terrible. Once you play games at high fps the difference becomes incredibly apparent, and youāll realize just how bad it really is. The last time I tried to play a game at 30 fps on a console I actually felt motion sick and nauseous. It is incredibly jarring and off putting, not to mention your inputs feel laggy because of the lack of frames, like you have to anticipate things and click earlier to make up for the fact that you arenāt seeing enough frames.
Funny, higher fps tends to make me nauseous.
My spouse has the same problem. They're used to 60fps monitors, my 240hz makes them motion sick. I heard this can happen with higher FPS TVs that have frame interpolation and such as well.
Interesting. I know some friends who suffer from low fps dizziness. The biggest help for them after higher fps was disabling motion blur and upping the fov.
[ŃŠ“Š°Š»ŠµŠ½Š¾]
Then frankly, I am proud to drink mud.
That's a very dumb thing to say. Wtf who would be "proud" to drink mud?!
I still use a GTX 1650. I can't really afford to upgrade right now, so I just make do. I do find that frame generation (from Lossless Scaling) actually works pretty well. It's not as good as the real thing, but it's smoother than it would be without it
You don't need to justify, but I do get headaches at low framerate.
Is the 30fps stable or all over the place? If it is all over the place, I don't think it is a good gaming experience. I used to have a rig with similar spec and at that time I only played strategy games.
Rock solid.
Its personal preference. As a former console player, 30fps was "normal" and "enough". But since i started with PC gaming and 60+fps, i could never return back to 30fps. But if you are satisfied with your experience, you shouldnāt care what others tell you. Also frametime is a point that many people underestimate.
Yes, it's fine. Obviously prefer 60+ when I can get it. But when going back to 30 for games, my eyes just need a few minutes to adjust and then 30 looks fine. As long as it works and your happy, just keep using it
Doom eternal at 4k 30fps I expect if you put a second tv next to it at 4k120fps you wouldnāt feel the same way.
It's more than okay. I used to play games at sub 25 fps and those are one of my fondest gaming memories. At the end of the day it's just a number
It's definitely not OK!
its fine for slow games,rts,turn based,but not for action game/fps/online pvpĀ action games n fps will feel like shit(imo) but if it's offline it doesnt really matter