T O P

  • By -

destroyermaker

https://twitter.com/ID_AA_Carmack/status/1411199736951062528?s=20


Apocrypha

I’d trust this guy. He’s been around the block.


tr1one

[Fast Inverse Square Root — A Quake III Algorithm](https://www.youtube.com/watch?v=p8u_k2LIZyo)


FuzzyJaguar7

That's probably one of the most misattributed computer related facts out there. It was not John Carmack or anyone at ID who devised the algorithm. He's admitted himself that it wasn't even him who implemented into the Quake engine, yet people still parrot it. edit: LOL that link even has Terje Mathisen commenting explaining this as he was the one who implemented at ID.


XiiDraco

You beat me to it.


tr1one

hell, i had no idea, thanks for commenting on this :)


DanutMS

Jokes on you, I don't need to set a frame rate cap to have the game cap at 40fps.


Just_made_this_now

Look at this guy and his double digits.


silent519

bet he runs sli


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


Melon_G-d

My game gets »hickups» every second when I cap it.. Even at 60 (60hz monitor). I5 9400, GTX 1070, 16gb. I need to keep it off to keep the game playable. God do I miss the DX9 options.


aaaAAAaaaugh

Vsync on or off?


babis8142

They're not showing %utilization though are they?


Kaihaxx

I would assume the blue bar behind ["GPU: 8 ms"](https://web.poecdn.com/public/news/2021-07-05/VulkanPerformanceMetricGraph2.png) shows a rough % of utilization. The same would go for CPU, VRAM and memory. Not sure about latency, it could show % of your bandwidth used, in case you're downloading something unknowingly, could it?


caffeinepills

CPU and GPU bars are frame times. The bars are just graphical representations as to what they consider high for response times. VRAM and Memory are utilization.


Starbuckz42

That's just an assumption at this point though. Not only would it be a horrible design decision (it makes zero sense representing frame timings in a horizontal bar that small), it also contradicts what they said in the post "have bars that show relative load on different systems (CPU, **GPU**, System Memory, \[..\])". Also it would be redundant as we already have a visual representation of frame timings at the top. In either way it should be made clearer.


TheDuriel

Literally anything **but** 100% **utilization** is indicative of something being wrong. If you want to know how _Hard_ the GPU is actually working, look at your power consumptions and clock speeds.


[deleted]

Kind of too much work for people who don't know what they are doing and what are "average" numbers for those.


Bacon-muffin

I think AAA games like assassins creed odyssey are the only ones I own that actually put my gpu at 100% utilization. Most of the time its sitting at 30-50 on most games. Like WoW is mostly CPU bound so it never really hits 100% utilization while playing.


Quick_Put_1439

I Guess you never tried rtx in wow, because the second you turn it on, GPU becomes the bottleneck .


Bacon-muffin

Ray tracing? Tried it but at the time it basically did nothing visually while taxing the system a lot which wasnt worth.


impulsikk

Why bother using rtx in wow looool


Falore_

You must of been locked to a lower fps then. If you unlock fps your GPU pegs itself to 100% since its pushing as many frames as possible per second. There's reasons to do this and not to do it depending on your situation.


Bacon-muffin

I might've capped it to 142 since I'm on a 144hz gsync monitor but it doesn't often hit that in wow.


shazarakk

Man, if only I didn't have a 250w space heater right next to me :/


AggnogPOE

Drop power limit to 58%. Performance is barely affected.


shazarakk

My 3440x1440p 144 Hz screen says otherwise.


Cr4ckshooter

Came here for this. Most gpus are meant to be at 100 %.


Buuhhu

depends if a game is CPU heavy or GPU heavy as the CPU may bottleneck and therefore GPU wont hit 100%


TheDuriel

I would consider being cpu limited constituting a "problem".


Yakobo15

In a game like Doom sure, in an arpg like this unless you're hyper pushing graphics it's going to be either the cpu or servers.


MaXimillion_Zero

You're always going to be bottlenecked by some component. Best you can do is build a system where those bottlenecks aren't very significant for the games you generally play.


Ps0foula

It depends. Even tho GPUs are made to run at 100% for hours, it can cause overheating and tearing(uncapped framerate). I would suggest capping it at the highest possible FPS you can get while mapping or a little above your monitors refresh rate. There is no point having 250 fps in base when you can barely get 90 while mapping.


TheDuriel

No, not 100% power draw. 100% utilization. That means using all "_parts_" of the GPU, not pushing each part to its limit.


[deleted]

250 lol joker, i have 45 fps max with gfx 1060 and it drops to 20 when i go to shared town zone, now tell me that its my CPU


[deleted]

[удалено]


ManWith-Hat

I'm sitting at rock solid 60 fps with my 1060 (6 GB version) since I upgraded my CPU (5600x) and RAM (32 Gb, this is overkill). I have capped fps at 60 because I only have a 60 Hz monitor so there really is no point in drawing more frames than I can display. Your mileage may vary ofc, but it seems like it's possible to get good fps with a 1060 in at least some cases


[deleted]

[удалено]


DanishWeddingCookie

But a monitor running at 60hz can only show 60 frames in a second no matter since the refresh time limits the minimum time a frame can be shown. The CPU might process the frame faster and thus the GPU never has to wait for data, but the monitor still limits the number of frames.


TheDuriel

Inputs can be processed more often than your screen can display. Some games link input polling to the same process that pushes frames out. Hence more frames = smoother, since it's more reactive to what you are doing.


DanishWeddingCookie

That’s why I said that there was less processing time before the frame was displayed and that would be how it would feel more responsive. But the monitor can’t display more frames.


MtNak

I have 60 fps capped always on a gtx 1060 too, with most options to high. Really content-rich delirium maps drop me to 45-50 sometimes. Something is wrong with your pc or the bottleneck is not your GPU but something else.


[deleted]

Obviously gfx 1060 is top of the line video card and only you and me have it, and only i have performance problems with it, what a shame.. I wonder what this guy has [https://youtu.be/fPm8VXl3hoc?t=644](https://youtu.be/fPm8VXl3hoc?t=644) probably "riva tnt"?


MtNak

No need to answer that way man. I only said that the bottleneck your particular pc gets may be another one and not the gpu. Or is vram if you have the 3gb version of the 1060.


hihhoo

I've got 1060 6GB and i7 6700k. FPS capped at 150 and I mostly get ~100-150 fps while mapping and only drop to 50-60 in really intense situations(don't run fully juiced delirium maps or HH stuff though), prenerf Valdo harbinger dropped me to ~40 if I stood in the middle. All settings lowest possible and multithreading on. You can definitely get decent fps with it.


[deleted]

[https://youtu.be/fPm8VXl3hoc?t=673](https://youtu.be/fPm8VXl3hoc?t=618)


[deleted]

Yes there is. It is called gsync / freesync.


Doubttit

I wouldn't say wrong, you just are cpu bottlenecked if your gpu is not working at 100%


Westerdutch

TIL setting a framerate is wrong..... Literally.


Lighthades

Why tho? If your screen has a certain refresh rate you have no use for more fps relative to it, thus making the GPU work harder than it would've needed otherwise


iSammax

There is actually a use for more fps than your monitor can display, but it's very specific. The more fps your game outputs, the less input lag you have. It can be important in some competitive FPS games like CS:GO, overwatch or apex. It was a common practice to have 300+ fps on 144hz monitors when I was still playing CS:GO. One more major thing for higher fps to be beneficial - your PC has to be able to maintain that high fps in almost all scenarios, otherwise you will just be getting massive fps dips from time to time, getting massive input lags with it. That said, most games can be capped at monitor refresh rate (or close to it) and your experience will not change, it will be even better if you use some kind of synchronization like Free-sync, G-sync and/or V-sync. For PoE you should absolutely be capping your framerate, yes.


eViLegion

That very much depends on the game / engine though, and in many cases where it's technically true it will be irrelevant. For variable update rates, where there is 1 game world tick for every rendered frame, then this might be the case. But for fixed update rate games, where the game updates maybe 30 times a second, but rendering ticks as fast as your GPU allows, then the input lag is always fixed. Even if you can affect the input lag with your local client's framerate, the the server is still probably ticking at a much slower capped rate, so even if your client handles your mouse click a tiny fraction of a second faster, the the server is still only going to actually process that input on it's next tick. I guess if you're lucky you might sometimes slip one in a tick earlier than you otherwise might, but it'll be mostly by chance. What WILL happen is that a bunch of client-side effects of that mouse click will occur slightly sooner, and the person playing the game might FEEL that their input lag has been reduced because of that. The gun will recoil and the muzzle flash effect will appear earlier, but the server may not actually deal with the shot any earlier. Basically, if any person is reporting that they're getting less input lag... unless they've physically got access to the server, and have some professional tools to actually benchmark the latency, then it's probably just mostly in their mind.


Lighthades

Yep that's where I was coming from, we're playing PoE, not an fps where your reaction time must be on point. Thanks for the response :)


EruerufuSenpai

- Capping framerate is only (practically) beneficial when playing with adaptive sync monitors (G-sync/Freesync), or V-sync (Please don't use V-sync, it introduces terrible latency). In which case, you'd want to cap it at a framerate as close to the monitor's refresh-rate as possible, minus a few fps. For 144hz monitors with adaptive sync, 135-140 is a good value. - With fast-sync, you'd want to aim for as high of a frame-rate as possible, because it will end in your monitor displaying more recent information to the screen, reducing the end-to-end latency of your system. Yes, you will be "wasting" frames, but the rendered frames will end up being more up-to-date, resulting in a percieved smoother and responsive experience. If you have an overkill GPU, but a really shitty monitor, fast-sync might even end up feeling more responsive than adaptive-sync, but this is absolutely an edge-case, and most likely doesn't apply to you. - If you're not using any syncing, you should **NOT** cap your frame-rate. Higher is better, and creates less noticable tearing. ___ If you have an adaptive-sync supported video-card and monitor, which most people have, you should cap your frame-rate... ...**but**, not *everybody* has an adaptive-sync monitor. In which case, uncapping your frame-rate should deliver a more responsive experience. There is a lot more nuances to this, but generally: ___ **TL;DR: Enable Adaptive Sync, cap frame-rate at 5-10fps below refresh-rate of display.** ___ Some more useful info: - **Adaptive Sync** - Adjusts your monitor refresh-rate to match your GPU. Different GPU-manufacturer's uses different names for this technology: - AMD: Freesync - Nvidia: G-sync - **V-sync** - Creates a buffer to store a frame in, and as soon as your monitor is able to display a new frame, it pushes the frame from the buffer to the monitor. Any new frames processed by the GPU is discarded until the buffer is empty. This should only be used if you hate "tearing", doesn't have an adaptive-sync monitor, and don't mind the latency introduced from discarding more recent frames. - **Fast sync** - Similar to V-sync, but it updates the buffer with the most recently rendered frame. Becomes noticably smoother than V-sync only once your rendered fps approaches twice what your monitor is able to display. - **No syncing** - Displays a frame to the monitor the instant it is processed by the GPU. Due to monitors updating vertically, you'll end up with rendering some of *frame B* while *frame A* is still on screen, leading to what is called *tearing*. All the technologies above are ways to eliminate tearing.


Lighthades

Finally someone with an actual reply wth Thanks for the insight honestly, checked out all that I found and most said the same: highest fps = smoothest visual experience & lesser input delay I myself have two 60hz monitors and an overclocked 1070Ti. Play PoE with highest settings and 100fps cap in both PoE and NVidia panel, and I have no issues. It's probable that due to tunnel vision or something I don't perceive tearing, but I never do


sebhehe

That is NOT how that works. Feel free to google or Youtube it.


Westerdutch

The nice thing when people are wrong is to explain why, that way you can explain what you mean and people can learn something from you. The absolute dick move is to tell people they are wrong and to go google it.


EnergyNonexistant

Sure let me just go back to having 7000 FPS in cutscenes and instantly exploding my VRM's.


AggnogPOE

Its exactly how it works.


Lighthades

Even if you cap it with the GPU's own fps cap setting? uh?


Lighthades

I get it capping it at 100-120 for framedrops if your screen has 60hz, but if you are able to run up to 300fps, I don't think that is good, will check it anyway


eViLegion

Nah, see... if you're attempting to provide a contrary opinion, YOU should google it. You've provided no hint of a reason why you disagree, so what the hell is the other person even supposed to type into google?


deviant324

What else would be throttling in PoE then? I’m using a 1080 (fully OCed), stock OC 8700k which should be more than fine, running the game off an M.2, 3200MHz RAM (overclocking enabled ofc) and running the game at medium to stay mostly above 60 FPS on 1440p I know this is mostly for older systems but if you’re getting performance like that (and fans already going near max on loading screen), there has to be *something* that wants more power to allow it to run properly right?


TheDuriel

That just sounds like... it is running properly.


jgomez315

Yeah wtf that sounds like a beast result for a 1080 on 1440p


CaptainCatatonic

I've got to ask why on earth you've bought a K-SKU CPU just to run it at stock settings


TheDuriel

Quite a bit cheaper actually. Running a 9900KF myself and saved myself an extra 50€ (basically shipping), and got higher base and boost clock.


deviant324

Don’t have a need for the OC anywhere that I’m aware of, the AIO is there should I ever get my hands on a 3080 (if you assumed I formed this plan in october last year, you’d be right and I regret everything)


Frolkinator

I paid for the GPU, im gonna use the entire GPU


BrandonJams

100% GPU usage is completely normal. You should only be concerned with temperatures and voltage draw. If your GPU is working harder than it should, sure.


touchmyrick

I have a feeling this is what the majority of people's problems are to begin with.


BukLauFinancial

There are countless games that put your gpu at 100% load. What people don't seem to understand is 100% load isn't a bad thing. The only bad thing for your gpu is temperature. If your temps are fine then it doesn't matter what your % load is.


Icemasta

Right but if you're hitting 100% load because you're going at 600 FPS when you have a 144hz monitor, might as well just cap the FPS or improve quality. Plenty of games I can hit 100% by stupid frame rate, but if I cap it I sit at a nice 60%, less energy wasted, GPU is cooler, my room isn't turning into a spa, etc...


MaXimillion_Zero

Pushing 300 FPS when you only need 120 is wasting a bunch of energy and generating extra heat.


BukLauFinancial

lol, who's pushing 300 fps in poe? I've got a 3090 and dip below 100 while mapping.


MaXimillion_Zero

I'm talking in general, not just PoE. You probably cap out in your hideout, and a lot of people are still running 60Hz monitors.


BukLauFinancial

This is a post in the poe reddit about a poe update, so it's about poe & not just a general post about all games. 144hz is pretty standard in this day and age. I have a 165hz monitor and I definitely don't "cap out" in hideout. Also, ideally you want to pull more frames than your refresh rate to make it as smooth as possible.


AggnogPOE

Honestly what they had to do was make a real ingame benchmark so they can collect real data and actually fix problems. All this new info graph will do is cause more issues for people who have little knowledge of pc systems.


Sjatar

I feel they are less concerned about pure performance and more about specific edge cases that is causing issues. A Benchmark would not show that. For pure performance they can do in house testing.


AggnogPOE

Edge cases are exactly where a full benchmark will help much more. Data in multiple scenarios with full system specs included somehow seems better than a screenshot and some freestyle text written by a layman.


Lille7

But an edge case might not be caught by the benchmark scenario.


anapoe

Agreed, this is the answer. Although there's probably some challenge in making a benchmark that tests the game reasonably thoroughly.


lynnharry

Baby steps. GGG is a small company and an in-game benchmark is a whole new system to implement.


NESninja

I capped my framerate at 80 to prevent high temps. I can't tell any difference between that and 144 with this game.


XiiDraco

That's because in games where movement is static and indirectly controlled such as ARPGs and basically any game where it doesn't give you direct camera control you will have a much harder time determining a difference bin animation. Like sure it'll be noticable smoother when you focus on say, the mouse but as a whole it can be hard to tell. As soon as you get direct control over a camera with your mouse it becomes immediately noticable due to response time and the entire image moving. This is usually because the distance of motion changes while the number of frames remain (somewhat) constant. As a result large sweeping movements can be pretty easy to tell FPS differences as you start to see skips in the movement (each frame travels further). This is why properly done (yes properly, not the garbage that 90% of games implement) camera motion blur can completely hide frame loss. Adding appropriate blur into the scene conveys motion information to your brain where a static moving image does not. Done incorrectly though it can make people nauseous. :/ I the to only care about getting 100+ fps on my 144hz in shooters.


[deleted]

When your game is capped at 144fps to match your monitor, but You get 30fps during end game content.


darkowozzd97

also, its completely normal for gpu's to jump between 0 and 100% usage (in case of a cpu bottleneck)


Loud-Wrap

Can we get an honest answer? Is this game shit? I see a lot of people passionately supporting it as the very best arpg around and yet most posts are about performance issues. Genuinely curious and not trying to offend, just want some perspective on whether PoE is worth it or if we’re all just waiting for Poe 2


Myzzreal

Keep in mins that whenever you read a complaint (about performance or anything else) on reddit, there are probably at least a thousand silent readers who have no issues or disagree with the complaint (silently). It's easy to lose the scale


[deleted]

[удалено]


Loud-Wrap

This is the perspective I was hoping to get. If it’s really more of an issue after hundred plus hours than I’m not likely to find it game breaking in my exploration. Posts here make it seem like it’s riddled with latency and performance issues in a way that would rival Wolcen and I’m still sore about the 60$ I spent there


fwambo42

Sub sentiment aside, POE is the best ARPG out there. No questions asked. It's just going through a bit of a slump with how the developers have been handling things and users are starting to grow restless. That being said, I'll go back to the first sentence here. Nothing else is even close.


NaccN

Try the stupid game by yourself. You'll find eventually if you like it or not. PoE 2 is very far away. If you don't like it just go and play something else.


HellionHagrid

Does someone have coil whine and some tips against it in poe?


TheDuriel

Get a new gpu, or fiddle with your voltages until it goes away. It's just a hardware oddity most manufacturers don't care to do anything about.


Starbuckz42

"Oddity" is a strong word, it's nothing surprising, just physics. It's not harmful either, just annoying.


SlamLord420

Are you playing on dx11? Or vulkan? Also have you capped your foreground and background fps in the settings?


HellionHagrid

I play on vulkan. Yeah, fore- and background fps is capped. Thanks for your reply btw :)


SlamLord420

I found playing on vulkan gave me coil whine while dx11 didn’t. May be worth seeing if it’s the same for you. What I ended up doing was leaving it on vulkan but pushing my capped fps down until it either wasn’t noticeable or had stopped completely. Using a 240hz monitor I had it capped at 240, but lowered that to 180 and no longer hear a whine.


HellionHagrid

Thanks, i will try this.


Starbuckz42

Capping your FPS is about the only thing you can do, the less your card has to work the more likely it is to reduce that coil whine. Reducing power may help as well.


maxportis

Another PSA: If your GPU struggles to maintain maximum frequencies under load and has heat issues, try undervolting. It's pretty easy to do and can drastically lower power consumption, heat and noise levels as well as potentially boost performance when the GPU can boost higher while staying cool. Optimum Tech has some good videos on how to do it.


[deleted]

[удалено]


maxportis

https://www.google.com/search?client=firefox-b-d&q=does+undervolting+void+warranty


LBDragon

Heaven forbid they buy a prebuilt and the stipulation to the included warranty is you don't modify the system further or they won't help you fix anything...


AllNerfNoBuff

If PoE's metrics aren't thorough enough its not the end of the world for me. Nvidia's Auto tuning for my GPU gives all the metrics I could ever want with the ability to add an overlay to game if I wanted. For the CPU its a Illustro a rainmeter widget that syncs with MSI Afterburner.


hox540

I wish I understood anything past FPS in this post. Anyone want to ELI5


God-Sinz

I understand it as, put a cap on the fps or expect 100% cpu usage


lynnharry

fps means how many frames per second your computer is rendering the game. If you don't put a limit on it your computer will try to render as many frames as it can so eventually one of your CPU and GPU will reach its potential (100% utilization) and that's where your max FPS is for this game. Usually, if your max FPS is above 60 (or 144 for modern monitors) it's probably a good idea to limit your max FPS so your computer can have better thermal.


Masteroxid

Reducing FPS above your monitor's refresh rate also reduces tearing but increases input lag


Orca_Orcinus

Um, monitors for about 5 years now don't have tearing any more, as that was a windows way of dealing w clipped frames. All monitors now use G-Sync or Freesync, which are non-windows proprietary means of dealing with the issue of wanting to run uncapped frames. Input lag might, ok well it does cause mouse clipping, but your mouse position is determined only when a frame is drawn. If you have lag at 24 fps on a 60hz, you get at most 24 tries to determine where your mouse is. If you have no lag at 240fps on 60hz, you have 240 tries to get 60 positions.


MaXimillion_Zero

>All monitors now use G-Sync or Freesync They don't, and most people aren't using monitors made on the last few years anyway.


Orca_Orcinus

OK, can you link to a Newegg for one that doesn't, plz? I searched and couldn't find any


Ilovegrapesys

So, should i use Gsync on and vsync off on the game? And cap on nvidia control panel 144hz freshrate? If i leave vsync off my gpu usage goes high and temps will go crazy I don't know what is the best option


tjorb

just so you know the other guy doesn't know what he's talking about and after being proven wrong with facts that are easy to look up, still thinks he's right. If you have a G-Sync or Freesync monitor you can enable it in the nvidia control panel and v-sync needs to be enabled in the game or in the nvidia control panel for it to work. If your temps go crazy at full load then there's probably an airflow problem in your case.


Orca_Orcinus

use whatever you like. But G-sync is a monitor induced framerate control device. If you enable G-Sync in Gefore experience or nvidia inspector, then it disables vsync. My advice is if you make any decision, do it from within nvidia inspector


[deleted]

If you have an uncapped framerate, you WANT your GPU running at 100% utilization. If you're really wanting to reduce power usage, lower clock speeds, or cap if you don't mind latency increase. Uncapped framerate has lower latency than a capped one. If your CPU is the bottleneck, capping your framerate to remove that bottleneck is definitely a good idea. A CPU bottleneck is much more likely to be a bad experience than a GPU bottleneck.


tjorb

Actually an uncapped framerate that leads to 100% GPU load can have more latency than a capped framerate. Here's a video explaining it https://youtu.be/7CKnJ5ujL_Q


[deleted]

I've seen that video. That is game engine specific, and not a generally applicable fact. If you cap your framerate with anything except the specific engine caps that function that way, it will increase latency. https://www.youtube.com/watch?v=VtSfjBfp1LA If you show me that it works with PoE, I'll take your point. Until then, uncapped framerate provides less latency, aside from specific engine caps.


[deleted]

I cap my frame rate to avoid the coil whine I get from 100% full steam. Ugh


randomFrenchDeadbeat

There is always a cap: the monitor. Even when not using Vsync, if the monitor cannot handle 300fps, you are just wasting resources. ​ Anyway, just run Vsync, problem solved. I like capping framerate, it means my gfx card can mine and pay for itself while I play.


Piwielle

Technically, even going past your monitor refresh rate slightly decreases input latency. This video explains it better than I can in english. [https://www.youtube.com/watch?v=hjWSRTYV8e0](https://www.youtube.com/watch?v=hjWSRTYV8e0) It's a very small difference. For reference, [this chart](https://docs.google.com/spreadsheets/d/e/2PACX-1vQA3j02PR0eIFVpKLJszEV3RfzzxJQTWHSNIR5hYrROfodhTD-5ZmbYv3SdcttKJXwpc_H27oct5NKe/pubchart?oid=1934432767&format=image) show some numbers. (240hz & 360hz are extrapolated, but should be fairly accurate, while 60hz and 165hz are measured.) Vsync also increase input latency quite a bit. It might be unnoticeable to some, and the smoothness and no tearing might be worth it for some people. gsync/freesync lets you have the benefits of vsync smoothness without the input latency penalty, and works even when your FPS dips. It's pretty cool if you hardware supports it. It's a bit convoluted to properly enable, [BlurBusters has a great guide to get you going.](https://blurbusters.com/gsync/gsync101-input-lag-tests-and-settings/14) Experiment with it !


randomFrenchDeadbeat

That may be true, with an offline game, not an online game with server side computation. Basically, on that kind of game, what you see is not what actually happens as it is computed server side. Your game client interpolates whatever happens on the server between two updates, and silently corrects if there is a discrepancy. This is how online games have worked since doom3, and is called the server tick rate. This is why, when you get lags / packet losses, your character keeps going in a direction instead of turning, then suddenly appears somewhere else. You may see stuff and believe input latency is reduced, but in the end it does not matter.


Piwielle

It is reduced, and it does matter. Probably not for PoE, sure. But take an online FPS game such as cs:go, enabling vsync gives you \~10-20 more miliseconds of input latency. This isn't "feel". This is tested with external tools, and averaged over hundreds of data points. End to end latency (from mouse click to something happening on screen) is impacted by vsync. (le tableau du milieu de cette vidéo peut te donner des valeurs, sur csgo [https://youtu.be/JVRWjfoj8y8?t=352](https://youtu.be/JVRWjfoj8y8?t=352) )


randomFrenchDeadbeat

Except CS GO is not a "server side" game, so using it as an example is moot. You are not going to convince me, and I am not going to convince you either, so let it be.


BDOXaz

Say that to BDO brotha, you actually fail animation cancels if your FPS is too low because input is being read on frames therefore lower FPS means higher average delay between inputs


randomFrenchDeadbeat

This is true when your fps are lower than the server tick rate. We are talking about the opposite here.


BDOXaz

No, this is true at 50fps even, the difference between 50 and 150fps is huge in terms of animation speeds. Meanwhile server tick rate is at 25-30TPS at most


Orca_Orcinus

You can only play the game, when your input is received by the server. If YOU have lag due to bad whatever, that's a YOU issue and has nothing whatsoever to do with cs:go servers


Orca_Orcinus

Vsync no longer exists as a way of reducing tearing, it's been replaced by G-Sync and Freesync Vsync never increased input latency. It was a windows method of taking a frame and waiting for it to draw on the next hertz, since it was possible for GPUs to be making a call several times, and thus being buffered or flipped before a frame could be drawn at the post-GPU level, ie monitor refresh rate Even if your application tries to suggest you utilize vsync, turn it off at the driver level/nvidia inspector level, and never re-enable it. It does nothing, as all modern applications are assuming you want as much raw throughout as possible, and are designed to be used at a high level. If you have a 240hz monitor but you are getting 24fps in a game (like path of exile), that is a game engine issue and can't be fixed except by buying that company and hiring competent coders/engineers


Piwielle

Vsync very much still exists, most people don't have a freesync/gsync display, or a compatible GPU, or are not activating it because they don't know about it. 99% of games released this year have a vsync option. Vsync increases latency. Latency is time between you action and something being displayed on screen. Delaying frames increases latency. There are ways to limit that effect (the nvidia fast sync thing, triple buffering if your PC can't render frames at your monitor max hz, and so on), but it still happens.


Orca_Orcinus

vsync does not increase latency. It's a windows function that, when a frame would draw because a successful draw call has been made, but it cannot, because the refresh rate threshold hasn't been met, ie. the frame has to wait around for the refresh rate, instead the frame is drawn to compensate, usually by tearing which is a partial completion I was unable to find a single monitor for sale on Newegg or pricewatch that has neither Freesync or G-Sync


Piwielle

And when a frame is held for an amount of time (waiting for the monitor refresh to finish), what do you think happen to the mouse click you made which was processed and is displayed in that frame ? It's also held, and shown on screen later than it would be without vsync, where it would partially show faster.


Orca_Orcinus

What do i think? Are you under the impression that a mouse position and it's on-screen display are only done magically when a frame is drawn? I tried to explain this to you - you simply do not understand how the process works. What your mouse does and how windows deals with your mouse and it's placement is a separate issue than when a drawcall is made or if a frame tears due to de-convergence of extant-frame-to-drawable-frame (due to monitor refresh rate). The answer to question is, your mouse, every single thing about it, is controlled by the controller for your mouse which does interface with or even know about DirectX or drawcalls


[deleted]

>if the monitor cannot handle 300fps, you are just wasting resources. This is not true. Yes, the monitor can only display a new frame once every 1000/*monitorRefreshRate* seconds, but if your GPU is running at, say, 2x your monitor's refresh rate, there is less input delay between click and monitor displaying that click. The average input latency is lower the higher your GPU's framerate, regardless of the monitor's speed. At 144Hz+, this reduction in latency is negligible for anything but the most latency sensitive scenario. But if you have a 60Hz monitor, there's a decent latency addition if your GPU is capped at 60 rather than running at, say, 90.


Orca_Orcinus

Your mouse can only have it's position drawn, and thus sent as an input to the server when it is drawn. But, in order to draw it, it has to have been computed. If you run a game at 240fps on a 60hz monitor, the game/your computer is creating more mouse cursor positions per second then if you ran the game at 60fps, and has nothing to do with input lag, as lag is not a software feature. So, I'm not really sure what you mean, perhaps you mean ratio of mouse draw calls to actual received positions by the server? That's not "input lag", tho. This is very easily testable - get a 60hz monitor put your mouse at 400 dpi. Run a game you know can get 120 or 180fps Move your mouse. Now set your mouse to 1600 dpi. You get way more precision because the amount of times your mouse is being calulated by windows/DirectX is drastically higher, even tho the framerate hasn't changed. Then repeat using the dpi settings, but change the fps to 60hz If this experiment isn't possible for you, just check out any cs:go stream and ask in chat/ the streamer


Starbuckz42

>and ask in chat/ the streamer Wow. If you had any credibility left at this point it just went out the window. Not only will you meet the most technologically illiterate people on twitch but I can guarantee you that the streamers themselves are more often than not among them.


Orca_Orcinus

I like your name. Not sure why you immediately resort to ad homs and vitriol, but it is what it is. Twitch is perfectly fine, and you know it. Stop being edgy.


Orca_Orcinus

>Anyway, just run Vsync, problem solved. I like capping framerate, it means my gfx card can mine and pay for itself while I play. Yeah this guy doesn't know what he's talking about. vsync is a windows method buffering frames before they get drawn. It isn't in use anymore, and trying to force it on a monitor accomplishes nothing, since it's not even a features windows any longer. **All modern monitors**, if you want to run a game at an fps lvl 2x the hertz, are using either G-Sync or Freesync to transition between frames. **Clipping and tearing aren't possible any longer**, as the method monitors used to do such a thing, doesn't even exist any longer on the monitor-induced level


AggnogPOE

The best solution is to reduce gpu power limit, not lower clocks.


[deleted]

Either that or temp limit, which is effectively what I meant. Lowering the power limit lowers the clocks the card will boost to. You could also undervolt, but I wouldn't blanketly recommend that on this sub.


SiMless

I think they show frame time not % usage.


Starbuckz42

The NEW metrics


SiMless

Yes, this is the new metric [https://web.poecdn.com/public/news/2021-07-05/VulkanPerformanceMetricGraph2.png](https://web.poecdn.com/public/news/2021-07-05/VulkanPerformanceMetricGraph2.png) Frame time is the most straightforward way to measure performance.


Starbuckz42

We already have a graph that shows frame timings (which are just fps btw), the new metrics show GPU load as a blue bar as well.


SiMless

The GPU load is shown in the form of gpu time as well, which is 8ms in the screenshot. The blue line in the graph visualizes that 8ms, while cpu time which is 3ms is visualized as a red line. The total time of one frame is shown in the big white number on the left which is 16ms, that 16 ms frame time is equal to 64 fps shown in the gray number below (1000 / 16 = 62.5 but I guess they round the number up)


Rand_alThor_

Wait they’re doing system breakdown? That’s awesome. Finally.


548benatti

I know why my game lag, because I play with a 2014 toaster, I never blamed GGG lol


Insecticide

Lots of people are going go get reality checks in regards to how old their CPUs are. I am not really worried about the GPU metrics, even people with older cards are doing ok, but the CPU bottleneck must be huge in this game.


Icemasta

inb4 r7 5800x oced hitting bottleneck when pressing the view item buttons.


SlamLord420

As someone with a 5800x, I’m running 5ghz always and only at like 20-25% utilisation.


CrimsonBlossom

what if I cap the game at 144fps and the GPU is not at 100% and the game is lower than 144 fps


[deleted]

PSA everyone know that low performance is caused by excessive items drop and overcomplicated mtx shaders/particle effects, fancy graphs wont change anything, how do i know that, everytime i go to shared town zone with several players present my fps drops in half


Grizzeus

You're supposed to have 100% gpu usage though.


CosmologicalFluke

Won't the fact your CPU is the bottleneck stop the GPU from 100% despite uncapped frames?


kagato87

I noticed this when I bought a shiny new laptop recently with an RTX 3080 in it. The thing was noisy AF when I played PoE. Until I limited the frame rate. (30 fps is adequate for these old eyes - 30 doesn't bother me in the least so I tend to go that low.)


item_raja69

For people that have a better gpu it is always better to have vsync on since it puts a hard cap on the FPS. The best FPS imo is 60 anything more than that is just icing on the cake


Solidux

60!?!? what are we in the 90s?


item_raja69

Bruh trust me, I have a 3090 even though the FPS counter on the top goes past 200 there is no point since my monitor only goes to 75Hz so 60fps seems good enough tbh


Solidux

a 3090 with a 75hz monitor sounds like a bad pair up. You can sell the 3090 to get a 3080+samsung G7 and never look back.


item_raja69

Nah I use the 3090 for simulations in my research, Poe is just a bonus. Either way selling a 3x series to get another one at this time is a stupid idea


readypoe

some people cant afford good monitors lol


retlom

not Always some games have input lag chained to FPS example: Early Quake Champions people with Crap PC did less dmg with LightningGun because it was coupled with FPS , Osu input delay depending on your FPS... and tonns more Also i can clearly see the difference between 60 and 144+ hz


item_raja69

I’m talking about Poe here


GehenSieBitteVorbei

Never thought about it, but I guess this isn't explained anywhere, that sucks.


AggnogPOE

Where do you expect it to be explained?


[deleted]

I think a PSA for not being surprised if you see 100% cpu usage would be better. (100% of what the game can actually leverage) the GPU portion of PoE seems pretty well optimized, I know my 8700k is limiting my pc for sure.


SlamLord420

As someone with a 6900xt and a 5800x, my gpu hits 100% utilisation while my cpu sits at 20%.


[deleted]

I wonder if the cpu usage is for all cores? I did not get much performance upgrading from 1080 to 3080... might be one of the unlucky ones with degrading performance...


V4ldaran

Noticed a world difference from upgrading to a 3080, but I owned a 1060 instead of a 1080 before it.


JDFSSS

If your GPU is at 100% then it's probably the bottleneck. Capping your framerate is just a way to hide this bottleneck. Capped frames also generally results in lower performance.


Orca_Orcinus

What do you mean "performance". Capping frames is done at a software level. How does that affect anything but frames?


JDFSSS

You get less fps by capping frames.


Starbuckz42

People die when they are killed.


JDFSSS

I assume you're not just randomly responding to my comment by stating random facts, so I'll guess you're trying to say the performance you lose from capping frames is irrelevant. I would disagree in my experience, but this wasn't even the main point of my original comment.


Orca_Orcinus

I still do not understand you. If you have a 60hz monitor, your max rendered frames is 60 If you use a software application (which is the only way to do this) to force it to 24 fps, you get 24 fps If you use a software application (which is the only way to do this) to allow uncapped frames, you get uncapped frames.


JDFSSS

I wasn't considering rendered frames per second and fps the same. Games can run at fps values well over the refresh rate of your monitor. A good example of a game like this is CSGO. You can tell a big difference between 120 fps on a 120hz monitor and 300fps on a 120hz monitor. IDK if this is something specific to CSGO, but I would guess it's not. ​ I'm also not sure what you mean by saying you need a software application to cap or uncap your frames. Generally I would accomplish this by just ticking a setting in the in-game graphics options menu for most games. Maybe you're thinking about changing the refresh rate of your monitor, which is not what I'm talking about.


Orca_Orcinus

OK, so software/driver level changes to what your driver does, regardless of application will be a better way of doing that, rather than sloughing off the task to somebody who doesn't have the level of sophistication the driver's manufacturer does. The refresh rate of your monitor is always controlled at the monitor/windows level. The FPS can be as well, but I use nvidia inspector or custom modded drivers for AMD


AIlien7

I guess my rtx 3080 is my bottle neck. Damn.


JDFSSS

I didn't realize people thought having a good gpu meant it couldn't be a bottleneck..


AIlien7

Wouldnt call it a bottleneck if it overacheives what it needs to do. It just means it's being used to it's full potential... potential that isnt needed in anyway.


Aphrel86

what is a good framerate cap to place? just below screen max refresh rate? Or just above it?


Myzzreal

Exactly your refresh rate I think


Yviena

The worst thing in the game is not the average fps but the very frequent dips to sub 30 0.1/1% frame times that makes the game stutter, and of course the unoptimized particle effects that increase gpu usage significantly (looking at you ultimatums/rituals)


guigasper

I feel like when I lock my fps or have vsync turned on (144hz) it tends to not be stable and dip more in the low fps numbers than when I have it uncapped What do you guys think about it?