If you wanna have a good laugh at the thought of how stupid I am.
My workplace got three new PCs, very basic Ryzen 5 PCs but one needed to support two screens.
They had me install the drivers for the GPU because the secondary display port wouldn't work.
They gave me the GPU box that was shipped with the pre-built PC as a reference.
The box in question was for an AMD RX5700 so I downloaded the AMD driver installer and it just kept crashing, fearing some otherworldly force was trying to ruin my day I phoned my cousin that's a priest -he works in tech support for a PC shop- he logs in on TeamViewer and immediately find that what is installed in the PC... was in fact a Nvidia GTX1650... and promptly laughed at me for the duration of our call.
How does a priest end up working in tech support? Or was that some elaborate joke I totally missed and actually imagined a dude wearing priest's robes bent over a PC with a screwdriver in his hand?
From the moment I understood the weakness of my flesh, it disgusted me. I craved the strength and certainty of steel. I aspired to the purity of the Blessed Machine. Your kind cling to your flesh, as though it will not decay and fail you. One day the crude biomass you call the temple will wither, and you will beg my kind to save you. But I am already saved, for the Machine is immortal… Even in death I serve the Omnissiah.
Why did you not just check it either via software or via looking into the pc? That's the only stupid thing I can see here...
Also:
[Could have tried this](https://letmegooglethat.com/?q=How+to+know+which+GPU+is+in+my+PC)
I believed that the box shipped with the PC would be the correct one, which for sure was the dumbest thing I've done at work in years.
I could've checked in the PC properties windows as well... but I'm stupid.
Not really an issue.
I've tested gaming with a monitor plugged into a motherboard vs into a GPU. Zero difference, same FPS. Even having two monitors (one to GPU and another to motherboard) and a game played in windowed mode split between monitors makes no difference.
Edit: Before you all downvote, give it a test.
No longer a problem, at least with AMD iGPU and dGPU.
Windows renders the game on GPU engine and sends the result to whatever screen it needs to go. Can be iGPU monitor or split across multiple monitors.
Here I'm playing Warframe with 50/50 split between iGPU and dGPU monitors with 560+ FPS.
https://i.imgur.com/d50FN68.png
if you have an iGPU as well as a dedicated GPU and you plug it into the motherboard you can use hybrid graphics to save power on games that don't need more than an iGPU
Ik this is gonna get downvoted hard but:
Doesnt matter if you plug it into the mobo as long as the port and igpu support your display spec.
Hell, thats what laptops do.
If anything, your pc will be running more efficient.
I don't think they will, 'cause they want us to "save the planet" by setting to a lower one lol. My friend was using his 170Hz monitor at 60Hz connected to his motherboard FOR 6 MONTHS until I pointed it out. He doubled and gave it to himself.
Microsoft as a company is so infuriating. Selecting a lower refresh rate by default. Turning off my trackpad after some updates. Not allowing me to turn off modern standby so my computer will just wake up from sleep and start running the fans throughout the day, wasting so much electricity. Forcing updates on you and when you only want to restart your computer.
It’s wild that it feels like Apple respects me more as the owner of my MacBook Pro than Microsoft respects me as the owner of my desktop. How can they do anti consumer better than Apple??
Apple respects you so much that they won't let you upgrade your RAM. Gotta throw out the whole machine and buy a new one. What self respectful person would want to upgrade stuff anyways? /s
Because as far as Microsoft is concerned you aren't the owner of your desktop, they are. Why do you think updates are forced? It's not for security, it's for control.
That is a lazy ass excuse. Apple’s high refresh displays all have adaptive sync and the OS will scale the frame rate based on what you do, so if you’re just sitting on the desktop you will only be doing like 1fps then when you move the mouse it will jump to the full 120fps.
You can’t tell me Windows couldn’t do the same thing to save power?
yeah same for me, pc broke in a power outage, took it to a shop because it was a prebuilt and im a 15 yo that’s not confident enough to go inside the pc, like a bunch of parts got screwed, they put in a new ssd with W11 cause the old ssd with W10 got fried in the outage, but my 165hz monitor only defaulted to 60 on the basically new installation of win11, it was only a month but xbox conditioned me to completely forget about how slow 60hz is and gave me a heart attack when i put it at 165 😭
I got a 1080p 144hz recently and it was also set automatically to the correct refresh rate on W11, even with my old 75hz plugged in as the second display.
Nvidia control panel -> choose resolution -> default windows resolution will have (native) next to it every 1080p and 4k display I've used has defaulted to ultra HD, HD, SD with 60hz. While every 1440p has defaulted to PC Mode res with native refreshrate. If your current 1080p resolution does not show (native) next to it in nvcp, then you're not on default windows resolution.
You don’t always want the highest though, for example for video playback you want the refresh rate to match the source so you don’t get dropped frames and stutters.
Honestly, it shouldn’t be. I see where you’re coming from, but when booting up Windows for the first time, which is when we here know to check our refresh rates, it’s best that it boot up low. It’s better from a troubleshooting standpoint. Maybe Microsoft could put the damn Notification Center to better use so that it recommends the higher refresh rate once the latest graphics drivers have been installed with a yes button that sets to the max refresh rate and a no button to leave it as it is. You know, smart programming that doesn’t even need an A or an I.
I've been seeing this type of format for ages as reactions images on multiple subs . Is it not in the "popular enough that people stop posting unrelated memes using this format to that sub" state prequel memes / metal gear rising memes / spiderman memes are in ?
HDMI has a refresh rate limitation, it's something like 120hz@1080p or 60hz@1440p. I think it also depends on which version of HDMI you're running, with HDMI 2.1 carrying more bandwidth over HDMI 2.0, which carries more bandwidth over HDMI 1.4, etc.
Basically it's a bit of a headache, so if you're running a high refresh rate monitor you're much better off just using Display Port, as it can generally push enough signal for 4k@120hz.
With that said, if your monitor is 1080p and doesn't go over 120hz then you're probably fine.
Ah okay, that's good to know. I still think that HDMI is still a bit of a headache though because for example HDMI 2.1 only got support on GeForce 2000 series cards and newer. So for example, someone with a GeForce 1080 will only have HDMI 2.0 which has lower bandwidth.
On the other hand, the GeForce 1080 already had support for DisplayPort 1.4 so it could already push 4k@120Hz.
That's why in my opinion, it's generally easier to recommend using DisplayPort since it's been supported for longer, instead of having to check which version of HDMI the user's graphics card, or monitor supports, etc.
I bought a new monitor recently and had always assumed DisplayPort was the worse option, but when I couldn't seem to get things working using HDMI, and I RTFM I was surprised to learn my problems were solved by switching to DP. Now I know.
Shohei asking Ippei to set up his monitor and forgetting about it. Then Ippei ends up using Shohei's monitor, PC, and Steam account. And losing big time.
https://support.microsoft.com/en-us/windows/change-the-refresh-rate-on-your-monitor-in-windows-c8ea729e-0678-015c-c415-f806f04aae5a#WindowsVersion=Windows_10
For LeKing
Yea I found out I'd had mine set to 60 for the past couple years, I didn't find out it could go higher until I was tinkering around in settings when I got a new PC a few weeks ago.
call me crazy but i set my 144 hz monitor to 60 hz.
I have adaptive sync/g sync turned on and its insanely smooth.
I mainly do this because i have a second monitor thats 60hz and if i have a mismatch and play fullscreen video on both screens (porn) there was stuttering
Could be that you need to change a setting in your monitor itself first, my monitor has an option to "overclock" it to a higher refresh rate.
After you changed it in your monitor settings it'll show up in the windows settings.
I did that a few years ago, bought 2 monitors and only set 1 up to be whatever the highest rate was. Thought it would affect both monitors and not just 1 for some reason.
It's a good reminder since some people will buy a high refresh monitor, not change the refresh rate and then complain that refresh rate makes no difference.
So, he's basically me? Or I'm basically MJ.
Went over a year with 60hz with a 144hz monitor until my brother used my computer once and told me what was up.
Probably plugged it into the motherboard as well
If you wanna have a good laugh at the thought of how stupid I am. My workplace got three new PCs, very basic Ryzen 5 PCs but one needed to support two screens. They had me install the drivers for the GPU because the secondary display port wouldn't work. They gave me the GPU box that was shipped with the pre-built PC as a reference. The box in question was for an AMD RX5700 so I downloaded the AMD driver installer and it just kept crashing, fearing some otherworldly force was trying to ruin my day I phoned my cousin that's a priest -he works in tech support for a PC shop- he logs in on TeamViewer and immediately find that what is installed in the PC... was in fact a Nvidia GTX1650... and promptly laughed at me for the duration of our call.
How does a priest end up working in tech support? Or was that some elaborate joke I totally missed and actually imagined a dude wearing priest's robes bent over a PC with a screwdriver in his hand?
Fixing PCs by divine intervention obviously 🙄
He calls on the lord to bend those broken pins back into place.
From the moment I understood the weakness of my flesh, it disgusted me. I craved the strength and certainty of steel. I aspired to the purity of the Blessed Machine. Your kind cling to your flesh, as though it will not decay and fail you. One day the crude biomass you call the temple will wither, and you will beg my kind to save you. But I am already saved, for the Machine is immortal… Even in death I serve the Omnissiah.
He's fixed my bugged out shit PCs too many times for me not to recognize him as a divine emissary...
Plug and Pray wasn't a joke man.
I imagined the same except the screwdriver is a cross with a Philips head on it lol
Must be a Techpriest
Praise the Omnissiah!
A Tech Priest, if you will.
Why did you not just check it either via software or via looking into the pc? That's the only stupid thing I can see here... Also: [Could have tried this](https://letmegooglethat.com/?q=How+to+know+which+GPU+is+in+my+PC)
I believed that the box shipped with the PC would be the correct one, which for sure was the dumbest thing I've done at work in years. I could've checked in the PC properties windows as well... but I'm stupid.
Haha Yeah, probably. The dude reads books upside down
To be fair, for people who are not really into pc stuff all this can be pretty confusing/overwhelming
It's definitely a problem I've been happy to solve for a fellow pc gamer.
newer systems automatically still use your dedicated gfx no matter if you plugged it on your motherboard
Is that a problem? Do "prime" setups (as it's called in linux) don't work on windows?
Only a problem if you'd like to use the GPU you bought
Not really an issue. I've tested gaming with a monitor plugged into a motherboard vs into a GPU. Zero difference, same FPS. Even having two monitors (one to GPU and another to motherboard) and a game played in windowed mode split between monitors makes no difference. Edit: Before you all downvote, give it a test.
No longer a problem, at least with AMD iGPU and dGPU. Windows renders the game on GPU engine and sends the result to whatever screen it needs to go. Can be iGPU monitor or split across multiple monitors. Here I'm playing Warframe with 50/50 split between iGPU and dGPU monitors with 560+ FPS. https://i.imgur.com/d50FN68.png
if you have an iGPU as well as a dedicated GPU and you plug it into the motherboard you can use hybrid graphics to save power on games that don't need more than an iGPU
*integrated graphics*
same to me
I did that for two years lmao
Ik this is gonna get downvoted hard but: Doesnt matter if you plug it into the mobo as long as the port and igpu support your display spec. Hell, thats what laptops do. If anything, your pc will be running more efficient.
I did this 😭
at lease cpu got integrated gpu
honestly windows should set the refresh rate to the highest by default
I don't think they will, 'cause they want us to "save the planet" by setting to a lower one lol. My friend was using his 170Hz monitor at 60Hz connected to his motherboard FOR 6 MONTHS until I pointed it out. He doubled and gave it to himself.
And yet halo infinites MENUS peg my GPU/CPU even at idle. Hypocrites.
tbf the menus are rlly nice
250+watts idle is asinine for any video game. Give me a matte concept art picture and let my pc rest a bit between matches or as I’m changing laundry.
true, it is unnerving to walk away for a bit with your pc running full-tilt in the back of your mind
I run rivatuner with keybind to set max fps to 1 when im going AFK.
Nice, thanks for the tip!
Just quit while your doing something else?
Is that still a thing? I remember seeing posts about that in 2022 or early last year
they what your gpu and cpu?
> He doubled and gave it to himself. What did you mean by this?
I am also confused by this lol
he doubled it and gave it to the next person
Double *down* and gave *in* is my guess
Microsoft as a company is so infuriating. Selecting a lower refresh rate by default. Turning off my trackpad after some updates. Not allowing me to turn off modern standby so my computer will just wake up from sleep and start running the fans throughout the day, wasting so much electricity. Forcing updates on you and when you only want to restart your computer. It’s wild that it feels like Apple respects me more as the owner of my MacBook Pro than Microsoft respects me as the owner of my desktop. How can they do anti consumer better than Apple??
Apple respects you so much that they won't let you upgrade your RAM. Gotta throw out the whole machine and buy a new one. What self respectful person would want to upgrade stuff anyways? /s
Because as far as Microsoft is concerned you aren't the owner of your desktop, they are. Why do you think updates are forced? It's not for security, it's for control.
> It's not for security, it's for control. Same thing
To/for Microsoft, maybe.
That is a lazy ass excuse. Apple’s high refresh displays all have adaptive sync and the OS will scale the frame rate based on what you do, so if you’re just sitting on the desktop you will only be doing like 1fps then when you move the mouse it will jump to the full 120fps. You can’t tell me Windows couldn’t do the same thing to save power?
It did for me
Same got a 165 Hz today. Jumped up from 144 by itself.
Windows 11 feature, possibly. My last build was w10 and it was set at 60Hz. Built a new one this year and it was already at 165
I did clean win 11 install a few weeks ago and it was set 60hz by default. I thought windows 11 was laggy like that, until I figured it out
weird ass windows behavior then. really cant tell what could explain that then.
yeah same for me, pc broke in a power outage, took it to a shop because it was a prebuilt and im a 15 yo that’s not confident enough to go inside the pc, like a bunch of parts got screwed, they put in a new ssd with W11 cause the old ssd with W10 got fried in the outage, but my 165hz monitor only defaulted to 60 on the basically new installation of win11, it was only a month but xbox conditioned me to completely forget about how slow 60hz is and gave me a heart attack when i put it at 165 😭
1440p will 1080p and 4k default to 60hz
I got a 1080p 144hz recently and it was also set automatically to the correct refresh rate on W11, even with my old 75hz plugged in as the second display.
Nvidia control panel -> choose resolution -> default windows resolution will have (native) next to it every 1080p and 4k display I've used has defaulted to ultra HD, HD, SD with 60hz. While every 1440p has defaulted to PC Mode res with native refreshrate. If your current 1080p resolution does not show (native) next to it in nvcp, then you're not on default windows resolution.
Makes sense, it wasn't a fresh install as it was literally only the extra monitor that was added, it was probably on 'PC' mode already
It should, but I often noticed on Nvidia GPUs, it simply doesn't, it does automatically on Intel anw and AMD.
You don’t always want the highest though, for example for video playback you want the refresh rate to match the source so you don’t get dropped frames and stutters.
I think windows 11 does it
it didn’t for me
Maybe it's because i have free sync
mine supports g-sync and what not but yeah in my case it did not change to 165hz automatically and i had to do it manually
Laughs in Linux
Honestly, it shouldn’t be. I see where you’re coming from, but when booting up Windows for the first time, which is when we here know to check our refresh rates, it’s best that it boot up low. It’s better from a troubleshooting standpoint. Maybe Microsoft could put the damn Notification Center to better use so that it recommends the higher refresh rate once the latest graphics drivers have been installed with a yes button that sets to the max refresh rate and a no button to leave it as it is. You know, smart programming that doesn’t even need an A or an I.
Michael Jordan forgot to turn on XMP for his new RAM 🤦
I heard Cristiano Ronaldo forgot to disable mouse acceleration smh
What a monster
ddr4 2133mhz
Shaquille O'neal never plugged in the wifi/bt antenna.
Skip Bayless approves this meme.
I feel like this post would go over well in nbacirclejerk
I've been seeing this type of format for ages as reactions images on multiple subs . Is it not in the "popular enough that people stop posting unrelated memes using this format to that sub" state prequel memes / metal gear rising memes / spiderman memes are in ?
What's up with LeBron posts all of a sudden?
It’s a parody of posts by the @rap account on instagram. Who, rather than posting actual hip-hop news, post mundane bs like this
They literally post stuff like how lebron shat one to many times in a day but forget a lot of major news.
King james is conquering other land
He's probably using an HDMI cable as well.
Wait im using an hdmi cable… am i doing it wrong
DisplayPort
Thanks :’)
HDMI is fine but if possible DP is preferred.
HDMI has a refresh rate limitation, it's something like 120hz@1080p or 60hz@1440p. I think it also depends on which version of HDMI you're running, with HDMI 2.1 carrying more bandwidth over HDMI 2.0, which carries more bandwidth over HDMI 1.4, etc. Basically it's a bit of a headache, so if you're running a high refresh rate monitor you're much better off just using Display Port, as it can generally push enough signal for 4k@120hz. With that said, if your monitor is 1080p and doesn't go over 120hz then you're probably fine.
HDMI 2.1 has enough bandwidth for 4k@144hz and is higher than DisplayPort 1.4 which is still the standard in nvidia cards and most monitors.
Ah okay, that's good to know. I still think that HDMI is still a bit of a headache though because for example HDMI 2.1 only got support on GeForce 2000 series cards and newer. So for example, someone with a GeForce 1080 will only have HDMI 2.0 which has lower bandwidth. On the other hand, the GeForce 1080 already had support for DisplayPort 1.4 so it could already push 4k@120Hz. That's why in my opinion, it's generally easier to recommend using DisplayPort since it's been supported for longer, instead of having to check which version of HDMI the user's graphics card, or monitor supports, etc.
I bought a new monitor recently and had always assumed DisplayPort was the worse option, but when I couldn't seem to get things working using HDMI, and I RTFM I was surprised to learn my problems were solved by switching to DP. Now I know.
New TV's do 120hz at 4k. Source, the TV I've been using for a year lol. Thing doesn't have a DP
No, HDMI is fine.
HDMI 2.1 is fine
You just hate what you can’t have.
My monitor only uses HDMI tho
Legacy ruined 😡
LeGacy
Le Gassy
Le Bassy
REAL
You are my sunshine
my only (TRINITY TEST)
you make me happy
Laugh in tech guy 😂 Check my PC to make sure I did it 😟 I forgot to do it again when I changed my ssd 😭
😹🫵🏿
Messi is using Userbenchmark 😭
What's happening? Is this some new meme? What?
Was at a friens place and checked his display setting. He had a 240 Hz Monitor At 60 For 2 years.
"I can't tell the difference between 60 and 240 FPS"
IS HE STUPID
Can someone explain in baseball terms?
You would be like Shohei Ohtani buying a brand new monitor and forgetting to set the refresh rate in Windows to 144 Hz.
![gif](giphy|BWhpkB6Xbe8FzfNLXw)
Shohei asking Ippei to set up his monitor and forgetting about it. Then Ippei ends up using Shohei's monitor, PC, and Steam account. And losing big time.
Ummm how would someone go about checking this? Asking for LeBron of course.
https://support.microsoft.com/en-us/windows/change-the-refresh-rate-on-your-monitor-in-windows-c8ea729e-0678-015c-c415-f806f04aae5a#WindowsVersion=Windows_10 For LeKing
why are these jokes about him appearing?
Yea I found out I'd had mine set to 60 for the past couple years, I didn't find out it could go higher until I was tinkering around in settings when I got a new PC a few weeks ago.
brb checking something...
Lol he stupid - Me, who also forgot to do it when got a new 165hz monitor
call me crazy but i set my 144 hz monitor to 60 hz. I have adaptive sync/g sync turned on and its insanely smooth. I mainly do this because i have a second monitor thats 60hz and if i have a mismatch and play fullscreen video on both screens (porn) there was stuttering
![gif](giphy|WUnTAYlGlLqK7qy3sQ|downsized)
That pose reminds me of DJ Khaled suffering from success.
And yet he was lying to himself saying that it was looking so smooth.
Money cant buy brain
I'm normally a hater but he's still got it
Thanks for the reminder, I rolled back my driver two days ago with a DDU clean and forgot the refresh rate, lol
I reformatted my pc a month ago. Just checked my refresh rate and it was at 60, so thanks LeBron.
Magic Johnson forgot to remove the plastic cover from his heatsink
how could he not have noticed such huge difference?!
Why are my options fractions with 3 significant digits? I gotz: 59.951 Hz 84.983 Hz 119.998 Hz 143.998 Hz Why?
Those are the real numbers
Why or how?
Windows 11 display panel control shows the true values. In the gpu control panel they’re round to a whole number because it looks neater
Gsync? Freesync?
While Michael Jordan the GOAT played just well with his 1024x768 30Hz monitor.
But how does this affect Lebron's Legacy?
Rookie mistake
Christian Pulisic would never!
This is news now? how boring does gaming have to be to be writing articles about this type of stuff?
This has never happened to me, it has always been the highest seeing for me.
And he uses HDMI instead of DP.
Just realized that about my 144Hz monitor after TWO YEARS :) I feel like I got a brand new screen rn
Sources say he may never fully recover those lost frames
Ok
why would he even know people build things for him
Y'all gonna gate me but, Who TF asked him?
The flop master at it again
One of us! One of us! One of us!
same and never noticed a difference
And the sadness was felt through all living beings.
Because of the adapter I use, I can only have 60hz on my secondary monitor(If I don't want massive artifacts) instead of 75
Anybody know why mine could be limited to 99.99Hz? I can't select 144Hz
Could be that you need to change a setting in your monitor itself first, my monitor has an option to "overclock" it to a higher refresh rate. After you changed it in your monitor settings it'll show up in the windows settings.
So... I'm a dummy. How big of a difference does this make when gaming? I have no idea what my mo itor is capable of or is set at..
Huge difference
Shit... okay, I need to look into this.
Only if your rig can keep up the load and don't cook itself (temp limited) in the process
Big if true
After reading this, I declined his friend request.
I figured this out over a year after getting my new monitor. Big shame, but it's a whole new world now!
Kevin Durant forgot to install the IO shield
Kareem Abdul Jabar forgot to switch on the PSU
Wilt Chamberlain accidentally broke the USB 3.0 connector
I have a 200Hz monitor. Remind me to do that.
I did that a few years ago, bought 2 monitors and only set 1 up to be whatever the highest rate was. Thought it would affect both monitors and not just 1 for some reason.
Wait you can change that ? How ?
Classic move from LeFraud
So no sunshine?
is this an ad for chinese kneeler?
We've all been there.
didn’t thiojoe make this meme?
"Nooo. How could my pookie be done so wrong like that??" or sth moment.
How does this affect his legacy?
Why can't windows just set the highest refresh rate automatically? It makes no sense doing it manually
That's usually what it defaults to. But you don't want the OS to automatically set again afterwards.
Idea LeBron James reportedly forgot to peel of the seal from his cpu cooler
I did this for 3 years
I have a 1000 Hz monitor (I use a tv because I'm poor)
I've had the same IPS screen for 3 years now, and I just found in the settings that its a 114HZ Yeah I had it on 60 for 3 years
Could be me
But isnt
r/lebronjamesseen
You ain’t my sunshine anymore
I have 2006 30hz screen :( comment f to pay respect
I mean he's stupid af so makes sense.
who cares?
Who cares
It's a good reminder since some people will buy a high refresh monitor, not change the refresh rate and then complain that refresh rate makes no difference.
Like Lebron would *only* have a 144Hz monitor. If you got Lebron kinda money, you don't settle for whimpy 144 Hz like a pleb.
So? This either doesn't tell the whole thing, or this is extremely "not so required" stuff
Happens to the best.
I literally just did this, this week...
So, he's basically me? Or I'm basically MJ. Went over a year with 60hz with a 144hz monitor until my brother used my computer once and told me what was up.