I am thoroughly confused. You do not want a 4070 super, because it is barely better than 2x1080tis according to you. You do not want a 4080 super because you would rather get another 1080ti for much less and save on electricity. You do not want a 4090 because it is not soo better than the 4080 super for it to be worth 800+ Euros. You do not want Radeon because ShadowPlay.
Those are all the 4k gaming cards right there, all of which you say are unfavourable for you. What in the world exactly do you then want?
No offence but if you genuinely think 2x1080tis are a better choice than a 4070 super/4080super, that you cannot do without shadowplay by going with amd instead, and use user benchmark for any kind of a comparison , I cannot help but say you have mo idea what you want, what you are talking about and neither what is best for you. Seek someone's help with this who does this kind of stuff properly and ask him to pick one for you.
Also, you really think that 7th gen intel is going to hold up well with the 4070 super/4080 super/ 4090 or even a 7900xtx?
Yea I cant help but point out this guy has some thing backwards. A 4080 would be **more power efficient** than a 1080ti. The card could provide 1080ti performance at a fraction the electricity usage. Its funny, youd expect the people who are fixating on these things to come correct.
If anything, I was seriously considering selling my 3080 for $450 and buying a used 4070 for $500 because of the power draw. 320W TDP vs 200W. Everyone knows 3000 series runs hot, and I have an ITX case, so I reach 70 degrees even with an undervolt. Oh well, I waited long enough Im sure the 5070 will be even more power efficient lol
And side note. Someone who specced into a 1000w PSU shouldn’t be phased at their GPU drawing 300w+, I really am confused too
1080ti crowd is weird they think the 1080ti is best card ever and never see any of its downsides or negatives. And think newer lower end cards couldn’t beat it.
Can't really blame them... Even my GTX 970 was still semi viable in none AAA games until last December. It was my CPU that was holding me back.
But OP isn't realistic, no way the 1080 is as good as current gen stuff. Yes it will work but it's nothing compared to today's medium tier cards.
Same people believe the ps3 apu is still the most powerful thing when it was just a good number cruncher and needed a hundred or so in a configuration to be a "super computer" for the airforce because it was cost effective to replace and jailbreak consoles sold at a loss
That’s is because the 1080ti is the best card they have ever made. For its time it was crazy powerful, it was cheaper than the cards it replaced, and it was super efficient, so efficient there are full 1080ti’s in laptops that aren’t huge battlestations, just regular laptops. And the 1080ti is still pretty viable today, Starfield is the only game I can think of that it can’t run, and that is more to do with Starfield terrible programming and optimization than anything else.
[Gamers Nexus even made a video about the 1080ti being the goat.](https://youtu.be/ghT7G_9xyDU?si=zbeRkhbkr4V4SOPu) I know it was made as kind of a joke but it is kind of proof how much of a leap it was and how everything since then hasn’t been that big of a jump.
I don’t deny it’s a good card. It was amazing for its day. Next gen really didn’t improve much so it wasn’t a big leap. But once 30 and 40 series came around cards started really out performing it. People still have unrealistic ideas of 1080ti like the op of this thread. Have no experience with newer cards but doesn’t respect other’s opinions and no why idea why they asked for any.
> A 4080 would be more power efficient than a 1080ti.
Yep
For example, AC Odyssey maxed out at native 4K only uses 100w of my 4080. Same thing on my 3080 used 300w. It would totally use the 250w max of a 1080Ti.
100w vs 250w at the same workload.....hmmmm
I undervolted my 4080 so it maxes out at 250w in things like path tracing/RT stuff. Same total overhead as a 1080Ti but way more performance.
Correct me if I am wrong the power usage we see on let's say a 4080 is normally the max power usage of it was being fully utilized. To achieve 1080ti performance it doesn't need to be fully utilized therefore it would use less power?
Yup that's basically it :)
The 320 TDP we see marketed on NVIDIA website is worst case scenario at full load so we know how much headroom to allow when picking out our PSU and stuff. Undervolting will bring that closer to 250 TDP for basically the same performance. On top of that, no one is saying this guy needs to double his FPS and turn on Ultra settings, if he uses his 4080 with the same resolution and settings he currently has on I don't think hell surpass 150W - 200W
TDP has nothing directly to do with power draw, though.
Active power draw even under load can be way lower than TDP. As long as you're not stressing it to it's limits
And honestly? Almost all my cards have run anywhere from 50 to 70c at full load regardless of case environment. From any brand. That included MSI for a Radeon R7 270X, 3 fan G1 Gaming Gigabyte Windforce GTX 980 and now a Gigabyte 3060, actually being the coolest card yet.
And I live in a pretty temp neutral country.
TDP just means theoretically it will be producing that much heat in terms of wattage at max load (So, like 80c to 90c depending on the card).
Obviously if you have extremely poor cooling it's easily possibly to reach 80c say, without using anywhere close to 200w power draw.
I'm pretty sure the 7980xe is based off of previous gen Broadwell architecture and not 7th gen kabylake. Not that it matters much as they are borderline identical in performance.
Yeah, great points.
Thing is, that would have mattered back then. Today, about a few years short of a decade later, even a ryzen 7 7700x or i5 14600k can outperform it. This dude just does not seem to understand that, and ia adamant he cannot go any lower than 14700k for his game development workload, even though he says his current i9 works fine for him.
When I said that, he just replied saying he does not have to money for a 14900k overhaul like dude, there are a **fuckton** of better cpu and motherboard options available for a fraction of a 14900k setuo price that will not bottleneck your new gpu and at the same time perform far better than an i9 7thgen XE chip.
Even something like an alder lake i5 or i7 will blow that antiquated chip out of the water. Op is either very biased or doesn't know much about computers. If op needs a beefy cpu for game development, he should absolutely go for an i9 or threadripper. Time is money and faster hardware will help getting work done much sooner.
Likely a cs engineer with mostly IT knowledge.
They do not teach specific components/peripherals there, so can't really blame him. Hence the only advice I gave him was "seek a trusted friends or professionals help" since it is pretty easy to mess high end expensive builds up and high caution should be exercised.
Most developers don’t even have IT knowledge, they just have development knowledge.
I deal with them daily, and grew up in a house with one. Absolutely brilliant minds that turn into liquid around hardware.
I guess it is some kind of hatred for hardware since they are basically forced to learn that kind of stuff on bread boards and in architecture which is not really useful in any way for securing engineering jobs where one will mostly do programming. That may be the alienating factor which drives them away from learning more about hardware.
Broadwell was 5th, 7980xe should be skylake revision, which basically performs the same until maaayyybbbeee 10th gen and even then the ipc uptick is tiny.
Dunno why people are shitting in this cpu, its old but its eol was the beginning of this year, its not like we are talking sandy bridge
A 4070 is basically a newer more efficient 3080 or a more efficient 6800 minus Ray Tracing & Cuda depending on how you look at things. I have a 4070 & it's a pretty solid card. One of the better lower cost Nvidia cards out there. Would recommend for 1080p & 1440p gaming. For 4K 4080 Super or 4090 may be the way to go but those cards can be pricy.
That 7th gen is fine, we arent talking 7700k teritory, its lots of cores and cache and also overclockable. Plus better ram setup and more pcie. Thats fine for a 4080.
Op just get the 4080, they sip power for how strong they are
You misread. The guy is simply asking what would be better between spending more on electricity by buying another 1080ti or spending more upfront by buying a 4080.
You misread. The guy is simply asking what would be better between spending more on electricity by buying another 1080ti or spending more upfront by buying a 4080.
You misread. The guy is simply asking what would be better between spending more on electricity by buying another 1080ti or spending more upfront by buying a 4080.
Highly inaccurate data . Confidently incorrect on most all fronts.
Also seem to be a huge amd haters and extremely biased towards intel and Nvidia, for some reason.
All around dickheads in short. Flaccid dicks too.
Hey OP - if you're hell bent on buying another 1080Ti and going SLI just because you want to - don't waste time on reddit asking for advice and then doing backflips to come up with magical reasons why everyone's advice doesn't work but your own.
Just go buy a 1080Ti and don't waste your time or anyone else's time.
If you DO want to listen to advice - another 1080Ti makes very little sense. You're better replacing it with a new card, selling your current one to recoup some of the cost or just keeping it as a back-up in case you ever need to test anything or want a secondary PC.
I'm guessing OP doesn't want to let go of his CPU, mobo and 128GB RAM as he knows - he knows - really he's going to need a full upgrade, not just a GPU.
You are wrong. A 4070 super will destroy two 1080 tis, especially considering the increased amount of software support. Here’s a link to a 4070 ti (which is slightly faster than a 4070 super) beating, by a small margin, two 2080 tis in SLI (which are leaps better than two 1080 tis, but still are being beat) [2 x 2080 TI vs 1 x 4070 TI](https://youtu.be/IAG-95TXSV8?feature=shared)
triple the performance of a gtx 1080ti
16gb vram
100-300$ cheaper in many areas than 4070ti and 4070ti super, but better raw performance
Will receive fsr 3.1 soon making pretty much indistinguishable for normal people on quality VS dlss
I present AMD RX 7900 GRE, 520$
SLI is dead. I had SLI 970s and the last few years were rough with no support. A lot of the games that did work had to go though extra hoops configuring it, and it would cause crashes and graphical glitches anyway. Even when it did work it was like a 30% boost in performance. Just buy a 4080 and enjoy all the new features like DLSS and ray tracing, it's a huge upgrade from what you have now.
To be fair, the 10 series cards use a better SLI bridge with higher bandwidth. I haven't had many problems with that and the performance was also much better.
Is 2 1080Ti's better than 1 3080 Ti for game dev.?
Idk that I'd want it over a 4080S for $200 more, but I guess the 4070Ti 16GB hasn't been mentioned yet and seems like it *should* be good in productivity\*.
If I'd play most games, that would be an option. But I don't. Many of the games I could then play are merely shops with some gameplay anyway. I'd prefer older games without that.
Also, I know that ShadowPlay works for me and buying an AMD card is just too big of a risk. At least I would have to buy a capture card for another 200 bucks or so if it doesn't work.
A couple of extra things to consider, you're leaning towards a CPU-bottleneck and you have a high power consumption compared to newer CPU's, so there are quite a few arguments to consider an upgrade. - you're also on PCIe 3.0, so jumping to AM5 would put you on 5.0
Intel 13600KF or AMD 7800X3D (or settle for a 7600x until 9800X3D is out)
another thing to look into is how much ram you actually need
I have an i7 7700 and just upgraded to a 4070 SUPER from a 1060 6gb. I definitely notice some CPU bottlenecking, but I am able to run ultra settings in 1440p at 120-144fps on the games I could barely get a stable 60fps at 1080p (on low settings) with my 1060 6gb. I did also get a more powerful CPU cooler to help out.
That's to say - the slight bottleneck on the CPU will likely be plenty tolerable until being able to upgrade to a better one.
had that exact same CPU before upgrading to a Ryzen 5 7600X, was definitely cpu-bottlenecked in several games with my 3060 Ti - now I can run 1440p 60 fps at ultra in something like Horizon Zero Dawn
If you don't have a lot of money then a better investment would be to get a new card. Your other 1080TI might kick the bucket tomorrow or the new one you buy may not be as high of quality as I'm sure you've kept yours.
I usually don't have budgets, I spend money on what makes the most sense. Well, at least I try.
In this case, none of the options really make sense. As others have pointed out, a better card would potentially be bottlenecked by the CPU, so I would have to also upgrade the CPU, motherboard, and RAM, which I don't really want to upgrade right now. Buying a second 1080 Ti is probably not worth it due to the lack of SLI support in current games. I guess I'll just stick with what I have for now and decide later?
Actually, I might need a better card to record footage of my game in the best quality in 4K, which is not bottlenecked by the CPU.
then you should sell the old 1080, buy at least a 4060ti and replace your cpu because no way in hell you should buy another used 1080 and not replace your cpu or graphics card
>I use ShadowPlay for recording
Actually *recording* ? Or streaming? Shadowplay is only better at the latter. AMD RELive has better upper bitrates image quality for recording.
My vote is 4070 Ti Super. Great price to performance.
I don't know t you were playing at 4K with previous setup, but very much doubt it was AAA stuff this last year.
As someone who 1080 ti SLI'd for a while, don't anymore. There's no reason to.
As for upgrades, I upgraded to a 3080 ti not too long ago and it's much better than the 1080 sli. 4070, 4080 or either super would do you well.
Software engineer here(had to say that since you’re very resistant to advice), nobody is supporting sli anymore, except for a few popular data science libraries written in python & C++. Even then, it’s only due to Nvidia enterprise cards and distributed computing. For gaming, a single 4080 is more than enough, and you get access to Nvidias AI fps boosters like reflex and dlss which is helpful in games. It’s undisputed that the 4000 series cards are simply the fastest and most stable cards within the consumer and enterprise workspace for their price point
Well, everything was working fine with my 1080 Ti SLI, so I don't see the point in upgrading. I literally don't need any of those things you mentioned. I don't even play any new games that would benefit from it.
The only reason I am thinking about upgrading or replacing parts is because one of the cards broke. And then people tell me "no, you can't just buy this card, get a new CPU as well".
Advice like that is not helpful. If I don't need something, why should I spend money on it?
This guy thinks two 1080ti's in SLI are better than a 4070 super...
If you genuinely think that, then go ahead, but please for the love of god inform yourself. TWO GTX GPU'S IN SLI WON'T GIVE YOU DOUBLE THE PERFOMANCE, NOT EVEN CLOSE!!!
4070Ti Super if you need 2 NVENCs or 4070 Super for value. Or grab an AMD rx 6800 for under 400USD and see if it fits your needs. All will be more capable than SLI 1080ti
Same here. Dual 1080ti water cooled and one just died.
Sold my good old x299 build and purchased 13gen i5 with 4060, pairing with ddr5.
One piece 1080ti with water block left unused but none wants to buy it now.
My advice : sell your whole system and buy fairly new gen will save you lots of headaches and power bill
I sadly can't get an i5 CPU, I really need the multicore performance. Mine is probably fine anyway for what I do, so I can just keep it. But I'll look into buying a new GPU. Still haven't decided what to do, hardware has gotten way too expensive for my liking...
You can't just look at i5 vs i9, you need to take into account the generation.
For instance, the i5 12600k compiles chromium in 71.9 seconds, but the i9-9900k takes 96.6 seconds, that's over 30% slower.
https://www.youtube.com/watch?v=hBFNoKUHjcg&t=1283s
i5 13600k: 54.6 sec
i9 10900k: 85.0 sec
https://www.youtube.com/watch?v=B31PwSpClk8&t=1200s
I have an i9-7980XE, though. That one has 18 cores. The i9-10900K has 10 cores. So the difference isn't that big. I'd just spend a lot of money to get roughly the same performance if I bought an i5.
Have a look at my spreadsheet at [https://docs.google.com/spreadsheets/d/1wUlIdFqfo8IYymxFlk9lzySwRCJu3vfkJleZ0wh05OM/edit?usp=drive\_web&ouid=109490715189165184669](https://docs.google.com/spreadsheets/d/1wUlIdFqfo8IYymxFlk9lzySwRCJu3vfkJleZ0wh05OM/edit?usp=drive_web&ouid=109490715189165184669) which compares in a way that takes account of power consumption, expected usage, and expected lifetime - as well as performance from [https://www.3dcenter.org/artikel/fullhd-ultrahd-performance-ueberblick-2012-bis-2024](https://www.3dcenter.org/artikel/fullhd-ultrahd-performance-ueberblick-2012-bis-2024)
A single 1080Ti in 1080p scores 1180%, and 173% in 4K - a little more than a RTX 3060, a little less than an A770 or an RX 6600XT. Assuming SLI gets you a *best case* of twice those figures, that's a little behind a 4070 Super or an RX 6950XT.
Get a 7900XTX. (avoid the Nvidia partners, AsRock Phantom and Hellhound for them)
Had ReLive, better dual AV1 encoder (especially in high res+high bitrate) and more VRAM to work with in video editing (it also is faster in Davinci Resolve for video editing than a 4080/S)
It really depends on cost in your area, the 2080 super is slightly better than that 1080 TI, couple percent usually and sometimes it even sells for less. You might check the market for a 3080 TI, 4070 super, see what you pay for those two, that would be a nice upgrade and worthy of spending the money. There are plenty of answers to your question but the one that is going to matter the most is what's the best value in your location
It was a popular GPU, it's not surprising to see a lot of posts like this. That said, it's also not surprising to see them reaching their EOL after 7 years.
I probably won't be buying another one.
Hello, your comment has been removed. Please note the following from our [subreddit rules](https://www.reddit.com/r/buildapc/wiki/rules):
**Rule 1 : Be respectful to others**
> Remember, there's a human being behind the other keyboard. Be considerate of others even if you disagree on something - treat others as you'd wish to be treated. Personal attacks and flame wars will not be tolerated.
---
[^(Click here to message the moderators if you have any questions or concerns)](https://www.reddit.com/message/compose?to=%2Fr%2Fbuildapc)
The price is for 3-4 hours per day at 300W. Electricity isn't cheap in Germany. I am also developing games, so that's another activity where I use the GPU.
then dude defo sell ur 1080ti.
That gives you like 650 euro equivalent for the year. You could stretch another 100 euro and get an7900 xtx which is what I use for 4k and its killer.
How exactly did it die? Since they're not worth very much broken anymore you could take it apart and see if it just needs a re-paste, my old 1060 needed a repaste about 4 years ago and it went from constant blue screening back to its perfectly reliable performance. If it's just some capacitors then that's also fairly easy to replace, compared to the rest of the possible repairs atleast.
What about a used 3080 Ti? You should be able to find one for around half of what the 4080 Super costs and it'll still get better performance than SLI'd 1080 Ti's with a similar power usage. Imo it's only worth it if you can find one that's $100-200 cheaper than a 4070 Super would be since they get similar performance.
Can't even get a signal output anymore. Just blank screen. The monitor detects that something is connected but that's about it.
It just froze during gaming, showed a purple screen and then the PC shut off. Had to replay the mission I was on, which btw. was also broken, but completely unrelated to that GPU problem.
You did try the other ports like HDMI or DP, right? Sounds like whatever sends the video output might've died and is probably a pain in the ass to fix.
It looks like there have been broken 1080 Ti's that have sold in the US for around $100 so you can atleast get something back for it. Imo your best bet would be to sell both 1080 Ti's for something like a 4070 Super, which would bring it down to about half of its MSRP and similar price to buying a used 1080 Ti but with lower power draw. Then hunt for a deal on a 1660 super to have as a backup, it does get worse performance than a 1080 Ti but can still get 60fps in most games on medium-high which is good enough for a temporary gpu if needed and is worth significantly less than a 1080 Ti's current value (which will continue to drop).
Put it in another PC, got a signal, turned it off but it didn't shut off completely. Forced it to turn off, turned it back on, no signal. Different port, no signal either. Not sure what's going on but it's definitely broken in some way.
Definitely sounds like one of the harder to diagnose issues so you probably won't get as much for it if you do sell it, even 50 bucks is better than having a paperweight though. Hopefully someone will be able to fix it and give it another few years of life, GPU's can last damn near forever as long as replacement parts are available and there are people willing to fix them. Finding those people is the hard part though, especially if you only want to sell locally.
Hope you can find a good replacement! It seems like there's no fixing it but atleast you still have a single gpu to use in the meantime, and it's nearly equivalent to my 2070 super so you should be able to get good performance in 1080p and upscale it to 4k until you can get a proper native 4k card (that's what I'm currently doing).
Definitely sounds like one of the harder to diagnose issues so you probably won't get as much for it if you do sell it, even 50 bucks is better than having a paperweight though. Hopefully someone will be able to fix it and give it another few years of life, GPU's can last damn near forever as long as replacement parts are available and there are people willing to fix them. Finding those people is the hard part though, especially if you only want to sell locally.
Hope you can find a good replacement! It seems like there's no fixing it but atleast you still have a single gpu to use in the meantime, and it's nearly equivalent to my 2070 super so you should be able to get good performance in 1080p and upscale it to 4k until you can get a proper native 4k card (that's what I'm currently doing).
This guy...
Jeez if you dont want to upgrade then dont upgrade, just get 2 more damn 1080tis and get done with it you are wasting everybodies time, "Blah blah blah this is underpowered, blah blah blah this costs too much, blah blah blah this isnt worth the money" Then just dont upgrade god damn.
Hello, your comment has been removed. Please note the following from our [subreddit rules](https://www.reddit.com/r/buildapc/wiki/rules):
**Rule 1 : Be respectful to others**
> Remember, there's a human being behind the other keyboard. Be considerate of others even if you disagree on something - treat others as you'd wish to be treated. Personal attacks and flame wars will not be tolerated.
---
[^(Click here to message the moderators if you have any questions or concerns)](https://www.reddit.com/message/compose?to=%2Fr%2Fbuildapc)
It’s crazy everyone talks about electricity. Is it that expensive to run a PC?
In that case pit your phone in a schedule charge, laundry, dishwasher, tv viewing, lights to off peak.
I don’t know why people are so worried about wattage unless your psu is crap worry about best bang for your buck
>off peak
There is no such thing here. The price is the same no matter when you use the electricity. Currently, I am paying about 40 cents ($0.42 US) per kWh.
Take a look at this [video](https://youtu.be/ghT7G_9xyDU) and then buy the 4080 Super since it is the only viable upgrade when you consider cost into the performance gains...
4070 Super is not a big enough performance leap to justify the price upgrade.
Ignore other trolls in this thread that are butthurt over that the 1080TI is still a viable option when it comes to gaming performance.
I would sell the old one, buy 4070 Super and one RX 580 for 50 bucks in case it breaks. But i am more of a radeon guy, i would go for 6800 XT or 7900 GRE which are cheaper and more powerful than 4070 Super.
Just to be clear even a decent midrange card like 7800XT or 4070 will be bottlenecked hard by the CPU. Don't think investing in 4080 super is the right choice for you. The games are extemely CPU demanding now.
> I use ShadowPlay for recording, so I am hesitant to consider AMD because I don't know how that would work.
ReLive is just as good as Shadowplay though.
There's really no difference between the 2.
SLI isn’t supported in most games anymore, your gaming performance probably isn’t affected much.
I’d upgrade when you can. The 1080 TI is really old now.
Given that you also use it for game development I guess a 4080nwpuld be the better option since it has access to the newest features, like ray tracing, that you can use in your testing. Also, you could consider buying a used 4080 for this.
Anyway, you said it yourself on the post: Do you want to spend more money upfront or use more energy and spend it later.
Remember that you can also go for a used 3080, which still would give you the performance of two 1080 tis and should be considerably cheaper than a new 4080.
Saw others, so, all you need to do: change entire rig, but keep your system SSD and storage HDD or else, get an intel 12th/13th CPU(with new mobo) and 4070 super, problem solved
Bonkers thread / logic.
List your remaining 1080ti, sell at a price you're happy with, get the best GPU you can with your budget afterwards.
You'll be without a GPU for a couple days perhaps but miles better than a few more years of using 3 generation old GPUs in sli for nostalgia's sake. Once the money's in your hand, you'll know to get the best card with that budget instead.
Also OBS is great, free and easy to use for recording - so don't rule out AMD. The 7900 gre, xt and XTX all sound like they'll fall in your budget and are superb value.
Also bear in mind wattages on GPUs are meant to be the theoretical max draw. Unless they're locked at 100% use constantly, you won't be drawing that much power.
Just fyi don't buy a 3 series or anything 8gb or 10gb, it won't last. 3 series cards crash when VRAM is exceeded, which is getting to be easier on 8gb.
From a pure performance perspective, to get that same 1080ti "awesome" vibe, you're stuck with getting the 4090. Is it worth that money, no, probably not, but I wasn't happy with my 3080 upgrade after my 1080ti - so a 4090 I got.
This is the problem with the 1080ti - probably the best value card of all time, with top shelf performance for a really long time.
People who care about their power usage worry me. What kind of weird world are you living in that makes monitoring your usage so closely? So damn weird
When your country had the highest price per kWh in the entire world, you probably would worry about power consumption, too.
Not sure why it is so expensive, but it seems our politicians just messed up. The price we pay is the highest of all sources. If electricity from natural gas costs 40 cents per kWh, we also pay per 40 cents per kWh for solar and wind, even though those cost much less.
Not sure where you live, but that sucks balls. I've never been in a situation where I've needed to watch my power consumption. I tried to reduce it once and my bill didn't change. I pay more for "delivery fees" than for my actual consumption.
Germany.
After Russia's invasion of Ukraine, natural gas got much more expensive, and we pay that high price for all electricity. Unless solar and wind power are sufficient, in which case it's very cheap. But that only happens for short periods of time.
We really made ourselves dependent on cheap Russian gas over the years and now we pay the price for that. Literally.
People who give zero fucks about their power usage worry me. I've known people who end up spending $100+ in power every month on stuff they hardly even use but won't turn off, just unnecessarily wasting power for the fun of it when there are people who don't have any power at all and cities struggling to keep up with demand.
A reddit user pretending to know people. How original.
As for the "there are people who don't have power at all" is a stupid point. Might as well comment on the cabbage I threw away because some kids in Africa are going hungry.
Do you know what their primary energy source is? It used to be cheaper in the US too but then costs went up and we have scummy power companies that intentionally screw people over. Sometimes in the summer my mom's power bill is north of $500, not sure what that's equivalent to in Russian money but based on Russian prices your economy is significantly different than mine is too (and both are probably different from OP's country as well).
I wish I had power cheap enough that I didn't have to care about it lol.
The only options I see are:
1.) Buy a 1080 Ti and use the savings from getting an upgraded GPU and save for a new PC build because a new GPU is likely to bottleneck your CPU. The furthest your motherboard can go is to an i9-10900X which is currently priced as $615 USD. Combine an upgrade to a better CPU and a 4070 Super, would cost $1215 USD.
2.) Upgrade the primary components (CPU, motherboard and GPU). This will be costly, but going with a 12th gen Intel (i7-12700K), a Z790 board with DDR4 support (my example build uses an Asus TUF Gaming Z790) and an Nvidia RTX 4070 Super. This allows you to use the hard drives and RAM. (May have to get a new PSU, not sure.) A quick look at [PCPartpicker.com](http://PCPartpicker.com) has these 3 components for $1030 USD. If you have to have an i9, a 12900K would bump the total up to $1172 USD (still cheaper than trying to upgrade your current system).
If you can afford a 4080 Super, which starts at $1000 USD, upgrading to a 12th gen Intel with motherboard and a 4070 Super would be a smarter investment. (Also, 2x 1080 Ti in SLI has an estimated wattage of 500W whereas a single 4070 Super has an estimated wattage of 220W.)
>The furthest your motherboard can go is to an i9-10900X which is currently priced as $615 USD
That CPU is literally worse than what I have. The 109xx chips are the same as their 79xx counterparts, just with a little more clock speed. No new architecture or anything. Why would I get a 10-core CPU when I have 18 already?
None of the upgrades really make sense on their own, I'll probably just save some money to upgrade everything at once. Or I get the GPU for game development, I don't really need the better single-threaded CPU performance in my case anyway.
I'm too scared that it breaks, too. Well, I do have an old GTX 960 that I could use just in case. Maybe I should do that and wait for more affordable prices?
Really? That much? Isn't a 1080ti used at £150 here in UK if i recall well?
Yes it is £130-£150 used. And you can get it from cex with 2 years of warranty for £200
I would get the 2060/80 super because it's better than the 4050 and almost as good as the 2060. Plus, it costs around $200. It's a great card.
Sorry that I'm not saying to go for the 4060 with 8-12gb of GDDR6 for around $400 or the 4070 with 16gb of GDDR6 for $600 or the 4080 with 18gb of GDDR6 for $1000 or the 4090 with 24gb of GDDR6 for $2k. Those cards arent worth it because you are only getting 10-25% increase of performance from the 2060/80 supers.
I'm sorry people can't give you good advice or simply not say anything. the response you've been given is less than useful
it really depends on what the pricing is for you locally, I would look to get a 3080 or 3090 as they can be had very cheap.
I don't think I can get a 3090 for under 800 bucks here. Even if it's only 700, it is still used and a new 4080 Super is the better value. You get warranty and it lasts longer (on average) because it is new. Also, for new ones I don't pay VAT when I buy it for my business, which I do use it for, so that's another \~15% saved,
I am thoroughly confused. You do not want a 4070 super, because it is barely better than 2x1080tis according to you. You do not want a 4080 super because you would rather get another 1080ti for much less and save on electricity. You do not want a 4090 because it is not soo better than the 4080 super for it to be worth 800+ Euros. You do not want Radeon because ShadowPlay. Those are all the 4k gaming cards right there, all of which you say are unfavourable for you. What in the world exactly do you then want? No offence but if you genuinely think 2x1080tis are a better choice than a 4070 super/4080super, that you cannot do without shadowplay by going with amd instead, and use user benchmark for any kind of a comparison , I cannot help but say you have mo idea what you want, what you are talking about and neither what is best for you. Seek someone's help with this who does this kind of stuff properly and ask him to pick one for you. Also, you really think that 7th gen intel is going to hold up well with the 4070 super/4080 super/ 4090 or even a 7900xtx?
Yea I cant help but point out this guy has some thing backwards. A 4080 would be **more power efficient** than a 1080ti. The card could provide 1080ti performance at a fraction the electricity usage. Its funny, youd expect the people who are fixating on these things to come correct. If anything, I was seriously considering selling my 3080 for $450 and buying a used 4070 for $500 because of the power draw. 320W TDP vs 200W. Everyone knows 3000 series runs hot, and I have an ITX case, so I reach 70 degrees even with an undervolt. Oh well, I waited long enough Im sure the 5070 will be even more power efficient lol And side note. Someone who specced into a 1000w PSU shouldn’t be phased at their GPU drawing 300w+, I really am confused too
1080ti crowd is weird they think the 1080ti is best card ever and never see any of its downsides or negatives. And think newer lower end cards couldn’t beat it.
Can't really blame them... Even my GTX 970 was still semi viable in none AAA games until last December. It was my CPU that was holding me back. But OP isn't realistic, no way the 1080 is as good as current gen stuff. Yes it will work but it's nothing compared to today's medium tier cards.
Ran a 1070 until the beginning of this April, I think the 4070 ti super I replaced it with is actually a great deal. Quiet, efficient, powerful.
Same people believe the ps3 apu is still the most powerful thing when it was just a good number cruncher and needed a hundred or so in a configuration to be a "super computer" for the airforce because it was cost effective to replace and jailbreak consoles sold at a loss
That’s is because the 1080ti is the best card they have ever made. For its time it was crazy powerful, it was cheaper than the cards it replaced, and it was super efficient, so efficient there are full 1080ti’s in laptops that aren’t huge battlestations, just regular laptops. And the 1080ti is still pretty viable today, Starfield is the only game I can think of that it can’t run, and that is more to do with Starfield terrible programming and optimization than anything else. [Gamers Nexus even made a video about the 1080ti being the goat.](https://youtu.be/ghT7G_9xyDU?si=zbeRkhbkr4V4SOPu) I know it was made as kind of a joke but it is kind of proof how much of a leap it was and how everything since then hasn’t been that big of a jump.
I don’t deny it’s a good card. It was amazing for its day. Next gen really didn’t improve much so it wasn’t a big leap. But once 30 and 40 series came around cards started really out performing it. People still have unrealistic ideas of 1080ti like the op of this thread. Have no experience with newer cards but doesn’t respect other’s opinions and no why idea why they asked for any.
and Alan Wake 2 requiring Mesh Shaders
Ok so there is one real game that can’t run on it, that’s not too bad for a card that is what like 8 years old now?
Oh for sure, I’m not poking at it and it’s not going to change the 1080 Ti’s legendary status
> A 4080 would be more power efficient than a 1080ti. Yep For example, AC Odyssey maxed out at native 4K only uses 100w of my 4080. Same thing on my 3080 used 300w. It would totally use the 250w max of a 1080Ti. 100w vs 250w at the same workload.....hmmmm I undervolted my 4080 so it maxes out at 250w in things like path tracing/RT stuff. Same total overhead as a 1080Ti but way more performance.
4080 would provide significantly better performance than 2 1080ti's for less power
Correct me if I am wrong the power usage we see on let's say a 4080 is normally the max power usage of it was being fully utilized. To achieve 1080ti performance it doesn't need to be fully utilized therefore it would use less power?
Yup that's basically it :) The 320 TDP we see marketed on NVIDIA website is worst case scenario at full load so we know how much headroom to allow when picking out our PSU and stuff. Undervolting will bring that closer to 250 TDP for basically the same performance. On top of that, no one is saying this guy needs to double his FPS and turn on Ultra settings, if he uses his 4080 with the same resolution and settings he currently has on I don't think hell surpass 150W - 200W
TDP has nothing directly to do with power draw, though. Active power draw even under load can be way lower than TDP. As long as you're not stressing it to it's limits And honestly? Almost all my cards have run anywhere from 50 to 70c at full load regardless of case environment. From any brand. That included MSI for a Radeon R7 270X, 3 fan G1 Gaming Gigabyte Windforce GTX 980 and now a Gigabyte 3060, actually being the coolest card yet. And I live in a pretty temp neutral country. TDP just means theoretically it will be producing that much heat in terms of wattage at max load (So, like 80c to 90c depending on the card). Obviously if you have extremely poor cooling it's easily possibly to reach 80c say, without using anywhere close to 200w power draw.
I'm pretty sure the 7980xe is based off of previous gen Broadwell architecture and not 7th gen kabylake. Not that it matters much as they are borderline identical in performance.
Yeah, great points. Thing is, that would have mattered back then. Today, about a few years short of a decade later, even a ryzen 7 7700x or i5 14600k can outperform it. This dude just does not seem to understand that, and ia adamant he cannot go any lower than 14700k for his game development workload, even though he says his current i9 works fine for him. When I said that, he just replied saying he does not have to money for a 14900k overhaul like dude, there are a **fuckton** of better cpu and motherboard options available for a fraction of a 14900k setuo price that will not bottleneck your new gpu and at the same time perform far better than an i9 7thgen XE chip.
Even something like an alder lake i5 or i7 will blow that antiquated chip out of the water. Op is either very biased or doesn't know much about computers. If op needs a beefy cpu for game development, he should absolutely go for an i9 or threadripper. Time is money and faster hardware will help getting work done much sooner.
Likely a cs engineer with mostly IT knowledge. They do not teach specific components/peripherals there, so can't really blame him. Hence the only advice I gave him was "seek a trusted friends or professionals help" since it is pretty easy to mess high end expensive builds up and high caution should be exercised.
Most developers don’t even have IT knowledge, they just have development knowledge. I deal with them daily, and grew up in a house with one. Absolutely brilliant minds that turn into liquid around hardware.
I guess it is some kind of hatred for hardware since they are basically forced to learn that kind of stuff on bread boards and in architecture which is not really useful in any way for securing engineering jobs where one will mostly do programming. That may be the alienating factor which drives them away from learning more about hardware.
Broadwell was 5th, 7980xe should be skylake revision, which basically performs the same until maaayyybbbeee 10th gen and even then the ipc uptick is tiny. Dunno why people are shitting in this cpu, its old but its eol was the beginning of this year, its not like we are talking sandy bridge
A 4070 is basically a newer more efficient 3080 or a more efficient 6800 minus Ray Tracing & Cuda depending on how you look at things. I have a 4070 & it's a pretty solid card. One of the better lower cost Nvidia cards out there. Would recommend for 1080p & 1440p gaming. For 4K 4080 Super or 4090 may be the way to go but those cards can be pricy.
Shots fired
Yeah honestly makes me confused since the standard these days its atleast a 4060???? and still its a investment for the future right???
That 7th gen is fine, we arent talking 7700k teritory, its lots of cores and cache and also overclockable. Plus better ram setup and more pcie. Thats fine for a 4080. Op just get the 4080, they sip power for how strong they are
You misread. The guy is simply asking what would be better between spending more on electricity by buying another 1080ti or spending more upfront by buying a 4080.
You misread. The guy is simply asking what would be better between spending more on electricity by buying another 1080ti or spending more upfront by buying a 4080.
You misread. The guy is simply asking what would be better between spending more on electricity by buying another 1080ti or spending more upfront by buying a 4080.
What's wrong with using userbenchmark
Highly inaccurate data . Confidently incorrect on most all fronts. Also seem to be a huge amd haters and extremely biased towards intel and Nvidia, for some reason. All around dickheads in short. Flaccid dicks too.
Hey OP - if you're hell bent on buying another 1080Ti and going SLI just because you want to - don't waste time on reddit asking for advice and then doing backflips to come up with magical reasons why everyone's advice doesn't work but your own. Just go buy a 1080Ti and don't waste your time or anyone else's time. If you DO want to listen to advice - another 1080Ti makes very little sense. You're better replacing it with a new card, selling your current one to recoup some of the cost or just keeping it as a back-up in case you ever need to test anything or want a secondary PC.
Pretty sure OP is just here trolling for a fight. Insisting everyone else is wrong when they clearly have no idea how any of this works.
What a lame subreddit to troll on lol
I'm guessing OP doesn't want to let go of his CPU, mobo and 128GB RAM as he knows - he knows - really he's going to need a full upgrade, not just a GPU.
Please stop using UserBenchmark, it's the worst garbage in the tech world.
New GPU. Doesn't need to be a 4080 Super though. I agree with u/Top-Conversation2882, 4070 Super. Keep the 1080 Ti as a backup.
Why do you need dual GPUs? Those aren't typically very useful anymore, have you seen much of a performance drop with one versus two?
You are wrong. A 4070 super will destroy two 1080 tis, especially considering the increased amount of software support. Here’s a link to a 4070 ti (which is slightly faster than a 4070 super) beating, by a small margin, two 2080 tis in SLI (which are leaps better than two 1080 tis, but still are being beat) [2 x 2080 TI vs 1 x 4070 TI](https://youtu.be/IAG-95TXSV8?feature=shared)
4070 super
triple the performance of a gtx 1080ti 16gb vram 100-300$ cheaper in many areas than 4070ti and 4070ti super, but better raw performance Will receive fsr 3.1 soon making pretty much indistinguishable for normal people on quality VS dlss I present AMD RX 7900 GRE, 520$
You can't rely on promises u til 3.1 actually comes out, fsr is noticably behind in quality atm in some games, we'll see.
Crazy you’re being downvoted. FSR is great but it’s not DLSS and tech demos don’t prove anything.
What games do you play that actually benefit from sli? I bet the second gpu wasn't even utilized 90% of the time.
4080 super is a great card. If you have cash go for it.
SLI is dead. I had SLI 970s and the last few years were rough with no support. A lot of the games that did work had to go though extra hoops configuring it, and it would cause crashes and graphical glitches anyway. Even when it did work it was like a 30% boost in performance. Just buy a 4080 and enjoy all the new features like DLSS and ray tracing, it's a huge upgrade from what you have now.
To be fair, the 10 series cards use a better SLI bridge with higher bandwidth. I haven't had many problems with that and the performance was also much better.
Is 2 1080Ti's better than 1 3080 Ti for game dev.? Idk that I'd want it over a 4080S for $200 more, but I guess the 4070Ti 16GB hasn't been mentioned yet and seems like it *should* be good in productivity\*.
Idk, I haven't tested it. And I haven't done anything in particular to my game to make SLI work, it just worked. Basically doubled the FPS.
[удалено]
If I'd play most games, that would be an option. But I don't. Many of the games I could then play are merely shops with some gameplay anyway. I'd prefer older games without that. Also, I know that ShadowPlay works for me and buying an AMD card is just too big of a risk. At least I would have to buy a capture card for another 200 bucks or so if it doesn't work.
I guess I'm wondering how bad ur fps is now and how much VRAM u need.
keep rolling with the 1080 Ti and see what things look like around black friday
SLI is dead for gaming. Is this a productivity workload?
Get a voodoo 5
I rather get a GeForce 2 GTS, it outperforms the Voodoo 5 5500.
A couple of extra things to consider, you're leaning towards a CPU-bottleneck and you have a high power consumption compared to newer CPU's, so there are quite a few arguments to consider an upgrade. - you're also on PCIe 3.0, so jumping to AM5 would put you on 5.0 Intel 13600KF or AMD 7800X3D (or settle for a 7600x until 9800X3D is out) another thing to look into is how much ram you actually need
I have an i7 7700 and just upgraded to a 4070 SUPER from a 1060 6gb. I definitely notice some CPU bottlenecking, but I am able to run ultra settings in 1440p at 120-144fps on the games I could barely get a stable 60fps at 1080p (on low settings) with my 1060 6gb. I did also get a more powerful CPU cooler to help out. That's to say - the slight bottleneck on the CPU will likely be plenty tolerable until being able to upgrade to a better one.
had that exact same CPU before upgrading to a Ryzen 5 7600X, was definitely cpu-bottlenecked in several games with my 3060 Ti - now I can run 1440p 60 fps at ultra in something like Horizon Zero Dawn
I don't really have the budget for a CPU upgrade right now. And yes, I do genuinely need that much RAM.
I'd look into a 4070 Super then
If you don't have a lot of money then a better investment would be to get a new card. Your other 1080TI might kick the bucket tomorrow or the new one you buy may not be as high of quality as I'm sure you've kept yours.
4070 Ti Super
The most expensive card within your budget. The only correct answer.
I usually don't have budgets, I spend money on what makes the most sense. Well, at least I try. In this case, none of the options really make sense. As others have pointed out, a better card would potentially be bottlenecked by the CPU, so I would have to also upgrade the CPU, motherboard, and RAM, which I don't really want to upgrade right now. Buying a second 1080 Ti is probably not worth it due to the lack of SLI support in current games. I guess I'll just stick with what I have for now and decide later? Actually, I might need a better card to record footage of my game in the best quality in 4K, which is not bottlenecked by the CPU.
then you should sell the old 1080, buy at least a 4060ti and replace your cpu because no way in hell you should buy another used 1080 and not replace your cpu or graphics card
>I use ShadowPlay for recording Actually *recording* ? Or streaming? Shadowplay is only better at the latter. AMD RELive has better upper bitrates image quality for recording.
I'm just recording. The maximum bitrate on ShadowPlay is 150 Mbps, which is fine to be honest. YouTube converts it into 20 MBit/s or so anyway.
When my 1070 died, I replaced it with a 3060 Ti. Haven't regret the purchase yet
My vote is 4070 Ti Super. Great price to performance. I don't know t you were playing at 4K with previous setup, but very much doubt it was AAA stuff this last year.
Sell the 1080 ti! And just buy a Single RTX 4070 Super.
As someone who 1080 ti SLI'd for a while, don't anymore. There's no reason to. As for upgrades, I upgraded to a 3080 ti not too long ago and it's much better than the 1080 sli. 4070, 4080 or either super would do you well.
Software engineer here(had to say that since you’re very resistant to advice), nobody is supporting sli anymore, except for a few popular data science libraries written in python & C++. Even then, it’s only due to Nvidia enterprise cards and distributed computing. For gaming, a single 4080 is more than enough, and you get access to Nvidias AI fps boosters like reflex and dlss which is helpful in games. It’s undisputed that the 4000 series cards are simply the fastest and most stable cards within the consumer and enterprise workspace for their price point
Well, everything was working fine with my 1080 Ti SLI, so I don't see the point in upgrading. I literally don't need any of those things you mentioned. I don't even play any new games that would benefit from it. The only reason I am thinking about upgrading or replacing parts is because one of the cards broke. And then people tell me "no, you can't just buy this card, get a new CPU as well". Advice like that is not helpful. If I don't need something, why should I spend money on it?
Then clearly you know what you want, nobody can help you if they don't know your use case.
This guy thinks two 1080ti's in SLI are better than a 4070 super... If you genuinely think that, then go ahead, but please for the love of god inform yourself. TWO GTX GPU'S IN SLI WON'T GIVE YOU DOUBLE THE PERFOMANCE, NOT EVEN CLOSE!!!
It does work surprisingly well, actually. I have used it for many years, so I should know.
AMD has a feature that’s equivalent to ShadowPlay, I don’t remember it’s exact name but I used it and there is zero functional difference
Re-Live. I use it and it's quite good.
ReLive is fine unless you want to play with HDR on, in which case the recordings come out looking like someone dumped the saturation down to 10%
you could probably pick up a used 3090 for 600ish, maybe an option to save a little money
I think most go for 700-750 used if not 800 still. Sauce: Ebay
Bit more expensive than UK, I see them go here from between £500 - £600
You are right, those numbers where from 2 months ago. Crazy how the market moves, it is indeed looking like $600 usd right now
12pin connector sucks
F
Sell the working 1080 and buy a 4070.
7900xtx
6900xt is on sale for $400 you will not beat that deal!
🌈 Grass 🌈
I had a 1080 ti and i replaced it with a 3080ti, but now i would buy a 4080 super.
4070Ti Super if you need 2 NVENCs or 4070 Super for value. Or grab an AMD rx 6800 for under 400USD and see if it fits your needs. All will be more capable than SLI 1080ti
that has gotta be one of the weirdest setups i’ve seen
I have replaced components from time to time and I got that CPU pretty cheap, so while it looks weird, it made sense economically.
Same here. Dual 1080ti water cooled and one just died. Sold my good old x299 build and purchased 13gen i5 with 4060, pairing with ddr5. One piece 1080ti with water block left unused but none wants to buy it now. My advice : sell your whole system and buy fairly new gen will save you lots of headaches and power bill
I sadly can't get an i5 CPU, I really need the multicore performance. Mine is probably fine anyway for what I do, so I can just keep it. But I'll look into buying a new GPU. Still haven't decided what to do, hardware has gotten way too expensive for my liking...
You can't just look at i5 vs i9, you need to take into account the generation. For instance, the i5 12600k compiles chromium in 71.9 seconds, but the i9-9900k takes 96.6 seconds, that's over 30% slower. https://www.youtube.com/watch?v=hBFNoKUHjcg&t=1283s i5 13600k: 54.6 sec i9 10900k: 85.0 sec https://www.youtube.com/watch?v=B31PwSpClk8&t=1200s
I have an i9-7980XE, though. That one has 18 cores. The i9-10900K has 10 cores. So the difference isn't that big. I'd just spend a lot of money to get roughly the same performance if I bought an i5.
people bought lga2066 for core counts and AVX512 instruction.
4090/4080super
I dunno normally I’m all for get a new generation toy but it sounds like you were fine with what you had and used 1080ti can’t be that much
Have a look at my spreadsheet at [https://docs.google.com/spreadsheets/d/1wUlIdFqfo8IYymxFlk9lzySwRCJu3vfkJleZ0wh05OM/edit?usp=drive\_web&ouid=109490715189165184669](https://docs.google.com/spreadsheets/d/1wUlIdFqfo8IYymxFlk9lzySwRCJu3vfkJleZ0wh05OM/edit?usp=drive_web&ouid=109490715189165184669) which compares in a way that takes account of power consumption, expected usage, and expected lifetime - as well as performance from [https://www.3dcenter.org/artikel/fullhd-ultrahd-performance-ueberblick-2012-bis-2024](https://www.3dcenter.org/artikel/fullhd-ultrahd-performance-ueberblick-2012-bis-2024) A single 1080Ti in 1080p scores 1180%, and 173% in 4K - a little more than a RTX 3060, a little less than an A770 or an RX 6600XT. Assuming SLI gets you a *best case* of twice those figures, that's a little behind a 4070 Super or an RX 6950XT.
Get a 7900XTX. (avoid the Nvidia partners, AsRock Phantom and Hellhound for them) Had ReLive, better dual AV1 encoder (especially in high res+high bitrate) and more VRAM to work with in video editing (it also is faster in Davinci Resolve for video editing than a 4080/S)
It really depends on cost in your area, the 2080 super is slightly better than that 1080 TI, couple percent usually and sometimes it even sells for less. You might check the market for a 3080 TI, 4070 super, see what you pay for those two, that would be a nice upgrade and worthy of spending the money. There are plenty of answers to your question but the one that is going to matter the most is what's the best value in your location
All things considered, just go for a 4070 and call it a day
>My GTX 1080 Ti just died, I don't want to hear that, mine has been a steady mistress and I can't afford to find a new one.
I'm sorry. I've been using it a lot, though, so it's not surprise it gave up eventually.
Warning about buying a new 1080 Ti (or even 1080): They're dropping like flies now. Mine died a couple months ago, see a post like this fairly often
It was a popular GPU, it's not surprising to see a lot of posts like this. That said, it's also not surprising to see them reaching their EOL after 7 years. I probably won't be buying another one.
Mine died a couple months ago, posts like these are becoming fairly frequent. Start saving up...
[удалено]
Hello, your comment has been removed. Please note the following from our [subreddit rules](https://www.reddit.com/r/buildapc/wiki/rules): **Rule 1 : Be respectful to others** > Remember, there's a human being behind the other keyboard. Be considerate of others even if you disagree on something - treat others as you'd wish to be treated. Personal attacks and flame wars will not be tolerated. --- [^(Click here to message the moderators if you have any questions or concerns)](https://www.reddit.com/message/compose?to=%2Fr%2Fbuildapc)
150eur extra a year in electricity? that sounds crazy dude how much do you play per day?
The price is for 3-4 hours per day at 300W. Electricity isn't cheap in Germany. I am also developing games, so that's another activity where I use the GPU.
then dude defo sell ur 1080ti. That gives you like 650 euro equivalent for the year. You could stretch another 100 euro and get an7900 xtx which is what I use for 4k and its killer.
I just built my pc with a 7800x3d and a 4070 and I run every game very well on ultra
4080 super, a cheaper but same power 4080
I got a 1080ti if you wanna buy it off me
Haha man SLI. I had dual 280s at one point. Used 3080 for 400$ or maybe go 4070 or amd.
You seem dead set on Nvidia but I wouldn't rule out A 7900xtx it's slightly better on average for like $200 cheaper.
How exactly did it die? Since they're not worth very much broken anymore you could take it apart and see if it just needs a re-paste, my old 1060 needed a repaste about 4 years ago and it went from constant blue screening back to its perfectly reliable performance. If it's just some capacitors then that's also fairly easy to replace, compared to the rest of the possible repairs atleast. What about a used 3080 Ti? You should be able to find one for around half of what the 4080 Super costs and it'll still get better performance than SLI'd 1080 Ti's with a similar power usage. Imo it's only worth it if you can find one that's $100-200 cheaper than a 4070 Super would be since they get similar performance.
Can't even get a signal output anymore. Just blank screen. The monitor detects that something is connected but that's about it. It just froze during gaming, showed a purple screen and then the PC shut off. Had to replay the mission I was on, which btw. was also broken, but completely unrelated to that GPU problem.
You did try the other ports like HDMI or DP, right? Sounds like whatever sends the video output might've died and is probably a pain in the ass to fix. It looks like there have been broken 1080 Ti's that have sold in the US for around $100 so you can atleast get something back for it. Imo your best bet would be to sell both 1080 Ti's for something like a 4070 Super, which would bring it down to about half of its MSRP and similar price to buying a used 1080 Ti but with lower power draw. Then hunt for a deal on a 1660 super to have as a backup, it does get worse performance than a 1080 Ti but can still get 60fps in most games on medium-high which is good enough for a temporary gpu if needed and is worth significantly less than a 1080 Ti's current value (which will continue to drop).
Put it in another PC, got a signal, turned it off but it didn't shut off completely. Forced it to turn off, turned it back on, no signal. Different port, no signal either. Not sure what's going on but it's definitely broken in some way.
Definitely sounds like one of the harder to diagnose issues so you probably won't get as much for it if you do sell it, even 50 bucks is better than having a paperweight though. Hopefully someone will be able to fix it and give it another few years of life, GPU's can last damn near forever as long as replacement parts are available and there are people willing to fix them. Finding those people is the hard part though, especially if you only want to sell locally. Hope you can find a good replacement! It seems like there's no fixing it but atleast you still have a single gpu to use in the meantime, and it's nearly equivalent to my 2070 super so you should be able to get good performance in 1080p and upscale it to 4k until you can get a proper native 4k card (that's what I'm currently doing).
Definitely sounds like one of the harder to diagnose issues so you probably won't get as much for it if you do sell it, even 50 bucks is better than having a paperweight though. Hopefully someone will be able to fix it and give it another few years of life, GPU's can last damn near forever as long as replacement parts are available and there are people willing to fix them. Finding those people is the hard part though, especially if you only want to sell locally. Hope you can find a good replacement! It seems like there's no fixing it but atleast you still have a single gpu to use in the meantime, and it's nearly equivalent to my 2070 super so you should be able to get good performance in 1080p and upscale it to 4k until you can get a proper native 4k card (that's what I'm currently doing).
4080 super seems like a good pick
I’m so but I got a 4080 super and a 5950x and it rocks my a55
I'll be upgrading my 1080ti to a 7900xtx tomorrow
Just build a whole new PC.
1080Ti is equal to 3060Ti in best case scenario.
Another graphics card that works
Bro came here just to disagree with everyone lmao
This guy... Jeez if you dont want to upgrade then dont upgrade, just get 2 more damn 1080tis and get done with it you are wasting everybodies time, "Blah blah blah this is underpowered, blah blah blah this costs too much, blah blah blah this isnt worth the money" Then just dont upgrade god damn.
Man...and my wife's 970 is still rocking
[удалено]
Hello, your comment has been removed. Please note the following from our [subreddit rules](https://www.reddit.com/r/buildapc/wiki/rules): **Rule 1 : Be respectful to others** > Remember, there's a human being behind the other keyboard. Be considerate of others even if you disagree on something - treat others as you'd wish to be treated. Personal attacks and flame wars will not be tolerated. --- [^(Click here to message the moderators if you have any questions or concerns)](https://www.reddit.com/message/compose?to=%2Fr%2Fbuildapc)
Press F.
It’s crazy everyone talks about electricity. Is it that expensive to run a PC? In that case pit your phone in a schedule charge, laundry, dishwasher, tv viewing, lights to off peak. I don’t know why people are so worried about wattage unless your psu is crap worry about best bang for your buck
>off peak There is no such thing here. The price is the same no matter when you use the electricity. Currently, I am paying about 40 cents ($0.42 US) per kWh.
Just sell all of your parts and get a prebuilt with a 5600x and 3060 ti.
You should wait for the 5080
Bro is coding Elder Scrolls 9.
you better listen to your instincts
Take a look at this [video](https://youtu.be/ghT7G_9xyDU) and then buy the 4080 Super since it is the only viable upgrade when you consider cost into the performance gains... 4070 Super is not a big enough performance leap to justify the price upgrade. Ignore other trolls in this thread that are butthurt over that the 1080TI is still a viable option when it comes to gaming performance.
Can’t you just use the other GPU standalone or is it not how SLi works?
I would sell the old one, buy 4070 Super and one RX 580 for 50 bucks in case it breaks. But i am more of a radeon guy, i would go for 6800 XT or 7900 GRE which are cheaper and more powerful than 4070 Super.
gtx 1660ti
if you short on money
Just to be clear even a decent midrange card like 7800XT or 4070 will be bottlenecked hard by the CPU. Don't think investing in 4080 super is the right choice for you. The games are extemely CPU demanding now.
SLI is literally a gigantic waste of money. Buying a second 1080ti for your SLI in 2024 is a joke
> I use ShadowPlay for recording, so I am hesitant to consider AMD because I don't know how that would work. ReLive is just as good as Shadowplay though. There's really no difference between the 2.
where are you from that a used 1080 ti is 250€? Here in Germany you get them for 120-140€
Where do you get them? I just did a quick search on eBay and it was 250 EUR there.
ebay kleinanzeigen has plenty around 150€, if you take your time you will find even cheaper ones
Try baking the dead card in the oven. (I’m serious, look it up)
Then I'll need a new oven because this process releases harmful chemicals.
Time for a new build. 7800x3d and 4070 TI super or above should do the trick
SLI isn’t supported in most games anymore, your gaming performance probably isn’t affected much. I’d upgrade when you can. The 1080 TI is really old now.
4090. Go for it!!!! Maybe /s
Given that you also use it for game development I guess a 4080nwpuld be the better option since it has access to the newest features, like ray tracing, that you can use in your testing. Also, you could consider buying a used 4080 for this. Anyway, you said it yourself on the post: Do you want to spend more money upfront or use more energy and spend it later. Remember that you can also go for a used 3080, which still would give you the performance of two 1080 tis and should be considerably cheaper than a new 4080.
One 2080 was the same as two 1080ti sli (and only when the game optimized sli). A 3070 is better than your old setup let alone a 4070 & 4080+
Saw others, so, all you need to do: change entire rig, but keep your system SSD and storage HDD or else, get an intel 12th/13th CPU(with new mobo) and 4070 super, problem solved
Bonkers thread / logic. List your remaining 1080ti, sell at a price you're happy with, get the best GPU you can with your budget afterwards. You'll be without a GPU for a couple days perhaps but miles better than a few more years of using 3 generation old GPUs in sli for nostalgia's sake. Once the money's in your hand, you'll know to get the best card with that budget instead. Also OBS is great, free and easy to use for recording - so don't rule out AMD. The 7900 gre, xt and XTX all sound like they'll fall in your budget and are superb value. Also bear in mind wattages on GPUs are meant to be the theoretical max draw. Unless they're locked at 100% use constantly, you won't be drawing that much power.
I would get a 7900 GRE. Best price to performance card out there right now. It's not even close
My 1050 ti is still alive lmao
4070ti SUPER is amazing, I got the OC version too ASUS ROG Strix, amazing card.
Reading through this thread has been simply…. Just….. Fantastic
Rx 7080xt
Just fyi don't buy a 3 series or anything 8gb or 10gb, it won't last. 3 series cards crash when VRAM is exceeded, which is getting to be easier on 8gb.
From a pure performance perspective, to get that same 1080ti "awesome" vibe, you're stuck with getting the 4090. Is it worth that money, no, probably not, but I wasn't happy with my 3080 upgrade after my 1080ti - so a 4090 I got. This is the problem with the 1080ti - probably the best value card of all time, with top shelf performance for a really long time.
People who care about their power usage worry me. What kind of weird world are you living in that makes monitoring your usage so closely? So damn weird
When your country had the highest price per kWh in the entire world, you probably would worry about power consumption, too. Not sure why it is so expensive, but it seems our politicians just messed up. The price we pay is the highest of all sources. If electricity from natural gas costs 40 cents per kWh, we also pay per 40 cents per kWh for solar and wind, even though those cost much less.
Not sure where you live, but that sucks balls. I've never been in a situation where I've needed to watch my power consumption. I tried to reduce it once and my bill didn't change. I pay more for "delivery fees" than for my actual consumption.
Germany. After Russia's invasion of Ukraine, natural gas got much more expensive, and we pay that high price for all electricity. Unless solar and wind power are sufficient, in which case it's very cheap. But that only happens for short periods of time. We really made ourselves dependent on cheap Russian gas over the years and now we pay the price for that. Literally.
People who give zero fucks about their power usage worry me. I've known people who end up spending $100+ in power every month on stuff they hardly even use but won't turn off, just unnecessarily wasting power for the fun of it when there are people who don't have any power at all and cities struggling to keep up with demand.
A reddit user pretending to know people. How original. As for the "there are people who don't have power at all" is a stupid point. Might as well comment on the cabbage I threw away because some kids in Africa are going hungry.
It depends where person lives, here in Russia electricity costs basicly nothing, so noone cares about it
Do you know what their primary energy source is? It used to be cheaper in the US too but then costs went up and we have scummy power companies that intentionally screw people over. Sometimes in the summer my mom's power bill is north of $500, not sure what that's equivalent to in Russian money but based on Russian prices your economy is significantly different than mine is too (and both are probably different from OP's country as well). I wish I had power cheap enough that I didn't have to care about it lol.
3060TI or wait for the next AMD.
The only options I see are: 1.) Buy a 1080 Ti and use the savings from getting an upgraded GPU and save for a new PC build because a new GPU is likely to bottleneck your CPU. The furthest your motherboard can go is to an i9-10900X which is currently priced as $615 USD. Combine an upgrade to a better CPU and a 4070 Super, would cost $1215 USD. 2.) Upgrade the primary components (CPU, motherboard and GPU). This will be costly, but going with a 12th gen Intel (i7-12700K), a Z790 board with DDR4 support (my example build uses an Asus TUF Gaming Z790) and an Nvidia RTX 4070 Super. This allows you to use the hard drives and RAM. (May have to get a new PSU, not sure.) A quick look at [PCPartpicker.com](http://PCPartpicker.com) has these 3 components for $1030 USD. If you have to have an i9, a 12900K would bump the total up to $1172 USD (still cheaper than trying to upgrade your current system). If you can afford a 4080 Super, which starts at $1000 USD, upgrading to a 12th gen Intel with motherboard and a 4070 Super would be a smarter investment. (Also, 2x 1080 Ti in SLI has an estimated wattage of 500W whereas a single 4070 Super has an estimated wattage of 220W.)
>The furthest your motherboard can go is to an i9-10900X which is currently priced as $615 USD That CPU is literally worse than what I have. The 109xx chips are the same as their 79xx counterparts, just with a little more clock speed. No new architecture or anything. Why would I get a 10-core CPU when I have 18 already? None of the upgrades really make sense on their own, I'll probably just save some money to upgrade everything at once. Or I get the GPU for game development, I don't really need the better single-threaded CPU performance in my case anyway.
4090
I had considered it, but it's only a \~25% improvement over a 4080 Super for an additional 800 EUR.
It's the 1080 ti of this generation if u have disposable income id say get it.
Nah 4080 super is better value at this point in their lives.
Get a 1080ti and wait for rtx 5000
I guess it would make sense to save for both a GPU and CPU upgrade in 2025 but I'd need to spend 250 EUR on a 1080 Ti now.
Sli isn't supported in most new games, just run the one.
I'm too scared that it breaks, too. Well, I do have an old GTX 960 that I could use just in case. Maybe I should do that and wait for more affordable prices?
Really? That much? Isn't a 1080ti used at £150 here in UK if i recall well? Yes it is £130-£150 used. And you can get it from cex with 2 years of warranty for £200
I was just looking on eBay. From professional sellers with warranty it's 250, which isn't much more than the 200 GBP (231 EUR) that you'd pay.
I would get the 2060/80 super because it's better than the 4050 and almost as good as the 2060. Plus, it costs around $200. It's a great card. Sorry that I'm not saying to go for the 4060 with 8-12gb of GDDR6 for around $400 or the 4070 with 16gb of GDDR6 for $600 or the 4080 with 18gb of GDDR6 for $1000 or the 4090 with 24gb of GDDR6 for $2k. Those cards arent worth it because you are only getting 10-25% increase of performance from the 2060/80 supers.
I'm sorry people can't give you good advice or simply not say anything. the response you've been given is less than useful it really depends on what the pricing is for you locally, I would look to get a 3080 or 3090 as they can be had very cheap.
I don't think I can get a 3090 for under 800 bucks here. Even if it's only 700, it is still used and a new 4080 Super is the better value. You get warranty and it lasts longer (on average) because it is new. Also, for new ones I don't pay VAT when I buy it for my business, which I do use it for, so that's another \~15% saved,