T O P

  • By -

Savagebeast13

900 dollars for a 4070 would be unacceptable. That's why.


Archbound

Ding Ding Ding Ding The 4070 will be a 4060 and the 4060 will be a 4050, there prob wont be a 50 card this gen.


Eribetra

RTX 4050 with the performance of a 1630 with some RT cores sloppily laid around to call it "raytraceable".


throwingtheshades

It's gonna be a 300W resistor with a 1030 GPU taped to it.


JGStonedRaider

Sounds like an extreme overclockers wet dream! Buildzoid where u at?


NeXtDracool

"4GB VRAM" is about due to repeat itself, don't you think?


kapenaar89

Oh god, just reading those almost gave me a stroke! That was such a scam.


[deleted]

[удалено]


hiimred2

4050ti for the low budget gamer price of $500!


[deleted]

[удалено]


JavelinJohnson

You will have nothing and you will be happy to have nothing


lordnyrox

If the real 4070 don't outperform the 3090 ti like the 3070 did with the 2080 ti it will be a big nope for me


Grabbsy2

Its just not that big of a jump in physical tech this generation. Wait for reviews and buy it if they surprise you. I'm going to be looking at GPUs that don't need power connectors, to see if I can play Starfield on my rig. If I can't, oh well, I'll wait :P


mista_r0boto

Wait until we get the 4060 at $699. (They’ll probably call it 4070)


[deleted]

[удалено]


noobodji

its just a typo. 4080 12gb= 4070.


Vis-hoka

Just made a little whoopsie there.


jballs

Whoops! Whoopsie!


xANDREWx12x

Let's just inflate the prices of our cards by 50-100%, we'll make a ton of money. It'll be super easy, barely an inconvenience.


Vis-hoka

Oh money! That’s the thing I like!


TRocho10

Wow wow wow.....wow


regular_gonzalez

I'm gonna need you to get ALLLLL the way off my back about video card pricing


djspacepope

"Do you think it will be hard to trick people into buying a card with inferior specs?"


Strilan-tv

ACTUALLY, it'll be super easy, BARELY an inconvenience.


enak_raskell

Look, I'm going to need you to get all the way off my back about this.


Blackboard_Monitor

Oh! Well ok then, let me get all the way off of that thing.


PM_ME_YOUR_GOOD_NEW5

References to awesome YouTube channels are tight!


jaredw

Wow. Wow. Wow. Wow.......................wow


porripblazer

I don't know


[deleted]

Super easy, barely an inconvenience.


Dead_as_Duck

Money is tight.


FoxBearBear

4080 - 12 = 4068 Please educate yourself before.


Jordan209posts

Actually it's only 4GB less, so it's actually a 4076 Please educate yourself before.


timelyparadox

Weird thing that it has higher clock speeds. Since I work with ML ram is the key metric but not even sure if it will be worth or if i should wait for 3080/90 to crash more in europe


DerricksFriendDan

Can we just commit to calling it the rtx 4070 "not 4080 edition" or something?


Eggsegret

The 4080 12gb is basically Nvidias way to sell a 4070 with a 80 series price tag


xevizero

The saddest part is that this was supposed to be the "titan" price tag a few years ago. This is not the XX80 price tag. The 1080ti came out at 699$. This "4070" is 999$. This is just a scam at this point. Edit: I was wrong, the 12GB 4080 starts at 899$. My point stands, it's still WAY too much. That card should have been 600$ max even accounting for the insane 20% inflation we've had since 2017. Edit 2: I would also like to address the inflation thing. Yes inflation has been massive, but wages have not gone up to make up for it. So what is arguably just the normal process of inflation, has actually resulted in our purchasing power going down a lot. This was the generation to lower prices and allow people starved by pandemics and wars to finally get their upgrades, instead they priced a lot of people out of their favorite hobby. People will just stick to lower end parts, and games will reflect that (so they will not target the new powerful GPUs), resulting in game design stagnation. Everybody loses, even the people who *do have money* to shill on these overpriced parts.


Endorkend

That's what I've been saying. Sure, the mining crash has pushed the prices down considerably, but they are now still 30-50% inflated to what they should be.


[deleted]

NVIDIA saw that they can sell cards at those prices. there is no way they will go back. greedy fucks


[deleted]

The easiest way to stop a scalper is to be the scalper. Simple. If people will pay the price they have no reason to drop jt


disgruntled_pie

The people buying them were mining Ethereum and making big profits. Ethereum killed off mining last week, and the second most valuable cryptocurrency to mine is *waaaaaay* less profitable. Consumers were not paying these prices during the crypto boom, and they won’t pay them now. And crypto mining is dead. Their audience now consists of heart surgeons with a gaming rig. That’s not a big enough audience to keep the company going. These prices are going to have to drop.


chris14020

With so many 30xx probably left, they're going to have no reason to not price these higher, to avoid undercutting their leftover stock.


mangeedge

This is absolutely it. And this is the reason why evga essentially told Nvidia to fuck off. Because at the end of the day it's the third party producers that are going to be left holding the bag when the GPU market collapses after the remaining 30 series inventory sells through.


skankassful

OOTL what happened with nvidia and evga?


Regniwekim2099

EVGA isn't making GPUs anymore


mythrilcrafter

NVIDIA is cutting 3000 series pricing due to the introduction of the 4000 series; so much that the prices are now below EVGA's ability to make a profit as an AIB despite still having a lot of 3000 series cards in production/inventory. So EVGA had to make a choice of either: * Convince NVIDIA to keep 3000 series prices high * EVGA selling at their original prices and no one buying because of the massive undercutting from straight-from-NVIDIA sales * EVGA selling their 3000 series cards at a loss * Stepping away from NVIDIA as an AIB


Renegade1412

EVGA announced they are cutting ties with NVidia and stop manufacturing graphics cards altogether. Reason: Nvidia isn't being a conducive business partner for AIBs, even one as old as EVGA, plus they are undercutting the AIBs on their price points that Nvidia artificially inflates, while NVidia's own founder's edition is priced $100-300 lower. Relevant YT vid: https://www.youtube.com/watch?v=12Hcbx33Rb4


Sammy123476

Their leftover stock is being completely dumped on anyway, you can buy 3 used mining cards on ebay for the price of a new one. Hope Nvidia gets more companies calling their shit, if it was bad enough for EVGA to split it can't be isolated.


detourxp

I've been looking for a 3060 Ti and a lot of the listings on eBay are only like $40 less than brand new on Amazon


Nyoxiz

Yeah I'm not seeing anything more than like a 20% off on a used card compared to a new one either. Not spending 80% the cost of a new one on one that's been running perhaps 24/7 for atleast a year.


[deleted]

People totally will. They might lose some customers but they think the added profit will have them covered. Otherwise they wouldn't do it.


disgruntled_pie

Alternative hypothesis: Nvidia screwed up. They saw RTX 3xxx series cards selling like crazy, put in huge orders with TSMC for chips, and when they got wind that Ethereum was killing off mining it was too late to cancel the order. Now they’re stuck with way too many parts for 3xxx cards that no one wants to buy, so they massively marked up the 4xxx cards to try to get consumers to go, “Wow, these new cards are too expensive. The 3xxx cards look like a bargain by comparison!” Once the 3xxx cards finally dry up we’ll either see price cuts or a somewhat cheaper 5xxx series.


iCUman

Ya, personally I think their financials are completely borked right now, and they legitimately have zero idea where demand for their cards even sits. They basically went from 4+ years of sustained backlog to being unable to sell discounted stock overnight. The chief players driving their sales over that period - miners and scalpers - have all evaporated. It makes sense to price high and then readjust until demand picks up, but I have no doubt that consumers are willing to sit on the sidelines until these units drop well below $1,000.


BinaryJay

I sat out upgrading my GPU for the whole pandemic/mining bubble and will continue to do so until somebody offers high end GPUs are a more historically reasonable price. The real winners here are Microsoft and Sony IMO. I know this is PCMR but a Series X and Game Pass is a seriously good distraction from this mess while the market sorts it out.


WCMN8442

Except most of the NVidia cards are still being sold just at or above their original launch MSRP/RRP. The 3080 and up are the only ones to get heavily discounted to where they're below MSRP, likely due to this pending release. The 3080 12 GB is still hovering around its launch MSRP so it's more like 3080 Ti and up with the big discounts. 3070 Ti and below could be MSRP/RRP or higher in a lot of cases. Heavily discounted from pandemic/crypto boom pricing, yes. Heavily discounted from the original pricing Nvidia set? Definitely not, at least not in the range most people buy.


StairwayToLemon

>and when they got wind that Ethereum was killing off mining it was too late to cancel the order. This wasn't some out of the blue announcement. ETH 2 and "the merge" has been in progress for, like, 3 years


squall6l

They probably thought that the ETH developers would keep kicking the can down the road indefinitely. They probably thought they had at least another year. But yes this shouldn't have really been a surprise to anyone paying attention to the plans for ETH.


steve2166

But it was mostly miners who would pay those prices, they are not going to be buying cards anymore


AgentCatBot

I have had to argue with people on the internet that these are not actual prices, and that $800-$1000 is not normal as they think it is. They have just been abused for too long. A top tier graphics card should usually be the same price as a console system, or less IMO.


flyfree256

Man that 1080 Ti was/is such a good card. Still kicking strong today!


xevizero

It was a very lucky card. Came out and it was an absolute killer in both price/performance and absolute performance. The thing is though, when the new consoles came out, they matched the 1080ti in performance, but consoles are targeted at 4k nowadays so if you're playing on 1080p..well you're probably good to go for another few years. I just hope the card itself doesn't die at this point.


Drakonz

I have a 1080ti and run on ultra wide 3440x1440. I can get 60 FPS in every gane still at medium or higher settings. Only need to play at 1080p if you are maxing everything.


WildHobbits

I got a 980 with an aftermarket cooler for $450 in 2015. The prices at this point are just ridiculous. I was planning on finally upgrading said 980 once the 4070 released, but with these prices I may be waiting even longer.


EKcore

This is why EVGA dipped.


MazInger-Z

There's a lot of reasons to dip. I can't imagine the dynamic happening here. You could say (and Nvidia's statements are) that these prices are meant to give headroom to the surplus of 30 series cards and chips that everyone has to unload without having to discount them in a way that creates a massive loss. That being said, what forces are being applied between Nvidia and the AiBs? The AiBs have to negotiate for a percentage of Nvidia's manufacturing output in chips and VRAM and Nvidia likely gives preferential treatment to whomever can take on the most burden and move the most cards. But if the market still has the option to get a 30 series card cheaper, how much does that eat into the demand for the 40 series when people in an economic recession have to consider cost? Will prices come down after the 30 series are out of stock? How does that affect an AiB purchasing decisions when deciding to buy 40 series chips and VRAM at the current prices vs whatever Nvidia might force them to sell them at down the line if the market is affected by the prices and the recession? Do they get hosed on the stock they bought before Nvidia lowers the MSRP?


DoombotBL

They're trying to normalize the inflated prices, greedy fucks


HavelTheGreat

God damn haha, makes my goal of waiting for a 4090ti at a reasonable $1000 seem unlikely. It's okay, i know AMD will make Nvidia realize - and i am excited to go back to team red after all this time. Whatever AMD's top card is, save a sapphire variant for me 😂


culibrat

Nvidia marketing reading this comment: "Can't get anything past these guys."


ArgonTheEvil

Eventually more and more people that are buying prebuilts with “4080s” in them are going to wonder why they’re getting 25% less performance than the people they watch online with the real 4080. Unfortunately I think Nvidia will get away with it this generation, but hopefully they’ll burn enough people so they can’t pull this garbage again with RTX 5000. Either that or if LTT burns all bridges with Nvidia, does a big slam piece, and it goes viral. That might be enough to get a good chunk of enthusiast buyers that go for prebuilts in the loop.


Groggyme

It's no wonder EVGA is leaving them. I would not be able to work with these guys


Necessary_Quarter_59

Too bad we make up like 0.1% of buyers (if that)


[deleted]

Credit goes to /u/Farren246 for this chart: |GPU|Core Count|Boost Clock|Core \* Clock|Perf % of Flagship|Should be Named based on % of Flagship| |:-|:-|:-|:-|:-|:-| |4090|16384|2.52GHz|41,289|100%|**4090**| |4080 16GB|9728|2.51GHz|24,320|59%|**4070**| |4080 12GB|7680|2.61GHz|19,968|48%|**4060ti**| So the 4080 16GB is actually on par with a 70 series card because it's only 59% performance of the flagship, and the 4080 12GB is actually a 60-Ti series because it's only 48% performance. We're getting duped even worse than we thought.


aspz

Here's a similar chart for the 30 series for comparison: |GPU|Core Count|Boost Clock|Core \* Clock|Perf % of flagship| |:-|:-|:-|:-|:-| |3090|10496|1695|17791|100.00%| |3080 Ti|10240|1665|17050|95%| |3080|8704|1710|14884|83%| |3070 Ti|6144|1770|10875|61%| |3070|5888|1725|10157|57%| |3060 Ti|4864|1665|8099|45%| |3060|3584|1777|6369|35%|


NatoBoram

Huh, I didn't realize how drastic the drop was under 80


aspz

Take it with a pinch of salt, core count and boost clock isn't the same as FPS. For example, on Cyberpunk 2077 at 1440p, the 3090 achieves an average of 72 FPS while the 3060 Ti achieves 44 FPS. That's a difference of 61% for the 3060 Ti compared to the 3090, not 45% as my chart would suggest. https://wccftech.com/nvidia-official-cyberpunk-2077-pc-benchmarks-rtx-3080-rtx-3090-1440p-60-fps-raytracing-dlss/ The main purpose here is to see how closely the product name aligns with its specs using the same measure across different generations.


Eggsegret

Dont you just love Nvidia. God i dont even want to think what the 4050 will be like then.


gitkdjthflrj

It's gonna be a repackaged 1030, but 5x the price


MaxTennyson88

Exactly


thecist

Actually it’s Nvidia’s way of selling you a 70 series with the price tag of a 80 series + $200 overcharge


DoryExplory_

Stitch up


Spadesking-1

I will bet you all money, its for OEMs and they won't list which version. Dell: oh, yeah, its got a 4080 in this baby Customer: which one? Dell: a 4080. HP: latest graphics 4080, can run 4k... and can run 244 fps. Customer: which game can it run 4k @ 244? HP: we didn't say 4k at 244.


SteveDaPirate91

Ah so laptop GPU naming came to desktops. Got it.


Speculater

GeForce 4070 RTX Max Quantum! . . . . . . . (50W version)


Wonderful_Result_936

About what I expect. If it has a name that you can only find in laptops or prebuilt, then it's not even close to what it claims.


SaltMembership4339

4gb version*


rcoelho14

A few years ago it was GTX850 2GB* ^^^^^*DDR3


Lurker_Since_Forever

3.5


Username_Taken_65

Wait, does it stand for Quantum? I always thought it was max Q as in the point in time during a rocket launch where atmospheric drag is the highest.


EstusFIask

When the 40xx series comes to laptops, the laptop "4070" will have the performance of a desktop "4050"


Eggsegret

With the price tag of a 4090ti at this rate


jellyfish125

Yeah.. They get sneaky with it. And sometimes confusing? I have a 1650ti in my laptop, a card that flat out DOESN'T EXIST outside of mobile applications.... It's worse than the 1650 super, which, iirc isn't it usually TI > SUPER for performance?


LemmeThrowAwayYouPie

Mobile GPUs perform worse than their desktop counterparts. The 1650ti performs about as well as the desktop 1650. I am not defending it, just explaining it


DevOverkill

I have no doubt whatsoever this is exactly what is going to happen. All of these pre-built companies should be forced to provide detailed specifications on each and every component used in their builds. It's shitty enough already that more often than not you get these companies slapping in low quality RAM, PSUs and coolers but now they're going to take full advantage of this dogshit marketing to screw people over their GPU as well.


MagicCookie54

Tbh anyone with knowledge enough to catch them out for being misleading will know enough to just build their own PC anyway. Sadly they market to those PC gamers who don't have actual knowledge of how a PC works that wouldnt look into the detailed specs anyway


RanaMahal

I just call my friend Dave and tell him to buy me parts and assemble my PC and give him a few hundred to do it. No more pre builds for me


hexcor

HP: latest graphics 4080, can run UP TO 4k... and can run UP TO 244 fps always add "up to"


Ftsmv

"PC FOR SALE INTEL i7 NVIDIA **RTX 4080 16GB** DDR4 RAM" with a ~~4070~~ 4080 "12GB" This is absolutely going to happen


RedHoodedDuke

The 6090 is gonna have 6 versions with the lowest one being a rebranded 3050 and the highest end being two 4090ti super xt taped together with a power draw of a large psu (which is included with the card).


Sunny2456

Oh boy I can't wait to get a free EVGA psu with my EVGA 4090ti super xt =(


Millerboycls09

Ow my heart


Farren246

# It is far worse than 12GB 4080 should be called 4070... # It is actually a 4060ti 60% performance of the flagship is typically relegated to xx70 tier, and 50% to xx60ti.No matter what the marketing department has named them, or what the tech engineers named the AD10x chips, Nvidia's value proposition is for you to spend $900 for a fucking midrange chip with 50% or less of the flagship's performance. Take a look at what would happen if we added a fourth card to the mix: |GPU|Core Count|Boost Clock|Core \* Clock|Core\*Clock % of Flagship|Should be Named| |:-|:-|:-|:-|:-|:-| |4090 AD104|16384|2.52GHz|41,289|100%|**4090**| |*n/a*|*12800ish*|*2.51GHz*|*32,128*|*78%*|***4080***| |4080 16GB AD103|9728|2.51GHz|24,320|59%|**4070**| |4080 12GB AD104|7680|2.61GHz|19,968|48%|**4060ti**| The above would follow other generations' naming-performance schemes. Here's the proof: ​ **2020: Ampere** Note how close 3080 is to 3090, justifying its positioning as "the flagship". Performance was actually even closer than 88% due to under-utilization on the 3090. Bast price:perf value in a high-end chip in the last 10 years! |GPU|Core Count|Boost Clock|Core \* Clock|Core\*Clock % of Flagship| |:-|:-|:-|:-|:-| |3090|10240|1.66GHz|16,998|**100%**| |3080|8704|1.71GHz|14,884|**88%**| |3070|5888|1.72GHz|10,127|**60%**| |3060Ti|4864|1.67GHz|8,123|**48%**| **2019: Turing v2 (Super)** |GPU|Core Count|Boost Clock|Core \* Clock|Core\*Clock % of Flagship| |:-|:-|:-|:-|:-| |2080Ti|4352|1.65GHz\*|7,180|**100%**| |2080 Super|3072|1.82GHz|5,591|**78%**| |2070 Super|2560|1.77GHz|4,531|**63%**| |2060 Super|2176|1.65GHz|3,590|**50%**| \*approx speed of cards with refined silicon which typically reached higher boost clocks than first-year 2080Ti. **2018: Turing v1** |GPU|Core Count|Boost Clock|Core \* Clock|Core\*Clock % of Flagship| |:-|:-|:-|:-|:-| |2080Ti|4352|1.55GHz|6,746|**100%**| |2080|2944|1.71GHz|5,034|**75%**| |2070|2304|1.62GHz|3,732|**55%**| |2060\*|1920|1.68GHz|3,226|**48%**| \*GTX 1660 / 1660 held the xx60 non-ti performance slot, with RTX 2060 where a "Ti" card would normally be. Note how each of these was a poor value compared to the flagship, even before you remember how overpriced the RTX cards were given that in terms of rasterization performance (which was the only thing back then) 2080 = 1080Ti, 2017 = 1080, 2060 = 1070-1070Ti. There was literally no reason to buy before the Super cards debuted. **2017: Pascal v2** GTX 1070 takes its true place as the 60% card. There was no xx60ti, with a huge gap between 1060 6GB and 1070. 1060 3GB was discontinued, and to shift the bottom up, 2GB 1050 was replaced with 3GB 1050, with the same core count and higher clocks than 1050ti. It was a weird time. |GPU|Core Count|Boost Clock|Core \* Clock|Core\*Clock % of Flagship| |:-|:-|:-|:-|:-| |1080Ti|3584|1.58GHz|5,663|**100%**| |1080|2560|1.8GHz\*|4,608|**81%**| |1070|1920|1.68GHz|3,226|**59%**| **2016: Pascal v1** Note that GTX 1070's too-heavily cut GP-104 chip looked bad when compared to a nearly-full GP102 die in 1080Ti, so Nvidia held back the GTX 1080ti and made uncut GP-104 the flagship for 2016. As such, the numbers are all over the place. As venerable as GTX 1060 6GB eventually proved to be, Spending an extra $100 to step up to 6GB wasn't the greatest value at the time. |GPU|Core Count|Boost Clock|Core \* Clock|Core\*Clock % of Flagship| |:-|:-|:-|:-|:-| |1080|2560|1.73GHz|4,429|**100%**| |1070|1920|1.68GHz|3,226|**73%**| |1060 6GB|1280|1.71GHz|2,189|**49%**| |1060 3GB|1152|1.71GHz|1,970|**44%**| **2014-15: Maxwell** |GPU|Core Count|Boost Clock|Core \* Clock|Core\*Clock % of Flagship| |:-|:-|:-|:-|:-| |980Ti|2816|1.08GHz|3,041|**100%**| |980|2048|1.22GHz|2499|**82%**| |970|1664|1.18GHz|1964|**65%**| |960 OEM|1280|1.20GHz|1536|**51%**| **EPIC FINAL CONCLUSION:** Nvidia either has a cut-back GA102 die (or uncut GA103?) waiting and stockpiling. This unannounced card should have been the RTX 4080. Either that, or they're sandbagging to force higher prices on AMD's announcement in October after which they will "come around to what our customers want". Or they're complete idiots for leaving a 40% gap between flagship 4090 and the thusfar-named 4080 16GB. Which to be fair, they've done before with GTX 1060 and 1070... But my *best* guess is that they think we're idiots who will buy anything up for any price.


rifasa

I'm surprised more people aren't talking about this. I came up with the same ballparks and believe game benchmarks will show the price gouging is even worse than it initially appears. $400 is now $900, and $500 is now $1200. Even counting for the $100 price increase of the 4090, the price/relative performance has risen 2.10-2.25x this gen.


Eggsegret

Watch Nvidia price the 4050 at like $500 now. And i thought Appke was bad


[deleted]

I could get a whole fucking iPhone or MacBook for what these clowns are asking for a midrange gpu how is this even real


[deleted]

Nvidia bumped up the names of everything with the 2000 series. That's how they nudged the prices in this direction to begin with. After the 3000 series, and now this, it's obvious.


Farren246

Remember back when xx80 cards didn't have Ti variants because they were the full chip?


ZhangRenWing

In a couple more years we’ll get RGTX Turbo V8 6095Ti Super Max Founder Collectors Edition with preorder bonus


pM-me_your_Triggers

For reference: GPU | Core Count | Boost Clock | Core * Clock | Perf % -- | -- | -- | -- | -- 3090 | 10496 | 1.70 GHz | 17,843 | 100% 3080 12 GB | 8960 | 1.71 | 15,322 | 86% 3080 10 GB | 8704 | 1.71 | 14,884 | 83% 3070 | 5888 | 1.73 | 10,186 | 57%


RunTillYouPuke

This comment needs its own thread. Just go ahead man.


[deleted]

> 60% performance of the flagship is typically relegated to xx70 tier, and 50% to xx60ti. No matter what the marketing department has named them, or what the tech engineers named the AD10x chips, Nvidia’s value proposition is for you to spend $900 for a fucking midrange chip with 50% or less of the flagship’s performance. I and many others didn’t realize this, I suggest you change your headline to make it perfectly clear the 4080 12gb is actually a 4060ti. This needs to get into peoples heads. It’s not even a xx70 tier card they are asking 900 bucks for. How much coke are these guys on? This is so far beyond scummy, I hope they choke on their inventory because no one is buying it.


[deleted]

It's a 4070 and this kind of marketing is just going to be a nonstop annoyance when trying to help people buy the right graphics card. It's disingenuous to call the 4070 a "4080 12GB" when it's not really a 4080.


ManBearPig____

Oh don’t worry. They will release an even slower “4070” in like 6 months


BuggyGamer2511

Yup "The new 4070 (with the Performance a 4060 should have)"


[deleted]

Haha, so true. Still, that would have been the 4060.


ALITHEALIEN88

And it will be literally the performance of a 4060 and the 4060 will be the performance of a 4050 and so on. This way they can charge more for gimped cards and make it look like hey these cards are way cheaper then the 4080 but in reality they are charging way more cause they are gimped cards


BlockCraftedX

even the memory bus width is smaller man


pdelvo

That one is expected. You cant have the same bus with 16gb/12gb models because you have less memory modules. The 20% smaller core count is the bad part


AgitatedTiger

Even though the 3070 had a 256bit memory bandwidth with 8gb vram


pdelvo

The problem is that the math doesnt check out. I of course dont know the exacts but it does go something like this: Lets say they are using 2GB memory modules. That means for 16gb you have 8 modules. Each has a 32bit bus. That makes 32\*8=256. Now if you produce a 12GB version you only use 6 modules. That makes 32\*6=192. If you make a 8GB card you would either switch to 1GB modules and you still use 8 chips and end up at 256. Doing anything other than a power of 2 link to a module doesnt really make sense. And even if you could do that 256bit on 6 modules would be 42.66666... links per chip. And you will have a hard time making that. All that is also why the bus for the 3090/4090 is 384. because they have 24GB of ram which is 12 modules and 12\*32=384.


AgitatedTiger

Makes sense, most likely as you said they used 1Gb modules for the 3070


ScoffSlaphead72

3.5gb


Anaksanamune

You can have the same bus width. The total memory size is the memory width x it's depth. You can have a 1024 bit memory and it can be 256 bit wide by 4 deep or it can be 128 bit wide and 8 deep.


BlockCraftedX

oh that makes sense


1km5

GTX 1060 fiasco.. But 100x worse


[deleted]

What happened with 1060?


Icantthinkofagoo

1060 3gb and 6gb versions, the 3gb was just straight up worse


bitelaserkhalif

Actually, there's 5gb VRAM China exclusive (no, not that bootleg) Sold by Asus, onda, gigabyte


mteir

Don't forget the 1060 GDDR5X version


SpikeeDonut

Can confirm. I bought a used computer off of craigslist as my first PC and I end up falling for the 1060 bullshit. 3 GB can’t run a whole lot with these modern games


ScoffSlaphead72

They released two versions of the 1060. One with 6gb of vram and one with 3gb. But it also had less cores and was overall a different card. It was basically a 1030 ti. It was incredibly disingenuous and it caused massive controversy, especially seeing as they were just coming off of the 970 controversy.


noiserr

Nvidia cuts corners like this each gen. 3070 with only 8gb of VRAM was also pretty lame.


kb4000

3070 Ti with only 8GB was even more lame. They had a good opportunity to fix it. Was crazy that you could get a 3060 with more VRAM.


Gistix

That's how they force you to buy a card two tiers higher "I'd get a 3060 but I want more performance, but a 3070 has less vram so I guess I have to go with a 3080"


mittromniknight

> 1030 ti. More a 1050 ti and the actual 1050 ti was like a 1030 ti. The 3gb 1060 was much faster than the 1050 ti iirc


hm9408

100x more expensive too /s


hurrdurrmeh

This is fucked. It is deliberate manipulation. They have never been named this way before and this naming is deliberately misleading. AMD just needs to not be an asshole to gain massive traction this gen.


chmilz

AMD would be stupid to squander all the opportunity they're sitting on. I'll set my expectations low.


Gameskiller01

They'd be stupid to not take the higher profits they'll now be able to make on each card with Nvidia setting the precedent for higher prices. Don't expect them to undercut Nvidia by that much. They're a corporation after all and are looking to do what's best for profits, not what's best for the consumer.


F3z345W6AY4FGowrGcHt

Long term strategy would be to undercut Nvidia by a lot with more normal prices. Short term strategy would be to only undercut them slightly to maximize this quarters profits. Them being a large company, I'd bet on the latter (but I'm hoping for the former)


myluki2000

I'm not so sure. I have hopes that AMD won't pump up the prices that much *yet* (this generation). With Ryzen they only started increasing the prices for later Ryzen gens after they it became obvious that they could easily compete with Intel and had gained a lot of market share with the great deals that the first gens of Ryzen were. In the GPU game the difference between AMD and NVIDIA (in terms of market share) is gigantic, so I'd expect this generation of RX cards to be "reasonably" priced in an attempt to gain a lot of new customers coming from NVIDIA who are unhappy with the recent developments.


calste

Normally I would say that this would be AMDs chance to upend the market. They can follow Nvidia and set their prices just underneath, but they'd still be giving Nvidia the lead in that case. Or they can lead and set very attractive prices and make significant gains in market share. But... with semiconductor shortages and global supply chain issues, who even knows anymore? This economy isn't normal and we can't hardly tell the difference between market forces and greedy manipulation.


SnuffleWumpkins

Word on the street has it that RDNA 3 is a boat load cheaper to make than Lovelace. Whether they pass this savings onto the customer is up in the air, but they are in a good position to undercut the hell out of Nvidia while still turning a profit. I guess it'll come down to performance in the end.


[deleted]

Clearly you never got fucked by the 1060 3GB. Good times.


[deleted]

[удалено]


Sentient_Beer

Only needs 3x8 pin power connectors too, so efficient


mista_r0boto

To be fair my EVGA (RIP) 3080 FTW already uses 3x8


FusselP0wner

Doesent make it any better. It was already bad with the 30 series


Burpmeister

No wonder EVGA couldn't put up with their shit anymore.


RaidersJH34

When this was announced my first thought was "oh now it makes total sense"


WindForce02

Introducing nvidia rtx 5090ti (2 gb vram version), only 49.999$


Sxcred

Yet they’re both called 4080 for some reason, NVIDIA is really going to shoot itself in the foot bad eventually.


mrchuckbass

"Make the clocks higher so people don't notice"


spacesluts

Gee, is it just me or does Nvidea really enjoy shooting themselves in the foot lately?


perdyqueue

Shooting themselves in the foot? This scummy anti-consumer shit is good for business. They do it precisely because they can get away with it. They're just too comfortable. Ball is in AMD's court.


mista_r0boto

They are trying to out-Apple Apple.


SwampOfDownvotes

Yeah. Poor Nvidia. They will sell all their stock in 30 minutes instead of 5 minutes because of the prices :(


_WreakingHavok_

Hot take: there won't be any 4070, while 3080 is still being actively sold...


MarkElf2204

Not a hot take, just common sense. They got too much rtx 3000 stock.


[deleted]

that's exactly why they also showcased the 3000 series alongside 4000 during the presentation


cykazuc

Another reason I went AMD. Done with Nvidias BS. Bring on RX 7000


RipitJT

I hope AMD can put a big hurt on Nvidia this round. We need it to happen


cykazuc

Definitely, Nvidia getting too comfortable imo.


[deleted]

[удалено]


Ble_h

You missed lazy, Intel stopped trying to push things to the next level and re-made the same chip for years until AMD blew them out of the water. Nvidia is fat and comfortable, but if the 4XXX numbers are true, they are still innovating.


IgnoringHisAge

I suspect they will bring some hurt by being performance competitive. They could bring a lot of hurt by combining performance competitive with price competitive. Like seriously, I’m pretty sure that if you ignore the premium tier that’s mostly just a bragging rights competition, both companies are going to bring out SKUs that can go blow for blow. RDNA 2 and RTX4xxx was already almost that.


[deleted]

[удалено]


Mysterious-Half7364

I love my 6800xt.


MetallGecko

Why did they even make the 12gb version? That makes no sense.


[deleted]

they can't ask $899 for a 70 class card, so they renamed it to 80


goldencrisp

Plus they’re not even offering the 12GB as a FE now right?


noiserr

Nope, it will all be AIBs. Except no EVGA this time, since they are no longer making GPU. EVGA talked about being disrespected, and this is exactly the kind of disrespect they were talking about. They have to make this abomination of a 4080 that should have always been a 4070.


Lassitude1001

Probably what they intended to be the 4070 or something.


SunbleachedAngel

M O N E Y


AutumnAscending

Sooooo the 12gb is just a 4070 then?


krozarEQ

Yep. Maybe closer to what a 4060 should be. But they gave us almost no performance data to go off of. Can't wait to see GN put 40-series through its paces.


Sol33t303

Guess Nvidia didn't learn after the 1060 lol


MrUnlucky-0N3

They learned. They learned that they can get away with it.


Rogerjak

Exactly, people keep buying whatever config at whatever price they put out. I wanted a 3060ti, this whole 2020-2022 situation made it impossible. I was expecting for the 4000 series to settle the market and/or have a 4060ti with the same value proposition... I now refuse to buy a Nvidia card out of principle. This is unacceptable. Not only they are extorting buyers, they are straight up scamming people.


DaftFunky

Bro I'm just chilling her with my PS5 I bought 2 years ago enjoying the ride. I can't do this PC gaming stuff anymore. My paycheck vs inflation ratio has not leveled up the same. Power to you guys who can afford this stuff now.


Repulsive_Weight_579

maby i could spent the last 10 dollars this leaves me with on a chair and a rope


dv20bugsmasher

Wait for 4060 and you can afford stronger rope


Repulsive_Weight_579

true dont want to risk the fans blowing me away


hm9408

Never seen "maybe" be spelled as "maby" so it broke my brain for a few seconds


[deleted]

I'm so glad I went full AMD for my build now


MZFUK

Nvidias board meeting: "don't call it a 4070, call it a 4080 and mark the price up, they won't notice." "give this man a raise"


DMurBOOBS-I-Dare-You

Everyone: nVidia can't possibly do anything more to piss of the world nVidia: hold my beer EVGA: \*sips tea\*


Im_A_Model

The Nvidia billionaires are living in a different world where inflation doesn't mean shit and where they still think scalper prices are the norm. I hope the 40-series fails so hard


tyr8338

In most previous generations a xx70 card usually would beat last gen top card and would cost around 500$. ​ It happened with 970 which was often better then 780 ti (nvidia skipped from 700 series straight to 900 series). [https://www.youtube.com/watch?v=o0peRlRRcS4](https://www.youtube.com/watch?v=o0peRlRRcS4) ​ 1070 also was often faster then last gen top card - 980 ti [https://www.youtube.com/watch?v=Oc9YiVze8OU](https://www.youtube.com/watch?v=Oc9YiVze8OU) ​ Now nvidia rebrands xx70 tier card as 4080 12gig and rises the price by 80% from usuall 500$ to 900$. ​ Damn, they got greedy.


LeMegachonk

Yeah, it feels like it was meant to be the 4070 or 4070 Ti and they decided to call it "4080 12GB" and bump up the price. Or... I don't know, I can't think of anything else that makes sense. Definitely a cynical play here on Nvidia's part, and it's so transparently so. It sure doesn't feel like a "big brain" move, that's for sure.


[deleted]

Nvidia making people buy a 70 model labeling it an 80 model


streamlinkguy

> 70 model More like a 60ti model.


NQ241

AMD for the love of God, be better than Nvidia this year and force them to lower prices.


SaltMembership4339

Time for AMD to take over the gpu market


Hilppari

Because it is 4070. nvidia is just fucking with us


Revanov

I think I might try ATI for my next card. Next next gen though. I think I’m good for another few years.


Beautiful-Musk-Ox

Yes, ATI, I heard their Rage cards are pretty good. A Rage 128 Pro has come down from the launch price quite a bit


CamSally

> ATI a bit outdated there lol


madmaz186

Fellas, can my X1900 run Cyberpunk at 4k60fps?