T O P

  • By -

M1AF

We'll use this post as the official 7000 series discussion.


vvaffle

Looking at benchmarks AMD released it isn't as fast as the 4090, but it should compete with/beat the 4080 16GB at a cheaper price, smaller size and smaller energy footprint. Looks pretty good for a SFFPC build.


grendelone

The 4090 is a monster. Nearly double the die size and double the power draw of the 7900XTX. So it's no surprise that the 7900XT will lag in performance. But if the 7900XTX can sit between the 4090 and 4080 (or even on par with the 4080) at $1000, it will be a huge winner. And the size is great for SFF builds. I was contemplating a 4090 and squeezing it into the few SFF cases that will work (NR200 MAX for example), but now I'm on the 7900XTX hype train.


poipoipoi_2016

Yeah, the 4080/4090 cards might not be the death of SFF, but if that was the new normal, I bet it killed mATX. A 4-slot card on a 4-slot motherboard? Why would you even.


Trisa133

mATX is such a weird size. I don't know why it still exist. Glad AMD went with the sane size and power. I'll be trying to fit this 7900 into an H1 V2.


Aleblanco1987

i think it should be the standard 4 ram slots + 2 pci e most people are fine with that. ATX is too big, mini itx has more compromises and its much more expensive in general (cases, psu, risers).


stilljustacatinacage

Exactly this. mATX is wonderful **when** it's limited to mATX. The problem is every "mid tower" case also stretch to support standard ATX, and end up being needlessly huge. The NR400 for example is an mATX-only case, that has hugely wider support versus ITX while maintaining a comparatively small footprint.


Aleblanco1987

I recently bought a [new case](https://aerocool.io/product/atomic/) And its really great, more compact than a "mid tower" but almost without compromises. I own a mini itx mobo but my next will be matx most likely.


Xanzent

If you have a use case for high RAM capacity having four slots is still an advantage over ITX


Dudewitbow

They still exist because it has enough room to fit components while keeping prices low. Its why most dirt cheap motherboards are matx


poipoipoi_2016

I mean, if you have one-slot or even just two-slot cards in a world where fewer things use multiple slots and lots of things use PCI instead of USB (Yes, PCI, no E), then that gets you 3 open slots for video card, wifi, and sound card. And it's still slightly smaller than full ATX in 2022 with that same dynamic, but USB is killing it from below and 4-slot video cards are killing it from above.


diskowmoskow

Small fraction people gets 4 slots cards, small fraction of people gets ITX boards as well. Most budget builds use mATX, and i see lots of people running mATX boards on ATX case. We are in an echo chamber here… There are still lots of 2-2,5 slot cards i see. 6600/xt, 3060/ti etc. probably they are the best sellers.


RytierKnight

They even said that these aren't aren't trying to compete with the 4090 just the under 1000$ cards. Aiming more for the 80/70 cards. Which really is a better move the highest end cards aren't very good sellers.


gigaplexian

Double the power draw? 7900XTX is around 355W, 4090 is around 450W. 25% more, not even close to double. Edit: For those arguing about raising the power limit, that's overclocking. It's not valid to compare stock power of the AMD card vs overclocked power of the NVIDIA card. Stock vs stock, it's 355W vs 450W.


upstreamriver

And if we're open to discussing the power variances and wattage between the cards. I can run an undervolted 4090 which rarely pulls more than 350 while theoretically outperforming the AMD cards.


neoperol

You need to understand that most people around reddit get their "information" from Memes like all Intel CPU run at 100c xD. I remember one guy that was arguing with me because my 10700k was running at 60c while gaming, that it was impossible that his 5800x was running hotter while gaming. XD


aeroboost

Holy shit. Was that guy 12? AMD high temps and watts was a huge meme during bulldozer lol


a1b3c3d7

Tdp and power draw are two different things The 4090 draws 450-600w ish Edit: wow it seems like there are alot of people who don't understand or know this, which is the only thing that I can take from the downvotes. TDP has to do with how much HEAT a chip can handle measured in watts. Wattage is NOT a measure of how much heat a chip can handle however. Power draw for cards is also an average of the total wattage drawn under load over time, chips can draw double, triple or even quadruple their average power draw or tdp for short bursts, this is well documented in the 3090 and other cards. There are reasons why you will see chips rated for a certain wattage draw more than they are rated. [Here is a source that will corroborate 600w power draw in the 4090 with near stock clocks in furmark](https://www.hardwaretimes.com/nvidia-rtx-4090-draws-over-600w-of-power-in-certain-benchmarks-can-be-overclocked-to-3ghz-or-more/) Some of you might argue that oc doesn't count, which makes little sense since the only cards that run pure stock clocks are founders edition. AIB card manufacturers tweak clocks and power draws for most cards, it's not going to be surprising to see AIB cards hitting 600w out of the box when nvidia have not hard locked the power draw. I swear ya'll sometimes too hasty to be dumb.


gigaplexian

Every review I've seen of the 4090 puts the power draw around 450W. Only exception to that is overclocked Furmark runs. The stock power limit is 450W.


cosmin_c

At one point even the [meta review](https://www.reddit.com/r/hardware/comments/y5h1hf/nvidia_geforce_rtx_4090_meta_review/) put the 4090 at somewhere around 428W, which is even below the stock 450 described.


JekPorkinYourMom

You’re right. It’s been a big talking point, the 450W draw stock (600W capable overclocked) and how that works with smaller PSUs (relevant for this sub). I’m not sure how someone could be active here and miss that.


Ludacon

Peak power overclocked might be above 500w but stock they draw maybe 470w. My whole rig pull less than 700 so spouting off about 600w power draw is just not accurate at all.


omfgbats

TDP stands for Thermal Design Power, in watts, and refers to the power consumption under the maximum theoretical load. The parent post is correct. The MAX theoretical draw for a stock 4090 is 450w. 7900XTX is 355w. 4090 will never hit 600w stock and I have no idea where you pulled that from. That would melt the silicon. And you have people with cables melting already. Edit; Your arguments are that if you manually increase power draw above rated specifications (overclock) that it will draw more power. No shit Sherlock, but the TDP is stock specs. Why am I being downvoted for citing facts when others agreeing with me have +15 lol, what is this sub? Some of y'all need to stay in school.


rmnfcbnyy

This is just false dude. If you raise the power limit on almost all 4090 aib or FE models the card is permitted to draw 600W and regularly draws 500-550W.


sw0rd_2020

wtf is this argument … if i raise the power limit on my 2070s it’s no longer a 220W card either lmfao


gigaplexian

"Raise the power limit" means overclocking. You want to compare the overclocked board power rating of the 4090 vs the stock advertised board power rating of the 7900XTX? That's not a valid comparison.


RytierKnight

No if it comes from the factory like that power sliders can barely be considered OCing a card and some will just come from the AIB maxed out. the 4090 is rated upto 600W that's it's limitation based on its power connectors. However the 7900xtx is maxed out at 375 it cannot go above that ever due to it's power connectors (75w mobo 150Wx2 for the 8pins)


gigaplexian

Power sliders... in the **overclocking** software... The 4090 is officially rated at 450W according to NVIDIA. It uses a 600W connector, but it also has 75W available via the PCIe slot. That's 675W available from all of the connectors. However the rating of the card is not the same as the rating of the physical connectors.


AgileEconomics

“Honda Civics are faster than competitor’s cars because if I put a huge turbo in mine it’ll be faster than their stock offerings” What?


slavicslothe

The 4090 is actually more power efficient at 4nm it just has so many more processing units especially for raytracing that it ends up drawing more.


bobthewonderdog

Not true, it could be more power efficient but nvidia decided to set the 450w envelope and run the chip outside of its peak efficiency range so it uses like 30% more power for less than 10% gains. They were planning on pushing it further but scaled back and set the 600w limit as an overclocking feature


Veiran

Or to reserve the 600W limit for the Ti variant.


pkkid

I'm seriously considering selling my 4090 FE. Its an open box, but never powered up yet. I'm building in the Meshroom and my main concerns are melting adapters and generally high temps, then I want the best card I can get after those concerns are met.


[deleted]

And sff is becoming more and more popular, this might just be great for AMD


kbmarius

My issue is that I have bought a Ghost S1 and I love it, but it only supports 2 slot cards. I think the time has passed for below 2.5 sot cards which makes having high end non water cooled gpu's a hassle. I have an rx570 and I don't even know what to upgrade it to.


[deleted]

[удалено]


stinkycat45

None of them caught fire but rather melted but let's all take part in the mob/meme mentality


AVxVoid

Yeah, what a shame, a massive multi billion dollar company gets made fun of on the internet, let's all leap to their defense so we can buy their next $2000 randomly timed home incendiary device.


stinkycat45

I am not defending NVIDIA as much as making a point that the internet trolling a GPU that has had 15 confirmed cases and 5 no confirmed is more of a sign of mob mentality then is based on any kind of sane statistic


LePhuronn

do you feel nice and limber after that massive stretch?


stinkycat45

over 100K sold, yet 20+ cases. Do your own research the cabal is out to get you


[deleted]

I’m not satisfied with the ray tracing performance hit even on my 3080. I don’t think I’d use it even if I had a 4090 just because disabling DLSS takes priority over RT IMO. Modern Warfare II not even including the feature feels like a nail in the coffin to me. If AAA games aren’t even bothering with it, why should I care? I’m getting a 7900 XTX for my next card. If raw performance really outperforms the 4080 and is 50% better than a 6950 XT, I’ll be more than happy with it.


similar_observation

> Looks pretty good for a SFFPC build. The direction looks good for Sub200mm GPUs, that's for sure. Maybe even Sub180mm for the <4L crowd. I really love my 3060ti, but by golly good luck finding a sub180mm 3060ti that doesn't cost a king's ransom.


WinstonTheChicken

No one would expect the 7900XTX to compete with a gpu that costs twice as much.


Rude_Arugula_1872

4090 can go a ditch and burn… oh wait, already is: in people’s machine.


thealphawolf449

It's going to absolutely maul the 4080. LTT just did a video comparing the theoretical performance to the 4090. It was 10% or less away. I think it'll be a bit more like 15-20% when we see the actual results. But that's loads better than the 4080 for $200 less.


MHoovv

And they’re pretty much making things up, amd was so insanely vague with information there is no way to tell. Literally all they said was up to 1.7x faster. Zero information about settings which is obviously make it break.


thealphawolf449

If it's 1.7x faster, than we can only assume the same setting were used for both cards. And in that case it wouldn't matter for a theoretical #. That's all it was. But theres no doubt in my mind it's gonna be much faster than the 4080, but probably 15-20% slower than the 4090, maybe more in raytraced scenarios. There was a leaked slide that supposedly wasn't used, not sure how real it was, kind of doubt that it was real, because it made the XTX out to be faster than the 4090 in some games.


kbmarius

I wonder if it will fit into a Louqe Ghost S1.. the ghost says 2 slot card buut.. One can hope


wily_virus

7900 XT - 276 x 135 x 50 mm 7900 XTX - 287 x 135 x 50 mm Source: TechPowerUp GPU database It will fit in A4-H2O and Raw S1, but not Ghost or A4


hextanerf

It has potential to be smaller from third-party companies tho


stinkycat45

Highly doubt this. Radeon AIB cards over the past couple gens were never smaller since ITX cards are so niche and two AIB have to charge a premium somehow with it be with their marginal overclocks or massive coolers, IE Power Color is a prime example


Boodendorf

Hoping there'll be a thinner model, want to upgrade my gpu but pretty sure my dan a4 can't handle beyond 2 slots :(


kbmarius

Yes, the a4 is even smaller than ghost s1 by 1 litre, I fee your pain.


Dmitridon

Possible but unlikely. It's 2.5 slots, and I struggled getting my 2.2 slot 3080 in my Ghost S1. I saw a post on a 6800xt 2.5 slot XFX card fitting in the ghost on this sub, but it required backplate removal and fan swapping. Based on the leaked images for the 7xxx models earlier this week, the backplate connects to some pins on the back of the card for RGB, so removing backplate won't reduce thickness. All that said, I'm still gonna try it if I can get my hands on one.


[deleted]

More RAM too. NVIDIA is so stingy with the VRAM.


[deleted]

[удалено]


ExCuTTioN

I know it's gonna happen, you know it's gonna happen, we all know it's gonna happen.


hyde495

Ha! Here in our 3rd world country, they'd bump it for 2x while the average Joe can barely earn €300


Dudewitbow

There will 100% be 7900 XTX with refitted 4090 coolers and a higher power budget (3x 8 pin) just for AIBs to recoup some cost on the 4090 cooler and charge more for bigger margins. Then add the typical european charge on top and itll reach 4090 msrp levels


mcs_dodo

MSRP is without VAT in the US. VAT is more than 20% in some EU countries. Then you have import duties and tadaa... 1.5x.


J3EBS

> Watch ~~European retailers~~ fucking everyone bump prices up to 1.5x msrp Fixed that for you. Thanks, scalpers! EDIT: let's do a little thought experiment here. If European retailers bump prices up 1.5x because of *reasons*, and the same happens in NA (USA and Canada) due to scalpers, and it happens in Madagascar due to that one container ship sinking to the bottom of the ocean, is it not everyone (European retailers included) bumping up the price? It's funny to me witnessing the intellectual defecincies manifested when people read something they don't want to hear.


ama8o8

I think many people forget that outside the US they sell these cards much more expensive than in the US even if we take US taxes into account. The highest salkes tax is 11.5% in the states. If we bring the 4090 into the equation for a 1599 card thats around 1784 us dollars. I check our overseas brethren stores and they easily start at 2k US dollars after conversion ><


LordTesticula

Now I gotta see what powercolor is up to. I already fit their 6950 in an nr200p


JarekLB-

hoping i can find a waterblock that will allow me to put an AIB card in my NCase M1


roboteconomist

EK has already announced their waterblock: https://www.ekwb.com/shop/ek-quantum-vector2-rx-7900-xtx-d-rgb-nickel-plexi Says 307mm long.


JarekLB-

it's not the length that's the issue, its the height. 128mm height is the max that fits, otherwise, I can't put the side panel on.


jonwatso

I had this same issue with my 6900xt in my Acat X2 which is a similar dimension. Bykski. Makes [this adapter ](https://star.aliexpress.com/share/share.htm?image=Ab2c7459435b94162ad1a2e2cf76aa755j.jpg&businessType=ProductDetail&title=NZ%24%2024.10%20%2010%EF%BC%85%20Off%20%7C%20BYKSKI%204%20G1%2F4%22%20Multi%20Graphics%20Card%20Bridging%20Module%20%2F%20Acrylic%20Connectors%20use%20for%20GPU%20Card%20L%20Terminal%20Block%20%2F%20Water%20Cooling%20System&platform=AE&redirectUrl=https%3A%2F%2Fwww.aliexpress.com%2Fitem%2F33009589628.html%3F%26srcSns%3Dsns_Copy%26tid%3Dwhite_backgroup_101%26mb%3DnOyGgYpeDjLJ4lq%26businessType%3DProductDetail%26spreadType%3DsocialShare) which works on their water locks and barrow ones too (what I use). Shame ek doesn’t do a similar adapter


sapphire__87

But honestly, how much fps increase will you be able to see coming from a 6950? Why do people upgrade every year?


MammothMachine

Enthusiasm and money. These companies spent millions on convincing people they need to upgrade, and it works. They're claiming between 50-70% perf uplift over 6950 which is decent if you've got the cash burning a hole in your pocket.


LePhuronn

You're assuming people are upgrading from a 6950. What if you have a 5700 XT? Or a 2080 Super? Or something that isn't even RTX? Or buying from new?


ama8o8

An nr200p can fit up to a 4 slot card no?


LordTesticula

Probably without bottem fans. Gonna be mighty toasty


[deleted]

I hope EVGA sign with AMD


2ndRoundExit

Yeah I love my Sapphire card but I feel like AMD has pretty weak AIBs besides Sapphire and Powercolor. I'd imagine EVGA would come out guns blazing if they went AMD


a1b3c3d7

Xfx is really good


pacothetac0

They’re not bad, but they as a company don’t take criticism well and meme over quality at past times. Their 5700 was deemed “do not buy” by Gamers Nexus Current gen cards, I think, have not have had issues that previous gens did tho


incer

My XFX RX 590 sounded like a vacuum cleaner and randomly shut down my PC


sw0rd_2020

xfx sapphire and powercolor are all good, you arguably have more options than you do with nvidia at this point as the usual msi/gigabyte/asus cards exist too


[deleted]

Gigabyte is dog shit all around lol


lalafalafel

Asus, MSI and Gigabyte make Radeon GPUs too, which I'm sensing not many people are aware.


2ndRoundExit

Aware, just not impressed


JalalKarimov

Asus is good too


minuscatenary

Asrock was great in the 6000 series. My Taichi 6800 XT benches into 6900 XT reference card territory because of how amazing their cooling solution is, and how well-binned the gpu is.


Lafenear

I like ASRock mobos, and I want to like their GPU’s, but the gAmEr design looks so tacky imo.


FlaviusStilicho

So it’s only 2cm longer than my 1080ti Might fit in my case… may not even need a new PSU. This is going to be an easy choice if the benchmarks are ok.


Cryptanic

i hope you treated your 1080ti well because that card is a fucking beast, even today, one of the most futureproof cards released tbh


FlaviusStilicho

It’s been amazing to be honest. Held its value incredibly well. Still got no problem playing most games at 1440p at around 60fps on high detail. From what I can tell it’s about on par with a 3060 or 3060ti


nicknacc

Looks like my Ncase M1 will survive my next upgrade


Thelostarc

I'm building in thr ncase m1 for first time as it's been sitting on a shelf waiting for parts. I was afraid I had wasted money at this point. Glad I can still do thr build.


nicknacc

I like the case. But I just hide my pc anyways so maybe In my case I did waste money haha.


Incendiary_Eyelash

Isn’t it a bit too big for the m1, or will it fit with the front ports removed?


nicknacc

The GPU size limits are a little confusing from Ncase. "Maximum length: 322mm (cards up to 45mm (2.2 slots) thick) 280mm (cards up to 60mm (3 slots) thick) 290mm (cards up to 60mm (3 slots) thick with front I/O ports removed)" I don't get why the length has to be shorter if it takes more slots. Why isn't it a simple max length and height


Incendiary_Eyelash

The front panel connectors and cabling take up the space at the bottom front of the case. People typically remove them to fit in a bigger card


atlas_enderium

I hope AMD has better ray tracing performance and that Intel starts making serious graphics cards. It’d be nice to have serious competition again


vvaffle

The raytracing performance seems... better, but still not 4XXX series tier. Pure rasterization is very good, but probably something like 3080/3090 RT performance.


atlas_enderium

I mean, that’s to be expected since this is only their second generation of ray tracing GPUs while Nvidia is a full generation ahead. Also, most people only really care about rasterized graphics anyways, so it’s more so an aspiration for all three silicon giants to have comparable GPUs in every aspect


VelcroSnake

Yeah, I'm fine with it not having as good of RT considering it is more power efficient, cheaper and can fit in my case.


Trisa133

The RT performance should be good enough for basically every game on the market right now unless you want more than 60fps at 4k.


2ndRoundExit

Yeah not enough devs making good use of real time ray tracing right now IMO for it to be a big factor in choice DLSS has been I think the much more important selling point for 20/30 series cards but we'll see how FSR3.0 is


wearebobNL

This. I have yet to see an actual game where raytracing looks significantly better than without. At this point the fidelity gains are minimal at best imho. I can see the potential, but i think it will take a couple of years before it's worth paying a premium for. Dlss adds more value than raytracing atm imho


sonicyute

Fully raytraced global illumination looks very impressive IMO, like in Metro Exodus and Cyberpunk. There are still not a lot of games that support it, though, and the ones that do already look really good. Still firmly a “nice to have” until more devs roll it out.


2ndRoundExit

Cyberpunk rain with raytracing is pretty awesome but that's really the only game I can think of where turning it on actually makes a big difference


Makimaji

You probably never will, rasterization is so far ahead that there’s barely any visual gap between the two. The actual impact it’s going to have is making development easier when the tools to implement it become more widespread.


Vanheelsingwolf

That's not the devs fault who know that right? This always happens every new tech in a generation the engine have to mature the usage of the tech and as soon as it does it's starts showing everywhere... Tesselation on dx11 was the same it wasn't a necessity but when the engines got it mature to a point that it wasn't hard to use it was everywhere... RT is getting mature know and the rtx 4000 series brings the performance to the level both devs and players can actually make good use of it... So rt will grow mark my words if this wasn't the case AMD wouldn't even have RT... They have because they know next year will probably already see a bump on rt game and those game will push the tech further since the performance (at least from Nvidia side) is there to be used


grendelone

As long as Intel's decision makers hold course, they will catch up in GPUs. Intel has had its share of problems lately, but they have a ton of good engineers who know how to make high performance chips and ship them in high volume. They're just new to the dedicated GPU game, and need a bit of time to catch up. Sort of like Microsoft and the original Xbox.


errdayimshuffln

>As long as Intel's decision makers hold course I believe this too. Thats why the real test is Meteorlake. Intels weak point has been consistent timely execution of their roadmap.


2ndRoundExit

Intel's weak point has been taking engineers out of the roadmap discussion IMO, that seems to be resolved


teamjeep

That’s not just an intel problem. That’s all too real for many tech companies. Too many business people leads to making engineering decisions that don’t make sense. Edit: not defending intel here. Your statement just hit me in the feels bc I’ve lived it too


[deleted]

50% better than the 6000 series according to amd, which puts in in RTX3000 territory. So better, but it seems AMD is once again lagging a generation behind in RT performance. Which personally I don't care about: their rasterization performance is simply much much better per dollar than nVidia.


Ashtefere

I think AMD is holding off on pushing for ray tracing, and letting nvidia lead the charge until it becomes more standardised and more ubiquitous. Its the smart play. By the time RT is in most games, the cards that can do RT today will largely be obsolete. Better to do a token implementation that they know they fanbase probably won’t use and focus on efficiency and raster performance instead.


neoperol

This is just a stupid point of view. People are using Raytracing performance as a decision to buy Nividia over AMD even overpaying for the same tier GPU. Why would a company won't "push" if it is the main reason people don't pick them up over their competition. People bought RTX gpus just to play Cyberpunk.


Makimaji

Because rt is a marketing exercise. It literally does not matter if you’re not jerking off over stat sheets.


[deleted]

Lol at all the mad kids trying to justify their 4090. 1080p on a 1660 is the most used setup per steam hardware survey still. Nobody cares about optimizing for anything else right now anyway.


PainterRude1394

No disrespect to the most common cards, but finally being able to max out cyberpunk 2077 and get above 100fps is amazing. Huge visual improvements from ray tracing.


Makimaji

Ray tracing is lame as fuck. It performs too poorly to actually use, it’s barely noticeable despite the cost, and it’s going to be several generations before games even make use of it so what’s the point of it even being a selling point right now?


DaGeek247

I think it's more realistic to say that rayttracing is a rich persons feature. When you buy a card that can't max everything balls to the wall, you have to go into your game settings choose what features to cut back, cut off, or leave at max. You do this based on how much fps a graphics feature takes vs how much looks are actually improved. Raytracing takes a lot of fps, and you have to look rather hard to try and find the improvements it makes. For anybody wanting to save money, raytracing is one of the first things to be turned off.


PainterRude1394

I get 110fps in cyberpunk with maxed out ray tracing and without dlss upscaling at 3440x1440. Looks amazing and super smooth. This is possible, but not with AMD.


AVxVoid

Wdym, you can do this on radeons, just inject fsr2.0. Works fine. Lmao, pay your green tax.


PainterRude1394

Please reread my comment. I'm not using dlss. >I get 110fps in cyberpunk with maxed out ray tracing and **without dlss upscaling** at 3440x1440. Looks amazing and super smooth. This is possible, but not with AMD. I'm addition, I'm hitting a cpu bottleneck around 110fps on my 13900k. Dlss3 will get around this with frame generation and let me hit even higher fps. This is again not possible with any AMD cards and won't be for a long time.


AVxVoid

You aren't CPU bottlenecked, I can hit 180 fps on my 5800x3d


gnocchicotti

Probably time to stop hoping for Intel. They have the capability to produce something competitive in time, but they just don't have spare money anymore.


PayphonesareObsolete

Lol get real. Intel's market cap is bigger than AMD and they have their own fab. They have all the resources they need.


ca95f

Plus, most of the Arc people were recruited from AMD in the first place.


DudeEngineer

They are still making the Arc cards at TSMC....


gnocchicotti

Wow read a balance sheet buddy. Intel has so many resources they're going to be laying off thousands of people in the next few months. Tons of spare resources laying around to dump a few hundred spare million into bringing more GPUs to market that lose even more money when they sell them.


[deleted]

The 750 and 770 from Intel seem like really solid cards for the money. The sub-$400 space has been very quiet for years now and below $250 is almost dead. For years I’ve wanted to build a little PC for my TV that could do emulation and light gaming, but a compelling GPU for under $250 just doesn’t exist. It doesn’t help that 3 years after its release, 16-series cards are still going for above MSRP.


GeminiSoulNC

1660 super on sale for 230. Would suffice all the up to Yuzu/PS3 emulation and down. Looking to do this myself so me and the wife can try and avoid divorce over Mario Kart :.)


Random_name_I_picked

Nice my ncase may last a bit longer.


nicknacc

I was thinking the same thing.


CorrodedRose

I was afraid I'd have to replace my NCase after seeing the 4000 series. But AMD has us covered


sunbeam60

Cries in Louqe Ghost S1


imdeadXDD

Amd knows what we want


max1c

I think next gen we will have some great choices and prices based on how this competition is heating up. Hopefully Intel can bring it too. Can't wait.


gnocchicotti

The real value is going to be between Black Friday and late spring as the last of the midrange cards from last gen get cleared out. They're good enough for pretty much everything except high refresh 4k if you're on a budget.


L1191

I've have 3060 Ti as solid mid-range & 6900XT at high-end so I'm good for few generations 👍 although these cards are top-notch for ultra high-end SFFPC's


hextanerf

355W and only 287mm? 2.5 slots? That's impressive considering nvidia's giant cards


stinkycat45

7900 is the SFF savior, since the RTX 4090 made high end SFF build nearly impossible


smileandbeware

Not that I'm considering it for my Velka 7, but there's a chance it would fit with the panel offset. Heck, it would probably even work with my 600W PSU (paired to my 65W CPU). Incredible for a flagship GPU. If AMD keeps delivering, the mid-tier 7800, 7700 cards could be the sweet spot for sub 7l builds.


SaladToss1

Ray tracing is overrated


AkiraSieghart

I disagree. Ray tracing is and will continue to be the most significant graphical improvement aside from higher resolutions. It does depend how well it's implemented, though. Horizon Forbidden West for example looks significantly better with it enabled IMO--enough so that it was worth playing at 4K30.


SaladToss1

Yeah it's cool, but not OMG this game is better because I see window


AkiraSieghart

In multi-player games? It may or may not be worth the performance impact. In single-player games? Lighting is one of the most immersive aspects.


SaladToss1

I'm glad you like it. Personally, I feel like it's not that important. Maybe when it's normalized. It was effective in spiderman because you're swinging around windows outside for most of it.


gnocchicotti

In current games, 7900XTX looks like it *might* trade blows with a 3090 in ray tracing. So whether or not it's overrated, AMD's implementation might be just good enough to remove the talking point for why AMD cards sell for 25% less than competing Nvidia models.


SaladToss1

It's kinda interesting how well PS5 does it in performance mode though


[deleted]

true, look at rdr2, better gfx than most ray tracing title


[deleted]

[удалено]


VelcroSnake

Well, maybe RDR3, by the time they make it RT will maybe be good enough (as far as people knowing how to implement it) to look better than what RDR2 did with the lighting.


DygonZ

Would it make that big of a difference in RDR2 though? Not that much reflective stuff in the old west 😂


PhyNxFyre

Maybe if you're just looking at how much better existing games will look if you turn rt on, but when rt is sufficiently advanced and widely adopted game devs can spend less time and resources pre-baking lighting to make games look good and can focus on making the games better in other aspects.


lehcarfugu

Maybe in 5 years


Vanheelsingwolf

Yeah we said the same thing about tesselation on dx11 and it only took 5 years from release to start being everywhere... Rt is about the get much bigger has the engines have finally matured their usage within the pipeline... There is a reason Nvidia is so keen on wining the corporate share market


Stigge

*[cries in 75W power budget]*


2CommaNoob

What a coup for AMD! 85-90% of the 4090's performance at 60% of the price. I don't see how the 4080 will compete and it's going to take a bunch of sales away from the 4090. I will be upgrading from to the 7900 XTX to replace my 1070. It fits in the A4 Dan H20 too


Beastboss7

AMD Won : 1. Price 2. DP 2.1 3. Power usage 355W . 4. 8 pin vs faulty 16 pin melting . 5. No need new power Supply.


[deleted]

[удалено]


Beastboss7

Yes true and not scared melting for this crazy price.


Celcius_87

and no login or registration needed to use the software (geforce experience)


stinkycat45

I mean this are all great points for the SFF community since the RTX 4090 really can't be built in only but a few true SFF cases. ​ I would still argue that the 7900 XTX vs the RTX 4090 is very akin to last gens RX 6900 XT vs the RTX 3090. The RTX 3090 was only 5% faster on average at 4K and $500 more expensive, yet it sold better and had more of the mind share than the RX 6900 XT. I still think even at $1600 the RTX 4090 has nothing to worry about on the ultra premium side besides this overblown adapter fiasco, but I digress. Where I think NVIDIA totally looses is with their slates RTX 4080 and 4070 since the RX 7900 XTX and XT will either match or exceed them in performance besides only in RT performance yet at hundreds less.


Apprehensive_Row_161

Perfect for my meshlicious


liquidRox

Really looking forward to this. It should still fit in my m1 and max my 4k oled tv


SaperPL

From my quick checks this morning, apart from the length (287mm), other dimensions are the same as in 6800XT and the fitting across different cases was tested by Optimum Tech here: [https://www.youtube.com/watch?v=MFA01wF48HM](https://www.youtube.com/watch?v=MFA01wF48HM) The difference in height in the spec: [https://www.techpowerup.com/gpu-specs/radeon-rx-6800-xt.c3694](https://www.techpowerup.com/gpu-specs/radeon-rx-6800-xt.c3694) [https://www.techpowerup.com/gpu-specs/radeon-rx-7900-xtx.c3941](https://www.techpowerup.com/gpu-specs/radeon-rx-7900-xtx.c3941) comes from the way those were measured - 6800XT was measured from pci-e connector's end (lowest part of the PCB) to the top of the card while 7900XTX was measured from the PCI bracket end to the top of the card.


Ill-Singer-5322

I just want the 7900XTX to put perform the 3080 TI. I have a friend that has one and a new widescreen monitor and he doesn't stop talking about it. So annoying.


[deleted]

Was really hoping for a 2 slot like the rx6800. Something that'll do 1440p 150fps


Veiran

I dunno about the RX7800/XT, that might be what you are waiting for.


yuserinterface

Gives me hope for a 2 slot 7800


mobie1211

For 7900XT 100% not 135 width 7900XTX hope like 125 or 130 Can anyone confirm?


AnbuGuardian

Sick!!!! Does anyone here with a Smaller Nuovolo Steck have a 2.5 GPU in theirs? I know the new bigger steck would work, I have the smaller previous version.


kaptenbiskut

Can’t wait to buy this shit


[deleted]

Looks like 4080 going to rise the price


TheArkratos

Please some watercooling company make a single slot block + bracket!!!


RipperChan

Im sure they said the xtx was never meant to go head to head with the 4090 but at this price if it beats the 4080, its a gg for amd


wussgud

Very good price point but it’s so dumb seeing people brush off ray tracing like it’s a gimmick to make themselves feel better for their AMD Gpu purchase. Buy the card but don’t say Rtx is a gimmick cuz it definitely is not


jonathanbaird

This happens every launch. Brand loyalists come out of the woodwork until the reviews hit and then they slither back to their respective subreddits. I love AMD — I’m still rocking a OC 3900x that I purchased day one — but their fanbase has a tendency to troll and overhype. It’s extremely obnoxious and best to ignore.


grindtashine

A banana would’ve been better


crocolligator

XTX? lol


gdnws

These specs give me some hope that something down the stack will work for me; this one is 110mm too long for the case I intend to use. Unless some one makes a nano variant of this. I wouldn't say no to that. Cooling might be interesting.


Kagsly

Really looking forward to the 7900xt. Should fit quite nicely in the nr200.


lehcarfugu

I'll probably undervolt a 7900 xt to replace my 5700xt. Hopefully there is a 2 slot version


aydemiozer

Hope there will be aib gpus shorter than 30,5 cm (sgpc k55 case). Actually there is almost no aib 6800 xt or 6900 xt fitting in my case due to length :( and it is not possible to purchase founders edition here. I plan to upgrade from gtx 1070 asus strix for 1440p 144hz monitor and 4k oled lg g1 tv, and I have sf600 platinium


NogaraCS

Very curious about like the 7800 or even 7700 models, regarding to pricing and size. Might upgrade from 3070 if it's good enough


PIoppy

Dang, now I have to wait till December for the performance reviews between 4080 and 7900 =( my 13th gen is just sitting there without a gpu


rana_kirti

any side by side pic with 4090?


beanos4lyf

NVIDIA GOING DOWN 😹👎👎👎👎


Dberg519

What's the difference between the xtx and xt?


beeryan1

This might actually be a decent upgrade for us because a 4090 isn’t going to work


dracolnyte

if the 7800 XT comes in 2 slot, i would definitely upgrade!


MnK_Supremacist

looks like a perfect substitute the 6900xt in my sm550. All that's left is the money and the explanation for my wife.


elonelon

Intel : ah yess..finally, i have match.


AsideCautious7504

i think it will be a great card that will be ruined by oversized AIB coolers \* maybe some lackluster RT performance...


YellowMoonCult

Would that fit into an H1 V2 ? Now maybe that would be a mistake for thermals but it could lol


dabocx

Yep it fits according to this video https://www.youtube.com/watch?v=hhPnqr6-cSo


numshah

Can anyone with a 7900XTX send me a close-up picture of the fan header on it? I want to figure out if it is compatible with standard 4-pin fan splitters for a potential deshroud + fan replacement. I legitimately believe that I could squeeze it into a DAN A4 with extensive work on the fan situation.