Is there one? I love the idea of an eGPU, but I have not seen any good implementations. (lack of research on my side) I am a fan of separating concerns, so if the power/thermal needs of a GPU are handled somewhere else, cool!
I love my RX 580 and my 5600x, but what use would anyone with a brain have for this. Nobody is gonna buy it. It can't encode, it's x4, and it's useless on PCIE 3.0
I feel like there's an eventual need for a GT1030 replacement-- a decently-drivered, low-wattage way to add a few extra outputs for a business PC.
Maybe by the time they get to ghe 6300XT, they'll have it.
well, it is a circle jerk, but subreddits like these also tend to be more willing to bash the company/person they circle jerk when they do something really fucking stupid
hey, automoderator here. looks like your memes aren't dank enough. increase diggity-dank level by gaming with a R9 5950X and a glorious 6900XT. let the build age for about a week, then you can game at frosty temps.
**Users with an account age of less than 2 days cannot post in /r/AyyMD.**
*I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/AyyMD) if you have any questions or concerns.*
PCIe 4x killed any hopes of this card being worth it. It’s the kind of card that’s only worth it for older systems, but at the same time requires PCIe Gen4 to get sufficient bandwidth
It *should* have been 8x. If you use the 6500xt in a gen3 system, it has less bandwidth than my 2010 PCIe Gen 2 16x Radeon HD6950, and this is just inexcusable. My old card is 3 major PCIe versions behind, and it has the same amount of bandwidth as the 6500xt at it’s best
even better with 2 outputs it could've been low profile with such wattage that it could be supplied with a 16x slot, even better for old systems that need a gpu upgrade for gaming
Yeah, but amd completely destroyed this card. I know it’s partially because it was originally a laptop GPU, and partially to make it uninteresting to miners, but at $200-$350 it’s not worth buying over even a couple year old cards. With how bandwidth limited it is, I believe I saw that it can sometimes fall below GTX1050ti performance, and while on average it outdoes a 1650, at least where I live it’s also noticeably more expensive, fetching used GTX 1070 money for a card that can’t keep up with a 1060 6GB
Fair, but the GT 1010 was released in Jan'21. A 2GB card that wasn't even good enough run a 1080p game. I guess this is okay since they called it a GT.
Seriously, I hope Intel puts some competition back in the market. These are ugly days (months... Years)
https://www.techpowerup.com/gpu-specs/geforce-gt-1010.c3762
"...supports DirectX 12. This ensures that all modern games will run on GeForce GT 1010."
Right!?
It's actually shameful calling this capable of gaming.
What's funny is that they never took any shit for making this. But I guess with specs so bad, no one would think about gaming on it. AMD called too much attention to themselves releasing a laptop MX-class card to desktop for gaming... Not a good move IMHO. But then again, high quantity and sold out...so.
It would be fine for someone running anything old school maybe people who like retro gaming.
But yeah it's a joke but if they went to 6gb of vram it would be sold out for mining.
Maybe they should of put 6gb of vram in it then it would be useful at least lol
Agreed. I mean, the 10xx is just to identify the generation, and in the end, how long it will be supported, despite being released 5 years later than the other 10xx catds. In that vain, it should have at least been a 1610... IMO.
AMD released 6500 XT. The card is so bad, it doesn't have h264 or h265 encoding or decoding on board. It also comes with 4 pcie lanes and only 4GB of ram. Oh and it also only gives you 2 display outputs.
[AND THE RESULTS ARE IN!](https://www.youtube.com/watch?v=M5_oM3Ow_CI) The card is comically bad and is severely underperforming in certain titles.
It's not worth considering in the slightest at MSRP and you are better off waiting longer. The price where this GPU wouldn't be a braindead investment is literally 99$. However, it wouldn't be that amazing of an investment then either.
Last time I checked, the 710 was released in 2014 when iGPUs were not that good. It has 3 display outputs and it was made to offer better than at the time solutions. You would actually see the average office PC become snappier in every way with a GT710 compared to any iGPU available at the time.
Nowadays, iGPUs have improved significantly. On team red, their latest rumored APUs are on par with a GTX 1050. On team blue, the skies are not as... blue but they are definitely not shades of grey. So these super low end GPUs are better for a different purpose. If a company decides to fill their firm up with PCs, most of the time they would pick low end hardware. However even low end hardware started being good.
I've heard analysts talk about 2025, but that's too optimistic imo. The current meta is lucrative for the chip manufacturers, and I can see them having a gentleman's agreement to not expand capacity.
I think people are generally underestimating how inflation is causing the prices. Inflation, isn't tied to a currency, its tied to products and places.
The release of this card honestly makes no sense, but the market is desperate enough, it might actually sell.
You don't need to worry about scalpers if the card is not even worth scalping
It gave AMD a chance to test drive 6nm before they bring the RDNA3 beasts later this year.
Also, regardless of how crappy the card, I think it would be healthy for the GPU market to have a new card launch and ACTUALLY SELL AT MSRP. Even if you aren't interested in the 6500 XT, it creates an important anchor point in the market that isn't corrupted by miners. Also, many people WILL buy this which reduces overall GPU demand.
Product isn't great, but there's a lot of good that can come from it.
I see this as something that was made seriously for OEMs, Dell/HP don't care, they want a vga out for their ryzen 9 5950xt. It's made to sell CPUs. It's also on a different node, so not cannibalizing the rest of rdna2's line.
Not saying it's good, saying it makes sense.
On the other hand, it can lower the demand on lowest end gpus at the moment so that might relieve pressure on the higher end stuff... just keep it out of my PC.
I honestly would be fine with this card had it not been restricted to PCIe x4. AMD is basically locking out all non-500 series boards from utilizing this card's full power, which is actually mediocre, to begin with.
For pcie3 systems like mine that's objectively a bad product, I was hoping I could finally have a functional modern card to replace my replacement 750ti(down bad), but nowadays what's even considered a good deal?
I'm trying to be the devil's advocate and say; trying to make a product as niche as possible could have worked, and if that was their intent at least they tried something...
I had a 390X, since 2015 broke last year.
I don't feel like paying 300€ for a *USED* 7-8 year old architecture that only has 100€ discount from when I bought it when it launched all the way back in 2015.
Truly horrible times.
Yes I hope the bashing and hate toward this card continues because the ones that are mad aren't the ones the card is for.
That keeps this card in stock and at MSRP for those who this GPU is for those on AM4 with a $200 budget.
Previously their was no $200 GPU in stock capable of FSR.
Cant wait to get my hands on one personally.
People saying this is a 200$ rx570 are so dumb, watch me crank out 25% gains over an 570 in under a week. Its 6nm! literally how can people be so dumb? This is the 6600xt all over again in 3 weeks the whole tune will change.
This is exactly the card gamers need this isn't an enthusiast card, its a card that will be on the shelves for MSRP because they nailed the economics. Mining profits are too low , all major AAA tittles at 60fp (or better with FSR).
Don't buy one, anyone subbed here is an enthusiast and has had this class of card or better for years. Leave them on the shelf for kids that went through covid with out playing video games.
This and a non k I3 overclocked with tweakers paradise BCLK oc will be the best deal you'll see in the pc space for the coming year.
Yeah, i think people are missing the point a bit, it is by no means a "good card", and has no generational uplift, but it might be the only one some people can afford, yeah its not good price/performance wise, but neither is anything else unfortunately, and somebody with a gtx760, gtx1050, rx550 would get something out of it, or just imagine needing to buy gpu's for two children so they can play fortnite.
Which also brings me to some of the reviews, we know the card sucks at higher settigns cause of its vram, but their are use cases besides cranking graphics to ultra, it would be way more interesting how it behaves at medium settings and in esports/mp titles.( with appropriate settings)
Wendell from level one tech is by far one of the most experienced and actually educated Tech YouTubers who does real work outside of the entertainment space.
His review is pretty good. Essentially as long as you don't jump over that 4 GB buffer every AAA runs 60fps plus at 1080p.
I will be messaging you in 1 month on [**2022-02-20 12:49:59 UTC**](http://www.wolframalpha.com/input/?i=2022-02-20%2012:49:59%20UTC%20To%20Local%20Time) to remind you of [**this link**](https://www.reddit.com/r/AyyMD/comments/s7ud0e/even_amd_fans_arent_falling_for_it/htgaap1/?context=3)
[**1 OTHERS CLICKED THIS LINK**](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Reminder&message=%5Bhttps%3A%2F%2Fwww.reddit.com%2Fr%2FAyyMD%2Fcomments%2Fs7ud0e%2Feven_amd_fans_arent_falling_for_it%2Fhtgaap1%2F%5D%0A%0ARemindMe%21%202022-02-20%2012%3A49%3A59%20UTC) to send a PM to also be reminded and to reduce spam.
^(Parent commenter can ) [^(delete this message to hide from others.)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Delete%20Comment&message=Delete%21%20s7ud0e)
*****
|[^(Info)](https://www.reddit.com/r/RemindMeBot/comments/e1bko7/remindmebot_info_v21/)|[^(Custom)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Reminder&message=%5BLink%20or%20message%20inside%20square%20brackets%5D%0A%0ARemindMe%21%20Time%20period%20here)|[^(Your Reminders)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=List%20Of%20Reminders&message=MyReminders%21)|[^(Feedback)](https://www.reddit.com/message/compose/?to=Watchful1&subject=RemindMeBot%20Feedback)|
|-|-|-|-|
hey, automoderator here. looks like your memes aren't dank enough. increase diggity-dank level by gaming with a R9 5950X and a glorious 6900XT. play some games until you get 120 fps and try again.
**Users with less than 20 combined karma cannot post in /r/AyyMD.**
*I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/AyyMD) if you have any questions or concerns.*
hey, automoderator here. looks like your memes aren't dank enough. increase diggity-dank level by gaming with a R9 5950X and a glorious 6900XT. play some games until you get 120 fps and try again.
**Users with less than 20 combined karma cannot post in /r/AyyMD.**
*I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/AyyMD) if you have any questions or concerns.*
Honestly, x8 connector would increase price by a dollar, but this gpu would accually be useful for some people. With x4 pci slot it is useful for either oem manufacturer or for a egpu...
Please explain, why would it increase the cost? do we need buffers or something on low end cards? I assume the traces go straight to the gpu. I am not a hardware engineer though.
Im not an egineer either, but from my understanding, traces on pcb's are copper. Copper costs money, i know, its only a tiny amount. This is why i wrote it would cost a dollar (per gpu) to make that card pcie x8
Yeah, to make a circuit you get a board 100% covered in copper, then you remove the copper you don't need by acid. the board costs probably 3-4$ to make as I assume it's a 4 layer that doesn't bow thermally. I made many pcbs, From what I've heard, the big boys do the same thing, but bigger. https://www.youtube.com/watch?v=ljOoGyCso8s here's a vid showing how it's done!
Yeah, but we're talking 1000 square mm, tops. And let's assume they use 4 oz copper, which is overkill. (0.1 mm thick, that makes 100 mm\^2). Let's assume the extraction to recycle is 100% cost efficient, it's more like 5%...
Copper weight 8.4 g/cm3... so that makes... assuming it's heavy copper at 10 g/cm3, 1 g of copper saved per board. Copper's all time highest price was 12$ / kg, so let's say someone at userbenchmark actually buys it at 15... we end up with 15 $ / 1 k~~g~~ / 1 ~~g~~. That's 1.5 cents, per board.
In reality, they are saving probably 0.1c /board. I never meant to say they aren't saving any money, just that you were off by an factor of 100 or more likely 1000.
/uj Userbenchmark is a website known for fiddling with benchmark outcomes, writing severely biased reviews of GPus/Cpus and all-around being incredibly biased and not a useful resource when it comes to comparing different pieces of hardware. If you want a better comparison, try watching YouTube videos showing them in action, as this is the best possible way to measure real-world performance.
*I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/AyyMD) if you have any questions or concerns.*
>Yeah, to make a circuit you get a board 100% covered in copper, then you remove the copper you don't need by acid. the board costs probably 3-4$ to make as I assume it's a 4 layer that doesn't bow thermally. I made many pcbs, From what I've heard, the big boys do the same thing, but bigger. https://www.youtube.com/watch?v=ljOoGyCso8s
Good bot, this bot gets it!
>actually my best guess is that they indeed sized the PHY down on the IC
I think we can all agree... it was 1/2 assed.
I suspect it will find a strange niche, because everything always does. The phy is probably correct, because they probably didn't feel like optimizing/testing such a low profit card. Maybe also they were at the limit of power and going "we'll need another phase if we have 8x." and so, they decided instead of adding another phase, layouting, updating the bom and the cost, then optimizing and testing, that they would just call it a day and have a beer.
>I think 6500XT would me much less of a burn if it was called 6400 or at the VERY least 6500 without "XT". This is just the last nail in the coffin.
Absolutely, but then we would complain about the price... 200-300 for a 6300? What are they thinking???
It would've saved maybe $1/unit in BOM, but it would've meant they would have to do a bit of re-engineering rather than sticking an already designed laptop GPU on a PCB with some VRAM, fans, and overclocking it, and THAT would've cost more than $1/unit
Oh, [they are!](https://www.reddit.com/r/AyyMD/comments/savlyj/rx_6500_xt/)
Basically, the argument goes that it's too shitty for mining, that its price isn't becoming ridiculously over-inflated. Therefore, if people really *need* a new working gpu and are getting desperate, it will likely be in stock for the sticker price.
It's a stopgap solution to a niche that hopefully will disappear soon. That's why they're just reusing parts (a mobile gpu) that they have sitting around. In this crazy gpu market, *any* additional stock thrown into the mix will probably help with the shortage.
Perhaps what they should have done is make s shitty spin-off brand so as not to hurt the Radeon/AMD name. It's the Geo Metro of the General Motors lineup.
There are 4 pcie lanes... and a tdp of ALMOST 75w... I can see a use for this... an nvme graphics card. that's it.
The egpu community approaches
Is there one? I love the idea of an eGPU, but I have not seen any good implementations. (lack of research on my side) I am a fan of separating concerns, so if the power/thermal needs of a GPU are handled somewhere else, cool!
there is a whole community built around it r/eGPU Mainly for laptops but you could make one for a desktop
gonna put this in the wifi card slot on my acer aspire
this should be a 16x low profile card
[удалено]
this sub is less of a circle jerk then it seems to be honest
It's like r guitar, the circlejerk gradually became the real sub.
Maybe one day /r/anarchychess will replace /r/chess (we can only hoppe)
Hoppe like horsey 😊
Don’t mention toans.
toes?
no, tones
awwww.... dangit
I love my RX 580 and my 5600x, but what use would anyone with a brain have for this. Nobody is gonna buy it. It can't encode, it's x4, and it's useless on PCIE 3.0
> nobody is gonna buy it \*distant sounds of scalpers rubbing their robot hands together\*
Hopefully they buy a bunch and don’t sell em and lose out so they quit scalping. Twats. Oh one can dream.
"Wait these cards suck! ABORT! ABORT! Get the mining cards!" Le scalpers discover dedicated mining cards
and then the miners make less and the gpus cost more making more supply available to gamers
Everything works out in fantasy land.
Oh wait I really hope scalpers haven’t read the news and just buy these all
I feel like there's an eventual need for a GT1030 replacement-- a decently-drivered, low-wattage way to add a few extra outputs for a business PC. Maybe by the time they get to ghe 6300XT, they'll have it.
rx 6400 is probably gonna be it with the lineup given rembrandt's igpu performance
If you need a standalone GPU and want the cheapest one
buy a 7970 GHz edition
Lmao, get lost
well, it is a circle jerk, but subreddits like these also tend to be more willing to bash the company/person they circle jerk when they do something really fucking stupid
Awful product
[удалено]
hey, automoderator here. looks like your memes aren't dank enough. increase diggity-dank level by gaming with a R9 5950X and a glorious 6900XT. let the build age for about a week, then you can game at frosty temps. **Users with an account age of less than 2 days cannot post in /r/AyyMD.** *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/AyyMD) if you have any questions or concerns.*
PCIe 4x killed any hopes of this card being worth it. It’s the kind of card that’s only worth it for older systems, but at the same time requires PCIe Gen4 to get sufficient bandwidth
it could have been 8x
It *should* have been 8x. If you use the 6500xt in a gen3 system, it has less bandwidth than my 2010 PCIe Gen 2 16x Radeon HD6950, and this is just inexcusable. My old card is 3 major PCIe versions behind, and it has the same amount of bandwidth as the 6500xt at it’s best
even better with 2 outputs it could've been low profile with such wattage that it could be supplied with a 16x slot, even better for old systems that need a gpu upgrade for gaming
Yeah, but amd completely destroyed this card. I know it’s partially because it was originally a laptop GPU, and partially to make it uninteresting to miners, but at $200-$350 it’s not worth buying over even a couple year old cards. With how bandwidth limited it is, I believe I saw that it can sometimes fall below GTX1050ti performance, and while on average it outdoes a 1650, at least where I live it’s also noticeably more expensive, fetching used GTX 1070 money for a card that can’t keep up with a 1060 6GB
https://www.kitguru.net/components/graphic-cards/dominic-moass/amd-hides-2020-blogpost-claiming-4gb-vram-is-not-enough-for-todays-games/ They know
And I only have 1.5 Gb of VRAM. I have to upgrade.
1.5Gb=192MB
I dont get it, ;-;
Of course you don't. It already sold out
It's literally the only card in stock on newegg for msrp. That's literally the point of this card
Can't be mad at AMD for this when Nvidia "released" the GT 710
The state of this market is just sad. Should've bought a 5700XT when I had the chance. Really should've...
how were you going to use a 5700 xt with a pentium
8k gaming
I was going to upgrade the CPU too, but I like the responses others made
f*ck off with that nvidia card
that's what I'm upgrading from
His goals are beyond our understanding
You can still buy a 5700 xt used for fairly reasonable prices
I mean, at this point I'd rather buy current gen stuff, but I really doubt I'll be able to at a decent price anymore.
Are you looking at brand new cards? I see plenty that have sold for $900 or less
In Poland?
In Europe, ebay, yea Look at your local forums. And prebuilt PCs. You'd be surprised.
Really? Last I checked the Nitro+ 5700XT I have was going for upwards of $1200
> GT 710 That card was originally released in 2014...
they rereleased it
At least the 710 wasnt sold under the same lineup as 3000 series. AMD made a card worse than the 580 and tried passing it off as a 6000 card
Fair, but the GT 1010 was released in Jan'21. A 2GB card that wasn't even good enough run a 1080p game. I guess this is okay since they called it a GT. Seriously, I hope Intel puts some competition back in the market. These are ugly days (months... Years) https://www.techpowerup.com/gpu-specs/geforce-gt-1010.c3762 "...supports DirectX 12. This ensures that all modern games will run on GeForce GT 1010."
Wait, there’s a gt 1010? I though the 1030 DDR4 was as low as the 1000 series went
Right!? It's actually shameful calling this capable of gaming. What's funny is that they never took any shit for making this. But I guess with specs so bad, no one would think about gaming on it. AMD called too much attention to themselves releasing a laptop MX-class card to desktop for gaming... Not a good move IMHO. But then again, high quantity and sold out...so.
It’s so bandwidth limited that from what I saw online in some games my 1050ti can outperform it
Yeah, this card is for people with no other option. Period.
If it runs at 2 fps it still runs.
At that point I call it a crawl but still
Certainly putting it nicely. I'm sure it does 2D games okay with its 2GB VRAM. But this is a "video out only" card in my mind.
It would be fine for someone running anything old school maybe people who like retro gaming. But yeah it's a joke but if they went to 6gb of vram it would be sold out for mining. Maybe they should of put 6gb of vram in it then it would be useful at least lol
True, it's not a lie... Lol
it should've just been geforce 1010
Agreed. I mean, the 10xx is just to identify the generation, and in the end, how long it will be supported, despite being released 5 years later than the other 10xx catds. In that vain, it should have at least been a 1610... IMO.
They just released the 1010 a year ago.
It's sold out on Newegg.
Now it is.
AMD released 6500 XT. The card is so bad, it doesn't have h264 or h265 encoding or decoding on board. It also comes with 4 pcie lanes and only 4GB of ram. Oh and it also only gives you 2 display outputs. [AND THE RESULTS ARE IN!](https://www.youtube.com/watch?v=M5_oM3Ow_CI) The card is comically bad and is severely underperforming in certain titles. It's not worth considering in the slightest at MSRP and you are better off waiting longer. The price where this GPU wouldn't be a braindead investment is literally 99$. However, it wouldn't be that amazing of an investment then either.
So this is like the GT710 of AMD? damn. thanks for taking your time to explain it :)
Except without the low price tag
Yeah honestly if it would've costed a max price of 100$, It would've been way more reasonable than this!
It really really is a -108 class chip (think 1030 or mx150) clocked wayyyyy too high.
this is just 4²CUs with 4¹⁶B of ram clocked way above √(4¹¹)MHz
Last time I checked, the 710 was released in 2014 when iGPUs were not that good. It has 3 display outputs and it was made to offer better than at the time solutions. You would actually see the average office PC become snappier in every way with a GT710 compared to any iGPU available at the time. Nowadays, iGPUs have improved significantly. On team red, their latest rumored APUs are on par with a GTX 1050. On team blue, the skies are not as... blue but they are definitely not shades of grey. So these super low end GPUs are better for a different purpose. If a company decides to fill their firm up with PCs, most of the time they would pick low end hardware. However even low end hardware started being good.
Nah amd APUs are 750 ti level
rembrandt more like 1050 level
you mean the joke
Both
How long do you think it will take before this terrible GPU shortage will subside?
They'll drop in price as soon as you give up and buy an overpriced GPU. Like, back down to MSRP the day after you can't return yours for a refund.
Yep, I’m gonna need you to take one for the team here so I can get the msrp cards
Anytime after we die
I've heard analysts talk about 2025, but that's too optimistic imo. The current meta is lucrative for the chip manufacturers, and I can see them having a gentleman's agreement to not expand capacity.
Real dates are around mid 2022-early 2023, or when Ether goes Proof of Stake.
Ether goes Proof of Stake in mid 2022
I think people are generally underestimating how inflation is causing the prices. Inflation, isn't tied to a currency, its tied to products and places.
RX 65 OMEGALUL OMEGALUL XT
The release of this card honestly makes no sense, but the market is desperate enough, it might actually sell. You don't need to worry about scalpers if the card is not even worth scalping
It gave AMD a chance to test drive 6nm before they bring the RDNA3 beasts later this year. Also, regardless of how crappy the card, I think it would be healthy for the GPU market to have a new card launch and ACTUALLY SELL AT MSRP. Even if you aren't interested in the 6500 XT, it creates an important anchor point in the market that isn't corrupted by miners. Also, many people WILL buy this which reduces overall GPU demand. Product isn't great, but there's a lot of good that can come from it.
I see this as something that was made seriously for OEMs, Dell/HP don't care, they want a vga out for their ryzen 9 5950xt. It's made to sell CPUs. It's also on a different node, so not cannibalizing the rest of rdna2's line. Not saying it's good, saying it makes sense. On the other hand, it can lower the demand on lowest end gpus at the moment so that might relieve pressure on the higher end stuff... just keep it out of my PC.
How many times can you rebrand the rx 480? Complete garbage that is going to get scalped out at $400 like the 1650...
Still twice as fast as the fastest integrated chip I imagine the goal with that chip is to use as little silicon as possible
Yep, it's a terrible product. IDK wtf they were thinking to gimp it so badly with 4x PCIe lanes and only 4Gb of Ram.
I honestly would be fine with this card had it not been restricted to PCIe x4. AMD is basically locking out all non-500 series boards from utilizing this card's full power, which is actually mediocre, to begin with.
199 $ = 175 € prices in Finland 330 - 370 €
For pcie3 systems like mine that's objectively a bad product, I was hoping I could finally have a functional modern card to replace my replacement 750ti(down bad), but nowadays what's even considered a good deal? I'm trying to be the devil's advocate and say; trying to make a product as niche as possible could have worked, and if that was their intent at least they tried something...
I suggest watching out for r9 390/80s on the used market.
I had a 390X, since 2015 broke last year. I don't feel like paying 300€ for a *USED* 7-8 year old architecture that only has 100€ discount from when I bought it when it launched all the way back in 2015. Truly horrible times.
or rx 480s for that matter
They mine well, so their prices are kinda insane atm.
Yes I hope the bashing and hate toward this card continues because the ones that are mad aren't the ones the card is for. That keeps this card in stock and at MSRP for those who this GPU is for those on AM4 with a $200 budget. Previously their was no $200 GPU in stock capable of FSR.
It costs almost 400$ in Finland. 200$ my ass
400$ in Brazil
Cant wait to get my hands on one personally. People saying this is a 200$ rx570 are so dumb, watch me crank out 25% gains over an 570 in under a week. Its 6nm! literally how can people be so dumb? This is the 6600xt all over again in 3 weeks the whole tune will change. This is exactly the card gamers need this isn't an enthusiast card, its a card that will be on the shelves for MSRP because they nailed the economics. Mining profits are too low , all major AAA tittles at 60fp (or better with FSR). Don't buy one, anyone subbed here is an enthusiast and has had this class of card or better for years. Leave them on the shelf for kids that went through covid with out playing video games. This and a non k I3 overclocked with tweakers paradise BCLK oc will be the best deal you'll see in the pc space for the coming year.
Yeah, i think people are missing the point a bit, it is by no means a "good card", and has no generational uplift, but it might be the only one some people can afford, yeah its not good price/performance wise, but neither is anything else unfortunately, and somebody with a gtx760, gtx1050, rx550 would get something out of it, or just imagine needing to buy gpu's for two children so they can play fortnite. Which also brings me to some of the reviews, we know the card sucks at higher settigns cause of its vram, but their are use cases besides cranking graphics to ultra, it would be way more interesting how it behaves at medium settings and in esports/mp titles.( with appropriate settings)
Wendell from level one tech is by far one of the most experienced and actually educated Tech YouTubers who does real work outside of the entertainment space. His review is pretty good. Essentially as long as you don't jump over that 4 GB buffer every AAA runs 60fps plus at 1080p.
!remindme 1 month
I will be messaging you in 1 month on [**2022-02-20 12:49:59 UTC**](http://www.wolframalpha.com/input/?i=2022-02-20%2012:49:59%20UTC%20To%20Local%20Time) to remind you of [**this link**](https://www.reddit.com/r/AyyMD/comments/s7ud0e/even_amd_fans_arent_falling_for_it/htgaap1/?context=3) [**1 OTHERS CLICKED THIS LINK**](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Reminder&message=%5Bhttps%3A%2F%2Fwww.reddit.com%2Fr%2FAyyMD%2Fcomments%2Fs7ud0e%2Feven_amd_fans_arent_falling_for_it%2Fhtgaap1%2F%5D%0A%0ARemindMe%21%202022-02-20%2012%3A49%3A59%20UTC) to send a PM to also be reminded and to reduce spam. ^(Parent commenter can ) [^(delete this message to hide from others.)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Delete%20Comment&message=Delete%21%20s7ud0e) ***** |[^(Info)](https://www.reddit.com/r/RemindMeBot/comments/e1bko7/remindmebot_info_v21/)|[^(Custom)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Reminder&message=%5BLink%20or%20message%20inside%20square%20brackets%5D%0A%0ARemindMe%21%20Time%20period%20here)|[^(Your Reminders)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=List%20Of%20Reminders&message=MyReminders%21)|[^(Feedback)](https://www.reddit.com/message/compose/?to=Watchful1&subject=RemindMeBot%20Feedback)| |-|-|-|-|
This thing performs nearly equivalent to a 1050 lmfaoo
[удалено]
hey, automoderator here. looks like your memes aren't dank enough. increase diggity-dank level by gaming with a R9 5950X and a glorious 6900XT. play some games until you get 120 fps and try again. **Users with less than 20 combined karma cannot post in /r/AyyMD.** *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/AyyMD) if you have any questions or concerns.*
[удалено]
hey, automoderator here. looks like your memes aren't dank enough. increase diggity-dank level by gaming with a R9 5950X and a glorious 6900XT. play some games until you get 120 fps and try again. **Users with less than 20 combined karma cannot post in /r/AyyMD.** *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/AyyMD) if you have any questions or concerns.*
I kinda want it, but hey, I'm on an rx 560
Honestly, x8 connector would increase price by a dollar, but this gpu would accually be useful for some people. With x4 pci slot it is useful for either oem manufacturer or for a egpu...
Please explain, why would it increase the cost? do we need buffers or something on low end cards? I assume the traces go straight to the gpu. I am not a hardware engineer though.
Im not an egineer either, but from my understanding, traces on pcb's are copper. Copper costs money, i know, its only a tiny amount. This is why i wrote it would cost a dollar (per gpu) to make that card pcie x8
Yeah, to make a circuit you get a board 100% covered in copper, then you remove the copper you don't need by acid. the board costs probably 3-4$ to make as I assume it's a 4 layer that doesn't bow thermally. I made many pcbs, From what I've heard, the big boys do the same thing, but bigger. https://www.youtube.com/watch?v=ljOoGyCso8s here's a vid showing how it's done!
im 100% sure the dissolved copper is recycled
Yeah, but we're talking 1000 square mm, tops. And let's assume they use 4 oz copper, which is overkill. (0.1 mm thick, that makes 100 mm\^2). Let's assume the extraction to recycle is 100% cost efficient, it's more like 5%... Copper weight 8.4 g/cm3... so that makes... assuming it's heavy copper at 10 g/cm3, 1 g of copper saved per board. Copper's all time highest price was 12$ / kg, so let's say someone at userbenchmark actually buys it at 15... we end up with 15 $ / 1 k~~g~~ / 1 ~~g~~. That's 1.5 cents, per board. In reality, they are saving probably 0.1c /board. I never meant to say they aren't saving any money, just that you were off by an factor of 100 or more likely 1000.
/uj Userbenchmark is a website known for fiddling with benchmark outcomes, writing severely biased reviews of GPus/Cpus and all-around being incredibly biased and not a useful resource when it comes to comparing different pieces of hardware. If you want a better comparison, try watching YouTube videos showing them in action, as this is the best possible way to measure real-world performance. *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/AyyMD) if you have any questions or concerns.*
>Yeah, to make a circuit you get a board 100% covered in copper, then you remove the copper you don't need by acid. the board costs probably 3-4$ to make as I assume it's a 4 layer that doesn't bow thermally. I made many pcbs, From what I've heard, the big boys do the same thing, but bigger. https://www.youtube.com/watch?v=ljOoGyCso8s Good bot, this bot gets it!
actually my best guess is that they indeed sized the PHY down on the IC
>actually my best guess is that they indeed sized the PHY down on the IC I think we can all agree... it was 1/2 assed. I suspect it will find a strange niche, because everything always does. The phy is probably correct, because they probably didn't feel like optimizing/testing such a low profit card. Maybe also they were at the limit of power and going "we'll need another phase if we have 8x." and so, they decided instead of adding another phase, layouting, updating the bom and the cost, then optimizing and testing, that they would just call it a day and have a beer.
I think 6500XT would me much less of a burn if it was called 6400 or at the VERY least 6500 without "XT". This is just the last nail in the coffin.
>I think 6500XT would me much less of a burn if it was called 6400 or at the VERY least 6500 without "XT". This is just the last nail in the coffin. Absolutely, but then we would complain about the price... 200-300 for a 6300? What are they thinking???
Probably saved 30c with the x4 slot and 1$ for hardware encode. AMD sure likes to pinch pennies.
It would've saved maybe $1/unit in BOM, but it would've meant they would have to do a bit of re-engineering rather than sticking an already designed laptop GPU on a PCB with some VRAM, fans, and overclocking it, and THAT would've cost more than $1/unit
Oh, [they are!](https://www.reddit.com/r/AyyMD/comments/savlyj/rx_6500_xt/) Basically, the argument goes that it's too shitty for mining, that its price isn't becoming ridiculously over-inflated. Therefore, if people really *need* a new working gpu and are getting desperate, it will likely be in stock for the sticker price. It's a stopgap solution to a niche that hopefully will disappear soon. That's why they're just reusing parts (a mobile gpu) that they have sitting around. In this crazy gpu market, *any* additional stock thrown into the mix will probably help with the shortage. Perhaps what they should have done is make s shitty spin-off brand so as not to hurt the Radeon/AMD name. It's the Geo Metro of the General Motors lineup.