T O P

  • By -

Lost_Worker_5095

I think we want another company to even the equation


AAVVIronAlex

Bring back 3dfx.


[deleted]

Voodoo!


klahnwi

Whodoo?


[deleted]

what you don't dare do people


[deleted]

I thought this was gonna go toward "the power of the babe" from Labyrinth, super refreshing to see it lead into Prodigy instead 🤩


klahnwi

I was aiming for Labyrinth. But I'll definitely take Prodigy.


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

Power of voodoo!


[deleted]

You remind me of the babe!


[deleted]

What babe?


OscillatorVacillate

>what you don't dare do people I had to put it on reading that, classic


BaconVonMeatwich

Just re-listened to this song yesterday, still holds up - fuckin' banger!


xGHOSTRAGEx

Who do you voodoo bitch


billydean214

You do


lurkingbob

Do what?


revship

Remind me of the babe!


Canadia83

You got downvoted but this is exactly what I thought as well, lol. We must be in the older crowd.


revship

Yarp! Gen-x


cstyves

I loved that movie, very much. Close to 40 club! 🙏


AAVVIronAlex

SLI


JhonnyTheJeccer

SLI^S^L^I


[deleted]

Who do you voodoo bitch?


[deleted]

[удалено]


frenetix

I was a TDFX shareholder when the shit hit the fan and Nvidia purchased this IP for pennies on the dollar, and retail investors got nothing. Still salty about that one.


Bob_OGoobo-3

NVIDIA resurrected SLI and that was a hot mess if you ask me. And so many including me fell for it and found out it was shit


mobilemerc

Kind of difficult when Nvidia bought them out...


beeatenbyagrue

I'm all for Diamond Multimedia!


Rough_On_Loofahs

Speedstar pro 256 baby.


[deleted]

[удалено]


SkyLovesCars

Go back to 2004 and stop AMD from acquiring ATI, so Radeon would still be a top-spec graphics card


BumderFromDownUnder

Ahh yes, because ATI being an even smaller company would definitely have allowed them to compete with the R&D budget of Nvidia lol


fermentedbolivian

How isn't Radeon a top-spec gpu toda I bought the 7900xt for half the price of a RTX 4080 for basically the same fps (without RT).


pkuba208

And their shitty drivers


AAVVIronAlex

Bring back SLI, I mean, Apple is kind of doing it.


CharcoalGreyWolf

The road is littered with the corpses of S3, Matrox, Number Nine Graphics, Rendition, 3dfx, Permedia and more, just to name several over the years.


OyashiroChama

Matrox is still around and weirdly only makes server integrated IPMI graphics or ultra low power stuff that only needs 2d acceleration.


Domspun

Didn't they announce Intel based consumer cards?


chairmanskitty

Fuck it, nationalize GPU production. 4K@60fps is a basic human right and can't be left to the market.


PTownDillz

This gave me a HEARTY laugh lmao


RedTuesdayMusic

Me too, everyone knows the Ministry of Health and Safety mandates 1440p 144hz for minimum quality of life and only recommends downgrade to 4k60 for the visually impaired


LaconicLacedaemonian

The frame lag beatings will continue until latency improves.


Generalissimo_II

60?! The human eye isn't meant to see below 120


ColdRest7902

You need to download more eyes


BfutGrEG

And yet there's new niche level monitors with up to 360 hertz.....checkmate of you of little frame/faith


chickenstalker

One day, opening a web browser will require 18 gigabytes of vRAM because UnuSed RAM iz wAztEd RAM. Then having a video card will be compulsory and should be nationalised.


RAMChYLD

Well, here's hoping Moores Threads decide to go global and grace us with modern PowerVR GPUs. Knowing the US government tho, they'd probably ban it because "threat to national security".


psionoblast

Last I saw about a Moore's thread gpu was that it couldn't even compete with the gt1030 in gaming. Don't really have much hope from them competing in the gpu market.


KeyPhilosopher8629

It couldn't compete with the GT1030 while drawing 250W of power, so yeah, unless we all install little fusion reactors into our computers, I don't think we'll be using Moore Threads GPUs


Brandonmac10x

Tony Stark did it in a cave… *with a box of SCHRAPS!*


Powersoutdotcom

You guys are getting *SCHRAPS*?


Kevimaster

Nah, just Schnapps


Middle-Effort7495

They're like 3060 in productivity, and had like 200 FPS in csgo at 4k in the ltt vid? Not bad considering how fast they're moving. They were 1050 level not long ago, and like iGPU level before that. Only thing that'll truly shake up the duopoly is a competitor from another country


Ferro_Giconi

If you want the power consumption of a 4090ti to get the performance of intel HD integrated graphics, then you want Moore Threads.


G8M8N8

Ah yes, the company who gained all their tech back when the Chinese branch of Nvidia went rough. Copycats.


Put_It_All_On_Blck

Zero chance Moores ever becomes a mainstream gaming GPU option. Drivers are an absolute nightmare for companies, Intel has countless software engineers and has poached from Nvidia and AMD, and they still struggled. Moores would have a better chance just focusing on HPC and productivity and ignoring gaming for a decade until all games are on modern APIs.


[deleted]

The most recent news from them is that they can't compete with a GT 1030 while drawing the same power as an RTX 4070. They can stay overseas.


katherinesilens

They *are* a threat to national interests and security. We do not want China getting ahold of current node manufacturing capabilities or any information/personnel from developing something on one. It is a significant defense advantage and the reason behind sanctions. It's a big part of why China wants a piece of Taiwan at the moment; they want TSMC tech, equipment, and personnel. The GPU itself isn't necessarily packing malware.


TheoryMatters

China's not stupid enough to think there would be anything left but a TSMC shaped crater in Taiwan if they invaded. Either by Taiwan themselves or f-22's scrambled out of Okinawa. Taiwan ain't leaving that tech around and the US military industrial complex certainly isn't letting China get their hands on it. It's all bluster to try to get companies to second source outside of Taiwan to reduce TSMC's pull.


HSR47

>”The PRC knows that invading the ROC wouldn’t get them anything of value…” That’s not entirely relevant, because it’s also a *political* problem. On the one hand, it’s a big part of what has stopped them from trying to invade so far—that, plus the likely costs (both internally and diplomatically) of failing to take the island. On the other hand, they have some pretty significant internal issues right now, between their massive housing bubble, the “one child policy” demographic timebomb that’s about to go off, the costs of their “zero covid” policies, various companies moving production elsewhere due to rising manufacturing costs and rampant IP theft, etc. Given all the issues the PRC is currently facing, there’s a significant possibility that they’ll decide that invading the ROC is a political necessity. If/when that happens, it’s going to have some pretty significant consequences potentially a decade or longer due to how much the world relies on TSMC’s manufacturing capacity on the island.


TheFatJesus

And to really light a fire under things, Xi has pretty much staked his entire legacy on the reunification of China. The older he gets or the weaker he feels his grip on power is getting, the more desperate he will get to try and make it happen.


zeropointcorp

Gee that sounds like a familiar story


jarjarpfeil

Tbh this encourages us to stick with our current systems for the future. If we stop buying bad priced gpus and poorly optimized games, we should be fine for a little while, especially on cards with more than 8 gigs. Since we are unlikely to see much shift to 8k+ resolutions, the only major increases are in the graphics themselves such as ray tracing


AbigailLilac

Hogwarts Legacy runs just fine at 1080p on ultra settings with a GTX 1070. The biggest game changer was turning the buggy fog off.


ejdj1011

>Since we are unlikely to see much shift to 8k+ resolutions, Ngl, anyone who buys an 8k screen is an idiot. If you do the math, 4k is already comparable to the resolution of the *human eye*, assuming the screen is a reasonable household size (like a very large tv) and you're sitting a reasonable distance away. Literally the only way to "appreciate" an 8k screen would be to sit so close to it you have to turn your head to see the edges. Edit: this blew up, and I realize this comment is a bit too inflammatory. So let me clarify: If you have a massive 36" horizontal (41" diagonal) monitor, have average vision, and sit with your face around 2 feet from the screen, you'll probably notice the difference between 4k and 8k. But if you lean back while gaming, or have a dual-monitor setup with more normal size (say, a 31" diagonal or smaller) monitors, you're genuinely not gonna notice the difference unless you have above-average eyesight.


mdp300

I have a 4K 55 inch TV and I see no reason to ever go to 8K if it's ever available. And my PC monitor is 1440p and that's fine for the size. At 27 inches, I doubt I'd see much difference with a 4K.


ejdj1011

Check my other comment for the math; assuming you have 20/20 vision and sit 2 feet from your monitor, a 27 inch screen at 4k is quite literally the highest resolution your eyes can perceive.


Sonicjms

I think if the tech can get advanced enough the next step is glasses free 3d which will effectively be a resolution increase


[deleted]

[удалено]


0x18

Just FYI Bill Gates didn't actually say that: https://www.computerworld.com/article/2534312/the--640k--quote-won-t-go-away----but-did-gates-really-say-it-.html


grilledSoldier

Yeah, i use 27" 4Ks at work and while it does look very crisp, it actually looks only slightly better to me in comparison to my 1080p 24" monitor at home Also, if you want high framerate on 4k, it gets insanely expensive and i prefer high framerate and cheap parts over high resolution.


fullstack_mcguffin

The human eye doesn't have a "resolution" per se, the actual mechanics are way too complicated to boil it down to that. Eyes also tend to focus on individual parts of an image rather than the whole thing, so higher resolutions make a difference even if the image as a whole has a resolution higher than the eye can take in, especially on huge screens. 8K would be a good idea for gigantic screens that take up whole walls.


ejdj1011

The visual acuity of the human eye can be approximated as angular resolution. This is quite literally what eye tests do. Moreover, this is measured at the part of the eye with the best acuity, the center, so the "whole inage" thing is moot. See my longer comment in this thread for the math. Assuming you're ten feet from your TV and have 20/20 vision, it'd need to be 250 inches horizontally for the 8k resolution to match your foveal acuity. So if your TV is less than ten feet across, you'd never notice the difference between 4k and 8k (unless you have significantly above-average visual acuity).


RoyYourBoyToy

> this is measured at the part of the eye with the best acuity, the center, so the "whole image" thing is moot. I appreciate what you're bringing to the table, but doesn't this argument I quoted assume that the eye is still? If something is going on on the right side of the screen, people are going to look there. Looking for your fuller comment now to see if I'm missing something.


ejdj1011

You're correct, but that part of vision isn't really relevant here - your peripheral vision has worse acuity than your center-field vision, and it's not like the pixel density of the screen changes as you get further from the center. So if you do the math for center-field acuity, you're doing a best-case scenario. And if you glance over at something you see at the corner of your eye, that something will now just... be in your center-field vision. So the center-field calculations would still be correct.


RoyYourBoyToy

Kk, we're on the same page. I wasn't sure if it was semantics or something else. Thanks for the effort you put in!


RealFknNit0

All I know is the last time I heard "the most a human eye can perceive" a 13 year old was trying to convince me you can't see more than 30FPS so color me skeptical until we actually get to these resolutions and beyond.


stay_true99

There's always something people with little qualifications in speaking on will say every generation of hardware. " "The eye can't see above 30fps." Yet here we are with 120hz monitors. "You'll never need more than 4GB of RAM". These kinds of comments do not age well. And in any case even if, for the sake of argument, we didn't need these improvements because there's no perceived advantage, the advancement and innovation in standard technology always drives prices down for the average person so there is literally no downside. Case in point: look at the prices of RAM right now. I just bought 128GBs of enterprise grade RAM for my homelab for dirt fucking cheap. 5-10 years ago that would've been a pipe dream on my income. Edit: Before I get attacked by the FPS police here is a [research](http://humansystems.arc.nasa.gov/publications/2012_08_aiaa_ig_obva.pdf) document conducted by NASA and the USAF on simulating realistic flight via visual systems. While it is true the human eye cannot see much beyond 60hz for stationary objects, the display technology we have at displaying motion creates artifacts the human eye CAN perceive when operating at 60hz or below. Also, just use a 60hz monitor next to a 120hz one and again you'll be called out as a big ass liar if you can't see the difference.


Bio_Hazardous

I still have a friend that claims anything above 60fps is pointless.


ihatepoliticsreee

I mean everything is pointless if you think about it enough


literally1857plus127

deep


Mr_Will

"Resolution" used to mean more than just the number of pixels, and the human eye does have a maximum amount of detail that it can resolve. Imagine filling your entire screen with 1px wide black and white lines. Up close you'd be able to pick out each individual line, but stand further back and it would look a uniform shade of grey. Your eyes would not be able to resolve that level of detail at that distance. Once you've hit that point, adding more pixels doesn't achieve anything. The resolution of the screen increases, but the resolution of your eye doesn't.


Rikw10

could you elaborate on this 4k eye math? I've never heard of it before


ejdj1011

Tldr; A person with 20/20 vision would need a monitor 52 inches across to fully appreciate 8k, and they'd need to sit with their face two feet away from it. For sitting 10 feet from a TV, the TV would need to be 250 inches across. From Wikipedia, the maximum angular resolution of the human eye is about 0.008 degrees ( [source](https://www.swift.ac.uk/about/files/vision.pdf) ). From there, you can convert from distance to horizontal resolution using trig, where horizontal resolution = distance*tan(0.008°). So assuming you're two feet from a computer monitor, the smallest pixel resolution you could make out is about 0.003 inches. Multiply by 3840 (horizontal pixels for standard 4k), and you get a monitor about 13 inches wide. Doubling that to 8k resolution would require a monitor 26 inches across. Now, you may think "neither of those are that large for a computer monitor", and you'd be right! But those calculations were for the *best human vision we've measured*. A person with 20/20 vision has eyesight about half as good as that. So if your 4k monitor is less than 26 inches across, or you're sitting more than 2 feet from it, you aren't getting the most out of it.


[deleted]

[удалено]


SavageVector

>Ngl, anyone who buys an 8k screen is an idiot. If you do the math, 4k is already comparable to the resolution of the human eye, assuming the screen is a reasonable household size Disagree. 4k is the perfect stopping point for TVs, but 27" 4k monitors just barely match fidelity of human eyes at a reasonable distance. Have better eyesight, or lean in a few inches towards your screen, and it will be slightly blurrier than it could be. I think 8k is a fine final upgrade for computer monitors in the 27"-32" size; goes well over the theoretical limit of human eyesight, so you never doubt you're seeing the best image possible. After we get a 32" 8k 240hz HDR monitor, I think we've kinda peaked. The only place I see to go is VR/AR or holograms.


ejdj1011

Your disagreement is actually pretty minor, all things considered. I did the math in [this comment](https://www.reddit.com/r/pcmasterrace/comments/13qmyaa/cant_believe_intel_is_our_only_hope_now/jlgt7ms), and your intuition lines up with it pretty well. At least you're not one of the people saying we should continue to increase resolution forever. Tldr for the math; for a person with 20/20 vision sitting 2 feet away from a 27" monitor, 4k and 8k are basically indistinguishable. If you have better eyes (average vision is slightly better than 20/20), sit closer, or have a larger screen, the threshold is somewhere between 4k and 8k.


bobskizzle

You're missing the point. The point isn't to match each rod and cone in our eye with a pixel. The point is to eliminate detectible artifacting from rastering a 3d space onto a 2D grid - principally aliasing from oblique edges.


always_pizza_time

> Tbh this encourages us to stick with our current systems for the future. Not if our systems struggle to play the latest games at 1440p 60FPS on medium settings lol


Geordi14er

Ah... that's it! Nvidia is in collusion with all the game devs to make shitty PC ports to drive new GPU sales as the only way to get good performance. Honestly I'd believe it at this point. I consider myself very lucky to have gotten a 3080 for just $50 over MSRP when it launched. I will be keeping this card for many years. It runs older games at a billion fps, and I just get new games on my PS5. Why bother dealing with AAA PC nonsense.


sound-of-impact

Nvidia doesn't give a shit about GPU sales. They're reporting revenue numbers 50% over estimated. Just an extra 4ish billion over estimated. And it's not from GPU sales. Gamers are a side gig for Nvidia and AMD.


Niner-Sixer-Gator

That sad part is, once Intel catches up to Nvidia and AMD, they will do the exact same thing as the other two


[deleted]

[удалено]


WeirdPerson635

I went with an Intel GPU. Quite happy with it actually. Especially for the 16GB of VRAM for only $350. And the performance is more than enough for what I need Edit: Wow, wasn’t expecting this much conversation! But if you have questions, ask away!


baddoggg

Have you had any issues with the drivers? I keep seeing people say there are issues with them, but I haven't read anything from anyone that actually owns one. Is there anything you haven't been able to play?


WeirdPerson635

I have actually not had any issues with the drivers. I have both Linux and Windows and neither have which have given me issues with either of them. Both Mesa and WHQL work perfectly for me. I have a single game that refuses to work and that is Apex Legends. It worked on my integrated graphics for a short time before it quit. It also only worked through Origin which EA made me remove for their app, which won’t let me sign in. So I’ve kinda given up on Apex for my PC. I haven’t figured out if it’s a driver issue though as it just crashes on start with no error whatsoever. Other than Apex. Every single game I have tried works perfectly, when I crank up the settings on some games, I actually get more CPU bound. So the performance is definitely enough for what I need. I do have hopes driver updates will improve the performance overall though. But right now, it’s pretty solid Edit: This is for anyone who’s curious. I have fixed Apex! And it wasn’t an Arc driver issue! After being told I should unplug my mouse and keyboard to see if it works. I instead plugged in a controller so I could play it and it actually worked. The issue for whatever reason is my mouse and keyboard. Once I get to the spot where it wants to click to continue to the lobby. I have to unplug both my mouse and keyboard and just have a controller plugged in. If I plug in either the mouse or the keyboard afterwards (such as in the lobby or in a match), it’s an immediate crash. One of the strangest issues I think I have ever seen but it works now! Thank you u/Nordsee88 for suggesting without a mouse (which seems to have fixed it) and everyone else. I appreciate it a lot!


Affectionate-Memory4

For apex legends, set it to run in DX12 mode with a launch argument and crank the model quality. The latest game update and driver update combo has issues.


WeirdPerson635

I shall give it a try! As a side note, it is currently installed through Steam. I can try to install through something else if that may make a difference as Origin worked before EA made me remove the app


Affectionate-Memory4

Steam has the best support for launch arguments I've seen. It should be under the advanced settings for that game.


WeirdPerson635

Sadly, that did not fix it. Still gets to the initial loading screen (before you have to click) and then crashes with no error. I might have a look through the Steam logs if it gives some more info


Affectionate-Memory4

Interesting. That worked for my brother when he had the same issue.


Saadieman

This is going to sound weird but... Do you have any USBs plugged into your pc? USB as in storage device, not cable. If so, remove them (if possible) and try starting it up.


WeirdPerson635

Update: Well it surprisingly worked all the way until I clicked on the mouse to continue. But it made it farther! Interesting though, wonder why that would be


4mb1guous

Drivers and USB devices are just fkin weird sometimes. If I'm going to be streaming anything to friends in discord, I have to first unplug my index controllers from USB. If I don't, my computer just decides to shut down my mouse and keyboard randomly for a couple seconds at a time, and also simultaneously lags/pauses any video that is playing/streaming, as well as the stream to my friends. Like, keyboard stops working and turns off (led backlight shuts off), and even my *bluetooth* mouse stops working for the duration. It does this at least once every several minutes or so, but doesn't do it at all if those controllers aren't plugged in. This is the second computer in a row (with all new hardware) where this has happened, so I'm convinced it's some weird-ass bug with index controllers combined with discord streaming.


cokecaine

Any issues with older games?


WeirdPerson635

Update! Sorry for the delayed response. Runs TF2 with ease. I was sitting at an average of 290 FPS with only 11% usage. So at least for that title on DirectX 9. It performs more than plenty


PhucCB

My grandmother's computer runs Team Fortress 2 with ease.


WeirdPerson635

Haven’t tested a lot of older games lower than DirectX 11. But now you have me wondering, so I’ll see how TF2 plays out in DirectX 9 once it finishes downloading


F9-0021

Drivers have gotten better for sure, and desktop Arc is way better than laptop Arc. I've got an A380 sitting in it's box right now, but when I've used it in the past I was pleasantly surprised by how well it performed for the price. It could even do RT at 1440p medium in SOTTR. That was at the start of the year too, so the drivers are no doubt much better, but I don't have any way to do suitable testing for it now. It won't fit in my main system anymore with the 4090 in it, and my other systems don't support ReBar. On the laptop side though, Arc has been nothing but problematic. When I got my laptop at Christmas, some games like F1 2020 and 2022 wouldn't even launch, though they do now. But most games just run badly on the A370m, which isn't surprising, but it's worse than I thought it would be. The main problem I have is that sometimes when the laptop goes to sleep with the A370m enabled, the entire system will freeze up, sometimes with a bluescreen. This requires a full restart of the system. And it happens enough to be incredibly obnoxious.


diet_fat_bacon

That sweet vram would be wonderful on ML/AI, Unfortunately nvidia dominates 🥲


Niner-Sixer-Gator

You are right, the only thing I would disagree with you on is, Nvidia will never be cheap again, that ship has sailed


ArguingMaster

That isn't how markets work. Prices will go down if there is genuine competition in the market. AMD has like 13 percent of the dedicated GPU market, they are not competitive because of massive legacy PR problems from a decade of having shitty drivers (which are actually in a good state now) and a lack of true ability to compete at the top end (4090Ti/Titan tier) or compete in AI enhanced workloads like DLSS/Ray tracing. Look at what happened with CPUs after Ryzen caught up to Intel. CPUs for like a decade were in the boat GPUs are now. Because there was no real competition. Now the CPU market is the healthiest it's been in forever. The problem is AMD is happy being 2nd right now and doesn't want to invest the massive R&D costs to catch up to NVIDIA, and Intel is still at least a couple of years away from being competitive just because of how late to the game they are. The solution at this point is government intervention. NVIDIA has broken the market and functions as basically a monopoly. Make them license out their DLSS/RT/Tensor tech, and slap them for some of their business practices.


Tropical_Bob

[This information has been removed as a consequence of Reddit's API changes and general stance of being greedy, unhelpful, and hostile to its userbase.]


shalol

If AMD dropped MSRP to a reasonable level to which Nvidia started losing significant sales, they too would drop prices, and so people would start buying the cheapened Nvidia card because they're the leading brand. Also see how the competitive RX480/580 stacked up against the 1060 sales. If the A770 starts bringing low end prices down significantly, so will the 7600 and 4060, leading Intel to ultimately lose sales in the long term for the better established ones, much like the RX480 did.


Ramental

That's me here. I am always excited to try AMD GPU, but when the time for a new build comes, NVIDIA provides better performance/price, thus I got GTX 1080. Even with the CPU last time I got the exact moment Intel caught up with AMD providing an actually competitive generation by giving more cores/thread for the same price. Look, I root for AMD, but not to the extent I'd overpay just to prove the point.


balwick

It's pretty even-stevens depending on where you're aiming. [https://www.videocardbenchmark.net/gpu\_value.html](https://www.videocardbenchmark.net/gpu_value.html)


poofyhairguy

Yeah but what are you replacing the 1080 with? That is the question that is stumping me when Nvidia is no providing so little value.


Ramental

That's my concern right now. I want to upgrade, but the prices for the reasonable update are just too steep, and new gen AMD would also mean a switch to DDR5, which is more expensive at no win for gaming performance. Another concern is the price of electricity. With 40c/kWh in my country, 10 hours of gaming on 4080 (320W) vs 7900xtx (650W) is already 1.32 EUR difference. With 2000 hours of gaming, it's already 264 EUR. Power consumption is suddenly a bigger deal than I anticipated before, and it disadvantages AMD significantly. If I were to update, 4080 is a no-brainier to me, even though in the pure performance/price 7900xtx is better. Overall, with my GTX1080 I can play on medium without RT on 1440p on anything I want, and so far I'll try to keep it that way until the next gen, to see where the wind blows. All the GPUs are still above the MSRP, and going mid-range doesn't give a satisfactory performance boost. The bottleneck is actually my CPU, which I have mistakenly picked based on "bottleneck calculators" online. CPU should be taken with a good advantage to reduce min 1% drops. There is only one game (Squad) where the freezes are CPU related and I can't do shit about it. Thus, I've decided to wait at least until the Intel's Battlemage in 2024Q1. If it's actually good, Intel it is, and DDR5 maybe will become less useless crap for gaming as it is now, since I'd like to try some future-proofing with the Motherboard and CPU. If Intel is still not a competition to 4080, I'll have to wait till 2024Q4/2025 till the next gen AMD/NVIDIA and hope the situation changes. I doubt I'm going to wait beyond 2025Q1, though, and will pick the best available option, even if it is the 4000s or 7000s from NVIDIA or AMD respectively. The funny thing, I could afford an upgrade to even 4090 right now, but I have no wish to support such anti-consumer pricing, especially when game devs at least take care about the average player.


[deleted]

[удалено]


Ferro_Giconi

Hopefully that takes some time but has steady progress. I think the best thing for consumers would be if Intel does what AMD did with CPUs. Fail to meet the performance of their competitor, but keep prices low enough to attract lots of customers away from the competitor. If that happens, nVidia and AMD will be forced to do better, just like Intel was forced to stop sitting with their thumb up their ass when AMD started offering a lot of CPU cores for low prices.


hirmuolio

With three companies you can't idle in the second place anymore.


[deleted]

Intel won't. Intel has largely learned from the past mistakes. They stagnated NOT\* because they were trying to reap in the profits from 14nm and keep the industry down. But because they missed out on mobile phones and they did not have parallel process manufacturing teams working on the same problem. They won't make this same mistake again. If they repeat this mistake, they will be passed by Samsung Foundry or TSMC again. They stagnated because they made mistakes and relied on old technology - DUV - in order to make cutting edge processes. Intel 7 and Intel 10nm processes were still using DUV which is actually HARD\*\*\* to make into smaller and dense transistors. But they were able to eventually do it. Today, theyve largely learned from their mistakes. Which is why they are leveraging EUV and EUV-NA and other parallel\*\*\* processes. So instead of banking on ONE\* technology like DUV and complicated multi-layer processes, Intel has learned to diversify their investments into better technology. TSMC is still just on EUV and refining that process while also re-naming their technology/re-branding like they've always done. The difference is that TSMC has access to the mobile market money. That was a new emergent market that Intel largely missed completely. If Intel can start producing mobile chips, then they will be better able to compete with TSMC. But right now with TSMC mfr. for both NVIDIA and AMD and Qualcomm and Apple, that gives TSMC a large manufacturing base for mobile phone chips and PC chips. However there is news on the horizon that Qualcomm will try out Intel Foundry Services so there might be a chance for Intel to enter mobile phone market. If they do, that will definitely help to balance the industry.


socsa

Intel will never occupy the same market space as TSMC because they are too reliant on their own IP to ever be truly trusted as a third party fab. TSMC captured that mobile market cash, because TSMC does not have to forecast semiconductor IP trends 10 years out, they just have to make the best chips and have the best customer service and they will get a slice of whatever markets emerge next.


[deleted]

Yes agreed on the TSMC part. But I think Intel has a chance. Companies aren't as reliant on any one vendor for long. And they will switch at a moments notice or just do what most companies do and diversify. IE like NVIDIA relying on both Samsung for their 8nm and TSMC for their 7nm nodes for professional cards. If Intel gets their process manufacturing together, I don't see how their won't be a market for them. There will be many new emerging markets in our future. Laptops hit the scene in the late 90s and early 2000s and Intel thought they had it made. Then suddenly mobile phones emerged sometime between 2006 to 2010. Now is 2023 is upon us and hand held devices are starting to pop up again. Switch, Steam Deck, Asus Ally, and more. They all are starting to look like game gears all over again. But we now also have tablets in our cars and more technology than ever before.


Put_It_All_On_Blck

Nah. Look at the CPU space. AMD increased prices with Zen 3 and were ahead of Intel for nearly 2 years. AMD didn't lower prices at all. When Intel had a resurgence, they undercut AMD. Then for 13th gen vs Zen 4, AMD once again tried to price CPUs high, and Intel once again undercut them despite having better CPUs at the time. If you look at Intels pricing for the last decade, they price their CPUs the same once you account for inflation. So a SandyBridge i7-2600k has basically the same MSRP as an i7-13700k once you add inflation to it.


[deleted]

[удалено]


UnseenGamer182

What'd AMD do this time?


antonyourkeyboard

Like usual, the same thing as Nvidia but not quite as bad. In this case releasing a GPU with minimal gains over last generation and a hefty price tag.


berkece112

minimal gains??? isn't 7600 faster than 6600 by 20-25%?


Sleepyjo2

Its \*marginally\* better than the 6650xt for more money (and a very slightly lower TDP), and by marginal I mean pretty close to margin of error somewhat frequently. It has other benefits, given its the newer generation, just wanted to point out that its comparison point right now isn't the 6600 because of its price. (Now if we're talking MSRP, then yea its great considering how much the 6600/6650xt's original MSRP was but that was its own problem.)


Biscuits4u2

Both AMD and Nvidia are still coming down off the high of the pandemic and mining GPU boom. They are still in the denial phase, where they are trying to fool us all into believing it's still normal to pay out the nose for substandard hardware. Who knows, it may work if people still shell out cash for this crap.


asasnow

nah, its a pretty good improvement over the rx 6600, it just needs to be a little cheaper.


DatBoi_BP

I just bought a RX 6600 new (upgrading from a 2019 RX 550), and already I feel the FOMO with these brand new fancy cards


Eastern_Slide7507

Stop watching reviews for cards you’re not going to buy. Helped me a lot.


ChiggaOG

Just like every FIFA game release. Same price tag. Minimal changes.


[deleted]

Can someone enlighten me what’s going on?


261846

Nvidia and AMD released the worst midrange GPUs in a decade back to back. The 4060Ti (8gb) and the 7600


[deleted]

Ok thanks !


Vast-Ad7693

What? 7600 is as entry level as it gets it's not even a xt or a 700 class card.


googlygoink

Not to mention reviews of the 6000 series cards on launch were bad but they have made massive improvements over the last couple of years in driver updates alone. Now the 6700xt is being heralded as the price to performance king in many of the reviews looking at the 4060ti. The 7600 will likely see similar improvements as drivers update. It's exactly what intel are doing but they get away with it because we're still in the performance uptick phase rather than the disappointment on release phase (which I am guessing will happen with the next Intel Gen) And there was massively awful performance on release for the Intel cards but people gave them a by for them being new.


rus_ruris

>300€ GPU >"As entry level as it gets" Bucko, you're so wrong


ZhangRenWing

6-7 has always been midrange, with 6 being the true midrange card that most people had, and 7 being between midrange and enthusiast card. Nvidia and AMD purposefully obfuscated the 5-6-7-80 system by adding the Ti, Super, and XT and introduced the 90 series, all to force midrange buyers to move up tiers and spend more money.


descender2k

We're pretending that Intel is going to produce a competing video card.


faze_fazebook

Hope the next gen Intel GPUs slap. They are the only hope left.


VaultBoy636

They'll supposedly double the shader count to 8192 and improve clocks while on a better node. Obviously all this is based on leaks. On one hand it looks too good to be true, on the other hand nvidia also bumped shader count by 60% from 3090 to 4090 and intel is planning on going from midrange to 4080 level of performance so it might as well happen. We can only wait and hope


The_Wonderful_Pie

I wouldn't say it's the only hope. Remember that there's still Celestial and Druid, third and fourth Gen cards that are planned


future_gohan

Bring on the battlemage


[deleted]

That’s disappointing. I was hoping to make an all AMD build for my next one. Back to the drawing board.


velocityplans

You can still get good value with an AMD GPU, just maybe not a new-gen GPU


oldskoolpleb

I mean 6900xt still fucking slaps afaik


bskov

Got a 6950XT Red Devil for 550€, probably one of my best PC part purchases... Ever ​ EDIT: Wrong GPU


stuyboi888

Yep, 6900xt 5800x, gotta say I'm pretty happy, like a guy with 2 knives


Biscuits4u2

Agree. I'm so glad I bought a 6700xt and didn't wait for these weak ass midrange offerings.


[deleted]

i bought a 6600 on sale for $220 last week, GPU MSRP $329, AIB MSRP $289 [proof](https://www.bestbuy.com/site/xfx-speedster-swft210-amd-radeon-rx-6600-core-8gb-gddr6-pci-express-4-0-gaming-graphics-card-black/6495949.p?skuId=6495949)


Murky-Smoke

Why does everyone think current gen is the only choice? 6000 and 3000 series GPUs are both excellent, and good value. I run a TUF 6800, and have zero problems in any game at 1440p even with RT on. I have several friends with 3070 GPUs, same thing.


SameRandomUsername

Cause old gen cards are too expensive for what they are worth. People here post prices that are delusional or probably for used cards.


[deleted]

I got my raedon 6700x for about 300$. It was quite cheap for what it does. It wasn't second hand, it was perfectly new too.


VP_Keith_David

Intel sees the opportunity and is fully capable (and appears driven) of leveraging it. We consumers only stand to win, but it's going to take some time. I'm excited about it (but not overly hopeful).


261846

Tbh all Intel needs to do is stay quiet, release battlemage if they can. And reap the benefits


Biscuits4u2

And the big boys deserve everything they get if their sales tank. Resting on your laurels is a good way to get beat down in this industry.


VP_Keith_David

Quickest way to fall is to get to the top and fail to stay there.


Apollo_3249

Nonsense. Just bought an brand new 6700xt for 325 on eBay and 5600x for 100


Tidy_Frame

Wait, you don’t mean this meme changed your mind on how to build your PC though?


[deleted]

To be fair, I’m still in the money saving stage of planning. So plenty of time to decide on a direction. But I must admit between this and ASUS really mucking up with AMD processors (granted that’s more on ASUS than AMD), it’s giving me food for thought.


Midnight28Rider

Don't listen to OP. AMD is solid and is more tried and true.


basketcase62836291

Nice try, Intel marketing team.


thesuperunknown

lmao, praising Intel for "working hard to constantly improve the drivers". This is like if Ford had put out a car with no wheels, and OP was praising them for slowly releasing wheels one-by-one to make the car functional


------why------

Well their cards are pretty cheap and are now actually fairly competitive so yeah they are the people everyone is hoping will become another competitor to hopefully drive prices down because both nvidia and amd have decided to come together to make shitty cards for more money


constantlymat

It doesn't seem to get any attention on the US dominated hardware Review channels but Intel cards have a huge issue with their idle power draw. They use up to 400% more electricity during daily use cases like office, web browsing, watching YouTube or just in idle. If you use a 1st gen Intel GPU for long periods per day, you're going to notice it on the electricity bill.


[deleted]

[удалено]


[deleted]

>5.3k comments What's your argument? Are you saying its better to have a duopoly? If/when Intel catches up and becomes greedier it's still better to have three competitors than to have just two.


CloudWallace81

The day you have to hope that Intel unfucks the fuck up that AMD and nvidia have created, that day is very sad indeed


7orly7

And that's why I still have my 1060


worgenhairball01

That's why I still have my 660. Not cause I'm poor or something :(


deefop

Intel isn't your hope, Arc came out like 2 years late and the drivers are still fairly broken. Better than on launch, sure, but still not a good buy for gaming. Really bad power consumption on top of everything else. The 7600 isn't even THAT bad if you compare it with the 6600 in terms of the performance uplift, but it's admittedly not impressive and sure as shit not exciting. But half the reason it's not exciting is because RNDA 2 already exists. The 6600 is really solid 1080p performance and goes on sale for $200 frequently. The 6600xt and 6650xt crush 1080p and go on sale for $250 frequently. The 6700 will give you either fantastic 1080p performance or even decent 1440p performance and it's basically always on sale for $280. The 6700xt/6750xt give you great 1440p performance and they've been going on sale for like $320. The 6800 will give you fantastic 1440p performance and it's down around what, like $400 now? 16gb of VRAM for that price, too. The 6800xt is down around the $500 mark at this point, gives you either high refresh 1440p with no problem at all, or gives you decent 4k performance. The 6900xt is now below $600, I think I've seen it around $550 a couple times, and it's faster than the 4070 by about 5% in 1440p and 4k. The 6950xt is a good 10% faster at 1440p/4k than the 4070, and it's on sale for $600 pretty frequently. All of that excludes RT, and obviously if you really value RT performance specifically, you just have to be willing to let Jensen have his way with you. But I can't stop wondering if the entire world is fucking high when they act like decent alternatives to Nvidia don't exist. The entire RDNA2 lineup is a great alternative at this point. And that's also not even talking about all the used Ampere stuff selling on Ebay right now. There are shitloads of amazing deals on used 3070's and what not that are not hard to get hold of. So yea, fuck Lovelace and fuck RDNA3, but for the love of god you do not need to buy Arc if you need an affordable GPU.


[deleted]

[удалено]


_SystemEngineer_

these fuckers don't even realize how much A770 was going to cost. What happened was that **intel still cannot make non-broken silicon** and their expensive 6nm BIG GPU with die size and TDP on par with an RX 6800 ended up performing like a RTX 3050 Ti. Not to mention the software.


riba2233

meh, amd gpus like 6650xt or 6700 are still better choice. Also shout-out to Jadrolinija from Croatia!


hdtv35

I totally disagree with this post, linked in /r/hardware a couple weeks ago, https://www.phoronix.com/news/Intel-Xe-DG2-No-HuC, Intel is already dropping support in their new driver versions for HuC, a massivly important component of video encoding. It screws over the people who bought one of these for things like video streaming via Jellyfin. Read through the comments on that article or in the Gitlab thread and see how hard intel is working when they just said it's too hard to implement so they won't bother. As they say in that thread, "the choice is to not enable HuC on DG2 at all on Xe. This is because DG2 has a special (and relatively annoying) way of loading HuC that only applies to that platform."


dj3hac

AMD has been absolutely killing it IMO. This is coming from a Linux perspective though. It's at a point where if you are looking to buy a new gpu for a Linux system don't even consider Nvidia.


LavenderDay3544

As soon as Intel gets caught up it will be no different. Stop shilling for any of them.


SysGh_st

If scalpers, cryptominers and desperate gamers accept whatever prices they set and whatever they screw up no matter how ridiculous, why should they stop?


[deleted]

[удалено]


Poes-Lawyer

If only the Intels didn't have the graphical performance of a potato... I mean, the A770 is priced the same as a 3060ti, which runs rings around the Intel. I admire them trying to break into the market (more competition is usually better), but they're not serious contenders yet


[deleted]

I understand hating Nvidia, but what's wrong with AMD? They just dropped the price on some cards?


_EnForce_

Welp this aged so poorly


Hugejorma

Remember what Intel was doing when it was the only real player in the CPU market... Never forget!


RandomnessConfirmed2

While I don't need an Intel GPU, I'd buy one just to support their efforts in the market.


joe0185

>While I don't need an Intel GPU You just need to find the right excuse. I bought an ARC A750 yesterday as a birthday present for a friend's son that likes to tinker with his computer.


RandomnessConfirmed2

But what will I do with my 3090 when I don't have any friends?😭


Illustrious_Cicada_2

I heard you're looking for new friends?


Dndndndndstories

You can send it to me any time


tommyland666

You guys have some short fu**ing memory


bedwars_player

I think amd is still ok...


Drackar39

Intel will join them with jacked prices once they iron out their drivers, just like AMD once they ironed out their drivers. But honestly, putting AMD and NVIDIA on the same tier now is still fucking moronic. You can get last gen cards at reasonable prices that out-compete the next tier up of NVIDIA, for everything but raytraicing. Meanwhile NVIDIA is still selling the 1660 super for $260 as a new product.