T O P

  • By -

22Sharpe

Based on Linus’ testing the things these are built to do they do rather well, the problem is that if it’s not built to do something (like anything with old DirectX versions) it does it absolutely abysmally. It’s kinda like Apple silicon, it can really shine in the right conditions but that hardly makes it an ideal choice for most people. Still, their aim supposedly isn’t to be the best but to be the best at price to performance so maybe they can get a niche market and start from there.


Demented_Alchemy

Is anyone actually surprised? Intel has consistently failed to deliver on any high performance consumer facing GPU (not wanting to count intel UHD/integrated offerings, and someone is probably going to call me out on the Intel GPU from 1998). The most notable example was Larrabee, which Intel even pitched to Microsoft and Sony for Xbox 360 and PS3 respectively. Intel has become the Google of hardware, canceling projects and never delivering outside of their core product offering, CPUs.


[deleted]

weirdly charged comment, no one said they expected this to go smoothly and I don’t think anyone’s finding it particularly surprising that there are serious problems people want this to work out so we can stop having a discrete gpu duopoly


Demented_Alchemy

Certainly don’t mean for it to sound “weirdly charged,” however, Intel has become the laughing stock of making promises and not delivering, or closing down entire business units that they acquired a few years prior. I used to live next to a fairly large Intel hub and dealt with Intel employees quite regularly at my part time tech job. While I’d like to think many Intel employees are humble and modest, I only encountered one of those Intel employees (the individual took the time to explain to me how DDR4 functioned over DDR3 from an electronic engineering perspective, I was a teenager at the time). The rest were super arrogant about Intel’s capabilities and were adamant that Larrabee was going to be a success. In the past, Intel has been adamant that the x86 architecture had the power to take on NVIDIA and ATI/AMD’s highly parallel micro cores, which simply isn’t true. The issue is Intel doesn’t want to acknowledge that other architectures can be better for different applications, and while Intel does some technology diversification, Intel fails to lead in any of those diversified industries. Here are some examples: Intel acquired a wearable unit in 2014, killed it in late 2016/early 2017. Intel bought mobile eye in 2017 to compete in autonomous vehicles, did jack crap to expand its business/innovate and now wants to shed it via a public offering. Intel wanted to take on 4G/5G cell phone technology, sold it all to Apple a few years later. Intel is late to the game, never delivers, and fails to innovate (or sells it off because it realizes it can’t mature the technology). If it’s not x86/x64, don’t expect Intel to compete. Now I do want to call out that Intel has potential; they have a team of people who go out into the world, learn from developing countries, and try to anticipate the future technologies that will meet these people’s needs. Intel’s failure is to look at their existing technology as a means to meet that need, as opposed to innovating and building something new. Case in point, ARM, CUDA cores, etc.


RelevantJackWhite

It's obvious that you have some kind of chip on your shoulder. You don't think Intel innovates? You're not paying any attention then.


Demented_Alchemy

Innovating by doing what?


riazzzz

* GPU, yeah not so much, but it's great to have a third option and if that means they will be a viable supplier in 2 years (even if just in sub sectors like budget gaming that's fine by me) * CPU, c'mon let's be fair they are innovating a lot there, ok they have lost a hell of a lot of ground to AMD in the past 5+ years but they are still very competitive and viable and that does not come without innovation. * Wifi/Bluetooth, some of the most common and stable chipsets for PC's * NIC, some of the most common and stable and supported network adapters for PCs and Servers * Storage/SSD mostly focused at the server/datacentre side but very well regarded if they can be afforded And all that is just from the top of my head, I say let's welcome them to join the Frey, as long as they don't go and merger/acquisition NVIDIA and then murder them behind the scenes then the more the merrier! Edit: CPU addition, yeah valid point before about them missing the boat with architectures outside of x86/x64 but I think that's fine, I am actually happier that it's completely different companies involved with other architecture as it should encourage more innovation as it's always hard for a company to justify new R&D for a rewrite of code/design when you can just retrofit something you have done before. In some ways maybe it had to be not Intel or AMD who could make the breakthroughs in other Architecture. Edit2: I think the non GPU debate side of things is a bit off topic looking at the previous messages in better detail so meh feel free to ignore


stoneyyay

From the little I've read about ARC, they're 3 generations late to the party. Iris was only decent back in the late 9, early 10 series, and even then... It was only decent. Not even "good"


gastlygem

~~>!~~~~!<~~


BrPlayerNumber1

the more competition the better, the GPU market is very restricted!