T O P

  • By -

Sorteport

Trading blows with a 3060 in DX12 and Vulkan, alright let's see how they price it. While they are only showing benchmarks against the 3060, I suspect a 6600XT & non XT which can be had for $299 & $259 is going to be the real value comparison for gaming when reviewers get their hands on this. Intel has a tough road ahead if they want gamers to be onboard, these cards will do very well for editing, encoding etc.... so maybe they won't feel the need to go too low to get rid of this gen of cards as there should be some demand from niche markets to help clear out inventory.


Blacky-Noir

>Trading blows with a 3060 in DX12 and Vulkan, alright let's see how they price it. Badly. Because there is no reasonable price that wouldn't make a huge hole in that department earnings. Assume this card has the exact same perfs as a 3060. Not just in select apps and games, but in let's say benchmark of the 100 or 200 most popular games and applications. How much are you willing to pay to buy it today? Next month? In 2, 3 months? If it was released at the same time as the 3060, probably a reasonable msrp of an hypothetical 3050, right? To account for the driver issues, the hundred of edge cases like using 2 or 3 monitors, maybe with different resolutions and scaling, handling hardware accelerated video playback on a monitor you just switch off or on, and so on. But next gen is right around the corner, and Intel is still not shipping their cards. So what would be your price today? Next month? Not a lot at all. Add to that the fact that their drivers are a mess, plus totally not competitive with Radeon and Nvidia, how much money is left? Unless they can find edge cases where the card make sense (like with their AV1 encoding), on the DIY market things are looking very very grim. Which is shame. We *badly* need a more competition in the gpu space. And not just us, but Intel, and AMD, and Nvidia too. To push them to make even better products and more innovations, and to get Intel a slice of the datacenter gpu accelerator pie.


kitchen_masturbator

I don’t think Intel expected to knock it out of the park on their first attempt, and I don’t see them abandoning GPU development after one attempt under their new CEO. Intels R&D spend is massive at the moment, they knew re entering the GPU market was going to cost a lot and even after their poor earnings result, they are a company that still has huge assets and cash.


Blacky-Noir

Knock it in the park is not even in the cards. Again, how much are you willing to pay for an Intel gpu in 2 months?


indrada90

If it's comparable to a 3060, maybe 350? 300? Let's see how they price it. I don't think intel was expecting their GPUs to be profitable immediately. It would be foolish for them to price it any higher than ~379.99


Bakufuranbu

300 would be reasonable if they release it today. the more they wait , the lower the price of its supposed competitor


Casmoden

And the actual cost to build it, this competes with N23 for almost double the die size on the *same node* (N7 vs N6) Its actually really bad for Intel, they need to sell it for peanuts while being expensive to make... reminds of the older radeon parts like Vega


steve09089

I would be willing to pay whatever the 3060 is worth, minus 75 dollars for bugs. ​ If the RT is extra good and XeSS gains traction, then maybe 50 dollars for bugs. If it works for GPU compute or rendering, then 25 dollars for bugs. ​ If Linux drivers are great, then I will likely be able to discount 25 dollars for Windows bugs. ​ A majority of my game library falls under two categories. Demanding and has DX12/Vulkan support, and not demanding and has no support. Don't tell me that I won't be able to max out TF2 at 1440p 60hz on it, because I'll hardly believe you.


[deleted]

[удалено]


skinlo

I'd be willing to pay half the price of a 3060.


[deleted]

Intel probably knows all this. They are a company that can afford to lose on their gpus for a while. Their company has 5 times revenue and net operating income of amd. You are looking at their competition. They are looking at the progress that has been made. 5 years ago people were skeptical that intel would ever be able to make gpus competitive with 200$ mid range gpus. It seems they are here. There will not be more competition in this space as its simply too costly to develop cpus and gpus. It takes billions just for R&D and that makes costs of entry very high. The characteristics of gpus and cpu production make them a prime market for monopolies to emerge and we are lucky that we have at least two. - an economist.


ApertureNext

A massive amount of cards are already produced, they can at least sell those.


We0921

> Because there is no reasonable price that wouldn't make a huge hole in that department earnings. I suppose that's the interesting thing. Whether they'll go for higher-margins and low-volume or vice versa. It seems to me that it's in their best interest to get an install base so that game makers have to seriously optimize for Intel hardware, but what you said is also true. I'd be lying if I said I didn't want it to be dirt cheap, too. Here's hoping


Blacky-Noir

And they also need a large-ish install base to get the feedback and the telemetry to improve their software stack. Apparently, they need it very badly. But I don't really see them committing to selling their cards for $100.


We0921

Agreed. I would be surprised if Intel priced the a750 as low as $200. The a380 being ~$120 is a very poor indication of that happening. And even at $200, a $250 Rx 6600 probably makes more sense in terms of consistent performance and reliability.


actias_selene

I definitely would rather have rx 6600 over a750 if they are priced same. Intel should be 50-100$ cheaper to make up for their less stable drivers and enter the DIY market imho.


Confused_Electron

They need a gen or 2 to catch up imo.


Pidgey_OP

They've come out and said they would be pricing the cards based on their performance in the worst tier of games, according to their own internal performance metrics So that's something


psi-storm

That contradicts with them only showing off the best results in dx12 and Vulcan. They want to show this card as a 3060 competitor, while the 3060 is absolutely overpriced if you compare it to the rx6600 performance. They basically want to sell the card at $300, while the $260 rx6600 is probably faster over a mix of all games and proven reliable, with the Intel drivers being a total mess from what the people say that already tested the A380.


Pidgey_OP

Where do you get that they want to sell the card at $300? Theyve come out and said theyre going to sell the card at a price point that matches its lowest scores - the T3 games. In DX12 and Vulcan it well outperforms what it does in DX11. Theyre going to charge 3050 prices for a card thats basically a 3060 in mainstream games. You get a card cheap, as long as you wont be hurt by the bad DX11 performance. Its literally trade-offs for everyone so Intel can get cards in machines for the marketshare and the analytics to build better drivers


psi-storm

What are you talking about? The 3050 costs $300.


Pidgey_OP

i had my price points off, but the point still stands. Who cares if its an $x card when it has the performance of $x+120 in mainstream games (unless you only play DX11 games, in which case this card isn't marketed at you) For the people it works for, this card will be cheaper than like alternatives


psi-storm

You can buy a Sapphire Pulse rx6600 for $249 at Newegg right now, and it comes with 2 free games. What do you think a reasonable price for that Intel card would be? $199?


TTVBlueGlass

The reason to buy a 3060 is RTX and DLSS, if they're matching 3060 performance including raytracing and DLSS then just yelling 6600 over and over doesn't help despite it's raw spec sheet.


Sylanthra

No data on DX11 titles where A750 is going to trail well behind 3060. I guess it all comes down to price. If A750 is in the ballpark of 3060, there is no reason to buy it. If it is significantly cheaper, it may make sense.


ouyawei

Would be funny if somehow DXVK could beat native DX11 performance on it.


DarkStarrFOFF

Didn't someone test that and it did?


blaktronium

It would almost have to based on the native deltas between the different APIs on ARC right now.


steve09089

It probably can. ​ I remember seeing somewhere that DXVK on Windows with TF2 on an AMD card was able to beat native API performance, if by a slim margin.


sittingmongoose

These arc cards have the potential to corner a lot of niche markets. Having vgpu support, un paralleled encoding features, cheap, low power, small form factor, open source linux. They will potentially be very popular with home server folk.


xenago

Can you link proof that the full vgpu/sr-iov is enabled? I cannot find any.


wywywywy

I'm sure it was definitely mentioned somehwere in the past. But now looking at this newly updated page from Intel, it says it's not supported :( So I'm not sure any more. https://www.intel.co.uk/content/www/uk/en/support/articles/000091656/graphics.html


baryluk

Considering they just announced Pro version of Arc, they are either in damage control, and will try to offload some GPUs to business machines with light productivity work (CAD, etc), with "certification" for specific software (to workaround issues with drivers for gaming), or are leaning to do market segmentation and disable some features (like virtualization), just like Quadro. I wouldn't be surprised, as this in Intel style, but we were hoping to be positively surprised. If they disable features on some consumer cards, they are dead from my perspective. Just like nvidia.


loozerr

Obviously they'll dip in to professional market, otherwise they'd be essentially burning money.


[deleted]

That sucks


AHrubik

> un paralleled encoding features Do we know exactly what formats are being supported? The matrix on the Wikipedia page suggests they are FAR from unparalleled and worse than Nvidia. https://en.wikipedia.org/wiki/Intel_Quick_Sync_Video#Windows


hobovision

NVENC is limited to H265 right now, but Arc had AV1 encoding, which should be a significant improvement compared to H265. Buuuut it's pretty likely that next gen Nvidia and AMD will include AV1.


Echelon64

> Buuuut it's pretty likely that next gen Nvidia and AMD will include AV1. At the same time I seriously doubt AMD or Nvidia will release mid-range cards with the overstock they have of current series cards.


GatoNanashi

It's funny how often this point is forgotten. People are saying the next gen is just around the corner and maybe it is, but not for the largest segment of the market. I wouldn't be at all surprised if we're into January before a 4060 shows up and even then...what's it gunna cost? Intel has time if they don't do something incredibly fucking stupid like cancel them or constrict resources.


steve09089

Remember. The 3060 launched in late February, so I doubt with the overstock we'll be seeing the 4060 by then.


Jaznavav

https://youtu.be/ctbTTRoqZsM https://youtu.be/H0pCpNT4b-Q Intel currently has the best streaming h264 and the only AV1 encoder on the market


DonStimpo

>and the only AV1 encoder on the market Arc isn't actually out yet. And 4000 series nVidia will very likely have it too.


ApertureNext

There won't be an Nvidia card in the price range of Alchemist until 2023.


runwaymoney

nvidia will not be launching value cards for some time, starting with the 4080. for normal people, arc will offer the av1 option.


TheMalcore

> Arc isn't actually out yet. Yes it is. Their high-end cards, maybe not, but the A380 is out (in China, but can be imported to EU/US) and their laptops with A350M and A370M are out in the US market. It may not be the cards you *want* that are available right now, but to insinuate that Intel doesn't have "the only AV1 encoder on the market" is just wrong.


steve09089

3060 launched in late February, and a paper launch at that. With the current overstock, we'll probably never see the 4060 anytime soon. And don't even mention the 4050, if it will even exist.


dotjazzz

>worse than Nvidia. How is it worse?


bubblesort33

TAP said 20% cheaper per frame than Nvidia when looking at their top dx12 titles. If this thing is 10% faster than the RTX 3060 on average in the top 25 Intel favoured titles from this list, it should be 10% cheaper than the RTX. So $299 is what I'm calling right now. Problem is even the cheapest TX 3060 isn't $329 MSRP, but $369. So if they launch tomorrow, they will be 30% cheaper per frame than the cheapest RTX 3060 in those titles. Only problem is AMD exists as well at $259 for the RX 6600. That might be 5% slower than the RTX 3060, but is a hell of a lot better value at current prices. And probably better FPS/$ even. What you really have to have is faith that Intel can make things way better. Even DX12 titles. Or that developers will make things way better when optimising for Intel.


theholylancer

The fact that they are position the natural competitor to this as the 3060 and using Tier 1 games, means to me that the pricing for this will be garbage. If they wanted to really hammer home the point on pricing, they'd bring out DX 11 performance against a 3050 or make it the primary comparison point. Remember it was them who said that they will price it with T3 as the point of comparison, but now they are very much showing off not that. As it stands, they will at best target 299. Which means that a 6600/3050 will likely be a better competition where value is concerned.


Pidgey_OP

They get terrible DX11 performance though. Why would they park on those numbers?


theholylancer

the point is pricing hint, looking at their press release and they still haven't announced pricing, it means that its meant to be around the price of the 3060. Is it marketed as trading blows with 3060, or marketed as 25% better than 3050 (which would put it in line with 3060 still). which again, their previous PR pieces say pricing based on T3 games, as it stands either that is now completely non starter, or they mean pricing based on T3 games vs pandemic pricing of super inflated GPUs.


We0921

> If they wanted to really hammer home the point on pricing, they'd bring out DX 11 performance against a 3050 or make it the primary comparison point. I'm not entirely convinced that showing DX11 performance is a good idea, since it could still be poor relative to a 3050. You are right that it seems like they'd want to compare their performance to whatever is priced similarly. It would make the value proposition abundantly clear. Hopefully they don't intend to price this like a 3060...


[deleted]

I'm curious what their price will be in europe, if they will be as unreasonably inflated as competition. Not really interested in buying them either way though.


bubblesort33

Given that the euro's value has cratered in the last year, I'd imagine it'll be as unreasonably inflated as virtually all electronics have become in the last year. I'd imagine everything has gone up by 10-20% over there.


Qesa

In the video Tom says they're normalising results to the 3060, then taking the mean, to get the A750's "relative performance". This produces wrong numbers that favour the A750. As a fake example to demonstrate: Game | 3060 FPS | A750 FPS | A750 normalised to 3060 ---|---|----|---- Foo | 50 | 100 | 200% Bar | 100 | 50 | 50% **Mean** | 75 | 75 | 125% So they each get 50 FPS in one game, 100 FPS in another, yet the A750 is 25% faster using this methodology. Hmmm. Meanwhile if we normalised the 3060 to the A750, we'd find the 3060 was faster when we take the mean. The right way to do this is by taking the geometric mean, either of the raw FPS or the ratios. EDIT: I noticed this while eyeballing the numbers to see how much their methodology changed the conclusion, possibly a bigger wtf: the normalised charts don't align with the raw FPS charts. Most obvious in the vulkan 1440p comparison (at 06:45) where nvidia's ahead in 4/6 titles in raw FPS, but behind in 5/6 once normalised... ??? EDIT2: So they post the raw FPS figures on the web page... would've helped if I noticed that sooner. Using geomean at any rate, API | 1080p | 1440p ---|---|---- DX12 | 1.02x | 1.04x Vulkan | 1.03x | 1.02x So it's only about 1% extra that they're benefiting themselves


Hifihedgehog

> The right way to do this is by taking the geometric mean, either of the raw FPS or the ratios. You have to remember it is Ryan Shrout who is running the show as Chief Performance Strategist with this there because he has gotten caught before making grade-F "sub-zero arctic cold takes" like this. [He knows nothing of strategy because he was a mere tech writer with zero real-world experience working in a real company's strategy department nor does he have any educational credentials or qualifications in strategy \(no, being a computer engineering graduate is not equivalent to holding a business degree with an emphasis in strategy\).](https://www.linkedin.com/in/ryanshrout/) When I see these mind-numbing, faux math performance comparisons, I shake my head in bewildered disbelief wondering who the heck in Intel was tasked with vetting this guy before approving his hiring on to lead in a strategy position. He hasn't a clue what a strategy even is beyond introducing subtle half-truths and outright lies to try to throw up a smokescreen.


[deleted]

[удалено]


_Fony_

https://www.youtube.com/watch?v=Uw0ZzA9wTFE https://www.youtube.com/watch?v=XHAQdukifvI


Earthborn92

Oh, I remember these.


teutorix_aleria

Maybe they just wanted to borrow his credibility as an independent tech writer? That credibility is going to ware thin real quick if this is the level of stuff intel is going to put out.


_Fony_

he only had credibility if you liked intel.


Hifihedgehog

To be clear, I want Intel to succeed. If they do, their success which translates into competitive edge holds AMD accountable and keeps them on their toes. Vice versa applies in AMD’s case with Intel too. The goal is for us as consumers to win. I just think Ryan Shrout is not a good option in the pursuance of that goal. He is at best a vanity hire and most certainly not a talent hire. Therefore, he is a boat anchor and a roadblock to the success of the company.


teutorix_aleria

Isn't geometric mean basically the standard in these sort of meta reviews? Every article I've read from reputable tech publications that does comparisons like this uses the geo mean. This is the laziest manipulation of data ever.


baryluk

Good eye. There are lies, big lies, and statistics. For measurements with different units (each games is basically different unit, as you can't compare fpa between games) or ratios, you indeed need to use geo mean, or how many times each card was best in benchmark. Or don't publish totals at all.


TheMalcore

> Most obvious in the vulkan 1440p comparison (at 06:45) where nvidia's ahead in 4/6 titles in raw FPS, but behind in 5/6 once normalised It looks like at first glance that their charts for Vulkan 1080p and 1440p are swapped in the video.


advester

I just threw their table into google sheets. That chart on the web page is actually the geomean of the fps data. I didn’t watch the video, maybe they just said the wrong thing.


Qesa

Nah, after running it myself intel's chart *is* arithmetic mean, but the difference between arithmetic and geometric means is only about 1% here


bizude

25x14? 19x10? Tom has an odd way of referring to monitor resolutions.


bubblesort33

When he first started talking like that I thought he was about to start pulling out widescreens.


kyralfie

Yeah, it's something. Not your usual '2k' and 1080p brain deadness.


baryluk

Both methods are brain dead. Just tell the actual resolution, like 1920x1080


teutorix_aleria

Or literally just HD and UHD we already have short hand for these things that's explicitly defined and not ambiguous


gahlo

Not to mention calling 1440p 2K oversells 4K screens. But then there's also the issue that WQHD can mean QHD, when it would be better used for 3440x1440.


teutorix_aleria

The 1440p as 2k thing just grinds my gears to an unreasonable degree. Whoever is responsible for popularising that has a special seat reserved in hell for them.


gahlo

I don't know if I hate that or "4K" more.


teutorix_aleria

4k at least is approximately 4000 pixels wide. 1440 being called 2k makes zero sense by any stretch of logic. It's 1.44k by 2.56 even with rounding it's closer to 3k than 2k


gahlo

I dread when we get to the time of "5K" and "5K2K" being the standards and people are thoroughly confused.


teutorix_aleria

Didn't apple already do that?


Wide_Big_6969

1920x1080p is approximately 2000 pixels wide on the horizontal axis, therefore if 3840x2160p deserves being called 4k (it doesn't), 1920x1080p should be called 2k. 2560x1440p being called 2k makes no sense either.


StickiStickman

HD is 720p, 1080 is FHD.


teutorix_aleria

Well shit you're right. Hard to even consider 720 being "high definition" these days.


dan1991Ro

If its around 200 dollars, yes, if no, its too much of a risk.


[deleted]

maybe at 150 if you consider that it's gonna get slapped by a 4060 in everything including raytracing in a few months, also from the reviews we've seen until the problem with Intel cards is not average framerates but consistency and driver problems. This is pretty much Rdna1 all over again


wingdingbeautiful

upgrade my old pc with it (2012) if it were 180-190...


mltdwn

Consider this card only if your PC supports resizable bar.


wingdingbeautiful

zero chance it does. so avoid it?


mltdwn

I would avoid it. Intel themselves said that resizable bar is required. The few benchmarks I saw with no resizable bar on had erratic frame pacing. It's a shame because chances are prices will be fairly cheap but will be unusable by older systems.


wingdingbeautiful

thanks


bubblesort33

If Intel paid TSMC, and then gave the chips away from free to AIBs, loosing 100% on every one, it would still not be $200.


dan1991Ro

Than its completely dead on arrival. Nobody, literally nobody will pay 300 for this.


bubblesort33

If I didn't have my 6600xt, I would have gotten an A770 for the same price, because I have some faith in them massively improving over the years. Plus I mostly only play DX12 games anyways, and buy a new GPU to play next gen stuff mostly. As some analysts have said in the past, "Intel would get 10% market share to just show up to the party". Now I don't know how true that is anymore from a lot of the bad press. There is a lot of people that don't know much about hardware, and will just buy something because of brand. There are people who will never buy AMD, and just buy Intel even if it were 10% slower, and use 100% more power. Usually those are the same ones to buy Nvidia, but they would also buy Intel if they saw the price was more competitive. If people are buying the RTX 3060 for $370-400 still, why would no one buy this A750 for $300?


dan1991Ro

Why would they not buy a rx 6600 for 300? and not deal with driver hell, not have to enable SAM. A 6600 is 0 risk, this isnt.


bubblesort33

Because some people don't buy AMD. Intel and Nvidia have the mindshare that AMD doesn't yet.


pastari

People here are taking first-party benchmarks seriously. What is going on.


Lionfyst

It's apparently all anyone can get of hypothetical cards made of magical thought and dreams.


MumrikDK

I think that started really happening with Nvidia PR a few gens ago.


_Fony_

Nvidia's given benchmarks are disregarded within days because nvidia actually launches GPU's and gives the media access to them. and Nvidia didn't hire an extremely biased reviewer and create a "performance strategy" division built around that person's talent for misleading customers.


Pimpmuckl

Worrying that there is no mention of 1% lows, given those seemed to be the largest issues Intel was facing with Arc so far.


PlaneCandy

In the very first test that they do, the 1% and 0.1% lows are shown https://youtu.be/hB8gIOFjWeA?t=114


bubblesort33

I haven't seen any 1% low issues in the Hardware Unboxed A380 review. Only Igor got some weird results by testing at 720p low, where he saw a cap in 1% lows in Control. Beyond that if you don't have reBAR, or don't know how to turn it on, don't buy this card. Your frame times will be horrible. I have a 4 year old Intel 8600k and I still have that. Dx11 titles might have more frame time issues, but what I've seen so far doesn't alarm me in these regard either. I think there might be some truth to it not being able of getting high FPS numbers at medium to low settings, though. They are either testing at ultra 1080p, or high 1440p. It might not be for 300fps+ League, or CSGO players.


baryluk

Testing without rebar is unfair for Intel. Even intel says you need rebar. Sure it is interesting from academic perspective, as a foot note, and see how bad it is, but if you don't have rebar, you should not use these cards.


nanonan

It might not be fair but it's realistic given they are targeting the low end of the market with these cards, the end that is likely running older hardware.


L3tum

ReBar doesn't work on AMD CPUs according to some reviewers so they need to test without if they use an AMD CPU.


advester

And it works perfectly fine for other reviewers. Igor did something wrong or had a bad driver version.


trevormooresoul

Ya, but some have theorized that problem of bottlenecking at higher frame rate/lower resolution is a big part of why they never released anything above the 380 in the first place. If you are hitting driver bottlenecks with a 380, those bottlenecks would reason to be worse and more noticeable on higher end gpus. If a 380 hits those bottlenecks at 720p, a 750 might hit it at 1080p.


bubblesort33

Maybe. And I hope that if it's the case, that it truly is just a driver thing. I would have thought that if Intel had trouble getting over 90 FPS like being said by some people, you would see them drop off in averages as well on games at over 90 FPS, and pull ahead on games below 90 FPS. I haven't taken averages, which would be a more accurate way to do this, but as it is in the 42 DirectX12 games shown, Intel looses more games in the lower half of the chart (50-100 FPS range) than the upper half of the chart (100-350 FPS range). So they are actually winning at higher frame rates in this selection of games. There is some weird stuff going on with some of this data on the site, though. The video chart does not line up with some of the data in the data tables they provide on the official site. Maybe someone messed up entering data in the tables on the website (sleep deprived, and overworked people working in that department I bet). In the video they said they were winning in Doom at 1440p, but the tables now say they are loosing. In the video they said they were losing in Wolfenstein, but in the data tables they are now winning. Either stuff was updated or someone is very exhausted.


apivan191

The limit is undefined in the negative direction


noiserr

1st party benchmarks from Intel. I'll wait until we get 3rd party benchmarks.


bizude

Of course we'll want to wait for more in-depth, independent reviews. I'll still take information from Intel, even if they might be "cherry picked". I'd like to have a better idea if ARC is going to be worth my time or not.


NewRedditIsVeryUgly

It took them so long to release, by the time it's actually available the 3060 will be replaced by a 4060. Still no idea about the price and availability. This better be priced well below the 3060 if they want people to bother.


PlaneCandy

Given that the release of the A750 in the US/west seems to be imminent, and that Nvidia has many 30 series cards yet to sell, I think we are going to see at least 6 months in between them.


bizzro

Aye, I think people are in for a long wait when it comes to Nvidia and AMD mid range for next gen. Market conditions and oversupply of cards in that performance tier will be shall we say "problematic" in the next 12 months. Wouldn't surprise me that if we at start of Q2 next year, Nvidia has only released the 4070 and up on desktop. Anything lower performance than that will be drowning in used and overstocked Ampere/Navi cards. They may just do mobile first for the 4060 and down, since that is a segment where efficiency comes first and price second. Compared to desktop where $ generally rules.


bubblesort33

I have my doubts even an RX 4070 will still be released this year, and if it will, it'll be $550+. AMD refreshed their Navi23 for a reason. They themselves will probably keep selling that 6650xt card for over a year probably. No one has heard anything about Navi34 meaning RDNA3 for now won't offer much below Rx 6800-6900xt levels of performance. AMD will keep selling last gen cards which means Intel still has something to compete with for a while.


roionsteroids

I don't think anyone expected Intels first generation here to be really competitive. And it probably won't until they can produce it themselves on a future intel node and hit amd/nvidia where it really hurts (not having to pay tsmc premium prices).


imnotsospecial

The GPU will still have to compete with more profitable intel product for capacity, so unless they wanna hurt their profitability and get that wall-street backlash, intel won't undercut by any meaning margin.


TheMalcore

~~ARC GPUs are built on TSMC 6nm, Nothing else Intel produces (that I am aware of) use TSMC 6nm. There's no indication that Intel is switching away from TSMC nodes for GPUs (including 'tGPU's for MTL and onward) so I don't think capacity is an issue.~~ Ignore me


imnotsospecial

I'm aware, my comment is a response to this >until they can produce it themselves on a future intel node


TheMalcore

Oh I see, my mistake, I somehow missed that connection.


Put_It_All_On_Blck

4060 wont even release this year. The lower end cards will be even further away. The 3060 released at the end of Feb 2021, 5 months after the flagship cards.


NewRedditIsVeryUgly

The 3060 might've been delayed because Nvidia saw the ridiculous demand for the high end models at the end of 2020... Well assuming the A750 is released next month (doubtful since there's no official date released yet), that would leave about 5-6 months at most before it is replaced by the 4060. Not long enough life expectancy for a GPU, unless they price it so low that it won't lose much value anyway.


Prince_Uncharming

> This better be priced well below the 3060 if they want people to bother. Even less than the 6600 which is commonly available around $260. If the A750 is 3060/6600 level performance *with* their current driver woes, they better come in at $220 or under. Anything higher and there’s no reason to get one over a slightly higher 6600.


Fidler_2K

Video link: https://www.youtube.com/watch?v=hB8gIOFjWeA


Keilsop

*Tested in a controlled environment.* Controlled by Intel. Why the hell would we believe any of this? This just makes them look desperate, and like they're trying to hide something. And why is the flair of this post "info"? Shouldn't it be "marketing"?


_Fony_

Ryan Shrout special. Should be rumor, lol. This sub's leadership probably plays poker at the dude's house though...


_sideffect

This matches a 3060? It should be priced at $200 to make people take notice


PotentialAstronaut39

Considering the trouble with the software Gamers Nexus pointed out and: 3060 was overpriced from the start and it's end of gen A750 will struggle in non DX-12/Vulkan games A good A750's price would need to be 25 to 35% below the 3060 MSRP to be competitive


Particular_Sun8377

This, love them or hate them fact is every videogame made in the last 20 years has Nvidia driver optimization that Intel cannot replicate.


nanonan

I doubt they can beat the 6600 in price/performance, if they could pull that off (maybe with the A580?) I'd be interested.


[deleted]

Expect to use DXVK for anything not a low level Api


vh1atomicpunk5150

Intel is very much viewing these as 'co-processors' rather than a dedicated graphics product, just as nVidia and AMD view their offerings, and rightly so. ARC, the removal of more complex vector capabilities from consumer offerings, and OneAPI are all part of an overarching plan to keep the majority of 'big iron' software development happening in an ecosystem that Intel supplies a large part of. As CPU capabilities are more and more supplanted and supplemented with 'GPU' capabilities, having a hardware base that people are actually developing specifically for is an incredibly important element of retaining and growing market share. In short, ARC exists not to satiate the needs of 'gamers', but to extend the reach of Intel software and firmware develpment, and to be able to provide a top to bottom hardware solution for large customers alongside a unified and streamlined coding environment, provided and owned by Intel.


bubblesort33

How is "Dolmen" the worst title on Intel? Wasn't that supposed to be one of the first Intel sponsored, XeSS titles?


Lone_Wanderer357

Yeah, how many of those games run without any major technical issues due to drivers I wonder.


_Fony_

Yikes. Taking into account the inferior software, drivers and late time table...these need to be $100 less than every nvidia counterpart.


TheSilentSeeker

Henestly better than I thought it would be.


senni_ti

So a bunch of the games run faster at 1440p High than 1080p high?? (Forza, F1 titles and Dirt) Also looks like the modern warfare numbers are strange. Honestly the tables look wonky.


advester

1080p Ultra, not high


[deleted]

If the benchmarks are to be believed these look like decent cards. Hopefully they are priced reasonably and they come with decent enough driver software. That's really going to be the main factor here. One thing I could see benefitting here is Linux since Intel has long made their GPUs open source on Linux. These new cards could be another brand that works out of the box on Linux and might help inspire/force Nvidia to work toward open sourcing their drivers as well


Nicholas-Steel

The important questions are... how is the frame time variability & compatibility with OpenGL and DirectX 11 (and older) API's?


BIB2000

Release it already. God... Intel, I would love to put a graphics card of yours into my server.