Intel Arc A750 graphics card on par with RTX 2080 in official benchmarks

Written by Chad Norton on Fri, Jul 29, 2022 10:58 PM
System Requirements Optimum 1080p PC Build Low vs Ultra Screenshots GPU Performance Chart CPU List That Meet System Requirements GPU List That Meet System Requirements

Intel has still yet to announce an official release date for their first generation Arc A-series graphics cards, even though Nvidia and AMD are still on track to launch their next-gen GPUs by the end of the year. So in order to make up for their late entrance, the Blue Team has been very transparent recently on their new hardware, and that includes some official benchmarks from Intel themselves…

Intel’s latest video on their Arc graphics cards showcases some key details for their upcoming A750 GPU. It’s not the top-end card of the series, but it’s still one of the higher-end models. In the video, Intel’s own Ryan Shrout shows off the Arc A750 in action in Death Stranding, achieving around 90fps in Death Stranding at 1440p.

(Skip to around 1:08 in the video for the Death Stranding FPS performance)

That’s pretty impressive for Intel’s first outing into the dedicated GPU market, especially if it isn’t the flagship card of the series. In fact, based on our very own Death Stranding PC benchmarks, at 1440p resolution we can see that the Arc A750 performs on par with an RTX 2080, which delivers around 87fps at the same resolution and graphics settings (which are set to Default in the video above).

GPU Low Medium Default Very High
RTX 2080 99.4 97.3 87 78.5
Arc A750 - - ~90 -

That’s a promising contender to Nvidia’s RTX 30 series or AMD’s RX 6000 series, especially if Intel prices the GPUs right (the RTX 2080 cost $699 at launch). But launching so close to the RTX 40 series and RX 7000 series - which are rumored to have massive performance leaps compared to even the previous generations - may still be a bit too late for some PC gamers.

What do you think? How do you feel about the performance benchmarks for the Intel Arc A750 GPU? Are they better or worse than you expected? How much do you think this card should cost for its performance? And do you think it would be a good purchase right before RTX 40 and RX 7000 even with a price cut? Let us know!

What do you think of the Arc A750 GPU performance?

How much do you think this GPU will cost for its performance? - In USD

Login or Register to join the debate

Rep
-12
Offline
09:22 Jul-31-2022

hmmm so over a generation behind, and not even to the standard of a TI. poor show intel, the people deserved better. this better be cheap as chips or they might aswell cut their losses in the gpu market

1
Rep
1
Offline
10:52 Jul-31-2022

That's their mid range GPU.

3
Rep
-12
Offline
15:27 Aug-01-2022

depends on who you ask if a 3060 is "mid range", personally i wouldn't by anything lower than a xx70 minimum. xx60 just don't last the time i want from something I'm spending £300+ , id rather spend £400-£500 and have it last at least 4-5 years running games in ultra like my 1070 did.


but either way were talking about something that is of lower to mid tier of last gen stuff when current get is about to become last gen, like what's a 3060 gonna be worth after xmas when you can get a 4060 for £350 ish? maybe £199, £230 at a push? so even the 780 is prob going to be around a 3070-3070TI actual mid point of the about to be last generation so after xmas worth £300-£350?


fine if all you want is a budget card, but not everyone does

0
Rep
1
Offline
18:25 Aug-02-2022

The rtx 3070 is mid-range by definition. A mid-range integrated circuit chip by definition is half of the reticle limit or close to half. The RTX 3060 is low-end.

0
Rep
105
Offline
16:33 Jul-30-2022

Pretty meh performance to be honest, it is good intel is brringing competition but they are an entire gen behind in raster and RT performance nvidia

2
Rep
1
Offline
11:48 Jul-31-2022

That's their mid-range GPU. The A780 is their high-end, the A750 is their mid-range.

2
Rep
55
Offline
14:08 Jul-30-2022

Surprisingly that's not bad.

1
Rep
1
Offline
12:01 Jul-30-2022

So it's on par with the RX 6650XT and RTX3060Ti, not bad if the price and power consumption is right.

3
Rep
6
Offline
19:32 Jul-31-2022

and if it came out six months ago...


as for performance per watt if the A380 is any measure of the alchemist archictectures efficiency the next tier is going to be behind amd and even nvidia in that regard also.

0
Rep
1
Offline
15:08 Aug-01-2022

For a first attempt it seems alright. We desperately need more competitors in the GPU and CPU market. A duopoly is almost as bad as a monopoly.

3
Rep
6
Offline
23:42 Aug-01-2022

I think we can all agree that more competition is an absolute must in the space, my worry is that up until now Intel has been stumble-launching their discrete GPU's and that's not competition, it's a failure to execute. And note that DG1 was their first attempt.


So far the second half of this year you have a leveling off of demand, saturated supply and prices continuing to fall, they cant rely as much on tempting buyers to take a chance on them if the price difference ends up being much closer than it might of been months ago, especially if it only runs 2 out of your 5 favorite games properly. Meanwhile the current status is no release dates and more delays till drivers are ready. Current estimate is a late September release for me but that could easily slip again.


Saying that I'd be super interested in a 'who's buying arc' poll when we finally get some release info, if anything just to get a feel of how it might go...

0
Rep
1
Offline
01:22 Aug-03-2022

Wait they released a dedicated GPU 2 years ago? :O


I wasn't aware.

0
Rep
1
Offline
18:27 Aug-02-2022

Well the DG1 was the beginning, it's not like these GPUs are their second generation, it's still their first.

0
Rep
6
Offline
22:10 Aug-02-2022

Yeah i'm not so sure, the dg1 architecture and process is from nearly 2 years ago at this point and distinctly different to dg2 (arc a-series). Thats not generational?

0
Rep
1
Offline
12:01 Jul-30-2022

So it's on par with the RX 6650XT and RTX3060Ti, not bad if the price and power consumption is right.

3
Rep
97
Offline
admin approved badge
05:05 Jul-30-2022

might actually work like that if they werent lazy and just upped the drivers they have now. they didnt understand that unified driver doesnt mean one driver for all gpus.

1
Rep
1
Offline
08:51 Jul-30-2022

Drivers are very hard to code though.

0
Rep
272
Offline
admin approved badge
13:32 Aug-01-2022

Yeah. And some people gets paid accordingly for that. So they have to do the job well.

1
Rep
1
Offline
15:10 Aug-01-2022

True, but if AMD and Nvidia had to wait until their drivers are good to release a product, we'd never would have had GPUs. For the past 2 years AMD and Nvidia have had crappy drivers, should they have stopped selling GPUs? Should they NOT release the RTX 4000 and RX 7000 series until they make their drivers good?

0
Rep
272
Offline
admin approved badge
13:06 Aug-02-2022

How were Nvidia drivers crappy? I can't vouch for AMD as I don't use their GPUs (outside of Steam Deck), but Nvidia hasn't given me any problems for a long time, tbh. Though I do tend to own flagships, not sure if this is any different on mid-range cards by any chance?

1
Rep
1
Offline
18:29 Aug-02-2022

The fact that they are not always stable and that they get massive improvements in performance with new drivers means that they are crappy. Now I don't have the latest GPUs, maybe that's part of the problem, but I've built many PCs with the RTX 2000/gtx 1600 series and some with the RX 5000/6000 series in the past 2 years and I've had issues while testing the systems with specific drivers, so I had to roll-back drivers in some cases to achieve full stability, so that the brain-dead doctors don't think the PC is broken.

0
Rep
272
Offline
admin approved badge
19:24 Aug-02-2022

Hmm, I don't think I've ever had to rollback a driver with my Nvidia cards. The only time when I needed to do that was when I modded the VBIOS of my laptop's (at the time) GTX 980M cards and they wouldn't work with a certain driver version after the flash. Other than that incompatibility with a mod - on stock cards like I ran since - nothing like that, stuff just worked fine.


Granted - I don't update drivers every time there's a new one. I stick with what I have until I need a new software feature or driver support for a new game - then I'll grab whatever is the latest at the time. So maybe I'm just lucky I don't run into problems by skipping multiple revisions of drivers at a time? Hut then surely I'd run into something...

1
Rep
1
Offline
01:24 Aug-03-2022

I have the same strategy, I get on a good driver and stick with it until a game prompts me to update it. But when you are building a new PC you install the latest drivers and I've been hit with duds and many problems from both AMD and Nvidia. And if it was for me, I wouldn't mind the occasional driver crash or PC freeze and such, along with features not working, control panels not starting, color problems, etc, etc, but for a "client"(I do it for free) you have to make sure it works 100%

0
Rep
272
Offline
admin approved badge
13:17 Aug-03-2022

Hmm, I can't say I fish for particular drivers, to be fair. Whatever I randomly get always worked. No such things as NVCP not working or color issues. I really wonder if that could be a GPU class problem at that point, like less care put into the lower-end cards? WOuld be really odd if that's the case, though.

0
Rep
97
Offline
admin approved badge
01:33 Aug-02-2022

Intel used integrated GPU drivers were used to an AIB. A driver traditionally coded so badly and limited in its updates that it has become its own joke. a driver so bad that when Intel found a simple code problem, performance in a certain area increased by 100x.


ARC GPUs performance tanks when rebar is disabled. AMD and Nvdia drivers are miles ahead of where Intel is. a little buggy is nothing to the lack of effort that Intel has put in its drivers over the years or now.

0
Rep
1
Offline
01:25 Aug-03-2022

That's to be expected, Intel didn't have dedicated GPUs until, well not even now. Their iGPUs were either for accelerating CPU tasks that worked well on GPUs or for a basic graphics adapter, plus next to nothing is optimized for intel GPUs, APIs and games just ignore intel(and for a good reason until now).

0

Can They Run... |

| 60FPS, High, 1080p
Ryzen 5 3600 6-Core 3.6GHz GeForce RTX 2060 Asus Dual 6GB 16GB
66.6667% Yes [3 votes]
| 60FPS, Low, 720p
Core i5-5200U 2.2GHz GeForce 930M 2GB 8GB
| 60FPS, High, 1080p
Ryzen 5 3600 6-Core 3.6GHz Radeon RX 590 ASRock Phantom Gaming X OC 8GB 16GB
100% Yes [1 votes]
Pentium Gold G5500 2-Core 3.8GHz GeForce GTX 1050 Ti 4GB 8GB
0% No [2 votes]
| 60FPS, Low, 1080p
Core i7-3770S 4-Core 3.1GHz GeForce GTX 1050 Ti EVGA SC Gaming 4GB 8GB
100% Yes [2 votes]
| 60FPS, Ultra, 4k
Core i7-4770 4-Core 3.4GHz GeForce GTX 1050 Ti 4GB 16GB
0% No [1 votes]