Intel's upcoming 10nm Xe GPU to feature hardware-level ray-tracing support

Written by Stuart Thomas on Fri, May 3, 2019 3:32 PM
System Requirements Optimum 1080p PC Build Low vs Ultra Screenshots GPU Performance Chart CPU List That Meet System Requirements GPU List That Meet System Requirements

Intel has confirmed its upcoming Xe GPU architecture will support hardware accelerated real-time ray-tracing. At the moment, Intel has only confirmed ray tracing will be supported in the initial wave of Xe GPUs designed for data centre usage, although this is a heavy indicator that the first consumer Intel graphics cards will support ray tracing technology when they launch next year.

“The Intel Xe architecture roadmap for data center optimized rendering includes ray tracing hardware acceleration support for the Intel Rendering Framework family of API’s and libraries,” confirmed Intel in a blog post.

Considering Intel’s direct competitor Nvidia has already led the charge with ray-tracing on its current crop of GeForce RTX GPUs, it stands to reason that Intel would want to match this feature set.

The focus for Intel’s initial chatter around ray-tracing was the potential benefits for the movie industry. Pixar was namechecked more than once, for example. 3D animations already utilise ray tracing but the task is performed on the CPU rather than the GPU. CPUs are much, much slower at ray tracing but render the image with 100% accuracy. Graphics cards are comparatively quick and dirty at the process, at the expense of ray tracing accuracy. From the way Intel is talking, it sounds as if Team Blue may have zeroed in on a close-to-100% accuracy GPU rendering method for ray tracing, which would prove a huge boon for the animation industry.

“Studios continue to reach for maximum realism with complex physics processing for cloth, fluids, hair and more, plus modeling the physics of light with ray tracing,” wrote Jim Jeffers, a senior principal engineer and senior director of Intel’s Advanced Rendering and Visualization team. “These algorithms benefit from mixed parallel and scalar computing while requiring ever-growing memory footprints. The best solutions will include a holistic platform design where computational tasks are distributed to the most appropriate processing resources.”

The big question now is whether this ray tracing technology trickles down to the consumer graphics cards. With Intel supporting it at a hardware-level for Xe, the odds are certainly high.

This would just leave AMD without confirmed ray tracing support, although there’s an expectation that the 7nm Navi architecture will support ray tracing on PC. Sony recently teased out details on the upcoming PlayStation 5, confirming a custom Navi GPU as well as what is probably fairly rudimentary ray tracing capabilities. This leads us to expect the Navi desktop chips will also follow suit.

Login or Register to join the debate

Rep
24
Offline
06:31 May-04-2019

Sounds good, but it's still intel so it's a no go for me

-1
Rep
36
Offline
12:31 May-04-2019

That's stupid. I never give a $h1t who is making a product, and judge the produce by it's value and performance. If it's good I'll definitely look into it. We sorely need competition in the GPU market and only AMD is not cutting it.

8
Rep
45
Offline
admin approved badge
01:54 May-04-2019

I've gotta say this thing looks pretty nice. I hope Intel can bring some healthy competition to AMD and nVidia.

2
Rep
59
Offline
admin approved badge
20:16 May-03-2019

Bring on the competition, my wallet will (hopefully) cry tears of joy in the future.

6
Rep
1,041
Offline
senior admin badge
17:46 May-03-2019

looks like a rising battle of Intel+nVidia vs AMD actually

6
Rep
272
Offline
admin approved badge
01:12 May-04-2019

Suppose it's easier to take down a common enemy together, flanking both sides (CPU and GPU).

1
Rep
18
Offline
17:42 May-03-2019

What will be the memory of those GPUs ?

0
Rep
35
Offline
17:41 May-03-2019

I will for sure wait for benchmarks but i think i will upgrade my gpu to a intel once it comes out. Wonder if my cpu can handle the gpu.

1
Rep
57
Offline
17:20 May-03-2019

Thats actually amazing, ray tracing is indeed awesome for games, though it just got bad rep for tanking fps heavily because current gen is definitely not ready, but its a start, none the less.

0
Rep
-6
Offline
17:07 May-03-2019

my new gpu will be intel or amd, depends on price and perf

2
Rep
272
Offline
admin approved badge
16:56 May-03-2019

@Stuart - not sure where you guys heard it, but we can run 100% photorealistic, pathtraced rendering engines on our GPUs and could do since about 2012. One such engine I personally use is FStorm, but others like Iray+, Octane, RedShift, Arnold and VRay RT (among others) are also capable of outputting photorealistic and physically accurate results.


RedShift is the rendering engine of choice for Blizzard, for example (do not be fooled by the render style of Overwatch - the lighting there is very complex!) while engines like Octane and FStorm are used in advertising and interior design industries.

4
Rep
272
Offline
admin approved badge
17:03 May-03-2019

With GPU rendering our biggest limitation is the VRAM amount. The second one is lack of CUDA.


Whatever we render on GPU has to fit into the VRAM - models, textures, shaders, any caches, the final framebuffer, etc. The more VRAM you have - the larger and the more complex scenes you can render. On CPU you'd just buy more RAM. Not so easy with GPUs...


And for CUDA - Nvidia GPUs can run ALL the render engines, since they run CUDA and OpenCL. AMD GPUs don't have CUDA, so a few high-profile and very good engines become unavailable to them. For me - no CUDA - no sale, since I couldn't work with the tools I like/need.

3
Rep
272
Offline
admin approved badge
17:09 May-03-2019

A big bonus to GPU rendering, by the way is SCALABILITY. You Buy a CPU and you're stuck with what you have. Want more power? Buy another CPU or an entire new PC to render.


With GPUs - a single PC could have up to a double-digit number of GPUs just for rendering. You don't need to keep upgrading the motherboard or buying a new PC just to increase your rendering power. Besides, GPU interface standard, the PCIe connector, hasn't changed for a decade now and you can always step up and down the versions if you need to. How many changes to the CPU sockets have we had in the last 10 years? And which tech do you think gained more peformance over that period? GPUs or CPUs..? :)

3
Rep
87
Offline
18:43 May-03-2019

why r u getting downvoted to the ground?

-1
Rep
272
Offline
admin approved badge
01:10 May-04-2019

I didn't notice. But likely someone who doesn't understand how different offline 3D work is to video games and to whom the information I presented made no sense. Happens all the time, tbh.

-1
Rep
1,041
Offline
senior admin badge
17:46 May-03-2019

I'd say the twist is "realtime" rendering,
for ex. Blender provides realtime renderer for just couple months afaik

0
Rep
272
Offline
admin approved badge
01:08 May-04-2019

Depends on what is considered realtime. With VRay RT, FStorm, Iray and Corona (this one is CPU) we could do "realtime" rendering too - set the renderer to be interactive and we can play with the materials and lights with realtime feedback of how things look. Progressive rendering is nothing new and Blender's implementation seems to me to be a hybrid of a game engine and a real renderer - less noise, faster feedback, but also less accurate when compared with their Cycles renderer.


The wording is vague, so I'm not exactly sure what is marketed here. It's kinda how AMD marketed Ryzen in a way that ignored the 5960X as being the first consumer 8-core, saying it was 6900K (3 years apart).

1
Rep
-122
Offline
16:27 May-03-2019

my new gpu will be intel for sure

-1
Rep
57
Offline
17:17 May-03-2019

Wait for benchmarks first! :D

7
Rep
20
Offline
16:22 May-03-2019

very interesting intel may have known about the inclusion of RTX features long before we did
I don't think they just started working on it after the release of RTX

0

Can They Run... |

| 30FPS, Medium, 1080p
Ryzen 7 4800H 8-Core 2.9GHz GeForce GTX 1650 Ti Mobile 16GB
| 60FPS, High, 1080p
Core i5-9300H 4-Core 2.4GHz GeForce GTX 1650 8GB
| 60FPS, Ultra, 1080p
Ryzen 5 3600 6-Core 3.6GHz GeForce RTX 3060 16GB
100% Yes [1 votes]
| 60FPS, Low, 1080p
Core i5-3470 3.2GHz Radeon RX 470 Sapphire Nitro+ 8GB 16GB
100% Yes [1 votes]
| 60FPS, High, 1080p
Core i7-10870H 8-Core 2.20GHz GeForce RTX 2060 Asus ROG STRIX Gaming 6GB 16GB
100% Yes [1 votes]
| 60FPS, Medium, 1080p
Ryzen 5 3400G 4-Core 3.7GHz GeForce RTX 2060 6GB 16GB
| 60FPS, Ultra, 1080p
Ryzen 7 3750H 4-Core 2.3 GHz GeForce RTX 2060 Mobile 16GB
100% Yes [1 votes]
| 60FPS, Ultra, 1080p
Core i7-6800K 6-Core 3.4GHz GeForce GTX 1080 Asus ROG Strix Gaming OC 8GB Edition 32GB
100% Yes [2 votes]
| 60FPS, High, 1080p
Core i7-10700 8-Core 2.90GHz GeForce GTX 1050 Gigabyte D5 2GB 8GB
0% No [1 votes]
Ryzen 5 3600 6-Core 3.6GHz GeForce RTX 2070 Gigabyte Windforce 8GB 16GB
100% Yes [4 votes]
Ryzen 7 5800H 8-Core 3.2GHz GeForce RTX 3060 Mobile 16GB
100% Yes [2 votes]