|Recommended System Requirements|
|Game||Radeon X1300 XT||GeForce 8400 GS|
|Deus Ex: Mankind Divided||4530%||3900%|
|No Mans Sky||2863%||2460%|
|Pro Evolution Soccer 2017||803%||680%|
|Forza Horizon 3||5363%||4620%|
|Watch Dogs 2||3928%||3380%|
In terms of overall gaming performance, the graphical capabilities of the Nvidia GeForce 8400 GS are very slightly better than the AMD Radeon X1300 XT .
The GeForce 8400 GS was released less than a year after the X1300 XT, and so they are likely to have similar driver support for optimizing performance when running the latest games.
Both GPUs exhibit very poor performance, so rather than upgrading from one to the other you should consider looking at more powerful GPUs. Neither of these will be able to run the latest games in any playable way.
The Radeon X1300 XT and the GeForce 8400 GS have the same amount of video memory, but are likely to provide slightly different experiences when displaying game textures at high resolutions.
The X1300 XT has 6.4 GB/sec greater memory bandwidth than the GeForce 8400 GS, which means that the memory performance of the X1300 XT is marginally better than the GeForce 8400 GS.
The Radeon X1300 XT has 12 Shader Processing Units and the GeForce 8400 GS has 16. However, the actual shader performance of the X1300 XT is 4 and the actual shader performance of the GeForce 8400 GS is 18. The GeForce 8400 GS having 14 better shader performance is not particularly notable, as altogether the X1300 XT performs better when taking into account other relevant data.
The Radeon X1300 XT requires 22 Watts to run and the GeForce 8400 GS requires 18 Watts. We would recommend a PSU with at least 300 Watts for the GeForce 8400 GS, but we do not have a recommended PSU wattage for the X1300 XT. The X1300 XT requires 4 Watts more than the GeForce 8400 GS to run. The difference is not significant enough for the X1300 XT to have a noticeably larger impact on your yearly electricity bills than the GeForce 8400 GS.
Can I Run It
|Core Speed||500 MHz||vs||450 MHz|
|Release Date||12 Aug 2006||vs||17 Apr 2007|
|GPU Link||GD Link||GD Link|
|Memory||256 MB||vs||256 MB|
|Memory Speed||400 MHz||vs||400 MHz|
|Memory Bus||128 Bit||vs||64 Bit|
|L2 Cache||-||vs||0 KB|
|Delta Color Compression||no||vs||no|
|Shader Processing Units||12||vs||16|
|Actual Shader Performance||0%||vs||1%|
|Texture Mapping Units||-||vs||8|
|Texture Rate||-||vs||3.6 GTexel/s|
|Render Output Units||-||vs||4|
|Pixel Rate||-||vs||1.8 GPixel/s|
|Max Digital Resolution (WxH)||-||vs||2048x1536|
|Max Power||22 Watts||vs||18 Watts|
|Recommended PSU||-||300 Watts & 20 Amps|
|Recommended Processor||-||Intel Celeron 440 2.0GHz|
|Recommended RAM||-||2 GB|
|Maximum Recommended Gaming Resolution||-||800x600|
|Mini Review||Radeon X1300 XT is an entry-level GPU based on the 90nm variant of the R500 architecture. |
It's based on the RV530 Core and offers 12 Pixel Shaders, 4 TMUs and 4 ROPs, on a 128-bit of standard DDR2. The central unit runs at 500MHz and the memory clock operates at up to 400MHz.
Despite featuring more Pixel and Vertex Shaders, Radeon X1300 XT consumes less power than X1300 PRO due to being equipped with energy efficient DDR2 (against DDR X1300 PRO) and featuring a lower clocked central unit.
Still, its performance is quite limited and so today's modern games are only playable at the lowest settings or unplayable at all. As it's not based on a Shader-Unified architecture, both DirectX 10 & 11 games aren't supported.
|GeForce 8400 GS is an entry-level GFX based on the 80nm variant of the first-shader unified architecture. |
It's based on the G86 Core and offers 16 Shader Processing Units, 8 TMUs and 4 ROPs, on a 64-bit memory interface of DDR2. The central unit runs at 450MHz and the memory clock operates at 400MHz. It will consume no more than 40 Watt.
It's therefore a GeForce 8500 GT, slightly lower power consumption and no SLI support.
This GPU is only suited for a 1024x768 (or less) resolution. Most of today's games will only run at low settings and extremely demanding games might not run at all. DirectX 11 based games aren't supported and its performance will decrease drastically, as the resolution increases.
Note: This GPU had two revisions which added HDMI support, DDR3 memory and different core configurations.
|Recommended CPU||-||AMD Athlon LE-1600|
Intel Pentium 4 HT 570J
Intel Pentium 4 HT 571
Intel Pentium 4 HT 560J
Intel Pentium 4 HT 561
|Possible GPU Upgrades||-||Radeon HD 4850 Golden Sample Edition|
Radeon HD 4850 Gigabyte OC Edition
GeForce 9800 GX2
GeForce GTS 450 PNY XLR8 Performance 1GB Edition
GeForce GTS 450 Point of View 1GB Edition