Recommended System Requirements | ||
---|---|---|
Game | GeForce 4 MX 440 | GeForce FX 5900 Ultra |
Cyberpunk 2077 | 9726% | 4402% |
Hitman 3 | 12683% | 5757% |
Assassins Creed: Valhalla | 8887% | 4018% |
Resident Evil 8 | 9726% | 4402% |
FIFA 21 | 4857% | 2171% |
Grand Theft Auto VI | 13726% | 6235% |
Call of Duty: Black Ops Cold War | 8596% | 3884% |
Genshin Impact | 9726% | 4402% |
The Medium | 12161% | 5518% |
Far Cry 6 | 14117% | 6414% |
In terms of overall gaming performance, the graphical capabilities of the Nvidia GeForce FX 5900 Ultra are massively better than the Nvidia GeForce 4 MX 440.
The FX 5900 was released over a year more recently than the 4 MX, and so the FX 5900 is likely to have better driver support, meaning it will be more optimized for running the latest games when compared to the 4 MX.
Both GPUs exhibit very poor performance, so rather than upgrading from one to the other you should consider looking at more powerful GPUs. Neither of these will be able to run the latest games in any playable way.
The FX 5900 has 128 MB more video memory than the 4 MX, so is likely to be slightly better at displaying game textures at higher resolutions. This is supported by the fact that the FX 5900 also has superior memory performance overall.
The FX 5900 has 10.4 GB/sec greater memory bandwidth than the 4 MX, which means that the memory performance of the FX 5900 is slightly better than the 4 MX.
The GeForce 4 MX 440 has 2 Shader Processing Units and the GeForce FX 5900 Ultra has 4. However, the actual shader performance of the 4 MX is 1 and the actual shader performance of the FX 5900 is 2. The FX 5900 having 1 better shader performance and an altogether better performance when taking into account other relevant data means that the FX 5900 delivers a marginally smoother and more efficient experience when processing graphical data than the 4 MX.
The GeForce FX 5900 Ultra requires 59 Watts to run but there is no entry for the 4 MX.
Core Speed | 275 MHz | vs | ![]() | 450 MHz | |
---|---|---|---|---|---|
Boost Clock | - | vs | - | ||
Architecture | NV17 | NV35 | |||
OC Potential | - | vs |
![]() | Poor | |
Driver Support | - | vs | - | ||
Release Date | 06 Feb 2002 | vs | ![]() | 12 May 2003 | |
GPU Link | GD Link | GD Link | |||
Approved | ![]() | ![]() | |||
Comparison |
Memory | 128 MB | vs | ![]() | 256 MB | |
---|---|---|---|---|---|
Memory Speed | 200 MHz | vs | ![]() | 425 MHz | |
Memory Bus | 128 Bit | vs | ![]() | 256 Bit | |
Memory Type | DDR | ![]() | vs | ![]() | DDR |
Memory Bandwidth | 3.2GB/sec | vs | ![]() | 13.6GB/sec | |
L2 Cache | - | vs | - | ||
Delta Color Compression | no | vs | no | ||
Memory Performance | 0% | ![]() |
vs | ![]() |
0% |
Comparison |
Shader Processing Units | 2 | vs | ![]() | 4 | |
---|---|---|---|---|---|
Actual Shader Performance | 0% | ![]() | vs | ![]() | 0% |
Technology | - | vs | ![]() | 130nm | |
Texture Mapping Units | - | vs | ![]() | 4 | |
Texture Rate | - | vs | ![]() | 1.8 GTexel/s | |
Render Output Units | - | vs | ![]() | 4 | |
Pixel Rate | - | vs | ![]() | 1.8 GPixel/s | |
Comparison |
Max Digital Resolution (WxH) | 1600x1200 | vs | ![]() | 2048x1536 | |
---|---|---|---|---|---|
VGA Connections | 1 | ![]() | vs | ![]() | 1 |
DVI Connections | 1 | ![]() | vs | ![]() | 1 |
HDMI Connections | 0 | vs | 0 | ||
DisplayPort Connections | - | vs | - | ||
Comparison |
Max Power | - | 59 Watts | |||
---|---|---|---|---|---|
Recommended PSU | - | - |
DirectX | 7.0 | vs | ![]() | 9.0b | |
---|---|---|---|---|---|
Shader Model | - | vs | ![]() | 2.0 | |
Open GL | 1.2 | vs | ![]() | 1.5 | |
Open CL | - | vs | - | ||
Notebook GPU | no | no | |||
SLI/Crossfire | no | vs | no | ||
Dedicated | yes | ![]() | vs | ![]() | yes |
Comparison |
Recommended Processor | - | - | |||
---|---|---|---|---|---|
Recommended RAM | - | - | |||
Maximum Recommended Gaming Resolution | - | - |
Performance Value | ![]() |
---|
Mini Review | GeForce4 MX 440 is part of the GeForce4 GPUS released by NVIDIA in 2002. Only compatible with DirectX 7 or less and with a max memory of 128 MB, it can't play today's games. Still, games before 2003 should be fully playable at max settings.... | The top-to-bottom family of NVIDIA GeForce FX GPUs provide both high-performance gaming for enthusiasts and best-in-class features for mainstream users. Whether you're a bleeding-edge gamer who desires the most advanced gaming technology available, or a PC user in search of the perfect combination of power, performance, and value--GeForce FX solutions deliver. |
---|
Recommended CPU | - | - | |||
---|---|---|---|---|---|
Possible GPU Upgrades | - | - | |||
GPU Variants | - | - |