Difference between revisions of "NVIDIA"
(→NV1 / STG2000) |
m (→GeForce 8) |
||
(69 intermediate revisions by 6 users not shown) | |||
Line 1: | Line 1: | ||
+ | ==Retro value== | ||
+ | GeForce 256 through GeForce FX are fine cards for old games for a number of reasons. Their DOS game compatibility and GUI performance are top notch. Their Direct3D driver supports two critical old features, fog table and 8-bit palettized textures. OpenGL compatibility and performance are second to none, and some games utilize proprietary NVIDIA extensions. | ||
− | + | One problem with NVIDIA cards prior to GeForce4 is that some card vendors built their cards too cheaply. The most noticeable result is poor analog signal quality. It causes problems such as blurriness, loss of color saturation and color bleed, particularly at higher resolutions and refresh rates. Determining which cards are high quality is difficult but the GeForce4 and newer cards are most likely to be good. | |
− | + | ||
− | NVIDIA | + | |
− | + | Perhaps the finest choices are the GeForce FX series because they offer the most refined quality-enhancing features and the high end models have the performance to use these features at higher resolutions. GeForce 6 drops support for palettized textures, which is a problem with a few games, but otherwise they too are great choices. GeForce 7 drops support for Windows 9x but still supports Windows 2000. Some users, however, have reportedly gotten GeForce 7800 cards to work under Windows 98 and ME using an unofficial, modified driver [http://www.mdgx.com/files/nv8269.php (link)]. This modified driver does not support Windows 95. The GeForce 7 series is also the last that is supported by NVIDIAs Stereo3D driver extension for shutter glasses such as ELSA Revelator. GeForce 8 and above support only XP and newer. | |
− | == | + | ==Cards== |
− | + | ===NV1 / STG2000 === | |
+ | [[File:Dedge3d.jpg|180px|thumb|NV1]] | ||
+ | Released in 1995, NVIDIA's first 3D accelerator was an all-in-one product with audio, GUI, VGA, 3D and Sega Saturn gamepad support. It uses a type of 3D rendering called quadratic texture mapping that is not Direct3D or OpenGL compatible so it is only useful with games that use its proprietary API. It is the same technology used by Sega Saturn and as such various games were ported to use the NV1. Its audio consists of wavetable MIDI and DirectSound support but very little DOS support. DOS VGA compatibility is limited. | ||
− | + | It comes in various memory configurations with up to 4 MB maximum. The final drivers support Direct3D, but it is a software-only implementation. | |
− | + | Games with support: Battle Arena Toshinden, Descent: Destination Saturn, NASCAR Racing, Panzer Dragoon, Virtua Cop, Virtua Fighter Remix. | |
− | + | ||
− | == | + | ===NV3 | RIVA 128 & 128 ZX=== |
− | + | [[File:Riva128.jpg|180px|thumb|RIVA 128]] | |
+ | The RIVA 128 was released in late 1997 and it is the first Direct3D compatible GPU from NVIDIA. RIVA stands for '''R'''eal-time '''I'''nteractive '''V'''ideo and '''A'''nimation and 128 for the internal 128-bit pipeline and memory interface. It has a 206 MHz RAMDAC and supports DDC2 and VBE3. RIVA 128 has all the hardware features required for Direct3D 5 and has also good OpenGL compatibility. It renders at only 16-bit color depth. The 3D performance is competitive with [[3dfx|Voodoo Graphics]] (Voodoo1). | ||
− | + | The RIVA 128 shows a number of rendering quality compromises. It has blending issues that introduce dithering patterns/noise. Its texture mapping does automatic mipmap generation. This combined with its per-polygon mip mapping causes texture popping and other quirks. The per-pixel mipmapping in the control panel does not operate as true per-pixel mipmapping and can cause problems. It also has some kind of problem maintaining texture alignment and visible gaps often occur. Finally its bilinear texture filtering is not as blurring as you may be used to. Sometimes textures look almost as if they are only point-sampled. Overall RIVA 128 is an interesting mix of speed and compromise. | |
− | + | ||
− | + | A special feature of the PCI version of the card is the ability to load textures over the PCI bus, which was advertised as ''AGP content on PCI''. The 2D-features include video scaling and color conversion capabilities. The chip features also PAL/NTSC output so many cards were released with TV-out, some even with TV-in. | |
− | + | In early 1998, NVIDIA refreshed the NV3 architecture by releasing the RIVA 128 ZX. The RIVA 128 ZX is an upgraded chip that has a 250 MHz RAMDAC and supports up to 8 MB SGRAM. For texture intense games the increased memory results in higher performance. Higher resolutions as also supported. The RIVA 128 ZX was also integrated onto Intel's i440BX-based RC440BX motherboard. | |
− | + | ===NV4 | RIVA TNT=== | |
+ | [[File:Tntpci.jpg|180px|thumb|TNT PCI]] | ||
+ | [[File:TNT_AGP_16MB_-_Diamond_Viper_550.jpg|180px|thumb|TNT AGP]] | ||
+ | Released in 1998, NVIDIA's RIVA TNT is a drastically improved Direct3D 6-compliant GPU with much better image quality and performance compared to its predecessors. It is competitive with Voodoo2 but with more flexibility such as 32-bit color rendering and 1024x1024 texture support. It also supports AGP 2x with execute mode (AGP texturing). 32-bit color rendering comes with a significant speed loss though and large textures are of very limited use since the chip lacks any form of texture compression. 16 MB RAM allows for very high resolutions. Great OpenGL support. All NVIDIA cards have the advantage of seperately clocked memory which makes for more flexible overclocking than with 3dfx cards. Unlike ATI's Rage 128 - one of its main competitors, the RIVA TNT does not support true trilinear filtering, but rather, its trilinear filtering is an approximation implementation. This results in reduced visual quality when viewing textures / MIP levels at far distances (which can be best described as a "sanding" or dither-pattern type effect) when trilinear filtering is enabled, but otherwise the RIVA TNT's visual quality is great. The RIVA TNT was also integrated onto Intel's SR440BX and PowerColor's DREAMCODE motherboards, both of which are based on Intel's i440BX chipset. | ||
− | + | <br><br/> | |
− | + | <br><br/> | |
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
− | + | ===NV5-6 | RIVA TNT2 & TNT2 M64 and Vanta=== | |
+ | [[File:TNT2_M64_16MB_PCI.jpg|180px|thumb|TNT2 M64 16MB PCI (notice the solder dots for the 32MB version)]] | ||
+ | [[File:TNT2_M64_32MB_(may_have_a_not-original_cooling_solution).jpg|180px|thumb|TNT2 M64 32MB PCI]] | ||
+ | [[File:TNT2_M64_32MB_AGP.jpg|180px|thumb|TNT2 M64 32MB AGP (notice the solder dots for the 16MB version)]] | ||
+ | [[File:TNT2_32MB_-_Diamond_Viper_V770_non-ultra.jpg|180px|thumb|TNT2 AGP]] | ||
+ | [[File:TNT2_Ultra_32MB_-_Diamond_Viper_V770.jpg|180px|thumb|TNT2 Ultra AGP]] | ||
+ | NVIDIA further refined the RIVA TNT architecture in early 1999 with the RIVA TNT2. The RIVA TNT2 is very similar to the original TNT but adds support for AGP 4x and 2048x2048 resolution texture maps. It also typically has 32 MB of RAM and it is clocked much higher so performs noticeably better. Some popular budget variants include Vanta and TNT2 Model 64 (M64). These budget models use a 64-bit memory interface (rather than 128-bit as opposed to the regular variant), effectively halving its memory bandwidth. The performance of the M64 model, however, is faster than the original TNT in some situations. | ||
− | ==NV3x | GeForce FX== | + | Acer Laboratories Inc. (ALi) later integrated the RIVA TNT2 core into the northbridge of their M1631 motherboard chipset, commonly known as the Aladdin TNT2. The Aladdin TNT2 could also use shared system memory (up to 32 MB total) in addition its onboard graphics memory. Most motherboard manufacturers opted to equip it with 8 MB local memory. The Aladdin TNT2 is on par with the TNT2 M64 in gaming performance. |
+ | |||
+ | The RIVA TNT and TNT2 do not support 8-bit palettized textures. This is a problem with, for example, Final Fantasy VII. Likewise with the original TNT, the RIVA TNT2 only supports trilinear approximation rather than true trilinear filtering. NVIDIA would not implement true trilinear filtering until the NV1x (GeForce 256 / GeForce2 / GeForce4 MX) architecture. | ||
+ | |||
+ | The TNT2 is also NVIDIA's last chip with Windows 3.1x support. The TNT/TNT2 Windows 3.1x drivers were only released in beta form and therefore have stability issues and occasionally produce screen corruption - for example, video playback is fickle and will usually crash the system. Furthermore, the same drivers are locked to 60 Hz refresh rate, making them undesirable for use with CRT monitors under this OS. The TNT/TNT2 Windows 3.1x drivers will not work on any GeForce cards. | ||
+ | |||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | ===NV10 | GeForce 256=== | ||
+ | [[File:Gf256sdr.jpg|180px|thumb|GeForce 256 SDR]] | ||
+ | Released in October 1999, GeForce 256 / NV10 was the first Direct3D 7-compliant GPU. It was initially released with standard SDRAM, but a DDR version later followed. The DDR version is roughly twice as fast as TNT2. It introduced ordered grid super-sampling anti-aliasing, anisotropic filtering support (up to 2x level), cubic environment mapping, and support for hardware dot product bump mapping. | ||
+ | |||
+ | ===NV11 to NV1A | GeForce 2 / 2 MX / 4 MX=== | ||
+ | [[File:Geforce_2MX_200_32MB_-_Active_cooling.jpg|180px|thumb|GeForce 2MX 200]] | ||
+ | [[File:Geforce_2MX_200_32MB_-_Passive_cooling.jpg|180px|thumb|GeForce 2MX 200]] | ||
+ | [[File:Geforce_2MX_400_64MB.jpg|180px|thumb|GeForce 2MX 400]] | ||
+ | [[File:Geforce_2MX_400_PCI_larger_PCB.jpg|180px|thumb|GeForce 2MX 400 PCI]] | ||
+ | [[File:Geforce_2MX_400_PCI.jpg|180px|thumb|GeForce 2MX 400 PCI]] | ||
+ | [[File:Geforce_2_GTS.jpg|180px|thumb|GeForce 2 GTS]] | ||
+ | [[File:Geforce_2Ti_-_Nice_purple!.jpg|180px|thumb|GeForce 2 Ti]] | ||
+ | [[File:Geforce_4MX_420_64MB.jpg|180px|thumb|Geforce 4MX 420]] | ||
+ | [[File:Geforce_4MX_440.jpg|180px|thumb|Geforce 4MX 440]] | ||
+ | [[File:Geforce_4MX_460_(Medion_-_without_original_cooling_solution).jpg|180px|thumb|Geforce 4MX 460 (without original cooling solution)]] | ||
+ | The next refinement in the architecture came in April 2000 in the form of the GeForce 2 / NV15. The GeForce 2 has twice the texture fillrate per clock compared to NV10 and uses a smaller manufacturing process allowing higher clock rates while reducing the power consumption at the same time. The GeForce 2 GTS is about 40% faster than the GeForce 256. GeForce 2 performance is mostly limited by memory bandwidth. Some card manufacturer used a low quality analog circuit design that produces a blurry image output. | ||
+ | |||
+ | GeForce 2 MX / NV11 is the low end series of the GeForce 2, released in September 2000. These cards have half of the pixel pipelines and half the memory interface of the NV15. It is the first NV chip with two integrated TDMS channels, providing dual display output (called "TwinView"). It also has "Digital Vibrance Control" that allows calibration of various image output aspects. The 3D performance of GeForce 2 MX at 16-bit color depth is slightly faster than a GeForce 256 SDR. With its relatively low price and with the performance it offered, it became a popular card. The GeForce 2 MX core was later integrated into NVIDIA's nForce IGP motherboard chipset northbridge. | ||
+ | |||
+ | GeForce 4 MX / NV17 replaced the GeForce 2 series in January 2002. The NV17 core is a hybrid of the NV11 and NV25. The integration of various efficiency and bandwidth improving features, combined with significantly higher clock speed than NV11, allows it to match NV15 performance. These features were advertised as "Lightspeed Memory Architecture II," which was a refinement of the Lightspeed Memory Architecture introduced with the GeForce 3 (NV20). It also gained the "AccuView" anti-aliasing capabilities which are considerably advanced in quality and performance over NV11 and NV15. However, the GeForce 4 MX lacks the hardware pixel and vertex shaders support, as well as environment mapped bump mapping support and higher level anisotropic filtering support found in the NV2x architecture. | ||
+ | *[http://www.anandtech.com/show/875/6 AnandTech: NV17 and NV25 Come to Life] | ||
+ | *[http://www.nvidia.com/object/feature_lmaii.html Lightspeed Memory Architecture II] | ||
+ | |||
+ | NVIDIA further refreshed the GeForce 4 MX line in late 2002 with the NV18, which added AGP 8x support. In 2004 the series received another refresh with PCI Express x16 support via a bridge chip. The first such product introduced was the GeForce PCX 4300. The GeForce 4 MX core was also integrated into NVIDIA's nForce2 IGP motherboard chipset northbridge. | ||
+ | |||
+ | Estimated model performance ranking: | ||
+ | |||
+ | GF2 MX100 < GF2 MX200 < GF2 MX < GF2 MX400 < GF4 MX420 < GF2 GTS < GF2 Pro < GF2 Ti VX < GF2 Ti < GF2 Ultra < GF4 MX440 | ||
+ | |||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | ===NV20 | GeForce 3=== | ||
+ | [[File:GeForce3_Ti200.JPG|180px|thumb||GeForce 3 Ti 200]] | ||
+ | [[File:Geforce_3_Ti200_-_Hercules.jpg|180px|thumb||GeForce 3 Ti 200 - Active cooling]] | ||
+ | Released in March 2001, the GeForce 3 / NV20 is the first Direct3D 8.0-compliant GPU. It is more efficient than GeForce 2 because of improvements in memory bandwidth utilization and the addition of hidden surface removal (HSR) functions similar to those of ATI Radeon. NVIDIA called this "Lightspeed Memory Architecture". Despite lower fillrate than GeForce 2 Ultra/Pro/Ti, in some cases GeForce 3 can outperform those cards by up to 50%, namely in situations with anti-aliasing or when the HSR features save considerable fillrate. However in some cases it loses to GeForce 2 Ultra. In the latter half of 2001, NVIDIA released the GeForce 3 Ti 200 and Ti 500. GeForce 3 Ti 200 is clocked lower than the original GeForce 3, while Ti 500 is fastest. The original GeForce 3 and GeForce 3 Ti 500 were only released in 64 MB configurations, while the GeForce 3 Ti 200 was released in 64 MB and 128 MB configurations. The GeForce 3, however, benefitted very little from 128 MB memory. | ||
+ | |||
+ | GeForce 3 is the first GeForce GPU with environment mapped bump mapping (EMBM) support, multi-sample anti-aliasing (MSAA) and complete anisotropic filtering support (up to 8x level). MSAA is considerably less demanding of fillrate than SSAA. 2X and 4X MSAA modes are available. There is also an anti-aliasing mode called "Quincunx" that uses a combination of 2X MSAA and a RAMDAC-based filter. This mode was intended to allow better quality anti-aliasing than 2X MSAA/SSAA but without the performance hit of 4X MSAA/SSAA. Higher anisotropic filtering levels (4x and 8x), however, incur a heavy performance decrease (sometimes by as much as 50%). However, compared to its main competitor, ATI's R200 series, GeForce3's anisotropic filtering implementation is less angle-dependent and can work simultaneously with trilinear filtering, yielding better quality when observed in motion. | ||
+ | |||
+ | ===NV25 to NV28 | GeForce 4=== | ||
+ | [[File:Geforce_4200_64MB.jpg|180px|thumb||GeForce 4 Ti4200 64MB]] | ||
+ | The next evolution in the NV2x architecture came in early 2002 in the form of the NV25 / GeForce 4 Ti series. The GeForce 4 Ti is quite similar to the GeForce 3 in general. Changes include higher clock speed, pixel shader 1.3 support, Direct3D 8.1 compliance, an additional vertex processor for better geometry performance and dual RAM DAC for dual display output. Later in 2002, NVIDIA released the NV28, which added AGP 8x support. With the NV28 release, the AGP 8x versions of the Ti 4400 and 4600 were respectively rebranded as the Ti 4800 SE and Ti 4800. The AGP 8x variant of the Ti 4200 was just known as the "Ti 4200 with AGP 8x." | ||
+ | |||
+ | In the GeForce4 Ti line, the Ti 4200 is slowest and the Ti 4800 is fastest. | ||
+ | *[http://ixbtlabs.com/articles/gf4/index1.html Review from ixbtlabs GeForce 3 Ti500, Radeon 8500, GeForce 4 Ti] | ||
+ | |||
+ | ===NV3x | GeForce FX=== | ||
+ | [[File:Gf5200u.jpg|180px|thumb|FX 5200 Ultra]] | ||
+ | [[File:Geforce_FX_5600_256MB.jpg|180px|thumb|FX 5600 256MB]] | ||
+ | [[File:Geforce_FX_5900_with_original_cooling_solution.jpg|180px|thumb|FX 5900 Ultra]] | ||
These are NVIDIA's first Direct3D 9 GPUs. They have excellent Direct3D 5-8 compatibility and performance but are of limited value for Direct3D 9. They are very useful for old games because they still have support for palettized textures and fog table. Similar anti-aliasing and anisotropic features, but performance with these is improved compared to older models. The high performance models like 5700 Ultra, 5800 Ultra and 59x0 Ultra allow one to run old games in high-resolution with anti-aliasing and anisotropic filtering. | These are NVIDIA's first Direct3D 9 GPUs. They have excellent Direct3D 5-8 compatibility and performance but are of limited value for Direct3D 9. They are very useful for old games because they still have support for palettized textures and fog table. Similar anti-aliasing and anisotropic features, but performance with these is improved compared to older models. The high performance models like 5700 Ultra, 5800 Ultra and 59x0 Ultra allow one to run old games in high-resolution with anti-aliasing and anisotropic filtering. | ||
Line 38: | Line 143: | ||
5200 < 5500 < 5200 Ultra < 5600 < 5600 Ultra < 5700 < 5700 Ultra < 5800 < 5800 Ultra < 59x0 < 59x0 Ultra | 5200 < 5500 < 5200 Ultra < 5600 < 5600 Ultra < 5700 < 5700 Ultra < 5800 < 5800 Ultra < 59x0 < 59x0 Ultra | ||
− | ==NV4x | GeForce 6== | + | <br><br/> |
− | + | <br><br/> | |
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | ===NV4x | GeForce 6=== | ||
+ | [[File:Geforce_6600.jpg|180px|thumb|Geforce 6600 128MB]] | ||
+ | [[File:Geforce_6800.jpg|180px|thumb|Geforce 6800]] | ||
+ | The GeForce 6 series (NV4x) was released in 2004 and is the first Direct3D 9.0c-compliant GPU. It introduced support for Shader Model 3.0 and support for PCI Express x16, though initially released cards were AGP-only. Compared to the GeForce FX series, the NV4x had dramatically improved performance all-around but dropped palettized texture support, so it is incompatible with some old games (few). This is NVIDIA's final generation of GPUs with Windows 9x and NT 4.0 support. | ||
+ | |||
+ | Some of the GF 6800s and most of the GF 6600s made use of a PCIe to AGP bridge chip. | ||
+ | |||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | ===GeForce 7=== | ||
+ | The Geforce 7 was the last generation of NVidia graphics chips to support AGP. | ||
+ | Ati/AMD continued to offer AGP-compatible graphics solutions for a while longer till the HD 4670. | ||
+ | While mostly a direct improvement of the GeForce 6 series, it was the first generation to go by NVidia's new GPU naming scheme. | ||
+ | The initial flagship 7800 GTX was later superseded by the 7900 GTX. | ||
+ | |||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | ===GeForce 8=== | ||
+ | [[File:Geforce_8800GTS_320MB.jpg|180px|thumb|Geforce 8800GTS 320MB]] | ||
+ | The Geforce 8 series was introduced in late 2006 and served as the first Direct3D 10 compatible GPU, as well as the first product of a paradigm shift for GPU's in the form of becoming general purpose computational devices, opening up for non-graphical usages with CUDA and OpenCL. Many architectural changes have implications for older games, such as improved math precision, lack of proper 16-bit framebuffer support, and different gamma-correction behaviour (anti-aliasing and texture filtering). Two different chips go by the same brand of 8800 GTS. | ||
+ | |||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | <br><br/> | ||
+ | |||
+ | ==Compatibility notes== | ||
+ | *With Intel 440BX motherboards, drivers newer than 56.64 may be unstable. | ||
+ | |||
+ | ==Video captures== | ||
+ | |||
+ | ===NV1 / STG2000 === | ||
+ | {{#ev:youtube|2H-gEKsd5SY}} | ||
+ | {{#ev:youtube|9bJGnzb0Asw}} | ||
+ | {{#ev:youtube|t9Zqy5BZQ1E}} | ||
+ | {{#ev:youtube|_4l_2Bqbi9Q}} | ||
+ | {{#ev:youtube|ZOstiHUk_EM}} | ||
+ | {{#ev:youtube|Ibs90LY_Ph8}} | ||
+ | {{#ev:youtube|JG0kqg8QlLw}} | ||
+ | {{#ev:youtube|nc6cH0zuMSs}} | ||
+ | {{#ev:youtube|VK9sg_93iCE}} | ||
+ | |||
+ | ===NV3 | RIVA 128 & 128 ZX === | ||
+ | {{#ev:youtube|0GAEXE3eu0o}} | ||
+ | {{#ev:youtube|Cqxia9tPFrs}} | ||
+ | {{#ev:youtube|QpGVOAuDfNc}} | ||
+ | {{#ev:youtube|h2KGiIen4n4}} | ||
+ | {{#ev:youtube|LZrWRMMdwW4}} | ||
+ | {{#ev:youtube|hv07UKRetPY}} | ||
+ | |||
+ | ==Related links== | ||
+ | *[http://www.vogonsdrivers.com/index.php?catid=24 VOGONS Drivers NVIDIA section] | ||
+ | *[http://gona.mactar.hu/DOS_TESTS/ Gona's PCI and AGP DOS game compatibility matrix] | ||
+ | *[http://www.vogonswiki.com/index.php/Interesting_Vogons_Threads#Graphics_cards VOGONS threads about graphics cards] | ||
+ | |||
+ | [[Category:Hardware]] | ||
+ | [[Category:Graphics Cards]] |
Latest revision as of 09:13, 13 January 2019
Retro value
GeForce 256 through GeForce FX are fine cards for old games for a number of reasons. Their DOS game compatibility and GUI performance are top notch. Their Direct3D driver supports two critical old features, fog table and 8-bit palettized textures. OpenGL compatibility and performance are second to none, and some games utilize proprietary NVIDIA extensions.
One problem with NVIDIA cards prior to GeForce4 is that some card vendors built their cards too cheaply. The most noticeable result is poor analog signal quality. It causes problems such as blurriness, loss of color saturation and color bleed, particularly at higher resolutions and refresh rates. Determining which cards are high quality is difficult but the GeForce4 and newer cards are most likely to be good.
Perhaps the finest choices are the GeForce FX series because they offer the most refined quality-enhancing features and the high end models have the performance to use these features at higher resolutions. GeForce 6 drops support for palettized textures, which is a problem with a few games, but otherwise they too are great choices. GeForce 7 drops support for Windows 9x but still supports Windows 2000. Some users, however, have reportedly gotten GeForce 7800 cards to work under Windows 98 and ME using an unofficial, modified driver (link). This modified driver does not support Windows 95. The GeForce 7 series is also the last that is supported by NVIDIAs Stereo3D driver extension for shutter glasses such as ELSA Revelator. GeForce 8 and above support only XP and newer.
Cards
NV1 / STG2000
Released in 1995, NVIDIA's first 3D accelerator was an all-in-one product with audio, GUI, VGA, 3D and Sega Saturn gamepad support. It uses a type of 3D rendering called quadratic texture mapping that is not Direct3D or OpenGL compatible so it is only useful with games that use its proprietary API. It is the same technology used by Sega Saturn and as such various games were ported to use the NV1. Its audio consists of wavetable MIDI and DirectSound support but very little DOS support. DOS VGA compatibility is limited.
It comes in various memory configurations with up to 4 MB maximum. The final drivers support Direct3D, but it is a software-only implementation.
Games with support: Battle Arena Toshinden, Descent: Destination Saturn, NASCAR Racing, Panzer Dragoon, Virtua Cop, Virtua Fighter Remix.
NV3 | RIVA 128 & 128 ZX
The RIVA 128 was released in late 1997 and it is the first Direct3D compatible GPU from NVIDIA. RIVA stands for Real-time Interactive Video and Animation and 128 for the internal 128-bit pipeline and memory interface. It has a 206 MHz RAMDAC and supports DDC2 and VBE3. RIVA 128 has all the hardware features required for Direct3D 5 and has also good OpenGL compatibility. It renders at only 16-bit color depth. The 3D performance is competitive with Voodoo Graphics (Voodoo1).
The RIVA 128 shows a number of rendering quality compromises. It has blending issues that introduce dithering patterns/noise. Its texture mapping does automatic mipmap generation. This combined with its per-polygon mip mapping causes texture popping and other quirks. The per-pixel mipmapping in the control panel does not operate as true per-pixel mipmapping and can cause problems. It also has some kind of problem maintaining texture alignment and visible gaps often occur. Finally its bilinear texture filtering is not as blurring as you may be used to. Sometimes textures look almost as if they are only point-sampled. Overall RIVA 128 is an interesting mix of speed and compromise.
A special feature of the PCI version of the card is the ability to load textures over the PCI bus, which was advertised as AGP content on PCI. The 2D-features include video scaling and color conversion capabilities. The chip features also PAL/NTSC output so many cards were released with TV-out, some even with TV-in.
In early 1998, NVIDIA refreshed the NV3 architecture by releasing the RIVA 128 ZX. The RIVA 128 ZX is an upgraded chip that has a 250 MHz RAMDAC and supports up to 8 MB SGRAM. For texture intense games the increased memory results in higher performance. Higher resolutions as also supported. The RIVA 128 ZX was also integrated onto Intel's i440BX-based RC440BX motherboard.
NV4 | RIVA TNT
Released in 1998, NVIDIA's RIVA TNT is a drastically improved Direct3D 6-compliant GPU with much better image quality and performance compared to its predecessors. It is competitive with Voodoo2 but with more flexibility such as 32-bit color rendering and 1024x1024 texture support. It also supports AGP 2x with execute mode (AGP texturing). 32-bit color rendering comes with a significant speed loss though and large textures are of very limited use since the chip lacks any form of texture compression. 16 MB RAM allows for very high resolutions. Great OpenGL support. All NVIDIA cards have the advantage of seperately clocked memory which makes for more flexible overclocking than with 3dfx cards. Unlike ATI's Rage 128 - one of its main competitors, the RIVA TNT does not support true trilinear filtering, but rather, its trilinear filtering is an approximation implementation. This results in reduced visual quality when viewing textures / MIP levels at far distances (which can be best described as a "sanding" or dither-pattern type effect) when trilinear filtering is enabled, but otherwise the RIVA TNT's visual quality is great. The RIVA TNT was also integrated onto Intel's SR440BX and PowerColor's DREAMCODE motherboards, both of which are based on Intel's i440BX chipset.
NV5-6 | RIVA TNT2 & TNT2 M64 and Vanta
NVIDIA further refined the RIVA TNT architecture in early 1999 with the RIVA TNT2. The RIVA TNT2 is very similar to the original TNT but adds support for AGP 4x and 2048x2048 resolution texture maps. It also typically has 32 MB of RAM and it is clocked much higher so performs noticeably better. Some popular budget variants include Vanta and TNT2 Model 64 (M64). These budget models use a 64-bit memory interface (rather than 128-bit as opposed to the regular variant), effectively halving its memory bandwidth. The performance of the M64 model, however, is faster than the original TNT in some situations.
Acer Laboratories Inc. (ALi) later integrated the RIVA TNT2 core into the northbridge of their M1631 motherboard chipset, commonly known as the Aladdin TNT2. The Aladdin TNT2 could also use shared system memory (up to 32 MB total) in addition its onboard graphics memory. Most motherboard manufacturers opted to equip it with 8 MB local memory. The Aladdin TNT2 is on par with the TNT2 M64 in gaming performance.
The RIVA TNT and TNT2 do not support 8-bit palettized textures. This is a problem with, for example, Final Fantasy VII. Likewise with the original TNT, the RIVA TNT2 only supports trilinear approximation rather than true trilinear filtering. NVIDIA would not implement true trilinear filtering until the NV1x (GeForce 256 / GeForce2 / GeForce4 MX) architecture.
The TNT2 is also NVIDIA's last chip with Windows 3.1x support. The TNT/TNT2 Windows 3.1x drivers were only released in beta form and therefore have stability issues and occasionally produce screen corruption - for example, video playback is fickle and will usually crash the system. Furthermore, the same drivers are locked to 60 Hz refresh rate, making them undesirable for use with CRT monitors under this OS. The TNT/TNT2 Windows 3.1x drivers will not work on any GeForce cards.
NV10 | GeForce 256
Released in October 1999, GeForce 256 / NV10 was the first Direct3D 7-compliant GPU. It was initially released with standard SDRAM, but a DDR version later followed. The DDR version is roughly twice as fast as TNT2. It introduced ordered grid super-sampling anti-aliasing, anisotropic filtering support (up to 2x level), cubic environment mapping, and support for hardware dot product bump mapping.
NV11 to NV1A | GeForce 2 / 2 MX / 4 MX
The next refinement in the architecture came in April 2000 in the form of the GeForce 2 / NV15. The GeForce 2 has twice the texture fillrate per clock compared to NV10 and uses a smaller manufacturing process allowing higher clock rates while reducing the power consumption at the same time. The GeForce 2 GTS is about 40% faster than the GeForce 256. GeForce 2 performance is mostly limited by memory bandwidth. Some card manufacturer used a low quality analog circuit design that produces a blurry image output.
GeForce 2 MX / NV11 is the low end series of the GeForce 2, released in September 2000. These cards have half of the pixel pipelines and half the memory interface of the NV15. It is the first NV chip with two integrated TDMS channels, providing dual display output (called "TwinView"). It also has "Digital Vibrance Control" that allows calibration of various image output aspects. The 3D performance of GeForce 2 MX at 16-bit color depth is slightly faster than a GeForce 256 SDR. With its relatively low price and with the performance it offered, it became a popular card. The GeForce 2 MX core was later integrated into NVIDIA's nForce IGP motherboard chipset northbridge.
GeForce 4 MX / NV17 replaced the GeForce 2 series in January 2002. The NV17 core is a hybrid of the NV11 and NV25. The integration of various efficiency and bandwidth improving features, combined with significantly higher clock speed than NV11, allows it to match NV15 performance. These features were advertised as "Lightspeed Memory Architecture II," which was a refinement of the Lightspeed Memory Architecture introduced with the GeForce 3 (NV20). It also gained the "AccuView" anti-aliasing capabilities which are considerably advanced in quality and performance over NV11 and NV15. However, the GeForce 4 MX lacks the hardware pixel and vertex shaders support, as well as environment mapped bump mapping support and higher level anisotropic filtering support found in the NV2x architecture.
NVIDIA further refreshed the GeForce 4 MX line in late 2002 with the NV18, which added AGP 8x support. In 2004 the series received another refresh with PCI Express x16 support via a bridge chip. The first such product introduced was the GeForce PCX 4300. The GeForce 4 MX core was also integrated into NVIDIA's nForce2 IGP motherboard chipset northbridge.
Estimated model performance ranking:
GF2 MX100 < GF2 MX200 < GF2 MX < GF2 MX400 < GF4 MX420 < GF2 GTS < GF2 Pro < GF2 Ti VX < GF2 Ti < GF2 Ultra < GF4 MX440
NV20 | GeForce 3
Released in March 2001, the GeForce 3 / NV20 is the first Direct3D 8.0-compliant GPU. It is more efficient than GeForce 2 because of improvements in memory bandwidth utilization and the addition of hidden surface removal (HSR) functions similar to those of ATI Radeon. NVIDIA called this "Lightspeed Memory Architecture". Despite lower fillrate than GeForce 2 Ultra/Pro/Ti, in some cases GeForce 3 can outperform those cards by up to 50%, namely in situations with anti-aliasing or when the HSR features save considerable fillrate. However in some cases it loses to GeForce 2 Ultra. In the latter half of 2001, NVIDIA released the GeForce 3 Ti 200 and Ti 500. GeForce 3 Ti 200 is clocked lower than the original GeForce 3, while Ti 500 is fastest. The original GeForce 3 and GeForce 3 Ti 500 were only released in 64 MB configurations, while the GeForce 3 Ti 200 was released in 64 MB and 128 MB configurations. The GeForce 3, however, benefitted very little from 128 MB memory.
GeForce 3 is the first GeForce GPU with environment mapped bump mapping (EMBM) support, multi-sample anti-aliasing (MSAA) and complete anisotropic filtering support (up to 8x level). MSAA is considerably less demanding of fillrate than SSAA. 2X and 4X MSAA modes are available. There is also an anti-aliasing mode called "Quincunx" that uses a combination of 2X MSAA and a RAMDAC-based filter. This mode was intended to allow better quality anti-aliasing than 2X MSAA/SSAA but without the performance hit of 4X MSAA/SSAA. Higher anisotropic filtering levels (4x and 8x), however, incur a heavy performance decrease (sometimes by as much as 50%). However, compared to its main competitor, ATI's R200 series, GeForce3's anisotropic filtering implementation is less angle-dependent and can work simultaneously with trilinear filtering, yielding better quality when observed in motion.
NV25 to NV28 | GeForce 4
The next evolution in the NV2x architecture came in early 2002 in the form of the NV25 / GeForce 4 Ti series. The GeForce 4 Ti is quite similar to the GeForce 3 in general. Changes include higher clock speed, pixel shader 1.3 support, Direct3D 8.1 compliance, an additional vertex processor for better geometry performance and dual RAM DAC for dual display output. Later in 2002, NVIDIA released the NV28, which added AGP 8x support. With the NV28 release, the AGP 8x versions of the Ti 4400 and 4600 were respectively rebranded as the Ti 4800 SE and Ti 4800. The AGP 8x variant of the Ti 4200 was just known as the "Ti 4200 with AGP 8x."
In the GeForce4 Ti line, the Ti 4200 is slowest and the Ti 4800 is fastest.
NV3x | GeForce FX
These are NVIDIA's first Direct3D 9 GPUs. They have excellent Direct3D 5-8 compatibility and performance but are of limited value for Direct3D 9. They are very useful for old games because they still have support for palettized textures and fog table. Similar anti-aliasing and anisotropic features, but performance with these is improved compared to older models. The high performance models like 5700 Ultra, 5800 Ultra and 59x0 Ultra allow one to run old games in high-resolution with anti-aliasing and anisotropic filtering.
Avoid models with 64-bit bus and naming suffixes like LE, XT or VE because they have been crippled in some way. There were some PCIe models made, named GeForce PCX 5xxx.
5200 < 5500 < 5200 Ultra < 5600 < 5600 Ultra < 5700 < 5700 Ultra < 5800 < 5800 Ultra < 59x0 < 59x0 Ultra
NV4x | GeForce 6
The GeForce 6 series (NV4x) was released in 2004 and is the first Direct3D 9.0c-compliant GPU. It introduced support for Shader Model 3.0 and support for PCI Express x16, though initially released cards were AGP-only. Compared to the GeForce FX series, the NV4x had dramatically improved performance all-around but dropped palettized texture support, so it is incompatible with some old games (few). This is NVIDIA's final generation of GPUs with Windows 9x and NT 4.0 support.
Some of the GF 6800s and most of the GF 6600s made use of a PCIe to AGP bridge chip.
GeForce 7
The Geforce 7 was the last generation of NVidia graphics chips to support AGP. Ati/AMD continued to offer AGP-compatible graphics solutions for a while longer till the HD 4670. While mostly a direct improvement of the GeForce 6 series, it was the first generation to go by NVidia's new GPU naming scheme. The initial flagship 7800 GTX was later superseded by the 7900 GTX.
GeForce 8
The Geforce 8 series was introduced in late 2006 and served as the first Direct3D 10 compatible GPU, as well as the first product of a paradigm shift for GPU's in the form of becoming general purpose computational devices, opening up for non-graphical usages with CUDA and OpenCL. Many architectural changes have implications for older games, such as improved math precision, lack of proper 16-bit framebuffer support, and different gamma-correction behaviour (anti-aliasing and texture filtering). Two different chips go by the same brand of 8800 GTS.
Compatibility notes
- With Intel 440BX motherboards, drivers newer than 56.64 may be unstable.