<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
		<id>https://www.vogonswiki.com/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Martin</id>
		<title>Vogons Wiki - User contributions [en]</title>
		<link rel="self" type="application/atom+xml" href="https://www.vogonswiki.com/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Martin"/>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php/Special:Contributions/Martin"/>
		<updated>2026-04-20T06:38:53Z</updated>
		<subtitle>User contributions</subtitle>
		<generator>MediaWiki 1.30.2</generator>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=3dfx&amp;diff=1387</id>
		<title>3dfx</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=3dfx&amp;diff=1387"/>
				<updated>2013-05-19T06:21:31Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* Voodoo 4/5 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:3Dfxlogo old.png|right|150px|alt=Old 3dfx logo]]&lt;br /&gt;
&lt;br /&gt;
3dfx (written as 3Dfx until 1999) was a 3D graphics chipset manufacturer and later on graphics card manufacturer. Founded in 1994, the company was one of the pioneers of 3D graphics in the PC industry in the mid to late 1990's. Their products were mainly popular for PC 3D game accelerators, but also used in arcade machines and professional visualization systems.&lt;br /&gt;
&lt;br /&gt;
[[File:3dfxlogo new.png|right|140px|alt=New 3dfx logo]]&lt;br /&gt;
&lt;br /&gt;
They played an important role in the 3D graphics industry until December 15, 2000, when most of their assets were purchased by [[NVIDIA]] Corporation, after which the company filed for bancruptcy and officialy went defunct in 2002.&lt;br /&gt;
&lt;br /&gt;
== General 3dfx information ==&lt;br /&gt;
&lt;br /&gt;
3dfx cards - namely their proprietary Glide API - can be considered one of the prime reasons to use vintage hardware today, because many early 3D games starting from 1996 had versions for at least some 3dfx cards, and in many cases these cards brought the superior image quality. A notorious example for this is Unreal (1998), a game that was geared towards software rendering at first, but had a Glide renderer added during development as soon as it was clear that Voodoo would come out as the best 3D accelerator. The game also had Direct3D and OpenGL renderers, but Direct3D was well in its infancy back in the day and even the OpenGL renderer wasn't their best effort, therefore players with competitor cards had to wait for Epic's patches to improve the graphics, but in the end it would take fanmade patches to provide competitive renderers.[http://www.xbitlabs.com/articles/graphics/display/voodoo3-3000.html]&lt;br /&gt;
&lt;br /&gt;
It was also common for game developers to put 3dfx logos on their games' boxes, leading to misconceptions for a decent amount of games ostensibly supporting Glide which actually do not at all, or only provide a special MiniGL driver for 3dfx cards. This was again due to 3dfx being the dominant 3D solution at this time, and also a commonly known brand with PC gamers. If software does not straightforwardly access either glide.dll, glide2x.dll/ovl or glide3x.dll, it cannot be said to directly support the Glide API.&lt;br /&gt;
&lt;br /&gt;
Since Glide was a proprietary interface, there were 3rd party efforts from early on to bring it to all 3D cards. Glide wrappers are at a level where they can properly emulate how those games would look on a real Voodoo card and can be considered a viable alternative to the real cards. A problem with them is that games written for Win9x are not necessarily compatible with modern operating systems, so only a (at best) period-correct Win9x system can be guaranteed to play all Glide games properly.&lt;br /&gt;
&lt;br /&gt;
The main weak points of all vintage cards apart from incompatibility with modern mainboards/operating systems are lack of full screen anti-aliasing (addressed with V5) and anisotropic filtering (implemented in GeForce 256, only started to be useful with about GeForce 3).&lt;br /&gt;
&lt;br /&gt;
A disadvantage of 3dfx's SLI multi-GPU solution (V2 SLI/V5 for consumer cards) is that it is somewhat prone to slight horizontal artifacts somewhat akin to screen tearing, which results from the multiple chips not fully working synchronously. This can be prevented by activating VSync in the drivers or in the games, a solution which itself has the problem of causing mouse lag in many cases.&lt;br /&gt;
&lt;br /&gt;
All AGP 3dfx cards use the port as a mere 66 MHz PCI port and do not utilize any of the special features that AGP offers. All AGP 3dfx cards are AGP 2x (3.3V) and must not be used in newer 1.5V slots. Only retail Voodoo 4 4500 cards are AGP 4x 1.5V capable. PCI and AGP versions of 3dfx hardware were considered to perform virtually identically with contemporary games, but testing has shown that later games can largely profit from higher bus bandwidth.[http://translate.google.com/translate?hl=en&amp;amp;sl=de&amp;amp;tl=en&amp;amp;u=http%3A%2F%2Fwww.voodooalert.de%2Fde%2Fcontent%2Ftests%2Fagp_vs_pci_v44500.php]&lt;br /&gt;
&lt;br /&gt;
Cards of Voodoo2 SLI/3 grade speed scale with CPUs up until about a ~1 GHz Intel Pentium III Coppermine, although a PIII 500 Katmai should be enough to play all Glide games fluently. AMD's K6 line can be considered second choice when building a 3dfx centered PC, because these CPUs can be a significant bottleneck with some later games. Pentium Classic and Pentium MMX CPUs will only be able to run early Glide titles full-speed. Older games should be able to cope with faster CPUs; exceptions are listed [[List of games that require specific CPUs to run properly|here]]. Lastly, it should be noted that Voodoo Graphics cards will not work with K7 (Athlon) CPUs, and Voodoo2 cards will need special 3rd party drivers to work with these CPUs.&lt;br /&gt;
&lt;br /&gt;
Community-made resources for 3dfx cards include drivers, such as Amigamerlin, x3dfx and SFFT, which can provide more features and speed than the latest official drivers from 2000 and some of which allow the cards to be run under Windows XP, or tools such as V.Control which provide more in-depth tweaking options. For potentially better OpenGL compatibility or speed, one can try the MesaFX standalone OpenGL driver or Metabyte's WickedGL MiniGL driver.&lt;br /&gt;
&lt;br /&gt;
A method to improve texture quality with Voodoo cards is setting a negative LOD bias in the driver settings, resulting in a sharper image. However, this causes a slight performance hit and leads to very noticeable texture shimmering. The function is essentially only usable with FSAA on VSA-100 cards due to this.[http://www.anandtech.com/show/580/22][http://quake3tweaks.tripod.com/lod_nn4.html]&lt;br /&gt;
&lt;br /&gt;
All 3dfx cards share the fact that the graphics core and memory always run at the same frequency. This makes overclocking generally harder than on other cards, because in many cases the memory will hit the limit earlier than the core. Still it is possible: V1/V2 cards can be overclocked by environment variables, while the other cards contain an option in the driver for this. With most drivers this needs to be unlocked with a [http://www.falconfly.de/downloads/overclock.zip special utility from 3dfx], which will also allow the user to set VSync options for both DirectX and OpenGL. Better cooling, e.g. through means of a case fan is advised when overclocking.&lt;br /&gt;
&lt;br /&gt;
Note that 3dfx cards can react quite sensitively to overclocked system buses, such as the commonly attempted 133 MHz FSB on 100 MHz-specified Intel 440BX chipset boards. Such an overclock on these boards will result in a 89 MHz (instead of 66 MHz) frequency for the AGP bus due to the lack of a proper divider, which can result in garbled BIOS screens and IDE drive corruption. Finally, note that all overclocking happens at the user's own risk.&lt;br /&gt;
&lt;br /&gt;
== Getting the best compatibility ==&lt;br /&gt;
&lt;br /&gt;
For better compatibility and versatility, it is common practice among vintage computer enthusiasts to have multiple video or sound cards in one system. Back in the day, this was typically widespread and necessary for 3D-only 3dfx cards with a loop cable (V1/V2). That way, one can easily have a faster card for OpenGL/D3D (or a card supporting one of the other proprietary 3D APIs) combined with e.g. V2 SLI which will automatically engage when Glide is chosen in games. This may cause issues with some cards if for some reason OpenGL/D3D would be needed on the 3dfx card(s).&lt;br /&gt;
&lt;br /&gt;
Another way is to combine said non-3dfx card with a 2D+3D 3dfx card, one of them being AGP and the other one PCI. Due to both being full video cards one would need to perform the switch in the BIOS under &amp;quot;Primary VGA adapter&amp;quot;, &amp;quot;Boot from AGP/PCI&amp;quot; or likewise (if it supports it) depending on what card is needed. This method has the disadvantage of requiring to relocate the monitor cable each time because there is no passthrough; a monitor with multiple inputs or a VGA or KVM switch would solve that problem, potentially with DVI for one of the cards if available. This should work very reliably without any conflicts.&lt;br /&gt;
&lt;br /&gt;
It is also possible to take advantage of multi-monitor support in Windows 98, either with a multi-input monitor by switching between inputs on the monitor itself or two monitors. However, this has been reported to cause Windows to use the OpenGL software fallback mode as long as the secondary display is enabled, so it is perhaps not the optimal solution. Direct3D hardware acceleration only works on the primary display.&lt;br /&gt;
&lt;br /&gt;
Finally, for maximum Glide compatibility, one could even use three cards (e.g. V1, V2 and V3/4/5) and switch between the cards by copying the appropriate glide2x.dll/glide3x.dll drivers into the game directory depending on which card the game should run with. When using this method, it is important to install the drivers in ascending order, so that games which access the drivers in the Windows folder use the newest 3dfx card. For DOS games, one would analogically copy Glide2x.ovl into the game folder.&lt;br /&gt;
&lt;br /&gt;
To get games which were originally made for Voodoo Graphics to work with Voodoo2 boards, one can use the following SST variables in the autoexec.bat, either directly or by an external batchfile:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;SET SST_GRXCLK=90&lt;br /&gt;
SET SST_FT_CLK_DEL=0x4&lt;br /&gt;
SET SST_TF0_CLK_DEL=0x6&lt;br /&gt;
SET SST_TF1_CLK_DEL=0x6&lt;br /&gt;
SET SST_VIN_CLKDEL=0x1&lt;br /&gt;
SET SST_VOUT_CLKDEL=0x0&lt;br /&gt;
SET SST_TMUMEM_SIZE=2&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Voodoo Graphics ==&lt;br /&gt;
&lt;br /&gt;
[[File:Voodoo_1.jpg|200px|thumb|Diamond Monster 3D (4 MB)]]&lt;br /&gt;
[[File:Canopus_Pure3D.jpg|200px|thumb|Canopus Pure3D (6 MB)]]&lt;br /&gt;
&lt;br /&gt;
The Voodoo Graphics chipset, based on the SST1 architecture, was 3dfx's first foray into the PC market. Its release in 1996 was primarily made possible by EDO DRAM declining in price, allowing good profits from an adequately-equipped ~$300 Voodoo Graphics solution. The PCI cards, which were manufactured by board partners, feature a frame buffer processor, a texture mapping unit (TMU), a RAMDAC and 4 MB EDO DRAM (some later versions were released with 6 or even 8 MB). Both the RAM and graphics processors operate at 50 MHz, with 2 MB RAM being used as framebuffer and 2 MB as texture memory. The RAM banks are on independent 64-bit buses. A Pentium 90 with 8 MB RAM was considered the minimal specifications for these cards.&lt;br /&gt;
&lt;br /&gt;
The chipset was rich in features, boasting perspective correct texture mapping, bilinear texture filtering, level of detail MIP mapping, sub-pixel correction, polygonal-based Gouraud shading and texture modulation. It natively supported Direct3D 5 and was notable for supporting all of its required features with adequate speed, unlike previous 3D chipsets such as [[S3]] Virge and [[Matrox]] Mystique. It also introduced Glide, 3dfx's own proprietary API that worked initially under DOS and later under Windows 9x and NT 4.0/2000. Glide was essentially a subset of OpenGL, with no support for features deemed unnecessary for PC gaming at the time, and for some functions not supported by the SST1 architecture. OpenGL games were initially only supported through the use of MiniGL, which was an OpenGL driver with only the necessary functions implemented for a specific game, most notably Quake engines. In May 1999, 3dfx released a full OpenGL ICD, providing support for all OpenGL applications.&lt;br /&gt;
&lt;br /&gt;
Voodoo Graphics does not have 2D functions like VGA or GUI acceleration, meaning that it has to be used in conjunction with a standard 2D card by means of a [[VGA passthrough cable]]. Voodoo cards have relays onboard that switch between passthrough mode and output mode, controlled by the driver or DOS game/Glide. Unfortunately the passthrough impacts 2D quality because of the signal passing through additional circuitry that may not be of optimum quality. High resolution GUI modes are most noticeably affected.&lt;br /&gt;
&lt;br /&gt;
Thanks to 3dfx's efforts with game developers and publishers and the excellent performance of their solution, the company's technology was quickly adopted as the de-facto standard in PC 3D gaming. Voodoo 1 enjoyed lengthy support from game developers. Despite only supporting resolutions as high as 640x480 (800x600 without the usage of Z-buffering) and 16-bit color depth, the card was usable with games into 2000.&lt;br /&gt;
&lt;br /&gt;
It has been reported that Voodoo1 cards produce artifacts on the screen and do not play most DOS Glide games correctly when the FSB is higher than 100 MHz, even if the PCI bus runs at default clock speeds.&lt;br /&gt;
&lt;br /&gt;
The prime competitors upon its release were the [[PowerVR]] PCX1 and [[Rendition]] Vérité V1000 chipsets, the latter of which already featured complete 2D processing onboard. Other competitors include the [[Matrox]] Millenium II/[[Matrox Mystique]], [[ATI]] Rage II, [[S3]] Virge and [[NVIDIA]] RIVA 128, all of which had 2D functions, but only the RIVA 128 can be said to match the Voodoo 1 in performance, while of course lacking Glide support.&lt;br /&gt;
&lt;br /&gt;
'''Bottom line:'''&lt;br /&gt;
The card's prime use case would be early statically-linked Glide games in DOS that depend on the first Voodoo chipset. Later games, starting with ca. 1997, are better played with the subsequent Voodoo cards. &lt;br /&gt;
&lt;br /&gt;
Cards with higher than 4 MB are a trade-off: they have somewhat higher compatibility to later games, but lose some compatibility with first generation titles. 6 MB versions only have more texture memory and are therefore still limited to 640x480; 8 MB boards are able to show 800x600 resolutions due to extra framebuffer memory. Both offer somewhat smoother frame rates in games with more texture memory usage, such as Unreal and Quake 2.&lt;br /&gt;
&lt;br /&gt;
== Voodoo Rush ==&lt;br /&gt;
&lt;br /&gt;
[[File:Jazz Multimedia Voodoo Rush.jpg|200px|thumb|Jazz Adrenaline 3D (Alliance ProMotion)]]&lt;br /&gt;
[[File:Voodoo_Rush_with_Macronix_2D.jpg|200px|thumb|Procomp G108 (Macronix)]]&lt;br /&gt;
&lt;br /&gt;
Voodoo Rush was released in August 1997 for the PCI bus and addressed the main shortcoming of the Voodoo Graphics by being a complete 2D/3D solution. The chipset combined either an [[Alliance Semiconductor]] AT25/AT3D or [[Macronix]] 2D core on the same board as the exact same Voodoo chipset (on some cards the 3dfx part came as a daughterboard).&lt;br /&gt;
&lt;br /&gt;
The combination of two independent chipsets led to a bottleneck for the 3dfx part and therefore about 10% lower performance. The cards had 4, 6 or 8 MB total memory, with only 8 MB versions offering 4 MB for texture space, similarly to Voodoo Graphics. Some cards had slightly higher clocks to close the performance gap. The cards also sometimes weren't fully compatible to existing games, leading to specific Voodoo Rush patches for some games, e.g. Tomb Raider.&lt;br /&gt;
&lt;br /&gt;
The AT3D chipset has rudimentary 3D functions which can be activated, meaning that Rush cards that feature it have two 3D chipsets.&lt;br /&gt;
&lt;br /&gt;
'''Bottom line:''' Rush cards were an infamous early attempt at a 2D/3D card by 3dfx and should be avoided when building a vintage gaming system. Primarily a curiosity. Despite that, the cards may be potentially useful in fringe cases like in a system with only a single available PCI slot that does not support a Banshee or Voodoo 3 because of a weak power supply or weak voltage regulators. They shouldn't be difficult to acquire since the demand is not as high as for other 3dfx cards.&lt;br /&gt;
&lt;br /&gt;
== Voodoo2 ==&lt;br /&gt;
&lt;br /&gt;
[[File:3dfx_Voodoo_2.jpg|200px|thumb|Provideo PV830 (reference Voodoo2 with 110MHz rated RAM)]]&lt;br /&gt;
&lt;br /&gt;
Released in early 1998, the Voodoo2 chipset (SST96) expanded upon its predecessor by adding a second texture processor, featuring 8 or 12 MB EDO DRAM and supporting Direct3D 6. The clock was increased to 90 MHz, almost doubling the performance compared to Voodoo1. Performance in games utilizing the Voodoo2's second texture unit by means of single-pass multitexturing is further increased. The first notable games to do so were GLQuake, Quake II (both 1997) and Unreal (1998). Single-pass trilinear filtering was possible as well.&lt;br /&gt;
&lt;br /&gt;
The cards also support SLI (Scan-Line Interleave), a technique which allows 2 cards to be run simultaneously and draw the lines of the image in turn, boosting performance and enabling a resolution of up to 1024x768. With one card installed, up to 800x600 is possible regardless of memory. For SLI operation a special SLI cable is required. It is also possible to use a [[How to make a Voodoo 2 SLI cable|modified floppy drive cable]].&lt;br /&gt;
&lt;br /&gt;
Following the same principle as the Voodoo1 there are three independent 64-bit RAM buses, one for the frame buffer processor and one for each TMU. While 4 MB RAM are available for the frame buffer, the textures have to be copied into the RAM of both TMUs. So even though there are technically 4 or 8 MB of texture memory on a card effectively there are only 2 or 4 MB available for textures. With SLI this amount does not grow, instead the textures will be copied two more times. &lt;br /&gt;
&lt;br /&gt;
Voodoo2 still requires the passthrough cable and use of a separate 2D card. However, the chipset does have some 2D features and there is a driver for Linux that allows one to use Voodoo2 as a GUI accelerator.&lt;br /&gt;
&lt;br /&gt;
A large number of cards from different manufacturers were released, with some deviating from the reference design and/or featuring extra cooling and even slight factory overclocks. The Voodoo2 remained the standard for PC 3D accelerator cards throughout 1998.&lt;br /&gt;
&lt;br /&gt;
Competitors included the NVIDIA RIVA TNT, ATI Rage 128 and the Matrox G200.&lt;br /&gt;
&lt;br /&gt;
'''Bottom line:''' The iconic Voodoo2 SLI setup holds nostalgic value for some people. Voodoo2 SLI is viable for almost all Glide games, and has the advantage over Voodoo 3 that it can play more Glide games originally only designed for Voodoo1, with necessary environment variable configuration. Weak points include occasional slight stutter in texture intensive games due to texture trashing and possible image quality problems resulting from the passthrough design.&lt;br /&gt;
&lt;br /&gt;
== Banshee ==&lt;br /&gt;
&lt;br /&gt;
Released in 1998, the Banshee was 3dfx's first fully integrated 2D+3D card. It combines a new 2D core, a single-TMU Voodoo2 and the RAMDAC into one chip. It is clocked at 100 MHz, meaning that the midrange Banshee was actually slightly faster in then prevalent single-textured games than the high-end Voodoo2, yet clearly falls behind in games utilizing multi-texturing. Banshee cards were the first 3dfx cards to universally feature some kind of cooling solution and came equipped with 8MB/16MB SDRAM or SGRAM, with PCI and AGP versions existent.&lt;br /&gt;
&lt;br /&gt;
Its 2D acceleration was very capable. It rivaled the fastest 2D cores from Matrox, Nvidia, and ATI, consisting of a 128-bit GUI engine and a 128-bit VESA VBE 3.0 VGA core. DirectDraw is accelerated, and the GUI portion supports all of the Windows Graphics Device Interface (GDI) in hardware. The GUI engine achieved near-theoretical maximum performance with a null driver test in Windows NT.&lt;br /&gt;
&lt;br /&gt;
'''Bottom line:''' Banshee cards are far superior to Voodoo Rush, although they have a few bugs in various areas such as video playback and DOS VESA modes. As such they are not ideal gaming choices, although they can be still useful for some games.&lt;br /&gt;
&lt;br /&gt;
== Voodoo 3 ==&lt;br /&gt;
&lt;br /&gt;
[[File:3dfx_Voodoo_3_3000.jpg|200px|thumb||Voodoo 3 3000 AGP]]&lt;br /&gt;
[[File:3dfx_Voodoo_3_3500.jpg|200px|thumb||Voodoo 3 3500 TV]]&lt;br /&gt;
&lt;br /&gt;
The Voodoo 3, codenamed &amp;quot;Avenger&amp;quot;, was announced at COMDEX in November 1998 and released on April 3, 1999. Following the buyout of STB, 3dfx was now manufacturing their own cards. The Voodoo 3 was basically a higher-clocked Banshee core outfitted with a second texture unit and some bugfixes. The cards were released in four different flavors: the 125 MHz Voodoo 3 1000, the 143 MHz Voodoo 3 2000, the 166 MHz Voodoo 3 3000, and the 183 MHz Voodoo 3 3500 TV with integrated TV tuner. Except for the low-end V3 1000, which could also come with 8 MB, all cards featured 16 MB. The V3 line came both in PCI or AGP versions, with the 3500 being AGP-only. Some PCI versions featured SGRAM instead of the standard SDRAM. Thanks to the integrated 350 MHz RAMDAC (V3 3000/3500), the maximum resolution is 2048x1536 at about 75 Hz.&lt;br /&gt;
&lt;br /&gt;
Now facing stronger competition from [[NVIDIA|NVIDIA's]] RIVA TNT line, which already supported 32-bit color depth, 1024x1024 textures and AGP texturing, the Voodoo 3 line was somewhat panned by critics and called outdated in terms of features, but was still considered to be very competitive speed-wise, because 32-bit rendering introduced a big performance hit on competitor cards. At that time, 3dfx's marketing was centered around speed, but to demonstrate that the image quality was still better than their last year's high-end setup, they invented the term &amp;quot;22-bit&amp;quot;, describing the fact that the RAMDAC of the card would perform either a 2x2 box or 4x1 line filter on the image, depending on the driver settings, masking some of the dithering.[http://www.beyond3d.com/content/articles/61/1]&lt;br /&gt;
&lt;br /&gt;
Its prime competitor was the NVIDIA RIVA TNT2, of which the TNT2 Ultra was widely considered to be the better choice overall by reviewers, mainly due to superior Direct3D performance and more features. Still, arguably 2048x2048 textures and 32-bit rendering were not as significant in 1999 as the better game compatibility of the Voodoo 3. Other competitor cards included the ATI Rage 128 Pro and S3 Savage 4.&lt;br /&gt;
&lt;br /&gt;
The GeForce 256, which came out later that year, was considered to be superior in features and performance in D3D and OGL games, yet could still merely tie it in some Glide-centric/CPU limited games such as Unreal Tournament.&lt;br /&gt;
&lt;br /&gt;
A noteworthy problem with Voodoo 3 or other cards from this generation was the higher power demands, which certain mainboards at the time could not cope with. The issue lied specifically in the voltage regulators for the AGP slot. Intel specified 6A at 3.3V for this slot, but due to cost saving measures some mainboard manufacturers utilized parts that were specified for less than that. Voodoo 3 cards were reported to demand up to 4.8A, which could cause severe thermal issues, crashes and even hardware failures with these boards. One known manufacturer with this problem was Gigabyte. Also Asus had a similar issue, although only two models were reportedly affected; only Nvidia RIVA TNT cards are mentioned, but it is safe to assume that the issue is present for all AGP video cards from that generation and further. Lists for both manufacturers can be found in the links section.&lt;br /&gt;
&lt;br /&gt;
'''Bottom line:''' A Voodoo 3 2000 roughly matches Voodoo2 SLI 12MB in speed, while only taking one slot and offering better real-world performance due to more texture memory (for 1024x768 as the highest resolution available with Voodoo2 SLI there would be 16 - 4.5 = 11.5 MB available instead of 4 MB, nearly three times as much). The image quality is slightly better due to more advanced RAMDAC filtering and the end of the passthrough design.&lt;br /&gt;
&lt;br /&gt;
Like with all Voodoo cards, V3 will run games requiring early Direct3D features (8-bit paletted textures or table fog). If this is taken into account together with its good DOS compatibility/speed, wide availability and low cost, the Voodoo 3 can be considered among the best all-around cards for vintage gaming purposes of that time frame and before.&lt;br /&gt;
&lt;br /&gt;
== Voodoo 4/5 ==&lt;br /&gt;
&lt;br /&gt;
[[File:PowerColor_Voodoo_4_4500.jpg|200px|thumb||PowerColor Evilking IV]]&lt;br /&gt;
[[File:3dfx_Voodoo_5_5500.jpg|200px|thumb||Voodoo 5 5500]]&lt;br /&gt;
&lt;br /&gt;
The VSA-100 (Voodoo Scalable Architecture), codenamed &amp;quot;Napalm&amp;quot;, was the final product from 3dfx and was released in 2000. Only the single-chip Voodoo 4 4500 and the dual-chip Voodoo 5 5500 made it to market, both clocked at 166 MHz and released both in AGP and PCI versions. It is a further refinement of the architecture of all previous products, with some changes and additions such as two pixel pipelines with one texture unit each (instead of one pipeline with two texture units), larger texture caches and data paths expanded from 16-bit to 32-bit. The chip supports 32-bit color depth 3D rendering, 2048x2048 textures, FXT1 and DXTC texture compression.&lt;br /&gt;
&lt;br /&gt;
Voodoo 4 4500 cards have 32 MB SDRAM. Voodoo 5 5500 cards have 64 MB SDRAM, although only 32 of it are actually usable due to the SLI method used, much like with Voodoo2 SLI. Voodoo 5 cards require supplementary power in the form of a single Molex connection.&lt;br /&gt;
&lt;br /&gt;
The marketing was now more centered on image quality (&amp;quot;cinematic effects&amp;quot;) than speed: Through the added &amp;quot;T-buffer&amp;quot; the Voodoo 4 4500 is capable of 2x RGSSAA (rotated-grid supersampling anti-aliasing), while the Voodoo 5 5500 supports up to 4x RGSSAA.[http://www.anandtech.com/show/350/2][http://www.beyond3d.com/content/articles/37/1] Compared to other methods such as MSAA (multisample anti-aliasing), this variant is considered higher quality, because it smooths the whole screen and eliminates texture flickering to a great extent, therefore generating a much &amp;quot;calmer&amp;quot; and more realistic image especially when moving in the game.&lt;br /&gt;
&lt;br /&gt;
Unfortunately this comes with a large performance impact due to the high fillrate requirements from rendering at a higher resolution and sampling down to the actual resolution. Additionally, it has the weakness of blurring the text and UI in games; this undesired effect seems to be more detrimental with the 2x mode than the 4x mode. Generally speaking, the 4x mode is only practially useful up to 800x600 depending on the game.&lt;br /&gt;
&lt;br /&gt;
Only two playable games are known to specifically take advantage of the added T-buffer hardware for effects other than anti-aliasing. These include a custom made Q3Test demo (ver. 1.08), in which 3dfx hacked motion blur support to promote their then new cards. Screenshots from this demo emerged around November 1999 and the actual program was released in December 2000, after 3dfx went bankrupt. The demo consists of three maps and does not feature bots; the motion blur effect is seen with weapons, powerups and moving players. It is only displayed correctly with 4xFSAA turned on for OpenGL and there is no known way to enable it in any other version of Quake 3.&lt;br /&gt;
&lt;br /&gt;
Also, T-buffer support was programmed into Serious Sam First/Second Encounter. It is supposed to be used for depth of field effects. To activate it, the game must run in 16-bit color depth, FSAA must be on and the following commands have to be written in the console: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;ogl_iTBuffereffect=2&lt;br /&gt;
ogl_iTBufferSamples=4&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The game is then supposed to confirm T-buffer usage in the console when starting it up. Other valid values for the second command are 2 and 8, depending on the number of VSA-100 chips on the used card. The FSAA implementation in this game is reported to be faster than in other releases. Still, given the experimental nature it is very unlikely that the DoF effects work as intended.&lt;br /&gt;
&lt;br /&gt;
The Voodoo 5 generally performs similarly to the GeForce 256 DDR, but was not competitive with high-end GeForce 2 cards, especially since Glide support in new games was rapidly declining by that time. Likewise, the delayed Voodoo 4 4500 was considered obsolete by reviewers upon its introduction due to the GeForce 2 MX performing better at a similar price point (the Voodoo 4 4500 was released almost four months after the Voodoo 5 5500). Despite this, reviewers pointed out that the VSA-100's SSAA implementation was superior to the competition.&lt;br /&gt;
&lt;br /&gt;
Unlike the GeForce cards which were Direct3D 7 capable, the VSA-100 line was still limited to Direct3D 6, as it lacked Hardware T&amp;amp;L. This feature, which was introduced by Nvidia a year before, was slowly taking off in 2000 and games which use it to a great extent are not optimally suited for 3dfx hardware. After 3dfx came to an end, fanmade drivers tried to solve this problem and make some 2001 and later games playable on these cards.&lt;br /&gt;
&lt;br /&gt;
32-bit color depth rendering can be forced in the driver for games that don't natively support it (especially Glide games).&lt;br /&gt;
&lt;br /&gt;
'''Bottom line:''' The V4 4500 is not a large improvement over the V3, since it performs similarly and its new features are of limited benefit due to this fact.[http://translate.google.com/translate?hl=en&amp;amp;sl=de&amp;amp;tl=en&amp;amp;u=http%3A%2F%2Fwww.voodooalert.de%2Fde%2Fcontent%2Ftests%2Fv3vsv4.php] V5 5500 is considerably faster and provides optimal Glide gameplay up to around 1280x1024 without AA, with the added possibility of adding anti-aliasing for higher image quality in lower resolutions, which may be especially useful with older games that are locked to these modes. For authentic hardware, VSA-100 can provide the best visual quality in these titles.&lt;br /&gt;
&lt;br /&gt;
To attain the best possible frame rates, the cards can be combined with fast CPUs such as Athlon XPs with the KT333 chipset. AGP 3.3v support is necessary. Note that with 4xFSAA in any resolution or 1024x768x32 with no AA the Voodoo5 hits its fillrate limit in many non CPU limited games, making faster CPUs effectively useless.[http://www.rashly3dfx.com/products/images/133fsbCPU.gif]&lt;br /&gt;
&lt;br /&gt;
Macintosh PCI versions of the V5 5500 have DVI outputs for clearer image quality. Using this with DOS games may [[General monitor advices|cause problems]] due to locked refresh rates though.&lt;br /&gt;
&lt;br /&gt;
== Other 3dfx cards ==&lt;br /&gt;
&lt;br /&gt;
The company also released other cards, such as the budget Velocity (name taken after the acquisition of STB) line, which only came with 1 TMU similarly to Banshee, although the second one can reportedly be enabled by a registry hack. Also, 3dfx had plans for a Voodoo 5 6000, which would have come with four VSA-100 chips installed and would have been powered by an external power supply, dubbed &amp;quot;Voodoo Volts&amp;quot;. About 150-250 of these were made as prototypes. These cards beat Nvidia's GeForce 2 line and are even competitive with GeForce 3 when used with faster CPUs, and are also capable of 8x RGSSAA. The prototypes are considered &amp;quot;legendary&amp;quot; in the enthusiast community and are highly sought after, with prices easily as high as $1000 paid for them.&lt;br /&gt;
&lt;br /&gt;
== User benchmarks ==&lt;br /&gt;
&lt;br /&gt;
''Main article: [[3dfx Benchmarks]]''&lt;br /&gt;
&lt;br /&gt;
== Video captures ==&lt;br /&gt;
&lt;br /&gt;
{{#ev:youtube|LPocZ-FX8SU}}&lt;br /&gt;
{{#ev:youtube|_S4qCr77jJ8}}&lt;br /&gt;
{{#ev:youtube|mXpoRJjsr-g}}&lt;br /&gt;
{{#ev:youtube|HLWIhqAfFz0}}&lt;br /&gt;
{{#ev:youtube|wGLy2iIviek}}&lt;br /&gt;
{{#ev:youtube|QLBgaLOi7N4}}&lt;br /&gt;
{{#ev:youtube|eghlSdGvuC0}}&lt;br /&gt;
{{#ev:youtube|4j07Gmrw50E}}&lt;br /&gt;
&lt;br /&gt;
== External links ==&lt;br /&gt;
*[http://www.tdfx.de/eng/grafikkarten_alle.shtml Complete database of 3dfx cards]&lt;br /&gt;
*[http://www.falconfly.de/ Best resource for 3dfx drivers + other information]&lt;br /&gt;
*[http://www.3dfxzone.it/enboard/topic.asp?TOPIC_ID=1758 Glide games list]&lt;br /&gt;
*[http://www.zeus-software.com/downloads/nglide/compatibility NGlide wrapper compatibility list]&lt;br /&gt;
*[http://vogons.zetafleet.com/viewtopic.php?t=886 Complete list of Glide games for DOS]&lt;br /&gt;
*[http://www.youtube.com/playlist?list=PL2DC6912FD577F199 3D Acceleration Comparison with many 3dfx games]&lt;br /&gt;
*[http://translate.google.com/translate?hl=en&amp;amp;ie=ASCII&amp;amp;prev=_t&amp;amp;sl=de&amp;amp;tl=en&amp;amp;u=http://www.voodooalert.de/de/content/tests/index.php Many 3dfx tests and driver comparisons]&lt;br /&gt;
*[http://patrizio1.tripod.com/var.htm List of SST variables]&lt;br /&gt;
*[http://translate.google.com/translate?hl=en&amp;amp;ie=ASCII&amp;amp;prev=_t&amp;amp;sl=de&amp;amp;tl=en&amp;amp;u=http://web.archive.org/web/20000304054232/http://www.gigabyte.de/gigadeutsch/news/news.htm List of Gigabyte mainboards which will accept a Voodoo3]&lt;br /&gt;
*[http://translate.google.com/translate?hl=en&amp;amp;ie=ASCII&amp;amp;prev=_t&amp;amp;sl=de&amp;amp;tl=en&amp;amp;u=http://web.archive.org/web/20070304033237/http://rma.asus.de/support/FAQ/faq034_lx_tntrew.htm List of affected Asus motherboards with modding instructions]&lt;br /&gt;
*[http://www.3dgw.com/faq/moodys_voodoo2_faq.htm Moody's Voodoo2 FAQ]&lt;br /&gt;
*[http://floodyberry.com/carmack/johnc_plan_1998.html#d19980216 John Carmack on texture swapping with Voodoo cards]&lt;br /&gt;
*[http://www.falconfly.de/downloads/Q3_Motion_Blur.zip Q3Test 1.08 for Voodoo 5]&lt;br /&gt;
&lt;br /&gt;
[[Category:Hardware]]&lt;br /&gt;
[[Category:Graphics Cards]]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=User:Martin&amp;diff=1386</id>
		<title>User:Martin</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=User:Martin&amp;diff=1386"/>
				<updated>2013-05-19T05:56:56Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Вфхуи ZoPиЕ m&amp;lt;br&amp;gt;&lt;br /&gt;
СФИР Et. SEPOHЖ&amp;lt;br&amp;gt;&lt;br /&gt;
Chebzon фt Ymeztoix © 1959 zem&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Worker_and_Parasite.jpg]]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=File:Worker_and_Parasite.jpg&amp;diff=1385</id>
		<title>File:Worker and Parasite.jpg</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=File:Worker_and_Parasite.jpg&amp;diff=1385"/>
				<updated>2013-05-19T05:56:01Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: СФИР Et. SEPOHЖ&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;СФИР Et. SEPOHЖ&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=User:Martin&amp;diff=1384</id>
		<title>User:Martin</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=User:Martin&amp;diff=1384"/>
				<updated>2013-05-19T05:48:58Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Вфхуи ZoPиЕ m&amp;lt;br&amp;gt;&lt;br /&gt;
СФИР Et. SEPOHЖ&amp;lt;br&amp;gt;&lt;br /&gt;
Chebzon фt Ymeztoix © 1959 zem&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=User:Martin&amp;diff=1383</id>
		<title>User:Martin</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=User:Martin&amp;diff=1383"/>
				<updated>2013-05-19T05:48:32Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Вфхуи ZoPиЕ m&lt;br /&gt;
СФИР Et. SEPOHЖ&lt;br /&gt;
Chebzon фt Ymeztoix © 1959 zem&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=NVIDIA&amp;diff=1371</id>
		<title>NVIDIA</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=NVIDIA&amp;diff=1371"/>
				<updated>2013-05-12T07:26:10Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* NV4-6 | RIVA TNT &amp;amp; TNT2 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Retro value==&lt;br /&gt;
GeForce 256 through GeForce FX are fine cards for old games for a number of reasons. Their DOS game compatibility and GUI performance are top notch. Their Direct3D driver supports two critical old features, fog table and 8-bit palettized textures. OpenGL compatibility and performance are second to none, and some games utilize proprietary NVIDIA extensions.&lt;br /&gt;
&lt;br /&gt;
One problem with NVIDIA cards prior to GeForce4 is that some card vendors built their cards too cheaply. The most noticeable result is poor analog signal quality. It causes problems such as blurriness, loss of color saturation and color bleed, particularly at higher resolutions and refresh rates. Determining which cards are high quality is difficult but the GeForce4 and newer cards are most likely to be good.&lt;br /&gt;
&lt;br /&gt;
Perhaps the finest choices are the GeForce FX series because they offer the most refined quality-enhancing features and the high end models have the performance to use these features at higher resolutions. GeForce 6 drops support for palettized textures, which is a problem with a few games, but otherwise they too are great choices. GeForce 7 drops support for Windows 9x but still supports Windows 2000. Some users, however, have reportedly gotten GeForce 7800 cards to work under Windows 98 and ME using an unofficial, modified driver [http://www.mdgx.com/files/nv8269.php (link)]. This modified driver does not support Windows 95. The GeForce 7 series is also the last that is supported by NVIDIAs Stereo3D driver extension for shutter glasses such as ELSA Revelator. GeForce 8 and above support only XP and newer.&lt;br /&gt;
&lt;br /&gt;
==Cards==&lt;br /&gt;
===NV1 / STG2000 ===&lt;br /&gt;
[[File:Dedge3d.jpg|thumb|NV1]]&lt;br /&gt;
Released in 1995, NVIDIA's first 3D accelerator was an all-in-one product with audio, GUI, VGA, 3D and Sega Saturn gamepad support. It uses a type of 3D rendering called quadratic texture mapping that is not Direct3D or OpenGL compatible so it is only useful with games that use its proprietary API. It is the same technology used by Sega Saturn and as such various games were ported to use the NV1. Its audio consists of wavetable MIDI and DirectSound support but very little DOS support. DOS VGA compatibility is limited.&lt;br /&gt;
&lt;br /&gt;
It comes in various memory configurations with up to 4 MB maximum. The final drivers support Direct3D, but it is a software-only implementation.&lt;br /&gt;
&lt;br /&gt;
Games with support: Battle Arena Toshinden, Descent: Destination Saturn, NASCAR Racing, Panzer Dragoon, Virtua Cop, Virtua Fighter Remix.&lt;br /&gt;
&lt;br /&gt;
===NV3 | RIVA 128 &amp;amp; 128 ZX===&lt;br /&gt;
[[File:Riva128.jpg|thumb|RIVA 128]]&lt;br /&gt;
The RIVA 128 was released in late 1997 and it is the first Direct3D compatible GPU from NVIDIA. RIVA stands for '''R'''eal-time '''I'''nteractive '''V'''ideo and '''A'''nimation and 128 for the internal 128-bit pipeline and memory interface. It has a 206 MHz RAMDAC and supports DDC2 and VBE3.&lt;br /&gt;
Riva 128 has all the hardware features required for Direct3D 5 and has also good OpenGL 1.0 compatibility. It renders at 16-bit color depth and supports 3D accelerated resolutions up to 960x720 with Z-buffer. The 3D-performance is competitive with [[3dfx|Voodoo Graphics]] (Voodoo1). The RIVA 128 shows a few rendering quality issues like visible texture seams and a very apparent dithering pattern. Overall the rendered image appears more saturated than output from Glide games from this time. A special feature of the PCI version of the card is the ability to load textures over the PCI bus, which was advertised as ''AGP content on PCI''. The 2D-features include video scaling and color conversion capabilities. The chip features also PAL/NTSC output so many cards were released with TV-out, some even with TV-in.&lt;br /&gt;
&lt;br /&gt;
In early 1998, NVIDIA refreshed the NV3 architecture by releasing the RIVA 128 ZX. The RIVA 128 ZX is an upgraded chip that has a 250 MHz RAMDAC and supports up to 8 MB SGRAM. For texture intense games the increased memory results in higher performance.&lt;br /&gt;
&lt;br /&gt;
The RIVA 128 ZX was also integrated onto Intel's i440BX-based RC440BX motherboard.&lt;br /&gt;
&lt;br /&gt;
===NV4-6 | RIVA TNT &amp;amp; TNT2===&lt;br /&gt;
[[File:Tntpci.jpg|thumb|TNT PCI]]&lt;br /&gt;
Released in 1998, NVIDIA's RIVA TNT is a drastically improved Direct3D 6-compliant GPU with much better image quality and performance compared to its predecessors. It is competitive with Voodoo2 but with more flexibility such as 32-bit color rendering and 1024x1024 texture support. It also supports AGP 2x with execute mode (AGP texturing). 32-bit color rendering comes with a significant speed loss though and large textures are of very limited use since the chip lacks any form of texture compression. 16 MB RAM allows for very high resolutions. Great OpenGL support. All NVIDIA cards have the advantage of seperately clocked memory which makes for more flexible overclocking than with 3dfx cards. Unlike ATI's Rage 128 - one of its main competitors, the RIVA TNT does not support true trilinear filtering, but rather, its trilinear filtering is an approximation implementation. This results in reduced visual quality when viewing textures / MIP levels at far distances (which can be best described as a &amp;quot;sanding&amp;quot; or dither-pattern type effect) when trilinear filtering is enabled, but otherwise the RIVA TNT's visual quality is great. The RIVA TNT was also integrated onto Intel's SR440BX and PowerColor's DREAMCODE motherboards, both of which are based on Intel's i440BX chipset. &lt;br /&gt;
&lt;br /&gt;
NVIDIA further refined the RIVA TNT architecture in early 1999 with the RIVA TNT2. The RIVA TNT2 is very similar to the original TNT but adds support for AGP 4x and 2048x2048 resolution texture maps. It also typically has 32 MB of RAM and it is clocked much higher so performs noticeably better. Some popular budget variants include Vanta and TNT2 Model 64 (M64). The M64 model uses a 64-bit memory interface (rather than 128-bit as opposed to the regular variant), effectively halving its memory bandwidth. The performance of the M64 model, however, is faster than the original TNT in some situations. &lt;br /&gt;
&lt;br /&gt;
Acer Laboratories Inc. (ALi) later integrated the RIVA TNT2 core into the northbridge of their M1631 motherboard chipset, commonly known as the Aladdin TNT2. The Aladdin TNT2 could also use shared system memory (up to 32 MB total) in addition its onboard graphics memory. Most motherboard manufacturers opted to equip it with 8 MB local memory. The Aladdin TNT2 is on par with the TNT2 M64 in gaming performance.&lt;br /&gt;
&lt;br /&gt;
The RIVA TNT and TNT2 do not support 8-bit palettized textures. This is a problem with, for example, Final Fantasy VII. Likewise with the original TNT, the RIVA TNT2 only supports trilinear approximation rather than true trilinear filtering. NVIDIA would not implement true trilinear filtering until the NV1x (GeForce 256 / GeForce2 / GeForce4 MX) architecture.&lt;br /&gt;
&lt;br /&gt;
The TNT2 is also NVIDIA's last chip with Windows 3.1x support. The TNT/TNT2 Windows 3.1x drivers were only released in beta form and therefore have stability issues and occasionally produce screen corruption - for example, video playback is fickle and will usually crash the system. Furthermore, the same drivers are locked to 60 Hz refresh rate, making them undesirable for use with CRT monitors under this OS. The TNT/TNT2 Windows 3.1x drivers will not work on any GeForce cards.&lt;br /&gt;
&lt;br /&gt;
===NV1x | GeForce 256 / 2 / 4 MX===&lt;br /&gt;
[[File:Gf256sdr.jpg|thumb|GeForce 256 SDR]]&lt;br /&gt;
Released in October 1999, GeForce 256 / NV10 was the first Direct3D 7-compliant GPU. It was initially released with standard SDRAM, but a DDR version later followed. The DDR version is roughly twice as fast as TNT2. It introduced ordered grid super-sampling anti-aliasing, anisotropic filtering support (up to 2x level), cubic environment mapping, and support for hardware dot product bump mapping.&lt;br /&gt;
&lt;br /&gt;
The next refinement in the architecture came in April 2000 in the form of the GeForce 2 / NV15. The GeForce 2 has twice the texture fillrate per clock compared to NV10 and uses a smaller manufacturing process allowing higher clock rates while reducing the power consumption at the same time. The GeForce 2 GTS is about 40% faster than the GeForce 256. GeForce 2 performance is mostly limited by memory bandwidth. Some card manufacturer used a low quality analog circuit design that produces a blurry image output.&lt;br /&gt;
&lt;br /&gt;
GeForce 2 MX / NV11 is the low end series of the GeForce 2, released in September 2000.  These cards have half of the pixel pipelines and half the memory interface of the NV15. It is the first NV chip with two integrated TDMS channels, providing dual display output (called &amp;quot;TwinView&amp;quot;).  It also has &amp;quot;Digital Vibrance Control&amp;quot; that allows calibration of various image output aspects.  The 3D performance of GeForce 2 MX at 16-bit color depth is slightly faster than a GeForce 256 SDR. With its relatively low price and with the performance it offered, it became a popular card. The GeForce 2 MX core was later integrated into NVIDIA's nForce IGP motherboard chipset northbridge.&lt;br /&gt;
&lt;br /&gt;
GeForce 4 MX / NV17 replaced the GeForce 2 series in January 2002. The NV17 core is a hybrid of the NV11 and NV25. The integration of various efficiency and bandwidth improving features, combined with significantly higher clock speed than NV11, allows it to match NV15 performance. These features were advertised as &amp;quot;Lightspeed Memory Architecture II,&amp;quot; which was a refinement of the Lightspeed Memory Architecture introduced with the GeForce 3 (NV20).  It also gained the &amp;quot;AccuView&amp;quot; anti-aliasing capabilities which are considerably advanced in quality and performance over NV11 and NV15. However, the GeForce 4 MX lacks the hardware pixel and vertex shaders support, as well as environment mapped bump mapping support and higher level anisotropic filtering support found in the NV2x architecture.&lt;br /&gt;
*[http://www.anandtech.com/show/875/6 AnandTech: NV17 and NV25 Come to Life]&lt;br /&gt;
*[http://www.nvidia.com/object/feature_lmaii.html Lightspeed Memory Architecture II]&lt;br /&gt;
&lt;br /&gt;
NVIDIA further refreshed the GeForce 4 MX line in late 2002 with the NV18, which added AGP 8x support. In 2004 the series received another refresh with PCI Express x16 support via a bridge chip. The first such product introduced was the GeForce PCX 4300. The GeForce 4 MX core was also integrated into NVIDIA's nForce2 IGP motherboard chipset northbridge.&lt;br /&gt;
&lt;br /&gt;
Estimated model performance ranking:&lt;br /&gt;
&lt;br /&gt;
GF2 MX100 &amp;lt; GF2 MX200 &amp;lt; GF2 MX &amp;lt; GF2 MX400 &amp;lt; GF4 MX420 &amp;lt; GF2 GTS &amp;lt; GF2 Pro &amp;lt; GF2 Ti VX &amp;lt; GF2 Ti &amp;lt; GF2 Ultra &amp;lt; GF4 MX440&lt;br /&gt;
&lt;br /&gt;
===NV2x | GeForce 3 &amp;amp; 4===&lt;br /&gt;
[[File:GeForce3_Ti200.JPG|180px|thumb||GeForce 3 Ti 200]]&lt;br /&gt;
Released in March 2001, the GeForce 3 / NV20 is the first Direct3D 8.0-compliant GPU. It is more efficient than GeForce 2 because of improvements in memory bandwidth utilization and the addition of hidden surface removal (HSR) functions similar to those of ATI Radeon. NVIDIA called this &amp;quot;Lightspeed Memory Architecture&amp;quot;.  Despite lower fillrate than GeForce 2 Ultra/Pro/Ti, in some cases GeForce 3 can outperform those cards by up to 50%, namely in situations with anti-aliasing or when the HSR features save considerable fillrate. However in some cases it loses to GeForce 2 Ultra. In the latter half of 2001, NVIDIA released the GeForce 3 Ti 200 and Ti 500. GeForce 3 Ti 200 is clocked lower than the original GeForce 3, while Ti 500 is fastest. The original GeForce 3 and GeForce 3 Ti 500 were only released in 64 MB configurations, while the GeForce 3 Ti 200 was released in 64 MB and 128 MB configurations. The GeForce 3, however, benefitted very little from 128 MB memory.&lt;br /&gt;
&lt;br /&gt;
GeForce 3 is the first GeForce GPU with environment mapped bump mapping (EMBM) support, multi-sample anti-aliasing (MSAA) and complete anisotropic filtering support (up to 8x level). MSAA is considerably less demanding of fillrate than SSAA. 2X and 4X MSAA modes are available. There is also an anti-aliasing mode called &amp;quot;Quincunx&amp;quot; that uses a combination of 2X MSAA and a RAMDAC-based filter. This mode was intended to allow better quality anti-aliasing than 2X MSAA/SSAA but without the performance hit of 4X MSAA/SSAA. Higher anisotropic filtering levels (4x and 8x), however, incur a heavy performance decrease (sometimes by as much as 50%). However, compared to its main competitor, ATI's R200 series, GeForce3's anisotropic filtering implementation is less angle-dependent and can work simultaneously with trilinear filtering, yielding better quality when observed in motion.&lt;br /&gt;
&lt;br /&gt;
The next evolution in the NV2x architecture came in early 2002 in the form of the NV25 / GeForce 4 Ti series. The GeForce 4 Ti is quite similar to the GeForce 3 in general. Changes include higher clock speed, pixel shader 1.3 support, Direct3D 8.1 compliance, an additional vertex processor for better geometry performance and dual RAM DAC for dual display output. Later in 2002, NVIDIA released the NV28, which added AGP 8x support. With the NV28 release, the AGP 8x versions of the Ti 4400 and 4600 were respectively rebranded as the Ti 4800 SE and Ti 4800. The AGP 8x variant of the Ti 4200 was just known as the &amp;quot;Ti 4200 with AGP 8x.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
In the GeForce4 Ti line, the Ti 4200 is slowest and the Ti 4800 is fastest.&lt;br /&gt;
*[http://ixbtlabs.com/articles/gf4/index1.html Review from ixbtlabs GeForce 3 Ti500, Radeon 8500, GeForce 4 Ti]&lt;br /&gt;
&lt;br /&gt;
===NV3x | GeForce FX===&lt;br /&gt;
[[File:Gf5200u.jpg|thumb|FX 5200 Ultra]]&lt;br /&gt;
These are NVIDIA's first Direct3D 9 GPUs.  They have excellent Direct3D 5-8 compatibility and performance but are of limited value for Direct3D 9.  They are very useful for old games because they still have support for palettized textures and fog table. Similar anti-aliasing and anisotropic features, but performance with these is improved compared to older models. The high performance models like 5700 Ultra, 5800 Ultra and 59x0 Ultra allow one to run old games in high-resolution with anti-aliasing and anisotropic filtering.&lt;br /&gt;
&lt;br /&gt;
Avoid models with 64-bit bus and naming suffixes like LE, XT or VE because they have been crippled in some way.  There were some PCIe models made, named GeForce PCX 5xxx.&lt;br /&gt;
&lt;br /&gt;
5200 &amp;lt; 5500 &amp;lt; 5200 Ultra &amp;lt; 5600 &amp;lt; 5600 Ultra &amp;lt; 5700 &amp;lt; 5700 Ultra &amp;lt; 5800 &amp;lt; 5800 Ultra &amp;lt; 59x0 &amp;lt; 59x0 Ultra&lt;br /&gt;
&lt;br /&gt;
===NV4x | GeForce 6===&lt;br /&gt;
The GeForce 6 series (NV4x) was released in 2004 and is the first Direct3D 9.0c-compliant GPU. It introduced support for Shader Model 3.0 and support for PCI Express x16, though initially released cards were AGP-only. Compared to the GeForce FX series, the NV4x had dramatically improved performance all-around but dropped palettized texture support, so it is incompatible with some old games (few). This is NVIDIA's final generation of GPUs with Windows 9x and NT 4.0 support.&lt;br /&gt;
&lt;br /&gt;
==Compatibility notes==&lt;br /&gt;
*With Intel 440BX motherboards, drivers newer than 56.64 may be unstable.&lt;br /&gt;
&lt;br /&gt;
==Video captures==&lt;br /&gt;
&lt;br /&gt;
===NV1 / STG2000 ===&lt;br /&gt;
{{#ev:youtube|2H-gEKsd5SY}}&lt;br /&gt;
{{#ev:youtube|9bJGnzb0Asw}}&lt;br /&gt;
{{#ev:youtube|t9Zqy5BZQ1E}}&lt;br /&gt;
{{#ev:youtube|_4l_2Bqbi9Q}}&lt;br /&gt;
{{#ev:youtube|ZOstiHUk_EM}}&lt;br /&gt;
{{#ev:youtube|Ibs90LY_Ph8}}&lt;br /&gt;
{{#ev:youtube|JG0kqg8QlLw}}&lt;br /&gt;
{{#ev:youtube|nc6cH0zuMSs}}&lt;br /&gt;
{{#ev:youtube|VK9sg_93iCE}}&lt;br /&gt;
&lt;br /&gt;
===NV3 | RIVA 128 &amp;amp; 128 ZX ===&lt;br /&gt;
{{#ev:youtube|0GAEXE3eu0o}}&lt;br /&gt;
{{#ev:youtube|Cqxia9tPFrs}}&lt;br /&gt;
{{#ev:youtube|QpGVOAuDfNc}}&lt;br /&gt;
{{#ev:youtube|h2KGiIen4n4}}&lt;br /&gt;
{{#ev:youtube|LZrWRMMdwW4}}&lt;br /&gt;
{{#ev:youtube|hv07UKRetPY}}&lt;br /&gt;
&lt;br /&gt;
==Related links==&lt;br /&gt;
*[http://www.vogonsdrivers.com/index.php?catid=24  VOGONS Drivers NVIDIA section] &lt;br /&gt;
*[http://gona.mactar.hu/DOS_TESTS/ Gona's PCI and AGP DOS game compatibility matrix]&lt;br /&gt;
&lt;br /&gt;
[[Category:Hardware]]&lt;br /&gt;
[[Category:Graphics Cards]]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=NVIDIA&amp;diff=1370</id>
		<title>NVIDIA</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=NVIDIA&amp;diff=1370"/>
				<updated>2013-05-12T07:23:44Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* NV3 | RIVA 128 &amp;amp; 128 ZX */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Retro value==&lt;br /&gt;
GeForce 256 through GeForce FX are fine cards for old games for a number of reasons. Their DOS game compatibility and GUI performance are top notch. Their Direct3D driver supports two critical old features, fog table and 8-bit palettized textures. OpenGL compatibility and performance are second to none, and some games utilize proprietary NVIDIA extensions.&lt;br /&gt;
&lt;br /&gt;
One problem with NVIDIA cards prior to GeForce4 is that some card vendors built their cards too cheaply. The most noticeable result is poor analog signal quality. It causes problems such as blurriness, loss of color saturation and color bleed, particularly at higher resolutions and refresh rates. Determining which cards are high quality is difficult but the GeForce4 and newer cards are most likely to be good.&lt;br /&gt;
&lt;br /&gt;
Perhaps the finest choices are the GeForce FX series because they offer the most refined quality-enhancing features and the high end models have the performance to use these features at higher resolutions. GeForce 6 drops support for palettized textures, which is a problem with a few games, but otherwise they too are great choices. GeForce 7 drops support for Windows 9x but still supports Windows 2000. Some users, however, have reportedly gotten GeForce 7800 cards to work under Windows 98 and ME using an unofficial, modified driver [http://www.mdgx.com/files/nv8269.php (link)]. This modified driver does not support Windows 95. The GeForce 7 series is also the last that is supported by NVIDIAs Stereo3D driver extension for shutter glasses such as ELSA Revelator. GeForce 8 and above support only XP and newer.&lt;br /&gt;
&lt;br /&gt;
==Cards==&lt;br /&gt;
===NV1 / STG2000 ===&lt;br /&gt;
[[File:Dedge3d.jpg|thumb|NV1]]&lt;br /&gt;
Released in 1995, NVIDIA's first 3D accelerator was an all-in-one product with audio, GUI, VGA, 3D and Sega Saturn gamepad support. It uses a type of 3D rendering called quadratic texture mapping that is not Direct3D or OpenGL compatible so it is only useful with games that use its proprietary API. It is the same technology used by Sega Saturn and as such various games were ported to use the NV1. Its audio consists of wavetable MIDI and DirectSound support but very little DOS support. DOS VGA compatibility is limited.&lt;br /&gt;
&lt;br /&gt;
It comes in various memory configurations with up to 4 MB maximum. The final drivers support Direct3D, but it is a software-only implementation.&lt;br /&gt;
&lt;br /&gt;
Games with support: Battle Arena Toshinden, Descent: Destination Saturn, NASCAR Racing, Panzer Dragoon, Virtua Cop, Virtua Fighter Remix.&lt;br /&gt;
&lt;br /&gt;
===NV3 | RIVA 128 &amp;amp; 128 ZX===&lt;br /&gt;
[[File:Riva128.jpg|thumb|RIVA 128]]&lt;br /&gt;
The RIVA 128 was released in late 1997 and it is the first Direct3D compatible GPU from NVIDIA. RIVA stands for '''R'''eal-time '''I'''nteractive '''V'''ideo and '''A'''nimation and 128 for the internal 128-bit pipeline and memory interface. It has a 206 MHz RAMDAC and supports DDC2 and VBE3.&lt;br /&gt;
Riva 128 has all the hardware features required for Direct3D 5 and has also good OpenGL 1.0 compatibility. It renders at 16-bit color depth and supports 3D accelerated resolutions up to 960x720 with Z-buffer. The 3D-performance is competitive with [[3dfx|Voodoo Graphics]] (Voodoo1). The RIVA 128 shows a few rendering quality issues like visible texture seams and a very apparent dithering pattern. Overall the rendered image appears more saturated than output from Glide games from this time. A special feature of the PCI version of the card is the ability to load textures over the PCI bus, which was advertised as ''AGP content on PCI''. The 2D-features include video scaling and color conversion capabilities. The chip features also PAL/NTSC output so many cards were released with TV-out, some even with TV-in.&lt;br /&gt;
&lt;br /&gt;
In early 1998, NVIDIA refreshed the NV3 architecture by releasing the RIVA 128 ZX. The RIVA 128 ZX is an upgraded chip that has a 250 MHz RAMDAC and supports up to 8 MB SGRAM. For texture intense games the increased memory results in higher performance.&lt;br /&gt;
&lt;br /&gt;
The RIVA 128 ZX was also integrated onto Intel's i440BX-based RC440BX motherboard.&lt;br /&gt;
&lt;br /&gt;
===NV4-6 | RIVA TNT &amp;amp; TNT2===&lt;br /&gt;
[[File:Tntpci.jpg|thumb|TNT PCI]]&lt;br /&gt;
Released in 1998, NVIDIA's RIVA TNT is a drastically improved Direct3D 6-compliant GPU with much better image quality and performance compared to its predecessors. It is competitive with Voodoo2 but with more flexibility such as 32-bit color rendering and 1024x1024 texture support. It also supports AGP 2x with execute mode (AGP texturing). 32-bit color rendering comes with a significant speed loss though and large textures are of very limited use since the chip lacks any form of texture compression. 16 MB RAM allows for very high resolutions. Great OpenGL support. All NVIDIA cards have the advantage of seperately clocked memory which makes for more flexible overclocking than with 3dfx cards. Unlike ATI's Rage 128 - one of its main competitors, the RIVA TNT does not support true trilinear filtering, but rather, its trilinear filtering is an approximation implementation. This results in reduced visual quality when viewing textures / MIP levels at far distances (which can be best described as a &amp;quot;sanding&amp;quot; or dither-pattern type effect) when trilinear filtering is enabled, but otherwise the RIVA TNT's visual quality is great. &lt;br /&gt;
&lt;br /&gt;
NVIDIA further refined the RIVA TNT architecture in early 1999 with the RIVA TNT2. The RIVA TNT2 is very similar to the original TNT but adds support for AGP 4x and 2048x2048 resolution texture maps. It also typically has 32 MB of RAM and it is clocked much higher so performs noticeably better. Some popular budget variants include Vanta and TNT2 Model 64 (M64). The M64 model uses a 64-bit memory interface (rather than 128-bit as opposed to the regular variant), effectively halving its memory bandwidth. The performance of the M64 model, however, is faster than the original TNT in some situations. &lt;br /&gt;
&lt;br /&gt;
Acer Laboratories Inc. (ALi) later integrated the RIVA TNT2 core into the northbridge of their M1631 motherboard chipset, commonly known as the Aladdin TNT2. The Aladdin TNT2 could also use shared system memory (up to 32 MB total) in addition its onboard graphics memory. Most motherboard manufacturers opted to equip it with 8 MB local memory. The Aladdin TNT2 is on par with the TNT2 M64 in gaming performance.&lt;br /&gt;
&lt;br /&gt;
The RIVA TNT and TNT2 do not support 8-bit palettized textures. This is a problem with, for example, Final Fantasy VII. Likewise with the original TNT, the RIVA TNT2 only supports trilinear approximation rather than true trilinear filtering. NVIDIA would not implement true trilinear filtering until the NV1x (GeForce 256 / GeForce2 / GeForce4 MX) architecture.&lt;br /&gt;
&lt;br /&gt;
The TNT2 is also NVIDIA's last chip with Windows 3.1x support. The TNT/TNT2 Windows 3.1x drivers were only released in beta form and therefore have stability issues and occasionally produce screen corruption - for example, video playback is fickle and will usually crash the system. Furthermore, the same drivers are locked to 60 Hz refresh rate, making them undesirable for use with CRT monitors under this OS. The TNT/TNT2 Windows 3.1x drivers will not work on any GeForce cards.&lt;br /&gt;
&lt;br /&gt;
===NV1x | GeForce 256 / 2 / 4 MX===&lt;br /&gt;
[[File:Gf256sdr.jpg|thumb|GeForce 256 SDR]]&lt;br /&gt;
Released in October 1999, GeForce 256 / NV10 was the first Direct3D 7-compliant GPU. It was initially released with standard SDRAM, but a DDR version later followed. The DDR version is roughly twice as fast as TNT2. It introduced ordered grid super-sampling anti-aliasing, anisotropic filtering support (up to 2x level), cubic environment mapping, and support for hardware dot product bump mapping.&lt;br /&gt;
&lt;br /&gt;
The next refinement in the architecture came in April 2000 in the form of the GeForce 2 / NV15. The GeForce 2 has twice the texture fillrate per clock compared to NV10 and uses a smaller manufacturing process allowing higher clock rates while reducing the power consumption at the same time. The GeForce 2 GTS is about 40% faster than the GeForce 256. GeForce 2 performance is mostly limited by memory bandwidth. Some card manufacturer used a low quality analog circuit design that produces a blurry image output.&lt;br /&gt;
&lt;br /&gt;
GeForce 2 MX / NV11 is the low end series of the GeForce 2, released in September 2000.  These cards have half of the pixel pipelines and half the memory interface of the NV15. It is the first NV chip with two integrated TDMS channels, providing dual display output (called &amp;quot;TwinView&amp;quot;).  It also has &amp;quot;Digital Vibrance Control&amp;quot; that allows calibration of various image output aspects.  The 3D performance of GeForce 2 MX at 16-bit color depth is slightly faster than a GeForce 256 SDR. With its relatively low price and with the performance it offered, it became a popular card. The GeForce 2 MX core was later integrated into NVIDIA's nForce IGP motherboard chipset northbridge.&lt;br /&gt;
&lt;br /&gt;
GeForce 4 MX / NV17 replaced the GeForce 2 series in January 2002. The NV17 core is a hybrid of the NV11 and NV25. The integration of various efficiency and bandwidth improving features, combined with significantly higher clock speed than NV11, allows it to match NV15 performance. These features were advertised as &amp;quot;Lightspeed Memory Architecture II,&amp;quot; which was a refinement of the Lightspeed Memory Architecture introduced with the GeForce 3 (NV20).  It also gained the &amp;quot;AccuView&amp;quot; anti-aliasing capabilities which are considerably advanced in quality and performance over NV11 and NV15. However, the GeForce 4 MX lacks the hardware pixel and vertex shaders support, as well as environment mapped bump mapping support and higher level anisotropic filtering support found in the NV2x architecture.&lt;br /&gt;
*[http://www.anandtech.com/show/875/6 AnandTech: NV17 and NV25 Come to Life]&lt;br /&gt;
*[http://www.nvidia.com/object/feature_lmaii.html Lightspeed Memory Architecture II]&lt;br /&gt;
&lt;br /&gt;
NVIDIA further refreshed the GeForce 4 MX line in late 2002 with the NV18, which added AGP 8x support. In 2004 the series received another refresh with PCI Express x16 support via a bridge chip. The first such product introduced was the GeForce PCX 4300. The GeForce 4 MX core was also integrated into NVIDIA's nForce2 IGP motherboard chipset northbridge.&lt;br /&gt;
&lt;br /&gt;
Estimated model performance ranking:&lt;br /&gt;
&lt;br /&gt;
GF2 MX100 &amp;lt; GF2 MX200 &amp;lt; GF2 MX &amp;lt; GF2 MX400 &amp;lt; GF4 MX420 &amp;lt; GF2 GTS &amp;lt; GF2 Pro &amp;lt; GF2 Ti VX &amp;lt; GF2 Ti &amp;lt; GF2 Ultra &amp;lt; GF4 MX440&lt;br /&gt;
&lt;br /&gt;
===NV2x | GeForce 3 &amp;amp; 4===&lt;br /&gt;
[[File:GeForce3_Ti200.JPG|180px|thumb||GeForce 3 Ti 200]]&lt;br /&gt;
Released in March 2001, the GeForce 3 / NV20 is the first Direct3D 8.0-compliant GPU. It is more efficient than GeForce 2 because of improvements in memory bandwidth utilization and the addition of hidden surface removal (HSR) functions similar to those of ATI Radeon. NVIDIA called this &amp;quot;Lightspeed Memory Architecture&amp;quot;.  Despite lower fillrate than GeForce 2 Ultra/Pro/Ti, in some cases GeForce 3 can outperform those cards by up to 50%, namely in situations with anti-aliasing or when the HSR features save considerable fillrate. However in some cases it loses to GeForce 2 Ultra. In the latter half of 2001, NVIDIA released the GeForce 3 Ti 200 and Ti 500. GeForce 3 Ti 200 is clocked lower than the original GeForce 3, while Ti 500 is fastest. The original GeForce 3 and GeForce 3 Ti 500 were only released in 64 MB configurations, while the GeForce 3 Ti 200 was released in 64 MB and 128 MB configurations. The GeForce 3, however, benefitted very little from 128 MB memory.&lt;br /&gt;
&lt;br /&gt;
GeForce 3 is the first GeForce GPU with environment mapped bump mapping (EMBM) support, multi-sample anti-aliasing (MSAA) and complete anisotropic filtering support (up to 8x level). MSAA is considerably less demanding of fillrate than SSAA. 2X and 4X MSAA modes are available. There is also an anti-aliasing mode called &amp;quot;Quincunx&amp;quot; that uses a combination of 2X MSAA and a RAMDAC-based filter. This mode was intended to allow better quality anti-aliasing than 2X MSAA/SSAA but without the performance hit of 4X MSAA/SSAA. Higher anisotropic filtering levels (4x and 8x), however, incur a heavy performance decrease (sometimes by as much as 50%). However, compared to its main competitor, ATI's R200 series, GeForce3's anisotropic filtering implementation is less angle-dependent and can work simultaneously with trilinear filtering, yielding better quality when observed in motion.&lt;br /&gt;
&lt;br /&gt;
The next evolution in the NV2x architecture came in early 2002 in the form of the NV25 / GeForce 4 Ti series. The GeForce 4 Ti is quite similar to the GeForce 3 in general. Changes include higher clock speed, pixel shader 1.3 support, Direct3D 8.1 compliance, an additional vertex processor for better geometry performance and dual RAM DAC for dual display output. Later in 2002, NVIDIA released the NV28, which added AGP 8x support. With the NV28 release, the AGP 8x versions of the Ti 4400 and 4600 were respectively rebranded as the Ti 4800 SE and Ti 4800. The AGP 8x variant of the Ti 4200 was just known as the &amp;quot;Ti 4200 with AGP 8x.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
In the GeForce4 Ti line, the Ti 4200 is slowest and the Ti 4800 is fastest.&lt;br /&gt;
*[http://ixbtlabs.com/articles/gf4/index1.html Review from ixbtlabs GeForce 3 Ti500, Radeon 8500, GeForce 4 Ti]&lt;br /&gt;
&lt;br /&gt;
===NV3x | GeForce FX===&lt;br /&gt;
[[File:Gf5200u.jpg|thumb|FX 5200 Ultra]]&lt;br /&gt;
These are NVIDIA's first Direct3D 9 GPUs.  They have excellent Direct3D 5-8 compatibility and performance but are of limited value for Direct3D 9.  They are very useful for old games because they still have support for palettized textures and fog table. Similar anti-aliasing and anisotropic features, but performance with these is improved compared to older models. The high performance models like 5700 Ultra, 5800 Ultra and 59x0 Ultra allow one to run old games in high-resolution with anti-aliasing and anisotropic filtering.&lt;br /&gt;
&lt;br /&gt;
Avoid models with 64-bit bus and naming suffixes like LE, XT or VE because they have been crippled in some way.  There were some PCIe models made, named GeForce PCX 5xxx.&lt;br /&gt;
&lt;br /&gt;
5200 &amp;lt; 5500 &amp;lt; 5200 Ultra &amp;lt; 5600 &amp;lt; 5600 Ultra &amp;lt; 5700 &amp;lt; 5700 Ultra &amp;lt; 5800 &amp;lt; 5800 Ultra &amp;lt; 59x0 &amp;lt; 59x0 Ultra&lt;br /&gt;
&lt;br /&gt;
===NV4x | GeForce 6===&lt;br /&gt;
The GeForce 6 series (NV4x) was released in 2004 and is the first Direct3D 9.0c-compliant GPU. It introduced support for Shader Model 3.0 and support for PCI Express x16, though initially released cards were AGP-only. Compared to the GeForce FX series, the NV4x had dramatically improved performance all-around but dropped palettized texture support, so it is incompatible with some old games (few). This is NVIDIA's final generation of GPUs with Windows 9x and NT 4.0 support.&lt;br /&gt;
&lt;br /&gt;
==Compatibility notes==&lt;br /&gt;
*With Intel 440BX motherboards, drivers newer than 56.64 may be unstable.&lt;br /&gt;
&lt;br /&gt;
==Video captures==&lt;br /&gt;
&lt;br /&gt;
===NV1 / STG2000 ===&lt;br /&gt;
{{#ev:youtube|2H-gEKsd5SY}}&lt;br /&gt;
{{#ev:youtube|9bJGnzb0Asw}}&lt;br /&gt;
{{#ev:youtube|t9Zqy5BZQ1E}}&lt;br /&gt;
{{#ev:youtube|_4l_2Bqbi9Q}}&lt;br /&gt;
{{#ev:youtube|ZOstiHUk_EM}}&lt;br /&gt;
{{#ev:youtube|Ibs90LY_Ph8}}&lt;br /&gt;
{{#ev:youtube|JG0kqg8QlLw}}&lt;br /&gt;
{{#ev:youtube|nc6cH0zuMSs}}&lt;br /&gt;
{{#ev:youtube|VK9sg_93iCE}}&lt;br /&gt;
&lt;br /&gt;
===NV3 | RIVA 128 &amp;amp; 128 ZX ===&lt;br /&gt;
{{#ev:youtube|0GAEXE3eu0o}}&lt;br /&gt;
{{#ev:youtube|Cqxia9tPFrs}}&lt;br /&gt;
{{#ev:youtube|QpGVOAuDfNc}}&lt;br /&gt;
{{#ev:youtube|h2KGiIen4n4}}&lt;br /&gt;
{{#ev:youtube|LZrWRMMdwW4}}&lt;br /&gt;
{{#ev:youtube|hv07UKRetPY}}&lt;br /&gt;
&lt;br /&gt;
==Related links==&lt;br /&gt;
*[http://www.vogonsdrivers.com/index.php?catid=24  VOGONS Drivers NVIDIA section] &lt;br /&gt;
*[http://gona.mactar.hu/DOS_TESTS/ Gona's PCI and AGP DOS game compatibility matrix]&lt;br /&gt;
&lt;br /&gt;
[[Category:Hardware]]&lt;br /&gt;
[[Category:Graphics Cards]]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=NVIDIA&amp;diff=1369</id>
		<title>NVIDIA</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=NVIDIA&amp;diff=1369"/>
				<updated>2013-05-12T05:55:06Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* NV4-6 | RIVA TNT &amp;amp; TNT2 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Retro value==&lt;br /&gt;
GeForce 256 through GeForce FX are fine cards for old games for a number of reasons. Their DOS game compatibility and GUI performance are top notch. Their Direct3D driver supports two critical old features, fog table and 8-bit palettized textures. OpenGL compatibility and performance are second to none, and some games utilize proprietary NVIDIA extensions.&lt;br /&gt;
&lt;br /&gt;
One problem with NVIDIA cards prior to GeForce4 is that some card vendors built their cards too cheaply. The most noticeable result is poor analog signal quality. It causes problems such as blurriness, loss of color saturation and color bleed, particularly at higher resolutions and refresh rates. Determining which cards are high quality is difficult but the GeForce4 and newer cards are most likely to be good.&lt;br /&gt;
&lt;br /&gt;
Perhaps the finest choices are the GeForce FX series because they offer the most refined quality-enhancing features and the high end models have the performance to use these features at higher resolutions. GeForce 6 drops support for palettized textures, which is a problem with a few games, but otherwise they too are great choices. GeForce 7 drops support for Windows 9x but still supports Windows 2000. Some users, however, have reportedly gotten GeForce 7800 cards to work under Windows 98 and ME using an unofficial, modified driver [http://www.mdgx.com/files/nv8269.php (link)]. This modified driver does not support Windows 95. The GeForce 7 series is also the last that is supported by NVIDIAs Stereo3D driver extension for shutter glasses such as ELSA Revelator. GeForce 8 and above support only XP and newer.&lt;br /&gt;
&lt;br /&gt;
==Cards==&lt;br /&gt;
===NV1 / STG2000 ===&lt;br /&gt;
[[File:Dedge3d.jpg|thumb|NV1]]&lt;br /&gt;
Released in 1995, NVIDIA's first 3D accelerator was an all-in-one product with audio, GUI, VGA, 3D and Sega Saturn gamepad support. It uses a type of 3D rendering called quadratic texture mapping that is not Direct3D or OpenGL compatible so it is only useful with games that use its proprietary API. It is the same technology used by Sega Saturn and as such various games were ported to use the NV1. Its audio consists of wavetable MIDI and DirectSound support but very little DOS support. DOS VGA compatibility is limited.&lt;br /&gt;
&lt;br /&gt;
It comes in various memory configurations with up to 4 MB maximum. The final drivers support Direct3D, but it is a software-only implementation.&lt;br /&gt;
&lt;br /&gt;
Games with support: Battle Arena Toshinden, Descent: Destination Saturn, NASCAR Racing, Panzer Dragoon, Virtua Cop, Virtua Fighter Remix.&lt;br /&gt;
&lt;br /&gt;
===NV3 | RIVA 128 &amp;amp; 128 ZX===&lt;br /&gt;
[[File:Riva128.jpg|thumb|RIVA 128]]&lt;br /&gt;
The RIVA 128 was released in late 1997 and it is the first Direct3D compatible GPU from NVIDIA. RIVA stands for '''R'''eal-time '''I'''nteractive '''V'''ideo and '''A'''nimation and 128 for the internal 128-bit pipeline and memory interface. It has a 206 MHz RAMDAC and supports DDC2 and VBE3.&lt;br /&gt;
Riva 128 has all the hardware features required for Direct3D 5 and has also good OpenGL 1.0 compatibility. It renders at 16-bit color depth and supports 3D accelerated resolutions up to 960x720 with Z-buffer. The 3D-performance is competitive with [[3dfx|Voodoo Graphics]] (Voodoo1). The RIVA 128 shows a few rendering quality issues like visible texture seams and a very apparent dithering pattern. Overall the rendered image appears more saturated than output from Glide games from this time. A special feature of the PCI version of the card is the ability to load textures over the PCI bus, which was advertised as ''AGP content on PCI''. The 2D-features include video scaling and color conversion capabilities. The chip features also PAL/NTSC output so many cards were released with TV-out, some even with TV-in.&lt;br /&gt;
&lt;br /&gt;
In early 1998, NVIDIA refreshed the NV3 architecture by releasing the RIVA 128 ZX. The RIVA 128 ZX is an upgraded chip that has a 250 MHz RAMDAC and supports up to 8 MB SGRAM. For texture intense games the increased memory results in higher performance.&lt;br /&gt;
&lt;br /&gt;
===NV4-6 | RIVA TNT &amp;amp; TNT2===&lt;br /&gt;
[[File:Tntpci.jpg|thumb|TNT PCI]]&lt;br /&gt;
Released in 1998, NVIDIA's RIVA TNT is a drastically improved Direct3D 6-compliant GPU with much better image quality and performance compared to its predecessors. It is competitive with Voodoo2 but with more flexibility such as 32-bit color rendering and 1024x1024 texture support. It also supports AGP 2x with execute mode (AGP texturing). 32-bit color rendering comes with a significant speed loss though and large textures are of very limited use since the chip lacks any form of texture compression. 16 MB RAM allows for very high resolutions. Great OpenGL support. All NVIDIA cards have the advantage of seperately clocked memory which makes for more flexible overclocking than with 3dfx cards. Unlike ATI's Rage 128 - one of its main competitors, the RIVA TNT does not support true trilinear filtering, but rather, its trilinear filtering is an approximation implementation. This results in reduced visual quality when viewing textures / MIP levels at far distances (which can be best described as a &amp;quot;sanding&amp;quot; or dither-pattern type effect) when trilinear filtering is enabled, but otherwise the RIVA TNT's visual quality is great. &lt;br /&gt;
&lt;br /&gt;
NVIDIA further refined the RIVA TNT architecture in early 1999 with the RIVA TNT2. The RIVA TNT2 is very similar to the original TNT but adds support for AGP 4x and 2048x2048 resolution texture maps. It also typically has 32 MB of RAM and it is clocked much higher so performs noticeably better. Some popular budget variants include Vanta and TNT2 Model 64 (M64). The M64 model uses a 64-bit memory interface (rather than 128-bit as opposed to the regular variant), effectively halving its memory bandwidth. The performance of the M64 model, however, is faster than the original TNT in some situations. &lt;br /&gt;
&lt;br /&gt;
Acer Laboratories Inc. (ALi) later integrated the RIVA TNT2 core into the northbridge of their M1631 motherboard chipset, commonly known as the Aladdin TNT2. The Aladdin TNT2 could also use shared system memory (up to 32 MB total) in addition its onboard graphics memory. Most motherboard manufacturers opted to equip it with 8 MB local memory. The Aladdin TNT2 is on par with the TNT2 M64 in gaming performance.&lt;br /&gt;
&lt;br /&gt;
The RIVA TNT and TNT2 do not support 8-bit palettized textures. This is a problem with, for example, Final Fantasy VII. Likewise with the original TNT, the RIVA TNT2 only supports trilinear approximation rather than true trilinear filtering. NVIDIA would not implement true trilinear filtering until the NV1x (GeForce 256 / GeForce2 / GeForce4 MX) architecture.&lt;br /&gt;
&lt;br /&gt;
The TNT2 is also NVIDIA's last chip with Windows 3.1x support. The TNT/TNT2 Windows 3.1x drivers were only released in beta form and therefore have stability issues and occasionally produce screen corruption - for example, video playback is fickle and will usually crash the system. Furthermore, the same drivers are locked to 60 Hz refresh rate, making them undesirable for use with CRT monitors under this OS. The TNT/TNT2 Windows 3.1x drivers will not work on any GeForce cards.&lt;br /&gt;
&lt;br /&gt;
===NV1x | GeForce 256 / 2 / 4 MX===&lt;br /&gt;
[[File:Gf256sdr.jpg|thumb|GeForce 256 SDR]]&lt;br /&gt;
Released in October 1999, GeForce 256 / NV10 was the first Direct3D 7-compliant GPU. It was initially released with standard SDRAM, but a DDR version later followed. The DDR version is roughly twice as fast as TNT2. It introduced ordered grid super-sampling anti-aliasing, anisotropic filtering support (up to 2x level), cubic environment mapping, and support for hardware dot product bump mapping.&lt;br /&gt;
&lt;br /&gt;
The next refinement in the architecture came in April 2000 in the form of the GeForce 2 / NV15. The GeForce 2 has twice the texture fillrate per clock compared to NV10 and uses a smaller manufacturing process allowing higher clock rates while reducing the power consumption at the same time. The GeForce 2 GTS is about 40% faster than the GeForce 256. GeForce 2 performance is mostly limited by memory bandwidth. Some card manufacturer used a low quality analog circuit design that produces a blurry image output.&lt;br /&gt;
&lt;br /&gt;
GeForce 2 MX / NV11 is the low end series of the GeForce 2, released in September 2000.  These cards have half of the pixel pipelines and half the memory interface of the NV15. It is the first NV chip with two integrated TDMS channels, providing dual display output (called &amp;quot;TwinView&amp;quot;).  It also has &amp;quot;Digital Vibrance Control&amp;quot; that allows calibration of various image output aspects.  The 3D performance of GeForce 2 MX at 16-bit color depth is slightly faster than a GeForce 256 SDR. With its relatively low price and with the performance it offered, it became a popular card. The GeForce 2 MX core was later integrated into NVIDIA's nForce IGP motherboard chipset northbridge.&lt;br /&gt;
&lt;br /&gt;
GeForce 4 MX / NV17 replaced the GeForce 2 series in January 2002. The NV17 core is a hybrid of the NV11 and NV25. The integration of various efficiency and bandwidth improving features, combined with significantly higher clock speed than NV11, allows it to match NV15 performance. These features were advertised as &amp;quot;Lightspeed Memory Architecture II,&amp;quot; which was a refinement of the Lightspeed Memory Architecture introduced with the GeForce 3 (NV20).  It also gained the &amp;quot;AccuView&amp;quot; anti-aliasing capabilities which are considerably advanced in quality and performance over NV11 and NV15. However, the GeForce 4 MX lacks the hardware pixel and vertex shaders support, as well as environment mapped bump mapping support and higher level anisotropic filtering support found in the NV2x architecture.&lt;br /&gt;
*[http://www.anandtech.com/show/875/6 AnandTech: NV17 and NV25 Come to Life]&lt;br /&gt;
*[http://www.nvidia.com/object/feature_lmaii.html Lightspeed Memory Architecture II]&lt;br /&gt;
&lt;br /&gt;
NVIDIA further refreshed the GeForce 4 MX line in late 2002 with the NV18, which added AGP 8x support. In 2004 the series received another refresh with PCI Express x16 support via a bridge chip. The first such product introduced was the GeForce PCX 4300. The GeForce 4 MX core was also integrated into NVIDIA's nForce2 IGP motherboard chipset northbridge.&lt;br /&gt;
&lt;br /&gt;
Estimated model performance ranking:&lt;br /&gt;
&lt;br /&gt;
GF2 MX100 &amp;lt; GF2 MX200 &amp;lt; GF2 MX &amp;lt; GF2 MX400 &amp;lt; GF4 MX420 &amp;lt; GF2 GTS &amp;lt; GF2 Pro &amp;lt; GF2 Ti VX &amp;lt; GF2 Ti &amp;lt; GF2 Ultra &amp;lt; GF4 MX440&lt;br /&gt;
&lt;br /&gt;
===NV2x | GeForce 3 &amp;amp; 4===&lt;br /&gt;
[[File:GeForce3_Ti200.JPG|180px|thumb||GeForce 3 Ti 200]]&lt;br /&gt;
Released in March 2001, the GeForce 3 / NV20 is the first Direct3D 8.0-compliant GPU. It is more efficient than GeForce 2 because of improvements in memory bandwidth utilization and the addition of hidden surface removal (HSR) functions similar to those of ATI Radeon. NVIDIA called this &amp;quot;Lightspeed Memory Architecture&amp;quot;.  Despite lower fillrate than GeForce 2 Ultra/Pro/Ti, in some cases GeForce 3 can outperform those cards by up to 50%, namely in situations with anti-aliasing or when the HSR features save considerable fillrate. However in some cases it loses to GeForce 2 Ultra. In the latter half of 2001, NVIDIA released the GeForce 3 Ti 200 and Ti 500. GeForce 3 Ti 200 is clocked lower than the original GeForce 3, while Ti 500 is fastest. The original GeForce 3 and GeForce 3 Ti 500 were only released in 64 MB configurations, while the GeForce 3 Ti 200 was released in 64 MB and 128 MB configurations. The GeForce 3, however, benefitted very little from 128 MB memory.&lt;br /&gt;
&lt;br /&gt;
GeForce 3 is the first GeForce GPU with environment mapped bump mapping (EMBM) support, multi-sample anti-aliasing (MSAA) and complete anisotropic filtering support (up to 8x level). MSAA is considerably less demanding of fillrate than SSAA. 2X and 4X MSAA modes are available. There is also an anti-aliasing mode called &amp;quot;Quincunx&amp;quot; that uses a combination of 2X MSAA and a RAMDAC-based filter. This mode was intended to allow better quality anti-aliasing than 2X MSAA/SSAA but without the performance hit of 4X MSAA/SSAA. Higher anisotropic filtering levels (4x and 8x), however, incur a heavy performance decrease (sometimes by as much as 50%). However, compared to its main competitor, ATI's R200 series, GeForce3's anisotropic filtering implementation is less angle-dependent and can work simultaneously with trilinear filtering, yielding better quality when observed in motion.&lt;br /&gt;
&lt;br /&gt;
The next evolution in the NV2x architecture came in early 2002 in the form of the NV25 / GeForce 4 Ti series. The GeForce 4 Ti is quite similar to the GeForce 3 in general. Changes include higher clock speed, pixel shader 1.3 support, Direct3D 8.1 compliance, an additional vertex processor for better geometry performance and dual RAM DAC for dual display output. Later in 2002, NVIDIA released the NV28, which added AGP 8x support. With the NV28 release, the AGP 8x versions of the Ti 4400 and 4600 were respectively rebranded as the Ti 4800 SE and Ti 4800. The AGP 8x variant of the Ti 4200 was just known as the &amp;quot;Ti 4200 with AGP 8x.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
In the GeForce4 Ti line, the Ti 4200 is slowest and the Ti 4800 is fastest.&lt;br /&gt;
*[http://ixbtlabs.com/articles/gf4/index1.html Review from ixbtlabs GeForce 3 Ti500, Radeon 8500, GeForce 4 Ti]&lt;br /&gt;
&lt;br /&gt;
===NV3x | GeForce FX===&lt;br /&gt;
[[File:Gf5200u.jpg|thumb|FX 5200 Ultra]]&lt;br /&gt;
These are NVIDIA's first Direct3D 9 GPUs.  They have excellent Direct3D 5-8 compatibility and performance but are of limited value for Direct3D 9.  They are very useful for old games because they still have support for palettized textures and fog table. Similar anti-aliasing and anisotropic features, but performance with these is improved compared to older models. The high performance models like 5700 Ultra, 5800 Ultra and 59x0 Ultra allow one to run old games in high-resolution with anti-aliasing and anisotropic filtering.&lt;br /&gt;
&lt;br /&gt;
Avoid models with 64-bit bus and naming suffixes like LE, XT or VE because they have been crippled in some way.  There were some PCIe models made, named GeForce PCX 5xxx.&lt;br /&gt;
&lt;br /&gt;
5200 &amp;lt; 5500 &amp;lt; 5200 Ultra &amp;lt; 5600 &amp;lt; 5600 Ultra &amp;lt; 5700 &amp;lt; 5700 Ultra &amp;lt; 5800 &amp;lt; 5800 Ultra &amp;lt; 59x0 &amp;lt; 59x0 Ultra&lt;br /&gt;
&lt;br /&gt;
===NV4x | GeForce 6===&lt;br /&gt;
The GeForce 6 series (NV4x) was released in 2004 and is the first Direct3D 9.0c-compliant GPU. It introduced support for Shader Model 3.0 and support for PCI Express x16, though initially released cards were AGP-only. Compared to the GeForce FX series, the NV4x had dramatically improved performance all-around but dropped palettized texture support, so it is incompatible with some old games (few). This is NVIDIA's final generation of GPUs with Windows 9x and NT 4.0 support.&lt;br /&gt;
&lt;br /&gt;
==Compatibility notes==&lt;br /&gt;
*With Intel 440BX motherboards, drivers newer than 56.64 may be unstable.&lt;br /&gt;
&lt;br /&gt;
==Video captures==&lt;br /&gt;
&lt;br /&gt;
===NV1 / STG2000 ===&lt;br /&gt;
{{#ev:youtube|2H-gEKsd5SY}}&lt;br /&gt;
{{#ev:youtube|9bJGnzb0Asw}}&lt;br /&gt;
{{#ev:youtube|t9Zqy5BZQ1E}}&lt;br /&gt;
{{#ev:youtube|_4l_2Bqbi9Q}}&lt;br /&gt;
{{#ev:youtube|ZOstiHUk_EM}}&lt;br /&gt;
{{#ev:youtube|Ibs90LY_Ph8}}&lt;br /&gt;
{{#ev:youtube|JG0kqg8QlLw}}&lt;br /&gt;
{{#ev:youtube|nc6cH0zuMSs}}&lt;br /&gt;
{{#ev:youtube|VK9sg_93iCE}}&lt;br /&gt;
&lt;br /&gt;
===NV3 | RIVA 128 &amp;amp; 128 ZX ===&lt;br /&gt;
{{#ev:youtube|0GAEXE3eu0o}}&lt;br /&gt;
{{#ev:youtube|Cqxia9tPFrs}}&lt;br /&gt;
{{#ev:youtube|QpGVOAuDfNc}}&lt;br /&gt;
{{#ev:youtube|h2KGiIen4n4}}&lt;br /&gt;
{{#ev:youtube|LZrWRMMdwW4}}&lt;br /&gt;
{{#ev:youtube|hv07UKRetPY}}&lt;br /&gt;
&lt;br /&gt;
==Related links==&lt;br /&gt;
*[http://www.vogonsdrivers.com/index.php?catid=24  VOGONS Drivers NVIDIA section] &lt;br /&gt;
*[http://gona.mactar.hu/DOS_TESTS/ Gona's PCI and AGP DOS game compatibility matrix]&lt;br /&gt;
&lt;br /&gt;
[[Category:Hardware]]&lt;br /&gt;
[[Category:Graphics Cards]]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=NVIDIA&amp;diff=1368</id>
		<title>NVIDIA</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=NVIDIA&amp;diff=1368"/>
				<updated>2013-05-12T05:53:31Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* NV4-6 | RIVA TNT &amp;amp; TNT2 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Retro value==&lt;br /&gt;
GeForce 256 through GeForce FX are fine cards for old games for a number of reasons. Their DOS game compatibility and GUI performance are top notch. Their Direct3D driver supports two critical old features, fog table and 8-bit palettized textures. OpenGL compatibility and performance are second to none, and some games utilize proprietary NVIDIA extensions.&lt;br /&gt;
&lt;br /&gt;
One problem with NVIDIA cards prior to GeForce4 is that some card vendors built their cards too cheaply. The most noticeable result is poor analog signal quality. It causes problems such as blurriness, loss of color saturation and color bleed, particularly at higher resolutions and refresh rates. Determining which cards are high quality is difficult but the GeForce4 and newer cards are most likely to be good.&lt;br /&gt;
&lt;br /&gt;
Perhaps the finest choices are the GeForce FX series because they offer the most refined quality-enhancing features and the high end models have the performance to use these features at higher resolutions. GeForce 6 drops support for palettized textures, which is a problem with a few games, but otherwise they too are great choices. GeForce 7 drops support for Windows 9x but still supports Windows 2000. Some users, however, have reportedly gotten GeForce 7800 cards to work under Windows 98 and ME using an unofficial, modified driver [http://www.mdgx.com/files/nv8269.php (link)]. This modified driver does not support Windows 95. The GeForce 7 series is also the last that is supported by NVIDIAs Stereo3D driver extension for shutter glasses such as ELSA Revelator. GeForce 8 and above support only XP and newer.&lt;br /&gt;
&lt;br /&gt;
==Cards==&lt;br /&gt;
===NV1 / STG2000 ===&lt;br /&gt;
[[File:Dedge3d.jpg|thumb|NV1]]&lt;br /&gt;
Released in 1995, NVIDIA's first 3D accelerator was an all-in-one product with audio, GUI, VGA, 3D and Sega Saturn gamepad support. It uses a type of 3D rendering called quadratic texture mapping that is not Direct3D or OpenGL compatible so it is only useful with games that use its proprietary API. It is the same technology used by Sega Saturn and as such various games were ported to use the NV1. Its audio consists of wavetable MIDI and DirectSound support but very little DOS support. DOS VGA compatibility is limited.&lt;br /&gt;
&lt;br /&gt;
It comes in various memory configurations with up to 4 MB maximum. The final drivers support Direct3D, but it is a software-only implementation.&lt;br /&gt;
&lt;br /&gt;
Games with support: Battle Arena Toshinden, Descent: Destination Saturn, NASCAR Racing, Panzer Dragoon, Virtua Cop, Virtua Fighter Remix.&lt;br /&gt;
&lt;br /&gt;
===NV3 | RIVA 128 &amp;amp; 128 ZX===&lt;br /&gt;
[[File:Riva128.jpg|thumb|RIVA 128]]&lt;br /&gt;
The RIVA 128 was released in late 1997 and it is the first Direct3D compatible GPU from NVIDIA. RIVA stands for '''R'''eal-time '''I'''nteractive '''V'''ideo and '''A'''nimation and 128 for the internal 128-bit pipeline and memory interface. It has a 206 MHz RAMDAC and supports DDC2 and VBE3.&lt;br /&gt;
Riva 128 has all the hardware features required for Direct3D 5 and has also good OpenGL 1.0 compatibility. It renders at 16-bit color depth and supports 3D accelerated resolutions up to 960x720 with Z-buffer. The 3D-performance is competitive with [[3dfx|Voodoo Graphics]] (Voodoo1). The RIVA 128 shows a few rendering quality issues like visible texture seams and a very apparent dithering pattern. Overall the rendered image appears more saturated than output from Glide games from this time. A special feature of the PCI version of the card is the ability to load textures over the PCI bus, which was advertised as ''AGP content on PCI''. The 2D-features include video scaling and color conversion capabilities. The chip features also PAL/NTSC output so many cards were released with TV-out, some even with TV-in.&lt;br /&gt;
&lt;br /&gt;
In early 1998, NVIDIA refreshed the NV3 architecture by releasing the RIVA 128 ZX. The RIVA 128 ZX is an upgraded chip that has a 250 MHz RAMDAC and supports up to 8 MB SGRAM. For texture intense games the increased memory results in higher performance.&lt;br /&gt;
&lt;br /&gt;
===NV4-6 | RIVA TNT &amp;amp; TNT2===&lt;br /&gt;
[[File:Tntpci.jpg|thumb|TNT PCI]]&lt;br /&gt;
Released in 1998, NVIDIA's RIVA TNT is a drastically improved Direct3D 6-compliant GPU with much better image quality and performance compared to its predecessors. It is competitive with Voodoo2 but with more flexibility such as 32-bit color rendering and 1024x1024 texture support. It also supports AGP 2x with execute mode (AGP texturing). 32-bit color rendering comes with a significant speed loss though and large textures are of very limited use since the chip lacks any form of texture compression. 16 MB RAM allows for very high resolutions. Great OpenGL support. All NVIDIA cards have the advantage of seperately clocked memory which makes for more flexible overclocking than with 3dfx cards. Unlike ATI's Rage 128 - one of its main competitors, the RIVA TNT does not support true trilinear filtering, but rather, its trilinear filtering is an approximation implementation. This results in reduced visual quality when viewing textures / MIP levels at far distances (which can be best described as a &amp;quot;sanding&amp;quot; or dither-pattern type effect) when trilinear filtering is enabled, but otherwise the RIVA TNT's visual quality is great. &lt;br /&gt;
&lt;br /&gt;
NVIDIA further refined the RIVA TNT architecture in early 1999 with the RIVA TNT2. The RIVA TNT2 is very similar to the original TNT but adds support for AGP 4x and 2048x2048 resolution texture maps. It also typically has 32 MB of RAM and it is clocked much higher so performs noticeably better. Some popular budget variants include Vanta and TNT2 Model 64 (M64). The M64 model uses a 64-bit memory interface (rather than 128-bit as opposed to the regular variant), effectively halving its memory bandwidth. The performance of the M64 model, however, is faster than the original TNT in some situations. Likewise with the original TNT, the RIVA TNT2 only supports trilinear approximation rather than true trilinear filtering.&lt;br /&gt;
&lt;br /&gt;
Acer Laboratories Inc. (ALi) later integrated the RIVA TNT2 core into the northbridge of their M1631 motherboard chipset, commonly known as the Aladdin TNT2. The Aladdin TNT2 could also use shared system memory (up to 32 MB total) in addition its onboard graphics memory. Most motherboard manufacturers opted to equip it with 8 MB local memory. The Aladdin TNT2 is on par with the TNT2 M64 in gaming performance.&lt;br /&gt;
&lt;br /&gt;
The RIVA TNT and TNT2 do not support 8-bit palettized textures. This is a problem with, for example, Final Fantasy VII.&lt;br /&gt;
&lt;br /&gt;
The TNT2 is also NVIDIA's last chip with Windows 3.1x support. The TNT/TNT2 Windows 3.1x drivers were only released in beta form and therefore have stability issues and occasionally produce screen corruption - for example, video playback is fickle and will usually crash the system. Furthermore, the same drivers are locked to 60 Hz refresh rate, making them undesirable for use with CRT monitors under this OS. The TNT/TNT2 Windows 3.1x drivers will not work on any GeForce cards.&lt;br /&gt;
&lt;br /&gt;
===NV1x | GeForce 256 / 2 / 4 MX===&lt;br /&gt;
[[File:Gf256sdr.jpg|thumb|GeForce 256 SDR]]&lt;br /&gt;
Released in October 1999, GeForce 256 / NV10 was the first Direct3D 7-compliant GPU. It was initially released with standard SDRAM, but a DDR version later followed. The DDR version is roughly twice as fast as TNT2. It introduced ordered grid super-sampling anti-aliasing, anisotropic filtering support (up to 2x level), cubic environment mapping, and support for hardware dot product bump mapping.&lt;br /&gt;
&lt;br /&gt;
The next refinement in the architecture came in April 2000 in the form of the GeForce 2 / NV15. The GeForce 2 has twice the texture fillrate per clock compared to NV10 and uses a smaller manufacturing process allowing higher clock rates while reducing the power consumption at the same time. The GeForce 2 GTS is about 40% faster than the GeForce 256. GeForce 2 performance is mostly limited by memory bandwidth. Some card manufacturer used a low quality analog circuit design that produces a blurry image output.&lt;br /&gt;
&lt;br /&gt;
GeForce 2 MX / NV11 is the low end series of the GeForce 2, released in September 2000.  These cards have half of the pixel pipelines and half the memory interface of the NV15. It is the first NV chip with two integrated TDMS channels, providing dual display output (called &amp;quot;TwinView&amp;quot;).  It also has &amp;quot;Digital Vibrance Control&amp;quot; that allows calibration of various image output aspects.  The 3D performance of GeForce 2 MX at 16-bit color depth is slightly faster than a GeForce 256 SDR. With its relatively low price and with the performance it offered, it became a popular card. The GeForce 2 MX core was later integrated into NVIDIA's nForce IGP motherboard chipset northbridge.&lt;br /&gt;
&lt;br /&gt;
GeForce 4 MX / NV17 replaced the GeForce 2 series in January 2002. The NV17 core is a hybrid of the NV11 and NV25. The integration of various efficiency and bandwidth improving features, combined with significantly higher clock speed than NV11, allows it to match NV15 performance. These features were advertised as &amp;quot;Lightspeed Memory Architecture II,&amp;quot; which was a refinement of the Lightspeed Memory Architecture introduced with the GeForce 3 (NV20).  It also gained the &amp;quot;AccuView&amp;quot; anti-aliasing capabilities which are considerably advanced in quality and performance over NV11 and NV15. However, the GeForce 4 MX lacks the hardware pixel and vertex shaders support, as well as environment mapped bump mapping support and higher level anisotropic filtering support found in the NV2x architecture.&lt;br /&gt;
*[http://www.anandtech.com/show/875/6 AnandTech: NV17 and NV25 Come to Life]&lt;br /&gt;
*[http://www.nvidia.com/object/feature_lmaii.html Lightspeed Memory Architecture II]&lt;br /&gt;
&lt;br /&gt;
NVIDIA further refreshed the GeForce 4 MX line in late 2002 with the NV18, which added AGP 8x support. In 2004 the series received another refresh with PCI Express x16 support via a bridge chip. The first such product introduced was the GeForce PCX 4300. The GeForce 4 MX core was also integrated into NVIDIA's nForce2 IGP motherboard chipset northbridge.&lt;br /&gt;
&lt;br /&gt;
Estimated model performance ranking:&lt;br /&gt;
&lt;br /&gt;
GF2 MX100 &amp;lt; GF2 MX200 &amp;lt; GF2 MX &amp;lt; GF2 MX400 &amp;lt; GF4 MX420 &amp;lt; GF2 GTS &amp;lt; GF2 Pro &amp;lt; GF2 Ti VX &amp;lt; GF2 Ti &amp;lt; GF2 Ultra &amp;lt; GF4 MX440&lt;br /&gt;
&lt;br /&gt;
===NV2x | GeForce 3 &amp;amp; 4===&lt;br /&gt;
[[File:GeForce3_Ti200.JPG|180px|thumb||GeForce 3 Ti 200]]&lt;br /&gt;
Released in March 2001, the GeForce 3 / NV20 is the first Direct3D 8.0-compliant GPU. It is more efficient than GeForce 2 because of improvements in memory bandwidth utilization and the addition of hidden surface removal (HSR) functions similar to those of ATI Radeon. NVIDIA called this &amp;quot;Lightspeed Memory Architecture&amp;quot;.  Despite lower fillrate than GeForce 2 Ultra/Pro/Ti, in some cases GeForce 3 can outperform those cards by up to 50%, namely in situations with anti-aliasing or when the HSR features save considerable fillrate. However in some cases it loses to GeForce 2 Ultra. In the latter half of 2001, NVIDIA released the GeForce 3 Ti 200 and Ti 500. GeForce 3 Ti 200 is clocked lower than the original GeForce 3, while Ti 500 is fastest. The original GeForce 3 and GeForce 3 Ti 500 were only released in 64 MB configurations, while the GeForce 3 Ti 200 was released in 64 MB and 128 MB configurations. The GeForce 3, however, benefitted very little from 128 MB memory.&lt;br /&gt;
&lt;br /&gt;
GeForce 3 is the first GeForce GPU with environment mapped bump mapping (EMBM) support, multi-sample anti-aliasing (MSAA) and complete anisotropic filtering support (up to 8x level). MSAA is considerably less demanding of fillrate than SSAA. 2X and 4X MSAA modes are available. There is also an anti-aliasing mode called &amp;quot;Quincunx&amp;quot; that uses a combination of 2X MSAA and a RAMDAC-based filter. This mode was intended to allow better quality anti-aliasing than 2X MSAA/SSAA but without the performance hit of 4X MSAA/SSAA. Higher anisotropic filtering levels (4x and 8x), however, incur a heavy performance decrease (sometimes by as much as 50%). However, compared to its main competitor, ATI's R200 series, GeForce3's anisotropic filtering implementation is less angle-dependent and can work simultaneously with trilinear filtering, yielding better quality when observed in motion.&lt;br /&gt;
&lt;br /&gt;
The next evolution in the NV2x architecture came in early 2002 in the form of the NV25 / GeForce 4 Ti series. The GeForce 4 Ti is quite similar to the GeForce 3 in general. Changes include higher clock speed, pixel shader 1.3 support, Direct3D 8.1 compliance, an additional vertex processor for better geometry performance and dual RAM DAC for dual display output. Later in 2002, NVIDIA released the NV28, which added AGP 8x support. With the NV28 release, the AGP 8x versions of the Ti 4400 and 4600 were respectively rebranded as the Ti 4800 SE and Ti 4800. The AGP 8x variant of the Ti 4200 was just known as the &amp;quot;Ti 4200 with AGP 8x.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
In the GeForce4 Ti line, the Ti 4200 is slowest and the Ti 4800 is fastest.&lt;br /&gt;
*[http://ixbtlabs.com/articles/gf4/index1.html Review from ixbtlabs GeForce 3 Ti500, Radeon 8500, GeForce 4 Ti]&lt;br /&gt;
&lt;br /&gt;
===NV3x | GeForce FX===&lt;br /&gt;
[[File:Gf5200u.jpg|thumb|FX 5200 Ultra]]&lt;br /&gt;
These are NVIDIA's first Direct3D 9 GPUs.  They have excellent Direct3D 5-8 compatibility and performance but are of limited value for Direct3D 9.  They are very useful for old games because they still have support for palettized textures and fog table. Similar anti-aliasing and anisotropic features, but performance with these is improved compared to older models. The high performance models like 5700 Ultra, 5800 Ultra and 59x0 Ultra allow one to run old games in high-resolution with anti-aliasing and anisotropic filtering.&lt;br /&gt;
&lt;br /&gt;
Avoid models with 64-bit bus and naming suffixes like LE, XT or VE because they have been crippled in some way.  There were some PCIe models made, named GeForce PCX 5xxx.&lt;br /&gt;
&lt;br /&gt;
5200 &amp;lt; 5500 &amp;lt; 5200 Ultra &amp;lt; 5600 &amp;lt; 5600 Ultra &amp;lt; 5700 &amp;lt; 5700 Ultra &amp;lt; 5800 &amp;lt; 5800 Ultra &amp;lt; 59x0 &amp;lt; 59x0 Ultra&lt;br /&gt;
&lt;br /&gt;
===NV4x | GeForce 6===&lt;br /&gt;
The GeForce 6 series (NV4x) was released in 2004 and is the first Direct3D 9.0c-compliant GPU. It introduced support for Shader Model 3.0 and support for PCI Express x16, though initially released cards were AGP-only. Compared to the GeForce FX series, the NV4x had dramatically improved performance all-around but dropped palettized texture support, so it is incompatible with some old games (few). This is NVIDIA's final generation of GPUs with Windows 9x and NT 4.0 support.&lt;br /&gt;
&lt;br /&gt;
==Compatibility notes==&lt;br /&gt;
*With Intel 440BX motherboards, drivers newer than 56.64 may be unstable.&lt;br /&gt;
&lt;br /&gt;
==Video captures==&lt;br /&gt;
&lt;br /&gt;
===NV1 / STG2000 ===&lt;br /&gt;
{{#ev:youtube|2H-gEKsd5SY}}&lt;br /&gt;
{{#ev:youtube|9bJGnzb0Asw}}&lt;br /&gt;
{{#ev:youtube|t9Zqy5BZQ1E}}&lt;br /&gt;
{{#ev:youtube|_4l_2Bqbi9Q}}&lt;br /&gt;
{{#ev:youtube|ZOstiHUk_EM}}&lt;br /&gt;
{{#ev:youtube|Ibs90LY_Ph8}}&lt;br /&gt;
{{#ev:youtube|JG0kqg8QlLw}}&lt;br /&gt;
{{#ev:youtube|nc6cH0zuMSs}}&lt;br /&gt;
{{#ev:youtube|VK9sg_93iCE}}&lt;br /&gt;
&lt;br /&gt;
===NV3 | RIVA 128 &amp;amp; 128 ZX ===&lt;br /&gt;
{{#ev:youtube|0GAEXE3eu0o}}&lt;br /&gt;
{{#ev:youtube|Cqxia9tPFrs}}&lt;br /&gt;
{{#ev:youtube|QpGVOAuDfNc}}&lt;br /&gt;
{{#ev:youtube|h2KGiIen4n4}}&lt;br /&gt;
{{#ev:youtube|LZrWRMMdwW4}}&lt;br /&gt;
{{#ev:youtube|hv07UKRetPY}}&lt;br /&gt;
&lt;br /&gt;
==Related links==&lt;br /&gt;
*[http://www.vogonsdrivers.com/index.php?catid=24  VOGONS Drivers NVIDIA section] &lt;br /&gt;
*[http://gona.mactar.hu/DOS_TESTS/ Gona's PCI and AGP DOS game compatibility matrix]&lt;br /&gt;
&lt;br /&gt;
[[Category:Hardware]]&lt;br /&gt;
[[Category:Graphics Cards]]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=NVIDIA&amp;diff=1367</id>
		<title>NVIDIA</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=NVIDIA&amp;diff=1367"/>
				<updated>2013-05-12T05:52:23Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Retro value==&lt;br /&gt;
GeForce 256 through GeForce FX are fine cards for old games for a number of reasons. Their DOS game compatibility and GUI performance are top notch. Their Direct3D driver supports two critical old features, fog table and 8-bit palettized textures. OpenGL compatibility and performance are second to none, and some games utilize proprietary NVIDIA extensions.&lt;br /&gt;
&lt;br /&gt;
One problem with NVIDIA cards prior to GeForce4 is that some card vendors built their cards too cheaply. The most noticeable result is poor analog signal quality. It causes problems such as blurriness, loss of color saturation and color bleed, particularly at higher resolutions and refresh rates. Determining which cards are high quality is difficult but the GeForce4 and newer cards are most likely to be good.&lt;br /&gt;
&lt;br /&gt;
Perhaps the finest choices are the GeForce FX series because they offer the most refined quality-enhancing features and the high end models have the performance to use these features at higher resolutions. GeForce 6 drops support for palettized textures, which is a problem with a few games, but otherwise they too are great choices. GeForce 7 drops support for Windows 9x but still supports Windows 2000. Some users, however, have reportedly gotten GeForce 7800 cards to work under Windows 98 and ME using an unofficial, modified driver [http://www.mdgx.com/files/nv8269.php (link)]. This modified driver does not support Windows 95. The GeForce 7 series is also the last that is supported by NVIDIAs Stereo3D driver extension for shutter glasses such as ELSA Revelator. GeForce 8 and above support only XP and newer.&lt;br /&gt;
&lt;br /&gt;
==Cards==&lt;br /&gt;
===NV1 / STG2000 ===&lt;br /&gt;
[[File:Dedge3d.jpg|thumb|NV1]]&lt;br /&gt;
Released in 1995, NVIDIA's first 3D accelerator was an all-in-one product with audio, GUI, VGA, 3D and Sega Saturn gamepad support. It uses a type of 3D rendering called quadratic texture mapping that is not Direct3D or OpenGL compatible so it is only useful with games that use its proprietary API. It is the same technology used by Sega Saturn and as such various games were ported to use the NV1. Its audio consists of wavetable MIDI and DirectSound support but very little DOS support. DOS VGA compatibility is limited.&lt;br /&gt;
&lt;br /&gt;
It comes in various memory configurations with up to 4 MB maximum. The final drivers support Direct3D, but it is a software-only implementation.&lt;br /&gt;
&lt;br /&gt;
Games with support: Battle Arena Toshinden, Descent: Destination Saturn, NASCAR Racing, Panzer Dragoon, Virtua Cop, Virtua Fighter Remix.&lt;br /&gt;
&lt;br /&gt;
===NV3 | RIVA 128 &amp;amp; 128 ZX===&lt;br /&gt;
[[File:Riva128.jpg|thumb|RIVA 128]]&lt;br /&gt;
The RIVA 128 was released in late 1997 and it is the first Direct3D compatible GPU from NVIDIA. RIVA stands for '''R'''eal-time '''I'''nteractive '''V'''ideo and '''A'''nimation and 128 for the internal 128-bit pipeline and memory interface. It has a 206 MHz RAMDAC and supports DDC2 and VBE3.&lt;br /&gt;
Riva 128 has all the hardware features required for Direct3D 5 and has also good OpenGL 1.0 compatibility. It renders at 16-bit color depth and supports 3D accelerated resolutions up to 960x720 with Z-buffer. The 3D-performance is competitive with [[3dfx|Voodoo Graphics]] (Voodoo1). The RIVA 128 shows a few rendering quality issues like visible texture seams and a very apparent dithering pattern. Overall the rendered image appears more saturated than output from Glide games from this time. A special feature of the PCI version of the card is the ability to load textures over the PCI bus, which was advertised as ''AGP content on PCI''. The 2D-features include video scaling and color conversion capabilities. The chip features also PAL/NTSC output so many cards were released with TV-out, some even with TV-in.&lt;br /&gt;
&lt;br /&gt;
In early 1998, NVIDIA refreshed the NV3 architecture by releasing the RIVA 128 ZX. The RIVA 128 ZX is an upgraded chip that has a 250 MHz RAMDAC and supports up to 8 MB SGRAM. For texture intense games the increased memory results in higher performance.&lt;br /&gt;
&lt;br /&gt;
===NV4-6 | RIVA TNT &amp;amp; TNT2===&lt;br /&gt;
[[File:Tntpci.jpg|thumb|TNT PCI]]&lt;br /&gt;
Released in 1998, NVIDIA's RIVA TNT is a drastically improved Direct3D 6-compliant GPU with much better image quality and performance compared to its predecessors. It is competitive with Voodoo2 but with more flexibility such as 32-bit color rendering and 1024x1024 texture support. It also supports AGP 2x with execute mode (AGP texturing). 32-bit color rendering comes with a significant speed loss though and large textures are of very limited use since the chip lacks any form of texture compression. 16 MB RAM allows for very high resolutions. Great OpenGL support. All NVIDIA cards have the advantage of seperately clocked memory which makes for more flexible overclocking than with 3dfx cards. Unlike ATI's Rage 128 - one of its main competitors, the RIVA TNT does not support true trilinear filtering, but rather, its trilinear filtering is an approximation implementation. This results in reduced visual quality when viewing textures / MIP levels at far distances (which can be best described as a &amp;quot;sanding&amp;quot; or dither-pattern type effect) when trilinear filtering is enabled, but otherwise the RIVA TNT's visual quality is great. &lt;br /&gt;
&lt;br /&gt;
NVIDIA further refined the RIVA TNT architecture in early 1999 with the RIVA TNT2. The RIVA TNT2 is very similar to the original TNT but adds support for AGP 4x and 2048x2048 resolution texture maps. It also typically has 32 MB of RAM and it is clocked much higher so performs noticeably better. Some popular budget variants include Vanta and TNT2 Model 64 (M64). The M64 model uses a 64-bit memory interface (rather than 128-bit as opposed to the regular variant), effectively halving its memory bandwidth. The performance of the M64 model, however, is faster than the original TNT in some situations. Likewise with the original TNT, the RIVA TNT2's only supports trilinear approximation rather than true trilinear filtering.&lt;br /&gt;
&lt;br /&gt;
Acer Laboratories Inc. (ALi) later integrated the RIVA TNT2 core into the northbridge of their M1631 motherboard chipset, commonly known as the Aladdin TNT2. The Aladdin TNT2 could also use shared system memory (up to 32 MB total) in addition its onboard graphics memory. Most motherboard manufacturers opted to equip it with 8 MB local memory. The Aladdin TNT2 is on par with the TNT2 M64 in gaming performance.&lt;br /&gt;
&lt;br /&gt;
The RIVA TNT and TNT2 do not support 8-bit palettized textures. This is a problem with, for example, Final Fantasy VII.&lt;br /&gt;
&lt;br /&gt;
The TNT2 is also NVIDIA's last chip with Windows 3.1x support. The TNT/TNT2 Windows 3.1x drivers were only released in beta form and therefore have stability issues and occasionally produce screen corruption - for example, video playback is fickle and will usually crash the system. Furthermore, the same drivers are locked to 60 Hz refresh rate, making them undesirable for use with CRT monitors under this OS. The TNT/TNT2 Windows 3.1x drivers will not work on any GeForce cards.&lt;br /&gt;
&lt;br /&gt;
===NV1x | GeForce 256 / 2 / 4 MX===&lt;br /&gt;
[[File:Gf256sdr.jpg|thumb|GeForce 256 SDR]]&lt;br /&gt;
Released in October 1999, GeForce 256 / NV10 was the first Direct3D 7-compliant GPU. It was initially released with standard SDRAM, but a DDR version later followed. The DDR version is roughly twice as fast as TNT2. It introduced ordered grid super-sampling anti-aliasing, anisotropic filtering support (up to 2x level), cubic environment mapping, and support for hardware dot product bump mapping.&lt;br /&gt;
&lt;br /&gt;
The next refinement in the architecture came in April 2000 in the form of the GeForce 2 / NV15. The GeForce 2 has twice the texture fillrate per clock compared to NV10 and uses a smaller manufacturing process allowing higher clock rates while reducing the power consumption at the same time. The GeForce 2 GTS is about 40% faster than the GeForce 256. GeForce 2 performance is mostly limited by memory bandwidth. Some card manufacturer used a low quality analog circuit design that produces a blurry image output.&lt;br /&gt;
&lt;br /&gt;
GeForce 2 MX / NV11 is the low end series of the GeForce 2, released in September 2000.  These cards have half of the pixel pipelines and half the memory interface of the NV15. It is the first NV chip with two integrated TDMS channels, providing dual display output (called &amp;quot;TwinView&amp;quot;).  It also has &amp;quot;Digital Vibrance Control&amp;quot; that allows calibration of various image output aspects.  The 3D performance of GeForce 2 MX at 16-bit color depth is slightly faster than a GeForce 256 SDR. With its relatively low price and with the performance it offered, it became a popular card. The GeForce 2 MX core was later integrated into NVIDIA's nForce IGP motherboard chipset northbridge.&lt;br /&gt;
&lt;br /&gt;
GeForce 4 MX / NV17 replaced the GeForce 2 series in January 2002. The NV17 core is a hybrid of the NV11 and NV25. The integration of various efficiency and bandwidth improving features, combined with significantly higher clock speed than NV11, allows it to match NV15 performance. These features were advertised as &amp;quot;Lightspeed Memory Architecture II,&amp;quot; which was a refinement of the Lightspeed Memory Architecture introduced with the GeForce 3 (NV20).  It also gained the &amp;quot;AccuView&amp;quot; anti-aliasing capabilities which are considerably advanced in quality and performance over NV11 and NV15. However, the GeForce 4 MX lacks the hardware pixel and vertex shaders support, as well as environment mapped bump mapping support and higher level anisotropic filtering support found in the NV2x architecture.&lt;br /&gt;
*[http://www.anandtech.com/show/875/6 AnandTech: NV17 and NV25 Come to Life]&lt;br /&gt;
*[http://www.nvidia.com/object/feature_lmaii.html Lightspeed Memory Architecture II]&lt;br /&gt;
&lt;br /&gt;
NVIDIA further refreshed the GeForce 4 MX line in late 2002 with the NV18, which added AGP 8x support. In 2004 the series received another refresh with PCI Express x16 support via a bridge chip. The first such product introduced was the GeForce PCX 4300. The GeForce 4 MX core was also integrated into NVIDIA's nForce2 IGP motherboard chipset northbridge.&lt;br /&gt;
&lt;br /&gt;
Estimated model performance ranking:&lt;br /&gt;
&lt;br /&gt;
GF2 MX100 &amp;lt; GF2 MX200 &amp;lt; GF2 MX &amp;lt; GF2 MX400 &amp;lt; GF4 MX420 &amp;lt; GF2 GTS &amp;lt; GF2 Pro &amp;lt; GF2 Ti VX &amp;lt; GF2 Ti &amp;lt; GF2 Ultra &amp;lt; GF4 MX440&lt;br /&gt;
&lt;br /&gt;
===NV2x | GeForce 3 &amp;amp; 4===&lt;br /&gt;
[[File:GeForce3_Ti200.JPG|180px|thumb||GeForce 3 Ti 200]]&lt;br /&gt;
Released in March 2001, the GeForce 3 / NV20 is the first Direct3D 8.0-compliant GPU. It is more efficient than GeForce 2 because of improvements in memory bandwidth utilization and the addition of hidden surface removal (HSR) functions similar to those of ATI Radeon. NVIDIA called this &amp;quot;Lightspeed Memory Architecture&amp;quot;.  Despite lower fillrate than GeForce 2 Ultra/Pro/Ti, in some cases GeForce 3 can outperform those cards by up to 50%, namely in situations with anti-aliasing or when the HSR features save considerable fillrate. However in some cases it loses to GeForce 2 Ultra. In the latter half of 2001, NVIDIA released the GeForce 3 Ti 200 and Ti 500. GeForce 3 Ti 200 is clocked lower than the original GeForce 3, while Ti 500 is fastest. The original GeForce 3 and GeForce 3 Ti 500 were only released in 64 MB configurations, while the GeForce 3 Ti 200 was released in 64 MB and 128 MB configurations. The GeForce 3, however, benefitted very little from 128 MB memory.&lt;br /&gt;
&lt;br /&gt;
GeForce 3 is the first GeForce GPU with environment mapped bump mapping (EMBM) support, multi-sample anti-aliasing (MSAA) and complete anisotropic filtering support (up to 8x level). MSAA is considerably less demanding of fillrate than SSAA. 2X and 4X MSAA modes are available. There is also an anti-aliasing mode called &amp;quot;Quincunx&amp;quot; that uses a combination of 2X MSAA and a RAMDAC-based filter. This mode was intended to allow better quality anti-aliasing than 2X MSAA/SSAA but without the performance hit of 4X MSAA/SSAA. Higher anisotropic filtering levels (4x and 8x), however, incur a heavy performance decrease (sometimes by as much as 50%). However, compared to its main competitor, ATI's R200 series, GeForce3's anisotropic filtering implementation is less angle-dependent and can work simultaneously with trilinear filtering, yielding better quality when observed in motion.&lt;br /&gt;
&lt;br /&gt;
The next evolution in the NV2x architecture came in early 2002 in the form of the NV25 / GeForce 4 Ti series. The GeForce 4 Ti is quite similar to the GeForce 3 in general. Changes include higher clock speed, pixel shader 1.3 support, Direct3D 8.1 compliance, an additional vertex processor for better geometry performance and dual RAM DAC for dual display output. Later in 2002, NVIDIA released the NV28, which added AGP 8x support. With the NV28 release, the AGP 8x versions of the Ti 4400 and 4600 were respectively rebranded as the Ti 4800 SE and Ti 4800. The AGP 8x variant of the Ti 4200 was just known as the &amp;quot;Ti 4200 with AGP 8x.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
In the GeForce4 Ti line, the Ti 4200 is slowest and the Ti 4800 is fastest.&lt;br /&gt;
*[http://ixbtlabs.com/articles/gf4/index1.html Review from ixbtlabs GeForce 3 Ti500, Radeon 8500, GeForce 4 Ti]&lt;br /&gt;
&lt;br /&gt;
===NV3x | GeForce FX===&lt;br /&gt;
[[File:Gf5200u.jpg|thumb|FX 5200 Ultra]]&lt;br /&gt;
These are NVIDIA's first Direct3D 9 GPUs.  They have excellent Direct3D 5-8 compatibility and performance but are of limited value for Direct3D 9.  They are very useful for old games because they still have support for palettized textures and fog table. Similar anti-aliasing and anisotropic features, but performance with these is improved compared to older models. The high performance models like 5700 Ultra, 5800 Ultra and 59x0 Ultra allow one to run old games in high-resolution with anti-aliasing and anisotropic filtering.&lt;br /&gt;
&lt;br /&gt;
Avoid models with 64-bit bus and naming suffixes like LE, XT or VE because they have been crippled in some way.  There were some PCIe models made, named GeForce PCX 5xxx.&lt;br /&gt;
&lt;br /&gt;
5200 &amp;lt; 5500 &amp;lt; 5200 Ultra &amp;lt; 5600 &amp;lt; 5600 Ultra &amp;lt; 5700 &amp;lt; 5700 Ultra &amp;lt; 5800 &amp;lt; 5800 Ultra &amp;lt; 59x0 &amp;lt; 59x0 Ultra&lt;br /&gt;
&lt;br /&gt;
===NV4x | GeForce 6===&lt;br /&gt;
The GeForce 6 series (NV4x) was released in 2004 and is the first Direct3D 9.0c-compliant GPU. It introduced support for Shader Model 3.0 and support for PCI Express x16, though initially released cards were AGP-only. Compared to the GeForce FX series, the NV4x had dramatically improved performance all-around but dropped palettized texture support, so it is incompatible with some old games (few). This is NVIDIA's final generation of GPUs with Windows 9x and NT 4.0 support.&lt;br /&gt;
&lt;br /&gt;
==Compatibility notes==&lt;br /&gt;
*With Intel 440BX motherboards, drivers newer than 56.64 may be unstable.&lt;br /&gt;
&lt;br /&gt;
==Video captures==&lt;br /&gt;
&lt;br /&gt;
===NV1 / STG2000 ===&lt;br /&gt;
{{#ev:youtube|2H-gEKsd5SY}}&lt;br /&gt;
{{#ev:youtube|9bJGnzb0Asw}}&lt;br /&gt;
{{#ev:youtube|t9Zqy5BZQ1E}}&lt;br /&gt;
{{#ev:youtube|_4l_2Bqbi9Q}}&lt;br /&gt;
{{#ev:youtube|ZOstiHUk_EM}}&lt;br /&gt;
{{#ev:youtube|Ibs90LY_Ph8}}&lt;br /&gt;
{{#ev:youtube|JG0kqg8QlLw}}&lt;br /&gt;
{{#ev:youtube|nc6cH0zuMSs}}&lt;br /&gt;
{{#ev:youtube|VK9sg_93iCE}}&lt;br /&gt;
&lt;br /&gt;
===NV3 | RIVA 128 &amp;amp; 128 ZX ===&lt;br /&gt;
{{#ev:youtube|0GAEXE3eu0o}}&lt;br /&gt;
{{#ev:youtube|Cqxia9tPFrs}}&lt;br /&gt;
{{#ev:youtube|QpGVOAuDfNc}}&lt;br /&gt;
{{#ev:youtube|h2KGiIen4n4}}&lt;br /&gt;
{{#ev:youtube|LZrWRMMdwW4}}&lt;br /&gt;
{{#ev:youtube|hv07UKRetPY}}&lt;br /&gt;
&lt;br /&gt;
==Related links==&lt;br /&gt;
*[http://www.vogonsdrivers.com/index.php?catid=24  VOGONS Drivers NVIDIA section] &lt;br /&gt;
*[http://gona.mactar.hu/DOS_TESTS/ Gona's PCI and AGP DOS game compatibility matrix]&lt;br /&gt;
&lt;br /&gt;
[[Category:Hardware]]&lt;br /&gt;
[[Category:Graphics Cards]]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=Intel/Real3D&amp;diff=1362</id>
		<title>Intel/Real3D</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=Intel/Real3D&amp;diff=1362"/>
				<updated>2013-05-11T09:31:32Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* i740 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Cards==&lt;br /&gt;
&lt;br /&gt;
===i740 ===&lt;br /&gt;
Also known as the Intel740 and codenamed &amp;quot;Auburn&amp;quot;, this was Intel's first foray into the discrete graphics market. It was released in January 1998 and was the result of Intel's collaboration with Lockheed Martin's Real3D graphics division. It supports Direct3D and OpenGL ICD. It was designed to take full advantage of the AGP bus, and as such, it can only texture from system memory via the AGP bus. All of its local onboard graphics memory was used purely as a display frame buffer. Therefore, manufacturers were able to produce cards with as low as 2 MB of onboard memory (one such example being Intel's own Express 3D graphics card). The i740 also supports texture map resolutions up to 1024x1024. 3D rendering color depth is limited to 16-bit, but the chip has good dithering quality. Its performance allows it to compete well with NVIDIA's RIVA 128, ATI's Rage Pro and the original 3Dfx Voodoo graphics chipset, but it falls woefully short of higher-end graphics solutions released around the same timeframe, such as 3Dfx's Voodoo2. Much of the bottleneck results from its sole use of AGP texturing - the i740 has to access textures through a channel that is often much slower than that of the onboard graphics memory. It was also difficult to port i740 cards to the PCI bus - often times it necessitated the use of an AGP-to-PCI bridge chip, resulting in increased card cost. The PCI variants could texture from onboard graphics memory and are faster than their AGP counterparts in some tests. The PCI variants also came with larger amounts of onboard memory - one such card was a PCI version of Real3D's Starfighter, which was manufactured with as much as 24 MB of onboard memory.&lt;br /&gt;
&lt;br /&gt;
The i740 supports Windows 9x and NT 4.0. Drivers for Windows 3.1x were also released, and the Windows 3.1x drivers are heavily based on Chips and Technologies' GUI accelerator drivers (since Intel had acquired Chips and Technologies in July 1997).&lt;br /&gt;
&lt;br /&gt;
The i740 has good overall DOS VGA compatibility and performance.&lt;br /&gt;
&lt;br /&gt;
===i752 and i754 ===&lt;br /&gt;
The i752 and i754 are slight evolutions of the i740 architecture and they were announced in April 1999. As of 2013, they are also Intel's last graphics chips used on discrete cards. The i752 supports AGP 2x while the i754 supports AGP 4x. New features include support for multitexturing, anisotropic filtering, MPEG-2 motion compensation and DVI displays. Both were cancelled upon release and very few boards made it out into the wild.&lt;br /&gt;
The i752 and i754 technology respectively lived on as the IGP in Intel's i810 and i815 motherboard chipsets. Some of this technology further found its way into Intel's later IGPs, notably their Extreme Graphics and Graphics Media Accelerator (GMA) lines.&lt;br /&gt;
&lt;br /&gt;
==Video captures ==&lt;br /&gt;
===i740 ===&lt;br /&gt;
{{#ev:youtube|XDxjuXikgs8}}&lt;br /&gt;
{{#ev:youtube|7d2pNPGm8Nc}}&lt;br /&gt;
{{#ev:youtube|MGbXh0n9I_U}}&lt;br /&gt;
{{#ev:youtube|__B1DzxDjYw}}&lt;br /&gt;
{{#ev:youtube|x8sZ3kazUaU}}&lt;br /&gt;
&lt;br /&gt;
==Related links==&lt;br /&gt;
*[http://www.vogonsdrivers.com/getfile.php?fileid=330  VOGONS Drivers Intel i740 Driver and BIOS compilation] &lt;br /&gt;
*[http://gona.mactar.hu/DOS_TESTS/ Gona's PCI and AGP DOS game compatibility matrix]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=3Dlabs&amp;diff=1361</id>
		<title>3Dlabs</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=3Dlabs&amp;diff=1361"/>
				<updated>2013-05-11T09:27:37Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Cards==&lt;br /&gt;
===Game GLINT ===&lt;br /&gt;
&lt;br /&gt;
3DLabs' first tap into the consumer graphics market was in November 1995 with its Game GLINT (also known as GiGi) chip. It is a scaled-down variant of 3DLabs' GLINT 300TX chipset released a year earlier, which was designed for professional 3D and CAD applications. The only graphics card produced using this chip was Creative Labs' 3D Blaster VLB. It is a unique card in that it is the only existing consumer-level 3D accelerator (and texture mapper) for the VLB bus, and it was designed for 80486 VLB users who wanted Pentium-level gaming performance. It has 1 MB graphics memory for display frame buffer and 1 MB texture memory (for 2 MB memory in total). The 3D Blaster VLB can be used as a standalone 2D/3D graphics card or as a 3D-only accelerator when paired with a 2D graphics card in the same system through a VGA pass-through cable (in the same manner that cards based on 3Dfx's Voodoo Graphics and Voodoo2 chipsets function). The 3D Blaster VLB supported Creative Labs' own proprietary API known as Creative Graphics Library (CGL). Only a handful of CGL games were ever released [http://www.vogonswiki.com/index.php/List_of_games_supporting_proprietary_APIs#3D_Blaster_.28CGL.29 (more information here)] and only these games take advantage of the 3D Blaster VLB's 3D rendering features. Games that do take advantage of the 3D Blaster VLB usually run in higher resolution (640x400 or 640x480) and with additional graphics detail. However, with these additional rendering features enabled, performance in supported games is less than desirable (unless one has a VLB Pentium system) and reviewers knocked the card. Most gamers at the time opted to upgrade to a PCI Pentium system instead. The 3D Blaster VLB was supplanted several months later by its PCI bus counterpart, the 3D Blaster PCI, which used Rendition's Verite 1000 chip and had better game compatibility and performance.&lt;br /&gt;
&lt;br /&gt;
The 3D Blaster VLB nowadays is a very rare card and usually if it shows up on auction websites, the price will be very high. The 3D Blaster VLB also supposedly supports Direct3D if the graphics memory upgrade module is installed. However, the memory upgrade module is more difficult to find than the 3D Blaster VLB itself, and would also be very steeply-priced. Unlike 3DLabs' other graphics chips and chipsets, the Game GLINT does not support OpenGL. Display drivers are available for Windows 3.1x and Windows 95.&lt;br /&gt;
&lt;br /&gt;
===Permedia and Permedia NT ===&lt;br /&gt;
&lt;br /&gt;
3DLabs released the Permedia and Permedia NT in 1996, and they served as value-oriented graphics chips designed for the professional 3D and CAD markets. Permedia NT differs from the original in that it features a separate geometry co-processor chip (known as Delta) for performing geometry setup calculations. However, the Delta geometry co-processor is optimized for professional 3D and CAD applications rather than for games, resulting in anemic overall gaming performance. On the upside, the main advantage at the time was that 3DLabs was the only company producing consumer-level graphics chips that had full OpenGL ICD driver support (other manufacturers didn't support OpenGL at all or released a miniport driver instead).&lt;br /&gt;
&lt;br /&gt;
The Permedia and Permedia NT support Windows 9x and NT 4.0. Supported APIs include Direct3D, OpenGL ICD and HEIDI.&lt;br /&gt;
&lt;br /&gt;
Popular cards include Diamond Multimedia's FireGL 1000 and Leadtek's Winfast 3D L2200.&lt;br /&gt;
&lt;br /&gt;
[[File:Permedia_PCI.jpg|200px|thumb|right|Permedia PCI]]&lt;br /&gt;
&lt;br /&gt;
===Permedia 2 ===&lt;br /&gt;
The Permedia 2 was released in late 1997 and was an evolution of the original Permedia. The discrete Delta geometry co-processor chip originally featured in the Permedia NT is now integrated into the Permedia 2's core logic, resulting in a single-chip solution. Permedia 2 supports AGP texturing as well as 3D rendering in 32-bit color depth. However, it does not support colored vertex blending, which results in games using only monochromatic lighting - notable examples are in Quake 2 and Quake III: Arena. No colored lighting is present in Quake 2 even when using the OpenGL renderer, and Quake III: Arena only supports vertex lighting on the Permedia 2. &lt;br /&gt;
&lt;br /&gt;
Most Permedia 2 graphics cards do not come with out-of-the-box VESA VBE support, requiring the use of TSRs such as UniVBE/SciTech Display Doctor in order to use higher-resolution VESA graphics modes in DOS. DOS VGA compatibility is also spotty. The Permedia 2 is also 3DLabs' last graphics chip with Windows 95 support.&lt;br /&gt;
&lt;br /&gt;
===Permedia 3 and GLINT R3 ===&lt;br /&gt;
The Permedia 3 and GLINT R3 are the third iterations of 3DLabs' Permedia architecture. Both were released in mid 1999 and again were directed at the professional 3D and CAD application market. The Permedia 3, however, had more emphasis on being directed at gaming. Both are also Direct3D 6-compliant. The Permedia 3 and GLINT R3 are nearly identical to each other, but the GLINT R3 is more optimized for professional 3D and CAD applications as well as multiprocessor systems. At this time 3DLabs began producing their own graphics cards rather than license their graphics chips out to OEMs, in the same manner that 3dfx did with their Voodoo3/4/5 series. The Permedia 3 and GLINT R3 also introduced support for hardware dot product bump mapping, but this feature would not be extensively used in games until a year or so after its release. Other consumer-level graphics cards at the time only supported emboss bump mapping. Three retail cards based on this chip were released: 3DLabs' Permedia 3 Create! (using the Permedia 3 chip), Oxygen VX1 and the Oxygen GVX1 (both using the GLINT R3 chip). The Oxygen VX1 was released in both 32 MB and 16 MB configurations (the latter was designated as the Oxygen VX1-16). The Oxygen GVX1 featured a separate geometry co-processor (known as Gamma G1) chip for performing transformation, clipping and lighting calculations, extending what the Delta co-processor performed in the Permedia and Permedia 2 series. Likewise with Delta, the Gamma G1 geometry co-processor is not optimized for games. Gaming performance-wise, the Permedia 3 and GLINT R3 are both slower than even NVIDIA's RIVA TNT and ATI's Rage 128 GL (both released several months earlier) in most situations, and they are definitely no match for the RIVA TNT2 or 3dfx's Voodoo3. Their performance in professional 3D and CAD applications, however, is much better. The Oxygen GVX1 is slower than NVIDIA's GeForce 256 and Quadro in both games and professional 3D applications, but produces far fewer rendering anomalies in professional 3D applications.&lt;br /&gt;
&lt;br /&gt;
Later card revisions added AGP 4x support - if purchased in a retail box, there usually will be an &amp;quot;AGP 4x&amp;quot; sticker present on the box to indicate such support.&lt;br /&gt;
&lt;br /&gt;
Compared to 3DLabs' previous graphics chips, the GLINT R3 has better DOS VGA compatibility and contains VESA VBE support out-of-the-box.&lt;br /&gt;
&lt;br /&gt;
The Permedia 3 and GLINT R3 require Windows 98 or Windows NT 4.0 (or newer). Overall, these chips are definitely not geared towards gaming.&lt;br /&gt;
&lt;br /&gt;
==Video captures==&lt;br /&gt;
&lt;br /&gt;
===Game GLINT ===&lt;br /&gt;
{{#ev:youtube|c7tTYwNDWlU}}&lt;br /&gt;
&lt;br /&gt;
===Permedia 2 ===&lt;br /&gt;
{{#ev:youtube|m9PE2jLxVTY}}&lt;br /&gt;
{{#ev:youtube|7CS7cJFFV7I}}&lt;br /&gt;
{{#ev:youtube|A94iqwj4mPw}}&lt;br /&gt;
{{#ev:youtube|HOQrR0Bv-Mc}}&lt;br /&gt;
{{#ev:youtube|5EA9xtM22tA}}&lt;br /&gt;
&lt;br /&gt;
==Related links==&lt;br /&gt;
*[http://www.3dlabs.com/content/Legacy/drivers/driverSelect.asp 3DLabs legacy driver archive]&lt;br /&gt;
*[http://gona.mactar.hu/DOS_TESTS/ Gona's PCI and AGP DOS game compatibility matrix]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=3Dlabs&amp;diff=1353</id>
		<title>3Dlabs</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=3Dlabs&amp;diff=1353"/>
				<updated>2013-04-29T02:57:58Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* Permedia 3 and GLINT R3 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Cards==&lt;br /&gt;
===Game GLINT ===&lt;br /&gt;
&lt;br /&gt;
3DLabs' first tap into the consumer graphics market was in November 1995 with its Game GLINT (also known as GiGi) chip. It is a scaled-down variant of 3DLabs' GLINT 300TX chipset released a year earlier, which was designed for professional 3D and CAD applications. The only graphics card produced using this chip was Creative Labs' 3D Blaster VLB. It is a unique card in that it is the only existing consumer-level 3D accelerator (and texture mapper) for the VLB bus, and it was designed for 80486 VLB users who wanted Pentium-level gaming performance. It has 1 MB graphics memory for display frame buffer and 1 MB texture memory. The 3D Blaster VLB can be used as a standalone 2D/3D graphics card or as a 3D-only accelerator when paired with a 2D graphics card in the same system through a VGA pass-through cable (in the same manner that cards based on 3Dfx's Voodoo Graphics and Voodoo2 chipsets function). The 3D Blaster VLB supported Creative Labs' own proprietary API known as Creative Graphics Library (CGL). Only a handful of CGL games were ever released [http://www.vogonswiki.com/index.php/List_of_games_supporting_proprietary_APIs#3D_Blaster_.28CGL.29 (more information here)] and only these games take advantage of the 3D Blaster VLB's 3D rendering features. Games that do take advantage of the 3D Blaster VLB usually run in higher resolution (640x400 or 640x480) and with additional graphics detail. However, with these additional rendering features enabled, performance in supported games is less than desirable (unless one has a VLB Pentium system) and reviewers knocked the card. Most gamers at the time opted to upgrade to a PCI Pentium system instead. The 3D Blaster VLB was supplanted several months later by its PCI bus counterpart, the 3D Blaster PCI, which used Rendition's Verite 1000 chip and had better game compatibility and performance.&lt;br /&gt;
&lt;br /&gt;
The 3D Blaster VLB nowadays is a very rare card and usually if it shows up on auction websites, the price will be very high. The 3D Blaster VLB also supposedly supports Direct3D if the graphics memory upgrade module is installed. However, the memory upgrade module is more difficult to find than the 3D Blaster VLB itself, and would also be very steeply-priced. Unlike 3DLabs' other graphics chips and chipsets, the Game GLINT does not support OpenGL. Display drivers are available for Windows 3.1x and Windows 95.&lt;br /&gt;
&lt;br /&gt;
===Permedia and Permedia NT ===&lt;br /&gt;
&lt;br /&gt;
3DLabs released the Permedia and Permedia NT in 1996, and they served as value-oriented graphics chips designed for the professional 3D and CAD markets. Permedia NT differs from the original in that it features a separate geometry co-processor chip (known as Delta) for performing geometry setup calculations. However, the Delta geometry co-processor is optimized for professional 3D and CAD applications rather than for games, resulting in anemic overall gaming performance. On the upside, the main advantage at the time was that 3DLabs was the only company producing consumer-level graphics chips that had full OpenGL ICD driver support (other manufacturers didn't support OpenGL at all or released a miniport driver instead).&lt;br /&gt;
&lt;br /&gt;
The Permedia and Permedia NT support Windows 9x and NT 4.0. Supported APIs include Direct3D, OpenGL ICD and HEIDI.&lt;br /&gt;
&lt;br /&gt;
Popular cards include Diamond Multimedia's FireGL 1000 and Leadtek's Winfast 3D L2200.&lt;br /&gt;
&lt;br /&gt;
[[File:Permedia_PCI.jpg|200px|thumb|right|Permedia PCI]]&lt;br /&gt;
&lt;br /&gt;
===Permedia 2 ===&lt;br /&gt;
The Permedia 2 was released in late 1997 and was an evolution of the original Permedia. The discrete Delta geometry co-processor chip originally featured in the Permedia NT is now integrated into the Permedia 2's core logic, resulting in a single-chip solution. Permedia 2 supports AGP texturing as well as 3D rendering in 32-bit color depth. However, it does not support colored vertex blending, which results in games using only monochromatic lighting - notable examples are in Quake 2 and Quake III: Arena. No colored lighting is present in Quake 2 even when using the OpenGL renderer, and Quake III: Arena only supports vertex lighting on the Permedia 2. &lt;br /&gt;
&lt;br /&gt;
Most Permedia 2 graphics cards do not come with out-of-the-box VESA VBE support, requiring the use of TSRs such as UniVBE/SciTech Display Doctor in order to use higher-resolution VESA graphics modes in DOS. DOS VGA compatibility is also spotty. The Permedia 2 is also 3DLabs' last graphics chip with Windows 95 support.&lt;br /&gt;
&lt;br /&gt;
===Permedia 3 and GLINT R3 ===&lt;br /&gt;
The Permedia 3 and GLINT R3 are the third iterations of 3DLabs' Permedia architecture. Both were released in mid 1999 and again were directed at the professional 3D and CAD application market. The Permedia 3, however, had more emphasis on being directed at gaming. Both are also Direct3D 6-compliant. The Permedia 3 and GLINT R3 are nearly identical to each other, but the GLINT R3 is more optimized for professional 3D and CAD applications as well as multiprocessor systems. At this time 3DLabs began producing their own graphics cards rather than license their graphics chips out to OEMs, in the same manner that 3dfx did with their Voodoo3/4/5 series. The Permedia 3 and GLINT R3 also introduced support for hardware dot product bump mapping, but this feature would not be extensively used in games until a year or so after its release. Other consumer-level graphics cards at the time only supported emboss bump mapping. Three retail cards based on this chip were released: 3DLabs' Permedia 3 Create! (using the Permedia 3 chip), Oxygen VX1 and the Oxygen GVX1 (both using the GLINT R3 chip). The Oxygen VX1 was released in both 32 MB and 16 MB configurations (the latter was designated as the Oxygen VX1-16). The Oxygen GVX1 featured a separate geometry co-processor (known as Gamma G1) chip for performing transformation, clipping and lighting calculations, extending what the Delta co-processor performed in the Permedia and Permedia 2 series. Likewise with Delta, the Gamma G1 geometry co-processor is not optimized for games. Gaming performance-wise, the Permedia 3 and GLINT R3 are both slower than even NVIDIA's RIVA TNT and ATI's Rage 128 GL (both released several months earlier) in most situations, and they are definitely no match for the RIVA TNT2 or 3dfx's Voodoo3. Their performance in professional 3D and CAD applications, however, is much better. The Oxygen GVX1 is slower than NVIDIA's GeForce 256 and Quadro in both games and professional 3D applications, but produces far fewer rendering anomalies in professional 3D applications.&lt;br /&gt;
&lt;br /&gt;
Later card revisions added AGP 4x support - if purchased in a retail box, there usually will be an &amp;quot;AGP 4x&amp;quot; sticker present on the box to indicate such support.&lt;br /&gt;
&lt;br /&gt;
Compared to 3DLabs' previous graphics chips, the GLINT R3 has better DOS VGA compatibility and contains VESA VBE support out-of-the-box.&lt;br /&gt;
&lt;br /&gt;
The Permedia 3 and GLINT R3 require Windows 98 or Windows NT 4.0 (or newer). Overall, these chips are definitely not geared towards gaming.&lt;br /&gt;
&lt;br /&gt;
==Video captures==&lt;br /&gt;
&lt;br /&gt;
===Game GLINT ===&lt;br /&gt;
{{#ev:youtube|c7tTYwNDWlU}}&lt;br /&gt;
&lt;br /&gt;
==Related links==&lt;br /&gt;
*[http://www.3dlabs.com/content/Legacy/drivers/driverSelect.asp 3DLabs legacy driver archive]&lt;br /&gt;
*[http://gona.mactar.hu/DOS_TESTS/ Gona's PCI and AGP DOS game compatibility matrix]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=3Dlabs&amp;diff=1352</id>
		<title>3Dlabs</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=3Dlabs&amp;diff=1352"/>
				<updated>2013-04-29T02:56:37Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* Permedia and Permedia NT */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Cards==&lt;br /&gt;
===Game GLINT ===&lt;br /&gt;
&lt;br /&gt;
3DLabs' first tap into the consumer graphics market was in November 1995 with its Game GLINT (also known as GiGi) chip. It is a scaled-down variant of 3DLabs' GLINT 300TX chipset released a year earlier, which was designed for professional 3D and CAD applications. The only graphics card produced using this chip was Creative Labs' 3D Blaster VLB. It is a unique card in that it is the only existing consumer-level 3D accelerator (and texture mapper) for the VLB bus, and it was designed for 80486 VLB users who wanted Pentium-level gaming performance. It has 1 MB graphics memory for display frame buffer and 1 MB texture memory. The 3D Blaster VLB can be used as a standalone 2D/3D graphics card or as a 3D-only accelerator when paired with a 2D graphics card in the same system through a VGA pass-through cable (in the same manner that cards based on 3Dfx's Voodoo Graphics and Voodoo2 chipsets function). The 3D Blaster VLB supported Creative Labs' own proprietary API known as Creative Graphics Library (CGL). Only a handful of CGL games were ever released [http://www.vogonswiki.com/index.php/List_of_games_supporting_proprietary_APIs#3D_Blaster_.28CGL.29 (more information here)] and only these games take advantage of the 3D Blaster VLB's 3D rendering features. Games that do take advantage of the 3D Blaster VLB usually run in higher resolution (640x400 or 640x480) and with additional graphics detail. However, with these additional rendering features enabled, performance in supported games is less than desirable (unless one has a VLB Pentium system) and reviewers knocked the card. Most gamers at the time opted to upgrade to a PCI Pentium system instead. The 3D Blaster VLB was supplanted several months later by its PCI bus counterpart, the 3D Blaster PCI, which used Rendition's Verite 1000 chip and had better game compatibility and performance.&lt;br /&gt;
&lt;br /&gt;
The 3D Blaster VLB nowadays is a very rare card and usually if it shows up on auction websites, the price will be very high. The 3D Blaster VLB also supposedly supports Direct3D if the graphics memory upgrade module is installed. However, the memory upgrade module is more difficult to find than the 3D Blaster VLB itself, and would also be very steeply-priced. Unlike 3DLabs' other graphics chips and chipsets, the Game GLINT does not support OpenGL. Display drivers are available for Windows 3.1x and Windows 95.&lt;br /&gt;
&lt;br /&gt;
===Permedia and Permedia NT ===&lt;br /&gt;
&lt;br /&gt;
3DLabs released the Permedia and Permedia NT in 1996, and they served as value-oriented graphics chips designed for the professional 3D and CAD markets. Permedia NT differs from the original in that it features a separate geometry co-processor chip (known as Delta) for performing geometry setup calculations. However, the Delta geometry co-processor is optimized for professional 3D and CAD applications rather than for games, resulting in anemic overall gaming performance. On the upside, the main advantage at the time was that 3DLabs was the only company producing consumer-level graphics chips that had full OpenGL ICD driver support (other manufacturers didn't support OpenGL at all or released a miniport driver instead).&lt;br /&gt;
&lt;br /&gt;
The Permedia and Permedia NT support Windows 9x and NT 4.0. Supported APIs include Direct3D, OpenGL ICD and HEIDI.&lt;br /&gt;
&lt;br /&gt;
Popular cards include Diamond Multimedia's FireGL 1000 and Leadtek's Winfast 3D L2200.&lt;br /&gt;
&lt;br /&gt;
[[File:Permedia_PCI.jpg|200px|thumb|right|Permedia PCI]]&lt;br /&gt;
&lt;br /&gt;
===Permedia 2 ===&lt;br /&gt;
The Permedia 2 was released in late 1997 and was an evolution of the original Permedia. The discrete Delta geometry co-processor chip originally featured in the Permedia NT is now integrated into the Permedia 2's core logic, resulting in a single-chip solution. Permedia 2 supports AGP texturing as well as 3D rendering in 32-bit color depth. However, it does not support colored vertex blending, which results in games using only monochromatic lighting - notable examples are in Quake 2 and Quake III: Arena. No colored lighting is present in Quake 2 even when using the OpenGL renderer, and Quake III: Arena only supports vertex lighting on the Permedia 2. &lt;br /&gt;
&lt;br /&gt;
Most Permedia 2 graphics cards do not come with out-of-the-box VESA VBE support, requiring the use of TSRs such as UniVBE/SciTech Display Doctor in order to use higher-resolution VESA graphics modes in DOS. DOS VGA compatibility is also spotty. The Permedia 2 is also 3DLabs' last graphics chip with Windows 95 support.&lt;br /&gt;
&lt;br /&gt;
===Permedia 3 and GLINT R3 ===&lt;br /&gt;
The Permedia 3 and GLINT R3 are the third iterations of 3DLabs' Permedia architecture. Both were released in mid 1999 and again were directed at the professional 3D and CAD application market. The Permedia 3, however, had more emphasis on being directed at gaming. Both are also Direct3D 6-compliant. The Permedia 3 and GLINT R3 are nearly identical to each other, but the GLINT R3 is more optimized for professional 3D and CAD applications as well as multiprocessor systems. At this time 3DLabs began producing their own graphics cards rather than license their graphics chips out to OEMs, in the same manner that 3dfx did with their Voodoo3/4/5 series. The Permedia 3 and GLINT R3 also introduced support for hardware dot product bump mapping, but this feature would not be extensively used in games until a year or so after its release. Other consumer-level graphics cards at the time only supported emboss bump mapping. Three retail cards based on this chip were released: 3DLabs' Permedia 3 Create! (using the Permedia 3 chip), Oxygen VX1 and the Oxygen GVX1 (both using the GLINT R3 chip). The Oxygen VX1 was released in both 32 MB and 16 MB configurations (the latter was designated as the Oxygen VX1-16). The Oxygen GVX1 featured a separate geometry co-processor (known as Gamma G1) chip for performing transformation, clipping and lighting calculations, much in the same manner that the Delta co-processor functioned in the Permedia and Permedia 2 series. Likewise with Delta, the Gamma G1 geometry co-processor is not optimized for games. Gaming performance-wise, the Permedia 3 and GLINT R3 are both slower than even NVIDIA's RIVA TNT and ATI's Rage 128 GL (both released several months earlier) in most situations, and they are definitely no match for the RIVA TNT2 or 3dfx's Voodoo3. Their performance in professional 3D and CAD applications, however, is much better. The Oxygen GVX1 is slower than NVIDIA's GeForce 256 and Quadro in both games and professional 3D applications, but produces far fewer rendering anomalies in professional 3D applications.&lt;br /&gt;
&lt;br /&gt;
Later card revisions added AGP 4x support - if purchased in a retail box, there usually will be an &amp;quot;AGP 4x&amp;quot; sticker present on the box to indicate such support.&lt;br /&gt;
&lt;br /&gt;
Compared to 3DLabs' previous graphics chips, the GLINT R3 has better DOS VGA compatibility and contains VESA VBE support out-of-the-box.&lt;br /&gt;
&lt;br /&gt;
The Permedia 3 and GLINT R3 require Windows 98 or Windows NT 4.0 (or newer). Overall, these chips are definitely not geared towards gaming.&lt;br /&gt;
&lt;br /&gt;
==Video captures==&lt;br /&gt;
&lt;br /&gt;
===Game GLINT ===&lt;br /&gt;
{{#ev:youtube|c7tTYwNDWlU}}&lt;br /&gt;
&lt;br /&gt;
==Related links==&lt;br /&gt;
*[http://www.3dlabs.com/content/Legacy/drivers/driverSelect.asp 3DLabs legacy driver archive]&lt;br /&gt;
*[http://gona.mactar.hu/DOS_TESTS/ Gona's PCI and AGP DOS game compatibility matrix]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=3Dlabs&amp;diff=1351</id>
		<title>3Dlabs</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=3Dlabs&amp;diff=1351"/>
				<updated>2013-04-28T22:37:28Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* Related links */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Cards==&lt;br /&gt;
===Game GLINT ===&lt;br /&gt;
&lt;br /&gt;
3DLabs' first tap into the consumer graphics market was in November 1995 with its Game GLINT (also known as GiGi) chip. It is a scaled-down variant of 3DLabs' GLINT 300TX chipset released a year earlier, which was designed for professional 3D and CAD applications. The only graphics card produced using this chip was Creative Labs' 3D Blaster VLB. It is a unique card in that it is the only existing consumer-level 3D accelerator (and texture mapper) for the VLB bus, and it was designed for 80486 VLB users who wanted Pentium-level gaming performance. It has 1 MB graphics memory for display frame buffer and 1 MB texture memory. The 3D Blaster VLB can be used as a standalone 2D/3D graphics card or as a 3D-only accelerator when paired with a 2D graphics card in the same system through a VGA pass-through cable (in the same manner that cards based on 3Dfx's Voodoo Graphics and Voodoo2 chipsets function). The 3D Blaster VLB supported Creative Labs' own proprietary API known as Creative Graphics Library (CGL). Only a handful of CGL games were ever released [http://www.vogonswiki.com/index.php/List_of_games_supporting_proprietary_APIs#3D_Blaster_.28CGL.29 (more information here)] and only these games take advantage of the 3D Blaster VLB's 3D rendering features. Games that do take advantage of the 3D Blaster VLB usually run in higher resolution (640x400 or 640x480) and with additional graphics detail. However, with these additional rendering features enabled, performance in supported games is less than desirable (unless one has a VLB Pentium system) and reviewers knocked the card. Most gamers at the time opted to upgrade to a PCI Pentium system instead. The 3D Blaster VLB was supplanted several months later by its PCI bus counterpart, the 3D Blaster PCI, which used Rendition's Verite 1000 chip and had better game compatibility and performance.&lt;br /&gt;
&lt;br /&gt;
The 3D Blaster VLB nowadays is a very rare card and usually if it shows up on auction websites, the price will be very high. The 3D Blaster VLB also supposedly supports Direct3D if the graphics memory upgrade module is installed. However, the memory upgrade module is more difficult to find than the 3D Blaster VLB itself, and would also be very steeply-priced. Unlike 3DLabs' other graphics chips and chipsets, the Game GLINT does not support OpenGL. Display drivers are available for Windows 3.1x and Windows 95.&lt;br /&gt;
&lt;br /&gt;
===Permedia and Permedia NT ===&lt;br /&gt;
&lt;br /&gt;
3DLabs released the Permedia and Permedia NT in 1996, and they served as value-oriented graphics chips designed for the professional 3D and CAD markets. Permedia NT differs from the original in that it features a separate geometry co-processor chip (known as Delta) for performing transformation, clipping and lighting calculations. However, the Delta geometry co-processor is optimized for professional 3D and CAD applications rather than for games, resulting in anemic overall gaming performance. On the upside, the main advantage at the time was that 3DLabs was the only company producing consumer-level graphics chips that had full OpenGL ICD driver support (other manufacturers didn't support OpenGL at all or released a miniport driver instead).&lt;br /&gt;
&lt;br /&gt;
The Permedia and Permedia NT support Windows 9x and NT 4.0. Supported APIs include Direct3D, OpenGL ICD and HEIDI.&lt;br /&gt;
&lt;br /&gt;
Popular cards include Diamond Multimedia's FireGL 1000 and Leadtek's Winfast 3D L2200.&lt;br /&gt;
&lt;br /&gt;
[[File:Permedia_PCI.jpg|200px|thumb|right|Permedia PCI]]&lt;br /&gt;
&lt;br /&gt;
===Permedia 2 ===&lt;br /&gt;
The Permedia 2 was released in late 1997 and was an evolution of the original Permedia. The discrete Delta geometry co-processor chip originally featured in the Permedia NT is now integrated into the Permedia 2's core logic, resulting in a single-chip solution. Permedia 2 supports AGP texturing as well as 3D rendering in 32-bit color depth. However, it does not support colored vertex blending, which results in games using only monochromatic lighting - notable examples are in Quake 2 and Quake III: Arena. No colored lighting is present in Quake 2 even when using the OpenGL renderer, and Quake III: Arena only supports vertex lighting on the Permedia 2. &lt;br /&gt;
&lt;br /&gt;
Most Permedia 2 graphics cards do not come with out-of-the-box VESA VBE support, requiring the use of TSRs such as UniVBE/SciTech Display Doctor in order to use higher-resolution VESA graphics modes in DOS. DOS VGA compatibility is also spotty. The Permedia 2 is also 3DLabs' last graphics chip with Windows 95 support.&lt;br /&gt;
&lt;br /&gt;
===Permedia 3 and GLINT R3 ===&lt;br /&gt;
The Permedia 3 and GLINT R3 are the third iterations of 3DLabs' Permedia architecture. Both were released in mid 1999 and again were directed at the professional 3D and CAD application market. The Permedia 3, however, had more emphasis on being directed at gaming. Both are also Direct3D 6-compliant. The Permedia 3 and GLINT R3 are nearly identical to each other, but the GLINT R3 is more optimized for professional 3D and CAD applications as well as multiprocessor systems. At this time 3DLabs began producing their own graphics cards rather than license their graphics chips out to OEMs, in the same manner that 3dfx did with their Voodoo3/4/5 series. The Permedia 3 and GLINT R3 also introduced support for hardware dot product bump mapping, but this feature would not be extensively used in games until a year or so after its release. Other consumer-level graphics cards at the time only supported emboss bump mapping. Three retail cards based on this chip were released: 3DLabs' Permedia 3 Create! (using the Permedia 3 chip), Oxygen VX1 and the Oxygen GVX1 (both using the GLINT R3 chip). The Oxygen VX1 was released in both 32 MB and 16 MB configurations (the latter was designated as the Oxygen VX1-16). The Oxygen GVX1 featured a separate geometry co-processor (known as Gamma G1) chip for performing transformation, clipping and lighting calculations, much in the same manner that the Delta co-processor functioned in the Permedia and Permedia 2 series. Likewise with Delta, the Gamma G1 geometry co-processor is not optimized for games. Gaming performance-wise, the Permedia 3 and GLINT R3 are both slower than even NVIDIA's RIVA TNT and ATI's Rage 128 GL (both released several months earlier) in most situations, and they are definitely no match for the RIVA TNT2 or 3dfx's Voodoo3. Their performance in professional 3D and CAD applications, however, is much better. The Oxygen GVX1 is slower than NVIDIA's GeForce 256 and Quadro in both games and professional 3D applications, but produces far fewer rendering anomalies in professional 3D applications.&lt;br /&gt;
&lt;br /&gt;
Later card revisions added AGP 4x support - if purchased in a retail box, there usually will be an &amp;quot;AGP 4x&amp;quot; sticker present on the box to indicate such support.&lt;br /&gt;
&lt;br /&gt;
Compared to 3DLabs' previous graphics chips, the GLINT R3 has better DOS VGA compatibility and contains VESA VBE support out-of-the-box.&lt;br /&gt;
&lt;br /&gt;
The Permedia 3 and GLINT R3 require Windows 98 or Windows NT 4.0 (or newer). Overall, these chips are definitely not geared towards gaming.&lt;br /&gt;
&lt;br /&gt;
==Video captures==&lt;br /&gt;
&lt;br /&gt;
===Game GLINT ===&lt;br /&gt;
{{#ev:youtube|c7tTYwNDWlU}}&lt;br /&gt;
&lt;br /&gt;
==Related links==&lt;br /&gt;
*[http://www.3dlabs.com/content/Legacy/drivers/driverSelect.asp 3DLabs legacy driver archive]&lt;br /&gt;
*[http://gona.mactar.hu/DOS_TESTS/ Gona's PCI and AGP DOS game compatibility matrix]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=3Dlabs&amp;diff=1350</id>
		<title>3Dlabs</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=3Dlabs&amp;diff=1350"/>
				<updated>2013-04-28T22:35:23Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* Permedia 3 and GLINT R3 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Cards==&lt;br /&gt;
===Game GLINT ===&lt;br /&gt;
&lt;br /&gt;
3DLabs' first tap into the consumer graphics market was in November 1995 with its Game GLINT (also known as GiGi) chip. It is a scaled-down variant of 3DLabs' GLINT 300TX chipset released a year earlier, which was designed for professional 3D and CAD applications. The only graphics card produced using this chip was Creative Labs' 3D Blaster VLB. It is a unique card in that it is the only existing consumer-level 3D accelerator (and texture mapper) for the VLB bus, and it was designed for 80486 VLB users who wanted Pentium-level gaming performance. It has 1 MB graphics memory for display frame buffer and 1 MB texture memory. The 3D Blaster VLB can be used as a standalone 2D/3D graphics card or as a 3D-only accelerator when paired with a 2D graphics card in the same system through a VGA pass-through cable (in the same manner that cards based on 3Dfx's Voodoo Graphics and Voodoo2 chipsets function). The 3D Blaster VLB supported Creative Labs' own proprietary API known as Creative Graphics Library (CGL). Only a handful of CGL games were ever released [http://www.vogonswiki.com/index.php/List_of_games_supporting_proprietary_APIs#3D_Blaster_.28CGL.29 (more information here)] and only these games take advantage of the 3D Blaster VLB's 3D rendering features. Games that do take advantage of the 3D Blaster VLB usually run in higher resolution (640x400 or 640x480) and with additional graphics detail. However, with these additional rendering features enabled, performance in supported games is less than desirable (unless one has a VLB Pentium system) and reviewers knocked the card. Most gamers at the time opted to upgrade to a PCI Pentium system instead. The 3D Blaster VLB was supplanted several months later by its PCI bus counterpart, the 3D Blaster PCI, which used Rendition's Verite 1000 chip and had better game compatibility and performance.&lt;br /&gt;
&lt;br /&gt;
The 3D Blaster VLB nowadays is a very rare card and usually if it shows up on auction websites, the price will be very high. The 3D Blaster VLB also supposedly supports Direct3D if the graphics memory upgrade module is installed. However, the memory upgrade module is more difficult to find than the 3D Blaster VLB itself, and would also be very steeply-priced. Unlike 3DLabs' other graphics chips and chipsets, the Game GLINT does not support OpenGL. Display drivers are available for Windows 3.1x and Windows 95.&lt;br /&gt;
&lt;br /&gt;
===Permedia and Permedia NT ===&lt;br /&gt;
&lt;br /&gt;
3DLabs released the Permedia and Permedia NT in 1996, and they served as value-oriented graphics chips designed for the professional 3D and CAD markets. Permedia NT differs from the original in that it features a separate geometry co-processor chip (known as Delta) for performing transformation, clipping and lighting calculations. However, the Delta geometry co-processor is optimized for professional 3D and CAD applications rather than for games, resulting in anemic overall gaming performance. On the upside, the main advantage at the time was that 3DLabs was the only company producing consumer-level graphics chips that had full OpenGL ICD driver support (other manufacturers didn't support OpenGL at all or released a miniport driver instead).&lt;br /&gt;
&lt;br /&gt;
The Permedia and Permedia NT support Windows 9x and NT 4.0. Supported APIs include Direct3D, OpenGL ICD and HEIDI.&lt;br /&gt;
&lt;br /&gt;
Popular cards include Diamond Multimedia's FireGL 1000 and Leadtek's Winfast 3D L2200.&lt;br /&gt;
&lt;br /&gt;
[[File:Permedia_PCI.jpg|200px|thumb|right|Permedia PCI]]&lt;br /&gt;
&lt;br /&gt;
===Permedia 2 ===&lt;br /&gt;
The Permedia 2 was released in late 1997 and was an evolution of the original Permedia. The discrete Delta geometry co-processor chip originally featured in the Permedia NT is now integrated into the Permedia 2's core logic, resulting in a single-chip solution. Permedia 2 supports AGP texturing as well as 3D rendering in 32-bit color depth. However, it does not support colored vertex blending, which results in games using only monochromatic lighting - notable examples are in Quake 2 and Quake III: Arena. No colored lighting is present in Quake 2 even when using the OpenGL renderer, and Quake III: Arena only supports vertex lighting on the Permedia 2. &lt;br /&gt;
&lt;br /&gt;
Most Permedia 2 graphics cards do not come with out-of-the-box VESA VBE support, requiring the use of TSRs such as UniVBE/SciTech Display Doctor in order to use higher-resolution VESA graphics modes in DOS. DOS VGA compatibility is also spotty. The Permedia 2 is also 3DLabs' last graphics chip with Windows 95 support.&lt;br /&gt;
&lt;br /&gt;
===Permedia 3 and GLINT R3 ===&lt;br /&gt;
The Permedia 3 and GLINT R3 are the third iterations of 3DLabs' Permedia architecture. Both were released in mid 1999 and again were directed at the professional 3D and CAD application market. The Permedia 3, however, had more emphasis on being directed at gaming. Both are also Direct3D 6-compliant. The Permedia 3 and GLINT R3 are nearly identical to each other, but the GLINT R3 is more optimized for professional 3D and CAD applications as well as multiprocessor systems. At this time 3DLabs began producing their own graphics cards rather than license their graphics chips out to OEMs, in the same manner that 3dfx did with their Voodoo3/4/5 series. The Permedia 3 and GLINT R3 also introduced support for hardware dot product bump mapping, but this feature would not be extensively used in games until a year or so after its release. Other consumer-level graphics cards at the time only supported emboss bump mapping. Three retail cards based on this chip were released: 3DLabs' Permedia 3 Create! (using the Permedia 3 chip), Oxygen VX1 and the Oxygen GVX1 (both using the GLINT R3 chip). The Oxygen VX1 was released in both 32 MB and 16 MB configurations (the latter was designated as the Oxygen VX1-16). The Oxygen GVX1 featured a separate geometry co-processor (known as Gamma G1) chip for performing transformation, clipping and lighting calculations, much in the same manner that the Delta co-processor functioned in the Permedia and Permedia 2 series. Likewise with Delta, the Gamma G1 geometry co-processor is not optimized for games. Gaming performance-wise, the Permedia 3 and GLINT R3 are both slower than even NVIDIA's RIVA TNT and ATI's Rage 128 GL (both released several months earlier) in most situations, and they are definitely no match for the RIVA TNT2 or 3dfx's Voodoo3. Their performance in professional 3D and CAD applications, however, is much better. The Oxygen GVX1 is slower than NVIDIA's GeForce 256 and Quadro in both games and professional 3D applications, but produces far fewer rendering anomalies in professional 3D applications.&lt;br /&gt;
&lt;br /&gt;
Later card revisions added AGP 4x support - if purchased in a retail box, there usually will be an &amp;quot;AGP 4x&amp;quot; sticker present on the box to indicate such support.&lt;br /&gt;
&lt;br /&gt;
Compared to 3DLabs' previous graphics chips, the GLINT R3 has better DOS VGA compatibility and contains VESA VBE support out-of-the-box.&lt;br /&gt;
&lt;br /&gt;
The Permedia 3 and GLINT R3 require Windows 98 or Windows NT 4.0 (or newer). Overall, these chips are definitely not geared towards gaming.&lt;br /&gt;
&lt;br /&gt;
==Video captures==&lt;br /&gt;
&lt;br /&gt;
===Game GLINT ===&lt;br /&gt;
{{#ev:youtube|c7tTYwNDWlU}}&lt;br /&gt;
&lt;br /&gt;
==Related links==&lt;br /&gt;
*[http://gona.mactar.hu/DOS_TESTS/ Gona's PCI and AGP DOS game compatibility matrix]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=3Dlabs&amp;diff=1349</id>
		<title>3Dlabs</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=3Dlabs&amp;diff=1349"/>
				<updated>2013-04-28T22:34:25Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* Permedia 3 and GLINT R3 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Cards==&lt;br /&gt;
===Game GLINT ===&lt;br /&gt;
&lt;br /&gt;
3DLabs' first tap into the consumer graphics market was in November 1995 with its Game GLINT (also known as GiGi) chip. It is a scaled-down variant of 3DLabs' GLINT 300TX chipset released a year earlier, which was designed for professional 3D and CAD applications. The only graphics card produced using this chip was Creative Labs' 3D Blaster VLB. It is a unique card in that it is the only existing consumer-level 3D accelerator (and texture mapper) for the VLB bus, and it was designed for 80486 VLB users who wanted Pentium-level gaming performance. It has 1 MB graphics memory for display frame buffer and 1 MB texture memory. The 3D Blaster VLB can be used as a standalone 2D/3D graphics card or as a 3D-only accelerator when paired with a 2D graphics card in the same system through a VGA pass-through cable (in the same manner that cards based on 3Dfx's Voodoo Graphics and Voodoo2 chipsets function). The 3D Blaster VLB supported Creative Labs' own proprietary API known as Creative Graphics Library (CGL). Only a handful of CGL games were ever released [http://www.vogonswiki.com/index.php/List_of_games_supporting_proprietary_APIs#3D_Blaster_.28CGL.29 (more information here)] and only these games take advantage of the 3D Blaster VLB's 3D rendering features. Games that do take advantage of the 3D Blaster VLB usually run in higher resolution (640x400 or 640x480) and with additional graphics detail. However, with these additional rendering features enabled, performance in supported games is less than desirable (unless one has a VLB Pentium system) and reviewers knocked the card. Most gamers at the time opted to upgrade to a PCI Pentium system instead. The 3D Blaster VLB was supplanted several months later by its PCI bus counterpart, the 3D Blaster PCI, which used Rendition's Verite 1000 chip and had better game compatibility and performance.&lt;br /&gt;
&lt;br /&gt;
The 3D Blaster VLB nowadays is a very rare card and usually if it shows up on auction websites, the price will be very high. The 3D Blaster VLB also supposedly supports Direct3D if the graphics memory upgrade module is installed. However, the memory upgrade module is more difficult to find than the 3D Blaster VLB itself, and would also be very steeply-priced. Unlike 3DLabs' other graphics chips and chipsets, the Game GLINT does not support OpenGL. Display drivers are available for Windows 3.1x and Windows 95.&lt;br /&gt;
&lt;br /&gt;
===Permedia and Permedia NT ===&lt;br /&gt;
&lt;br /&gt;
3DLabs released the Permedia and Permedia NT in 1996, and they served as value-oriented graphics chips designed for the professional 3D and CAD markets. Permedia NT differs from the original in that it features a separate geometry co-processor chip (known as Delta) for performing transformation, clipping and lighting calculations. However, the Delta geometry co-processor is optimized for professional 3D and CAD applications rather than for games, resulting in anemic overall gaming performance. On the upside, the main advantage at the time was that 3DLabs was the only company producing consumer-level graphics chips that had full OpenGL ICD driver support (other manufacturers didn't support OpenGL at all or released a miniport driver instead).&lt;br /&gt;
&lt;br /&gt;
The Permedia and Permedia NT support Windows 9x and NT 4.0. Supported APIs include Direct3D, OpenGL ICD and HEIDI.&lt;br /&gt;
&lt;br /&gt;
Popular cards include Diamond Multimedia's FireGL 1000 and Leadtek's Winfast 3D L2200.&lt;br /&gt;
&lt;br /&gt;
[[File:Permedia_PCI.jpg|200px|thumb|right|Permedia PCI]]&lt;br /&gt;
&lt;br /&gt;
===Permedia 2 ===&lt;br /&gt;
The Permedia 2 was released in late 1997 and was an evolution of the original Permedia. The discrete Delta geometry co-processor chip originally featured in the Permedia NT is now integrated into the Permedia 2's core logic, resulting in a single-chip solution. Permedia 2 supports AGP texturing as well as 3D rendering in 32-bit color depth. However, it does not support colored vertex blending, which results in games using only monochromatic lighting - notable examples are in Quake 2 and Quake III: Arena. No colored lighting is present in Quake 2 even when using the OpenGL renderer, and Quake III: Arena only supports vertex lighting on the Permedia 2. &lt;br /&gt;
&lt;br /&gt;
Most Permedia 2 graphics cards do not come with out-of-the-box VESA VBE support, requiring the use of TSRs such as UniVBE/SciTech Display Doctor in order to use higher-resolution VESA graphics modes in DOS. DOS VGA compatibility is also spotty. The Permedia 2 is also 3DLabs' last graphics chip with Windows 95 support.&lt;br /&gt;
&lt;br /&gt;
===Permedia 3 and GLINT R3 ===&lt;br /&gt;
The Permedia 3 and GLINT R3 are the third iterations of 3DLabs' Permedia architecture. Both were released in mid 1999 and again were directed at the professional 3D and CAD application market. The Permedia 3, however, had more emphasis on being directed at gaming. Both are also Direct3D 6-compliant. The Permedia 3 and GLINT R3 are nearly identical to each other, but the GLINT R3 is more optimized for professional 3D and CAD applications as well as multiprocessor systems. At this time 3DLabs began producing their own graphics cards rather than license their graphics chips out to OEMs, in the same manner that 3dfx did with their Voodoo3/4/5 series. The Permedia 3 and GLINT R3 also introduced support for hardware dot product bump mapping, but this feature would not be extensively used in games until a year or so after its release. Other consumer-level graphics cards at the time only supported emboss bump mapping. Three retail cards based on this chip were released: 3DLabs' Permedia 3 Create! (using the Permedia 3 chip), Oxygen VX1 and the Oxygen GVX1 (both using the GLINT R3 chip). The Oxygen VX1 was released in both 32 MB and 16 MB configurations (the latter was designated as the Oxygen VX1-16). The Oxygen GVX1 featured a separate geometry co-processor (known as Gamma G1) chip for performing transformation, clipping and lighting calculations, much in the same manner that the Delta co-processor functioned in the Permedia and Permedia 2 series. Likewise with Delta, the Gamma G1 geometry co-processor is not optimized for games. Gaming performance-wise, the Permedia 3 and GLINT R3 are both slower than even NVIDIA's RIVA TNT and ATI's Rage 128 GL (both released several months earlier) in most situations, and they are definitely no match for the RIVA TNT2 or 3dfx's Voodoo3. Their performance in professional 3D and CAD applications, however, is much better. The Oxygen GVX1 is slower than NVIDIA's GeForce 256 and Quadro in both games and professional 3D applications, but produces far fewer rendering anomalies in professional 3D applications.&lt;br /&gt;
&lt;br /&gt;
Later card revisions added AGP 4x support - if purchased in a retail box, there usually will be an &amp;quot;AGP 4x&amp;quot; sticker present on the box to indicate such support.&lt;br /&gt;
&lt;br /&gt;
Compared to 3DLabs' previous graphics chips, the GLINT R3 has better DOS VGA compatibility and contains VESA VBE support out-of-the-box.&lt;br /&gt;
&lt;br /&gt;
The Permedia 3 and GLINT R3 require Windows 98 or Windows NT 4.0 (or newer). Overall, this chip is definitely not geared towards gaming.&lt;br /&gt;
&lt;br /&gt;
==Video captures==&lt;br /&gt;
&lt;br /&gt;
===Game GLINT ===&lt;br /&gt;
{{#ev:youtube|c7tTYwNDWlU}}&lt;br /&gt;
&lt;br /&gt;
==Related links==&lt;br /&gt;
*[http://gona.mactar.hu/DOS_TESTS/ Gona's PCI and AGP DOS game compatibility matrix]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=3Dlabs&amp;diff=1348</id>
		<title>3Dlabs</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=3Dlabs&amp;diff=1348"/>
				<updated>2013-04-28T22:33:16Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* Permedia 3 and GLINT R3 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Cards==&lt;br /&gt;
===Game GLINT ===&lt;br /&gt;
&lt;br /&gt;
3DLabs' first tap into the consumer graphics market was in November 1995 with its Game GLINT (also known as GiGi) chip. It is a scaled-down variant of 3DLabs' GLINT 300TX chipset released a year earlier, which was designed for professional 3D and CAD applications. The only graphics card produced using this chip was Creative Labs' 3D Blaster VLB. It is a unique card in that it is the only existing consumer-level 3D accelerator (and texture mapper) for the VLB bus, and it was designed for 80486 VLB users who wanted Pentium-level gaming performance. It has 1 MB graphics memory for display frame buffer and 1 MB texture memory. The 3D Blaster VLB can be used as a standalone 2D/3D graphics card or as a 3D-only accelerator when paired with a 2D graphics card in the same system through a VGA pass-through cable (in the same manner that cards based on 3Dfx's Voodoo Graphics and Voodoo2 chipsets function). The 3D Blaster VLB supported Creative Labs' own proprietary API known as Creative Graphics Library (CGL). Only a handful of CGL games were ever released [http://www.vogonswiki.com/index.php/List_of_games_supporting_proprietary_APIs#3D_Blaster_.28CGL.29 (more information here)] and only these games take advantage of the 3D Blaster VLB's 3D rendering features. Games that do take advantage of the 3D Blaster VLB usually run in higher resolution (640x400 or 640x480) and with additional graphics detail. However, with these additional rendering features enabled, performance in supported games is less than desirable (unless one has a VLB Pentium system) and reviewers knocked the card. Most gamers at the time opted to upgrade to a PCI Pentium system instead. The 3D Blaster VLB was supplanted several months later by its PCI bus counterpart, the 3D Blaster PCI, which used Rendition's Verite 1000 chip and had better game compatibility and performance.&lt;br /&gt;
&lt;br /&gt;
The 3D Blaster VLB nowadays is a very rare card and usually if it shows up on auction websites, the price will be very high. The 3D Blaster VLB also supposedly supports Direct3D if the graphics memory upgrade module is installed. However, the memory upgrade module is more difficult to find than the 3D Blaster VLB itself, and would also be very steeply-priced. Unlike 3DLabs' other graphics chips and chipsets, the Game GLINT does not support OpenGL. Display drivers are available for Windows 3.1x and Windows 95.&lt;br /&gt;
&lt;br /&gt;
===Permedia and Permedia NT ===&lt;br /&gt;
&lt;br /&gt;
3DLabs released the Permedia and Permedia NT in 1996, and they served as value-oriented graphics chips designed for the professional 3D and CAD markets. Permedia NT differs from the original in that it features a separate geometry co-processor chip (known as Delta) for performing transformation, clipping and lighting calculations. However, the Delta geometry co-processor is optimized for professional 3D and CAD applications rather than for games, resulting in anemic overall gaming performance. On the upside, the main advantage at the time was that 3DLabs was the only company producing consumer-level graphics chips that had full OpenGL ICD driver support (other manufacturers didn't support OpenGL at all or released a miniport driver instead).&lt;br /&gt;
&lt;br /&gt;
The Permedia and Permedia NT support Windows 9x and NT 4.0. Supported APIs include Direct3D, OpenGL ICD and HEIDI.&lt;br /&gt;
&lt;br /&gt;
Popular cards include Diamond Multimedia's FireGL 1000 and Leadtek's Winfast 3D L2200.&lt;br /&gt;
&lt;br /&gt;
[[File:Permedia_PCI.jpg|200px|thumb|right|Permedia PCI]]&lt;br /&gt;
&lt;br /&gt;
===Permedia 2 ===&lt;br /&gt;
The Permedia 2 was released in late 1997 and was an evolution of the original Permedia. The discrete Delta geometry co-processor chip originally featured in the Permedia NT is now integrated into the Permedia 2's core logic, resulting in a single-chip solution. Permedia 2 supports AGP texturing as well as 3D rendering in 32-bit color depth. However, it does not support colored vertex blending, which results in games using only monochromatic lighting - notable examples are in Quake 2 and Quake III: Arena. No colored lighting is present in Quake 2 even when using the OpenGL renderer, and Quake III: Arena only supports vertex lighting on the Permedia 2. &lt;br /&gt;
&lt;br /&gt;
Most Permedia 2 graphics cards do not come with out-of-the-box VESA VBE support, requiring the use of TSRs such as UniVBE/SciTech Display Doctor in order to use higher-resolution VESA graphics modes in DOS. DOS VGA compatibility is also spotty. The Permedia 2 is also 3DLabs' last graphics chip with Windows 95 support.&lt;br /&gt;
&lt;br /&gt;
===Permedia 3 and GLINT R3 ===&lt;br /&gt;
The Permedia 3 and GLINT R3 are the third iterations of 3DLabs' Permedia architecture. Both were released in mid 1999 and again were directed at the professional 3D and CAD application market. The Permedia 3, however, had more emphasis on being directed at gaming. Both are also Direct3D 6-compliant. The Permedia 3 and GLINT R3 are nearly identical to each other, but the GLINT R3 is more optimized for professional 3D and CAD applications as well as multiprocessor systems. At this time 3DLabs began producing their own graphics cards rather than license their graphics chips out to OEMs, in the same manner that 3dfx did with their Voodoo3/4/5 series. The Permedia 3 and GLINT R3 also introduced support for hardware dot product bump mapping, but this feature would not be extensively used in games until a year or so after its release. Other consumer-level graphics cards at the time only supported emboss bump mapping. Three retail cards based on this chip were released: 3DLabs' Permedia 3 Create! (using the Permedia 3 chip), Oxygen VX1 and the Oxygen GVX1 (both using the GLINT R3 chip). The Oxygen VX1 was released in both 32 MB and 16 MB configurations (the latter was designated as the Oxygen VX1-16). The Oxygen GVX1 featured a separate geometry co-processor (known as Gamma G1) chip for performing transformation, clipping and lighting calculations, much in the same manner that the Delta co-processor functioned in the Permedia and Permedia 2 series. Likewise with Delta, the Gamma G1 geometry co-processor is not optimized for games. Gaming performance-wise, the Permedia 3 and GLINT R3 are both slower than even NVIDIA's RIVA TNT (released several months earlier) in most situations, and they are definitely no match for the RIVA TNT2 or 3dfx's Voodoo3. Their performance in professional 3D and CAD applications, however, is much better. The Oxygen GVX1 is slower than NVIDIA's GeForce 256 and Quadro in both games and professional 3D applications, but produces far fewer rendering anomalies in professional 3D applications.&lt;br /&gt;
&lt;br /&gt;
Later card revisions added AGP 4x support - if purchased in a retail box, there usually will be an &amp;quot;AGP 4x&amp;quot; sticker present on the box to indicate such support.&lt;br /&gt;
&lt;br /&gt;
Compared to 3DLabs' previous graphics chips, the GLINT R3 has better DOS VGA compatibility and contains VESA VBE support out-of-the-box.&lt;br /&gt;
&lt;br /&gt;
The Permedia 3 and GLINT R3 require Windows 98 or Windows NT 4.0 (or newer). Overall, this chip is definitely not geared towards gaming.&lt;br /&gt;
&lt;br /&gt;
==Video captures==&lt;br /&gt;
&lt;br /&gt;
===Game GLINT ===&lt;br /&gt;
{{#ev:youtube|c7tTYwNDWlU}}&lt;br /&gt;
&lt;br /&gt;
==Related links==&lt;br /&gt;
*[http://gona.mactar.hu/DOS_TESTS/ Gona's PCI and AGP DOS game compatibility matrix]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=3Dlabs&amp;diff=1347</id>
		<title>3Dlabs</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=3Dlabs&amp;diff=1347"/>
				<updated>2013-04-28T22:32:19Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* GLINT R3 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Cards==&lt;br /&gt;
===Game GLINT ===&lt;br /&gt;
&lt;br /&gt;
3DLabs' first tap into the consumer graphics market was in November 1995 with its Game GLINT (also known as GiGi) chip. It is a scaled-down variant of 3DLabs' GLINT 300TX chipset released a year earlier, which was designed for professional 3D and CAD applications. The only graphics card produced using this chip was Creative Labs' 3D Blaster VLB. It is a unique card in that it is the only existing consumer-level 3D accelerator (and texture mapper) for the VLB bus, and it was designed for 80486 VLB users who wanted Pentium-level gaming performance. It has 1 MB graphics memory for display frame buffer and 1 MB texture memory. The 3D Blaster VLB can be used as a standalone 2D/3D graphics card or as a 3D-only accelerator when paired with a 2D graphics card in the same system through a VGA pass-through cable (in the same manner that cards based on 3Dfx's Voodoo Graphics and Voodoo2 chipsets function). The 3D Blaster VLB supported Creative Labs' own proprietary API known as Creative Graphics Library (CGL). Only a handful of CGL games were ever released [http://www.vogonswiki.com/index.php/List_of_games_supporting_proprietary_APIs#3D_Blaster_.28CGL.29 (more information here)] and only these games take advantage of the 3D Blaster VLB's 3D rendering features. Games that do take advantage of the 3D Blaster VLB usually run in higher resolution (640x400 or 640x480) and with additional graphics detail. However, with these additional rendering features enabled, performance in supported games is less than desirable (unless one has a VLB Pentium system) and reviewers knocked the card. Most gamers at the time opted to upgrade to a PCI Pentium system instead. The 3D Blaster VLB was supplanted several months later by its PCI bus counterpart, the 3D Blaster PCI, which used Rendition's Verite 1000 chip and had better game compatibility and performance.&lt;br /&gt;
&lt;br /&gt;
The 3D Blaster VLB nowadays is a very rare card and usually if it shows up on auction websites, the price will be very high. The 3D Blaster VLB also supposedly supports Direct3D if the graphics memory upgrade module is installed. However, the memory upgrade module is more difficult to find than the 3D Blaster VLB itself, and would also be very steeply-priced. Unlike 3DLabs' other graphics chips and chipsets, the Game GLINT does not support OpenGL. Display drivers are available for Windows 3.1x and Windows 95.&lt;br /&gt;
&lt;br /&gt;
===Permedia and Permedia NT ===&lt;br /&gt;
&lt;br /&gt;
3DLabs released the Permedia and Permedia NT in 1996, and they served as value-oriented graphics chips designed for the professional 3D and CAD markets. Permedia NT differs from the original in that it features a separate geometry co-processor chip (known as Delta) for performing transformation, clipping and lighting calculations. However, the Delta geometry co-processor is optimized for professional 3D and CAD applications rather than for games, resulting in anemic overall gaming performance. On the upside, the main advantage at the time was that 3DLabs was the only company producing consumer-level graphics chips that had full OpenGL ICD driver support (other manufacturers didn't support OpenGL at all or released a miniport driver instead).&lt;br /&gt;
&lt;br /&gt;
The Permedia and Permedia NT support Windows 9x and NT 4.0. Supported APIs include Direct3D, OpenGL ICD and HEIDI.&lt;br /&gt;
&lt;br /&gt;
Popular cards include Diamond Multimedia's FireGL 1000 and Leadtek's Winfast 3D L2200.&lt;br /&gt;
&lt;br /&gt;
[[File:Permedia_PCI.jpg|200px|thumb|right|Permedia PCI]]&lt;br /&gt;
&lt;br /&gt;
===Permedia 2 ===&lt;br /&gt;
The Permedia 2 was released in late 1997 and was an evolution of the original Permedia. The discrete Delta geometry co-processor chip originally featured in the Permedia NT is now integrated into the Permedia 2's core logic, resulting in a single-chip solution. Permedia 2 supports AGP texturing as well as 3D rendering in 32-bit color depth. However, it does not support colored vertex blending, which results in games using only monochromatic lighting - notable examples are in Quake 2 and Quake III: Arena. No colored lighting is present in Quake 2 even when using the OpenGL renderer, and Quake III: Arena only supports vertex lighting on the Permedia 2. &lt;br /&gt;
&lt;br /&gt;
Most Permedia 2 graphics cards do not come with out-of-the-box VESA VBE support, requiring the use of TSRs such as UniVBE/SciTech Display Doctor in order to use higher-resolution VESA graphics modes in DOS. DOS VGA compatibility is also spotty. The Permedia 2 is also 3DLabs' last graphics chip with Windows 95 support.&lt;br /&gt;
&lt;br /&gt;
===Permedia 3 and GLINT R3 ===&lt;br /&gt;
The Permedia 3 and GLINT R3 are the third iterations of 3DLabs' Permedia architecture. Both were released in mid 1999 and again were directed at the professional 3D and CAD application market. The Permedia 3, however, had more emphasis on being directed at gaming. Both are also Direct3D 6-compliant. The Permedia 3 and GLINT R3 are nearly identical to each other, but the GLINT R3 is more optimized for professional 3D and CAD applications as well as multiprocessor systems. At this time 3DLabs began producing their own graphics cards rather than license their graphics chips out to OEMs, in the same manner that 3dfx did with their Voodoo3/4/5 series. The GLINT R3 also introduced support for hardware dot product bump mapping, but this feature would not be extensively used in games until a year or so after its release. Other consumer-level graphics cards at the time only supported emboss bump mapping. Three retail cards based on this chip were released: 3DLabs' Permedia 3 Create! (using the Permedia 3 chip), Oxygen VX1 and the Oxygen GVX1 (both using the GLINT R3 chip). The Oxygen VX1 was released in both 32 MB and 16 MB configurations (the latter was designated as the Oxygen VX1-16). The Oxygen GVX1 featured a separate geometry co-processor (known as Gamma G1) chip for performing transformation, clipping and lighting calculations, much in the same manner that the Delta co-processor functioned in the Permedia and Permedia 2 series. Likewise with Delta, the Gamma G1 geometry co-processor is not optimized for games. Gaming performance-wise, the Permedia 3 and GLINT R3 are both slower than even NVIDIA's RIVA TNT (released several months earlier) in most situations, and they are definitely no match for the RIVA TNT2 or 3dfx's Voodoo3. Their performance in professional 3D and CAD applications, however, is much better. The Oxygen GVX1 is slower than NVIDIA's GeForce 256 and Quadro in both games and professional 3D applications, but produces far fewer rendering anomalies in professional 3D applications.&lt;br /&gt;
&lt;br /&gt;
Later card revisions added AGP 4x support - if purchased in a retail box, there usually will be an &amp;quot;AGP 4x&amp;quot; sticker present on the box to indicate such support.&lt;br /&gt;
&lt;br /&gt;
Compared to 3DLabs' previous graphics chips, the GLINT R3 has better DOS VGA compatibility and contains VESA VBE support out-of-the-box.&lt;br /&gt;
&lt;br /&gt;
The Permedia 3 and GLINT R3 require Windows 98 or Windows NT 4.0 (or newer). Overall, this chip is definitely not geared towards gaming.&lt;br /&gt;
&lt;br /&gt;
==Video captures==&lt;br /&gt;
&lt;br /&gt;
===Game GLINT ===&lt;br /&gt;
{{#ev:youtube|c7tTYwNDWlU}}&lt;br /&gt;
&lt;br /&gt;
==Related links==&lt;br /&gt;
*[http://gona.mactar.hu/DOS_TESTS/ Gona's PCI and AGP DOS game compatibility matrix]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=Intel/Real3D&amp;diff=1346</id>
		<title>Intel/Real3D</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=Intel/Real3D&amp;diff=1346"/>
				<updated>2013-04-28T21:28:08Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* i740 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Cards==&lt;br /&gt;
&lt;br /&gt;
===i740 ===&lt;br /&gt;
Also known as the Intel740 and codenamed &amp;quot;Auburn&amp;quot;, this was Intel's first foray into the discrete graphics market. It was released in January 1998 and was the result of Intel's collaboration with Lockheed Martin's Real3D graphics division. It supports Direct3D and OpenGL ICD. It was designed to take full advantage of the AGP bus, and as such, it can only texture from system memory via the AGP bus. All of its local onboard graphics memory was used purely as a display frame buffer. Therefore, manufacturers were able to produce cards with as low as 2 MB of onboard memory (one such example being Intel's own Express 3D graphics card). The i740 also supports texture map resolutions up to 1024x1024. 3D rendering color depth is limited to 16-bit, but the chip has good dithering quality. Its performance allows it to compete well with NVIDIA's RIVA 128, ATI's Rage Pro and the original 3Dfx Voodoo graphics chipset, but it falls woefully short of higher-end graphics solutions released around the same timeframe, such as 3Dfx's Voodoo2. Much of the bottleneck results from its sole use of AGP texturing - the i740 has to access textures through a channel that is often much slower than that of the onboard graphics memory. It was also difficult to port i740 cards to the PCI bus - often times it necessitated the use of an AGP-to-PCI bridge chip, resulting in increased card cost. The PCI variants could texture from onboard graphics memory and are faster than their AGP counterparts in some tests. The PCI variants also came with larger amounts of onboard memory - one such card was a PCI version of Real3D's Starfighter, which was manufactured with as much as 24 MB of onboard memory.&lt;br /&gt;
&lt;br /&gt;
The i740 supports Windows 9x and NT 4.0. Drivers for Windows 3.1x were also released, and the Windows 3.1x drivers are heavily based on Chips and Technologies' GUI accelerator drivers (since Intel had acquired Chips and Technologies in July 1997).&lt;br /&gt;
&lt;br /&gt;
The i740 has good overall DOS VGA compatibility and performance.&lt;br /&gt;
&lt;br /&gt;
===i752 and i754 ===&lt;br /&gt;
The i752 and i754 are slight evolutions of the i740 architecture and they were announced in April 1999. As of 2013, they are also Intel's last graphics chips used on discrete cards. The i752 supports AGP 2x while the i754 supports AGP 4x. New features include support for multitexturing, anisotropic filtering, MPEG-2 motion compensation and DVI displays. Both were cancelled upon release and very few boards made it out into the wild.&lt;br /&gt;
The i752 and i754 technology respectively lived on as the IGP in Intel's i810 and i815 motherboard chipsets. Some of this technology further found its way into Intel's later IGPs, notably their Extreme Graphics and Graphics Media Accelerator (GMA) lines.&lt;br /&gt;
&lt;br /&gt;
==Video captures ==&lt;br /&gt;
===i740 ===&lt;br /&gt;
{{#ev:youtube|XDxjuXikgs8}}&lt;br /&gt;
&lt;br /&gt;
==Related links==&lt;br /&gt;
*[http://www.vogonsdrivers.com/getfile.php?fileid=330  VOGONS Drivers Intel i740 Driver and BIOS compilation] &lt;br /&gt;
*[http://gona.mactar.hu/DOS_TESTS/ Gona's PCI and AGP DOS game compatibility matrix]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=Rendition&amp;diff=1341</id>
		<title>Rendition</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=Rendition&amp;diff=1341"/>
				<updated>2013-04-28T09:56:55Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* Related links */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Rendition was a graphics chip manufacturer that produced cards for PCs from 1996 through 1999. &lt;br /&gt;
&lt;br /&gt;
==Vérité==&lt;br /&gt;
===V1000===&lt;br /&gt;
[[File:Veritev1000.jpg|thumb|V1000E]]&lt;br /&gt;
Released in October 1996, V1000 was one of the earliest chips with 2D, 3D and video functions all integrated into one ASIC while also providing impressive performance all around.  It still used an external RAMDAC, which was common at the time. The GUI acceleration is adequate but not exceptional. DirectDraw and VESA VBE 2.0 functions are fast. All V1000 cards are equipped with 4MB of EDO DRAM running synchronously with the graphics chip. &lt;br /&gt;
&lt;br /&gt;
The chip is based upon MIPS-like RISC CPU technology with microcode programmability and a fixed-function pixel engine.  The CPU-like design provided flexibility allowing the chip to be tweaked for various use cases. Its 3D capabilities were second only to Voodoo Graphics in 1996. 3D performance is perhaps around 50% of Voodoo. V1000 is best utilized with Rendition's APIs, Speedy3D (DOS) and RRedline (Win9x). The first 3D accelerated form of Quake was [[VQuake]], using Speedy3D. Direct3D is less optimal for it though Direct3D 5 games like Jedi Knight are quite playable. The chip is not really adequately capable of OpenGL although there is an ICD available.&lt;br /&gt;
&lt;br /&gt;
Legacy VGA modes are very slow. For example Doom, which uses VGA Mode X, will run at around 10 fps on a Pentium III. There is a DOS utility program (renutil) to remap some VGA modes to VESA VBE modes, but Mode X can not be improved in this manner.&lt;br /&gt;
&lt;br /&gt;
There are two V1000 chips, V1000E and V1000L. V1000L operates on 3.3v instead of 5v and so uses less power and may be slightly higher clocked. V1000E boards should be used with the available BIOS update TSR for improved performance.&lt;br /&gt;
&lt;br /&gt;
===V2x00===&lt;br /&gt;
[[File:Rendition_2100.JPG|thumb||V2100]]&lt;br /&gt;
The second generation Vérité chip followed in 1997 and is similar in design to the V1000 but it has been drastically enhanced with a focus on single-cycle operation. V2200 can get much more work done per clock than the V1000 and this improves both 2D and 3D performance. The chip was offered in a budget form as V2100 and a high-end model as V2200 although both are identical and clock speed is the only differentiation. V2100 operates at 40-45 MHz while V2200 is 55-60 MHz. Memory is 4-8 MB SGRAM operating asynchronously, by default usually clocked twice as high as the chip clock. V2x00 is AGP capable but operates as a 66 MHz PCI device, without AGP's special features.&lt;br /&gt;
&lt;br /&gt;
The V2x00 is also capable of 3D rendering in 32-bit color depth.&lt;br /&gt;
&lt;br /&gt;
Unfortunately V2x00 is not able to perform per-pixel mip-mapping (only per-polygon), something even Voodoo 1 could do. Legacy VGA modes are also still very slow. Drivers also tend to be very fickle, with each release offering varying levels of stability and game compatibility.&lt;br /&gt;
&lt;br /&gt;
===V3300===&lt;br /&gt;
The V3300 was to be Rendition's third generation 3D graphics chipset and was initially scheduled for release in 1999. It was never released, likely because it was not going to be adequately competitive.&lt;br /&gt;
&lt;br /&gt;
*Dual Pixel Engine&lt;br /&gt;
**dual-texturing for bilinear and trilinear filtering&lt;br /&gt;
**specular highlighting (per vertex), Anti-aliasing&lt;br /&gt;
**3 million triangles/second triangle setup engine, 200 million pixels/s trilinear fillrate&lt;br /&gt;
*Dual independent 250 MHz RAMDAC CRT controllers&lt;br /&gt;
*iDCT transformations &amp;amp; motion compensation support (DVD playback acceleration)&lt;br /&gt;
*Compatible with 166 MHz SDRAM/SGRAM&lt;br /&gt;
*128-bit bus architecture&lt;br /&gt;
*AGP 2X execute mode support&lt;br /&gt;
*0.35 μm process &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===V4400===&lt;br /&gt;
After canceling V3300, Rendition accelerated development of V4400.  This chip was to utilize Micron's EDRAM technology and have 4MB of integrated memory. The project was eventually canceled.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Video captures==&lt;br /&gt;
{{#ev:youtube|NADd6w0wFk8}}&lt;br /&gt;
{{#ev:youtube|7HQ8J3Vthz8}}&lt;br /&gt;
{{#ev:youtube|hIDYwsVS85E}}&lt;br /&gt;
{{#ev:youtube|pYXF_VhAhlI}}&lt;br /&gt;
{{#ev:youtube|ZC6PYQpWoPQ}}&lt;br /&gt;
{{#ev:youtube|mfxOP4g3-Zk}}&lt;br /&gt;
{{#ev:youtube|cV3DRebD38g}}&lt;br /&gt;
{{#ev:youtube|1K_Ox3Shixk}}&lt;br /&gt;
{{#ev:youtube|wdhPe5max4E}}&lt;br /&gt;
{{#ev:youtube|pHgGRCj4yhE}}&lt;br /&gt;
&lt;br /&gt;
==Related links==&lt;br /&gt;
*[http://vogons.zetafleet.com/viewtopic.php?t=23019 Rendition thread on VOGONS]&lt;br /&gt;
*[http://gona.mactar.hu/v1000/ Gona's V1000-E vs. V1000L-P benchmarks]&lt;br /&gt;
*[http://gona.mactar.hu/DOS_TESTS/ Gona's PCI and AGP DOS game compatibility matrix]&lt;br /&gt;
*[http://vintage3d.org/index.php Vintage3D] - Rendition sections with benchmarks and screenshots&lt;br /&gt;
*[https://sites.google.com/site/martinsguistuff/home/hardware/rendition-verite-game-and-application-resources Rendition V2x00 Game and Application Compatibility List]&lt;br /&gt;
*[http://www.vogonsdrivers.com/index.php?catid=18 VOGONS Drivers Rendition section]&lt;br /&gt;
*[http://web.archive.org/web/200101240126/http://renaddiction.com/index.htm Renaddiction fan site] (Internet Archive)&lt;br /&gt;
*[[VQuake]] - details about how VQuake works.&lt;br /&gt;
&lt;br /&gt;
[[Category:Hardware]]&lt;br /&gt;
[[Category:Graphics Cards]]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=Rendition&amp;diff=1340</id>
		<title>Rendition</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=Rendition&amp;diff=1340"/>
				<updated>2013-04-28T09:35:15Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* Video captures */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Rendition was a graphics chip manufacturer that produced cards for PCs from 1996 through 1999. &lt;br /&gt;
&lt;br /&gt;
==Vérité==&lt;br /&gt;
===V1000===&lt;br /&gt;
[[File:Veritev1000.jpg|thumb|V1000E]]&lt;br /&gt;
Released in October 1996, V1000 was one of the earliest chips with 2D, 3D and video functions all integrated into one ASIC while also providing impressive performance all around.  It still used an external RAMDAC, which was common at the time. The GUI acceleration is adequate but not exceptional. DirectDraw and VESA VBE 2.0 functions are fast. All V1000 cards are equipped with 4MB of EDO DRAM running synchronously with the graphics chip. &lt;br /&gt;
&lt;br /&gt;
The chip is based upon MIPS-like RISC CPU technology with microcode programmability and a fixed-function pixel engine.  The CPU-like design provided flexibility allowing the chip to be tweaked for various use cases. Its 3D capabilities were second only to Voodoo Graphics in 1996. 3D performance is perhaps around 50% of Voodoo. V1000 is best utilized with Rendition's APIs, Speedy3D (DOS) and RRedline (Win9x). The first 3D accelerated form of Quake was [[VQuake]], using Speedy3D. Direct3D is less optimal for it though Direct3D 5 games like Jedi Knight are quite playable. The chip is not really adequately capable of OpenGL although there is an ICD available.&lt;br /&gt;
&lt;br /&gt;
Legacy VGA modes are very slow. For example Doom, which uses VGA Mode X, will run at around 10 fps on a Pentium III. There is a DOS utility program (renutil) to remap some VGA modes to VESA VBE modes, but Mode X can not be improved in this manner.&lt;br /&gt;
&lt;br /&gt;
There are two V1000 chips, V1000E and V1000L. V1000L operates on 3.3v instead of 5v and so uses less power and may be slightly higher clocked. V1000E boards should be used with the available BIOS update TSR for improved performance.&lt;br /&gt;
&lt;br /&gt;
===V2x00===&lt;br /&gt;
[[File:Rendition_2100.JPG|thumb||V2100]]&lt;br /&gt;
The second generation Vérité chip followed in 1997 and is similar in design to the V1000 but it has been drastically enhanced with a focus on single-cycle operation. V2200 can get much more work done per clock than the V1000 and this improves both 2D and 3D performance. The chip was offered in a budget form as V2100 and a high-end model as V2200 although both are identical and clock speed is the only differentiation. V2100 operates at 40-45 MHz while V2200 is 55-60 MHz. Memory is 4-8 MB SGRAM operating asynchronously, by default usually clocked twice as high as the chip clock. V2x00 is AGP capable but operates as a 66 MHz PCI device, without AGP's special features.&lt;br /&gt;
&lt;br /&gt;
The V2x00 is also capable of 3D rendering in 32-bit color depth.&lt;br /&gt;
&lt;br /&gt;
Unfortunately V2x00 is not able to perform per-pixel mip-mapping (only per-polygon), something even Voodoo 1 could do. Legacy VGA modes are also still very slow. Drivers also tend to be very fickle, with each release offering varying levels of stability and game compatibility.&lt;br /&gt;
&lt;br /&gt;
===V3300===&lt;br /&gt;
The V3300 was to be Rendition's third generation 3D graphics chipset and was initially scheduled for release in 1999. It was never released, likely because it was not going to be adequately competitive.&lt;br /&gt;
&lt;br /&gt;
*Dual Pixel Engine&lt;br /&gt;
**dual-texturing for bilinear and trilinear filtering&lt;br /&gt;
**specular highlighting (per vertex), Anti-aliasing&lt;br /&gt;
**3 million triangles/second triangle setup engine, 200 million pixels/s trilinear fillrate&lt;br /&gt;
*Dual independent 250 MHz RAMDAC CRT controllers&lt;br /&gt;
*iDCT transformations &amp;amp; motion compensation support (DVD playback acceleration)&lt;br /&gt;
*Compatible with 166 MHz SDRAM/SGRAM&lt;br /&gt;
*128-bit bus architecture&lt;br /&gt;
*AGP 2X execute mode support&lt;br /&gt;
*0.35 μm process &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===V4400===&lt;br /&gt;
After canceling V3300, Rendition accelerated development of V4400.  This chip was to utilize Micron's EDRAM technology and have 4MB of integrated memory. The project was eventually canceled.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Video captures==&lt;br /&gt;
{{#ev:youtube|NADd6w0wFk8}}&lt;br /&gt;
{{#ev:youtube|7HQ8J3Vthz8}}&lt;br /&gt;
{{#ev:youtube|hIDYwsVS85E}}&lt;br /&gt;
{{#ev:youtube|pYXF_VhAhlI}}&lt;br /&gt;
{{#ev:youtube|ZC6PYQpWoPQ}}&lt;br /&gt;
{{#ev:youtube|mfxOP4g3-Zk}}&lt;br /&gt;
{{#ev:youtube|cV3DRebD38g}}&lt;br /&gt;
{{#ev:youtube|1K_Ox3Shixk}}&lt;br /&gt;
{{#ev:youtube|wdhPe5max4E}}&lt;br /&gt;
{{#ev:youtube|pHgGRCj4yhE}}&lt;br /&gt;
&lt;br /&gt;
==Related links==&lt;br /&gt;
*[http://vogons.zetafleet.com/viewtopic.php?t=23019 Rendition thread on VOGONS]&lt;br /&gt;
*[http://gona.mactar.hu/v1000/ Gona's V1000-E vs. V1000L-P benchmarks]&lt;br /&gt;
*[http://gona.mactar.hu/DOS_TESTS/ Gona's PCI and AGP DOS game compatibility matrix]&lt;br /&gt;
*[http://vintage3d.org/index.php Vintage3D] - Rendition sections with benchmarks and screenshots&lt;br /&gt;
*[http://www.vogonsdrivers.com/index.php?catid=18 VOGONS Drivers Rendition section]&lt;br /&gt;
*[http://web.archive.org/web/200101240126/http://renaddiction.com/index.htm Renaddiction fan site] (Internet Archive)&lt;br /&gt;
*[[VQuake]] - details about how VQuake works.&lt;br /&gt;
&lt;br /&gt;
[[Category:Hardware]]&lt;br /&gt;
[[Category:Graphics Cards]]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=PowerVR&amp;diff=1339</id>
		<title>PowerVR</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=PowerVR&amp;diff=1339"/>
				<updated>2013-04-28T09:30:22Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* PCX-2 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Series 1 ==&lt;br /&gt;
&lt;br /&gt;
=== Midas ===&lt;br /&gt;
OEM&lt;br /&gt;
&lt;br /&gt;
=== PCX-1 ===&lt;br /&gt;
&lt;br /&gt;
=== PCX-2 ===&lt;br /&gt;
1997 and it's bilinear filtered and MiniGL drivers for this are in the 1.0.x.x version range.  Supports Direct3D 3, and its own API, SGL.&lt;br /&gt;
&lt;br /&gt;
A seal swimming quickly in polluted water in a sinus pattern was used to demonstrate this card.  Bundled games typically are Ultim@te Race.&lt;br /&gt;
&lt;br /&gt;
Something about infinite planes and dummied-out 24-bit rendering.&lt;br /&gt;
&lt;br /&gt;
Something about Crime Cities' MiniGL driver to get newer games &amp;quot;&amp;quot;working&amp;quot;&amp;quot;&lt;br /&gt;
&lt;br /&gt;
== Series 2 ==&lt;br /&gt;
=== PowerVR SG ===&lt;br /&gt;
Announced early 1998 and never released, however some games were tooled to support it (Half-Life) which suggests a prototypical existence in 1998.  &lt;br /&gt;
&lt;br /&gt;
MiniGL drivers for this are in the 1.1.x.x version range.&lt;br /&gt;
&lt;br /&gt;
=== Neon250 ===&lt;br /&gt;
Released in late 1999, something about dreamcast, &amp;quot;still&amp;quot; no support for OpenGL, supports SGL2 API which almost no games used, and it's rare!!!  &lt;br /&gt;
&lt;br /&gt;
MiniGL drivers for this are in the 1.2.x.x version range.&lt;br /&gt;
&lt;br /&gt;
== Series 3 ==&lt;br /&gt;
=== Kyro 1 ===&lt;br /&gt;
[[File:3D_Prophet_4000XT_PCI.jpg|200px|thumb||3D Prophet 4000XT]]&lt;br /&gt;
&lt;br /&gt;
The original Kyro was released in 2000 and dropped support for the PowerVR SGL and SGL2 APIs. Basically it is a 'false DirectX 7' card that supports just DirectX 6 features. Its low core and memory clock frequencies and lack of hardware T&amp;amp;L support crippled its performance against its main competitors (namely the GeForce2 MX), but thanks to its tile-based deferred rendering architecture it performed decently in games that had a lot of overdraw. Although it lacked hardware T&amp;amp;L it was still fairly feature rich compared to other DirectX 6 graphics chips -  with texture compression (S3TC), environment-mapped bump mapping (EMBM), ordered-grid super-sampling anti-aliasing and anisotropic filtering (up to level 2x) support. While its anisotropic filtering quality was on par with that of the GeForce 256 / GeForce 2 series, it incurred a very large performance decrease when enabled.&lt;br /&gt;
&lt;br /&gt;
The tiled rendering was also a drawback to supporting more modern graphic features that rely on rendering-to-texture and framebuffers.&lt;br /&gt;
&lt;br /&gt;
Unsurprising for it's KYRO name (Cairo, Egypt), many tech demos to showcase this card take place in ancient Egypt.&lt;br /&gt;
&lt;br /&gt;
=== Kyro 2 ===&lt;br /&gt;
[[File:3D_Prophet_4500_AGP.jpg|200px|thumb||3D Prophet 4500]]&lt;br /&gt;
Released in 2001. Essentially the Kyro 2 is just a Kyro with higher core and memory clock frequencies. Most of the marketing were about features the older series had, including tiled rendering, full sorting, internal true color rendering, etc.  The emphasis on hidden surface removal was supposed to be the feature to make up for the lack of HW T&amp;amp;L.  It also does not support cubic environment mapping. In games that don't make heavy use of hardware T&amp;amp;L, the Kyro 2 can perform well against the GeForce 2 GTS. But, like everything that isn't Nvidia and ATI, it deprecated quickly for its lack of hardware T&amp;amp;L and its lateness to the market.&lt;br /&gt;
&lt;br /&gt;
Also oddly enough,the Kyro 2 does not support AGP 4x out of the box, but that can be remedied through a hardware mod [http://www.paraknowya.com/articles/agpmod/index.shtml (link)].&lt;br /&gt;
&lt;br /&gt;
Likewise with the original Kyro, it has no support for the SGL or SGL2 API.&lt;br /&gt;
&lt;br /&gt;
=== Kyro 2 SE===&lt;br /&gt;
Announced in 2002 but cancelled at the last minute. Only one known card was ever produced - Hercules' 3D Prophet 4800. Supposedly it introduced a hardware T&amp;amp;L implementation (advertised as EnT&amp;amp;L or &amp;quot;Enhanced T&amp;amp;L&amp;quot;), but it ended up just being a software hack, much in the way 3dfx's &amp;quot;Geometry Assist&amp;quot; feature functioned in their Voodoo3/4/5 drivers. Regardless, it would have easily been swallowed up by the competition from NVIDIA and ATI at the time.&lt;br /&gt;
&lt;br /&gt;
==Video captures==&lt;br /&gt;
&lt;br /&gt;
===PCX-2 ===&lt;br /&gt;
{{#ev:youtube|DU5Zi69QPQs}}&lt;br /&gt;
{{#ev:youtube|wWzWdwj9NvU}}&lt;br /&gt;
{{#ev:youtube|u2LVfam6w1o}}&lt;br /&gt;
{{#ev:youtube|aBuGwrWP-s4}}&lt;br /&gt;
{{#ev:youtube|gOB8hItI04Y}}&lt;br /&gt;
{{#ev:youtube|gYiiPHsMzBI}}&lt;br /&gt;
{{#ev:youtube|GT0OILijFfo}}&lt;br /&gt;
&lt;br /&gt;
===Kyro ===&lt;br /&gt;
{{#ev:youtube|yUxU93Geet4}}&lt;br /&gt;
{{#ev:youtube|5RlFs5dEm28}}&lt;br /&gt;
&lt;br /&gt;
[[Category:Hardware]]&lt;br /&gt;
[[Category:Graphics Cards]]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=PowerVR&amp;diff=1338</id>
		<title>PowerVR</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=PowerVR&amp;diff=1338"/>
				<updated>2013-04-28T09:27:48Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Series 1 ==&lt;br /&gt;
&lt;br /&gt;
=== Midas ===&lt;br /&gt;
OEM&lt;br /&gt;
&lt;br /&gt;
=== PCX-1 ===&lt;br /&gt;
&lt;br /&gt;
=== PCX-2 ===&lt;br /&gt;
1997 and it's bilinear filtered and MiniGL drivers for this are in the 1.0.x.x version range.  Supports Direct3D 3, and its own API, SGL.&lt;br /&gt;
&lt;br /&gt;
A seal swimming quickly in polluted water in a sinus pattern was used to demonstrate this card.  Bundled games typically are Ultim@te Race.&lt;br /&gt;
&lt;br /&gt;
Something about infinite planes and dummied-out 24-bit rendering.&lt;br /&gt;
&lt;br /&gt;
Something about Crime Cities' MiniGL driver to get newer games &amp;quot;&amp;quot;working&amp;quot;&amp;quot;&lt;br /&gt;
&lt;br /&gt;
== Series 2 ==&lt;br /&gt;
=== PowerVR SG ===&lt;br /&gt;
Announced early 1998 and never released, however some games were tooled to support it (Half-Life) which suggests a prototypical existence in 1998.  &lt;br /&gt;
&lt;br /&gt;
MiniGL drivers for this are in the 1.1.x.x version range.&lt;br /&gt;
&lt;br /&gt;
=== Neon250 ===&lt;br /&gt;
Released in late 1999, something about dreamcast, &amp;quot;still&amp;quot; no support for OpenGL, supports SGL2 API which almost no games used, and it's rare!!!  &lt;br /&gt;
&lt;br /&gt;
MiniGL drivers for this are in the 1.2.x.x version range.&lt;br /&gt;
&lt;br /&gt;
== Series 3 ==&lt;br /&gt;
=== Kyro 1 ===&lt;br /&gt;
[[File:3D_Prophet_4000XT_PCI.jpg|200px|thumb||3D Prophet 4000XT]]&lt;br /&gt;
&lt;br /&gt;
The original Kyro was released in 2000 and dropped support for the PowerVR SGL and SGL2 APIs. Basically it is a 'false DirectX 7' card that supports just DirectX 6 features. Its low core and memory clock frequencies and lack of hardware T&amp;amp;L support crippled its performance against its main competitors (namely the GeForce2 MX), but thanks to its tile-based deferred rendering architecture it performed decently in games that had a lot of overdraw. Although it lacked hardware T&amp;amp;L it was still fairly feature rich compared to other DirectX 6 graphics chips -  with texture compression (S3TC), environment-mapped bump mapping (EMBM), ordered-grid super-sampling anti-aliasing and anisotropic filtering (up to level 2x) support. While its anisotropic filtering quality was on par with that of the GeForce 256 / GeForce 2 series, it incurred a very large performance decrease when enabled.&lt;br /&gt;
&lt;br /&gt;
The tiled rendering was also a drawback to supporting more modern graphic features that rely on rendering-to-texture and framebuffers.&lt;br /&gt;
&lt;br /&gt;
Unsurprising for it's KYRO name (Cairo, Egypt), many tech demos to showcase this card take place in ancient Egypt.&lt;br /&gt;
&lt;br /&gt;
=== Kyro 2 ===&lt;br /&gt;
[[File:3D_Prophet_4500_AGP.jpg|200px|thumb||3D Prophet 4500]]&lt;br /&gt;
Released in 2001. Essentially the Kyro 2 is just a Kyro with higher core and memory clock frequencies. Most of the marketing were about features the older series had, including tiled rendering, full sorting, internal true color rendering, etc.  The emphasis on hidden surface removal was supposed to be the feature to make up for the lack of HW T&amp;amp;L.  It also does not support cubic environment mapping. In games that don't make heavy use of hardware T&amp;amp;L, the Kyro 2 can perform well against the GeForce 2 GTS. But, like everything that isn't Nvidia and ATI, it deprecated quickly for its lack of hardware T&amp;amp;L and its lateness to the market.&lt;br /&gt;
&lt;br /&gt;
Also oddly enough,the Kyro 2 does not support AGP 4x out of the box, but that can be remedied through a hardware mod [http://www.paraknowya.com/articles/agpmod/index.shtml (link)].&lt;br /&gt;
&lt;br /&gt;
Likewise with the original Kyro, it has no support for the SGL or SGL2 API.&lt;br /&gt;
&lt;br /&gt;
=== Kyro 2 SE===&lt;br /&gt;
Announced in 2002 but cancelled at the last minute. Only one known card was ever produced - Hercules' 3D Prophet 4800. Supposedly it introduced a hardware T&amp;amp;L implementation (advertised as EnT&amp;amp;L or &amp;quot;Enhanced T&amp;amp;L&amp;quot;), but it ended up just being a software hack, much in the way 3dfx's &amp;quot;Geometry Assist&amp;quot; feature functioned in their Voodoo3/4/5 drivers. Regardless, it would have easily been swallowed up by the competition from NVIDIA and ATI at the time.&lt;br /&gt;
&lt;br /&gt;
==Video captures==&lt;br /&gt;
&lt;br /&gt;
===PCX-2 ===&lt;br /&gt;
{{#ev:youtube|DU5Zi69QPQs}}&lt;br /&gt;
{{#ev:youtube|wWzWdwj9NvU}}&lt;br /&gt;
{{#ev:youtube|u2LVfam6w1o}}&lt;br /&gt;
{{#ev:youtube|aBuGwrWP-s4}}&lt;br /&gt;
{{#ev:youtube|GT0OILijFfo}}&lt;br /&gt;
&lt;br /&gt;
===Kyro ===&lt;br /&gt;
{{#ev:youtube|yUxU93Geet4}}&lt;br /&gt;
{{#ev:youtube|5RlFs5dEm28}}&lt;br /&gt;
&lt;br /&gt;
[[Category:Hardware]]&lt;br /&gt;
[[Category:Graphics Cards]]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=Matrox&amp;diff=1337</id>
		<title>Matrox</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=Matrox&amp;diff=1337"/>
				<updated>2013-04-28T09:17:51Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* Millennium */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Chips==&lt;br /&gt;
===Impression===&lt;br /&gt;
&lt;br /&gt;
===Millennium===&lt;br /&gt;
[[File:Matrox Millennium 8mb rotate.jpg|thumb|Millennium]]&lt;br /&gt;
Successor to the Impression series, Millennium is a capable VGA and GUI accelerator with good output quality. Its intended audience was CAD users and others who desired high-performance, high-resolution GUI acceleration and was highly-priced accordingly.&lt;br /&gt;
&lt;br /&gt;
Like the Impression, it is a rudimentary 3D accelerator with support for gouraud shading. It does not support hardware texture mapping. It is not Direct3D compatible. There were a few games included with the card that interfaced with it.&lt;br /&gt;
&lt;br /&gt;
===Mystique===&lt;br /&gt;
[[File:Mystique 2.jpg|thumb|Mystique]]&lt;br /&gt;
&lt;br /&gt;
''See main article: [[Matrox Mystique]]''&lt;br /&gt;
&lt;br /&gt;
[[Matrox Mystique]] is a 2D/3D/video accelerator for PC. Matrox released their first Mystique on August 14, 1996. Newer versions, including Mystique 220, kept appearing until summer 1997. The videocard usually had 2-4 Mb SGRAM expandable to 8MB with a special memory add-on card. However, apart from higher resolutions, upgrading memory did not make much difference. &lt;br /&gt;
&lt;br /&gt;
Mystique was oriented on mid-end consummer and business market, offering excellent 2D performance, traditional for Matrox. It also has basic 3D capabilities, delivered mainly through Matrox Simple Interface API. As for the 2D part, it's safe to say the videocard has no known flaws. The image is crisp, has fine colors. But the 3D part lacked a lot of functions, which were removed to improve overall performance. Overall, [[Matrox Mystique]] is a good choice for 2D graphics alongside [[3dfx]] Voodoo.&lt;br /&gt;
&lt;br /&gt;
===Millennium II===&lt;br /&gt;
[[File:Matroxmillennium2.jpg|thumb|Millenium II 8MB PCI]]&lt;br /&gt;
This chip is mysteriously similar to Mystique but it uses WRAM instead of SGRAM and this gives it higher high-resolution GUI performance. The 3D acceleration appears to be identical with even the same bugs.&lt;br /&gt;
&lt;br /&gt;
It comes in both AGP and PCI versions.&lt;br /&gt;
&lt;br /&gt;
===G100===&lt;br /&gt;
This was primarily a budget VGA/GUI &amp;quot;productivity&amp;quot; accelerator card. It has somewhat improved 3D hardware compared to the Mystique and Millennium II, with bilinear filtering, but it still lacks critical features like full alpha blending.&lt;br /&gt;
&lt;br /&gt;
===G200===&lt;br /&gt;
[[File:MatroxG200.jpg|thumb|Millennium G200]]&lt;br /&gt;
The G200 is Matrox's first in-house 3D accelerator with full Direct3D 5 feature compliance. It typically comes with 8MB RAM and is capable of rendering at any resolution that can fit within that. It is capable of 32-bit rendering color depth although the performance hit is considerable. It is AGP 2x compliant and can use AGP texturing.&lt;br /&gt;
&lt;br /&gt;
Unreal and Unreal Tournament may display an incorrect, overly bright image. Editing the Unreal.ini and disabling multi-texturing will fix this. Z-fighting may also be a problem and enabling 32-bit z-buffer can help this.&lt;br /&gt;
&lt;br /&gt;
G200's OpenGL support was very poor until years into its life. Initially a slow OpenGL-to-Direct3D wrapper was used to support a few games. Eventually a full OpenGL driver was released at around the same time as G400's.  Note that the final drivers for G200 include a OpenGL driver with a bug that breaks transparent water. This is remedied with later G400 driver package that contains a G200 OpenGL ICD. Overwrite the older ICD in the Windows directory.&lt;br /&gt;
&lt;br /&gt;
Retail cards were ''Millennium G200'', ''Mystique G200'', ''Marvel G200'' and ''G200 MMS''. Millennium uses SGRAM while Mystique has slightly slower SDRAM but also TV-output. Marvel features video in/out capabilities. There is also a G250 chip which was OEM-only. It is built on 250nm manufacturing instead of G200's 350nm and typically does not need a heatsink.&lt;br /&gt;
&lt;br /&gt;
===G400===&lt;br /&gt;
[[File:MatroxMillenniumG400Max.JPG|thumb|Millennium G400 Max]]&lt;br /&gt;
The G400 was essentially an improved and upgraded G200. Main improvements include 2 rendering pipelines, 128-bit memory bus, dual VGA monitor output, Direct3D 6 compliance, and environmental bump mapping support (EMBM).  It is over twice as fast as Millennium G200. The G400 Max was similar in performance to TNT2 Ultra and Voodoo3 3500.&lt;br /&gt;
&lt;br /&gt;
It lacks most DVD acceleration features but has an interesting DVD Max mode for output onto the second display.&lt;br /&gt;
&lt;br /&gt;
Initially the card did not have an OpenGL driver. Matrox compensated for over a year with a miniGL called TurboGL which supported mainly Quake 1/2/3-based games. In early 2000 the final OpenGL driver was ready.&lt;br /&gt;
&lt;br /&gt;
Variants include the ''Millennium G400'', ''Millennium G400 Max'', ''Marvel G400'', and ''Marvel G400-TV''. There are also a number of OEM models with different specifications. Some cards come with slightly slower SDRAM instead of SGRAM.&lt;br /&gt;
&lt;br /&gt;
===G450===&lt;br /&gt;
A cost-reduced version of G400 with similar performance and features. G400 Max is faster.&lt;br /&gt;
&lt;br /&gt;
Retail products were ''Millennium G450'' and ''Marvel G450 eTV''.&lt;br /&gt;
&lt;br /&gt;
===G550===&lt;br /&gt;
A Direct3D 6 GPU in practice. It does have a hardware transform and lighting unit but it is not Direct3D 7 compliant. This was only used for the Headcasting software. Performance of this card is slightly above the G400 and G450.&lt;br /&gt;
&lt;br /&gt;
The retail model was ''Millennium G550''.&lt;br /&gt;
&lt;br /&gt;
===Parhelia===&lt;br /&gt;
[[File:MatroxParhelia128.jpg|thumb|Parhelia revision 1]]&lt;br /&gt;
Matrox's first Direct3D 8 accelerator, although it was initially advertised has having partial D3D9 capabilities. Performance is similar to a GeForce4 Ti. Initial version of the GPU has some bugs with secondary displays and also shipped with a low clock speed because of manufacturing difficulties. A later version increased clock speeds but also eliminated AGP 2x (3.3v) support.&lt;br /&gt;
&lt;br /&gt;
It features a unique anti-aliasing technique called fragment anti-aliasing that provides very high quality (claimed 16X-equivalent MSAA). This technique has some caveats though, such as incompatibility with stencil buffering, and so was not further developed.&lt;br /&gt;
&lt;br /&gt;
The first Matrox chip with full DVD acceleration.&lt;br /&gt;
&lt;br /&gt;
It does not support Windows 9x.&lt;br /&gt;
&lt;br /&gt;
==Video captures==&lt;br /&gt;
{{#ev:youtube|K2CXlvWRzF4}}&lt;br /&gt;
{{#ev:youtube|uPY9lsMDW-o}}&lt;br /&gt;
{{#ev:youtube|bdA2UwA1YPc}}&lt;br /&gt;
{{#ev:youtube|qrV6eAPdMlA}}&lt;br /&gt;
{{#ev:youtube|SQTntSm17Xs}}&lt;br /&gt;
{{#ev:youtube|AntyM8KtIAs}}&lt;br /&gt;
&lt;br /&gt;
==Related links==&lt;br /&gt;
*[http://web.archive.org/web/20040110103309/http://grafi.ii.pw.edu.pl/gbm/matrox/ MatroX Files] - site with technical information about various Matrox cards. Includes overclocking, BIOS modification, tweaks, etc.&lt;br /&gt;
*[http://gona.mactar.hu/DOS_TESTS/ Gona's PCI and AGP DOS game compatibility matrix]&lt;br /&gt;
&lt;br /&gt;
[[Category:Hardware]]&lt;br /&gt;
[[Category:Graphics Cards]]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=NVIDIA&amp;diff=1336</id>
		<title>NVIDIA</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=NVIDIA&amp;diff=1336"/>
				<updated>2013-04-28T09:15:08Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* NV4-6 | RIVA TNT &amp;amp; TNT2 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Retro value==&lt;br /&gt;
GeForce 256 through GeForce FX are fine cards for old games for a number of reasons. Their DOS game compatibility and GUI performance are top notch. Their Direct3D driver supports two critical old features, fog table and 8-bit palettized textures. OpenGL compatibility and performance are second to none, and some games utilize proprietary NVIDIA extensions.&lt;br /&gt;
&lt;br /&gt;
One problem with NVIDIA cards prior to GeForce4 is that some card vendors built their cards too cheaply. The most noticeable result is poor analog signal quality. It causes problems such as blurriness, loss of color saturation and color bleed, particularly at higher resolutions and refresh rates. Determining which cards are high quality is difficult but the GeForce4 and newer cards are most likely to be good.&lt;br /&gt;
&lt;br /&gt;
Perhaps the finest choices are the GeForce FX series because they offer the most refined quality-enhancing features and the high end models have the performance to use these features at higher resolutions. GeForce 6 drops support for palettized textures, which is a problem with a few games, but otherwise they too are great choices. GeForce 7 drops support for Windows 9x but still supports Windows 2000. Some users, however, have reportedly gotten GeForce 7800 cards to work under Windows 98 and ME using an unofficial, modified driver [http://www.mdgx.com/files/nv8269.php (link)]. This modified driver does not support Windows 95. The GeForce 7 series is also the last that is supported by NVIDIAs Stereo3D driver extension for shutter glasses such as ELSA Revelator. GeForce 8 and above support only XP and newer.&lt;br /&gt;
&lt;br /&gt;
==Cards==&lt;br /&gt;
===NV1 / STG2000 ===&lt;br /&gt;
[[File:Dedge3d.jpg|thumb|NV1]]&lt;br /&gt;
Released in 1995, NVIDIA's first 3D accelerator was an all-in-one product with audio, GUI, VGA, 3D and Sega Saturn gamepad support. It uses a type of 3D rendering called quadratic texture mapping that is not Direct3D or OpenGL compatible so it is only useful with games that use its proprietary API. It is the same technology used by Sega Saturn and as such various games were ported to use the NV1. Its audio consists of wavetable MIDI and DirectSound support but very little DOS support. DOS VGA compatibility is limited.&lt;br /&gt;
&lt;br /&gt;
It comes in various memory configurations with up to 4 MB maximum. The final drivers support Direct3D, but it is a software-only implementation.&lt;br /&gt;
&lt;br /&gt;
Games with support: Battle Arena Toshinden, Descent: Destination Saturn, NASCAR Racing, Panzer Dragoon, Virtua Cop, Virtua Fighter Remix.&lt;br /&gt;
&lt;br /&gt;
===NV3 | RIVA 128 &amp;amp; 128 ZX===&lt;br /&gt;
[[File:Riva128.jpg|thumb|RIVA 128]]&lt;br /&gt;
The RIVA 128 was released in late 1997 and it is the first Direct3D compatible GPU from NVIDIA. RIVA stands for '''R'''eal-time '''I'''nteractive '''V'''ideo and '''A'''nimation and 128 for the internal 128-bit pipeline and memory interface. It has a 206 MHz RAMDAC and supports DDC2 and VBE3.&lt;br /&gt;
Riva 128 has all the hardware features required for Direct3D 5 and has also good OpenGL 1.0 compatibility. It renders at 16-bit color depth and supports 3D accelerated resolutions up to 960x720 with Z-buffer. The 3D-performance is competitive with [[3dfx|Voodoo Graphics]] (Voodoo1). The RIVA 128 shows a few rendering quality issues like visible texture seams and a very apparent dithering pattern. Overall the rendered image appears more saturated than output from Glide games from this time. A special feature of the PCI version of the card is the ability to load textures over the PCI bus, which was advertised as ''AGP content on PCI''. The 2D-features include video scaling and color conversion capabilities. The chip features also PAL/NTSC output so many cards were released with TV-out, some even with TV-in.&lt;br /&gt;
&lt;br /&gt;
In early 1998, NVIDIA refreshed the NV3 architecture by releasing the RIVA 128 ZX. The RIVA 128 ZX is an upgraded chip that has a 250 MHz RAMDAC and supports up to 8 MB SGRAM. For texture intense games the increased memory results in higher performance.&lt;br /&gt;
&lt;br /&gt;
===NV4-6 | RIVA TNT &amp;amp; TNT2===&lt;br /&gt;
[[File:Tntpci.jpg|thumb|TNT PCI]]&lt;br /&gt;
Released in 1998, NVIDIA's RIVA TNT is a drastically improved Direct3D 6-compliant GPU with much better image quality and performance compared to its predecessors. It is competitive with Voodoo2 but with more flexibility such as 32-bit color rendering and 1024x1024 texture support. It also supports AGP 2x with execute mode (AGP texturing). 32-bit color rendering comes with a significant speed loss though and large textures are of very limited use since the chip lacks any form of texture compression. 16 MB RAM allows for very high resolutions. Great OpenGL support. All NVIDIA cards have the advantage of seperately clocked memory which makes for more flexible overclocking than with 3dfx cards.&lt;br /&gt;
&lt;br /&gt;
NVIDIA further refined the RIVA TNT architecture in early 1999 with the RIVA TNT2. The RIVA TNT2 is very similar to the original TNT but adds support for AGP 4x and 2048x2048 resolution texture maps. It also typically has 32 MB of RAM and it is clocked much higher so performs noticeably better. Some popular budget variants include Vanta and TNT2 Model 64 (M64). The M64 model uses a 64-bit memory interface (rather than 128-bit as opposed to the regular variant), effectively halving its memory bandwidth. The performance of the M64 model, however, is faster than the original TNT in some situations.&lt;br /&gt;
&lt;br /&gt;
Acer Laboratories Inc. (ALi) later integrated the RIVA TNT2 core into the northbridge of their M1631 motherboard chipset, commonly known as the Aladdin TNT2. The Aladdin TNT2 could also use shared system memory (up to 32 MB total) in addition its onboard graphics memory. Most motherboard manufacturers opted to equip it with 8 MB local memory. The Aladdin TNT2 is on par with the TNT2 M64 in gaming performance.&lt;br /&gt;
&lt;br /&gt;
The RIVA TNT and TNT2 do not support 8-bit palettized textures. This is a problem with, for example, Final Fantasy VII.&lt;br /&gt;
&lt;br /&gt;
The TNT2 is also NVIDIA's last chip with Windows 3.1x support. The TNT/TNT2 Windows 3.1x drivers were only released in beta form and therefore have stability issues and occasionally produce screen corruption - for example, video playback is fickle and will usually crash the system. Furthermore, the same drivers are locked to 60 Hz refresh rate, making them undesirable for use with CRT monitors under this OS. The TNT/TNT2 Windows 3.1x drivers will not work on any GeForce cards.&lt;br /&gt;
&lt;br /&gt;
===NV1x | GeForce 256 / 2 / 4 MX===&lt;br /&gt;
[[File:Gf256sdr.jpg|thumb|GeForce 256 SDR]]&lt;br /&gt;
Released in October 1999, GeForce 256 / NV10 was the first Direct3D 7-compliant GPU. It was initially released with standard SDRAM, but a DDR version later followed. The DDR version is roughly twice as fast as TNT2. It introduced ordered grid super-sampling anti-aliasing, anisotropic filtering support (up to 2x level), cubic environment mapping, and support for hardware dot product bump mapping.&lt;br /&gt;
&lt;br /&gt;
The next refinement in the architecture came in April 2000 in the form of the GeForce 2 / NV15. The GeForce 2 has twice the texture fillrate per clock compared to NV10 and uses a smaller manufacturing process allowing higher clock rates while reducing the power consumption at the same time. The GeForce 2 GTS is about 40% faster than the GeForce 256. GeForce 2 performance is mostly limited by memory bandwidth. Some card manufacturer used a low quality analog circuit design that produces a blurry image output.&lt;br /&gt;
&lt;br /&gt;
GeForce 2 MX / NV11 is the low end series of the GeForce 2, released in September 2000.  These cards have half of the pixel pipelines and half the memory interface of the NV15. It is the first NV chip with two integrated TDMS channels, providing dual display output (called &amp;quot;TwinView&amp;quot;).  It also has &amp;quot;Digital Vibrance Control&amp;quot; that allows calibration of various image output aspects.  The 3D performance of GeForce 2 MX at 16-bit color depth is slightly faster than a GeForce 256 SDR. With its relatively low price and with the performance it offered, it became a popular card. The GeForce 2 MX core was later integrated into NVIDIA's nForce IGP motherboard chipset northbridge.&lt;br /&gt;
&lt;br /&gt;
GeForce 4 MX / NV17 replaced the GeForce 2 series in January 2002. The NV17 core is a hybrid of the NV11 and NV25. The integration of various efficiency and bandwidth improving features, combined with significantly higher clock speed than NV11, allows it to match NV15 performance. These features were advertised as &amp;quot;Lightspeed Memory Architecture II,&amp;quot; which was a refinement of the Lightspeed Memory Architecture introduced with the GeForce 3 (NV20).  It also gained the &amp;quot;AccuView&amp;quot; anti-aliasing capabilities which are considerably advanced in quality and performance over NV11 and NV15. However, the GeForce 4 MX lacks the hardware pixel and vertex shaders support, as well as environment mapped bump mapping support and higher level anisotropic filtering support found in the NV2x architecture.&lt;br /&gt;
*[http://www.anandtech.com/show/875/6 AnandTech: NV17 and NV25 Come to Life]&lt;br /&gt;
*[http://www.nvidia.com/object/feature_lmaii.html Lightspeed Memory Architecture II]&lt;br /&gt;
&lt;br /&gt;
NVIDIA further refreshed the GeForce 4 MX line in late 2002 with the NV18, which added AGP 8x support. In 2004 the series received another refresh with PCI Express x16 support via a bridge chip. The first such product introduced was the GeForce PCX 4300. The GeForce 4 MX core was also integrated into NVIDIA's nForce2 IGP motherboard chipset northbridge.&lt;br /&gt;
&lt;br /&gt;
Estimated model performance ranking:&lt;br /&gt;
&lt;br /&gt;
GF2 MX100 &amp;lt; GF2 MX200 &amp;lt; GF2 MX &amp;lt; GF2 MX400 &amp;lt; GF4 MX420 &amp;lt; GF2 GTS &amp;lt; GF2 Pro &amp;lt; GF2 Ti VX &amp;lt; GF2 Ti &amp;lt; GF2 Ultra &amp;lt; GF4 MX440&lt;br /&gt;
&lt;br /&gt;
===NV2x | GeForce 3 &amp;amp; 4===&lt;br /&gt;
[[File:GeForce3_Ti200.JPG|180px|thumb||GeForce 3 Ti 200]]&lt;br /&gt;
Released in March 2001, the GeForce 3 / NV20 is the first Direct3D 8.0-compliant GPU. It is more efficient than GeForce 2 because of improvements in memory bandwidth utilization and the addition of hidden surface removal (HSR) functions similar to those of ATI Radeon. NVIDIA called this &amp;quot;Lightspeed Memory Architecture&amp;quot;.  Despite lower fillrate than GeForce 2 Ultra/Pro/Ti, in some cases GeForce 3 can outperform those cards by up to 50%, namely in situations with anti-aliasing or when the HSR features save considerable fillrate. However in some cases it loses to GeForce 2 Ultra. In the latter half of 2001, NVIDIA released the GeForce 3 Ti 200 and Ti 500. GeForce 3 Ti 200 is clocked lower than the original GeForce 3, while Ti 500 is fastest. The original GeForce 3 and GeForce 3 Ti 500 were only released in 64 MB configurations, while the GeForce 3 Ti 200 was released in 64 MB and 128 MB configurations. The GeForce 3, however, benefitted very little from 128 MB memory.&lt;br /&gt;
&lt;br /&gt;
GeForce 3 is the first GeForce GPU with environment mapped bump mapping (EMBM) support, multi-sample anti-aliasing (MSAA) and complete anisotropic filtering support (up to 8x level). MSAA is considerably less demanding of fillrate than SSAA. 2X and 4X MSAA modes are available. There is also an anti-aliasing mode called &amp;quot;Quincunx&amp;quot; that uses a combination of 2X MSAA and a RAMDAC-based filter. This mode was intended to allow better quality anti-aliasing than 2X MSAA/SSAA but without the performance hit of 4X MSAA/SSAA. Higher anisotropic filtering levels (4x and 8x), however, incur a heavy performance decrease (sometimes by as much as 50%). However, compared to its main competitor, ATI's R200 series, GeForce3's anisotropic filtering implementation is less angle-dependent and can work simultaneously with trilinear filtering, yielding better quality when observed in motion.&lt;br /&gt;
&lt;br /&gt;
The next evolution in the NV2x architecture came in early 2002 in the form of the NV25 / GeForce 4 Ti series. The GeForce 4 Ti is quite similar to the GeForce 3 in general. Changes include higher clock speed, pixel shader 1.3 support, Direct3D 8.1 compliance, an additional vertex processor for better geometry performance and dual RAM DAC for dual display output. Later in 2002, NVIDIA released the NV28, which added AGP 8x support. With the NV28 release, the AGP 8x versions of the Ti 4400 and 4600 were respectively rebranded as the Ti 4800 SE and Ti 4800. The AGP 8x variant of the Ti 4200 was just known as the &amp;quot;Ti 4200 with AGP 8x.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
In the GeForce4 Ti line, the Ti 4200 is slowest and the Ti 4800 is fastest.&lt;br /&gt;
*[http://ixbtlabs.com/articles/gf4/index1.html Review from ixbtlabs GeForce 3 Ti500, Radeon 8500, GeForce 4 Ti]&lt;br /&gt;
&lt;br /&gt;
===NV3x | GeForce FX===&lt;br /&gt;
[[File:Gf5200u.jpg|thumb|FX 5200 Ultra]]&lt;br /&gt;
These are NVIDIA's first Direct3D 9 GPUs.  They have excellent Direct3D 5-8 compatibility and performance but are of limited value for Direct3D 9.  They are very useful for old games because they still have support for palettized textures and fog table. Similar anti-aliasing and anisotropic features, but performance with these is improved compared to older models. The high performance models like 5700 Ultra, 5800 Ultra and 59x0 Ultra allow one to run old games in high-resolution with anti-aliasing and anisotropic filtering.&lt;br /&gt;
&lt;br /&gt;
Avoid models with 64-bit bus and naming suffixes like LE, XT or VE because they have been crippled in some way.  There were some PCIe models made, named GeForce PCX 5xxx.&lt;br /&gt;
&lt;br /&gt;
5200 &amp;lt; 5500 &amp;lt; 5200 Ultra &amp;lt; 5600 &amp;lt; 5600 Ultra &amp;lt; 5700 &amp;lt; 5700 Ultra &amp;lt; 5800 &amp;lt; 5800 Ultra &amp;lt; 59x0 &amp;lt; 59x0 Ultra&lt;br /&gt;
&lt;br /&gt;
===NV4x | GeForce 6===&lt;br /&gt;
The GeForce 6 series (NV4x) was released in 2004 and is the first Direct3D 9.0c-compliant GPU. It introduced support for Shader Model 3.0 and support for PCI Express x16, though initially released cards were AGP-only. Compared to the GeForce FX series, the NV4x had dramatically improved performance all-around but dropped palettized texture support, so it is incompatible with some old games (few). This is NVIDIA's final generation of GPUs with Windows 9x and NT 4.0 support.&lt;br /&gt;
&lt;br /&gt;
==Compatibility notes==&lt;br /&gt;
*With Intel 440BX motherboards, drivers newer than 56.64 may be unstable.&lt;br /&gt;
&lt;br /&gt;
==Video captures==&lt;br /&gt;
&lt;br /&gt;
===NV1 / STG2000 ===&lt;br /&gt;
{{#ev:youtube|2H-gEKsd5SY}}&lt;br /&gt;
{{#ev:youtube|9bJGnzb0Asw}}&lt;br /&gt;
{{#ev:youtube|t9Zqy5BZQ1E}}&lt;br /&gt;
{{#ev:youtube|_4l_2Bqbi9Q}}&lt;br /&gt;
{{#ev:youtube|ZOstiHUk_EM}}&lt;br /&gt;
{{#ev:youtube|Ibs90LY_Ph8}}&lt;br /&gt;
{{#ev:youtube|JG0kqg8QlLw}}&lt;br /&gt;
{{#ev:youtube|nc6cH0zuMSs}}&lt;br /&gt;
{{#ev:youtube|VK9sg_93iCE}}&lt;br /&gt;
&lt;br /&gt;
===NV3 | RIVA 128 &amp;amp; 128 ZX ===&lt;br /&gt;
{{#ev:youtube|0GAEXE3eu0o}}&lt;br /&gt;
{{#ev:youtube|Cqxia9tPFrs}}&lt;br /&gt;
{{#ev:youtube|LZrWRMMdwW4}}&lt;br /&gt;
{{#ev:youtube|hv07UKRetPY}}&lt;br /&gt;
&lt;br /&gt;
==Related links==&lt;br /&gt;
*[http://www.vogonsdrivers.com/index.php?catid=24  VOGONS Drivers NVIDIA section] &lt;br /&gt;
*[http://gona.mactar.hu/DOS_TESTS/ Gona's PCI and AGP DOS game compatibility matrix]&lt;br /&gt;
&lt;br /&gt;
[[Category:Hardware]]&lt;br /&gt;
[[Category:Graphics Cards]]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=NVIDIA&amp;diff=1335</id>
		<title>NVIDIA</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=NVIDIA&amp;diff=1335"/>
				<updated>2013-04-28T09:09:34Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* NV4-6 | RIVA TNT &amp;amp; TNT2 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Retro value==&lt;br /&gt;
GeForce 256 through GeForce FX are fine cards for old games for a number of reasons. Their DOS game compatibility and GUI performance are top notch. Their Direct3D driver supports two critical old features, fog table and 8-bit palettized textures. OpenGL compatibility and performance are second to none, and some games utilize proprietary NVIDIA extensions.&lt;br /&gt;
&lt;br /&gt;
One problem with NVIDIA cards prior to GeForce4 is that some card vendors built their cards too cheaply. The most noticeable result is poor analog signal quality. It causes problems such as blurriness, loss of color saturation and color bleed, particularly at higher resolutions and refresh rates. Determining which cards are high quality is difficult but the GeForce4 and newer cards are most likely to be good.&lt;br /&gt;
&lt;br /&gt;
Perhaps the finest choices are the GeForce FX series because they offer the most refined quality-enhancing features and the high end models have the performance to use these features at higher resolutions. GeForce 6 drops support for palettized textures, which is a problem with a few games, but otherwise they too are great choices. GeForce 7 drops support for Windows 9x but still supports Windows 2000. Some users, however, have reportedly gotten GeForce 7800 cards to work under Windows 98 and ME using an unofficial, modified driver [http://www.mdgx.com/files/nv8269.php (link)]. This modified driver does not support Windows 95. The GeForce 7 series is also the last that is supported by NVIDIAs Stereo3D driver extension for shutter glasses such as ELSA Revelator. GeForce 8 and above support only XP and newer.&lt;br /&gt;
&lt;br /&gt;
==Cards==&lt;br /&gt;
===NV1 / STG2000 ===&lt;br /&gt;
[[File:Dedge3d.jpg|thumb|NV1]]&lt;br /&gt;
Released in 1995, NVIDIA's first 3D accelerator was an all-in-one product with audio, GUI, VGA, 3D and Sega Saturn gamepad support. It uses a type of 3D rendering called quadratic texture mapping that is not Direct3D or OpenGL compatible so it is only useful with games that use its proprietary API. It is the same technology used by Sega Saturn and as such various games were ported to use the NV1. Its audio consists of wavetable MIDI and DirectSound support but very little DOS support. DOS VGA compatibility is limited.&lt;br /&gt;
&lt;br /&gt;
It comes in various memory configurations with up to 4 MB maximum. The final drivers support Direct3D, but it is a software-only implementation.&lt;br /&gt;
&lt;br /&gt;
Games with support: Battle Arena Toshinden, Descent: Destination Saturn, NASCAR Racing, Panzer Dragoon, Virtua Cop, Virtua Fighter Remix.&lt;br /&gt;
&lt;br /&gt;
===NV3 | RIVA 128 &amp;amp; 128 ZX===&lt;br /&gt;
[[File:Riva128.jpg|thumb|RIVA 128]]&lt;br /&gt;
The RIVA 128 was released in late 1997 and it is the first Direct3D compatible GPU from NVIDIA. RIVA stands for '''R'''eal-time '''I'''nteractive '''V'''ideo and '''A'''nimation and 128 for the internal 128-bit pipeline and memory interface. It has a 206 MHz RAMDAC and supports DDC2 and VBE3.&lt;br /&gt;
Riva 128 has all the hardware features required for Direct3D 5 and has also good OpenGL 1.0 compatibility. It renders at 16-bit color depth and supports 3D accelerated resolutions up to 960x720 with Z-buffer. The 3D-performance is competitive with [[3dfx|Voodoo Graphics]] (Voodoo1). The RIVA 128 shows a few rendering quality issues like visible texture seams and a very apparent dithering pattern. Overall the rendered image appears more saturated than output from Glide games from this time. A special feature of the PCI version of the card is the ability to load textures over the PCI bus, which was advertised as ''AGP content on PCI''. The 2D-features include video scaling and color conversion capabilities. The chip features also PAL/NTSC output so many cards were released with TV-out, some even with TV-in.&lt;br /&gt;
&lt;br /&gt;
In early 1998, NVIDIA refreshed the NV3 architecture by releasing the RIVA 128 ZX. The RIVA 128 ZX is an upgraded chip that has a 250 MHz RAMDAC and supports up to 8 MB SGRAM. For texture intense games the increased memory results in higher performance.&lt;br /&gt;
&lt;br /&gt;
===NV4-6 | RIVA TNT &amp;amp; TNT2===&lt;br /&gt;
[[File:Tntpci.jpg|thumb|TNT PCI]]&lt;br /&gt;
Released in 1998, NVIDIA's RIVA TNT is a drastically improved Direct3D 6-compliant GPU with much better image quality and performance compared to its predecessors. It is competitive with Voodoo2 but with more flexibility such as 32-bit color rendering and 1024x1024 texture support. It also supports AGP 2x with execute mode (AGP texturing). 32-bit color rendering comes with a significant speed loss though and large textures are of very limited use since the chip lacks any form of texture compression. 16 MB RAM allows for very high resolutions. Great OpenGL support. All NVIDIA cards have the advantage of seperately clocked memory which makes for more flexible overclocking than with 3dfx cards.&lt;br /&gt;
&lt;br /&gt;
NVIDIA further refined the RIVA TNT architecture in early 1999 with the RIVA TNT2. The RIVA TNT2 is very similar to the original TNT but adds support for AGP 4x and 2048x2048 resolution texture maps. It also typically has 32 MB of RAM and it is clocked much higher so performs noticeably better. Some popular budget variants include Vanta and TNT2 Model 64 (M64). The M64 model uses a 64-bit memory interface (rather than 128-bit as opposed to the regular variant), effectively halving its memory bandwidth. The performance of the M64 model, however, is faster than the original TNT in some situations.&lt;br /&gt;
&lt;br /&gt;
Later in 1999, Acer Laboratories (ALi) integrated the RIVA TNT2 core into the northbridge of their M1631 motherboard chipset, commonly known as the Aladdin TNT2. The Aladdin TNT2 could also use shared system memory (up to 32 MB total) in addition its onboard graphics memory. Most motherboard manufacturers opted to equip it with 8 MB local memory. The Aladdin TNT2 is on par with the TNT2 M64 in gaming performance.&lt;br /&gt;
&lt;br /&gt;
The RIVA TNT and TNT2 do not support 8-bit palettized textures. This is a problem with, for example, Final Fantasy VII.&lt;br /&gt;
&lt;br /&gt;
The TNT2 is also NVIDIA's last chip with Windows 3.1x support. The TNT/TNT2 Windows 3.1x drivers were only released in beta form and therefore have stability issues and occasionally produce screen corruption - for example, video playback is fickle and will usually crash the system. Furthermore, the same drivers are locked to 60 Hz refresh rate, making them undesirable for use with CRT monitors under this OS. The TNT/TNT2 Windows 3.1x drivers will not work on any GeForce cards.&lt;br /&gt;
&lt;br /&gt;
===NV1x | GeForce 256 / 2 / 4 MX===&lt;br /&gt;
[[File:Gf256sdr.jpg|thumb|GeForce 256 SDR]]&lt;br /&gt;
Released in October 1999, GeForce 256 / NV10 was the first Direct3D 7-compliant GPU. It was initially released with standard SDRAM, but a DDR version later followed. The DDR version is roughly twice as fast as TNT2. It introduced ordered grid super-sampling anti-aliasing, anisotropic filtering support (up to 2x level), cubic environment mapping, and support for hardware dot product bump mapping.&lt;br /&gt;
&lt;br /&gt;
The next refinement in the architecture came in April 2000 in the form of the GeForce 2 / NV15. The GeForce 2 has twice the texture fillrate per clock compared to NV10 and uses a smaller manufacturing process allowing higher clock rates while reducing the power consumption at the same time. The GeForce 2 GTS is about 40% faster than the GeForce 256. GeForce 2 performance is mostly limited by memory bandwidth. Some card manufacturer used a low quality analog circuit design that produces a blurry image output.&lt;br /&gt;
&lt;br /&gt;
GeForce 2 MX / NV11 is the low end series of the GeForce 2, released in September 2000.  These cards have half of the pixel pipelines and half the memory interface of the NV15. It is the first NV chip with two integrated TDMS channels, providing dual display output (called &amp;quot;TwinView&amp;quot;).  It also has &amp;quot;Digital Vibrance Control&amp;quot; that allows calibration of various image output aspects.  The 3D performance of GeForce 2 MX at 16-bit color depth is slightly faster than a GeForce 256 SDR. With its relatively low price and with the performance it offered, it became a popular card. The GeForce 2 MX core was later integrated into NVIDIA's nForce IGP motherboard chipset northbridge.&lt;br /&gt;
&lt;br /&gt;
GeForce 4 MX / NV17 replaced the GeForce 2 series in January 2002. The NV17 core is a hybrid of the NV11 and NV25. The integration of various efficiency and bandwidth improving features, combined with significantly higher clock speed than NV11, allows it to match NV15 performance. These features were advertised as &amp;quot;Lightspeed Memory Architecture II,&amp;quot; which was a refinement of the Lightspeed Memory Architecture introduced with the GeForce 3 (NV20).  It also gained the &amp;quot;AccuView&amp;quot; anti-aliasing capabilities which are considerably advanced in quality and performance over NV11 and NV15. However, the GeForce 4 MX lacks the hardware pixel and vertex shaders support, as well as environment mapped bump mapping support and higher level anisotropic filtering support found in the NV2x architecture.&lt;br /&gt;
*[http://www.anandtech.com/show/875/6 AnandTech: NV17 and NV25 Come to Life]&lt;br /&gt;
*[http://www.nvidia.com/object/feature_lmaii.html Lightspeed Memory Architecture II]&lt;br /&gt;
&lt;br /&gt;
NVIDIA further refreshed the GeForce 4 MX line in late 2002 with the NV18, which added AGP 8x support. In 2004 the series received another refresh with PCI Express x16 support via a bridge chip. The first such product introduced was the GeForce PCX 4300. The GeForce 4 MX core was also integrated into NVIDIA's nForce2 IGP motherboard chipset northbridge.&lt;br /&gt;
&lt;br /&gt;
Estimated model performance ranking:&lt;br /&gt;
&lt;br /&gt;
GF2 MX100 &amp;lt; GF2 MX200 &amp;lt; GF2 MX &amp;lt; GF2 MX400 &amp;lt; GF4 MX420 &amp;lt; GF2 GTS &amp;lt; GF2 Pro &amp;lt; GF2 Ti VX &amp;lt; GF2 Ti &amp;lt; GF2 Ultra &amp;lt; GF4 MX440&lt;br /&gt;
&lt;br /&gt;
===NV2x | GeForce 3 &amp;amp; 4===&lt;br /&gt;
[[File:GeForce3_Ti200.JPG|180px|thumb||GeForce 3 Ti 200]]&lt;br /&gt;
Released in March 2001, the GeForce 3 / NV20 is the first Direct3D 8.0-compliant GPU. It is more efficient than GeForce 2 because of improvements in memory bandwidth utilization and the addition of hidden surface removal (HSR) functions similar to those of ATI Radeon. NVIDIA called this &amp;quot;Lightspeed Memory Architecture&amp;quot;.  Despite lower fillrate than GeForce 2 Ultra/Pro/Ti, in some cases GeForce 3 can outperform those cards by up to 50%, namely in situations with anti-aliasing or when the HSR features save considerable fillrate. However in some cases it loses to GeForce 2 Ultra. In the latter half of 2001, NVIDIA released the GeForce 3 Ti 200 and Ti 500. GeForce 3 Ti 200 is clocked lower than the original GeForce 3, while Ti 500 is fastest. The original GeForce 3 and GeForce 3 Ti 500 were only released in 64 MB configurations, while the GeForce 3 Ti 200 was released in 64 MB and 128 MB configurations. The GeForce 3, however, benefitted very little from 128 MB memory.&lt;br /&gt;
&lt;br /&gt;
GeForce 3 is the first GeForce GPU with environment mapped bump mapping (EMBM) support, multi-sample anti-aliasing (MSAA) and complete anisotropic filtering support (up to 8x level). MSAA is considerably less demanding of fillrate than SSAA. 2X and 4X MSAA modes are available. There is also an anti-aliasing mode called &amp;quot;Quincunx&amp;quot; that uses a combination of 2X MSAA and a RAMDAC-based filter. This mode was intended to allow better quality anti-aliasing than 2X MSAA/SSAA but without the performance hit of 4X MSAA/SSAA. Higher anisotropic filtering levels (4x and 8x), however, incur a heavy performance decrease (sometimes by as much as 50%). However, compared to its main competitor, ATI's R200 series, GeForce3's anisotropic filtering implementation is less angle-dependent and can work simultaneously with trilinear filtering, yielding better quality when observed in motion.&lt;br /&gt;
&lt;br /&gt;
The next evolution in the NV2x architecture came in early 2002 in the form of the NV25 / GeForce 4 Ti series. The GeForce 4 Ti is quite similar to the GeForce 3 in general. Changes include higher clock speed, pixel shader 1.3 support, Direct3D 8.1 compliance, an additional vertex processor for better geometry performance and dual RAM DAC for dual display output. Later in 2002, NVIDIA released the NV28, which added AGP 8x support. With the NV28 release, the AGP 8x versions of the Ti 4400 and 4600 were respectively rebranded as the Ti 4800 SE and Ti 4800. The AGP 8x variant of the Ti 4200 was just known as the &amp;quot;Ti 4200 with AGP 8x.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
In the GeForce4 Ti line, the Ti 4200 is slowest and the Ti 4800 is fastest.&lt;br /&gt;
*[http://ixbtlabs.com/articles/gf4/index1.html Review from ixbtlabs GeForce 3 Ti500, Radeon 8500, GeForce 4 Ti]&lt;br /&gt;
&lt;br /&gt;
===NV3x | GeForce FX===&lt;br /&gt;
[[File:Gf5200u.jpg|thumb|FX 5200 Ultra]]&lt;br /&gt;
These are NVIDIA's first Direct3D 9 GPUs.  They have excellent Direct3D 5-8 compatibility and performance but are of limited value for Direct3D 9.  They are very useful for old games because they still have support for palettized textures and fog table. Similar anti-aliasing and anisotropic features, but performance with these is improved compared to older models. The high performance models like 5700 Ultra, 5800 Ultra and 59x0 Ultra allow one to run old games in high-resolution with anti-aliasing and anisotropic filtering.&lt;br /&gt;
&lt;br /&gt;
Avoid models with 64-bit bus and naming suffixes like LE, XT or VE because they have been crippled in some way.  There were some PCIe models made, named GeForce PCX 5xxx.&lt;br /&gt;
&lt;br /&gt;
5200 &amp;lt; 5500 &amp;lt; 5200 Ultra &amp;lt; 5600 &amp;lt; 5600 Ultra &amp;lt; 5700 &amp;lt; 5700 Ultra &amp;lt; 5800 &amp;lt; 5800 Ultra &amp;lt; 59x0 &amp;lt; 59x0 Ultra&lt;br /&gt;
&lt;br /&gt;
===NV4x | GeForce 6===&lt;br /&gt;
The GeForce 6 series (NV4x) was released in 2004 and is the first Direct3D 9.0c-compliant GPU. It introduced support for Shader Model 3.0 and support for PCI Express x16, though initially released cards were AGP-only. Compared to the GeForce FX series, the NV4x had dramatically improved performance all-around but dropped palettized texture support, so it is incompatible with some old games (few). This is NVIDIA's final generation of GPUs with Windows 9x and NT 4.0 support.&lt;br /&gt;
&lt;br /&gt;
==Compatibility notes==&lt;br /&gt;
*With Intel 440BX motherboards, drivers newer than 56.64 may be unstable.&lt;br /&gt;
&lt;br /&gt;
==Video captures==&lt;br /&gt;
&lt;br /&gt;
===NV1 / STG2000 ===&lt;br /&gt;
{{#ev:youtube|2H-gEKsd5SY}}&lt;br /&gt;
{{#ev:youtube|9bJGnzb0Asw}}&lt;br /&gt;
{{#ev:youtube|t9Zqy5BZQ1E}}&lt;br /&gt;
{{#ev:youtube|_4l_2Bqbi9Q}}&lt;br /&gt;
{{#ev:youtube|ZOstiHUk_EM}}&lt;br /&gt;
{{#ev:youtube|Ibs90LY_Ph8}}&lt;br /&gt;
{{#ev:youtube|JG0kqg8QlLw}}&lt;br /&gt;
{{#ev:youtube|nc6cH0zuMSs}}&lt;br /&gt;
{{#ev:youtube|VK9sg_93iCE}}&lt;br /&gt;
&lt;br /&gt;
===NV3 | RIVA 128 &amp;amp; 128 ZX ===&lt;br /&gt;
{{#ev:youtube|0GAEXE3eu0o}}&lt;br /&gt;
{{#ev:youtube|Cqxia9tPFrs}}&lt;br /&gt;
{{#ev:youtube|LZrWRMMdwW4}}&lt;br /&gt;
{{#ev:youtube|hv07UKRetPY}}&lt;br /&gt;
&lt;br /&gt;
==Related links==&lt;br /&gt;
*[http://www.vogonsdrivers.com/index.php?catid=24  VOGONS Drivers NVIDIA section] &lt;br /&gt;
*[http://gona.mactar.hu/DOS_TESTS/ Gona's PCI and AGP DOS game compatibility matrix]&lt;br /&gt;
&lt;br /&gt;
[[Category:Hardware]]&lt;br /&gt;
[[Category:Graphics Cards]]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=NVIDIA&amp;diff=1334</id>
		<title>NVIDIA</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=NVIDIA&amp;diff=1334"/>
				<updated>2013-04-28T09:08:57Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* NV4-6 | RIVA TNT &amp;amp; TNT2 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Retro value==&lt;br /&gt;
GeForce 256 through GeForce FX are fine cards for old games for a number of reasons. Their DOS game compatibility and GUI performance are top notch. Their Direct3D driver supports two critical old features, fog table and 8-bit palettized textures. OpenGL compatibility and performance are second to none, and some games utilize proprietary NVIDIA extensions.&lt;br /&gt;
&lt;br /&gt;
One problem with NVIDIA cards prior to GeForce4 is that some card vendors built their cards too cheaply. The most noticeable result is poor analog signal quality. It causes problems such as blurriness, loss of color saturation and color bleed, particularly at higher resolutions and refresh rates. Determining which cards are high quality is difficult but the GeForce4 and newer cards are most likely to be good.&lt;br /&gt;
&lt;br /&gt;
Perhaps the finest choices are the GeForce FX series because they offer the most refined quality-enhancing features and the high end models have the performance to use these features at higher resolutions. GeForce 6 drops support for palettized textures, which is a problem with a few games, but otherwise they too are great choices. GeForce 7 drops support for Windows 9x but still supports Windows 2000. Some users, however, have reportedly gotten GeForce 7800 cards to work under Windows 98 and ME using an unofficial, modified driver [http://www.mdgx.com/files/nv8269.php (link)]. This modified driver does not support Windows 95. The GeForce 7 series is also the last that is supported by NVIDIAs Stereo3D driver extension for shutter glasses such as ELSA Revelator. GeForce 8 and above support only XP and newer.&lt;br /&gt;
&lt;br /&gt;
==Cards==&lt;br /&gt;
===NV1 / STG2000 ===&lt;br /&gt;
[[File:Dedge3d.jpg|thumb|NV1]]&lt;br /&gt;
Released in 1995, NVIDIA's first 3D accelerator was an all-in-one product with audio, GUI, VGA, 3D and Sega Saturn gamepad support. It uses a type of 3D rendering called quadratic texture mapping that is not Direct3D or OpenGL compatible so it is only useful with games that use its proprietary API. It is the same technology used by Sega Saturn and as such various games were ported to use the NV1. Its audio consists of wavetable MIDI and DirectSound support but very little DOS support. DOS VGA compatibility is limited.&lt;br /&gt;
&lt;br /&gt;
It comes in various memory configurations with up to 4 MB maximum. The final drivers support Direct3D, but it is a software-only implementation.&lt;br /&gt;
&lt;br /&gt;
Games with support: Battle Arena Toshinden, Descent: Destination Saturn, NASCAR Racing, Panzer Dragoon, Virtua Cop, Virtua Fighter Remix.&lt;br /&gt;
&lt;br /&gt;
===NV3 | RIVA 128 &amp;amp; 128 ZX===&lt;br /&gt;
[[File:Riva128.jpg|thumb|RIVA 128]]&lt;br /&gt;
The RIVA 128 was released in late 1997 and it is the first Direct3D compatible GPU from NVIDIA. RIVA stands for '''R'''eal-time '''I'''nteractive '''V'''ideo and '''A'''nimation and 128 for the internal 128-bit pipeline and memory interface. It has a 206 MHz RAMDAC and supports DDC2 and VBE3.&lt;br /&gt;
Riva 128 has all the hardware features required for Direct3D 5 and has also good OpenGL 1.0 compatibility. It renders at 16-bit color depth and supports 3D accelerated resolutions up to 960x720 with Z-buffer. The 3D-performance is competitive with [[3dfx|Voodoo Graphics]] (Voodoo1). The RIVA 128 shows a few rendering quality issues like visible texture seams and a very apparent dithering pattern. Overall the rendered image appears more saturated than output from Glide games from this time. A special feature of the PCI version of the card is the ability to load textures over the PCI bus, which was advertised as ''AGP content on PCI''. The 2D-features include video scaling and color conversion capabilities. The chip features also PAL/NTSC output so many cards were released with TV-out, some even with TV-in.&lt;br /&gt;
&lt;br /&gt;
In early 1998, NVIDIA refreshed the NV3 architecture by releasing the RIVA 128 ZX. The RIVA 128 ZX is an upgraded chip that has a 250 MHz RAMDAC and supports up to 8 MB SGRAM. For texture intense games the increased memory results in higher performance.&lt;br /&gt;
&lt;br /&gt;
===NV4-6 | RIVA TNT &amp;amp; TNT2===&lt;br /&gt;
[[File:Tntpci.jpg|thumb|TNT PCI]]&lt;br /&gt;
Released in 1998, NVIDIA's RIVA TNT is a drastically improved Direct3D 6-compliant GPU with much better image quality and performance compared to its predecessors. It is competitive with Voodoo2 but with more flexibility such as 32-bit color rendering and 1024x1024 texture support. It also supports AGP 2x with execute mode (AGP texturing). 32-bit color rendering comes with a significant speed loss though and large textures are of very limited use since the chip lacks any form of texture compression. 16 MB RAM allows for very high resolutions. Great OpenGL support. All NVIDIA cards have the advantage of seperately clocked memory which makes for more flexible overclocking than with 3dfx cards.&lt;br /&gt;
&lt;br /&gt;
NVIDIA further refined the RIVA TNT architecture in early 1999 with the RIVA TNT2. The RIVA TNT2 is very similar to the original TNT but adds support for AGP 4x and 2048x2048 resolution texture maps. It also typically has 32 MB of RAM and it is clocked much higher so performs noticeably better. Some popular budget variants include Vanta and TNT2 Model 64 (M64). The M64 model uses a 64-bit memory interface (rather than 128-bit as opposed to the regular variant), effectively halving its memory bandwidth. The performance of the M64 model, however, is faster than the original TNT in some situations.&lt;br /&gt;
&lt;br /&gt;
Later in 1999, Acer Laboratories (ALi) integrated the RIVA TNT2 core into the northbridge of their M1631 motherboard chipset, commonly known as the Aladdin TNT2. The Aladdin TNT2 could also use shared system memory (up to 32 MB total) in addition its onboard graphics memory. Most motherboard manufacturers opted to equip it with 8 MB local memory. The Aladdin TNT2 is on par with the TNT2 M64 in gaming performance.&lt;br /&gt;
&lt;br /&gt;
These cards do not support 8-bit palettized textures. This is a problem with, for example, Final Fantasy VII.&lt;br /&gt;
&lt;br /&gt;
The TNT2 is also NVIDIA's last chip with Windows 3.1x support. The TNT/TNT2 Windows 3.1x drivers were only released in beta form and therefore have stability issues and occasionally produce screen corruption - for example, video playback is fickle and will usually crash the system. Furthermore, the same drivers are locked to 60 Hz refresh rate, making them undesirable for use with CRT monitors under this OS. The TNT/TNT2 Windows 3.1x drivers will not work on any GeForce cards.&lt;br /&gt;
&lt;br /&gt;
===NV1x | GeForce 256 / 2 / 4 MX===&lt;br /&gt;
[[File:Gf256sdr.jpg|thumb|GeForce 256 SDR]]&lt;br /&gt;
Released in October 1999, GeForce 256 / NV10 was the first Direct3D 7-compliant GPU. It was initially released with standard SDRAM, but a DDR version later followed. The DDR version is roughly twice as fast as TNT2. It introduced ordered grid super-sampling anti-aliasing, anisotropic filtering support (up to 2x level), cubic environment mapping, and support for hardware dot product bump mapping.&lt;br /&gt;
&lt;br /&gt;
The next refinement in the architecture came in April 2000 in the form of the GeForce 2 / NV15. The GeForce 2 has twice the texture fillrate per clock compared to NV10 and uses a smaller manufacturing process allowing higher clock rates while reducing the power consumption at the same time. The GeForce 2 GTS is about 40% faster than the GeForce 256. GeForce 2 performance is mostly limited by memory bandwidth. Some card manufacturer used a low quality analog circuit design that produces a blurry image output.&lt;br /&gt;
&lt;br /&gt;
GeForce 2 MX / NV11 is the low end series of the GeForce 2, released in September 2000.  These cards have half of the pixel pipelines and half the memory interface of the NV15. It is the first NV chip with two integrated TDMS channels, providing dual display output (called &amp;quot;TwinView&amp;quot;).  It also has &amp;quot;Digital Vibrance Control&amp;quot; that allows calibration of various image output aspects.  The 3D performance of GeForce 2 MX at 16-bit color depth is slightly faster than a GeForce 256 SDR. With its relatively low price and with the performance it offered, it became a popular card. The GeForce 2 MX core was later integrated into NVIDIA's nForce IGP motherboard chipset northbridge.&lt;br /&gt;
&lt;br /&gt;
GeForce 4 MX / NV17 replaced the GeForce 2 series in January 2002. The NV17 core is a hybrid of the NV11 and NV25. The integration of various efficiency and bandwidth improving features, combined with significantly higher clock speed than NV11, allows it to match NV15 performance. These features were advertised as &amp;quot;Lightspeed Memory Architecture II,&amp;quot; which was a refinement of the Lightspeed Memory Architecture introduced with the GeForce 3 (NV20).  It also gained the &amp;quot;AccuView&amp;quot; anti-aliasing capabilities which are considerably advanced in quality and performance over NV11 and NV15. However, the GeForce 4 MX lacks the hardware pixel and vertex shaders support, as well as environment mapped bump mapping support and higher level anisotropic filtering support found in the NV2x architecture.&lt;br /&gt;
*[http://www.anandtech.com/show/875/6 AnandTech: NV17 and NV25 Come to Life]&lt;br /&gt;
*[http://www.nvidia.com/object/feature_lmaii.html Lightspeed Memory Architecture II]&lt;br /&gt;
&lt;br /&gt;
NVIDIA further refreshed the GeForce 4 MX line in late 2002 with the NV18, which added AGP 8x support. In 2004 the series received another refresh with PCI Express x16 support via a bridge chip. The first such product introduced was the GeForce PCX 4300. The GeForce 4 MX core was also integrated into NVIDIA's nForce2 IGP motherboard chipset northbridge.&lt;br /&gt;
&lt;br /&gt;
Estimated model performance ranking:&lt;br /&gt;
&lt;br /&gt;
GF2 MX100 &amp;lt; GF2 MX200 &amp;lt; GF2 MX &amp;lt; GF2 MX400 &amp;lt; GF4 MX420 &amp;lt; GF2 GTS &amp;lt; GF2 Pro &amp;lt; GF2 Ti VX &amp;lt; GF2 Ti &amp;lt; GF2 Ultra &amp;lt; GF4 MX440&lt;br /&gt;
&lt;br /&gt;
===NV2x | GeForce 3 &amp;amp; 4===&lt;br /&gt;
[[File:GeForce3_Ti200.JPG|180px|thumb||GeForce 3 Ti 200]]&lt;br /&gt;
Released in March 2001, the GeForce 3 / NV20 is the first Direct3D 8.0-compliant GPU. It is more efficient than GeForce 2 because of improvements in memory bandwidth utilization and the addition of hidden surface removal (HSR) functions similar to those of ATI Radeon. NVIDIA called this &amp;quot;Lightspeed Memory Architecture&amp;quot;.  Despite lower fillrate than GeForce 2 Ultra/Pro/Ti, in some cases GeForce 3 can outperform those cards by up to 50%, namely in situations with anti-aliasing or when the HSR features save considerable fillrate. However in some cases it loses to GeForce 2 Ultra. In the latter half of 2001, NVIDIA released the GeForce 3 Ti 200 and Ti 500. GeForce 3 Ti 200 is clocked lower than the original GeForce 3, while Ti 500 is fastest. The original GeForce 3 and GeForce 3 Ti 500 were only released in 64 MB configurations, while the GeForce 3 Ti 200 was released in 64 MB and 128 MB configurations. The GeForce 3, however, benefitted very little from 128 MB memory.&lt;br /&gt;
&lt;br /&gt;
GeForce 3 is the first GeForce GPU with environment mapped bump mapping (EMBM) support, multi-sample anti-aliasing (MSAA) and complete anisotropic filtering support (up to 8x level). MSAA is considerably less demanding of fillrate than SSAA. 2X and 4X MSAA modes are available. There is also an anti-aliasing mode called &amp;quot;Quincunx&amp;quot; that uses a combination of 2X MSAA and a RAMDAC-based filter. This mode was intended to allow better quality anti-aliasing than 2X MSAA/SSAA but without the performance hit of 4X MSAA/SSAA. Higher anisotropic filtering levels (4x and 8x), however, incur a heavy performance decrease (sometimes by as much as 50%). However, compared to its main competitor, ATI's R200 series, GeForce3's anisotropic filtering implementation is less angle-dependent and can work simultaneously with trilinear filtering, yielding better quality when observed in motion.&lt;br /&gt;
&lt;br /&gt;
The next evolution in the NV2x architecture came in early 2002 in the form of the NV25 / GeForce 4 Ti series. The GeForce 4 Ti is quite similar to the GeForce 3 in general. Changes include higher clock speed, pixel shader 1.3 support, Direct3D 8.1 compliance, an additional vertex processor for better geometry performance and dual RAM DAC for dual display output. Later in 2002, NVIDIA released the NV28, which added AGP 8x support. With the NV28 release, the AGP 8x versions of the Ti 4400 and 4600 were respectively rebranded as the Ti 4800 SE and Ti 4800. The AGP 8x variant of the Ti 4200 was just known as the &amp;quot;Ti 4200 with AGP 8x.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
In the GeForce4 Ti line, the Ti 4200 is slowest and the Ti 4800 is fastest.&lt;br /&gt;
*[http://ixbtlabs.com/articles/gf4/index1.html Review from ixbtlabs GeForce 3 Ti500, Radeon 8500, GeForce 4 Ti]&lt;br /&gt;
&lt;br /&gt;
===NV3x | GeForce FX===&lt;br /&gt;
[[File:Gf5200u.jpg|thumb|FX 5200 Ultra]]&lt;br /&gt;
These are NVIDIA's first Direct3D 9 GPUs.  They have excellent Direct3D 5-8 compatibility and performance but are of limited value for Direct3D 9.  They are very useful for old games because they still have support for palettized textures and fog table. Similar anti-aliasing and anisotropic features, but performance with these is improved compared to older models. The high performance models like 5700 Ultra, 5800 Ultra and 59x0 Ultra allow one to run old games in high-resolution with anti-aliasing and anisotropic filtering.&lt;br /&gt;
&lt;br /&gt;
Avoid models with 64-bit bus and naming suffixes like LE, XT or VE because they have been crippled in some way.  There were some PCIe models made, named GeForce PCX 5xxx.&lt;br /&gt;
&lt;br /&gt;
5200 &amp;lt; 5500 &amp;lt; 5200 Ultra &amp;lt; 5600 &amp;lt; 5600 Ultra &amp;lt; 5700 &amp;lt; 5700 Ultra &amp;lt; 5800 &amp;lt; 5800 Ultra &amp;lt; 59x0 &amp;lt; 59x0 Ultra&lt;br /&gt;
&lt;br /&gt;
===NV4x | GeForce 6===&lt;br /&gt;
The GeForce 6 series (NV4x) was released in 2004 and is the first Direct3D 9.0c-compliant GPU. It introduced support for Shader Model 3.0 and support for PCI Express x16, though initially released cards were AGP-only. Compared to the GeForce FX series, the NV4x had dramatically improved performance all-around but dropped palettized texture support, so it is incompatible with some old games (few). This is NVIDIA's final generation of GPUs with Windows 9x and NT 4.0 support.&lt;br /&gt;
&lt;br /&gt;
==Compatibility notes==&lt;br /&gt;
*With Intel 440BX motherboards, drivers newer than 56.64 may be unstable.&lt;br /&gt;
&lt;br /&gt;
==Video captures==&lt;br /&gt;
&lt;br /&gt;
===NV1 / STG2000 ===&lt;br /&gt;
{{#ev:youtube|2H-gEKsd5SY}}&lt;br /&gt;
{{#ev:youtube|9bJGnzb0Asw}}&lt;br /&gt;
{{#ev:youtube|t9Zqy5BZQ1E}}&lt;br /&gt;
{{#ev:youtube|_4l_2Bqbi9Q}}&lt;br /&gt;
{{#ev:youtube|ZOstiHUk_EM}}&lt;br /&gt;
{{#ev:youtube|Ibs90LY_Ph8}}&lt;br /&gt;
{{#ev:youtube|JG0kqg8QlLw}}&lt;br /&gt;
{{#ev:youtube|nc6cH0zuMSs}}&lt;br /&gt;
{{#ev:youtube|VK9sg_93iCE}}&lt;br /&gt;
&lt;br /&gt;
===NV3 | RIVA 128 &amp;amp; 128 ZX ===&lt;br /&gt;
{{#ev:youtube|0GAEXE3eu0o}}&lt;br /&gt;
{{#ev:youtube|Cqxia9tPFrs}}&lt;br /&gt;
{{#ev:youtube|LZrWRMMdwW4}}&lt;br /&gt;
{{#ev:youtube|hv07UKRetPY}}&lt;br /&gt;
&lt;br /&gt;
==Related links==&lt;br /&gt;
*[http://www.vogonsdrivers.com/index.php?catid=24  VOGONS Drivers NVIDIA section] &lt;br /&gt;
*[http://gona.mactar.hu/DOS_TESTS/ Gona's PCI and AGP DOS game compatibility matrix]&lt;br /&gt;
&lt;br /&gt;
[[Category:Hardware]]&lt;br /&gt;
[[Category:Graphics Cards]]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=3Dlabs&amp;diff=1333</id>
		<title>3Dlabs</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=3Dlabs&amp;diff=1333"/>
				<updated>2013-04-28T09:01:13Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* Video captures */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Cards==&lt;br /&gt;
===Game GLINT ===&lt;br /&gt;
&lt;br /&gt;
3DLabs' first tap into the consumer graphics market was in November 1995 with its Game GLINT (also known as GiGi) chip. It is a scaled-down variant of 3DLabs' GLINT 300TX chipset released a year earlier, which was designed for professional 3D and CAD applications. The only graphics card produced using this chip was Creative Labs' 3D Blaster VLB. It is a unique card in that it is the only existing consumer-level 3D accelerator (and texture mapper) for the VLB bus, and it was designed for 80486 VLB users who wanted Pentium-level gaming performance. It has 1 MB graphics memory for display frame buffer and 1 MB texture memory. The 3D Blaster VLB can be used as a standalone 2D/3D graphics card or as a 3D-only accelerator when paired with a 2D graphics card in the same system through a VGA pass-through cable (in the same manner that cards based on 3Dfx's Voodoo Graphics and Voodoo2 chipsets function). The 3D Blaster VLB supported Creative Labs' own proprietary API known as Creative Graphics Library (CGL). Only a handful of CGL games were ever released [http://www.vogonswiki.com/index.php/List_of_games_supporting_proprietary_APIs#3D_Blaster_.28CGL.29 (more information here)] and only these games take advantage of the 3D Blaster VLB's 3D rendering features. Games that do take advantage of the 3D Blaster VLB usually run in higher resolution (640x400 or 640x480) and with additional graphics detail. However, with these additional rendering features enabled, performance in supported games is less than desirable (unless one has a VLB Pentium system) and reviewers knocked the card. Most gamers at the time opted to upgrade to a PCI Pentium system instead. The 3D Blaster VLB was supplanted several months later by its PCI bus counterpart, the 3D Blaster PCI, which used Rendition's Verite 1000 chip and had better game compatibility and performance.&lt;br /&gt;
&lt;br /&gt;
The 3D Blaster VLB nowadays is a very rare card and usually if it shows up on auction websites, the price will be very high. The 3D Blaster VLB also supposedly supports Direct3D if the graphics memory upgrade module is installed. However, the memory upgrade module is more difficult to find than the 3D Blaster VLB itself, and would also be very steeply-priced. Unlike 3DLabs' other graphics chips and chipsets, the Game GLINT does not support OpenGL. Display drivers are available for Windows 3.1x and Windows 95.&lt;br /&gt;
&lt;br /&gt;
===Permedia and Permedia NT ===&lt;br /&gt;
&lt;br /&gt;
3DLabs released the Permedia and Permedia NT in 1996, and they served as value-oriented graphics chips designed for the professional 3D and CAD markets. Permedia NT differs from the original in that it features a separate geometry co-processor chip (known as Delta) for performing transformation, clipping and lighting calculations. However, the Delta geometry co-processor is optimized for professional 3D and CAD applications rather than for games, resulting in anemic overall gaming performance. On the upside, the main advantage at the time was that 3DLabs was the only company producing consumer-level graphics chips that had full OpenGL ICD driver support (other manufacturers didn't support OpenGL at all or released a miniport driver instead).&lt;br /&gt;
&lt;br /&gt;
The Permedia and Permedia NT support Windows 9x and NT 4.0. Supported APIs include Direct3D, OpenGL ICD and HEIDI.&lt;br /&gt;
&lt;br /&gt;
Popular cards include Diamond Multimedia's FireGL 1000 and Leadtek's Winfast 3D L2200.&lt;br /&gt;
&lt;br /&gt;
===Permedia 2 ===&lt;br /&gt;
The Permedia 2 was released in late 1997 and was an evolution of the original Permedia. The discrete Delta geometry co-processor chip originally featured in the Permedia NT is now integrated into the Permedia 2's core logic, resulting in a single-chip solution. Permedia 2 supports AGP texturing as well as 3D rendering in 32-bit color depth. However, it does not support colored vertex blending, which results in games using only monochromatic lighting - notable examples are in Quake 2 and Quake III: Arena. No colored lighting is present in Quake 2 even when using the OpenGL renderer, and Quake III: Arena only supports vertex lighting on the Permedia 2. &lt;br /&gt;
&lt;br /&gt;
Most Permedia 2 graphics cards do not come with out-of-the-box VESA VBE support, requiring the use of TSRs such as UniVBE/SciTech Display Doctor in order to use higher-resolution VESA graphics modes in DOS. DOS VGA compatibility is also spotty. The Permedia 2 is also 3DLabs' last graphics chip with Windows 95 support.&lt;br /&gt;
&lt;br /&gt;
===GLINT R3 ===&lt;br /&gt;
The GLINT R3 is the third iteration of 3DLabs' Permedia architecture. It was released in mid 1999 and again was directed at the professional 3D and CAD application market. It is also Direct3D 6-compliant. At this time 3DLabs began producing their own graphics cards rather than license their graphics chips out to OEMs, in the same manner that 3dfx did with their Voodoo3/4/5 series. The GLINT R3 also introduced support for hardware dot product bump mapping, but this feature would not be extensively used in games until a year or so after its release. Other consumer-level graphics cards at the time only supported emboss bump mapping. Three popular cards based on this chip were released: 3DLabs' Permedia 3 Create!, Oxygen VX1 and the Oxygen GVX1. The Oxygen GVX1 featured a separate geometry co-processor (known as GAMMA) chip for performing transformation, clipping and lighting calculations, much in the same manner that the Delta co-processor functioned in the Permedia and Permedia 2 series. Likewise with Delta, the GAMMA geometry co-processor is not optimized for games. Gaming performance-wise, the GLINT R3 is slower than even NVIDIA's RIVA TNT (released several months earlier) in most situations, and it is definitely no match for the RIVA TNT2 or 3dfx's Voodoo3. Its performance in professional 3D and CAD applications, however, is much better. The Oxygen GVX1 is slower than NVIDIA's GeForce 256 and Quadro in both games and professional 3D applications, but produces far fewer rendering anomalies in professional 3D applications.&lt;br /&gt;
&lt;br /&gt;
Later card revisions added AGP 4x support - if purchased in a retail box, there usually will be an &amp;quot;AGP 4x&amp;quot; sticker present on the box to indicate such support.&lt;br /&gt;
&lt;br /&gt;
Compared to 3DLabs' previous graphics chips, the GLINT R3 has better DOS VGA compatibility and contains VESA VBE support out-of-the-box.&lt;br /&gt;
&lt;br /&gt;
The GLINT R3 requires Windows 98 or Windows NT 4.0 (or newer). Overall, this chip is definitely not geared towards gaming.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Video captures==&lt;br /&gt;
&lt;br /&gt;
===Game GLINT ===&lt;br /&gt;
{{#ev:youtube|c7tTYwNDWlU}}&lt;br /&gt;
&lt;br /&gt;
==Related links==&lt;br /&gt;
*[http://gona.mactar.hu/DOS_TESTS/ Gona's PCI and AGP DOS game compatibility matrix]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=Intel/Real3D&amp;diff=1332</id>
		<title>Intel/Real3D</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=Intel/Real3D&amp;diff=1332"/>
				<updated>2013-04-28T08:08:43Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Cards==&lt;br /&gt;
&lt;br /&gt;
===i740 ===&lt;br /&gt;
Also known as the Intel740 and codenamed &amp;quot;Auburn&amp;quot;, this was Intel's first foray into the discrete graphics market. It was released in January 1998 and was the result of Intel's collaboration with Lockheed Martin's Real3D graphics division. It supports Direct3D and OpenGL ICD. It was designed to take full advantage of the AGP bus, and as such, it can only texture from system memory via the AGP bus. All of its local onboard graphics memory was used purely as a display frame buffer. Therefore, manufacturers were able to produce cards with as low as 2 MB of onboard memory (one such example being Intel's own Express 3D graphics card). The i740 also supports texture map resolutions up to 1024x1024. 3D rendering color depth is limited to 16-bit, but the chip has good dithering quality. Its performance allows it to compete well with NVIDIA's RIVA 128, ATI's Rage Pro and the original 3Dfx Voodoo graphics chipset, but it falls woefully short of higher-end graphics solutions released around the same timeframe, such as 3Dfx's Voodoo2. Much of the bottleneck results from its sole use of AGP texturing - the i740 has to access textures through a channel that is often much slower that of the onboard graphics memory. It was also difficult to port i740 cards to the PCI bus - often times it necessitated the use of an AGP-to-PCI bridge chip, resulting in increased card cost. The PCI variants could texture from onboard graphics memory and are faster than their AGP counterparts in some tests. The PCI variants also came with larger amounts of onboard memory - one such card was a PCI version of Real3D's Starfighter, which was manufactured with as much as 24 MB of onboard memory.&lt;br /&gt;
&lt;br /&gt;
The i740 supports Windows 9x and NT 4.0. Drivers for Windows 3.1x were also released, and the Windows 3.1x drivers are heavily based on Chips and Technologies' GUI accelerator drivers (since Intel had acquired Chips and Technologies in July 1997).&lt;br /&gt;
&lt;br /&gt;
The i740 has good overall DOS VGA compatibility and performance.&lt;br /&gt;
&lt;br /&gt;
===i752 and i754 ===&lt;br /&gt;
The i752 and i754 are slight evolutions of the i740 architecture and they were announced in April 1999. As of 2013, they are also Intel's last graphics chips used on discrete cards. The i752 supports AGP 2x while the i754 supports AGP 4x. New features include support for multitexturing, anisotropic filtering, MPEG-2 motion compensation and DVI displays. Both were cancelled upon release and very few boards made it out into the wild.&lt;br /&gt;
The i752 and i754 technology respectively lived on as the IGP in Intel's i810 and i815 motherboard chipsets. Some of this technology further found its way into Intel's later IGPs, notably their Extreme Graphics and Graphics Media Accelerator (GMA) lines.&lt;br /&gt;
&lt;br /&gt;
==Video captures ==&lt;br /&gt;
===i740 ===&lt;br /&gt;
{{#ev:youtube|XDxjuXikgs8}}&lt;br /&gt;
&lt;br /&gt;
==Related links==&lt;br /&gt;
*[http://www.vogonsdrivers.com/getfile.php?fileid=330  VOGONS Drivers Intel i740 Driver and BIOS compilation] &lt;br /&gt;
*[http://gona.mactar.hu/DOS_TESTS/ Gona's PCI and AGP DOS game compatibility matrix]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=NVIDIA&amp;diff=1331</id>
		<title>NVIDIA</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=NVIDIA&amp;diff=1331"/>
				<updated>2013-04-28T08:06:53Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* NV1 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Retro value==&lt;br /&gt;
GeForce 256 through GeForce FX are fine cards for old games for a number of reasons. Their DOS game compatibility and GUI performance are top notch. Their Direct3D driver supports two critical old features, fog table and 8-bit palettized textures. OpenGL compatibility and performance are second to none, and some games utilize proprietary NVIDIA extensions.&lt;br /&gt;
&lt;br /&gt;
One problem with NVIDIA cards prior to GeForce4 is that some card vendors built their cards too cheaply. The most noticeable result is poor analog signal quality. It causes problems such as blurriness, loss of color saturation and color bleed, particularly at higher resolutions and refresh rates. Determining which cards are high quality is difficult but the GeForce4 and newer cards are most likely to be good.&lt;br /&gt;
&lt;br /&gt;
Perhaps the finest choices are the GeForce FX series because they offer the most refined quality-enhancing features and the high end models have the performance to use these features at higher resolutions. GeForce 6 drops support for palettized textures, which is a problem with a few games, but otherwise they too are great choices. GeForce 7 drops support for Windows 9x but still supports Windows 2000. Some users, however, have reportedly gotten GeForce 7800 cards to work under Windows 98 and ME using an unofficial, modified driver [http://www.mdgx.com/files/nv8269.php (link)]. This modified driver does not support Windows 95. The GeForce 7 series is also the last that is supported by NVIDIAs Stereo3D driver extension for shutter glasses such as ELSA Revelator. GeForce 8 and above support only XP and newer.&lt;br /&gt;
&lt;br /&gt;
==Cards==&lt;br /&gt;
===NV1 / STG2000 ===&lt;br /&gt;
[[File:Dedge3d.jpg|thumb|NV1]]&lt;br /&gt;
Released in 1995, NVIDIA's first 3D accelerator was an all-in-one product with audio, GUI, VGA, 3D and Sega Saturn gamepad support. It uses a type of 3D rendering called quadratic texture mapping that is not Direct3D or OpenGL compatible so it is only useful with games that use its proprietary API. It is the same technology used by Sega Saturn and as such various games were ported to use the NV1. Its audio consists of wavetable MIDI and DirectSound support but very little DOS support. DOS VGA compatibility is limited.&lt;br /&gt;
&lt;br /&gt;
It comes in various memory configurations with up to 4 MB maximum. The final drivers support Direct3D, but it is a software-only implementation.&lt;br /&gt;
&lt;br /&gt;
Games with support: Battle Arena Toshinden, Descent: Destination Saturn, NASCAR Racing, Panzer Dragoon, Virtua Cop, Virtua Fighter Remix.&lt;br /&gt;
&lt;br /&gt;
===NV3 | RIVA 128 &amp;amp; 128 ZX===&lt;br /&gt;
[[File:Riva128.jpg|thumb|RIVA 128]]&lt;br /&gt;
The RIVA 128 was released in late 1997 and it is the first Direct3D compatible GPU from NVIDIA. RIVA stands for '''R'''eal-time '''I'''nteractive '''V'''ideo and '''A'''nimation and 128 for the internal 128-bit pipeline and memory interface. It has a 206 MHz RAMDAC and supports DDC2 and VBE3.&lt;br /&gt;
Riva 128 has all the hardware features required for Direct3D 5 and has also good OpenGL 1.0 compatibility. It renders at 16-bit color depth and supports 3D accelerated resolutions up to 960x720 with Z-buffer. The 3D-performance is competitive with [[3dfx|Voodoo Graphics]] (Voodoo1). The RIVA 128 shows a few rendering quality issues like visible texture seams and a very apparent dithering pattern. Overall the rendered image appears more saturated than output from Glide games from this time. A special feature of the PCI version of the card is the ability to load textures over the PCI bus, which was advertised as ''AGP content on PCI''. The 2D-features include video scaling and color conversion capabilities. The chip features also PAL/NTSC output so many cards were released with TV-out, some even with TV-in.&lt;br /&gt;
&lt;br /&gt;
In early 1998, NVIDIA refreshed the NV3 architecture by releasing the RIVA 128 ZX. The RIVA 128 ZX is an upgraded chip that has a 250 MHz RAMDAC and supports up to 8 MB SGRAM. For texture intense games the increased memory results in higher performance.&lt;br /&gt;
&lt;br /&gt;
===NV4-6 | RIVA TNT &amp;amp; TNT2===&lt;br /&gt;
[[File:Tntpci.jpg|thumb|TNT PCI]]&lt;br /&gt;
Released in 1998, NVIDIA's RIVA TNT is a drastically improved Direct3D 6-compliant GPU with much better image quality and performance compared to its predecessors. It is competitive with Voodoo2 but with more flexibility such as 32-bit color rendering and 1024x1024 texture support. It also supports AGP 2x with execute mode (AGP texturing). 32-bit color rendering comes with a significant speed loss though and large textures are of very limited use since the chip lacks any form of texture compression. 16 MB RAM allows for very high resolutions. Great OpenGL support. All NVIDIA cards have the advantage of seperately clocked memory which makes for more flexible overclocking than with 3dfx cards.&lt;br /&gt;
&lt;br /&gt;
NVIDIA further refined the RIVA TNT architecture in early 1999 with the RIVA TNT2. The RIVA TNT2 is very similar to the original TNT but adds support for AGP 4x and 2048x2048 resolution texture maps. It also typically has 32 MB of RAM and it is clocked much higher so performs noticeably better. Some popular budget variants include Vanta and TNT2 M64. The M64 model uses a 64-bit memory interface (rather than 128-bit as opposed to the regular variant), effectively halving its memory bandwidth. The performance of the M64 model, however, is faster than the original TNT in some situations.&lt;br /&gt;
&lt;br /&gt;
These cards do not support 8-bit palettized textures. This is a problem with, for example, Final Fantasy VII.&lt;br /&gt;
&lt;br /&gt;
The TNT2 is also NVIDIA's last chip with Windows 3.1x support. The TNT/TNT2 Windows 3.1x drivers were only released in beta form and therefore have stability issues and occasionally produce screen corruption - for example, video playback is fickle and will usually crash the system. Furthermore, the same drivers are locked to 60 Hz refresh rate, making them undesirable for use with CRT monitors under this OS. The TNT/TNT2 Windows 3.1x drivers will not work on any GeForce cards.&lt;br /&gt;
&lt;br /&gt;
===NV1x | GeForce 256 / 2 / 4 MX===&lt;br /&gt;
[[File:Gf256sdr.jpg|thumb|GeForce 256 SDR]]&lt;br /&gt;
Released in October 1999, GeForce 256 / NV10 was the first Direct3D 7-compliant GPU. It was initially released with standard SDRAM, but a DDR version later followed. The DDR version is roughly twice as fast as TNT2. It introduced ordered grid super-sampling anti-aliasing, anisotropic filtering support (up to 2x level), cubic environment mapping, and support for hardware dot product bump mapping.&lt;br /&gt;
&lt;br /&gt;
The next refinement in the architecture came in April 2000 in the form of the GeForce 2 / NV15. The GeForce 2 has twice the texture fillrate per clock compared to NV10 and uses a smaller manufacturing process allowing higher clock rates while reducing the power consumption at the same time. The GeForce 2 GTS is about 40% faster than the GeForce 256. GeForce 2 performance is mostly limited by memory bandwidth. Some card manufacturer used a low quality analog circuit design that produces a blurry image output.&lt;br /&gt;
&lt;br /&gt;
GeForce 2 MX / NV11 is the low end series of the GeForce 2, released in September 2000.  These cards have half of the pixel pipelines and half the memory interface of the NV15. It is the first NV chip with two integrated TDMS channels, providing dual display output (called &amp;quot;TwinView&amp;quot;).  It also has &amp;quot;Digital Vibrance Control&amp;quot; that allows calibration of various image output aspects.  The 3D performance of GeForce 2 MX at 16-bit color depth is slightly faster than a GeForce 256 SDR. With its relatively low price and with the performance it offered, it became a popular card. The GeForce 2 MX core was later integrated into NVIDIA's nForce IGP motherboard chipset northbridge.&lt;br /&gt;
&lt;br /&gt;
GeForce 4 MX / NV17 replaced the GeForce 2 series in January 2002. The NV17 core is a hybrid of the NV11 and NV25. The integration of various efficiency and bandwidth improving features, combined with significantly higher clock speed than NV11, allows it to match NV15 performance. These features were advertised as &amp;quot;Lightspeed Memory Architecture II,&amp;quot; which was a refinement of the Lightspeed Memory Architecture introduced with the GeForce 3 (NV20).  It also gained the &amp;quot;AccuView&amp;quot; anti-aliasing capabilities which are considerably advanced in quality and performance over NV11 and NV15. However, the GeForce 4 MX lacks the hardware pixel and vertex shaders support, as well as environment mapped bump mapping support and higher level anisotropic filtering support found in the NV2x architecture.&lt;br /&gt;
*[http://www.anandtech.com/show/875/6 AnandTech: NV17 and NV25 Come to Life]&lt;br /&gt;
*[http://www.nvidia.com/object/feature_lmaii.html Lightspeed Memory Architecture II]&lt;br /&gt;
&lt;br /&gt;
NVIDIA further refreshed the GeForce 4 MX line in late 2002 with the NV18, which added AGP 8x support. In 2004 the series received another refresh with PCI Express x16 support via a bridge chip. The first such product introduced was the GeForce PCX 4300. The GeForce 4 MX core was also integrated into NVIDIA's nForce2 IGP motherboard chipset northbridge.&lt;br /&gt;
&lt;br /&gt;
Estimated model performance ranking:&lt;br /&gt;
&lt;br /&gt;
GF2 MX100 &amp;lt; GF2 MX200 &amp;lt; GF2 MX &amp;lt; GF2 MX400 &amp;lt; GF4 MX420 &amp;lt; GF2 GTS &amp;lt; GF2 Pro &amp;lt; GF2 Ti VX &amp;lt; GF2 Ti &amp;lt; GF2 Ultra &amp;lt; GF4 MX440&lt;br /&gt;
&lt;br /&gt;
===NV2x | GeForce 3 &amp;amp; 4===&lt;br /&gt;
[[File:GeForce3_Ti200.JPG|180px|thumb||GeForce 3 Ti 200]]&lt;br /&gt;
Released in March 2001, the GeForce 3 / NV20 is the first Direct3D 8.0-compliant GPU. It is more efficient than GeForce 2 because of improvements in memory bandwidth utilization and the addition of hidden surface removal (HSR) functions similar to those of ATI Radeon. NVIDIA called this &amp;quot;Lightspeed Memory Architecture&amp;quot;.  Despite lower fillrate than GeForce 2 Ultra/Pro/Ti, in some cases GeForce 3 can outperform those cards by up to 50%, namely in situations with anti-aliasing or when the HSR features save considerable fillrate. However in some cases it loses to GeForce 2 Ultra. In the latter half of 2001, NVIDIA released the GeForce 3 Ti 200 and Ti 500. GeForce 3 Ti 200 is clocked lower than the original GeForce 3, while Ti 500 is fastest. The original GeForce 3 and GeForce 3 Ti 500 were only released in 64 MB configurations, while the GeForce 3 Ti 200 was released in 64 MB and 128 MB configurations. The GeForce 3, however, benefitted very little from 128 MB memory.&lt;br /&gt;
&lt;br /&gt;
GeForce 3 is the first GeForce GPU with environment mapped bump mapping (EMBM) support, multi-sample anti-aliasing (MSAA) and complete anisotropic filtering support (up to 8x level). MSAA is considerably less demanding of fillrate than SSAA. 2X and 4X MSAA modes are available. There is also an anti-aliasing mode called &amp;quot;Quincunx&amp;quot; that uses a combination of 2X MSAA and a RAMDAC-based filter. This mode was intended to allow better quality anti-aliasing than 2X MSAA/SSAA but without the performance hit of 4X MSAA/SSAA. Higher anisotropic filtering levels (4x and 8x), however, incur a heavy performance decrease (sometimes by as much as 50%). However, compared to its main competitor, ATI's R200 series, GeForce3's anisotropic filtering implementation is less angle-dependent and can work simultaneously with trilinear filtering, yielding better quality when observed in motion.&lt;br /&gt;
&lt;br /&gt;
The next evolution in the NV2x architecture came in early 2002 in the form of the NV25 / GeForce 4 Ti series. The GeForce 4 Ti is quite similar to the GeForce 3 in general. Changes include higher clock speed, pixel shader 1.3 support, Direct3D 8.1 compliance, an additional vertex processor for better geometry performance and dual RAM DAC for dual display output. Later in 2002, NVIDIA released the NV28, which added AGP 8x support. With the NV28 release, the AGP 8x versions of the Ti 4400 and 4600 were respectively rebranded as the Ti 4800 SE and Ti 4800. The AGP 8x variant of the Ti 4200 was just known as the &amp;quot;Ti 4200 with AGP 8x.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
In the GeForce4 Ti line, the Ti 4200 is slowest and the Ti 4800 is fastest.&lt;br /&gt;
*[http://ixbtlabs.com/articles/gf4/index1.html Review from ixbtlabs GeForce 3 Ti500, Radeon 8500, GeForce 4 Ti]&lt;br /&gt;
&lt;br /&gt;
===NV3x | GeForce FX===&lt;br /&gt;
[[File:Gf5200u.jpg|thumb|FX 5200 Ultra]]&lt;br /&gt;
These are NVIDIA's first Direct3D 9 GPUs.  They have excellent Direct3D 5-8 compatibility and performance but are of limited value for Direct3D 9.  They are very useful for old games because they still have support for palettized textures and fog table. Similar anti-aliasing and anisotropic features, but performance with these is improved compared to older models. The high performance models like 5700 Ultra, 5800 Ultra and 59x0 Ultra allow one to run old games in high-resolution with anti-aliasing and anisotropic filtering.&lt;br /&gt;
&lt;br /&gt;
Avoid models with 64-bit bus and naming suffixes like LE, XT or VE because they have been crippled in some way.  There were some PCIe models made, named GeForce PCX 5xxx.&lt;br /&gt;
&lt;br /&gt;
5200 &amp;lt; 5500 &amp;lt; 5200 Ultra &amp;lt; 5600 &amp;lt; 5600 Ultra &amp;lt; 5700 &amp;lt; 5700 Ultra &amp;lt; 5800 &amp;lt; 5800 Ultra &amp;lt; 59x0 &amp;lt; 59x0 Ultra&lt;br /&gt;
&lt;br /&gt;
===NV4x | GeForce 6===&lt;br /&gt;
The GeForce 6 series (NV4x) was released in 2004 and is the first Direct3D 9.0c-compliant GPU. It introduced support for Shader Model 3.0 and support for PCI Express x16, though initially released cards were AGP-only. Compared to the GeForce FX series, the NV4x had dramatically improved performance all-around but dropped palettized texture support, so it is incompatible with some old games (few). This is NVIDIA's final generation of GPUs with Windows 9x and NT 4.0 support.&lt;br /&gt;
&lt;br /&gt;
==Compatibility notes==&lt;br /&gt;
*With Intel 440BX motherboards, drivers newer than 56.64 may be unstable.&lt;br /&gt;
&lt;br /&gt;
==Video captures==&lt;br /&gt;
&lt;br /&gt;
===NV1 / STG2000 ===&lt;br /&gt;
{{#ev:youtube|2H-gEKsd5SY}}&lt;br /&gt;
{{#ev:youtube|9bJGnzb0Asw}}&lt;br /&gt;
{{#ev:youtube|t9Zqy5BZQ1E}}&lt;br /&gt;
{{#ev:youtube|_4l_2Bqbi9Q}}&lt;br /&gt;
{{#ev:youtube|ZOstiHUk_EM}}&lt;br /&gt;
{{#ev:youtube|Ibs90LY_Ph8}}&lt;br /&gt;
{{#ev:youtube|JG0kqg8QlLw}}&lt;br /&gt;
{{#ev:youtube|nc6cH0zuMSs}}&lt;br /&gt;
{{#ev:youtube|VK9sg_93iCE}}&lt;br /&gt;
&lt;br /&gt;
===NV3 | RIVA 128 &amp;amp; 128 ZX ===&lt;br /&gt;
{{#ev:youtube|0GAEXE3eu0o}}&lt;br /&gt;
{{#ev:youtube|Cqxia9tPFrs}}&lt;br /&gt;
{{#ev:youtube|LZrWRMMdwW4}}&lt;br /&gt;
{{#ev:youtube|hv07UKRetPY}}&lt;br /&gt;
&lt;br /&gt;
==Related links==&lt;br /&gt;
*[http://www.vogonsdrivers.com/index.php?catid=24  VOGONS Drivers NVIDIA section] &lt;br /&gt;
*[http://gona.mactar.hu/DOS_TESTS/ Gona's PCI and AGP DOS game compatibility matrix]&lt;br /&gt;
&lt;br /&gt;
[[Category:Hardware]]&lt;br /&gt;
[[Category:Graphics Cards]]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=Intel/Real3D&amp;diff=1329</id>
		<title>Intel/Real3D</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=Intel/Real3D&amp;diff=1329"/>
				<updated>2013-04-28T08:04:34Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Cards==&lt;br /&gt;
&lt;br /&gt;
===i740 ===&lt;br /&gt;
Also known as the Intel740 and codenamed &amp;quot;Auburn&amp;quot;, this was Intel's first foray into the discrete graphics market. It was released in January 1998 and was the result of Intel's collaboration with Lockheed Martin's Real3D graphics division. It supports Direct3D and OpenGL ICD. It was designed to take full advantage of the AGP bus, and as such, it can only texture from system memory via the AGP bus. All of its local onboard graphics memory was used purely as a display frame buffer. Therefore, manufacturers were able to produce cards with as low as 2 MB of onboard memory (one such example being Intel's own Express 3D graphics card). The i740 also supports texture map resolutions up to 1024x1024. 3D rendering color depth is limited to 16-bit, but the chip has good dithering quality. Its performance allows it to compete well with NVIDIA's RIVA 128, ATI's Rage Pro and the original 3Dfx Voodoo graphics chipset, but it falls woefully short of higher-end graphics solutions released around the same timeframe, such as 3Dfx's Voodoo2. Much of the bottleneck results from its sole use of AGP texturing - the i740 has to access textures through a channel that is often much slower that of the onboard graphics memory. It was also difficult to port i740 cards to the PCI bus - often times it necessitated the use of an AGP-to-PCI bridge chip, resulting in increased card cost. The PCI variants could texture from onboard graphics memory and are faster than their AGP counterparts in some tests. The PCI variants also came with larger amounts of onboard memory - one such card was a PCI version of Real3D's Starfighter, which was manufactured with as much as 24 MB of onboard memory.&lt;br /&gt;
&lt;br /&gt;
The i740 supports Windows 9x and NT 4.0. Drivers for Windows 3.1x were also released, and the Windows 3.1x drivers are heavily based on Chips and Technologies' GUI accelerator drivers (since Intel had acquired Chips and Technologies in July 1997).&lt;br /&gt;
&lt;br /&gt;
The i740 has good overall DOS VGA compatibility and performance.&lt;br /&gt;
&lt;br /&gt;
===i752 and i754 ===&lt;br /&gt;
The i752 and i754 are slight evolutions of the i740 architecture and they were announced in April 1999. As of 2013, they are also Intel's last graphics chips used on discrete cards. The i752 supports AGP 2x while the i754 supports AGP 4x. New features include support for multitexturing, anisotropic filtering, MPEG-2 motion compensation and DVI displays. Both were cancelled upon release and very few boards made it out into the wild.&lt;br /&gt;
The i752 and i754 technology respectively lived on as the IGP in Intel's i810 and i815 motherboard chipsets. Some of this technology further found its way into Intel's later IGPs, notably their Extreme Graphics and Graphics Media Accelerator (GMA) lines.&lt;br /&gt;
&lt;br /&gt;
==Video captures ==&lt;br /&gt;
===i740 ===&lt;br /&gt;
{{#ev:youtube|XDxjuXikgs8}}&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=ATI&amp;diff=1327</id>
		<title>ATI</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=ATI&amp;diff=1327"/>
				<updated>2013-04-28T08:01:40Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* 3D Rage II */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;ATi Technologies produced graphics cards from the '80s through the mid '00s until merging with AMD in 2006. AMD still produces graphics cards today.&lt;br /&gt;
&lt;br /&gt;
== Graphics card series ==&lt;br /&gt;
=== Mach ===&lt;br /&gt;
===== Mach 8 =====&lt;br /&gt;
===== Mach 32 =====&lt;br /&gt;
===== Mach 64 =====&lt;br /&gt;
&lt;br /&gt;
=== Rage ===&lt;br /&gt;
[[File:ATIRage128Pro.JPG|thumb|Rage 128 Pro OEM]]&lt;br /&gt;
&lt;br /&gt;
===== 3D Rage =====&lt;br /&gt;
===== 3D Rage II =====&lt;br /&gt;
===== 3D Rage Pro =====&lt;br /&gt;
Released in the latter half of 1997, the Rage Pro was a major improvement on ATI's previous Rage II chip. Improvements include an increased texture cache size (now at 4 KB) allowing for improved texture filtering, as well as an integrated triangle setup engine. It is the first ATI chip (and among the earliest graphics chips) to fully support AGP bus features, including execute mode (AGP texturing). It is also the first ATI chip to support OpenGL in hardware. However, like the previous Rage chips, the Rage Pro cannot bilinear filter alpha textures, resulting in transparent textures still having a rough appearance. Performance-wise, it is very similar to 3Dfx's original Voodoo Graphics chipset. The Rage Pro was very popular with OEMs and up until the late 2000s, it was integrated into many server motherboards.&lt;br /&gt;
&lt;br /&gt;
The Rage Pro is also the last chip to support ATI's CIF application programming interface. It is also ATI's last chip with Windows 3.1x support.&lt;br /&gt;
&lt;br /&gt;
===== Rage 128 =====&lt;br /&gt;
&lt;br /&gt;
=== Radeon ===&lt;br /&gt;
&lt;br /&gt;
===== R100 =====&lt;br /&gt;
[[File:Radeon7500agp.jpg|thumb|Radeon 7500 64MB]]&lt;br /&gt;
The original Radeon was a Direct3D 7 visual processing unit (VPU), as ATi named it. It is a 2 pixel per clock design with 3 texture units on each of the pixel pipelines. The 166 MHz Radeon DDR (aka 7200) is competitive with GeForce 256 DDR. Clock speeds varied from 143 - 200 MHz, synchronous memory and core. &lt;br /&gt;
&lt;br /&gt;
It supports environmental bump mapping (EMBM), unlike GeForce cards at the time. It has a basic form of anisotropic filtering that is high performance and offers a nice quality improvement but is highly angle-dependent and can not operate at the same time as trilinear filtering. It also offers ordered-grid supersampling anti-aliasing.&lt;br /&gt;
&lt;br /&gt;
Backwards compatibility with old D3D 5 games is limited because of the lack of support for fog table and palettized textures. It is possible to enable fog table via registry tweaks but it was not officially supported.&lt;br /&gt;
&lt;br /&gt;
RV100 (Radeon VE / 7000) is a chip with dual display capabilities but with reduced 3D hardware. It lacks T&amp;amp;L and has a single pixel pipeline. It is somewhat faster than TNT2 Ultra and G400 Max.&lt;br /&gt;
&lt;br /&gt;
RV200 (Radeon 7500) is a die shrink of R100 with some improvements. It has more anisotropic filtering options and is capable of asynchronous clocking of memory and the core. The top of the line model is clocked at 290 MHz core and 230 MHz RAM, and competes with GeForce 2 Ti/Pro. There are many variations of this card.&lt;br /&gt;
&lt;br /&gt;
'''Notes:'''&lt;br /&gt;
*DVI on these cards is flaky and is essentially unusable with DOS.  High resolutions may also be problematic.&lt;br /&gt;
&lt;br /&gt;
===== R200 =====&lt;br /&gt;
[[File:Radeon8500 128mb.JPG|thumb|Radeon 8500 128MB]]&lt;br /&gt;
This generation is the first with Direct3D 8 compliance, actually Direct3D 8.1. The Radeon 8500 is a 4 pipeline design with 2 texture units per pipeline and operates at up to 275 MHz, typically with synchronous core and RAM. It is competitive with GeForce 3 Ti 500. &lt;br /&gt;
&lt;br /&gt;
A wide variety of supersampling anti-aliasing modes are available (2-6x, quality/performance). ATi calls it &amp;quot;Smoothvision&amp;quot;. It uses various techniques, including a jittered-grid pattern for some modes/cases and ordered-grid for others. In Direct3D, fog may force it to use ordered-grid. Drivers vary in their behavior as well.[http://forum.beyond3d.com/showpost.php?p=4859&amp;amp;postcount=64]&lt;br /&gt;
&lt;br /&gt;
Anisotropic filtering is somewhat improved, with more levels supported, but is again very angle dependent and can not work with trilinear filtering. GeForce 3+ have higher quality anisotropic filtering but with a much higher performance impact.&lt;br /&gt;
&lt;br /&gt;
ATi introduced a tessellation function called [[TruForm]].&lt;br /&gt;
&lt;br /&gt;
Backwards compatibility with old D3D 5 games is limited because of the lack of support for fog table and palettized textures.&lt;br /&gt;
&lt;br /&gt;
RV250 and RV280, known as Radeon 9000, 9200 and 9250, are slight evolutions of the design. They have somewhat reduced specifications but are more efficient and run cooler. They were popular notebook GPUs. Performance of Radeon 9000 Pro is not far off of Radeon 8500. Radeon 9100 is a rename of Radeon 8500 LE.&lt;br /&gt;
&lt;br /&gt;
'''Notes:'''&lt;br /&gt;
*DVI on these cards is flaky and is essentially unusable with DOS. High resolutions may also be problematic.&lt;br /&gt;
*With Star Wars KOTOR and KOTOR2, use Catalyst 4.2.&lt;br /&gt;
&lt;br /&gt;
===== R300 =====&lt;br /&gt;
[[File:Radeon9800pro256.JPG|thumb|Radeon 9800 Pro 256MB]]&lt;br /&gt;
Introduced in August 2002, the R300 GPUs are Direct3D 9.0-compliant graphics chips. R300 introduced Shader Model 2.0 support and is also OpenGL 2.0-compliant. The R300 was designed by the ArtX engineering team that ATI had acquired in Feburary 2000. The same ArtX engineers (who were also former SGI employees) designed the Nintendo Gamecube GPU (Flipper) as well as the SGI RealityEngine-based graphics processor in the Nintendo 64. The first R300-based cards released were the Radeon 9500 and 9700 line of cards. In 2003, the Radeon 9600 and 9800 series were added to the lineup. R300 has many improvements and noticeably better visual quality than ATI's prior chips. Radeon 9800 Pro is competitive with GeForce FX 5900 Ultra, but with Direct3D 9 games the GeForce FX falls far behind.&lt;br /&gt;
&lt;br /&gt;
Anisotropic filtering quality is vastly improved in the R300, with much lower angle-dependency and the ability to work simultaneously with trilinear filtering. Furthermore, compared to its initial competitor, NVIDIA's GeForce 4 Ti series, R300's anisotropic filtering incurred much less performance decrease. Anti-aliasing is now performed with 2-6x gamma-corrected rotated-grid multi-sampling anti-aliasing. MSAA operates only on polygon edges, which of course means no anti-aliasing within textures or of transparent textures, but expends far less fillrate and is thus useable at higher resolutions. NVIDIA does not match the quality of this MSAA until GeForce 8. However, ATi did not support any form of super-sampling with R300-R700, while NVIDIA did.&lt;br /&gt;
&lt;br /&gt;
The R300 enjoyed visual quality and performance supremacy over its competitors in games and applications that extensively used Shader Model 2.0. NVIDIA would not be able to match or exceed ATI's Direct3D 9.0 performance until the release of the GeForce 6 series in 2004.&lt;br /&gt;
&lt;br /&gt;
Backwards compatibility with old D3D 5 games is limited because of the lack of support for fog table and palettized textures.&lt;br /&gt;
Also, despite being Direct3D 9.0-compliant, the R300 is not officially supported under Windows 7. However, for full Direct3D and OpenGL support, it is still possible to use the Windows Vista driver instead under Windows 7, although WDDM 1.1 features will not be present.&lt;br /&gt;
&lt;br /&gt;
'''Notes:'''&lt;br /&gt;
*With Star Wars KOTOR and KOTOR2, use Catalyst 4.2.&lt;br /&gt;
&lt;br /&gt;
===== R400 ===== &lt;br /&gt;
[[File:RadeonX800XTPE.jpg|thumb|Radeon X800 XT PE]]&lt;br /&gt;
&lt;br /&gt;
Introduced in 2004, this is ATi's Direct3D 9.0b generation. It is very similar to R300 in general, but with 16 pipelines in the top chip instead of 8, and higher clock speeds. They are still shader model 2.0 GPUs but have some extensions beyond 2.0, which gives them a 2.0b designation, but are not 3.0 compliant. This was not an issue until about 2 years after launch when games started to outright require shader model 3.0 or run without some visual features. There are some games that utilize 2.0b features - for example Oblivion has more visual effects available on X800 than 9800.&lt;br /&gt;
&lt;br /&gt;
A new anti-aliasing mode was introduced, called temporal AA. This feature shifts the sampling pattern on a per-frame basis, if the card can maintain &amp;gt;= 60 fps. This works well with human vision and gives a tangible improvement to anti-aliasing quality. Also, while not initially available, adaptive anti-aliasing was added to the R400 series after the release of R500 series. Adaptive AA anti-aliases within transparent textures, giving MSAA more SSAA-like capabilities.&lt;br /&gt;
&lt;br /&gt;
The ATI R400 series are ATI's last GPUs with official Windows 98/98 SE/ME support. Likewise with the R300 series, the R400 series is not officially supported under Windows 7. However, for full Direct3D and OpenGL support, it is still possible to use the Windows Vista driver instead under Windows 7, although WDDM 1.1 features will not be present.&lt;br /&gt;
&lt;br /&gt;
===== R500 =====&lt;br /&gt;
Introduced in 2005, the Radeon X1000 / R500 series are ATI's first Direct3D 9.0c-compliant GPUs with full Shader Model 3.0 features. The R500 series is not officially supported under Windows 7. However, for full Direct3D and OpenGL support, it is still possible to use the Windows Vista driver instead under Windows 7, although WDDM 1.1 features will not be present. &lt;br /&gt;
&lt;br /&gt;
==Video captures==&lt;br /&gt;
&lt;br /&gt;
===3D Rage II ===&lt;br /&gt;
{{#ev:youtube|wdJXf6MpN7A}}&lt;br /&gt;
{{#ev:youtube|iFHwNf7-oZk}}&lt;br /&gt;
{{#ev:youtube|wWzWdwj9NvU}}&lt;br /&gt;
&lt;br /&gt;
===3D Rage Pro ===&lt;br /&gt;
{{#ev:youtube|DU5Zi69QPQs}}&lt;br /&gt;
{{#ev:youtube|uZna8WXC4ds}}&lt;br /&gt;
{{#ev:youtube|IG3hd1humM0}}&lt;br /&gt;
{{#ev:youtube|i4pB5Fw8Slk}}&lt;br /&gt;
&lt;br /&gt;
[[Category:Hardware]]&lt;br /&gt;
[[Category:Graphics Cards]]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=ATI&amp;diff=1326</id>
		<title>ATI</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=ATI&amp;diff=1326"/>
				<updated>2013-04-28T08:00:57Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* 3D Rage Pro */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;ATi Technologies produced graphics cards from the '80s through the mid '00s until merging with AMD in 2006. AMD still produces graphics cards today.&lt;br /&gt;
&lt;br /&gt;
== Graphics card series ==&lt;br /&gt;
=== Mach ===&lt;br /&gt;
===== Mach 8 =====&lt;br /&gt;
===== Mach 32 =====&lt;br /&gt;
===== Mach 64 =====&lt;br /&gt;
&lt;br /&gt;
=== Rage ===&lt;br /&gt;
[[File:ATIRage128Pro.JPG|thumb|Rage 128 Pro OEM]]&lt;br /&gt;
&lt;br /&gt;
===== 3D Rage =====&lt;br /&gt;
===== 3D Rage II =====&lt;br /&gt;
===== 3D Rage Pro =====&lt;br /&gt;
Released in the latter half of 1997, the Rage Pro was a major improvement on ATI's previous Rage II chip. Improvements include an increased texture cache size (now at 4 KB) allowing for improved texture filtering, as well as an integrated triangle setup engine. It is the first ATI chip (and among the earliest graphics chips) to fully support AGP bus features, including execute mode (AGP texturing). It is also the first ATI chip to support OpenGL in hardware. However, like the previous Rage chips, the Rage Pro cannot bilinear filter alpha textures, resulting in transparent textures still having a rough appearance. Performance-wise, it is very similar to 3Dfx's original Voodoo Graphics chipset. The Rage Pro was very popular with OEMs and up until the late 2000s, it was integrated into many server motherboards.&lt;br /&gt;
&lt;br /&gt;
The Rage Pro is also the last chip to support ATI's CIF application programming interface. It is also ATI's last chip with Windows 3.1x support.&lt;br /&gt;
&lt;br /&gt;
===== Rage 128 =====&lt;br /&gt;
&lt;br /&gt;
=== Radeon ===&lt;br /&gt;
&lt;br /&gt;
===== R100 =====&lt;br /&gt;
[[File:Radeon7500agp.jpg|thumb|Radeon 7500 64MB]]&lt;br /&gt;
The original Radeon was a Direct3D 7 visual processing unit (VPU), as ATi named it. It is a 2 pixel per clock design with 3 texture units on each of the pixel pipelines. The 166 MHz Radeon DDR (aka 7200) is competitive with GeForce 256 DDR. Clock speeds varied from 143 - 200 MHz, synchronous memory and core. &lt;br /&gt;
&lt;br /&gt;
It supports environmental bump mapping (EMBM), unlike GeForce cards at the time. It has a basic form of anisotropic filtering that is high performance and offers a nice quality improvement but is highly angle-dependent and can not operate at the same time as trilinear filtering. It also offers ordered-grid supersampling anti-aliasing.&lt;br /&gt;
&lt;br /&gt;
Backwards compatibility with old D3D 5 games is limited because of the lack of support for fog table and palettized textures. It is possible to enable fog table via registry tweaks but it was not officially supported.&lt;br /&gt;
&lt;br /&gt;
RV100 (Radeon VE / 7000) is a chip with dual display capabilities but with reduced 3D hardware. It lacks T&amp;amp;L and has a single pixel pipeline. It is somewhat faster than TNT2 Ultra and G400 Max.&lt;br /&gt;
&lt;br /&gt;
RV200 (Radeon 7500) is a die shrink of R100 with some improvements. It has more anisotropic filtering options and is capable of asynchronous clocking of memory and the core. The top of the line model is clocked at 290 MHz core and 230 MHz RAM, and competes with GeForce 2 Ti/Pro. There are many variations of this card.&lt;br /&gt;
&lt;br /&gt;
'''Notes:'''&lt;br /&gt;
*DVI on these cards is flaky and is essentially unusable with DOS.  High resolutions may also be problematic.&lt;br /&gt;
&lt;br /&gt;
===== R200 =====&lt;br /&gt;
[[File:Radeon8500 128mb.JPG|thumb|Radeon 8500 128MB]]&lt;br /&gt;
This generation is the first with Direct3D 8 compliance, actually Direct3D 8.1. The Radeon 8500 is a 4 pipeline design with 2 texture units per pipeline and operates at up to 275 MHz, typically with synchronous core and RAM. It is competitive with GeForce 3 Ti 500. &lt;br /&gt;
&lt;br /&gt;
A wide variety of supersampling anti-aliasing modes are available (2-6x, quality/performance). ATi calls it &amp;quot;Smoothvision&amp;quot;. It uses various techniques, including a jittered-grid pattern for some modes/cases and ordered-grid for others. In Direct3D, fog may force it to use ordered-grid. Drivers vary in their behavior as well.[http://forum.beyond3d.com/showpost.php?p=4859&amp;amp;postcount=64]&lt;br /&gt;
&lt;br /&gt;
Anisotropic filtering is somewhat improved, with more levels supported, but is again very angle dependent and can not work with trilinear filtering. GeForce 3+ have higher quality anisotropic filtering but with a much higher performance impact.&lt;br /&gt;
&lt;br /&gt;
ATi introduced a tessellation function called [[TruForm]].&lt;br /&gt;
&lt;br /&gt;
Backwards compatibility with old D3D 5 games is limited because of the lack of support for fog table and palettized textures.&lt;br /&gt;
&lt;br /&gt;
RV250 and RV280, known as Radeon 9000, 9200 and 9250, are slight evolutions of the design. They have somewhat reduced specifications but are more efficient and run cooler. They were popular notebook GPUs. Performance of Radeon 9000 Pro is not far off of Radeon 8500. Radeon 9100 is a rename of Radeon 8500 LE.&lt;br /&gt;
&lt;br /&gt;
'''Notes:'''&lt;br /&gt;
*DVI on these cards is flaky and is essentially unusable with DOS. High resolutions may also be problematic.&lt;br /&gt;
*With Star Wars KOTOR and KOTOR2, use Catalyst 4.2.&lt;br /&gt;
&lt;br /&gt;
===== R300 =====&lt;br /&gt;
[[File:Radeon9800pro256.JPG|thumb|Radeon 9800 Pro 256MB]]&lt;br /&gt;
Introduced in August 2002, the R300 GPUs are Direct3D 9.0-compliant graphics chips. R300 introduced Shader Model 2.0 support and is also OpenGL 2.0-compliant. The R300 was designed by the ArtX engineering team that ATI had acquired in Feburary 2000. The same ArtX engineers (who were also former SGI employees) designed the Nintendo Gamecube GPU (Flipper) as well as the SGI RealityEngine-based graphics processor in the Nintendo 64. The first R300-based cards released were the Radeon 9500 and 9700 line of cards. In 2003, the Radeon 9600 and 9800 series were added to the lineup. R300 has many improvements and noticeably better visual quality than ATI's prior chips. Radeon 9800 Pro is competitive with GeForce FX 5900 Ultra, but with Direct3D 9 games the GeForce FX falls far behind.&lt;br /&gt;
&lt;br /&gt;
Anisotropic filtering quality is vastly improved in the R300, with much lower angle-dependency and the ability to work simultaneously with trilinear filtering. Furthermore, compared to its initial competitor, NVIDIA's GeForce 4 Ti series, R300's anisotropic filtering incurred much less performance decrease. Anti-aliasing is now performed with 2-6x gamma-corrected rotated-grid multi-sampling anti-aliasing. MSAA operates only on polygon edges, which of course means no anti-aliasing within textures or of transparent textures, but expends far less fillrate and is thus useable at higher resolutions. NVIDIA does not match the quality of this MSAA until GeForce 8. However, ATi did not support any form of super-sampling with R300-R700, while NVIDIA did.&lt;br /&gt;
&lt;br /&gt;
The R300 enjoyed visual quality and performance supremacy over its competitors in games and applications that extensively used Shader Model 2.0. NVIDIA would not be able to match or exceed ATI's Direct3D 9.0 performance until the release of the GeForce 6 series in 2004.&lt;br /&gt;
&lt;br /&gt;
Backwards compatibility with old D3D 5 games is limited because of the lack of support for fog table and palettized textures.&lt;br /&gt;
Also, despite being Direct3D 9.0-compliant, the R300 is not officially supported under Windows 7. However, for full Direct3D and OpenGL support, it is still possible to use the Windows Vista driver instead under Windows 7, although WDDM 1.1 features will not be present.&lt;br /&gt;
&lt;br /&gt;
'''Notes:'''&lt;br /&gt;
*With Star Wars KOTOR and KOTOR2, use Catalyst 4.2.&lt;br /&gt;
&lt;br /&gt;
===== R400 ===== &lt;br /&gt;
[[File:RadeonX800XTPE.jpg|thumb|Radeon X800 XT PE]]&lt;br /&gt;
&lt;br /&gt;
Introduced in 2004, this is ATi's Direct3D 9.0b generation. It is very similar to R300 in general, but with 16 pipelines in the top chip instead of 8, and higher clock speeds. They are still shader model 2.0 GPUs but have some extensions beyond 2.0, which gives them a 2.0b designation, but are not 3.0 compliant. This was not an issue until about 2 years after launch when games started to outright require shader model 3.0 or run without some visual features. There are some games that utilize 2.0b features - for example Oblivion has more visual effects available on X800 than 9800.&lt;br /&gt;
&lt;br /&gt;
A new anti-aliasing mode was introduced, called temporal AA. This feature shifts the sampling pattern on a per-frame basis, if the card can maintain &amp;gt;= 60 fps. This works well with human vision and gives a tangible improvement to anti-aliasing quality. Also, while not initially available, adaptive anti-aliasing was added to the R400 series after the release of R500 series. Adaptive AA anti-aliases within transparent textures, giving MSAA more SSAA-like capabilities.&lt;br /&gt;
&lt;br /&gt;
The ATI R400 series are ATI's last GPUs with official Windows 98/98 SE/ME support. Likewise with the R300 series, the R400 series is not officially supported under Windows 7. However, for full Direct3D and OpenGL support, it is still possible to use the Windows Vista driver instead under Windows 7, although WDDM 1.1 features will not be present.&lt;br /&gt;
&lt;br /&gt;
===== R500 =====&lt;br /&gt;
Introduced in 2005, the Radeon X1000 / R500 series are ATI's first Direct3D 9.0c-compliant GPUs with full Shader Model 3.0 features. The R500 series is not officially supported under Windows 7. However, for full Direct3D and OpenGL support, it is still possible to use the Windows Vista driver instead under Windows 7, although WDDM 1.1 features will not be present. &lt;br /&gt;
&lt;br /&gt;
==Video captures==&lt;br /&gt;
&lt;br /&gt;
===3D Rage II ===&lt;br /&gt;
{{#ev:youtube|wdJXf6MpN7A}}&lt;br /&gt;
{{#ev:youtube|iFHwNf7-oZk}}&lt;br /&gt;
&lt;br /&gt;
===3D Rage Pro ===&lt;br /&gt;
{{#ev:youtube|DU5Zi69QPQs}}&lt;br /&gt;
{{#ev:youtube|uZna8WXC4ds}}&lt;br /&gt;
{{#ev:youtube|IG3hd1humM0}}&lt;br /&gt;
{{#ev:youtube|i4pB5Fw8Slk}}&lt;br /&gt;
&lt;br /&gt;
[[Category:Hardware]]&lt;br /&gt;
[[Category:Graphics Cards]]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=ATI&amp;diff=1325</id>
		<title>ATI</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=ATI&amp;diff=1325"/>
				<updated>2013-04-28T07:59:51Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* Video captures */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;ATi Technologies produced graphics cards from the '80s through the mid '00s until merging with AMD in 2006. AMD still produces graphics cards today.&lt;br /&gt;
&lt;br /&gt;
== Graphics card series ==&lt;br /&gt;
=== Mach ===&lt;br /&gt;
===== Mach 8 =====&lt;br /&gt;
===== Mach 32 =====&lt;br /&gt;
===== Mach 64 =====&lt;br /&gt;
&lt;br /&gt;
=== Rage ===&lt;br /&gt;
[[File:ATIRage128Pro.JPG|thumb|Rage 128 Pro OEM]]&lt;br /&gt;
&lt;br /&gt;
===== 3D Rage =====&lt;br /&gt;
===== 3D Rage II =====&lt;br /&gt;
===== 3D Rage Pro =====&lt;br /&gt;
Released in the latter half of 1997, the Rage Pro was a major improvement on ATI's previous Rage II chip. Improvements include an increased texture cache size (now at 4 KB) allowing for improved texture filtering, as well as an integrated triangle setup engine. It is the first ATI chip (and among the earliest graphics chips) to fully support AGP bus features, including execute mode (AGP texturing). It is also the first ATI chip to support OpenGL in hardware. However, like the previous Rage chips, the Rage Pro cannot bilinear filter alpha textures, resulting in transparent textures still having a rough appearance. Performance-wise, it is very similar to 3Dfx's original Voodoo Graphics chipset. The Rage Pro was very popular with OEMs and up until the late 2000s, it was integrated into many server motherboards.&lt;br /&gt;
&lt;br /&gt;
The Rage Pro is also the last chip to support ATI's CIF application programming interface. It is also ATI's last chip with Windows 3.1x support.&lt;br /&gt;
&lt;br /&gt;
===== Rage 128 =====&lt;br /&gt;
&lt;br /&gt;
=== Radeon ===&lt;br /&gt;
&lt;br /&gt;
===== R100 =====&lt;br /&gt;
[[File:Radeon7500agp.jpg|thumb|Radeon 7500 64MB]]&lt;br /&gt;
The original Radeon was a Direct3D 7 visual processing unit (VPU), as ATi named it. It is a 2 pixel per clock design with 3 texture units on each of the pixel pipelines. The 166 MHz Radeon DDR (aka 7200) is competitive with GeForce 256 DDR. Clock speeds varied from 143 - 200 MHz, synchronous memory and core. &lt;br /&gt;
&lt;br /&gt;
It supports environmental bump mapping (EMBM), unlike GeForce cards at the time. It has a basic form of anisotropic filtering that is high performance and offers a nice quality improvement but is highly angle-dependent and can not operate at the same time as trilinear filtering. It also offers ordered-grid supersampling anti-aliasing.&lt;br /&gt;
&lt;br /&gt;
Backwards compatibility with old D3D 5 games is limited because of the lack of support for fog table and palettized textures. It is possible to enable fog table via registry tweaks but it was not officially supported.&lt;br /&gt;
&lt;br /&gt;
RV100 (Radeon VE / 7000) is a chip with dual display capabilities but with reduced 3D hardware. It lacks T&amp;amp;L and has a single pixel pipeline. It is somewhat faster than TNT2 Ultra and G400 Max.&lt;br /&gt;
&lt;br /&gt;
RV200 (Radeon 7500) is a die shrink of R100 with some improvements. It has more anisotropic filtering options and is capable of asynchronous clocking of memory and the core. The top of the line model is clocked at 290 MHz core and 230 MHz RAM, and competes with GeForce 2 Ti/Pro. There are many variations of this card.&lt;br /&gt;
&lt;br /&gt;
'''Notes:'''&lt;br /&gt;
*DVI on these cards is flaky and is essentially unusable with DOS.  High resolutions may also be problematic.&lt;br /&gt;
&lt;br /&gt;
===== R200 =====&lt;br /&gt;
[[File:Radeon8500 128mb.JPG|thumb|Radeon 8500 128MB]]&lt;br /&gt;
This generation is the first with Direct3D 8 compliance, actually Direct3D 8.1. The Radeon 8500 is a 4 pipeline design with 2 texture units per pipeline and operates at up to 275 MHz, typically with synchronous core and RAM. It is competitive with GeForce 3 Ti 500. &lt;br /&gt;
&lt;br /&gt;
A wide variety of supersampling anti-aliasing modes are available (2-6x, quality/performance). ATi calls it &amp;quot;Smoothvision&amp;quot;. It uses various techniques, including a jittered-grid pattern for some modes/cases and ordered-grid for others. In Direct3D, fog may force it to use ordered-grid. Drivers vary in their behavior as well.[http://forum.beyond3d.com/showpost.php?p=4859&amp;amp;postcount=64]&lt;br /&gt;
&lt;br /&gt;
Anisotropic filtering is somewhat improved, with more levels supported, but is again very angle dependent and can not work with trilinear filtering. GeForce 3+ have higher quality anisotropic filtering but with a much higher performance impact.&lt;br /&gt;
&lt;br /&gt;
ATi introduced a tessellation function called [[TruForm]].&lt;br /&gt;
&lt;br /&gt;
Backwards compatibility with old D3D 5 games is limited because of the lack of support for fog table and palettized textures.&lt;br /&gt;
&lt;br /&gt;
RV250 and RV280, known as Radeon 9000, 9200 and 9250, are slight evolutions of the design. They have somewhat reduced specifications but are more efficient and run cooler. They were popular notebook GPUs. Performance of Radeon 9000 Pro is not far off of Radeon 8500. Radeon 9100 is a rename of Radeon 8500 LE.&lt;br /&gt;
&lt;br /&gt;
'''Notes:'''&lt;br /&gt;
*DVI on these cards is flaky and is essentially unusable with DOS. High resolutions may also be problematic.&lt;br /&gt;
*With Star Wars KOTOR and KOTOR2, use Catalyst 4.2.&lt;br /&gt;
&lt;br /&gt;
===== R300 =====&lt;br /&gt;
[[File:Radeon9800pro256.JPG|thumb|Radeon 9800 Pro 256MB]]&lt;br /&gt;
Introduced in August 2002, the R300 GPUs are Direct3D 9.0-compliant graphics chips. R300 introduced Shader Model 2.0 support and is also OpenGL 2.0-compliant. The R300 was designed by the ArtX engineering team that ATI had acquired in Feburary 2000. The same ArtX engineers (who were also former SGI employees) designed the Nintendo Gamecube GPU (Flipper) as well as the SGI RealityEngine-based graphics processor in the Nintendo 64. The first R300-based cards released were the Radeon 9500 and 9700 line of cards. In 2003, the Radeon 9600 and 9800 series were added to the lineup. R300 has many improvements and noticeably better visual quality than ATI's prior chips. Radeon 9800 Pro is competitive with GeForce FX 5900 Ultra, but with Direct3D 9 games the GeForce FX falls far behind.&lt;br /&gt;
&lt;br /&gt;
Anisotropic filtering quality is vastly improved in the R300, with much lower angle-dependency and the ability to work simultaneously with trilinear filtering. Furthermore, compared to its initial competitor, NVIDIA's GeForce 4 Ti series, R300's anisotropic filtering incurred much less performance decrease. Anti-aliasing is now performed with 2-6x gamma-corrected rotated-grid multi-sampling anti-aliasing. MSAA operates only on polygon edges, which of course means no anti-aliasing within textures or of transparent textures, but expends far less fillrate and is thus useable at higher resolutions. NVIDIA does not match the quality of this MSAA until GeForce 8. However, ATi did not support any form of super-sampling with R300-R700, while NVIDIA did.&lt;br /&gt;
&lt;br /&gt;
The R300 enjoyed visual quality and performance supremacy over its competitors in games and applications that extensively used Shader Model 2.0. NVIDIA would not be able to match or exceed ATI's Direct3D 9.0 performance until the release of the GeForce 6 series in 2004.&lt;br /&gt;
&lt;br /&gt;
Backwards compatibility with old D3D 5 games is limited because of the lack of support for fog table and palettized textures.&lt;br /&gt;
Also, despite being Direct3D 9.0-compliant, the R300 is not officially supported under Windows 7. However, for full Direct3D and OpenGL support, it is still possible to use the Windows Vista driver instead under Windows 7, although WDDM 1.1 features will not be present.&lt;br /&gt;
&lt;br /&gt;
'''Notes:'''&lt;br /&gt;
*With Star Wars KOTOR and KOTOR2, use Catalyst 4.2.&lt;br /&gt;
&lt;br /&gt;
===== R400 ===== &lt;br /&gt;
[[File:RadeonX800XTPE.jpg|thumb|Radeon X800 XT PE]]&lt;br /&gt;
&lt;br /&gt;
Introduced in 2004, this is ATi's Direct3D 9.0b generation. It is very similar to R300 in general, but with 16 pipelines in the top chip instead of 8, and higher clock speeds. They are still shader model 2.0 GPUs but have some extensions beyond 2.0, which gives them a 2.0b designation, but are not 3.0 compliant. This was not an issue until about 2 years after launch when games started to outright require shader model 3.0 or run without some visual features. There are some games that utilize 2.0b features - for example Oblivion has more visual effects available on X800 than 9800.&lt;br /&gt;
&lt;br /&gt;
A new anti-aliasing mode was introduced, called temporal AA. This feature shifts the sampling pattern on a per-frame basis, if the card can maintain &amp;gt;= 60 fps. This works well with human vision and gives a tangible improvement to anti-aliasing quality. Also, while not initially available, adaptive anti-aliasing was added to the R400 series after the release of R500 series. Adaptive AA anti-aliases within transparent textures, giving MSAA more SSAA-like capabilities.&lt;br /&gt;
&lt;br /&gt;
The ATI R400 series are ATI's last GPUs with official Windows 98/98 SE/ME support. Likewise with the R300 series, the R400 series is not officially supported under Windows 7. However, for full Direct3D and OpenGL support, it is still possible to use the Windows Vista driver instead under Windows 7, although WDDM 1.1 features will not be present.&lt;br /&gt;
&lt;br /&gt;
===== R500 =====&lt;br /&gt;
Introduced in 2005, the Radeon X1000 / R500 series are ATI's first Direct3D 9.0c-compliant GPUs with full Shader Model 3.0 features. The R500 series is not officially supported under Windows 7. However, for full Direct3D and OpenGL support, it is still possible to use the Windows Vista driver instead under Windows 7, although WDDM 1.1 features will not be present. &lt;br /&gt;
&lt;br /&gt;
==Video captures==&lt;br /&gt;
&lt;br /&gt;
===3D Rage II ===&lt;br /&gt;
{{#ev:youtube|wdJXf6MpN7A}}&lt;br /&gt;
{{#ev:youtube|iFHwNf7-oZk}}&lt;br /&gt;
&lt;br /&gt;
===3D Rage Pro ===&lt;br /&gt;
{{#ev:youtube|DU5Zi69QPQs}}&lt;br /&gt;
{{#ev:youtube|uZna8WXC4ds}}&lt;br /&gt;
{{#ev:youtube|wWzWdwj9NvU}}&lt;br /&gt;
{{#ev:youtube|IG3hd1humM0}}&lt;br /&gt;
{{#ev:youtube|i4pB5Fw8Slk}}&lt;br /&gt;
&lt;br /&gt;
[[Category:Hardware]]&lt;br /&gt;
[[Category:Graphics Cards]]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=NVIDIA&amp;diff=1324</id>
		<title>NVIDIA</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=NVIDIA&amp;diff=1324"/>
				<updated>2013-04-28T07:57:48Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* NV1 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Retro value==&lt;br /&gt;
GeForce 256 through GeForce FX are fine cards for old games for a number of reasons. Their DOS game compatibility and GUI performance are top notch. Their Direct3D driver supports two critical old features, fog table and 8-bit palettized textures. OpenGL compatibility and performance are second to none, and some games utilize proprietary NVIDIA extensions.&lt;br /&gt;
&lt;br /&gt;
One problem with NVIDIA cards prior to GeForce4 is that some card vendors built their cards too cheaply. The most noticeable result is poor analog signal quality. It causes problems such as blurriness, loss of color saturation and color bleed, particularly at higher resolutions and refresh rates. Determining which cards are high quality is difficult but the GeForce4 and newer cards are most likely to be good.&lt;br /&gt;
&lt;br /&gt;
Perhaps the finest choices are the GeForce FX series because they offer the most refined quality-enhancing features and the high end models have the performance to use these features at higher resolutions. GeForce 6 drops support for palettized textures, which is a problem with a few games, but otherwise they too are great choices. GeForce 7 drops support for Windows 9x but still supports Windows 2000. Some users, however, have reportedly gotten GeForce 7800 cards to work under Windows 98 and ME using an unofficial, modified driver [http://www.mdgx.com/files/nv8269.php (link)]. This modified driver does not support Windows 95. The GeForce 7 series is also the last that is supported by NVIDIAs Stereo3D driver extension for shutter glasses such as ELSA Revelator. GeForce 8 and above support only XP and newer.&lt;br /&gt;
&lt;br /&gt;
==Cards==&lt;br /&gt;
===NV1 / STG2000 ===&lt;br /&gt;
[[File:Dedge3d.jpg|thumb|NV1]]&lt;br /&gt;
Released in 1995, NVIDIA's first 3D accelerator was an all-in-one product with audio, GUI, VGA, 3D and Sega Saturn gamepad support. It uses a type of 3D rendering called quadratic texture mapping that is not Direct3D or OpenGL compatible so it is only useful with games that use its proprietary API. It is the same technology used by Sega Saturn and as such various games were ported to use the NV1. Its audio consists of wavetable MIDI and DirectSound support but very little DOS support. DOS VGA compatibility is limited.&lt;br /&gt;
&lt;br /&gt;
It comes in various memory configurations with up to 4 MB maximum. The final drivers support Direct3D, but it is a software-only implementation.&lt;br /&gt;
&lt;br /&gt;
Games with support: Battle Arena Toshinden, Descent: Destination Saturn, NASCAR Racing, Panzer Dragoon, Virtua Cop, Virtua Fighter Remix.&lt;br /&gt;
&lt;br /&gt;
===NV3 | RIVA 128 &amp;amp; 128 ZX===&lt;br /&gt;
[[File:Riva128.jpg|thumb|RIVA 128]]&lt;br /&gt;
The RIVA 128 was released in late 1997 and it is the first Direct3D compatible GPU from NVIDIA. RIVA stands for '''R'''eal-time '''I'''nteractive '''V'''ideo and '''A'''nimation and 128 for the internal 128-bit pipeline and memory interface. It has a 206 MHz RAMDAC and supports DDC2 and VBE3.&lt;br /&gt;
Riva 128 has all the hardware features required for Direct3D 5 and has also good OpenGL 1.0 compatibility. It renders at 16-bit color depth and supports 3D accelerated resolutions up to 960x720 with Z-buffer. The 3D-performance is competitive with [[3dfx|Voodoo Graphics]] (Voodoo1). The RIVA 128 shows a few rendering quality issues like visible texture seams and a very apparent dithering pattern. Overall the rendered image appears more saturated than output from Glide games from this time. A special feature of the PCI version of the card is the ability to load textures over the PCI bus, which was advertised as ''AGP content on PCI''. The 2D-features include video scaling and color conversion capabilities. The chip features also PAL/NTSC output so many cards were released with TV-out, some even with TV-in.&lt;br /&gt;
&lt;br /&gt;
In early 1998, NVIDIA refreshed the NV3 architecture by releasing the RIVA 128 ZX. The RIVA 128 ZX is an upgraded chip that has a 250 MHz RAMDAC and supports up to 8 MB SGRAM. For texture intense games the increased memory results in higher performance.&lt;br /&gt;
&lt;br /&gt;
===NV4-6 | RIVA TNT &amp;amp; TNT2===&lt;br /&gt;
[[File:Tntpci.jpg|thumb|TNT PCI]]&lt;br /&gt;
Released in 1998, NVIDIA's RIVA TNT is a drastically improved Direct3D 6-compliant GPU with much better image quality and performance compared to its predecessors. It is competitive with Voodoo2 but with more flexibility such as 32-bit color rendering and 1024x1024 texture support. It also supports AGP 2x with execute mode (AGP texturing). 32-bit color rendering comes with a significant speed loss though and large textures are of very limited use since the chip lacks any form of texture compression. 16 MB RAM allows for very high resolutions. Great OpenGL support. All NVIDIA cards have the advantage of seperately clocked memory which makes for more flexible overclocking than with 3dfx cards.&lt;br /&gt;
&lt;br /&gt;
NVIDIA further refined the RIVA TNT architecture in early 1999 with the RIVA TNT2. The RIVA TNT2 is very similar to the original TNT but adds support for AGP 4x and 2048x2048 resolution texture maps. It also typically has 32 MB of RAM and it is clocked much higher so performs noticeably better. Some popular budget variants include Vanta and TNT2 M64. The M64 model uses a 64-bit memory interface (rather than 128-bit as opposed to the regular variant), effectively halving its memory bandwidth. The performance of the M64 model, however, is faster than the original TNT in some situations.&lt;br /&gt;
&lt;br /&gt;
These cards do not support 8-bit palettized textures. This is a problem with, for example, Final Fantasy VII.&lt;br /&gt;
&lt;br /&gt;
The TNT2 is also NVIDIA's last chip with Windows 3.1x support. The TNT/TNT2 Windows 3.1x drivers were only released in beta form and therefore have stability issues and occasionally produce screen corruption - for example, video playback is fickle and will usually crash the system. Furthermore, the same drivers are locked to 60 Hz refresh rate, making them undesirable for use with CRT monitors under this OS. The TNT/TNT2 Windows 3.1x drivers will not work on any GeForce cards.&lt;br /&gt;
&lt;br /&gt;
===NV1x | GeForce 256 / 2 / 4 MX===&lt;br /&gt;
[[File:Gf256sdr.jpg|thumb|GeForce 256 SDR]]&lt;br /&gt;
Released in October 1999, GeForce 256 / NV10 was the first Direct3D 7-compliant GPU. It was initially released with standard SDRAM, but a DDR version later followed. The DDR version is roughly twice as fast as TNT2. It introduced ordered grid super-sampling anti-aliasing, anisotropic filtering support (up to 2x level), cubic environment mapping, and support for hardware dot product bump mapping.&lt;br /&gt;
&lt;br /&gt;
The next refinement in the architecture came in April 2000 in the form of the GeForce 2 / NV15. The GeForce 2 has twice the texture fillrate per clock compared to NV10 and uses a smaller manufacturing process allowing higher clock rates while reducing the power consumption at the same time. The GeForce 2 GTS is about 40% faster than the GeForce 256. GeForce 2 performance is mostly limited by memory bandwidth. Some card manufacturer used a low quality analog circuit design that produces a blurry image output.&lt;br /&gt;
&lt;br /&gt;
GeForce 2 MX / NV11 is the low end series of the GeForce 2, released in September 2000.  These cards have half of the pixel pipelines and half the memory interface of the NV15. It is the first NV chip with two integrated TDMS channels, providing dual display output (called &amp;quot;TwinView&amp;quot;).  It also has &amp;quot;Digital Vibrance Control&amp;quot; that allows calibration of various image output aspects.  The 3D performance of GeForce 2 MX at 16-bit color depth is slightly faster than a GeForce 256 SDR. With its relatively low price and with the performance it offered, it became a popular card. The GeForce 2 MX core was later integrated into NVIDIA's nForce IGP motherboard chipset northbridge.&lt;br /&gt;
&lt;br /&gt;
GeForce 4 MX / NV17 replaced the GeForce 2 series in January 2002. The NV17 core is a hybrid of the NV11 and NV25. The integration of various efficiency and bandwidth improving features, combined with significantly higher clock speed than NV11, allows it to match NV15 performance. These features were advertised as &amp;quot;Lightspeed Memory Architecture II,&amp;quot; which was a refinement of the Lightspeed Memory Architecture introduced with the GeForce 3 (NV20).  It also gained the &amp;quot;AccuView&amp;quot; anti-aliasing capabilities which are considerably advanced in quality and performance over NV11 and NV15. However, the GeForce 4 MX lacks the hardware pixel and vertex shaders support, as well as environment mapped bump mapping support and higher level anisotropic filtering support found in the NV2x architecture.&lt;br /&gt;
*[http://www.anandtech.com/show/875/6 AnandTech: NV17 and NV25 Come to Life]&lt;br /&gt;
*[http://www.nvidia.com/object/feature_lmaii.html Lightspeed Memory Architecture II]&lt;br /&gt;
&lt;br /&gt;
NVIDIA further refreshed the GeForce 4 MX line in late 2002 with the NV18, which added AGP 8x support. In 2004 the series received another refresh with PCI Express x16 support via a bridge chip. The first such product introduced was the GeForce PCX 4300. The GeForce 4 MX core was also integrated into NVIDIA's nForce2 IGP motherboard chipset northbridge.&lt;br /&gt;
&lt;br /&gt;
Estimated model performance ranking:&lt;br /&gt;
&lt;br /&gt;
GF2 MX100 &amp;lt; GF2 MX200 &amp;lt; GF2 MX &amp;lt; GF2 MX400 &amp;lt; GF4 MX420 &amp;lt; GF2 GTS &amp;lt; GF2 Pro &amp;lt; GF2 Ti VX &amp;lt; GF2 Ti &amp;lt; GF2 Ultra &amp;lt; GF4 MX440&lt;br /&gt;
&lt;br /&gt;
===NV2x | GeForce 3 &amp;amp; 4===&lt;br /&gt;
[[File:GeForce3_Ti200.JPG|180px|thumb||GeForce 3 Ti 200]]&lt;br /&gt;
Released in March 2001, the GeForce 3 / NV20 is the first Direct3D 8.0-compliant GPU. It is more efficient than GeForce 2 because of improvements in memory bandwidth utilization and the addition of hidden surface removal (HSR) functions similar to those of ATI Radeon. NVIDIA called this &amp;quot;Lightspeed Memory Architecture&amp;quot;.  Despite lower fillrate than GeForce 2 Ultra/Pro/Ti, in some cases GeForce 3 can outperform those cards by up to 50%, namely in situations with anti-aliasing or when the HSR features save considerable fillrate. However in some cases it loses to GeForce 2 Ultra. In the latter half of 2001, NVIDIA released the GeForce 3 Ti 200 and Ti 500. GeForce 3 Ti 200 is clocked lower than the original GeForce 3, while Ti 500 is fastest. The original GeForce 3 and GeForce 3 Ti 500 were only released in 64 MB configurations, while the GeForce 3 Ti 200 was released in 64 MB and 128 MB configurations. The GeForce 3, however, benefitted very little from 128 MB memory.&lt;br /&gt;
&lt;br /&gt;
GeForce 3 is the first GeForce GPU with environment mapped bump mapping (EMBM) support, multi-sample anti-aliasing (MSAA) and complete anisotropic filtering support (up to 8x level). MSAA is considerably less demanding of fillrate than SSAA. 2X and 4X MSAA modes are available. There is also an anti-aliasing mode called &amp;quot;Quincunx&amp;quot; that uses a combination of 2X MSAA and a RAMDAC-based filter. This mode was intended to allow better quality anti-aliasing than 2X MSAA/SSAA but without the performance hit of 4X MSAA/SSAA. Higher anisotropic filtering levels (4x and 8x), however, incur a heavy performance decrease (sometimes by as much as 50%). However, compared to its main competitor, ATI's R200 series, GeForce3's anisotropic filtering implementation is less angle-dependent and can work simultaneously with trilinear filtering, yielding better quality when observed in motion.&lt;br /&gt;
&lt;br /&gt;
The next evolution in the NV2x architecture came in early 2002 in the form of the NV25 / GeForce 4 Ti series. The GeForce 4 Ti is quite similar to the GeForce 3 in general. Changes include higher clock speed, pixel shader 1.3 support, Direct3D 8.1 compliance, an additional vertex processor for better geometry performance and dual RAM DAC for dual display output. Later in 2002, NVIDIA released the NV28, which added AGP 8x support. With the NV28 release, the AGP 8x versions of the Ti 4400 and 4600 were respectively rebranded as the Ti 4800 SE and Ti 4800. The AGP 8x variant of the Ti 4200 was just known as the &amp;quot;Ti 4200 with AGP 8x.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
In the GeForce4 Ti line, the Ti 4200 is slowest and the Ti 4800 is fastest.&lt;br /&gt;
*[http://ixbtlabs.com/articles/gf4/index1.html Review from ixbtlabs GeForce 3 Ti500, Radeon 8500, GeForce 4 Ti]&lt;br /&gt;
&lt;br /&gt;
===NV3x | GeForce FX===&lt;br /&gt;
[[File:Gf5200u.jpg|thumb|FX 5200 Ultra]]&lt;br /&gt;
These are NVIDIA's first Direct3D 9 GPUs.  They have excellent Direct3D 5-8 compatibility and performance but are of limited value for Direct3D 9.  They are very useful for old games because they still have support for palettized textures and fog table. Similar anti-aliasing and anisotropic features, but performance with these is improved compared to older models. The high performance models like 5700 Ultra, 5800 Ultra and 59x0 Ultra allow one to run old games in high-resolution with anti-aliasing and anisotropic filtering.&lt;br /&gt;
&lt;br /&gt;
Avoid models with 64-bit bus and naming suffixes like LE, XT or VE because they have been crippled in some way.  There were some PCIe models made, named GeForce PCX 5xxx.&lt;br /&gt;
&lt;br /&gt;
5200 &amp;lt; 5500 &amp;lt; 5200 Ultra &amp;lt; 5600 &amp;lt; 5600 Ultra &amp;lt; 5700 &amp;lt; 5700 Ultra &amp;lt; 5800 &amp;lt; 5800 Ultra &amp;lt; 59x0 &amp;lt; 59x0 Ultra&lt;br /&gt;
&lt;br /&gt;
===NV4x | GeForce 6===&lt;br /&gt;
The GeForce 6 series (NV4x) was released in 2004 and is the first Direct3D 9.0c-compliant GPU. It introduced support for Shader Model 3.0 and support for PCI Express x16, though initially released cards were AGP-only. Compared to the GeForce FX series, the NV4x had dramatically improved performance all-around but dropped palettized texture support, so it is incompatible with some old games (few). This is NVIDIA's final generation of GPUs with Windows 9x and NT 4.0 support.&lt;br /&gt;
&lt;br /&gt;
==Compatibility notes==&lt;br /&gt;
*With Intel 440BX motherboards, drivers newer than 56.64 may be unstable.&lt;br /&gt;
&lt;br /&gt;
==Video captures==&lt;br /&gt;
&lt;br /&gt;
===NV1 ===&lt;br /&gt;
{{#ev:youtube|2H-gEKsd5SY}}&lt;br /&gt;
{{#ev:youtube|9bJGnzb0Asw}}&lt;br /&gt;
{{#ev:youtube|t9Zqy5BZQ1E}}&lt;br /&gt;
{{#ev:youtube|_4l_2Bqbi9Q}}&lt;br /&gt;
{{#ev:youtube|ZOstiHUk_EM}}&lt;br /&gt;
{{#ev:youtube|Ibs90LY_Ph8}}&lt;br /&gt;
{{#ev:youtube|JG0kqg8QlLw}}&lt;br /&gt;
{{#ev:youtube|nc6cH0zuMSs}}&lt;br /&gt;
{{#ev:youtube|VK9sg_93iCE}}&lt;br /&gt;
&lt;br /&gt;
===NV3 | RIVA 128 &amp;amp; 128 ZX ===&lt;br /&gt;
{{#ev:youtube|0GAEXE3eu0o}}&lt;br /&gt;
{{#ev:youtube|Cqxia9tPFrs}}&lt;br /&gt;
{{#ev:youtube|LZrWRMMdwW4}}&lt;br /&gt;
{{#ev:youtube|hv07UKRetPY}}&lt;br /&gt;
&lt;br /&gt;
==Related links==&lt;br /&gt;
*[http://www.vogonsdrivers.com/index.php?catid=24  VOGONS Drivers NVIDIA section] &lt;br /&gt;
*[http://gona.mactar.hu/DOS_TESTS/ Gona's PCI and AGP DOS game compatibility matrix]&lt;br /&gt;
&lt;br /&gt;
[[Category:Hardware]]&lt;br /&gt;
[[Category:Graphics Cards]]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=NVIDIA&amp;diff=1323</id>
		<title>NVIDIA</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=NVIDIA&amp;diff=1323"/>
				<updated>2013-04-28T07:56:44Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* RIVA 128 / 128 ZX */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Retro value==&lt;br /&gt;
GeForce 256 through GeForce FX are fine cards for old games for a number of reasons. Their DOS game compatibility and GUI performance are top notch. Their Direct3D driver supports two critical old features, fog table and 8-bit palettized textures. OpenGL compatibility and performance are second to none, and some games utilize proprietary NVIDIA extensions.&lt;br /&gt;
&lt;br /&gt;
One problem with NVIDIA cards prior to GeForce4 is that some card vendors built their cards too cheaply. The most noticeable result is poor analog signal quality. It causes problems such as blurriness, loss of color saturation and color bleed, particularly at higher resolutions and refresh rates. Determining which cards are high quality is difficult but the GeForce4 and newer cards are most likely to be good.&lt;br /&gt;
&lt;br /&gt;
Perhaps the finest choices are the GeForce FX series because they offer the most refined quality-enhancing features and the high end models have the performance to use these features at higher resolutions. GeForce 6 drops support for palettized textures, which is a problem with a few games, but otherwise they too are great choices. GeForce 7 drops support for Windows 9x but still supports Windows 2000. Some users, however, have reportedly gotten GeForce 7800 cards to work under Windows 98 and ME using an unofficial, modified driver [http://www.mdgx.com/files/nv8269.php (link)]. This modified driver does not support Windows 95. The GeForce 7 series is also the last that is supported by NVIDIAs Stereo3D driver extension for shutter glasses such as ELSA Revelator. GeForce 8 and above support only XP and newer.&lt;br /&gt;
&lt;br /&gt;
==Cards==&lt;br /&gt;
===NV1 / STG2000 ===&lt;br /&gt;
[[File:Dedge3d.jpg|thumb|NV1]]&lt;br /&gt;
Released in 1995, NVIDIA's first 3D accelerator was an all-in-one product with audio, GUI, VGA, 3D and Sega Saturn gamepad support. It uses a type of 3D rendering called quadratic texture mapping that is not Direct3D or OpenGL compatible so it is only useful with games that use its proprietary API. It is the same technology used by Sega Saturn and as such various games were ported to use the NV1. Its audio consists of wavetable MIDI and DirectSound support but very little DOS support. DOS VGA compatibility is limited.&lt;br /&gt;
&lt;br /&gt;
It comes in various memory configurations with up to 4 MB maximum. The final drivers support Direct3D, but it is a software-only implementation.&lt;br /&gt;
&lt;br /&gt;
Games with support: Battle Arena Toshinden, Descent: Destination Saturn, NASCAR Racing, Panzer Dragoon, Virtua Cop, Virtua Fighter Remix.&lt;br /&gt;
&lt;br /&gt;
===NV3 | RIVA 128 &amp;amp; 128 ZX===&lt;br /&gt;
[[File:Riva128.jpg|thumb|RIVA 128]]&lt;br /&gt;
The RIVA 128 was released in late 1997 and it is the first Direct3D compatible GPU from NVIDIA. RIVA stands for '''R'''eal-time '''I'''nteractive '''V'''ideo and '''A'''nimation and 128 for the internal 128-bit pipeline and memory interface. It has a 206 MHz RAMDAC and supports DDC2 and VBE3.&lt;br /&gt;
Riva 128 has all the hardware features required for Direct3D 5 and has also good OpenGL 1.0 compatibility. It renders at 16-bit color depth and supports 3D accelerated resolutions up to 960x720 with Z-buffer. The 3D-performance is competitive with [[3dfx|Voodoo Graphics]] (Voodoo1). The RIVA 128 shows a few rendering quality issues like visible texture seams and a very apparent dithering pattern. Overall the rendered image appears more saturated than output from Glide games from this time. A special feature of the PCI version of the card is the ability to load textures over the PCI bus, which was advertised as ''AGP content on PCI''. The 2D-features include video scaling and color conversion capabilities. The chip features also PAL/NTSC output so many cards were released with TV-out, some even with TV-in.&lt;br /&gt;
&lt;br /&gt;
In early 1998, NVIDIA refreshed the NV3 architecture by releasing the RIVA 128 ZX. The RIVA 128 ZX is an upgraded chip that has a 250 MHz RAMDAC and supports up to 8 MB SGRAM. For texture intense games the increased memory results in higher performance.&lt;br /&gt;
&lt;br /&gt;
===NV4-6 | RIVA TNT &amp;amp; TNT2===&lt;br /&gt;
[[File:Tntpci.jpg|thumb|TNT PCI]]&lt;br /&gt;
Released in 1998, NVIDIA's RIVA TNT is a drastically improved Direct3D 6-compliant GPU with much better image quality and performance compared to its predecessors. It is competitive with Voodoo2 but with more flexibility such as 32-bit color rendering and 1024x1024 texture support. It also supports AGP 2x with execute mode (AGP texturing). 32-bit color rendering comes with a significant speed loss though and large textures are of very limited use since the chip lacks any form of texture compression. 16 MB RAM allows for very high resolutions. Great OpenGL support. All NVIDIA cards have the advantage of seperately clocked memory which makes for more flexible overclocking than with 3dfx cards.&lt;br /&gt;
&lt;br /&gt;
NVIDIA further refined the RIVA TNT architecture in early 1999 with the RIVA TNT2. The RIVA TNT2 is very similar to the original TNT but adds support for AGP 4x and 2048x2048 resolution texture maps. It also typically has 32 MB of RAM and it is clocked much higher so performs noticeably better. Some popular budget variants include Vanta and TNT2 M64. The M64 model uses a 64-bit memory interface (rather than 128-bit as opposed to the regular variant), effectively halving its memory bandwidth. The performance of the M64 model, however, is faster than the original TNT in some situations.&lt;br /&gt;
&lt;br /&gt;
These cards do not support 8-bit palettized textures. This is a problem with, for example, Final Fantasy VII.&lt;br /&gt;
&lt;br /&gt;
The TNT2 is also NVIDIA's last chip with Windows 3.1x support. The TNT/TNT2 Windows 3.1x drivers were only released in beta form and therefore have stability issues and occasionally produce screen corruption - for example, video playback is fickle and will usually crash the system. Furthermore, the same drivers are locked to 60 Hz refresh rate, making them undesirable for use with CRT monitors under this OS. The TNT/TNT2 Windows 3.1x drivers will not work on any GeForce cards.&lt;br /&gt;
&lt;br /&gt;
===NV1x | GeForce 256 / 2 / 4 MX===&lt;br /&gt;
[[File:Gf256sdr.jpg|thumb|GeForce 256 SDR]]&lt;br /&gt;
Released in October 1999, GeForce 256 / NV10 was the first Direct3D 7-compliant GPU. It was initially released with standard SDRAM, but a DDR version later followed. The DDR version is roughly twice as fast as TNT2. It introduced ordered grid super-sampling anti-aliasing, anisotropic filtering support (up to 2x level), cubic environment mapping, and support for hardware dot product bump mapping.&lt;br /&gt;
&lt;br /&gt;
The next refinement in the architecture came in April 2000 in the form of the GeForce 2 / NV15. The GeForce 2 has twice the texture fillrate per clock compared to NV10 and uses a smaller manufacturing process allowing higher clock rates while reducing the power consumption at the same time. The GeForce 2 GTS is about 40% faster than the GeForce 256. GeForce 2 performance is mostly limited by memory bandwidth. Some card manufacturer used a low quality analog circuit design that produces a blurry image output.&lt;br /&gt;
&lt;br /&gt;
GeForce 2 MX / NV11 is the low end series of the GeForce 2, released in September 2000.  These cards have half of the pixel pipelines and half the memory interface of the NV15. It is the first NV chip with two integrated TDMS channels, providing dual display output (called &amp;quot;TwinView&amp;quot;).  It also has &amp;quot;Digital Vibrance Control&amp;quot; that allows calibration of various image output aspects.  The 3D performance of GeForce 2 MX at 16-bit color depth is slightly faster than a GeForce 256 SDR. With its relatively low price and with the performance it offered, it became a popular card. The GeForce 2 MX core was later integrated into NVIDIA's nForce IGP motherboard chipset northbridge.&lt;br /&gt;
&lt;br /&gt;
GeForce 4 MX / NV17 replaced the GeForce 2 series in January 2002. The NV17 core is a hybrid of the NV11 and NV25. The integration of various efficiency and bandwidth improving features, combined with significantly higher clock speed than NV11, allows it to match NV15 performance. These features were advertised as &amp;quot;Lightspeed Memory Architecture II,&amp;quot; which was a refinement of the Lightspeed Memory Architecture introduced with the GeForce 3 (NV20).  It also gained the &amp;quot;AccuView&amp;quot; anti-aliasing capabilities which are considerably advanced in quality and performance over NV11 and NV15. However, the GeForce 4 MX lacks the hardware pixel and vertex shaders support, as well as environment mapped bump mapping support and higher level anisotropic filtering support found in the NV2x architecture.&lt;br /&gt;
*[http://www.anandtech.com/show/875/6 AnandTech: NV17 and NV25 Come to Life]&lt;br /&gt;
*[http://www.nvidia.com/object/feature_lmaii.html Lightspeed Memory Architecture II]&lt;br /&gt;
&lt;br /&gt;
NVIDIA further refreshed the GeForce 4 MX line in late 2002 with the NV18, which added AGP 8x support. In 2004 the series received another refresh with PCI Express x16 support via a bridge chip. The first such product introduced was the GeForce PCX 4300. The GeForce 4 MX core was also integrated into NVIDIA's nForce2 IGP motherboard chipset northbridge.&lt;br /&gt;
&lt;br /&gt;
Estimated model performance ranking:&lt;br /&gt;
&lt;br /&gt;
GF2 MX100 &amp;lt; GF2 MX200 &amp;lt; GF2 MX &amp;lt; GF2 MX400 &amp;lt; GF4 MX420 &amp;lt; GF2 GTS &amp;lt; GF2 Pro &amp;lt; GF2 Ti VX &amp;lt; GF2 Ti &amp;lt; GF2 Ultra &amp;lt; GF4 MX440&lt;br /&gt;
&lt;br /&gt;
===NV2x | GeForce 3 &amp;amp; 4===&lt;br /&gt;
[[File:GeForce3_Ti200.JPG|180px|thumb||GeForce 3 Ti 200]]&lt;br /&gt;
Released in March 2001, the GeForce 3 / NV20 is the first Direct3D 8.0-compliant GPU. It is more efficient than GeForce 2 because of improvements in memory bandwidth utilization and the addition of hidden surface removal (HSR) functions similar to those of ATI Radeon. NVIDIA called this &amp;quot;Lightspeed Memory Architecture&amp;quot;.  Despite lower fillrate than GeForce 2 Ultra/Pro/Ti, in some cases GeForce 3 can outperform those cards by up to 50%, namely in situations with anti-aliasing or when the HSR features save considerable fillrate. However in some cases it loses to GeForce 2 Ultra. In the latter half of 2001, NVIDIA released the GeForce 3 Ti 200 and Ti 500. GeForce 3 Ti 200 is clocked lower than the original GeForce 3, while Ti 500 is fastest. The original GeForce 3 and GeForce 3 Ti 500 were only released in 64 MB configurations, while the GeForce 3 Ti 200 was released in 64 MB and 128 MB configurations. The GeForce 3, however, benefitted very little from 128 MB memory.&lt;br /&gt;
&lt;br /&gt;
GeForce 3 is the first GeForce GPU with environment mapped bump mapping (EMBM) support, multi-sample anti-aliasing (MSAA) and complete anisotropic filtering support (up to 8x level). MSAA is considerably less demanding of fillrate than SSAA. 2X and 4X MSAA modes are available. There is also an anti-aliasing mode called &amp;quot;Quincunx&amp;quot; that uses a combination of 2X MSAA and a RAMDAC-based filter. This mode was intended to allow better quality anti-aliasing than 2X MSAA/SSAA but without the performance hit of 4X MSAA/SSAA. Higher anisotropic filtering levels (4x and 8x), however, incur a heavy performance decrease (sometimes by as much as 50%). However, compared to its main competitor, ATI's R200 series, GeForce3's anisotropic filtering implementation is less angle-dependent and can work simultaneously with trilinear filtering, yielding better quality when observed in motion.&lt;br /&gt;
&lt;br /&gt;
The next evolution in the NV2x architecture came in early 2002 in the form of the NV25 / GeForce 4 Ti series. The GeForce 4 Ti is quite similar to the GeForce 3 in general. Changes include higher clock speed, pixel shader 1.3 support, Direct3D 8.1 compliance, an additional vertex processor for better geometry performance and dual RAM DAC for dual display output. Later in 2002, NVIDIA released the NV28, which added AGP 8x support. With the NV28 release, the AGP 8x versions of the Ti 4400 and 4600 were respectively rebranded as the Ti 4800 SE and Ti 4800. The AGP 8x variant of the Ti 4200 was just known as the &amp;quot;Ti 4200 with AGP 8x.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
In the GeForce4 Ti line, the Ti 4200 is slowest and the Ti 4800 is fastest.&lt;br /&gt;
*[http://ixbtlabs.com/articles/gf4/index1.html Review from ixbtlabs GeForce 3 Ti500, Radeon 8500, GeForce 4 Ti]&lt;br /&gt;
&lt;br /&gt;
===NV3x | GeForce FX===&lt;br /&gt;
[[File:Gf5200u.jpg|thumb|FX 5200 Ultra]]&lt;br /&gt;
These are NVIDIA's first Direct3D 9 GPUs.  They have excellent Direct3D 5-8 compatibility and performance but are of limited value for Direct3D 9.  They are very useful for old games because they still have support for palettized textures and fog table. Similar anti-aliasing and anisotropic features, but performance with these is improved compared to older models. The high performance models like 5700 Ultra, 5800 Ultra and 59x0 Ultra allow one to run old games in high-resolution with anti-aliasing and anisotropic filtering.&lt;br /&gt;
&lt;br /&gt;
Avoid models with 64-bit bus and naming suffixes like LE, XT or VE because they have been crippled in some way.  There were some PCIe models made, named GeForce PCX 5xxx.&lt;br /&gt;
&lt;br /&gt;
5200 &amp;lt; 5500 &amp;lt; 5200 Ultra &amp;lt; 5600 &amp;lt; 5600 Ultra &amp;lt; 5700 &amp;lt; 5700 Ultra &amp;lt; 5800 &amp;lt; 5800 Ultra &amp;lt; 59x0 &amp;lt; 59x0 Ultra&lt;br /&gt;
&lt;br /&gt;
===NV4x | GeForce 6===&lt;br /&gt;
The GeForce 6 series (NV4x) was released in 2004 and is the first Direct3D 9.0c-compliant GPU. It introduced support for Shader Model 3.0 and support for PCI Express x16, though initially released cards were AGP-only. Compared to the GeForce FX series, the NV4x had dramatically improved performance all-around but dropped palettized texture support, so it is incompatible with some old games (few). This is NVIDIA's final generation of GPUs with Windows 9x and NT 4.0 support.&lt;br /&gt;
&lt;br /&gt;
==Compatibility notes==&lt;br /&gt;
*With Intel 440BX motherboards, drivers newer than 56.64 may be unstable.&lt;br /&gt;
&lt;br /&gt;
==Video captures==&lt;br /&gt;
&lt;br /&gt;
===NV1 ===&lt;br /&gt;
{{#ev:youtube|2H-gEKsd5SY}}&lt;br /&gt;
{{#ev:youtube|9bJGnzb0Asw}}&lt;br /&gt;
{{#ev:youtube|t9Zqy5BZQ1E}}&lt;br /&gt;
{{#ev:youtube|_4l_2Bqbi9Q}}&lt;br /&gt;
{{#ev:youtube|ZOstiHUk_EM}}&lt;br /&gt;
{{#ev:youtube|JG0kqg8QlLw}}&lt;br /&gt;
{{#ev:youtube|nc6cH0zuMSs}}&lt;br /&gt;
{{#ev:youtube|VK9sg_93iCE}}&lt;br /&gt;
{{#ev:youtube|Ibs90LY_Ph8}}&lt;br /&gt;
&lt;br /&gt;
===NV3 | RIVA 128 &amp;amp; 128 ZX ===&lt;br /&gt;
{{#ev:youtube|0GAEXE3eu0o}}&lt;br /&gt;
{{#ev:youtube|Cqxia9tPFrs}}&lt;br /&gt;
{{#ev:youtube|LZrWRMMdwW4}}&lt;br /&gt;
{{#ev:youtube|hv07UKRetPY}}&lt;br /&gt;
&lt;br /&gt;
==Related links==&lt;br /&gt;
*[http://www.vogonsdrivers.com/index.php?catid=24  VOGONS Drivers NVIDIA section] &lt;br /&gt;
*[http://gona.mactar.hu/DOS_TESTS/ Gona's PCI and AGP DOS game compatibility matrix]&lt;br /&gt;
&lt;br /&gt;
[[Category:Hardware]]&lt;br /&gt;
[[Category:Graphics Cards]]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=NVIDIA&amp;diff=1322</id>
		<title>NVIDIA</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=NVIDIA&amp;diff=1322"/>
				<updated>2013-04-28T07:54:52Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* Video captures */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Retro value==&lt;br /&gt;
GeForce 256 through GeForce FX are fine cards for old games for a number of reasons. Their DOS game compatibility and GUI performance are top notch. Their Direct3D driver supports two critical old features, fog table and 8-bit palettized textures. OpenGL compatibility and performance are second to none, and some games utilize proprietary NVIDIA extensions.&lt;br /&gt;
&lt;br /&gt;
One problem with NVIDIA cards prior to GeForce4 is that some card vendors built their cards too cheaply. The most noticeable result is poor analog signal quality. It causes problems such as blurriness, loss of color saturation and color bleed, particularly at higher resolutions and refresh rates. Determining which cards are high quality is difficult but the GeForce4 and newer cards are most likely to be good.&lt;br /&gt;
&lt;br /&gt;
Perhaps the finest choices are the GeForce FX series because they offer the most refined quality-enhancing features and the high end models have the performance to use these features at higher resolutions. GeForce 6 drops support for palettized textures, which is a problem with a few games, but otherwise they too are great choices. GeForce 7 drops support for Windows 9x but still supports Windows 2000. Some users, however, have reportedly gotten GeForce 7800 cards to work under Windows 98 and ME using an unofficial, modified driver [http://www.mdgx.com/files/nv8269.php (link)]. This modified driver does not support Windows 95. The GeForce 7 series is also the last that is supported by NVIDIAs Stereo3D driver extension for shutter glasses such as ELSA Revelator. GeForce 8 and above support only XP and newer.&lt;br /&gt;
&lt;br /&gt;
==Cards==&lt;br /&gt;
===NV1 / STG2000 ===&lt;br /&gt;
[[File:Dedge3d.jpg|thumb|NV1]]&lt;br /&gt;
Released in 1995, NVIDIA's first 3D accelerator was an all-in-one product with audio, GUI, VGA, 3D and Sega Saturn gamepad support. It uses a type of 3D rendering called quadratic texture mapping that is not Direct3D or OpenGL compatible so it is only useful with games that use its proprietary API. It is the same technology used by Sega Saturn and as such various games were ported to use the NV1. Its audio consists of wavetable MIDI and DirectSound support but very little DOS support. DOS VGA compatibility is limited.&lt;br /&gt;
&lt;br /&gt;
It comes in various memory configurations with up to 4 MB maximum. The final drivers support Direct3D, but it is a software-only implementation.&lt;br /&gt;
&lt;br /&gt;
Games with support: Battle Arena Toshinden, Descent: Destination Saturn, NASCAR Racing, Panzer Dragoon, Virtua Cop, Virtua Fighter Remix.&lt;br /&gt;
&lt;br /&gt;
===NV3 | RIVA 128 &amp;amp; 128 ZX===&lt;br /&gt;
[[File:Riva128.jpg|thumb|RIVA 128]]&lt;br /&gt;
The RIVA 128 was released in late 1997 and it is the first Direct3D compatible GPU from NVIDIA. RIVA stands for '''R'''eal-time '''I'''nteractive '''V'''ideo and '''A'''nimation and 128 for the internal 128-bit pipeline and memory interface. It has a 206 MHz RAMDAC and supports DDC2 and VBE3.&lt;br /&gt;
Riva 128 has all the hardware features required for Direct3D 5 and has also good OpenGL 1.0 compatibility. It renders at 16-bit color depth and supports 3D accelerated resolutions up to 960x720 with Z-buffer. The 3D-performance is competitive with [[3dfx|Voodoo Graphics]] (Voodoo1). The RIVA 128 shows a few rendering quality issues like visible texture seams and a very apparent dithering pattern. Overall the rendered image appears more saturated than output from Glide games from this time. A special feature of the PCI version of the card is the ability to load textures over the PCI bus, which was advertised as ''AGP content on PCI''. The 2D-features include video scaling and color conversion capabilities. The chip features also PAL/NTSC output so many cards were released with TV-out, some even with TV-in.&lt;br /&gt;
&lt;br /&gt;
In early 1998, NVIDIA refreshed the NV3 architecture by releasing the RIVA 128 ZX. The RIVA 128 ZX is an upgraded chip that has a 250 MHz RAMDAC and supports up to 8 MB SGRAM. For texture intense games the increased memory results in higher performance.&lt;br /&gt;
&lt;br /&gt;
===NV4-6 | RIVA TNT &amp;amp; TNT2===&lt;br /&gt;
[[File:Tntpci.jpg|thumb|TNT PCI]]&lt;br /&gt;
Released in 1998, NVIDIA's RIVA TNT is a drastically improved Direct3D 6-compliant GPU with much better image quality and performance compared to its predecessors. It is competitive with Voodoo2 but with more flexibility such as 32-bit color rendering and 1024x1024 texture support. It also supports AGP 2x with execute mode (AGP texturing). 32-bit color rendering comes with a significant speed loss though and large textures are of very limited use since the chip lacks any form of texture compression. 16 MB RAM allows for very high resolutions. Great OpenGL support. All NVIDIA cards have the advantage of seperately clocked memory which makes for more flexible overclocking than with 3dfx cards.&lt;br /&gt;
&lt;br /&gt;
NVIDIA further refined the RIVA TNT architecture in early 1999 with the RIVA TNT2. The RIVA TNT2 is very similar to the original TNT but adds support for AGP 4x and 2048x2048 resolution texture maps. It also typically has 32 MB of RAM and it is clocked much higher so performs noticeably better. Some popular budget variants include Vanta and TNT2 M64. The M64 model uses a 64-bit memory interface (rather than 128-bit as opposed to the regular variant), effectively halving its memory bandwidth. The performance of the M64 model, however, is faster than the original TNT in some situations.&lt;br /&gt;
&lt;br /&gt;
These cards do not support 8-bit palettized textures. This is a problem with, for example, Final Fantasy VII.&lt;br /&gt;
&lt;br /&gt;
The TNT2 is also NVIDIA's last chip with Windows 3.1x support. The TNT/TNT2 Windows 3.1x drivers were only released in beta form and therefore have stability issues and occasionally produce screen corruption - for example, video playback is fickle and will usually crash the system. Furthermore, the same drivers are locked to 60 Hz refresh rate, making them undesirable for use with CRT monitors under this OS. The TNT/TNT2 Windows 3.1x drivers will not work on any GeForce cards.&lt;br /&gt;
&lt;br /&gt;
===NV1x | GeForce 256 / 2 / 4 MX===&lt;br /&gt;
[[File:Gf256sdr.jpg|thumb|GeForce 256 SDR]]&lt;br /&gt;
Released in October 1999, GeForce 256 / NV10 was the first Direct3D 7-compliant GPU. It was initially released with standard SDRAM, but a DDR version later followed. The DDR version is roughly twice as fast as TNT2. It introduced ordered grid super-sampling anti-aliasing, anisotropic filtering support (up to 2x level), cubic environment mapping, and support for hardware dot product bump mapping.&lt;br /&gt;
&lt;br /&gt;
The next refinement in the architecture came in April 2000 in the form of the GeForce 2 / NV15. The GeForce 2 has twice the texture fillrate per clock compared to NV10 and uses a smaller manufacturing process allowing higher clock rates while reducing the power consumption at the same time. The GeForce 2 GTS is about 40% faster than the GeForce 256. GeForce 2 performance is mostly limited by memory bandwidth. Some card manufacturer used a low quality analog circuit design that produces a blurry image output.&lt;br /&gt;
&lt;br /&gt;
GeForce 2 MX / NV11 is the low end series of the GeForce 2, released in September 2000.  These cards have half of the pixel pipelines and half the memory interface of the NV15. It is the first NV chip with two integrated TDMS channels, providing dual display output (called &amp;quot;TwinView&amp;quot;).  It also has &amp;quot;Digital Vibrance Control&amp;quot; that allows calibration of various image output aspects.  The 3D performance of GeForce 2 MX at 16-bit color depth is slightly faster than a GeForce 256 SDR. With its relatively low price and with the performance it offered, it became a popular card. The GeForce 2 MX core was later integrated into NVIDIA's nForce IGP motherboard chipset northbridge.&lt;br /&gt;
&lt;br /&gt;
GeForce 4 MX / NV17 replaced the GeForce 2 series in January 2002. The NV17 core is a hybrid of the NV11 and NV25. The integration of various efficiency and bandwidth improving features, combined with significantly higher clock speed than NV11, allows it to match NV15 performance. These features were advertised as &amp;quot;Lightspeed Memory Architecture II,&amp;quot; which was a refinement of the Lightspeed Memory Architecture introduced with the GeForce 3 (NV20).  It also gained the &amp;quot;AccuView&amp;quot; anti-aliasing capabilities which are considerably advanced in quality and performance over NV11 and NV15. However, the GeForce 4 MX lacks the hardware pixel and vertex shaders support, as well as environment mapped bump mapping support and higher level anisotropic filtering support found in the NV2x architecture.&lt;br /&gt;
*[http://www.anandtech.com/show/875/6 AnandTech: NV17 and NV25 Come to Life]&lt;br /&gt;
*[http://www.nvidia.com/object/feature_lmaii.html Lightspeed Memory Architecture II]&lt;br /&gt;
&lt;br /&gt;
NVIDIA further refreshed the GeForce 4 MX line in late 2002 with the NV18, which added AGP 8x support. In 2004 the series received another refresh with PCI Express x16 support via a bridge chip. The first such product introduced was the GeForce PCX 4300. The GeForce 4 MX core was also integrated into NVIDIA's nForce2 IGP motherboard chipset northbridge.&lt;br /&gt;
&lt;br /&gt;
Estimated model performance ranking:&lt;br /&gt;
&lt;br /&gt;
GF2 MX100 &amp;lt; GF2 MX200 &amp;lt; GF2 MX &amp;lt; GF2 MX400 &amp;lt; GF4 MX420 &amp;lt; GF2 GTS &amp;lt; GF2 Pro &amp;lt; GF2 Ti VX &amp;lt; GF2 Ti &amp;lt; GF2 Ultra &amp;lt; GF4 MX440&lt;br /&gt;
&lt;br /&gt;
===NV2x | GeForce 3 &amp;amp; 4===&lt;br /&gt;
[[File:GeForce3_Ti200.JPG|180px|thumb||GeForce 3 Ti 200]]&lt;br /&gt;
Released in March 2001, the GeForce 3 / NV20 is the first Direct3D 8.0-compliant GPU. It is more efficient than GeForce 2 because of improvements in memory bandwidth utilization and the addition of hidden surface removal (HSR) functions similar to those of ATI Radeon. NVIDIA called this &amp;quot;Lightspeed Memory Architecture&amp;quot;.  Despite lower fillrate than GeForce 2 Ultra/Pro/Ti, in some cases GeForce 3 can outperform those cards by up to 50%, namely in situations with anti-aliasing or when the HSR features save considerable fillrate. However in some cases it loses to GeForce 2 Ultra. In the latter half of 2001, NVIDIA released the GeForce 3 Ti 200 and Ti 500. GeForce 3 Ti 200 is clocked lower than the original GeForce 3, while Ti 500 is fastest. The original GeForce 3 and GeForce 3 Ti 500 were only released in 64 MB configurations, while the GeForce 3 Ti 200 was released in 64 MB and 128 MB configurations. The GeForce 3, however, benefitted very little from 128 MB memory.&lt;br /&gt;
&lt;br /&gt;
GeForce 3 is the first GeForce GPU with environment mapped bump mapping (EMBM) support, multi-sample anti-aliasing (MSAA) and complete anisotropic filtering support (up to 8x level). MSAA is considerably less demanding of fillrate than SSAA. 2X and 4X MSAA modes are available. There is also an anti-aliasing mode called &amp;quot;Quincunx&amp;quot; that uses a combination of 2X MSAA and a RAMDAC-based filter. This mode was intended to allow better quality anti-aliasing than 2X MSAA/SSAA but without the performance hit of 4X MSAA/SSAA. Higher anisotropic filtering levels (4x and 8x), however, incur a heavy performance decrease (sometimes by as much as 50%). However, compared to its main competitor, ATI's R200 series, GeForce3's anisotropic filtering implementation is less angle-dependent and can work simultaneously with trilinear filtering, yielding better quality when observed in motion.&lt;br /&gt;
&lt;br /&gt;
The next evolution in the NV2x architecture came in early 2002 in the form of the NV25 / GeForce 4 Ti series. The GeForce 4 Ti is quite similar to the GeForce 3 in general. Changes include higher clock speed, pixel shader 1.3 support, Direct3D 8.1 compliance, an additional vertex processor for better geometry performance and dual RAM DAC for dual display output. Later in 2002, NVIDIA released the NV28, which added AGP 8x support. With the NV28 release, the AGP 8x versions of the Ti 4400 and 4600 were respectively rebranded as the Ti 4800 SE and Ti 4800. The AGP 8x variant of the Ti 4200 was just known as the &amp;quot;Ti 4200 with AGP 8x.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
In the GeForce4 Ti line, the Ti 4200 is slowest and the Ti 4800 is fastest.&lt;br /&gt;
*[http://ixbtlabs.com/articles/gf4/index1.html Review from ixbtlabs GeForce 3 Ti500, Radeon 8500, GeForce 4 Ti]&lt;br /&gt;
&lt;br /&gt;
===NV3x | GeForce FX===&lt;br /&gt;
[[File:Gf5200u.jpg|thumb|FX 5200 Ultra]]&lt;br /&gt;
These are NVIDIA's first Direct3D 9 GPUs.  They have excellent Direct3D 5-8 compatibility and performance but are of limited value for Direct3D 9.  They are very useful for old games because they still have support for palettized textures and fog table. Similar anti-aliasing and anisotropic features, but performance with these is improved compared to older models. The high performance models like 5700 Ultra, 5800 Ultra and 59x0 Ultra allow one to run old games in high-resolution with anti-aliasing and anisotropic filtering.&lt;br /&gt;
&lt;br /&gt;
Avoid models with 64-bit bus and naming suffixes like LE, XT or VE because they have been crippled in some way.  There were some PCIe models made, named GeForce PCX 5xxx.&lt;br /&gt;
&lt;br /&gt;
5200 &amp;lt; 5500 &amp;lt; 5200 Ultra &amp;lt; 5600 &amp;lt; 5600 Ultra &amp;lt; 5700 &amp;lt; 5700 Ultra &amp;lt; 5800 &amp;lt; 5800 Ultra &amp;lt; 59x0 &amp;lt; 59x0 Ultra&lt;br /&gt;
&lt;br /&gt;
===NV4x | GeForce 6===&lt;br /&gt;
The GeForce 6 series (NV4x) was released in 2004 and is the first Direct3D 9.0c-compliant GPU. It introduced support for Shader Model 3.0 and support for PCI Express x16, though initially released cards were AGP-only. Compared to the GeForce FX series, the NV4x had dramatically improved performance all-around but dropped palettized texture support, so it is incompatible with some old games (few). This is NVIDIA's final generation of GPUs with Windows 9x and NT 4.0 support.&lt;br /&gt;
&lt;br /&gt;
==Compatibility notes==&lt;br /&gt;
*With Intel 440BX motherboards, drivers newer than 56.64 may be unstable.&lt;br /&gt;
&lt;br /&gt;
==Video captures==&lt;br /&gt;
&lt;br /&gt;
===NV1 ===&lt;br /&gt;
{{#ev:youtube|2H-gEKsd5SY}}&lt;br /&gt;
{{#ev:youtube|9bJGnzb0Asw}}&lt;br /&gt;
{{#ev:youtube|t9Zqy5BZQ1E}}&lt;br /&gt;
{{#ev:youtube|_4l_2Bqbi9Q}}&lt;br /&gt;
{{#ev:youtube|ZOstiHUk_EM}}&lt;br /&gt;
{{#ev:youtube|JG0kqg8QlLw}}&lt;br /&gt;
{{#ev:youtube|nc6cH0zuMSs}}&lt;br /&gt;
{{#ev:youtube|VK9sg_93iCE}}&lt;br /&gt;
{{#ev:youtube|Ibs90LY_Ph8}}&lt;br /&gt;
&lt;br /&gt;
===RIVA 128 / 128 ZX ===&lt;br /&gt;
{{#ev:youtube|0GAEXE3eu0o}}&lt;br /&gt;
{{#ev:youtube|Cqxia9tPFrs}}&lt;br /&gt;
{{#ev:youtube|LZrWRMMdwW4}}&lt;br /&gt;
&lt;br /&gt;
==Related links==&lt;br /&gt;
*[http://www.vogonsdrivers.com/index.php?catid=24  VOGONS Drivers NVIDIA section] &lt;br /&gt;
*[http://gona.mactar.hu/DOS_TESTS/ Gona's PCI and AGP DOS game compatibility matrix]&lt;br /&gt;
&lt;br /&gt;
[[Category:Hardware]]&lt;br /&gt;
[[Category:Graphics Cards]]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=ATI&amp;diff=1321</id>
		<title>ATI</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=ATI&amp;diff=1321"/>
				<updated>2013-04-28T07:42:16Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* Video captures */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;ATi Technologies produced graphics cards from the '80s through the mid '00s until merging with AMD in 2006. AMD still produces graphics cards today.&lt;br /&gt;
&lt;br /&gt;
== Graphics card series ==&lt;br /&gt;
=== Mach ===&lt;br /&gt;
===== Mach 8 =====&lt;br /&gt;
===== Mach 32 =====&lt;br /&gt;
===== Mach 64 =====&lt;br /&gt;
&lt;br /&gt;
=== Rage ===&lt;br /&gt;
[[File:ATIRage128Pro.JPG|thumb|Rage 128 Pro OEM]]&lt;br /&gt;
&lt;br /&gt;
===== 3D Rage =====&lt;br /&gt;
===== 3D Rage II =====&lt;br /&gt;
===== 3D Rage Pro =====&lt;br /&gt;
Released in the latter half of 1997, the Rage Pro was a major improvement on ATI's previous Rage II chip. Improvements include an increased texture cache size (now at 4 KB) allowing for improved texture filtering, as well as an integrated triangle setup engine. It is the first ATI chip (and among the earliest graphics chips) to fully support AGP bus features, including execute mode (AGP texturing). It is also the first ATI chip to support OpenGL in hardware. However, like the previous Rage chips, the Rage Pro cannot bilinear filter alpha textures, resulting in transparent textures still having a rough appearance. Performance-wise, it is very similar to 3Dfx's original Voodoo Graphics chipset. The Rage Pro was very popular with OEMs and up until the late 2000s, it was integrated into many server motherboards.&lt;br /&gt;
&lt;br /&gt;
The Rage Pro is also the last chip to support ATI's CIF application programming interface. It is also ATI's last chip with Windows 3.1x support.&lt;br /&gt;
&lt;br /&gt;
===== Rage 128 =====&lt;br /&gt;
&lt;br /&gt;
=== Radeon ===&lt;br /&gt;
&lt;br /&gt;
===== R100 =====&lt;br /&gt;
[[File:Radeon7500agp.jpg|thumb|Radeon 7500 64MB]]&lt;br /&gt;
The original Radeon was a Direct3D 7 visual processing unit (VPU), as ATi named it. It is a 2 pixel per clock design with 3 texture units on each of the pixel pipelines. The 166 MHz Radeon DDR (aka 7200) is competitive with GeForce 256 DDR. Clock speeds varied from 143 - 200 MHz, synchronous memory and core. &lt;br /&gt;
&lt;br /&gt;
It supports environmental bump mapping (EMBM), unlike GeForce cards at the time. It has a basic form of anisotropic filtering that is high performance and offers a nice quality improvement but is highly angle-dependent and can not operate at the same time as trilinear filtering. It also offers ordered-grid supersampling anti-aliasing.&lt;br /&gt;
&lt;br /&gt;
Backwards compatibility with old D3D 5 games is limited because of the lack of support for fog table and palettized textures. It is possible to enable fog table via registry tweaks but it was not officially supported.&lt;br /&gt;
&lt;br /&gt;
RV100 (Radeon VE / 7000) is a chip with dual display capabilities but with reduced 3D hardware. It lacks T&amp;amp;L and has a single pixel pipeline. It is somewhat faster than TNT2 Ultra and G400 Max.&lt;br /&gt;
&lt;br /&gt;
RV200 (Radeon 7500) is a die shrink of R100 with some improvements. It has more anisotropic filtering options and is capable of asynchronous clocking of memory and the core. The top of the line model is clocked at 290 MHz core and 230 MHz RAM, and competes with GeForce 2 Ti/Pro. There are many variations of this card.&lt;br /&gt;
&lt;br /&gt;
'''Notes:'''&lt;br /&gt;
*DVI on these cards is flaky and is essentially unusable with DOS.  High resolutions may also be problematic.&lt;br /&gt;
&lt;br /&gt;
===== R200 =====&lt;br /&gt;
[[File:Radeon8500 128mb.JPG|thumb|Radeon 8500 128MB]]&lt;br /&gt;
This generation is the first with Direct3D 8 compliance, actually Direct3D 8.1. The Radeon 8500 is a 4 pipeline design with 2 texture units per pipeline and operates at up to 275 MHz, typically with synchronous core and RAM. It is competitive with GeForce 3 Ti 500. &lt;br /&gt;
&lt;br /&gt;
A wide variety of supersampling anti-aliasing modes are available (2-6x, quality/performance). ATi calls it &amp;quot;Smoothvision&amp;quot;. It uses various techniques, including a jittered-grid pattern for some modes/cases and ordered-grid for others. In Direct3D, fog may force it to use ordered-grid. Drivers vary in their behavior as well.[http://forum.beyond3d.com/showpost.php?p=4859&amp;amp;postcount=64]&lt;br /&gt;
&lt;br /&gt;
Anisotropic filtering is somewhat improved, with more levels supported, but is again very angle dependent and can not work with trilinear filtering. GeForce 3+ have higher quality anisotropic filtering but with a much higher performance impact.&lt;br /&gt;
&lt;br /&gt;
ATi introduced a tessellation function called [[TruForm]].&lt;br /&gt;
&lt;br /&gt;
Backwards compatibility with old D3D 5 games is limited because of the lack of support for fog table and palettized textures.&lt;br /&gt;
&lt;br /&gt;
RV250 and RV280, known as Radeon 9000, 9200 and 9250, are slight evolutions of the design. They have somewhat reduced specifications but are more efficient and run cooler. They were popular notebook GPUs. Performance of Radeon 9000 Pro is not far off of Radeon 8500. Radeon 9100 is a rename of Radeon 8500 LE.&lt;br /&gt;
&lt;br /&gt;
'''Notes:'''&lt;br /&gt;
*DVI on these cards is flaky and is essentially unusable with DOS. High resolutions may also be problematic.&lt;br /&gt;
*With Star Wars KOTOR and KOTOR2, use Catalyst 4.2.&lt;br /&gt;
&lt;br /&gt;
===== R300 =====&lt;br /&gt;
[[File:Radeon9800pro256.JPG|thumb|Radeon 9800 Pro 256MB]]&lt;br /&gt;
Introduced in August 2002, the R300 GPUs are Direct3D 9.0-compliant graphics chips. R300 introduced Shader Model 2.0 support and is also OpenGL 2.0-compliant. The R300 was designed by the ArtX engineering team that ATI had acquired in Feburary 2000. The same ArtX engineers (who were also former SGI employees) designed the Nintendo Gamecube GPU (Flipper) as well as the SGI RealityEngine-based graphics processor in the Nintendo 64. The first R300-based cards released were the Radeon 9500 and 9700 line of cards. In 2003, the Radeon 9600 and 9800 series were added to the lineup. R300 has many improvements and noticeably better visual quality than ATI's prior chips. Radeon 9800 Pro is competitive with GeForce FX 5900 Ultra, but with Direct3D 9 games the GeForce FX falls far behind.&lt;br /&gt;
&lt;br /&gt;
Anisotropic filtering quality is vastly improved in the R300, with much lower angle-dependency and the ability to work simultaneously with trilinear filtering. Furthermore, compared to its initial competitor, NVIDIA's GeForce 4 Ti series, R300's anisotropic filtering incurred much less performance decrease. Anti-aliasing is now performed with 2-6x gamma-corrected rotated-grid multi-sampling anti-aliasing. MSAA operates only on polygon edges, which of course means no anti-aliasing within textures or of transparent textures, but expends far less fillrate and is thus useable at higher resolutions. NVIDIA does not match the quality of this MSAA until GeForce 8. However, ATi did not support any form of super-sampling with R300-R700, while NVIDIA did.&lt;br /&gt;
&lt;br /&gt;
The R300 enjoyed visual quality and performance supremacy over its competitors in games and applications that extensively used Shader Model 2.0. NVIDIA would not be able to match or exceed ATI's Direct3D 9.0 performance until the release of the GeForce 6 series in 2004.&lt;br /&gt;
&lt;br /&gt;
Backwards compatibility with old D3D 5 games is limited because of the lack of support for fog table and palettized textures.&lt;br /&gt;
Also, despite being Direct3D 9.0-compliant, the R300 is not officially supported under Windows 7. However, for full Direct3D and OpenGL support, it is still possible to use the Windows Vista driver instead under Windows 7, although WDDM 1.1 features will not be present.&lt;br /&gt;
&lt;br /&gt;
'''Notes:'''&lt;br /&gt;
*With Star Wars KOTOR and KOTOR2, use Catalyst 4.2.&lt;br /&gt;
&lt;br /&gt;
===== R400 ===== &lt;br /&gt;
[[File:RadeonX800XTPE.jpg|thumb|Radeon X800 XT PE]]&lt;br /&gt;
&lt;br /&gt;
Introduced in 2004, this is ATi's Direct3D 9.0b generation. It is very similar to R300 in general, but with 16 pipelines in the top chip instead of 8, and higher clock speeds. They are still shader model 2.0 GPUs but have some extensions beyond 2.0, which gives them a 2.0b designation, but are not 3.0 compliant. This was not an issue until about 2 years after launch when games started to outright require shader model 3.0 or run without some visual features. There are some games that utilize 2.0b features - for example Oblivion has more visual effects available on X800 than 9800.&lt;br /&gt;
&lt;br /&gt;
A new anti-aliasing mode was introduced, called temporal AA. This feature shifts the sampling pattern on a per-frame basis, if the card can maintain &amp;gt;= 60 fps. This works well with human vision and gives a tangible improvement to anti-aliasing quality. Also, while not initially available, adaptive anti-aliasing was added to the R400 series after the release of R500 series. Adaptive AA anti-aliases within transparent textures, giving MSAA more SSAA-like capabilities.&lt;br /&gt;
&lt;br /&gt;
The ATI R400 series are ATI's last GPUs with official Windows 98/98 SE/ME support. Likewise with the R300 series, the R400 series is not officially supported under Windows 7. However, for full Direct3D and OpenGL support, it is still possible to use the Windows Vista driver instead under Windows 7, although WDDM 1.1 features will not be present.&lt;br /&gt;
&lt;br /&gt;
===== R500 =====&lt;br /&gt;
Introduced in 2005, the Radeon X1000 / R500 series are ATI's first Direct3D 9.0c-compliant GPUs with full Shader Model 3.0 features. The R500 series is not officially supported under Windows 7. However, for full Direct3D and OpenGL support, it is still possible to use the Windows Vista driver instead under Windows 7, although WDDM 1.1 features will not be present. &lt;br /&gt;
&lt;br /&gt;
==Video captures==&lt;br /&gt;
&lt;br /&gt;
{{#ev:youtube|DU5Zi69QPQs}}&lt;br /&gt;
{{#ev:youtube|uZna8WXC4ds}}&lt;br /&gt;
{{#ev:youtube|wWzWdwj9NvU}}&lt;br /&gt;
{{#ev:youtube|IG3hd1humM0}}&lt;br /&gt;
{{#ev:youtube|wdJXf6MpN7A}}&lt;br /&gt;
{{#ev:youtube|iFHwNf7-oZk}}&lt;br /&gt;
{{#ev:youtube|i4pB5Fw8Slk}}&lt;br /&gt;
&lt;br /&gt;
[[Category:Hardware]]&lt;br /&gt;
[[Category:Graphics Cards]]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=ATI&amp;diff=1319</id>
		<title>ATI</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=ATI&amp;diff=1319"/>
				<updated>2013-04-28T07:41:02Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* Video captures */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;ATi Technologies produced graphics cards from the '80s through the mid '00s until merging with AMD in 2006. AMD still produces graphics cards today.&lt;br /&gt;
&lt;br /&gt;
== Graphics card series ==&lt;br /&gt;
=== Mach ===&lt;br /&gt;
===== Mach 8 =====&lt;br /&gt;
===== Mach 32 =====&lt;br /&gt;
===== Mach 64 =====&lt;br /&gt;
&lt;br /&gt;
=== Rage ===&lt;br /&gt;
[[File:ATIRage128Pro.JPG|thumb|Rage 128 Pro OEM]]&lt;br /&gt;
&lt;br /&gt;
===== 3D Rage =====&lt;br /&gt;
===== 3D Rage II =====&lt;br /&gt;
===== 3D Rage Pro =====&lt;br /&gt;
Released in the latter half of 1997, the Rage Pro was a major improvement on ATI's previous Rage II chip. Improvements include an increased texture cache size (now at 4 KB) allowing for improved texture filtering, as well as an integrated triangle setup engine. It is the first ATI chip (and among the earliest graphics chips) to fully support AGP bus features, including execute mode (AGP texturing). It is also the first ATI chip to support OpenGL in hardware. However, like the previous Rage chips, the Rage Pro cannot bilinear filter alpha textures, resulting in transparent textures still having a rough appearance. Performance-wise, it is very similar to 3Dfx's original Voodoo Graphics chipset. The Rage Pro was very popular with OEMs and up until the late 2000s, it was integrated into many server motherboards.&lt;br /&gt;
&lt;br /&gt;
The Rage Pro is also the last chip to support ATI's CIF application programming interface. It is also ATI's last chip with Windows 3.1x support.&lt;br /&gt;
&lt;br /&gt;
===== Rage 128 =====&lt;br /&gt;
&lt;br /&gt;
=== Radeon ===&lt;br /&gt;
&lt;br /&gt;
===== R100 =====&lt;br /&gt;
[[File:Radeon7500agp.jpg|thumb|Radeon 7500 64MB]]&lt;br /&gt;
The original Radeon was a Direct3D 7 visual processing unit (VPU), as ATi named it. It is a 2 pixel per clock design with 3 texture units on each of the pixel pipelines. The 166 MHz Radeon DDR (aka 7200) is competitive with GeForce 256 DDR. Clock speeds varied from 143 - 200 MHz, synchronous memory and core. &lt;br /&gt;
&lt;br /&gt;
It supports environmental bump mapping (EMBM), unlike GeForce cards at the time. It has a basic form of anisotropic filtering that is high performance and offers a nice quality improvement but is highly angle-dependent and can not operate at the same time as trilinear filtering. It also offers ordered-grid supersampling anti-aliasing.&lt;br /&gt;
&lt;br /&gt;
Backwards compatibility with old D3D 5 games is limited because of the lack of support for fog table and palettized textures. It is possible to enable fog table via registry tweaks but it was not officially supported.&lt;br /&gt;
&lt;br /&gt;
RV100 (Radeon VE / 7000) is a chip with dual display capabilities but with reduced 3D hardware. It lacks T&amp;amp;L and has a single pixel pipeline. It is somewhat faster than TNT2 Ultra and G400 Max.&lt;br /&gt;
&lt;br /&gt;
RV200 (Radeon 7500) is a die shrink of R100 with some improvements. It has more anisotropic filtering options and is capable of asynchronous clocking of memory and the core. The top of the line model is clocked at 290 MHz core and 230 MHz RAM, and competes with GeForce 2 Ti/Pro. There are many variations of this card.&lt;br /&gt;
&lt;br /&gt;
'''Notes:'''&lt;br /&gt;
*DVI on these cards is flaky and is essentially unusable with DOS.  High resolutions may also be problematic.&lt;br /&gt;
&lt;br /&gt;
===== R200 =====&lt;br /&gt;
[[File:Radeon8500 128mb.JPG|thumb|Radeon 8500 128MB]]&lt;br /&gt;
This generation is the first with Direct3D 8 compliance, actually Direct3D 8.1. The Radeon 8500 is a 4 pipeline design with 2 texture units per pipeline and operates at up to 275 MHz, typically with synchronous core and RAM. It is competitive with GeForce 3 Ti 500. &lt;br /&gt;
&lt;br /&gt;
A wide variety of supersampling anti-aliasing modes are available (2-6x, quality/performance). ATi calls it &amp;quot;Smoothvision&amp;quot;. It uses various techniques, including a jittered-grid pattern for some modes/cases and ordered-grid for others. In Direct3D, fog may force it to use ordered-grid. Drivers vary in their behavior as well.[http://forum.beyond3d.com/showpost.php?p=4859&amp;amp;postcount=64]&lt;br /&gt;
&lt;br /&gt;
Anisotropic filtering is somewhat improved, with more levels supported, but is again very angle dependent and can not work with trilinear filtering. GeForce 3+ have higher quality anisotropic filtering but with a much higher performance impact.&lt;br /&gt;
&lt;br /&gt;
ATi introduced a tessellation function called [[TruForm]].&lt;br /&gt;
&lt;br /&gt;
Backwards compatibility with old D3D 5 games is limited because of the lack of support for fog table and palettized textures.&lt;br /&gt;
&lt;br /&gt;
RV250 and RV280, known as Radeon 9000, 9200 and 9250, are slight evolutions of the design. They have somewhat reduced specifications but are more efficient and run cooler. They were popular notebook GPUs. Performance of Radeon 9000 Pro is not far off of Radeon 8500. Radeon 9100 is a rename of Radeon 8500 LE.&lt;br /&gt;
&lt;br /&gt;
'''Notes:'''&lt;br /&gt;
*DVI on these cards is flaky and is essentially unusable with DOS. High resolutions may also be problematic.&lt;br /&gt;
*With Star Wars KOTOR and KOTOR2, use Catalyst 4.2.&lt;br /&gt;
&lt;br /&gt;
===== R300 =====&lt;br /&gt;
[[File:Radeon9800pro256.JPG|thumb|Radeon 9800 Pro 256MB]]&lt;br /&gt;
Introduced in August 2002, the R300 GPUs are Direct3D 9.0-compliant graphics chips. R300 introduced Shader Model 2.0 support and is also OpenGL 2.0-compliant. The R300 was designed by the ArtX engineering team that ATI had acquired in Feburary 2000. The same ArtX engineers (who were also former SGI employees) designed the Nintendo Gamecube GPU (Flipper) as well as the SGI RealityEngine-based graphics processor in the Nintendo 64. The first R300-based cards released were the Radeon 9500 and 9700 line of cards. In 2003, the Radeon 9600 and 9800 series were added to the lineup. R300 has many improvements and noticeably better visual quality than ATI's prior chips. Radeon 9800 Pro is competitive with GeForce FX 5900 Ultra, but with Direct3D 9 games the GeForce FX falls far behind.&lt;br /&gt;
&lt;br /&gt;
Anisotropic filtering quality is vastly improved in the R300, with much lower angle-dependency and the ability to work simultaneously with trilinear filtering. Furthermore, compared to its initial competitor, NVIDIA's GeForce 4 Ti series, R300's anisotropic filtering incurred much less performance decrease. Anti-aliasing is now performed with 2-6x gamma-corrected rotated-grid multi-sampling anti-aliasing. MSAA operates only on polygon edges, which of course means no anti-aliasing within textures or of transparent textures, but expends far less fillrate and is thus useable at higher resolutions. NVIDIA does not match the quality of this MSAA until GeForce 8. However, ATi did not support any form of super-sampling with R300-R700, while NVIDIA did.&lt;br /&gt;
&lt;br /&gt;
The R300 enjoyed visual quality and performance supremacy over its competitors in games and applications that extensively used Shader Model 2.0. NVIDIA would not be able to match or exceed ATI's Direct3D 9.0 performance until the release of the GeForce 6 series in 2004.&lt;br /&gt;
&lt;br /&gt;
Backwards compatibility with old D3D 5 games is limited because of the lack of support for fog table and palettized textures.&lt;br /&gt;
Also, despite being Direct3D 9.0-compliant, the R300 is not officially supported under Windows 7. However, for full Direct3D and OpenGL support, it is still possible to use the Windows Vista driver instead under Windows 7, although WDDM 1.1 features will not be present.&lt;br /&gt;
&lt;br /&gt;
'''Notes:'''&lt;br /&gt;
*With Star Wars KOTOR and KOTOR2, use Catalyst 4.2.&lt;br /&gt;
&lt;br /&gt;
===== R400 ===== &lt;br /&gt;
[[File:RadeonX800XTPE.jpg|thumb|Radeon X800 XT PE]]&lt;br /&gt;
&lt;br /&gt;
Introduced in 2004, this is ATi's Direct3D 9.0b generation. It is very similar to R300 in general, but with 16 pipelines in the top chip instead of 8, and higher clock speeds. They are still shader model 2.0 GPUs but have some extensions beyond 2.0, which gives them a 2.0b designation, but are not 3.0 compliant. This was not an issue until about 2 years after launch when games started to outright require shader model 3.0 or run without some visual features. There are some games that utilize 2.0b features - for example Oblivion has more visual effects available on X800 than 9800.&lt;br /&gt;
&lt;br /&gt;
A new anti-aliasing mode was introduced, called temporal AA. This feature shifts the sampling pattern on a per-frame basis, if the card can maintain &amp;gt;= 60 fps. This works well with human vision and gives a tangible improvement to anti-aliasing quality. Also, while not initially available, adaptive anti-aliasing was added to the R400 series after the release of R500 series. Adaptive AA anti-aliases within transparent textures, giving MSAA more SSAA-like capabilities.&lt;br /&gt;
&lt;br /&gt;
The ATI R400 series are ATI's last GPUs with official Windows 98/98 SE/ME support. Likewise with the R300 series, the R400 series is not officially supported under Windows 7. However, for full Direct3D and OpenGL support, it is still possible to use the Windows Vista driver instead under Windows 7, although WDDM 1.1 features will not be present.&lt;br /&gt;
&lt;br /&gt;
===== R500 =====&lt;br /&gt;
Introduced in 2005, the Radeon X1000 / R500 series are ATI's first Direct3D 9.0c-compliant GPUs with full Shader Model 3.0 features. The R500 series is not officially supported under Windows 7. However, for full Direct3D and OpenGL support, it is still possible to use the Windows Vista driver instead under Windows 7, although WDDM 1.1 features will not be present. &lt;br /&gt;
&lt;br /&gt;
==Video captures==&lt;br /&gt;
&lt;br /&gt;
{{#ev:youtube|DU5Zi69QPQs}}&lt;br /&gt;
{{#ev:youtube|IG3hd1humM0}}&lt;br /&gt;
{{#ev:youtube|uZna8WXC4ds}}&lt;br /&gt;
{{#ev:youtube|wWzWdwj9NvU}}&lt;br /&gt;
{{#ev:youtube|wdJXf6MpN7A}}&lt;br /&gt;
{{#ev:youtube|iFHwNf7-oZk}}&lt;br /&gt;
{{#ev:youtube|i4pB5Fw8Slk}}&lt;br /&gt;
&lt;br /&gt;
[[Category:Hardware]]&lt;br /&gt;
[[Category:Graphics Cards]]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=ATI&amp;diff=1318</id>
		<title>ATI</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=ATI&amp;diff=1318"/>
				<updated>2013-04-28T07:37:59Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;ATi Technologies produced graphics cards from the '80s through the mid '00s until merging with AMD in 2006. AMD still produces graphics cards today.&lt;br /&gt;
&lt;br /&gt;
== Graphics card series ==&lt;br /&gt;
=== Mach ===&lt;br /&gt;
===== Mach 8 =====&lt;br /&gt;
===== Mach 32 =====&lt;br /&gt;
===== Mach 64 =====&lt;br /&gt;
&lt;br /&gt;
=== Rage ===&lt;br /&gt;
[[File:ATIRage128Pro.JPG|thumb|Rage 128 Pro OEM]]&lt;br /&gt;
&lt;br /&gt;
===== 3D Rage =====&lt;br /&gt;
===== 3D Rage II =====&lt;br /&gt;
===== 3D Rage Pro =====&lt;br /&gt;
Released in the latter half of 1997, the Rage Pro was a major improvement on ATI's previous Rage II chip. Improvements include an increased texture cache size (now at 4 KB) allowing for improved texture filtering, as well as an integrated triangle setup engine. It is the first ATI chip (and among the earliest graphics chips) to fully support AGP bus features, including execute mode (AGP texturing). It is also the first ATI chip to support OpenGL in hardware. However, like the previous Rage chips, the Rage Pro cannot bilinear filter alpha textures, resulting in transparent textures still having a rough appearance. Performance-wise, it is very similar to 3Dfx's original Voodoo Graphics chipset. The Rage Pro was very popular with OEMs and up until the late 2000s, it was integrated into many server motherboards.&lt;br /&gt;
&lt;br /&gt;
The Rage Pro is also the last chip to support ATI's CIF application programming interface. It is also ATI's last chip with Windows 3.1x support.&lt;br /&gt;
&lt;br /&gt;
===== Rage 128 =====&lt;br /&gt;
&lt;br /&gt;
=== Radeon ===&lt;br /&gt;
&lt;br /&gt;
===== R100 =====&lt;br /&gt;
[[File:Radeon7500agp.jpg|thumb|Radeon 7500 64MB]]&lt;br /&gt;
The original Radeon was a Direct3D 7 visual processing unit (VPU), as ATi named it. It is a 2 pixel per clock design with 3 texture units on each of the pixel pipelines. The 166 MHz Radeon DDR (aka 7200) is competitive with GeForce 256 DDR. Clock speeds varied from 143 - 200 MHz, synchronous memory and core. &lt;br /&gt;
&lt;br /&gt;
It supports environmental bump mapping (EMBM), unlike GeForce cards at the time. It has a basic form of anisotropic filtering that is high performance and offers a nice quality improvement but is highly angle-dependent and can not operate at the same time as trilinear filtering. It also offers ordered-grid supersampling anti-aliasing.&lt;br /&gt;
&lt;br /&gt;
Backwards compatibility with old D3D 5 games is limited because of the lack of support for fog table and palettized textures. It is possible to enable fog table via registry tweaks but it was not officially supported.&lt;br /&gt;
&lt;br /&gt;
RV100 (Radeon VE / 7000) is a chip with dual display capabilities but with reduced 3D hardware. It lacks T&amp;amp;L and has a single pixel pipeline. It is somewhat faster than TNT2 Ultra and G400 Max.&lt;br /&gt;
&lt;br /&gt;
RV200 (Radeon 7500) is a die shrink of R100 with some improvements. It has more anisotropic filtering options and is capable of asynchronous clocking of memory and the core. The top of the line model is clocked at 290 MHz core and 230 MHz RAM, and competes with GeForce 2 Ti/Pro. There are many variations of this card.&lt;br /&gt;
&lt;br /&gt;
'''Notes:'''&lt;br /&gt;
*DVI on these cards is flaky and is essentially unusable with DOS.  High resolutions may also be problematic.&lt;br /&gt;
&lt;br /&gt;
===== R200 =====&lt;br /&gt;
[[File:Radeon8500 128mb.JPG|thumb|Radeon 8500 128MB]]&lt;br /&gt;
This generation is the first with Direct3D 8 compliance, actually Direct3D 8.1. The Radeon 8500 is a 4 pipeline design with 2 texture units per pipeline and operates at up to 275 MHz, typically with synchronous core and RAM. It is competitive with GeForce 3 Ti 500. &lt;br /&gt;
&lt;br /&gt;
A wide variety of supersampling anti-aliasing modes are available (2-6x, quality/performance). ATi calls it &amp;quot;Smoothvision&amp;quot;. It uses various techniques, including a jittered-grid pattern for some modes/cases and ordered-grid for others. In Direct3D, fog may force it to use ordered-grid. Drivers vary in their behavior as well.[http://forum.beyond3d.com/showpost.php?p=4859&amp;amp;postcount=64]&lt;br /&gt;
&lt;br /&gt;
Anisotropic filtering is somewhat improved, with more levels supported, but is again very angle dependent and can not work with trilinear filtering. GeForce 3+ have higher quality anisotropic filtering but with a much higher performance impact.&lt;br /&gt;
&lt;br /&gt;
ATi introduced a tessellation function called [[TruForm]].&lt;br /&gt;
&lt;br /&gt;
Backwards compatibility with old D3D 5 games is limited because of the lack of support for fog table and palettized textures.&lt;br /&gt;
&lt;br /&gt;
RV250 and RV280, known as Radeon 9000, 9200 and 9250, are slight evolutions of the design. They have somewhat reduced specifications but are more efficient and run cooler. They were popular notebook GPUs. Performance of Radeon 9000 Pro is not far off of Radeon 8500. Radeon 9100 is a rename of Radeon 8500 LE.&lt;br /&gt;
&lt;br /&gt;
'''Notes:'''&lt;br /&gt;
*DVI on these cards is flaky and is essentially unusable with DOS. High resolutions may also be problematic.&lt;br /&gt;
*With Star Wars KOTOR and KOTOR2, use Catalyst 4.2.&lt;br /&gt;
&lt;br /&gt;
===== R300 =====&lt;br /&gt;
[[File:Radeon9800pro256.JPG|thumb|Radeon 9800 Pro 256MB]]&lt;br /&gt;
Introduced in August 2002, the R300 GPUs are Direct3D 9.0-compliant graphics chips. R300 introduced Shader Model 2.0 support and is also OpenGL 2.0-compliant. The R300 was designed by the ArtX engineering team that ATI had acquired in Feburary 2000. The same ArtX engineers (who were also former SGI employees) designed the Nintendo Gamecube GPU (Flipper) as well as the SGI RealityEngine-based graphics processor in the Nintendo 64. The first R300-based cards released were the Radeon 9500 and 9700 line of cards. In 2003, the Radeon 9600 and 9800 series were added to the lineup. R300 has many improvements and noticeably better visual quality than ATI's prior chips. Radeon 9800 Pro is competitive with GeForce FX 5900 Ultra, but with Direct3D 9 games the GeForce FX falls far behind.&lt;br /&gt;
&lt;br /&gt;
Anisotropic filtering quality is vastly improved in the R300, with much lower angle-dependency and the ability to work simultaneously with trilinear filtering. Furthermore, compared to its initial competitor, NVIDIA's GeForce 4 Ti series, R300's anisotropic filtering incurred much less performance decrease. Anti-aliasing is now performed with 2-6x gamma-corrected rotated-grid multi-sampling anti-aliasing. MSAA operates only on polygon edges, which of course means no anti-aliasing within textures or of transparent textures, but expends far less fillrate and is thus useable at higher resolutions. NVIDIA does not match the quality of this MSAA until GeForce 8. However, ATi did not support any form of super-sampling with R300-R700, while NVIDIA did.&lt;br /&gt;
&lt;br /&gt;
The R300 enjoyed visual quality and performance supremacy over its competitors in games and applications that extensively used Shader Model 2.0. NVIDIA would not be able to match or exceed ATI's Direct3D 9.0 performance until the release of the GeForce 6 series in 2004.&lt;br /&gt;
&lt;br /&gt;
Backwards compatibility with old D3D 5 games is limited because of the lack of support for fog table and palettized textures.&lt;br /&gt;
Also, despite being Direct3D 9.0-compliant, the R300 is not officially supported under Windows 7. However, for full Direct3D and OpenGL support, it is still possible to use the Windows Vista driver instead under Windows 7, although WDDM 1.1 features will not be present.&lt;br /&gt;
&lt;br /&gt;
'''Notes:'''&lt;br /&gt;
*With Star Wars KOTOR and KOTOR2, use Catalyst 4.2.&lt;br /&gt;
&lt;br /&gt;
===== R400 ===== &lt;br /&gt;
[[File:RadeonX800XTPE.jpg|thumb|Radeon X800 XT PE]]&lt;br /&gt;
&lt;br /&gt;
Introduced in 2004, this is ATi's Direct3D 9.0b generation. It is very similar to R300 in general, but with 16 pipelines in the top chip instead of 8, and higher clock speeds. They are still shader model 2.0 GPUs but have some extensions beyond 2.0, which gives them a 2.0b designation, but are not 3.0 compliant. This was not an issue until about 2 years after launch when games started to outright require shader model 3.0 or run without some visual features. There are some games that utilize 2.0b features - for example Oblivion has more visual effects available on X800 than 9800.&lt;br /&gt;
&lt;br /&gt;
A new anti-aliasing mode was introduced, called temporal AA. This feature shifts the sampling pattern on a per-frame basis, if the card can maintain &amp;gt;= 60 fps. This works well with human vision and gives a tangible improvement to anti-aliasing quality. Also, while not initially available, adaptive anti-aliasing was added to the R400 series after the release of R500 series. Adaptive AA anti-aliases within transparent textures, giving MSAA more SSAA-like capabilities.&lt;br /&gt;
&lt;br /&gt;
The ATI R400 series are ATI's last GPUs with official Windows 98/98 SE/ME support. Likewise with the R300 series, the R400 series is not officially supported under Windows 7. However, for full Direct3D and OpenGL support, it is still possible to use the Windows Vista driver instead under Windows 7, although WDDM 1.1 features will not be present.&lt;br /&gt;
&lt;br /&gt;
===== R500 =====&lt;br /&gt;
Introduced in 2005, the Radeon X1000 / R500 series are ATI's first Direct3D 9.0c-compliant GPUs with full Shader Model 3.0 features. The R500 series is not officially supported under Windows 7. However, for full Direct3D and OpenGL support, it is still possible to use the Windows Vista driver instead under Windows 7, although WDDM 1.1 features will not be present. &lt;br /&gt;
&lt;br /&gt;
==Video captures==&lt;br /&gt;
&lt;br /&gt;
{{#ev:youtube|IG3hd1humM0}}&lt;br /&gt;
{{#ev:youtube|wWzWdwj9NvU}}&lt;br /&gt;
{{#ev:youtube|uZna8WXC4ds}}&lt;br /&gt;
{{#ev:youtube|DU5Zi69QPQs}}&lt;br /&gt;
{{#ev:youtube|wdJXf6MpN7A}}&lt;br /&gt;
{{#ev:youtube|iFHwNf7-oZk}}&lt;br /&gt;
{{#ev:youtube|i4pB5Fw8Slk}}&lt;br /&gt;
&lt;br /&gt;
[[Category:Hardware]]&lt;br /&gt;
[[Category:Graphics Cards]]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=3Dlabs&amp;diff=1314</id>
		<title>3Dlabs</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=3Dlabs&amp;diff=1314"/>
				<updated>2013-04-28T07:21:56Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Cards==&lt;br /&gt;
===Game GLINT ===&lt;br /&gt;
&lt;br /&gt;
3DLabs' first tap into the consumer graphics market was in November 1995 with its Game GLINT (also known as GiGi) chip. It is a scaled-down variant of 3DLabs' GLINT 300TX chipset released a year earlier, which was designed for professional 3D and CAD applications. The only graphics card produced using this chip was Creative Labs' 3D Blaster VLB. It is a unique card in that it is the only existing consumer-level 3D accelerator (and texture mapper) for the VLB bus, and it was designed for 80486 VLB users who wanted Pentium-level gaming performance. It has 1 MB graphics memory for display frame buffer and 1 MB texture memory. The 3D Blaster VLB can be used as a standalone 2D/3D graphics card or as a 3D-only accelerator when paired with a 2D graphics card in the same system through a VGA pass-through cable (in the same manner that cards based on 3Dfx's Voodoo Graphics and Voodoo2 chipsets function). The 3D Blaster VLB supported Creative Labs' own proprietary API known as Creative Graphics Library (CGL). Only a handful of CGL games were ever released [http://www.vogonswiki.com/index.php/List_of_games_supporting_proprietary_APIs#3D_Blaster_.28CGL.29 (more information here)] and only these games take advantage of the 3D Blaster VLB's 3D rendering features. Games that do take advantage of the 3D Blaster VLB usually run in higher resolution (640x400 or 640x480) and with additional graphics detail. However, with these additional rendering features enabled, performance in supported games is less than desirable (unless one has a VLB Pentium system) and reviewers knocked the card. Most gamers at the time opted to upgrade to a PCI Pentium system instead. The 3D Blaster VLB was supplanted several months later by its PCI bus counterpart, the 3D Blaster PCI, which used Rendition's Verite 1000 chip and had better game compatibility and performance.&lt;br /&gt;
&lt;br /&gt;
The 3D Blaster VLB nowadays is a very rare card and usually if it shows up on auction websites, the price will be very high. The 3D Blaster VLB also supposedly supports Direct3D if the graphics memory upgrade module is installed. However, the memory upgrade module is more difficult to find than the 3D Blaster VLB itself, and would also be very steeply-priced. Unlike 3DLabs' other graphics chips and chipsets, the Game GLINT does not support OpenGL. Display drivers are available for Windows 3.1x and Windows 95.&lt;br /&gt;
&lt;br /&gt;
===Permedia and Permedia NT ===&lt;br /&gt;
&lt;br /&gt;
3DLabs released the Permedia and Permedia NT in 1996, and they served as value-oriented graphics chips designed for the professional 3D and CAD markets. Permedia NT differs from the original in that it features a separate geometry co-processor chip (known as Delta) for performing transformation, clipping and lighting calculations. However, the Delta geometry co-processor is optimized for professional 3D and CAD applications rather than for games, resulting in anemic overall gaming performance. On the upside, the main advantage at the time was that 3DLabs was the only company producing consumer-level graphics chips that had full OpenGL ICD driver support (other manufacturers didn't support OpenGL at all or released a miniport driver instead).&lt;br /&gt;
&lt;br /&gt;
The Permedia and Permedia NT support Windows 9x and NT 4.0. Supported APIs include Direct3D, OpenGL ICD and HEIDI.&lt;br /&gt;
&lt;br /&gt;
Popular cards include Diamond Multimedia's FireGL 1000 and Leadtek's Winfast 3D L2200.&lt;br /&gt;
&lt;br /&gt;
===Permedia 2 ===&lt;br /&gt;
The Permedia 2 was released in late 1997 and was an evolution of the original Permedia. The discrete Delta geometry co-processor chip originally featured in the Permedia NT is now integrated into the Permedia 2's core logic, resulting in a single-chip solution. Permedia 2 supports AGP texturing as well as 3D rendering in 32-bit color depth. However, it does not support colored vertex blending, which results in games using only monochromatic lighting - notable examples are in Quake 2 and Quake III: Arena. No colored lighting is present in Quake 2 even when using the OpenGL renderer, and Quake III: Arena only supports vertex lighting on the Permedia 2. &lt;br /&gt;
&lt;br /&gt;
Most Permedia 2 graphics cards do not come with out-of-the-box VESA VBE support, requiring the use of TSRs such as UniVBE/SciTech Display Doctor in order to use higher-resolution VESA graphics modes in DOS. DOS VGA compatibility is also spotty. The Permedia 2 is also 3DLabs' last graphics chip with Windows 95 support.&lt;br /&gt;
&lt;br /&gt;
===GLINT R3 ===&lt;br /&gt;
The GLINT R3 is the third iteration of 3DLabs' Permedia architecture. It was released in mid 1999 and again was directed at the professional 3D and CAD application market. It is also Direct3D 6-compliant. At this time 3DLabs began producing their own graphics cards rather than license their graphics chips out to OEMs, in the same manner that 3dfx did with their Voodoo3/4/5 series. The GLINT R3 also introduced support for hardware dot product bump mapping, but this feature would not be extensively used in games until a year or so after its release. Other consumer-level graphics cards at the time only supported emboss bump mapping. Three popular cards based on this chip were released: 3DLabs' Permedia 3 Create!, Oxygen VX1 and the Oxygen GVX1. The Oxygen GVX1 featured a separate geometry co-processor (known as GAMMA) chip for performing transformation, clipping and lighting calculations, much in the same manner that the Delta co-processor functioned in the Permedia and Permedia 2 series. Likewise with Delta, the GAMMA geometry co-processor is not optimized for games. Gaming performance-wise, the GLINT R3 is slower than even NVIDIA's RIVA TNT (released several months earlier) in most situations, and it is definitely no match for the RIVA TNT2 or 3dfx's Voodoo3. Its performance in professional 3D and CAD applications, however, is much better. The Oxygen GVX1 is slower than NVIDIA's GeForce 256 and Quadro in both games and professional 3D applications, but produces far fewer rendering anomalies in professional 3D applications.&lt;br /&gt;
&lt;br /&gt;
Later card revisions added AGP 4x support - if purchased in a retail box, there usually will be an &amp;quot;AGP 4x&amp;quot; sticker present on the box to indicate such support.&lt;br /&gt;
&lt;br /&gt;
Compared to 3DLabs' previous graphics chips, the GLINT R3 has better DOS VGA compatibility and contains VESA VBE support out-of-the-box.&lt;br /&gt;
&lt;br /&gt;
The GLINT R3 requires Windows 98 or Windows NT 4.0 (or newer). Overall, this chip is definitely not geared towards gaming.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Video captures==&lt;br /&gt;
&lt;br /&gt;
{{#ev:youtube|c7tTYwNDWlU}}&lt;br /&gt;
&lt;br /&gt;
==Related links==&lt;br /&gt;
*[http://gona.mactar.hu/DOS_TESTS/ Gona's PCI and AGP DOS game compatibility matrix]&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=3Dlabs&amp;diff=1313</id>
		<title>3Dlabs</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=3Dlabs&amp;diff=1313"/>
				<updated>2013-04-28T07:18:56Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* Game GLINT */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Cards==&lt;br /&gt;
===Game GLINT ===&lt;br /&gt;
&lt;br /&gt;
3DLabs' first tap into the consumer graphics market was in November 1995 with its Game GLINT (also known as GiGi) chip. It is a scaled-down variant of 3DLabs' GLINT 300TX chipset released a year earlier, which was designed for professional 3D and CAD applications. The only graphics card produced using this chip was Creative Labs' 3D Blaster VLB. It is a unique card in that it is the only existing consumer-level 3D accelerator (and texture mapper) for the VLB bus, and it was designed for 80486 VLB users who wanted Pentium-level gaming performance. It has 1 MB graphics memory for display frame buffer and 1 MB texture memory. The 3D Blaster VLB can be used as a standalone 2D/3D graphics card or as a 3D-only accelerator when paired with a 2D graphics card in the same system through a VGA pass-through cable (in the same manner that cards based on 3Dfx's Voodoo Graphics and Voodoo2 chipsets function). The 3D Blaster VLB supported Creative Labs' own proprietary API known as Creative Graphics Library (CGL). Only a handful of CGL games were ever released [http://www.vogonswiki.com/index.php/List_of_games_supporting_proprietary_APIs#3D_Blaster_.28CGL.29 (more information here)] and only these games take advantage of the 3D Blaster VLB's 3D rendering features. Games that do take advantage of the 3D Blaster VLB usually run in higher resolution (640x400 or 640x480) and with additional graphics detail. However, with these additional rendering features enabled, performance in supported games is less than desirable (unless one has a VLB Pentium system) and reviewers knocked the card. Most gamers at the time opted to upgrade to a PCI Pentium system instead. The 3D Blaster VLB was supplanted several months later by its PCI bus counterpart, the 3D Blaster PCI, which used Rendition's Verite 1000 chip and had better game compatibility and performance.&lt;br /&gt;
&lt;br /&gt;
The 3D Blaster VLB nowadays is a very rare card and usually if it shows up on auction websites, the price will be very high. The 3D Blaster VLB also supposedly supports Direct3D if the graphics memory upgrade module is installed. However, the memory upgrade module is more difficult to find than the 3D Blaster VLB itself, and would also be very steeply-priced. Unlike 3DLabs' other graphics chips and chipsets, the Game GLINT does not support OpenGL. Display drivers are available for Windows 3.1x and Windows 95.&lt;br /&gt;
&lt;br /&gt;
===Permedia and Permedia NT ===&lt;br /&gt;
&lt;br /&gt;
3DLabs released the Permedia and Permedia NT in 1996, and they served as value-oriented graphics chips designed for the professional 3D and CAD markets. Permedia NT differs from the original in that it features a separate geometry co-processor chip (known as Delta) for performing transformation, clipping and lighting calculations. However, the Delta geometry co-processor is optimized for professional 3D and CAD applications rather than for games, resulting in anemic overall gaming performance. On the upside, the main advantage at the time was that 3DLabs was the only company producing consumer-level graphics chips that had full OpenGL ICD driver support (other manufacturers didn't support OpenGL at all or released a miniport driver instead).&lt;br /&gt;
&lt;br /&gt;
The Permedia and Permedia NT support Windows 9x and NT 4.0. Supported APIs include Direct3D, OpenGL ICD and HEIDI.&lt;br /&gt;
&lt;br /&gt;
Popular cards include Diamond Multimedia's FireGL 1000 and Leadtek's Winfast 3D L2200.&lt;br /&gt;
&lt;br /&gt;
===Permedia 2 ===&lt;br /&gt;
The Permedia 2 was released in late 1997 and was an evolution of the original Permedia. The discrete Delta geometry co-processor chip originally featured in the Permedia NT is now integrated into the Permedia 2's core logic, resulting in a single-chip solution. Permedia 2 supports AGP texturing as well as 3D rendering in 32-bit color depth. However, it does not support colored vertex blending, which results in games using only monochromatic lighting - notable examples are in Quake 2 and Quake III: Arena. No colored lighting is present in Quake 2 even when using the OpenGL renderer, and Quake III: Arena only supports vertex lighting on the Permedia 2. &lt;br /&gt;
&lt;br /&gt;
Most Permedia 2 graphics cards do not come with out-of-the-box VESA VBE support, requiring the use of TSRs such as UniVBE/SciTech Display Doctor in order to use higher-resolution VESA graphics modes in DOS. DOS VGA compatibility is also spotty. The Permedia 2 is also 3DLabs' last graphics chip with Windows 95 support.&lt;br /&gt;
&lt;br /&gt;
===GLINT R3 ===&lt;br /&gt;
The GLINT R3 is the third iteration of 3DLabs' Permedia architecture. It was released in mid 1999 and again was directed at the professional 3D and CAD application market. It is also Direct3D 6-compliant. At this time 3DLabs began producing their own graphics cards rather than license their graphics chips out to OEMs, in the same manner that 3dfx did with their Voodoo3/4/5 series. The GLINT R3 also introduced support for hardware dot product bump mapping, but this feature would not be extensively used in games until a year or so after its release. Other consumer-level graphics cards at the time only supported emboss bump mapping. Three popular cards based on this chip were released: 3DLabs' Permedia 3 Create!, Oxygen VX1 and the Oxygen GVX1. The Oxygen GVX1 featured a separate geometry co-processor (known as GAMMA) chip for performing transformation, clipping and lighting calculations, much in the same manner that the Delta co-processor functioned in the Permedia and Permedia 2 series. Likewise with Delta, the GAMMA geometry co-processor is not optimized for games. Gaming performance-wise, the GLINT R3 is slower than even NVIDIA's RIVA TNT (released several months earlier) in most situations, and it is definitely no match for the RIVA TNT2 or 3dfx's Voodoo3. Its performance in professional 3D and CAD applications, however, is much better. The Oxygen GVX1 is slower than NVIDIA's GeForce 256 and Quadro in both games and professional 3D applications, but produces far fewer rendering anomalies in professional 3D applications.&lt;br /&gt;
&lt;br /&gt;
Later card revisions added AGP 4x support - if purchased in a retail box, there usually will be an &amp;quot;AGP 4x&amp;quot; sticker present on the box to indicate such support.&lt;br /&gt;
&lt;br /&gt;
Compared to 3DLabs' previous graphics chips, the GLINT R3 has better DOS VGA compatibility and contains VESA VBE support out-of-the-box.&lt;br /&gt;
&lt;br /&gt;
The GLINT R3 requires Windows 98 or Windows NT 4.0 (or newer). Overall, this chip is definitely not geared towards gaming.&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=3Dlabs&amp;diff=1309</id>
		<title>3Dlabs</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=3Dlabs&amp;diff=1309"/>
				<updated>2013-04-28T07:10:58Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* GLINT R3 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Cards==&lt;br /&gt;
===Game GLINT ===&lt;br /&gt;
&lt;br /&gt;
3DLabs' first tap into the consumer graphics market was in November 1995 with its Game GLINT (also known as GiGi) chip. It is a scaled-down variant of 3DLabs' GLINT 300TX chipset released a year earlier, which was designed for professional 3D and CAD applications. The only graphics card produced using this chip was Creative Labs' 3D Blaster VLB. It is a unique card in that it is the only existing consumer-level 3D accelerator (and texture mapper) for the VLB bus, and it was designed for 80486 VLB users who wanted Pentium-level gaming performance. It has 1 MB graphics memory for display frame buffer and 1 MB texture memory. The 3D Blaster VLB can be used as a standalone 2D/3D graphics card or as a 3D-only accelerator when paired with a 2D graphics card in the same system through a VGA pass-through cable (in the same manner that cards based on 3Dfx's Voodoo Graphics and Voodoo2 chipsets function). The 3D Blaster VLB supported Creative Labs' own proprietary API known as Creative Graphics Library (CGL). Only a handful of CGL games were ever released [http://www.vogonswiki.com/index.php/List_of_games_supporting_proprietary_APIs#3D_Blaster_.28CGL.29 (more information here)] and only these games take advantage of the 3D Blaster VLB's 3D rendering features. Games that do take advantage of the 3D Blaster VLB usually run in higher resolution (640x400 or 640x480) and with additional graphics detail. However, with these additional rendering features enabled, performance in supported games is less than desirable (unless one has a VLB Pentium system) and reviewers knocked the card. Most gamers at the time opted to upgrade to a PCI Pentium system instead. The 3D Blaster VLB was supplanted several months later by its PCI bus counterpart, the 3D Blaster PCI, which used Rendition's Verite 1000 chip and had better game compatibility and performance.&lt;br /&gt;
&lt;br /&gt;
The 3D Blaster VLB nowadays is a very rare card and usually if it shows up on auction websites, the price will be very high. The 3D Blaster VLB also supposedly supports Direct3D if the graphics memory upgrade module is installed. However, the memory upgrade module is more difficult to find than the 3D Blaster VLB itself, and would also be very steeply-priced. Unlike 3DLabs' other graphics chips and chipsets, the Game GLINT does not support OpenGL.&lt;br /&gt;
&lt;br /&gt;
===Permedia and Permedia NT ===&lt;br /&gt;
&lt;br /&gt;
3DLabs released the Permedia and Permedia NT in 1996, and they served as value-oriented graphics chips designed for the professional 3D and CAD markets. Permedia NT differs from the original in that it features a separate geometry co-processor chip (known as Delta) for performing transformation, clipping and lighting calculations. However, the Delta geometry co-processor is optimized for professional 3D and CAD applications rather than for games, resulting in anemic overall gaming performance. On the upside, the main advantage at the time was that 3DLabs was the only company producing consumer-level graphics chips that had full OpenGL ICD driver support (other manufacturers didn't support OpenGL at all or released a miniport driver instead).&lt;br /&gt;
&lt;br /&gt;
The Permedia and Permedia NT support Windows 9x and NT 4.0. Supported APIs include Direct3D, OpenGL ICD and HEIDI.&lt;br /&gt;
&lt;br /&gt;
Popular cards include Diamond Multimedia's FireGL 1000 and Leadtek's Winfast 3D L2200.&lt;br /&gt;
&lt;br /&gt;
===Permedia 2 ===&lt;br /&gt;
The Permedia 2 was released in late 1997 and was an evolution of the original Permedia. The discrete Delta geometry co-processor chip originally featured in the Permedia NT is now integrated into the Permedia 2's core logic, resulting in a single-chip solution. Permedia 2 supports AGP texturing as well as 3D rendering in 32-bit color depth. However, it does not support colored vertex blending, which results in games using only monochromatic lighting - notable examples are in Quake 2 and Quake III: Arena. No colored lighting is present in Quake 2 even when using the OpenGL renderer, and Quake III: Arena only supports vertex lighting on the Permedia 2. &lt;br /&gt;
&lt;br /&gt;
Most Permedia 2 graphics cards do not come with out-of-the-box VESA VBE support, requiring the use of TSRs such as UniVBE/SciTech Display Doctor in order to use higher-resolution VESA graphics modes in DOS. DOS VGA compatibility is also spotty. The Permedia 2 is also 3DLabs' last graphics chip with Windows 95 support.&lt;br /&gt;
&lt;br /&gt;
===GLINT R3 ===&lt;br /&gt;
The GLINT R3 is the third iteration of 3DLabs' Permedia architecture. It was released in mid 1999 and again was directed at the professional 3D and CAD application market. It is also Direct3D 6-compliant. At this time 3DLabs began producing their own graphics cards rather than license their graphics chips out to OEMs, in the same manner that 3dfx did with their Voodoo3/4/5 series. The GLINT R3 also introduced support for hardware dot product bump mapping, but this feature would not be extensively used in games until a year or so after its release. Other consumer-level graphics cards at the time only supported emboss bump mapping. Three popular cards based on this chip were released: 3DLabs' Permedia 3 Create!, Oxygen VX1 and the Oxygen GVX1. The Oxygen GVX1 featured a separate geometry co-processor (known as GAMMA) chip for performing transformation, clipping and lighting calculations, much in the same manner that the Delta co-processor functioned in the Permedia and Permedia 2 series. Likewise with Delta, the GAMMA geometry co-processor is not optimized for games. Gaming performance-wise, the GLINT R3 is slower than even NVIDIA's RIVA TNT (released several months earlier) in most situations, and it is definitely no match for the RIVA TNT2 or 3dfx's Voodoo3. Its performance in professional 3D and CAD applications, however, is much better. The Oxygen GVX1 is slower than NVIDIA's GeForce 256 and Quadro in both games and professional 3D applications, but produces far fewer rendering anomalies in professional 3D applications.&lt;br /&gt;
&lt;br /&gt;
Later card revisions added AGP 4x support - if purchased in a retail box, there usually will be an &amp;quot;AGP 4x&amp;quot; sticker present on the box to indicate such support.&lt;br /&gt;
&lt;br /&gt;
Compared to 3DLabs' previous graphics chips, the GLINT R3 has better DOS VGA compatibility and contains VESA VBE support out-of-the-box.&lt;br /&gt;
&lt;br /&gt;
The GLINT R3 requires Windows 98 or Windows NT 4.0 (or newer). Overall, this chip is definitely not geared towards gaming.&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=3Dlabs&amp;diff=1308</id>
		<title>3Dlabs</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=3Dlabs&amp;diff=1308"/>
				<updated>2013-04-28T07:09:25Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* GLINT R3 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Cards==&lt;br /&gt;
===Game GLINT ===&lt;br /&gt;
&lt;br /&gt;
3DLabs' first tap into the consumer graphics market was in November 1995 with its Game GLINT (also known as GiGi) chip. It is a scaled-down variant of 3DLabs' GLINT 300TX chipset released a year earlier, which was designed for professional 3D and CAD applications. The only graphics card produced using this chip was Creative Labs' 3D Blaster VLB. It is a unique card in that it is the only existing consumer-level 3D accelerator (and texture mapper) for the VLB bus, and it was designed for 80486 VLB users who wanted Pentium-level gaming performance. It has 1 MB graphics memory for display frame buffer and 1 MB texture memory. The 3D Blaster VLB can be used as a standalone 2D/3D graphics card or as a 3D-only accelerator when paired with a 2D graphics card in the same system through a VGA pass-through cable (in the same manner that cards based on 3Dfx's Voodoo Graphics and Voodoo2 chipsets function). The 3D Blaster VLB supported Creative Labs' own proprietary API known as Creative Graphics Library (CGL). Only a handful of CGL games were ever released [http://www.vogonswiki.com/index.php/List_of_games_supporting_proprietary_APIs#3D_Blaster_.28CGL.29 (more information here)] and only these games take advantage of the 3D Blaster VLB's 3D rendering features. Games that do take advantage of the 3D Blaster VLB usually run in higher resolution (640x400 or 640x480) and with additional graphics detail. However, with these additional rendering features enabled, performance in supported games is less than desirable (unless one has a VLB Pentium system) and reviewers knocked the card. Most gamers at the time opted to upgrade to a PCI Pentium system instead. The 3D Blaster VLB was supplanted several months later by its PCI bus counterpart, the 3D Blaster PCI, which used Rendition's Verite 1000 chip and had better game compatibility and performance.&lt;br /&gt;
&lt;br /&gt;
The 3D Blaster VLB nowadays is a very rare card and usually if it shows up on auction websites, the price will be very high. The 3D Blaster VLB also supposedly supports Direct3D if the graphics memory upgrade module is installed. However, the memory upgrade module is more difficult to find than the 3D Blaster VLB itself, and would also be very steeply-priced. Unlike 3DLabs' other graphics chips and chipsets, the Game GLINT does not support OpenGL.&lt;br /&gt;
&lt;br /&gt;
===Permedia and Permedia NT ===&lt;br /&gt;
&lt;br /&gt;
3DLabs released the Permedia and Permedia NT in 1996, and they served as value-oriented graphics chips designed for the professional 3D and CAD markets. Permedia NT differs from the original in that it features a separate geometry co-processor chip (known as Delta) for performing transformation, clipping and lighting calculations. However, the Delta geometry co-processor is optimized for professional 3D and CAD applications rather than for games, resulting in anemic overall gaming performance. On the upside, the main advantage at the time was that 3DLabs was the only company producing consumer-level graphics chips that had full OpenGL ICD driver support (other manufacturers didn't support OpenGL at all or released a miniport driver instead).&lt;br /&gt;
&lt;br /&gt;
The Permedia and Permedia NT support Windows 9x and NT 4.0. Supported APIs include Direct3D, OpenGL ICD and HEIDI.&lt;br /&gt;
&lt;br /&gt;
Popular cards include Diamond Multimedia's FireGL 1000 and Leadtek's Winfast 3D L2200.&lt;br /&gt;
&lt;br /&gt;
===Permedia 2 ===&lt;br /&gt;
The Permedia 2 was released in late 1997 and was an evolution of the original Permedia. The discrete Delta geometry co-processor chip originally featured in the Permedia NT is now integrated into the Permedia 2's core logic, resulting in a single-chip solution. Permedia 2 supports AGP texturing as well as 3D rendering in 32-bit color depth. However, it does not support colored vertex blending, which results in games using only monochromatic lighting - notable examples are in Quake 2 and Quake III: Arena. No colored lighting is present in Quake 2 even when using the OpenGL renderer, and Quake III: Arena only supports vertex lighting on the Permedia 2. &lt;br /&gt;
&lt;br /&gt;
Most Permedia 2 graphics cards do not come with out-of-the-box VESA VBE support, requiring the use of TSRs such as UniVBE/SciTech Display Doctor in order to use higher-resolution VESA graphics modes in DOS. DOS VGA compatibility is also spotty. The Permedia 2 is also 3DLabs' last graphics chip with Windows 95 support.&lt;br /&gt;
&lt;br /&gt;
===GLINT R3 ===&lt;br /&gt;
The GLINT R3 is the third iteration of 3DLabs' Permedia architecture. It was released in mid 1999 and again was directed at the professional 3D and CAD application market. At this time 3DLabs began producing their own graphics cards rather than license their graphics chips out to OEMs, in the same manner that 3dfx did with their Voodoo3/4/5 series. The GLINT R3 also introduced support for hardware dot product bump mapping, but this feature would not be extensively used in games until a year or so after its release. Other consumer-level graphics cards at the time only supported emboss bump mapping. Three popular cards based on this chip were released: 3DLabs' Permedia 3 Create!, Oxygen VX1 and the Oxygen GVX1. The Oxygen GVX1 featured a separate geometry co-processor (known as GAMMA) chip for performing transformation, clipping and lighting calculations, much in the same manner that the Delta co-processor functioned in the Permedia and Permedia 2 series. Likewise with Delta, the GAMMA geometry co-processor is not optimized for games. Gaming performance-wise, the GLINT R3 is slower than even NVIDIA's RIVA TNT (released several months earlier) in most situations, and it is definitely no match for the RIVA TNT2 or 3dfx's Voodoo3. Its performance in professional 3D and CAD applications, however, is much better. The Oxygen GVX1 is slower than NVIDIA's GeForce 256 and Quadro in both games and professional 3D applications, but produces far fewer rendering anomalies in professional 3D applications.&lt;br /&gt;
&lt;br /&gt;
Later card revisions added AGP 4x support - if purchased in a retail box, there usually will be an &amp;quot;AGP 4x&amp;quot; sticker present on the box to indicate such support.&lt;br /&gt;
&lt;br /&gt;
Compared to 3DLabs' previous graphics chips, the GLINT R3 has better DOS VGA compatibility and contains VESA VBE support out-of-the-box.&lt;br /&gt;
&lt;br /&gt;
The GLINT R3 requires Windows 98 or Windows NT 4.0 (or newer). Overall, this chip is definitely not geared towards gaming.&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=3Dlabs&amp;diff=1306</id>
		<title>3Dlabs</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=3Dlabs&amp;diff=1306"/>
				<updated>2013-04-28T06:56:50Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* Permedia 2 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Cards==&lt;br /&gt;
===Game GLINT ===&lt;br /&gt;
&lt;br /&gt;
3DLabs' first tap into the consumer graphics market was in November 1995 with its Game GLINT (also known as GiGi) chip. It is a scaled-down variant of 3DLabs' GLINT 300TX chipset released a year earlier, which was designed for professional 3D and CAD applications. The only graphics card produced using this chip was Creative Labs' 3D Blaster VLB. It is a unique card in that it is the only existing consumer-level 3D accelerator (and texture mapper) for the VLB bus, and it was designed for 80486 VLB users who wanted Pentium-level gaming performance. It has 1 MB graphics memory for display frame buffer and 1 MB texture memory. The 3D Blaster VLB can be used as a standalone 2D/3D graphics card or as a 3D-only accelerator when paired with a 2D graphics card in the same system through a VGA pass-through cable (in the same manner that cards based on 3Dfx's Voodoo Graphics and Voodoo2 chipsets function). The 3D Blaster VLB supported Creative Labs' own proprietary API known as Creative Graphics Library (CGL). Only a handful of CGL games were ever released [http://www.vogonswiki.com/index.php/List_of_games_supporting_proprietary_APIs#3D_Blaster_.28CGL.29 (more information here)] and only these games take advantage of the 3D Blaster VLB's 3D rendering features. Games that do take advantage of the 3D Blaster VLB usually run in higher resolution (640x400 or 640x480) and with additional graphics detail. However, with these additional rendering features enabled, performance in supported games is less than desirable (unless one has a VLB Pentium system) and reviewers knocked the card. Most gamers at the time opted to upgrade to a PCI Pentium system instead. The 3D Blaster VLB was supplanted several months later by its PCI bus counterpart, the 3D Blaster PCI, which used Rendition's Verite 1000 chip and had better game compatibility and performance.&lt;br /&gt;
&lt;br /&gt;
The 3D Blaster VLB nowadays is a very rare card and usually if it shows up on auction websites, the price will be very high. The 3D Blaster VLB also supposedly supports Direct3D if the graphics memory upgrade module is installed. However, the memory upgrade module is more difficult to find than the 3D Blaster VLB itself, and would also be very steeply-priced. Unlike 3DLabs' other graphics chips and chipsets, the Game GLINT does not support OpenGL.&lt;br /&gt;
&lt;br /&gt;
===Permedia and Permedia NT ===&lt;br /&gt;
&lt;br /&gt;
3DLabs released the Permedia and Permedia NT in 1996, and they served as value-oriented graphics chips designed for the professional 3D and CAD markets. Permedia NT differs from the original in that it features a separate geometry co-processor chip (known as Delta) for performing transformation, clipping and lighting calculations. However, the Delta geometry co-processor is optimized for professional 3D and CAD applications rather than for games, resulting in anemic overall gaming performance. On the upside, the main advantage at the time was that 3DLabs was the only company producing consumer-level graphics chips that had full OpenGL ICD driver support (other manufacturers didn't support OpenGL at all or released a miniport driver instead).&lt;br /&gt;
&lt;br /&gt;
The Permedia and Permedia NT support Windows 9x and NT 4.0. Supported APIs include Direct3D, OpenGL ICD and HEIDI.&lt;br /&gt;
&lt;br /&gt;
Popular cards include Diamond Multimedia's FireGL 1000 and Leadtek's Winfast 3D L2200.&lt;br /&gt;
&lt;br /&gt;
===Permedia 2 ===&lt;br /&gt;
The Permedia 2 was released in late 1997 and was an evolution of the original Permedia. The discrete Delta geometry co-processor chip originally featured in the Permedia NT is now integrated into the Permedia 2's core logic, resulting in a single-chip solution. Permedia 2 supports AGP texturing as well as 3D rendering in 32-bit color depth. However, it does not support colored vertex blending, which results in games using only monochromatic lighting - notable examples are in Quake 2 and Quake III: Arena. No colored lighting is present in Quake 2 even when using the OpenGL renderer, and Quake III: Arena only supports vertex lighting on the Permedia 2. &lt;br /&gt;
&lt;br /&gt;
Most Permedia 2 graphics cards do not come with out-of-the-box VESA VBE support, requiring the use of TSRs such as UniVBE/SciTech Display Doctor in order to use higher-resolution VESA graphics modes in DOS. DOS VGA compatibility is also spotty. The Permedia 2 is also 3DLabs' last graphics chip with Windows 95 support.&lt;br /&gt;
&lt;br /&gt;
===GLINT R3 ===&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=3Dlabs&amp;diff=1305</id>
		<title>3Dlabs</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=3Dlabs&amp;diff=1305"/>
				<updated>2013-04-28T06:50:03Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* Permedia and Permedia NT */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Cards==&lt;br /&gt;
===Game GLINT ===&lt;br /&gt;
&lt;br /&gt;
3DLabs' first tap into the consumer graphics market was in November 1995 with its Game GLINT (also known as GiGi) chip. It is a scaled-down variant of 3DLabs' GLINT 300TX chipset released a year earlier, which was designed for professional 3D and CAD applications. The only graphics card produced using this chip was Creative Labs' 3D Blaster VLB. It is a unique card in that it is the only existing consumer-level 3D accelerator (and texture mapper) for the VLB bus, and it was designed for 80486 VLB users who wanted Pentium-level gaming performance. It has 1 MB graphics memory for display frame buffer and 1 MB texture memory. The 3D Blaster VLB can be used as a standalone 2D/3D graphics card or as a 3D-only accelerator when paired with a 2D graphics card in the same system through a VGA pass-through cable (in the same manner that cards based on 3Dfx's Voodoo Graphics and Voodoo2 chipsets function). The 3D Blaster VLB supported Creative Labs' own proprietary API known as Creative Graphics Library (CGL). Only a handful of CGL games were ever released [http://www.vogonswiki.com/index.php/List_of_games_supporting_proprietary_APIs#3D_Blaster_.28CGL.29 (more information here)] and only these games take advantage of the 3D Blaster VLB's 3D rendering features. Games that do take advantage of the 3D Blaster VLB usually run in higher resolution (640x400 or 640x480) and with additional graphics detail. However, with these additional rendering features enabled, performance in supported games is less than desirable (unless one has a VLB Pentium system) and reviewers knocked the card. Most gamers at the time opted to upgrade to a PCI Pentium system instead. The 3D Blaster VLB was supplanted several months later by its PCI bus counterpart, the 3D Blaster PCI, which used Rendition's Verite 1000 chip and had better game compatibility and performance.&lt;br /&gt;
&lt;br /&gt;
The 3D Blaster VLB nowadays is a very rare card and usually if it shows up on auction websites, the price will be very high. The 3D Blaster VLB also supposedly supports Direct3D if the graphics memory upgrade module is installed. However, the memory upgrade module is more difficult to find than the 3D Blaster VLB itself, and would also be very steeply-priced. Unlike 3DLabs' other graphics chips and chipsets, the Game GLINT does not support OpenGL.&lt;br /&gt;
&lt;br /&gt;
===Permedia and Permedia NT ===&lt;br /&gt;
&lt;br /&gt;
3DLabs released the Permedia and Permedia NT in 1996, and they served as value-oriented graphics chips designed for the professional 3D and CAD markets. Permedia NT differs from the original in that it features a separate geometry co-processor chip (known as Delta) for performing transformation, clipping and lighting calculations. However, the Delta geometry co-processor is optimized for professional 3D and CAD applications rather than for games, resulting in anemic overall gaming performance. On the upside, the main advantage at the time was that 3DLabs was the only company producing consumer-level graphics chips that had full OpenGL ICD driver support (other manufacturers didn't support OpenGL at all or released a miniport driver instead).&lt;br /&gt;
&lt;br /&gt;
The Permedia and Permedia NT support Windows 9x and NT 4.0. Supported APIs include Direct3D, OpenGL ICD and HEIDI.&lt;br /&gt;
&lt;br /&gt;
Popular cards include Diamond Multimedia's FireGL 1000 and Leadtek's Winfast 3D L2200.&lt;br /&gt;
&lt;br /&gt;
===Permedia 2 ===&lt;br /&gt;
&lt;br /&gt;
===GLINT R3 ===&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=3Dlabs&amp;diff=1304</id>
		<title>3Dlabs</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=3Dlabs&amp;diff=1304"/>
				<updated>2013-04-28T06:40:40Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* Game GLINT */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Cards==&lt;br /&gt;
===Game GLINT ===&lt;br /&gt;
&lt;br /&gt;
3DLabs' first tap into the consumer graphics market was in November 1995 with its Game GLINT (also known as GiGi) chip. It is a scaled-down variant of 3DLabs' GLINT 300TX chipset released a year earlier, which was designed for professional 3D and CAD applications. The only graphics card produced using this chip was Creative Labs' 3D Blaster VLB. It is a unique card in that it is the only existing consumer-level 3D accelerator (and texture mapper) for the VLB bus, and it was designed for 80486 VLB users who wanted Pentium-level gaming performance. It has 1 MB graphics memory for display frame buffer and 1 MB texture memory. The 3D Blaster VLB can be used as a standalone 2D/3D graphics card or as a 3D-only accelerator when paired with a 2D graphics card in the same system through a VGA pass-through cable (in the same manner that cards based on 3Dfx's Voodoo Graphics and Voodoo2 chipsets function). The 3D Blaster VLB supported Creative Labs' own proprietary API known as Creative Graphics Library (CGL). Only a handful of CGL games were ever released [http://www.vogonswiki.com/index.php/List_of_games_supporting_proprietary_APIs#3D_Blaster_.28CGL.29 (more information here)] and only these games take advantage of the 3D Blaster VLB's 3D rendering features. Games that do take advantage of the 3D Blaster VLB usually run in higher resolution (640x400 or 640x480) and with additional graphics detail. However, with these additional rendering features enabled, performance in supported games is less than desirable (unless one has a VLB Pentium system) and reviewers knocked the card. Most gamers at the time opted to upgrade to a PCI Pentium system instead. The 3D Blaster VLB was supplanted several months later by its PCI bus counterpart, the 3D Blaster PCI, which used Rendition's Verite 1000 chip and had better game compatibility and performance.&lt;br /&gt;
&lt;br /&gt;
The 3D Blaster VLB nowadays is a very rare card and usually if it shows up on auction websites, the price will be very high. The 3D Blaster VLB also supposedly supports Direct3D if the graphics memory upgrade module is installed. However, the memory upgrade module is more difficult to find than the 3D Blaster VLB itself, and would also be very steeply-priced. Unlike 3DLabs' other graphics chips and chipsets, the Game GLINT does not support OpenGL.&lt;br /&gt;
&lt;br /&gt;
===Permedia and Permedia NT ===&lt;br /&gt;
&lt;br /&gt;
===Permedia 2 ===&lt;br /&gt;
&lt;br /&gt;
===GLINT R3 ===&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=3Dlabs&amp;diff=1303</id>
		<title>3Dlabs</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=3Dlabs&amp;diff=1303"/>
				<updated>2013-04-28T06:39:14Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: Created page with &amp;quot;==Cards== ===Game GLINT ===  3DLabs' first tap into the consumer graphics market was in November 1995 with its Game GLINT (also known as GiGi) chip. It is a scaled-down varian...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Cards==&lt;br /&gt;
===Game GLINT ===&lt;br /&gt;
&lt;br /&gt;
3DLabs' first tap into the consumer graphics market was in November 1995 with its Game GLINT (also known as GiGi) chip. It is a scaled-down variant of 3DLabs' GLINT 300TX chipset released a year earlier, which was designed for professional 3D and CAD applications. The only graphics card produced using this chip was Creative Labs' 3D Blaster VLB. It is a unique card in that it is the only existing consumer-level 3D accelerator (and texture mapper) for the VLB bus, and it was designed for 80486 VLB users who wanted Pentium-level gaming performance. It has 1 MB graphics memory for display frame buffer and 1 MB texture memory. The 3D Blaster VLB can be used as a standalone 2D/3D graphics card or as a 3D-only accelerator when paired with a 2D graphics card in the same system through a VGA pass-through cable (in the same manner that cards based on 3Dfx's Voodoo Graphics and Voodoo2 chipsets function). The 3D Blaster VLB supported Creative Labs' own proprietary API known as Creative Graphics Library (CGL). Only a handful of CGL games were ever released [http://www.vogonswiki.com/index.php/List_of_games_supporting_proprietary_APIs#3D_Blaster_.28CGL.29 (more information here)] and only these games take advantage of the 3D Blaster VLB's 3D rendering features. Games that do take advantage of the 3D Blaster VLB usually run in higher resolution (640x400 or 640x480) and with additional graphics detail. However, with these additional rendering features enabled, performance in supported games is less than desirable (unless one has a VLB Pentium system) and reviewers knocked the card. Most gamers at the time opted to upgrade to a PCI Pentium system instead. The 3D Blaster VLB was supplanted several months later by its PCI bus counterpart, the 3D Blaster PCI, which used Rendition's Verite 1000 chip and had better performance.&lt;br /&gt;
&lt;br /&gt;
The 3D Blaster VLB nowadays is a very rare card and usually if it shows up on auction websites, the price will be very high. The 3D Blaster VLB also supposedly supports Direct3D if the graphics memory upgrade module is installed. However, the memory upgrade module is more difficult to find than the 3D Blaster VLB itself, and would also be very steeply-priced. Unlike 3DLabs' other graphics chips and chipsets, the Game GLINT does not support OpenGL.&lt;br /&gt;
&lt;br /&gt;
===Permedia and Permedia NT ===&lt;br /&gt;
&lt;br /&gt;
===Permedia 2 ===&lt;br /&gt;
&lt;br /&gt;
===GLINT R3 ===&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	<entry>
		<id>https://www.vogonswiki.com/index.php?title=Intel/Real3D&amp;diff=1302</id>
		<title>Intel/Real3D</title>
		<link rel="alternate" type="text/html" href="https://www.vogonswiki.com/index.php?title=Intel/Real3D&amp;diff=1302"/>
				<updated>2013-04-28T05:55:22Z</updated>
		
		<summary type="html">&lt;p&gt;Martin: /* i740 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Cards==&lt;br /&gt;
&lt;br /&gt;
===i740 ===&lt;br /&gt;
Also known as the Intel740 and codenamed &amp;quot;Auburn&amp;quot;, this was Intel's first foray into the discrete graphics market. It was released in January 1998 and was the result of Intel's collaboration with Lockheed Martin's Real3D graphics division. It supports Direct3D and OpenGL ICD. It was designed to take full advantage of the AGP bus, and as such, it can only texture from system memory via the AGP bus. All of its local onboard graphics memory was used purely as a display frame buffer. Therefore, manufacturers were able to produce cards with as low as 2 MB of onboard memory (one such example being Intel's own Express 3D graphics card). The i740 also supports texture map resolutions up to 1024x1024. 3D rendering color depth is limited to 16-bit, but the chip has good dithering quality. Its performance allows it to compete well with NVIDIA's RIVA 128, ATI's Rage Pro and the original 3Dfx Voodoo graphics chipset, but it falls woefully short of higher-end graphics solutions released around the same timeframe, such as 3Dfx's Voodoo2. Much of the bottleneck results from its sole use of AGP texturing - the i740 has to access textures through a channel that is often much slower that of the onboard graphics memory. It was also difficult to port i740 cards to the PCI bus - often times it necessitated the use of an AGP-to-PCI bridge chip, resulting in increased card cost. The PCI variants could texture from onboard graphics memory and are faster than their AGP counterparts in some tests. The PCI variants also came with larger amounts of onboard memory - one such card was a PCI version of Real3D's Starfighter, which was manufactured with as much as 24 MB of onboard memory.&lt;br /&gt;
&lt;br /&gt;
The i740 supports Windows 9x and NT 4.0. Drivers for Windows 3.1x were also released, and the Windows 3.1x drivers are heavily based on Chips and Technologies' GUI accelerator drivers (since Intel had acquired Chips and Technologies in July 1997).&lt;br /&gt;
&lt;br /&gt;
The i740 has good overall DOS VGA compatibility and performance.&lt;br /&gt;
&lt;br /&gt;
===i752 and i754 ===&lt;br /&gt;
The i752 and i754 are slight evolutions of the i740 architecture and they were announced in April 1999. As of 2013, they are also Intel's last graphics chips used on discrete cards. The i752 supports AGP 2x while the i754 supports AGP 4x. New features include support for multitexturing, anisotropic filtering, MPEG-2 motion compensation and DVI displays. Both were cancelled upon release and very few boards made it out into the wild.&lt;br /&gt;
The i752 and i754 technology respectively lived on as the IGP in Intel's i810 and i815 motherboard chipsets. Some of this technology further found its way into Intel's later IGPs, notably their Extreme Graphics and Graphics Media Accelerator (GMA) lines.&lt;/div&gt;</summary>
		<author><name>Martin</name></author>	</entry>

	</feed>