Difference between revisions of "Talk:NVIDIA"
(→Drivers: new section) |
|||
(5 intermediate revisions by 2 users not shown) | |||
Line 10: | Line 10: | ||
The fact that boards from some manufacturers show blurry images due to low quality analog circuity is true for a lot of chipsets, not just nvidia. I would recommend to either remove this or move it to a separate article with known issues for certain cards series. Personally I already used a lot of early cards with nvidia chipset and never had blurry output issues. Still I would not generalize this to all cards.[[User:Enigma|Enigma]] ([[User talk:Enigma|talk]]) 12:14, 5 March 2013 (EST) | The fact that boards from some manufacturers show blurry images due to low quality analog circuity is true for a lot of chipsets, not just nvidia. I would recommend to either remove this or move it to a separate article with known issues for certain cards series. Personally I already used a lot of early cards with nvidia chipset and never had blurry output issues. Still I would not generalize this to all cards.[[User:Enigma|Enigma]] ([[User talk:Enigma|talk]]) 12:14, 5 March 2013 (EST) | ||
+ | *The problem was widespread enough to cause people to come up with modifications to address it, particularly with Geforce 2. But it runs all the way back to NV1 in my experience. Yeah some cards are crystal clear but there are some that are incredibly poor as well. I'm surprised you've never seen a blurry NV card. I believe with Geforce 4 NVIDIA established a requirement with analog signal quality. [[User:Swaaye|Swaaye]] ([[User talk:Swaaye|talk]]) 07:21, 6 March 2013 (EST) | ||
+ | **for example [http://web.archive.org/web/20021016213841/http://www.geocities.com/porotuner/imagequality.html] [[User:Swaaye|Swaaye]] ([[User talk:Swaaye|talk]]) 07:30, 6 March 2013 (EST) | ||
+ | |||
+ | The GF4MX did not had considerably higher clock speeds as the later GF2s that it replaced. f.e. GF2Ti had 250/400 and GF4MX440 had 275/400, GF4MX420 just 250/166.[[User:Enigma|Enigma]] ([[User talk:Enigma|talk]]) | ||
+ | |||
+ | It is also wrong that the GF3 wins considerably against a later GF2, like Ti or Ultra. At this time there were no Shader 1.4 games around and for DX7 stuff the performance is nearly identical with the last GF2s. This is btw also the reason why GF3 sold rather slow in the beginning (incl. the high price) and the momentum for DX8 cards gained with GF4 Ti4200. Just as indication: GF3 has the same number of pipelines as the GF2, but later GF2s core is clocked at 250 MHz, while GF3 started with 200 MHz. So the focus should be much more on the new MSAA, upgrade to 8xAF and a bit PS. In retrospect the PS capabilities of the GF3 are still on a low performance level such that it is hardly able to do scenes with per pixel lightning, so this is a feature where you are better going for a GF4 Ti.[[User:Enigma|Enigma]] ([[User talk:Enigma|talk]]) 04:05, 8 March 2013 (EST) | ||
+ | |||
+ | :Some elaboration was needed, yes. GeForce 3 can dramatically win against GeForce 2 in some cases, mainly if there is overdraw that its HSR can clean up, or if comparing its MSAA to GeForce2 SSAA. But yes it does lose in some cases too. | ||
+ | |||
+ | :GeForce3 only supports PS1.1. | ||
+ | |||
+ | :When saying NV17 had significantly improved clock speeds, I was comparing to NV11.[[User:Swaaye|Swaaye]] ([[User talk:Swaaye|talk]]) 06:26, 8 March 2013 (EST) | ||
+ | |||
+ | == Drivers == | ||
+ | |||
+ | I think there should be a section that details the drivers released from NVIDIA in chronological order, since they released quite a lot of them before ending the support. | ||
+ | Besides listing the releases, I bet older drivers have some hidden advantages over the newest drivers when playing old games, so a guide for that would be nice. | ||
+ | --[[User:Silikone|Silikone]] ([[User talk:Silikone|talk]]) 09:24, 29 March 2013 (EST) |
Latest revision as of 09:24, 29 March 2013
I added a picture of a GeForce 3, but I'm not happy with the result of how the picture interferes with the written part of the article. I tried adding 2 more lines to the bottom of the GF3 and GF4 article to make it fit, but the extra lines won't show up on the page itself. Also, the image is a bit larger then the rest of the pictures. How to I make the thumbnail smaller? (Edit:Fixed it!)
Also, shouldn't Sera Saturn be Sega Saturn?
The GF4 MX 460 is missing in the GF2 section. Also the performance of the GF4 MX 460 is between a GF3 Ti200 and Ti500, at least according to these benchmarks Anandtech GF2 GF3 GF4 benches. So the "significant win" mentioned in the GF3 is not true. It should be considered that the GF4 MX series has LMA2 while the GF3 only has LMA1.Enigma (talk) 16:12, 23 February 2013 (EST)
- GF4 MX460 was only sent to reviewers. They didn't release it. I believe the Ti 4200 was used for that segment instead. MX420 was omitted though because I wasn't sure where it fit in at the time. As for the LMA 1 vs 2, I'm not sure what the differences are beyond the marketing designation there so I left it vague. Swaaye (talk) 16:42, 23 February 2013 (EST)
The fact that boards from some manufacturers show blurry images due to low quality analog circuity is true for a lot of chipsets, not just nvidia. I would recommend to either remove this or move it to a separate article with known issues for certain cards series. Personally I already used a lot of early cards with nvidia chipset and never had blurry output issues. Still I would not generalize this to all cards.Enigma (talk) 12:14, 5 March 2013 (EST)
- The problem was widespread enough to cause people to come up with modifications to address it, particularly with Geforce 2. But it runs all the way back to NV1 in my experience. Yeah some cards are crystal clear but there are some that are incredibly poor as well. I'm surprised you've never seen a blurry NV card. I believe with Geforce 4 NVIDIA established a requirement with analog signal quality. Swaaye (talk) 07:21, 6 March 2013 (EST)
The GF4MX did not had considerably higher clock speeds as the later GF2s that it replaced. f.e. GF2Ti had 250/400 and GF4MX440 had 275/400, GF4MX420 just 250/166.Enigma (talk)
It is also wrong that the GF3 wins considerably against a later GF2, like Ti or Ultra. At this time there were no Shader 1.4 games around and for DX7 stuff the performance is nearly identical with the last GF2s. This is btw also the reason why GF3 sold rather slow in the beginning (incl. the high price) and the momentum for DX8 cards gained with GF4 Ti4200. Just as indication: GF3 has the same number of pipelines as the GF2, but later GF2s core is clocked at 250 MHz, while GF3 started with 200 MHz. So the focus should be much more on the new MSAA, upgrade to 8xAF and a bit PS. In retrospect the PS capabilities of the GF3 are still on a low performance level such that it is hardly able to do scenes with per pixel lightning, so this is a feature where you are better going for a GF4 Ti.Enigma (talk) 04:05, 8 March 2013 (EST)
- Some elaboration was needed, yes. GeForce 3 can dramatically win against GeForce 2 in some cases, mainly if there is overdraw that its HSR can clean up, or if comparing its MSAA to GeForce2 SSAA. But yes it does lose in some cases too.
- GeForce3 only supports PS1.1.
- When saying NV17 had significantly improved clock speeds, I was comparing to NV11.Swaaye (talk) 06:26, 8 March 2013 (EST)
Drivers
I think there should be a section that details the drivers released from NVIDIA in chronological order, since they released quite a lot of them before ending the support. Besides listing the releases, I bet older drivers have some hidden advantages over the newest drivers when playing old games, so a guide for that would be nice. --Silikone (talk) 09:24, 29 March 2013 (EST)