Saving the honor of the G400/G450

Author’s note: My actual first-hand experiences are based upon a Matrox G400 Max and a GeForce 2 GTS. Although they are not the G450 and the GeForce 2 MX, according to the specifications the scales are even more in favor of the G450 against the MX as in the case of the G400 and the GF2 GTS. That means, if anything, the situation is even more extreme than sketched here.

You can see the Matrox G450 specs here and the nVidia GeForce 2 MX specs here.

 

Matrox dead?

Back in July, when I first saw the technical specifications of the new GeForce 2 MX chip on nVidia’s website, I said to myself: “Matrox is dead. The TwinView of the GF2 MX can do something the G400’s DualHead can’t, namely sending digital output to 2 LCD monitors at the same time. And the nVidia chip’s 3D is way more powerful as well”.

After this first reaction, sobering came quickly. I guess there’s simply no limit as to how many times someone can fall to hype – even in the case of a seasoned unbeliever like me. The amount of false info in the case of the MX chip was amazing. Within two months, it became clear for me that taking away the hype, the MX is no match for the G400/450 if you’re after something more than raw fps in Q3A.

 

TvinView better than DualHead? Not exactly.

Let’s begin with the fact that the card has been shipping for many weeks with drivers where TwinView didn’t function at all. Reminds me of the S3/Diamond Viper II which was also hyped as a 3D card with integrated GPU (T&L) but from release on, for several months all drivers had the T&L disabled… sounds to me like buying a 2 MB version of the Xeon CPU for an insane price tag and the L2 cache is disabled.

Even with the new drivers where TwinView is enabled, it’s still a long shot from DualHead. While the DualHead of the G450 offers you a full-featured secondary display, the TvinView of the GeForce 2 MX limits the secondary monitor output to 85 Hz, thus a) crippling the possibilities of high-end monitors, b) limiting the resolution choices on older/cheaper monitors severely and c) making the whole hype about double LCD monitors questionable, as there is maybe 1 in 10 LCD monitors capable of 85 Hz.

And not enough with that: when you use TV out on the secondary connector, your primary VGA output is limited to a mere 800×600 – whereas in the case of the Matrox G400/450 DVD Max, you can use up to 2048×1536 on the primary monitor and watch the DVD movie fullscreen on the secondary output (be it a monitor or a TV).

Another thing that might sound unimportant at first is that the TvinView of the MX works only with the newest Detonator 3 drivers (see further below for what that really means).

Update November 2000: For a couple of weeks now, there’ve been GeForce drivers out that solve most of the probs you can read about on this page. But that doesn’t change the fact that these problems existed for nearly 2 months, which is a pretty long time in the world of 3D graphics.

 

3D performance crown? The Detonator 3 fiasco.

According to marketing hype, the GeForce 2 MX should sweep the floor with all low-end and middle-range 3D competition, including the whole Matrox product range. Real-world experiences show something else however.

The Detonator 3 drivers came out around mid-August and promised a 50% performance increase. While in games like Q3A and Unreal Tournament there was indeed a 25%-30% (certainly not 50%) performance increase, a lot of other games experienced much higher percentages of performance decrease. Games like Outlaws (which already ran fine on a 3Dfx Voodoo1) started to get choppy, Unreal lost about 15% of its speed and Requiem: Avenging Angel was now unplayably choppy in 1024×768 without FSAA whereas with previous drivers it was perfectly fluent in 1024×768 with FSAA (which requires more than 2x more 3D power). Not to mention severe visual errors in Q3A itself (see next chapter).

I can’t help but ask: what kind of drivers did they give us? Just checked today and the Detonator 3 v. 6.18 drivers are still the latest ones since mid-August. And that means these drivers that are uncapable of running a lot of games out there are the only ones you can use for your GeForce 2 MX unless you are ready to ditch all the TwinView capabilities… here’s your 2 MB Xeon with the disabled L2 cache.

And in case you didn’t realize, that means that a big chunk of the 3D games out there currently runs better on a G400/450 than on a member of the GeForce family (provided you use the latest drivers). I can’t believe nVidia seems to get away with it.

 

Reviewers seem to be playing the same two games all the time

But they do. Haven’t seen a 3D card review where they’d have put more than 2-3 games on test. And that’s misleading. While the Detonator 3 drivers run Q3A just fine (except for those visual defects) and they are superb for playing Unreal Tournament as well, Drakan will get choppy already at low resolutions (whereas a G400 gets it up to 1280×960 just fine), and when using FSAA, I can’t save Half-Life without the game crashing. Add the games mentioned in the previous chapter and you have a large percentage of the bestseller games from these last couple of years – and it looks pretty obvious the list could be longer, would someone take the time and methodically stress-test dozens of games for basic functionality.

This is how the visual defects with the new nVidia drivers look like in Q3A. Those artifacts are moving and flickering as well.
This is how the visual defects with the new nVidia drivers look like in Q3A.
Those artifacts are moving and flickering as well.

 

2D image quality: no surprises here

I’ve been using Matrox cards exclusively on my PCs for the last 4 years and so I didn’t realize until a short time ago when I bought my Elsa Gladiac (GeForce 2 GTS), how strong 2D image quality of different cards can vary. For your information, I’ve used for all tests the same 21″ SONY F500R monitor with the highest quality cables. I used the same resolutions and color depth for comparisons, with refresh rates between 100 and 160 Hz.

When I first installed the Gladiac, my very first impresson was how impossibly blurry the image was. Indeed, it didn’t reach the sharpness of my 3-years-old Matrox Mystique 220 – which is not even directly connected to the monitor but to an Orchid Righteous (3Dfx Voodoo) with a loopthrough cable. And the G400 (already the G200) is a long way over that quality level. It might not be of concern when playing Q3A, but it’s certainly one when surfing the web or typing a letter (or doing something even more productive like DTP or imaging). If I had to limit myself to one PC with one graphics card, I’d choose the poorer 3D speed because the other alternative is to damage my eyesight.

 

3D image quality: now there’s a surprise

And that’s not all. While most reviewers seem to assume that e.g. 640x480x32 is just the same-looking on any and all 3D cards and thus the one delivering the highest frame rates at this setting is necessarily the best, I happened to play Q3A for weeks on a Matrox G400 before acquiring a GeForce 2 GTS to get higher frame rates. At the very first moment when starting up the game with the GeForce 2 GTS, I was stunned by how ugly the game suddenly looked – with the same settings as before.

The main reason for this is the faulty S3TC implementation of nVidia in the GeForce drivers which makes the game a lot less fun to play. As you can see here, using S3TC doesn’t necessarily lower quality on a remarkable scale (it’s quite good-looking on an S3 Savage card). But nVidia took a “shortcut” (most probably to increase the frame rate and thus be able to claim to have the fastest 3D card out there). Indeed, the 3D image quality of a GeForce 2 with all quality settings of Q3A set to maximum and S3TC enabled is worse than that of a G400 with 16bit rendering, 16bit textures and bilinear filtering (instead of trilinear). Click on these 4 thumbnails to see what I mean.

thumb_Q3A_NVIDIA_S3TC_640
640x480x32 highest quality settings, GeForce 2 with S3TC enabled
thumb_Q3A_MATROX_640
640x480x16 bilinear filtering and 16bit textures, G400
1024x768x32 highest quality settings, GeForce 2 with S3TC enabled
1024x768x16 bilinear filtering and 16bit textures, G400
1024x768x16 bilinear filtering and 16bit textures, G400

Performance comparison – this time take your eyes instead of the specs on paper

As most (maybe all) benchmarkers leave S3TC enabled when benchmarking GeForce family cards, indeed some of them openly stated they find it worth the lower image quality (S3TC gives you some 20% performance boost already at 1024×768), I decided that I’ll take the same freedom and compare the G400 using a similarly reduced image quality – hence the 16bit rendering and texture quality, as well as the bilinear filtering (see screenshots in previous paragraph).

As I had no GeForce MX at hand, I benchmarked a GeForce 2 GTS and then reduced the fps results by the same percent rate with which the MX against a GeForce DDR scored in an MX review posted on Tom’s Hardware Guide. As the GeForce 2 GTS is somewhat faster than the GeForce DDR, and the G400 is at a similar percentage faster than the G450 (because of the different memory interface), I’m convinced these results are basically true for any G450/MX comparison.

3D performance comparison chartAs you see, by comparing different cards at the same 3D image quality level, the GeForce 2 MX doesn’t look that cool any more. By 69:68 and 44:46 fps, I’d call it even. and at that, already for this equality of relative performance the MX needed specialized drivers that won’t work with a lot of games out there. Go back to the Detonator 2 drivers to be able to play all games on the market and you’ll lose the TwinView capabilities and get some 20% lower fps results, putting the GeForce 2 MX behind the G450 in 3D speed – provided you benchmark the cards head-to-head with setting the graphics quality and not the paper specs equal as comparison base.

 

Conclusion: nothing is black & white

These results don’t mean however that the G400/G450 is a good 3D gaming solution at all. I just wanted to show you that a cheapo card from nVidia isn’t necessarily better in any way, just because it carries the word “GeForce” in its name. While a GeForce 2 GTS is definitely one of the very fastest 3D cards on the market now, the GeForce 2 MX is a severely crippled version. It’s limited by its way lower memory bandwidth – just like the G400/G450. I have the feeling would Matrox equip a model of the G400 series with 128bit 150 or 166 MHz DDR SDRAM (like in the case of the GeForce DDR and the GeForce 2 GTS), it would score if not right up with the DDR-equipped GeForce cards, at least high enough to be able to beat a GeForce SDR or a GeForce 2 MX any time.

Of course this is only speculation. What’s fact however is that the G400/G450 can stand its own against the only dual-output competition, the GeForce 2 MX. While not even the 3D speed of the MX is convincing enough, its image quality, partial game incompatibility (or no dual output at all) and only half-baked TwinView features make it no real choice for anything else than pure 3D gaming. And in that case, I’d rather opt for a TNT2 Pro for less money and more balanced fps across the games, or for a GeForce 2 GTS (Ultra) with its insane 3D power.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.