gulogo.gif  
 
1. Hiatus
2. RIP, Satoru Iwata
3. Let there be Robot Battles
4. Regarding pixel art!
5. 16-bit Star Wars
6. Goodbye, Spock.
7. James Randi Retires
8. More Star Wars on GOG
9. Archive.org gives you DOS Games
10. Ralph Baer, RIP.
1. Quickie: Impressions June 2014
2. Quickie: Penny Arcade Episode 3
3. Quickie: The Amazing Spider-Man
4. Quickie: Transformers: Fall of Cybertron
5. Quickie: Prototype 2
6. Quickie: Microsoft Kinect
7. Quickie: X-Men Destiny
8. Spider-Man: Edge of Time
9. Quickie: Transformers Dark of the Moon
10. Quickie: Borderlands GOTY
1. Musings 45: Penny Arcade and The Gripping Hand
2. Movie Review: Pacific Rim
3. Movie Review: Wreck-It Ralph
4. Glide Wrapper Repository
5. Movie Review: Winnie The Pooh
6. Musings 44: PC Gaming? Maybe it's on Life Support
7. Video Games Live 2009
8. Movie Review: District 9
9. Musings: Stardock, DRM, and Gamers' Rights
10. Musings: How DRM Hurts PC Gaming
Main Menu

Affiliates
X-bit labs
The Tech Zone
Twin Galaxies

Login






 Log in Problems?
 New User? Sign Up!


<% const szAddPath = "../../../" %>

<% ' TITLE OF ARTICLE %>
NVIDIA: Too Fast For Us? - Comments, Part I
Author: Torsten Daeges       Date: March 12th 2001
Page: 2

This is part I of the comments - also check out part II.

Paragraphs in blue are my replies to the readers and the yellow parts are what they replied to my replies again.


I think nVidia does have to be way ahead. I mean how much bump mapping have we seen? Yeah like none, and it was available since the Voodoo2.

People are only starting to use bump mapping more widely now and this is over 4 year old technology. nVidia has to guarantee early availability of features even if they won't be used, otherwise we won't even see these pixel shaders being used in 4 years' time.

Look how aged PC graphics look when you compare them to a game like Soul Calibur on Dreamcast or SSX on PS2, and we have GeForce 2 Ultras available these days for the love of Mike.

We need something hugely overpowered so that it can show off that little gain in graphic quality that will make it more worth it than a console.

Even if you have a TNT2 these days there's hardly anything that makes use of it, maybe Serious Sam comes close but that's about it. Even then, the models are still pretty low-polygon, but at least the special effects are there.

But with hugely overpowered cards hitting the market, we'll definitely be able to see some improvement in graphics.

- Anthony


Hi,

Read your article. The points are valid, and seem to reflect a number of opinions out there.

However, I think your Dolphin analogy is flawed in one important point. Dolphin users might not care if their existing games were faster (they're designed to be fast enough on an N64 anyway, after all), but they may well be more willing to upgrade if the new Gamecube automatically added FSAA to all their old N64 games, and still played them fast enough. Quality boosts to existing games are a valid reason to upgrade.

And there's a couple of other points to consider, even if you're not an early adopter. First, the rapid pace of development is driving these cool features down into the sensibly-priced mainstream that much faster, and secondly, older technology gets cheaper that much faster too. Now the GF3 is available, Ultras will come down within reach.

- Daniel


You are absolutely right. I agree with all the points you make. I would also like to point out one more reason for not buying a brand new graphics card (not only from nVidia for that matter). Drivers. nVidia offers official reference drivers that work with all cards based on their chips. But if you want drivers from the manufacturer of the card then usually you have to wait a couple of months. Take Gladiac Ultra from ELSA. Last official driver (for WinME) from them is a sort of beta (no VIVO, based on nVidia's 6.67) and its so buggy that they should probably not have released it in the first place. They will get it right eventually but it wasn't worth buying it (at the price asked) when it was released. I will probably just wait for the next chip form nVidia (shouldn't be more than another six months from now) before buying GeForce3.

best of luck

- yellowD


"When buying a brandnew nVidia product, you're either mad, a game developer or don't have to worry about money at all. Competition forces nVidia to release their chips in the moment they leave the labs - but it doesn't force you to buy them instantly."

Well I bought a Geforce 256 sdr when it first came out. It has lasted me about two years, witch I feel is good. *IF* benchmarks show the Geforce3 lives up to its hype I will buy it when it comes out. Hopeing that it will last as long. If it does even at the price it will cost less than a dollar a day to use, not to bad.

- David


Torsten,you are right. only thing, too fast or is it too fat for us?

- Fergus


I agree with most of what you wrote, but I think you should consider the perspective of people who don't upgrade every generation.

I currently have a TNT2. I am planning to upgrade my video card within the next 3 months. What video card would I get? If I got a GeForce 2, that card would reach the obsolete point much quicker than a GeForce 3. As well, I won't be upgrading again for a couple of years, so buying a new model will allow me to fully enjoy the games that come out a year from now. If you got at GeForce 2, you'd never get full performance out of your new games, only full performance from your old ones, which you've already beaten. I don't know about you, but I'm unlikely to play a game I've already beat, just because the framerate is faster.

Anyways, I hope my ramblings made some sort of sense.

Sincerely,

- Giant Space Hamster


I want to say that I agree totally with your article....

If you think about it, T&L still hasn't even been major as of yet (and it was introduced when the first Geforce was introduced!), and it looks as if it doesn't matter much if it is supported or not.

I do, however, like what the Geforce3 has introduced as far as realism. Perhaps we will see these GF3 features sooner in games as a result of consumers getting it now... even if it is useless to them unless they want a higher frame rate in Quake3. (My Voodoo3 was fast enough for me in Quake3. It makes me wish I hadn't bought my GF2)

In any case, I think that no one "needs" a GF3 for any reason right now, even for FPS.... Seriously, how fast does something need to be over what was previously available?

Games look cool though, but I really doubt many games (other than maybe a rare few) will support GF3 realism even when it is possible. If you think about it, many games currently being released have graphics that would run at a good speed with best detail on a TNT1. (I.e. Heist)

Anyway, have an awesome week!

- Mike


Torsten,

While on the whole I do agree with your comments regarding the GF3, I did want to point out a few things you may not have considered. In particular:

You wrote: "What's the point in getting from enough frames to more than enough frames per second in Quake 3?"

I'd argue: The GF3 does a lot more than just add "more than enough frames." Benchmarks popping up around the web show that this is the first time users can run Q3A at high reolutions (i.e. 1024x768) with high color depth (i.e. 32-bit) and good-quality anti-aliasing (i.e. Quincunx), and still have an enjoyable, playable game. It's really a first, having nothing to do with FPS and everything to do with image quality. While this may not be the goal of some users, it's imaginable that gamers outside the "gotta-have-the-best" crowd will be attracted to the GF3 for reasons other than frame rates.

You wrote: "nVidia seems to be coming closer and closer chasing the technological possibilities, making the manufacturing process more and more expensive."

I'd argue: The only thing that makes a technological process expensive is lack of volume. That's how huge retailers are able to make money; not because they use simpler technological processes, but because they're able to benefit from what is known as economies of scale. Simply put, with sufficient demand, if you pump out a lot of some product to meet the demand, prices will go down. The high intro price has nothing to do with technological processes, but everything to do with basic economics. Because it's new, there won't be that many GF3 cards to go around at first, and thus the demand will outstrip the supply, as the hardcore purists make a mad dash for any available card (I'm guessing you and I won't be among the dashing ;-). As a result, retailers will charge an arm and a leg for one, because some schmoe in the mad dash will actually be willing to pay practically any amount. Incidentally, these "pay anything" people seem to be the same losers who like to brag about it later. Anyway, once the supply increases (and we can count on nvidia to start pumping these things out), retailers will have to lower prices to keep sales going; this should happen fairly quickly with the GF3. So, yes prices will be high at first, but no, we don't have to wait for some mysterious technological innovation before the prices will fall. Basic economics will do that all on its own.

You wrote: "Competition forces nVidia to release their chips in the moment they leave the labs - but it doesn't force you to buy them instantly.

I'd argue: The second part is certainly true. However, as for the first part, I doubt the chips are zipping out the door the second they're done. Instead, nvidia has a multistream development process, where products are being worked on at different stages. For example, it wouldn't surprise me to learn that the NV25 is already well along its development timetable. What, you don't think they built the GF3 in 6 months, or a year, do you? It's been in development, in one way or another, probably since around the time of the GeForce 256!

As for why nvidia releases a new card of some kind (whether it's a major change, like the GF3, or a minor revision, like the GF2 Pro or Ultra) every 6 months, it's not really a competition thing, I think. Honestly, nvidia doesn't need to worry that much about competitors for now; it's got ATI playing catch-up at the moment. Rather, the nifty thing about nvidia's release strategy is that it leaves lots of old product lying around. Normally this is a terrible thing for computer companies, but nvidia has really turned it to their advantage. When the GF3 releases, it'll probably push the old Riva TNT2 chipset into oblivion. That chipset is still seeing a lot of use in OEM systems. Post-GF3, nvidia will start selling old GeForce 256 chips, or GF2 MX chips, to its system integrator partners (such as Dell) instead of Riva TNT2's. Incidentally, this system integrator market is where gfx chip companies really make the big bucks. If you thought that retail sales put nvidia where they are, think again; Dell and other companies who build system with nvidia-based cards are mainly responsible for that. Now, while the GeForce 256 moves into prebuilt systems, it gets dumped off retail shelves in favor of all the flavors of GF2. But why have so many different flavors of GF1 and GF2, you ask? Well, all these different flavors allow nvidia to get a slightly better profit from retail consumers; for example, customers who wouldn't buy an Ultra, but would buy a GTS, might be willing to shell out some extra bucks for a Pro. If the Pro didn't exist, nvidia would lose the difference. You can do this little mental excercise at almost any point on their line of products. All this adds up to a sort of trickle-down strategy that allows nvidia to squeeze every last bit of revenue from every product they've developed, while maximizing the money they get from the consumer. That's the real reason for the 6-month cycle.

My bottom line here is that perhaps you should stop looking at this from a technical perspective, or from a common sense perspective. In either light, nvidia's strategy makes no sense. However, from a business perspective, what nvidia is doing is not only pretty smart, it's near-genius. I actually did a case study of the graphics card market, comparing nvidia, ATI, and 3Dfx (before they croaked), for an accounting course. And let me tell you, the proof is in the pudding: careful examination of nvidia's financial position shows that they are doing very, very well. And the reason they're doing so well is because they've been consistently following a very smart business strategy that works for this market at this time. Things might change in the future, but for now they're on top, and on top for a reason. Sorry for the utterly huge email, but I've been thinking about these things for a while, and the while brouhaha about the GF3 kinda brought it to ahead. Enjoy the read, and feel free to share my arguments with anyone you think might be interested. Also, feel free to reply with any comments, criticism, counter-arguments, etc.

- Derek


I feel the same way. people are always asking me. What Should I buy. I tell them a Geforce 2 ,MX or even a Radeon. Depending on the person I'm talking to ( Does he play a lot or just every now and then and what else is he using it for). well lately other people standing around chime in with "I'd wait and get the geforce3"

I look at them with a look as if to say "FOR WHAT!!!". but now thanks to you I have a better way of wording that.

Thanks for the clever read

- Tony (Systems Engineer)


I agree. I bought a vodoo2 for 250 and a geforce 256 for about 250 but 500-600 for a geforce 3 that is way to much.

I will get on when doom 3 comes out, hopefully for 250

- wysocki


I'm right with you on the points you made. I just bought a Geforce2 Ultra card couple months ago an now I'm set for a couple years I hope, I'm a semi frame freak but I have realized u can't keep up with video companies R/D.

I said 2 year for being set on video card because I bought a TNT2 Ultra and still use it on my second system and it will be there probly until it dies, it played Q3 just fine and was snappy enough for everything truly but I opted for a new Video Card to have the best at the time.

You have probly can say the same thing in Cpu's, this my opinion on this subject I have been around since Vic-20 days. =) Up until about a year ago the industry need the faster Cpu's truly but there at the point that you have to much Cpu power "a.k.a. killing a spider with a baseball bat" for the conmen user. Where the most conmen computer user out there are normal everyday people browsing the web, emailing and other stuff in the general unlike die hard gamers looking for there next fix. I'm tired hearing the TV commercials tell every tom, dick and Harry out there that they need a damn 1ghz+ to browse the web and to do everyday task that a 400 to 600 mhz system can do. I see the need to have faster an faster cpu's don't get me wrong but today it's like you see the same old thing over and over.

well that enough ranting and raving heheh, it's time that everybody take a step back and think before they listen to TV hype or next press release.

Thank for your time reading the comments

- Robert (Computer Tech)


Interesting comments, but i think you're missing the point. First of all, you can't actually go out and buy a product that will give you 10X the power of a geforce 2 ultra right now. It definitely won't work with a PC, if it even exists. high end workstation cards only really have 2 advantages over gaming cards at this point: superior precision, and separated texture/vertex memory.

Maybe. But I figure that, with enough money, everything goes.

Second of all, what do you care if they come out with an expensive product? Honestly, that's like saying "screw ferrari, wtf are they doing, no one will afford their car" Obviously nvidia has been very successful, and economics will dictate whether their plan will pay off.

I acknowledge that nVidia has been doing and is doing a great job - and it shows in their success.

It's also not meant to rant about expensive products in general; see: If you buy a Ferrari, you get a very fast and beautiful red car, which costs a lot of money. But you'll have everything you paid for. All the article is trying to say is that with a GeForce 3, there is NO need to buy it now, since you won't get anything more than if you'd buy an older one.

There is (almost) no additional value to it.

But look at it from another point of view. Bascially you want the hardware market to stop evolving until the software market catches up to it. This is a surefire plan to cease PC evolution, for if no one produces a geforce 3, then who will write software to take advantage of it? No one will! Besides, the geforce 3 is in effect a toned down xbox core, so it wasn't even really meant for the pc market anyway. It's like carmack said - every graphics programmer should get one, everyone else only if they want the luxury of all that power. Besides, why should the high end of PC graphics cards be tied down to the average consumer? Like you said, that's what the economy boards are for. And if you think that people are stupid enough to pay extra for something you can't afford, why don't you go visit some of the underdeveloped third world nations and then call yourself a hypocrite.

Sorry if my tone is acidic, but i felt that the article was more of a jealousy-inspired rant than a truly objective view of technology.

np. I'm not jealous of people who get a GF3 now. The point is that there IS nothing to be jealous about yet: As I said in the editorial, even more frames in Quake (or another current game) don't really matter - and the additional features won't be used for some time.

I'll get one when it's cheaper and the games start to use it. Until then, I'll save my bucks for the next Ferrari. ;-)

- Alan


(As read through a link from [H]ardOCP, and keeping in mind of course that your article about the GeForce3 is an editorial, not cold hard facts)

You have quite a few valid points in this editorial, mainly centered around the "who needs that kind of graphics processing power" point of view, which begs the question : what would Nvidia hope to gain from releasing the GeForce3 at this point in time ? You brought up the comparison of the Nintendo Gamecube being pre-released and allowing users to play current N64 games much faster ... ultimately being ludicrous, as the current N64 is quite enough. Touché, but also consider the following : it's been announced that one of Visiontek's GeForce3-based video cards will come bundled w/ Giants : Citizen Kabuto, a highly-acclaimed PC game which has been available for a few months now. Giants arguably touts the most graphically intense gaming experience ever produced for PC's, statement which had me run an experiment of my own after reading your editorial. Playing the first hour of Giants : Citizen Kabuto, here were my results.

Test System 1 (WinME):

  • AMD Duron 800Mhz@1Ghz on an ABit KT7-RAID.
  • 512MB Infineon PC133 SDRAM, Maxtor 20G 7200 RPM ATA-100.

  • ASUS V7100 AGP4X 32MB SDRAM (175Mhz/166Mhz GPU/Mem).

Result : Averaging 16-17 FPS @ 640x480 Res, all details on med-low.

Test System 2 (WinME):

  • AMD Duron 750Mhz@945Mhz on ABit KT7A-RAID.
  • 512MB Infineon PC133 SDRAM, 2XWD 20G 7200RPM ATA-100 in RAID 0.
  • ASUS GF2MX V7100 AGP4X 32MB SDRAM (205Mhz/210Mhz GPU/Mem).

(Note the ASUS V7100 almost on par w/ stock ASUS V7700 GF2GTS specs !)

Result : Averaging 19 FPS @ 640x480 Res, all details on med-low.

Now, these results may not be perfectly representative, but they're the ones i've got to show. While playing the games on either test beds, the FPS would drop to as low as 12FPS on #1, 15FPS on #2, making the game hardly playable, and that much less enjoyable, which is quite the pity, given the extraordinary job in almost every aspect that game has to offer. This with quite a solid system, and with a very decent video card backing it up ! A GeForce2 Ultra would've probably made playing the game in 800x600 Res, medium detail, a nice experience (24-30 FPS), but i wouldn't hold my breath. Further testing would be required (anyone want to donate a GF2 Ultra ? *laugh*). The sheer computational power required by Giants is mind-blowing when pit against today's other hit games, such as Counter-Strike for example. Speaking of which, Test System #2 will run Counter-Strike in 1280x960 Res, all details @max, w/ over 40FPS. So let's review : For CS, an GeForce3 would be utterly useless, i couldn't agree more w/ you. For Giants, i wouldn't be so sure. In the coming year, the PC gaming industry will have progressed quasi-exponentially, churning out games on par w/ such games as Giants, which is currently available, and already bringing gaming rigs to their knees around the world ! When the standard for games becomes driven by a game such as Giants, a GeForce2 GTS will be the minimum required, and a GeForce 3 won't even be maxing out the games' capabilities yet.

Moving on, a quick blurb about Nvidia's release of the GeForce3 as seen from a marketing point of view. Note that in the past 18 months, GeForce2-based video cards have been dominating the PC market, further penetrating the name-brand PC market. Dell, for example is now almost exclusively carrying GeForce2-based video cards in their desktop systems (and have been for at least 4 months), quickly moving to incorporate the GeForce2 GO! into their laptop systems as well ! More companies are moving to do the same (Gateway for one), not to mention the so-called "Clone" computer industry. Releasing the GeForce3 at this point in time will significantly reduce the GeForce2 GTS&Ultra's prices, making those cards even more accessible to almost everyone !! Think of the impending market domination, given that ATI isn't primed to launch a counter-attack anytime soon (even if Xperts and Radeons are doing well), 3DFX is no more, and Diamond/S3 (SonicBlue?) aren't in the running anymore. This could mean gargantuan things for Nvidia in the coming year !

Lastly, i'd like to bring up one more point : i'm sure you know, but the GeForce3 isn't aimed at the average user. No wonder it'll gross a 600-800$ USD price tag, but fanatically hardcore gamers will want the latest gadget, which, provided they keep on top of the latest gaming software, will give them quite the run for their mountains of money spent on a GeForce3. There is, however, another market which will benefit largely from the GeForce3's impending release : professional 3D Animation. Most respected mid- or high-end 3D Animation/Editing studios are arguably very wealthy, but spending several 1000$'s USD on a Diamond FireGL 2's or 3's. or on SGI boxes, isn't exactly in the cards for all of them. Even the advent of GeForce2-based video cards has _significantly_ reduced the performance spread between professional video cards such as Diamond's FireGL & SGI systems, and themselves, and at roughly 1/10th the cost !! As an aspiring 3D background modeller/texture artist speaking from experience, having the added graphical processing power of a GeForce3 backing me up could mean meeting a deadline for a project and having the project quite noticeably nicer than w/ a weaker video card, simply due to using shaded/textured viewport and navigating it smoothly VS. using a wireframe viewport and having to switch the shaded/textured viewport and waiting for the GPU to show me a scene ... Moreover, what with the professional 3D animation field slowly moving towards NT solutions as opposed to SGI or custom systems, the GeForce3 will undoubtedly find itself a rather comfortable niche, as the GeForce2 already has now in the past year.

Not only Nvidia has got a hell of alot to gain from releasing the GeForce3 at this point in time, but both the gaming and professional multimedia communities as well. I'm w/ you 100% when you say "You can always buy a [GeForce2] MX !", but ONLY as far as the average user is concerned, and that's NOT who Nvidia's targetting w/ the GeForce3. They're trying to stay one step ahead of the competition, and by releasing the GeForce3 now, that's exactly what they'll do, not to mention that the GeForce2 will still be around for anyone who feels that they don't need a GeForce3, further boosting Nvidia's profit margin. And why not ? The GeForce 2 chipset is great, both in terms of performance and price. What it comes down to once again is that the GeForce3 isn't for everyone, i agree, but that's up to each and every one of us to decide. Many people WILL actually benefit from buying a GeForce3, and i'm not talking about "people frames-addicted enough to buy the new products as they hit the street"

Lastly, i'd like to thank you for letting me play "Devil's Advocate" this one time, and thank you for composing this GeForce3 editorial in the first place. Bringing up the issue had to be done because it's a more than legit one, not to mention that it has huge potential for discussion, as you can plainly see here ! Cheers, and keep us thinking about the "latest & greatest",

- Wyldchild

P.S. If indeed there are any technical/verbal inaccuracies in this text, please don't hesitate to let me know, discussion is the fuel for learning, and i more than welcome it ! Cheers.


Previous | Next
Jump to Page: [ 1 | 2 ]

Too Fast For You? -Letters 1


Added:  Monday, March 12, 2001
Reviewer:  Torsten Daeges

Page: 2/2

Previous Previous (1/2)  1 2  

[ Back to Articles index ]

Home :: Share Your Story
Site contents copyright Glide Underground.
Want to syndicate our news? Hook in to our RSS Feed.