gulogo.gif  
 
1. Hiatus
2. RIP, Satoru Iwata
3. Let there be Robot Battles
4. Regarding pixel art!
5. 16-bit Star Wars
6. Goodbye, Spock.
7. James Randi Retires
8. More Star Wars on GOG
9. Archive.org gives you DOS Games
10. Ralph Baer, RIP.
1. Quickie: Impressions June 2014
2. Quickie: Penny Arcade Episode 3
3. Quickie: The Amazing Spider-Man
4. Quickie: Transformers: Fall of Cybertron
5. Quickie: Prototype 2
6. Quickie: Microsoft Kinect
7. Quickie: X-Men Destiny
8. Spider-Man: Edge of Time
9. Quickie: Transformers Dark of the Moon
10. Quickie: Borderlands GOTY
1. Musings 45: Penny Arcade and The Gripping Hand
2. Movie Review: Pacific Rim
3. Movie Review: Wreck-It Ralph
4. Glide Wrapper Repository
5. Movie Review: Winnie The Pooh
6. Musings 44: PC Gaming? Maybe it's on Life Support
7. Video Games Live 2009
8. Movie Review: District 9
9. Musings: Stardock, DRM, and Gamers' Rights
10. Musings: How DRM Hurts PC Gaming
Main Menu

Affiliates
X-bit labs
The Tech Zone
Twin Galaxies

Login






 Log in Problems?
 New User? Sign Up!


 May 11, 2005 - 02:00 PM - by Michael
* Competitive Incompetence?

Printer-friendly page Print this story   Email this to a friend
PC Games/Hardware/Microsoft
Cooltechzone's got an interesting op-ed today on competitive incompetence, or "when video board makers leave an unnecessary feature out".

It's an interesting take, too; of course, the last time I saw a maker WRONGLY state that something wasn't necessary, it was when 3dfx ignored 32-bit color.

Now that I am done presenting ATI as an example, let?s take a look at Intel. Intel has done the exact same thing with its 64-bit Prescott microprocessors that ATI did with SM 3.0. Right when AMD launched its 64-bit chips, Intel was the first one to say that 64-bit instructions were unnecessary. So far so good, right? I would?ve completely agreed with them at that point, except that AMD?s 64-bit chips running in 32-bit mode performed exceptionally well, so I felt that in 64-bit mode we might see some impressive gains. I actually favored Intel?s position at the time because their arguments did make sense. After all, 64-bit chips will truly show their potential when software catches up to the hardware. Only then can you enjoy the fruits of 64-bit. I suppose you can say that I was agreeing with both companies from their respective point of views.
Future-proofing is good, but only if it's a product you can expect the users to be using when that future finally arrives.

If ATi slips 3.0 shaders into their next line of cards (and it's entirely possible they will), then nobody'll have to worry about it.
 

Home :: Share Your Story
Site contents copyright Glide Underground.
Want to syndicate our news? Hook in to our RSS Feed.