Surf Nazis Must Die!
ATI driver tweaks, Lotus eaters, and your grudges against French Judges
Letters Did we receive any letters this week that weren't about Petabytes, Pebibytes or Troglobytes?
Hell, yes. Much hot debate about NForce. Either PC part of the year or an outdated POS? Even more thoughtful responses about ATI tweaking its drivers for Quake. And a cry for help from a Lotus user. Read on...
US judge's Nazi Net ruling turns worldwide law on its head
It seems that you don't understand part of what makes American the greatest country in the history of the planet: The First Amendment to the US Constitution. No fucking socialist French judge is going to undermine that, and that is what judge Fogel's ruling was all about.
Too stinking bad if it creates international friction.
I happen to be Jewish, and I totally disagree with the misguided, anti-freedom Jews in France that brought the lawsuit. Banning Nazi memorabilia is, in itself, Nazi-like.
There are many honorable reasons for owning such items, such as putting on a play about Nazis, or a historical collection. Even the sympathizers of Nazism should have the right to own such materials. If someone actually becomes an active Nazi, that's a different matter: They will be hunted down and blown apart.
IBM Continues Writing Lotus Out of History
As an IT/Network Admin who also has to administer two 'clustered' Domino servers with 150+ Notes users...
"the main IBM search function is hopeless at finding Lotus specific queries"
... IBM have just taken the Notes 'help' DB and put on the web - so you can't blame IBM.
Domino/LN has to be the most worse and horrendous piece of software ever coded for a Sysadmin - and god forbid the corporate Company decision makers that rolled it out to 1000s of users on WAN's.
[name and address supplied]
nForce PCs a go-go
Nvidia nForce gets first PC gig
Opinion is divided about the Nvidia NForce. "underpowered and vastly outdated POS" reckons Robert Richardson. While Dean in Germany calls it the "PC Part of the Year" and says he "stumbled upon one" when out buying milk in Plus round the corner.
But there's more heated debate about Quake drivers...
Don't give a frag about ATI's Quake III driver tweaks
Subject: Radeon 8500 Q3 image quality
Actually, well before the hack was known about there were people complaining in the arstechnica forums about the image quality of the 8500 compared to most other cards. As a matter of fact, some people sent their 8500s back and other's traded their cards for Geforce3 Ti 200 cards which cost $70 less.
The difference is easy to see, and is even obvious when you are used to seeing "normal" quake 3 texture detail and are then presented with "Radeon 8500" texture quality detail.
Obviously, if it was truly difficult to tell the difference, the default mip map levels would be further off and the texture detail would be lower.
It's a moot point, however, because ATi recently changed their story to "it was a bug in the drivers that made them misinterpret the quake3 texture detail slider".
But, of course, you're right about getting better frame rates helping more than visual detail. After all, how can you live with 120FPS at 1600x1200x32 when you can have 15 more FPS (which also puts the 8500's performance up near Geforce3 Ti 200 levels in Quake3).
I don't see how it could possibly be any more obvious that ATi lowered image quality specifically to make their product look competitive on the most-used 3d benchmark. Don't mind that ATi isn't "optimizing" for the most-used 3d games, as quake3 is not even close to that.
Keep writing the insightful and well-thought out articles!
- Jesse Stroik
Someone who actually gets what is going on out there. I have to say you are the first person who clearly pointed out both lines of what's up and what exactly the whole problem is.
I agree that quake 3 is definitely used as the main basis for videocard benchmarks and I have to say that it is really unfair for every card out there to see what they can actually do. I think we should see Nascar 4 and some flight sims. I feel that the consumers should know what they are getting in the card as a whole, not just what numbers come up when someone runs a benchmark on a FPS game. I would understand if they got up in arms on the texture quality in say a Flight Sim because of course textures make up the landscape but in a FPS screen where you have to ZOOM in on the texture just to see the difference. That is ridiculous. Not to mention actually standing in one place to just look at the difference.
I feel the only time I've ever done that in a FPS is when I first tried a Geforce2 running Tribes2... and that moment of awe lasted all of 30 seconds IN my base. I feel that some of those techsites are really ignorant to the big picture.
Again, great article, smart guy, and its nice to see someone finally understands.
As regards the entire Quake III / ATI hullabaloo, to me it isn't an issue of tweaking the drivers for performance while sacrificing image quality exactly.
It IS an issue when a company alters the given performance characteristics of a product without being totally forthcoming with the modifications done to achieve the performance gain. With ATI, it would make me just as suspicious if they had claimed that the image quality had increased by 18% without informing me that they had sacrificed frames-per-second to achieve that goal. It has nothing to do with the ability to see the results in realtime at non-zoom levels or not. It is simply an issue of being forthcoming about a change to the hardware/software combination that I purchased. Reference the entire Ford/Firestone tire issue for an example of not being forthcoming.
You seem to be a little under-informed about the subject of driver tweaking. The problem is not with optimization, every driver does some level of it to get the most out of common applications. The problem is that Quake III is basically THE benchmark used by hardware sites to measure the real-world performance of a video card. So the Radeon may look to be on par or even faster than the latest Nvidia card according to this benchmark, so the natural assumption to the buyer is that the Radeon will be faster than the others in every other application too. This is just not the case. Once you load up any other game, the speed you saw advertised on a website, or benchmarked in a hardware review is gone.
ATi depends on the reputation given to them by the hardware community. Therefore, they know how the reviews of their cards are conducted by 90% of the sites. and those sites use Quake III as the de facto benchmark. So you have to wonder the intentions of ATi releasing this optimized driver with no advertisement about it, in fact, not even mentioned to reviewers. In the days of SSE, 3DNow! and the like, optimization is a big market grabber, so why does ATi hide the fact, instead of slapping it on every press release?? It brings up some interesting questions which should not be dismissed so simply.
- Matt Madzia
I read your article specified in the subject field and mostly, I agree.
But you are missing the point that I think should count the most:
There is no way to turn it off!
It should be up to the user's preference to decide whether to go for frame rate or IQ. By doing this "hack," ATI disabled Quake's own graphical adjustment options. I prefer IQ before framerate, because I don't play against others, I'm mostly a single player man.
And I really don't like having someone (ATI) telling me what level of IQ is enough. This issue will probably be fixed in the forthcoming driver release, but I think we should tell the companies what we think, maybe Nvidia does the same thing next time and I don't like that at all!
I say Petabyte, You say Pebibyte...Let's Call The Whole Thing Off
Subject: S.I. data units considered harmful
S.I. is not an authority on computer data units, as is made obvious by their utterly failed attempts to impose terms such as "kibibyte." They didn't originate the terms "byte," "kilobyte," "megabyte," etc. and they have no power to set their definitions.
Their opinion is no more valid than it would
be if they declared pounds henceforth equivalent to kilograms. The English language is defined by use, and the powers-of-two definitions are standard by virtue of near-universal use. The main exception, makers of hard drives and other mass storage devices, are engaging in deliberate deception, knowing full well that GB will be read as 2^30 bytes by a large proportion of shoppers.
This is a simple appeal to false authority.
Promoting or using the non-standard S.I. definitions is an attack on meaningful communication, and should be treated as such.
- Darrell Johnson
Sponsored: Hyper-scale data management