Thread: Higher quality != higher lag?
October 1st, 2003, 08:30 PM #1
Higher quality != higher lag?
I just tested something with my video card. Normaly I play JK2 at 800x600 at medium quality and I get around 90fps. I change the resolution to 1600x1200, changed the quality to the very max then turned on aniso filtering (the in game option, not the display settings one) and I got around 60fps.
My computer is an AthlonXP 1700+ with FX5200 (non ultra) video card.
Today I was reading the latest article at anandtech and some benchmarks hit me as being weird.
In these benchmarks, Nvidia's NV38 got the same rate regardless of whether AA/AF was on or off. FX5900 was the same. Radeon 9800XT was pretty much the same. On the other hand, the Radeon 9600 Pro got slapped real hard when AA/AF was on. I'm assuming the FX5600 Ultra being faster with AA/AF than without is some sort of error.
How do these cards not slow down too much when the demand is increased so much?
October 2nd, 2003, 01:40 PM #2
- Join Date
- Mar 2003
- Joplin, MO
- Blog Entries
The difference would be the 256bit pipeline. The R9600, FX5200, etc have 128bit pipes, so AA/AF squeezes it real fast.Good job, friend-of-friends!
Users Browsing this Thread
There are currently 1 users browsing this thread. (0 members and 1 guests)