What's new

I need a video card, what should i get

Tagrineth

Dragony thingy
AlphaWolf said:
It's ok tag...we all lose at least once in our life :D

I hope that the release of CS:CZ may inspire ATI to fix the driver issues with the HL engine though....b/c right now it's as if they simply don't care.

No, seriously. How can a video card driver kill a CRT? They must've been very poorly made CRT's if a bad signal could KILL them. Can we say no protection or failsafes in the monitor? That's just sad. My 17" OEM Sony CRT with BUILT IN SPEAKERS has lasted me SEVEN YEARS now, and I betcha if I got one of those driver sets famous for breaking monitors, I'd still have no problems. Why? Because my CRT isn't poorly built. This is one solid monitor. It's 17" and matches the spec of most off-the-shelf, decently priced 17" CRT's today (60Hz 1280x1024 realistic maximum - yes, I know high-end 17" CRT's can do much better than that, but not any of the models I ever see at Best Buy).
 

neoak

Triforce of Something...
AlphaWolf said:
The ATI fanboy forum, rage3d, doesn't agree.

Oh, you meant Frames per Second. Actually, yes, my Radeon can even get jumpy at 24 or more players in screen. Hmm... Gotta check the Catalyst 3.10 when they are released officially... (i don't wanna compromise the stability of this PC with the betas).

Or even better, check CS with the leaked HL2 engine...

Virtualbs@Rade3D said:
20 November 2003

-=COUNTER-STRIKE 1.6 PERFORMANCE COMPARISON=-
Demo: Match from ESL / mouz.GeForce vs EFF-KULT
Point of view: mouz.GeForce | Johnny R
Map: de_train
Link to demo: http://www.sogamed.com/demos.php?id=13588
Benchmark util: FRAPS 1.9D
Benchmark method: From start of match until end of 5th round (the round where there are a lot of smoke grens).

-+-Results-+- (in frames per second)

Radeon Omega Drivers rad_w2kxp_omega_2496c (Catalyst 3.9)
2003-11-12 12:08:09 - hl
Frames: 51259 - Time: 541209ms - Avg: 94.712 - Min: 44 - Max: 163

Radeon KillerSneak DNA Drivers DNA 1.8.3.9 (Catalyst 3.9)
2003-11-12 12:58:24 - hl
Frames: 65877 - Time: 540788ms - Avg: 121.816 - Min: 55 - Max: 207

Geforce 3 Ti200 @ 210 Core/500 Memory, Drivers series 45.X
2003-11-12 16:14:15 - hl
Frames: 78547 - Time: 543581ms - Avg: 144.499 - Min: 33 - Max: 231

Radeon wxp-w2k-cod-7-96-012324e (Call of Duty Hotfix - Catalyst 3.10 Beta)
2003-11-20 02:28:32 - hl
Frames: 75643 - Time: 540717ms - Avg: 139.893 - Min: 63 - Max: 240

For the first time my Radeon almost reaches my old Geforce 3 Ti200!!!
GREAT JOB ATI!!!

I urge everyone to try these drivers and post the results!!!

Cheers,
Virtualbs

__________________
Asus A7M266 (AMD761+VIA686B) / Athlon XP 2100+ Palomino / Connect 3D Radeon 9500(SoftModded)9700 / SB Live 5.1 (PCI3) / LAN card for 768/128 ADSL connection (PCI5) / Q-tec 550W PSU.
 

Xade

Irrelevant Insight
Tagrineth said:
Eh? I don't have a problem enabling AF without AA. One driver set wouldn't allow me to enable either of them, but anyway...

...3dfx cards also worked right out of the box at the time.

It isn't because their drivers are perfect, it's because the bloody devs are coding the games around the drivers' bugs, hence making nV's drivers SEEM perfect.

(Belated reply)

And, of course, Valve aren't coding to, er, 'improve' Radeon performance or anything.

They've even been reported to have signed a secret deal with ATI, etc, etc.

We both prefer opposite companies, clearly, but you often go a little overboard with your nVidia-bashing, Tagrineth.
 
Last edited:

Tagrineth

Dragony thingy
Xade said:
(Belated reply)

And, of course, Valve aren't coding to, er, 'improve' Radeon performance or anything.

They've even been reported to have signed a secret deal with ATI, etc, etc.

We both prefer opposite companies, clearly, but you often go a little overboard with your nVidia-bashing, Tagrineth.

I bash GeForce FX and the post-NV30 nVidia that likes to run around in little circles like a retarded dog.

And Half-Life 2 runs on Radeons on the EXACT SAME CODE as it uses on XGI Volari (and any other pure DX9 non-NV3x) cards. Radeons run using default API calls only, the GeForce FX on the other hand has had around oh... five times (by Valve's own admission) the optimisation time and... well I'm sure you've seen the performance numbers.

And the "secret deal" isn't exactly secret, ATi paid Valve a bunch of money to win a PROMOTION deal, not to gain favours among the game's programmers.



Added: Oh, and before I forget: That big performance test? Which shows the Radeon 9600 Pro often beating the GeForce FX 5900 Ultra? That build was released to reviewers before the "secret deal" was signed.

Double edit: And just because the nVidia/GeForceFX apologists will probably never believe me when I tell them the FX line completely sucks for DirectX 9 Pixel Shaders (2.0) (which incidentally the Half-Life 2 engine uses very extensively) - http://english.bonusweb.cz/interviews/carmackgfx.html That link contains a response from the very famous nVidia apologist, Mr. John Carmack himself.
 
Last edited:

Xade

Irrelevant Insight
Those tests were done before the 52.x drivers, so their use as evidence is slightly unfair. Early estimations give nVidia a large performance boost with the new set.

And which tests was Carmack speaking of? Those at that pre-52 shamble of an ATI-fest? Uh-huh. Point made. Valve + ATi = best friends... etc... Carmack may well be of a different opinion post-52. Interesting to hear what he has to say now.

Check the latest benchmarks.
 
Last edited:

Tagrineth

Dragony thingy
Xade, first off there are no "latest benchmarks" of HL2 as the engine has been kept quite locked away since that theft some time ago...

And second, I take it you never saw Valve's slide on why the 5x.xx dets should never be used for benchmarking HL2?
 

fivefeet8

-= Clark Kent -X- =-
Tagrineth said:
And second, I take it you never saw Valve's slide on why the 5x.xx dets should never be used for benchmarking HL2?

Valve said to not use the 51.75 beta dets for testing. They haven't said anything about the 52.xx and the recently released 53.03 forcewares. Add to the fact that the game has been delayed until next year in which case the Nv40 and R420 will be the deciding factor on HL2 performance.

Frankly, the current lineup of Dx9/PS2.0 games that are out now are far outclassed by some of the more currently released Dx8/Ps1.4 games. It probably wont be until early next year or by the time the next generation of Video cards from ATi/Nvidia that we start seeing some really good Dx9/ps2 games. And then by that time, Microsoft will be readying DX10. In which case, the cycle starts all over again.

Barring Tomb Raider AOD with all options enabled, most of the currently released Dx9/ps2 games out are running quite well on Nvidia cards with the Forceware 52.16 and beta 53.03. Ati doesn't have a huge lead like in TR or HL2. The performance in Tomb Raider AOD have actually increased substancially for the GeforceFX's with the newer Drivers. Still not able to get close to the performance of the ATi cards in the heavy PS2.0 benches, but definately a lot better than the meager showings before the 5x.xx dets.
 

Xade

Irrelevant Insight
Tagrineth said:
Xade, first off there are no "latest benchmarks" of HL2 as the engine has been kept quite locked away since that theft some time ago...

And second, I take it you never saw Valve's slide on why the 5x.xx dets should never be used for benchmarking HL2?

Even without the blending and corner-cutting, if ran in proper dx9, the new drivers beat the 45s by a fair way.

Also, the HL2 engine beta has been leaked, actually, and a hefty amount of benchmarking has been done using programs like Fraps etc. Crude, but...

52.x and above has brought major improvements for nVidia cards, Tagrineth, and you'd be a fool to argue with that.
 

Top