What's new

I rarely post, but this was so damn funny...

General Plot

Britchie Crazy
ShizZie said:
Not to bust the 6600 users, but that isn't too much of an impressive card either. Either go 6800 Ultra, or go x800. But don't half-ass it and go "almost" one or the other. (Reminds me, almost time to sell my 6800, 7800 is on the way ;))
Sorry, but when a video card costs more than a brand new game console, I draw the line. $300+ seems like a bit much to be spending on a gfx card (that's a little more than I spent on my processor when it was the newest thing on the market). So enjoy your 7800, I'll be just fine with a 6600 GT. ;) Besides, with the lack of a PCI express socket, it won't do much good for me anyways.
 
Last edited:

Eagle

aka Alshain
Moderator
ShizZie said:
Not to bust the 6600 users, but that isn't too much of an impressive card either. Either go 6800 Ultra, or go x800. But don't half-ass it and go "almost" one or the other. (Reminds me, almost time to sell my 6800, 7800 is on the way ;))

Thats funny, mines quite impressive over the Geforce 3 Ti 200 I was using. Sure if I had money to burn I would buy a new $600 video card every time it came out. For the people who know the value of a dollar, right now the 6600 GT is the best card out there.
 

ShizZy

Emulator Developer
Oh and just for the record, (don't ask me how) but my socket 754 Athlon 64 3200+ owns all of you! (How the hell did it beat Redah's 3500?)... oh well, I'm not complaining.
 

Eagle

aka Alshain
Moderator
I'm betting Shizzie is overclocking (and cooliscool's is overclocking), I think Redah, generalplot, and I all did our tests at stock speeds. I could definately get mine down to 40 at least.
 
Last edited:

cooliscool

Nintendo Zealot
Right, but the general concensus was that Redah's AMD 64s would kill any P4..

Shizzie's CPU isn't overclocked. The Socket 754 3200+ is a 2.2GHz chip, to help make up for the lack of dual channel capability. SuperPI feeds on raw FPU speed (and the fact that you're all running chips on the same architecture makes this more evident), which is increased by more MHz, which makes it obvious why Shizzie's CPU beats yours here.

AMD's funny PR ratings make SuperPI test results nearly impossible to compare cross-platform of the same CPU.
 
Last edited:

Eagle

aka Alshain
Moderator
cooliscool said:
Right, but the general concensus was that Redah's AMD 64s would kill any P4..

Shizzie's CPU isn't overclocked. The Socket 754 3200+ is a 2.2GHz chip, to help make up for the lack of dual channel capability.


Right, but I dont think the Windows dialog there reports the core speed, it reports the speed the processor is spec'ed at.
 

General Plot

Britchie Crazy
And the original statement was
Redah said:
Pentium4 2.4 GHz vs AthlonXP 2400+ ? AthlonXP wins
But this apparently is not true, as a 3000+ barely beat mine (by about a second, which I'm willing to bet if my RAM was better it may have won, afterall remember he was using Corsair XMS memory, while mine is plain ol' Wintec Industries, plain vanilla 3 clock latency RAM).
So I'm pretty sure that a 3000+ VS anything at a P4C 2.6 or above would eat up any AXP, and given the better RAM, I could probably take out any AXP in a straight CPU benchmark.
 

Eagle

aka Alshain
Moderator
cooliscool said:
SuperPI feeds on raw FPU speed (and the fact that you're all running chips on the same architecture makes this more evident), which is increased by more MHz, which makes it obvious why Shizzie's CPU beats yours here.

Ah ok, well then that makes sense.
 

Eagle

aka Alshain
Moderator
cooliscool said:
Right, but the general concensus was that Redah's AMD 64s would kill any P4..

HIS consensus maybe, I never would have said something like that.
 

cooliscool

Nintendo Zealot
Ya, Windows' system dialog does report core clock speed.. it has since Win2K. :p

edit: Yeah, I know from your posts in the past that you aren't a brand-whore.. but the majority of people here are (before it's said.. I'm not. I got my P4 3.2 for $55 :D).. which is why I said, general consensus. :p
 

General Plot

Britchie Crazy
And I want to point out my attachment to Intel, just so people don't think some cult got me to side with them. I've used Intel since my 386, I've had them in my systems now for 13 years, and it's an architecture I understand, mainly because of all that time to get to know them. ;) I'm not an Intel whore, but I don't want to have to learn a whole new architecture to try to understand what it takes to build a good AMD system. And if Intel has always done what I needed it to do, why change what isn't broken? Whether it's RAW power of processing the simpler CISC instructions fast or processing CISC and RISC instructions at the cost of core speeds and such doesn't really make a difference to me. As far as I can tell, both manufacturers produce chips that acheive the same end result in about the same amount of time. Again, it comes down to my personal preference. Intel does what I need it to do, maybe AMD could as well, but again, I'm comfortable with Intel after all these years, so I'll stay with them until they give me a reason not to anymore.
 
Last edited:

cooliscool

Nintendo Zealot
Umm.. Generalplot. Over 13 years, Intel's architectures have been significantly different.. if you're referring to sockets, even then, Intel's changed sockets more than AMD has. If you mean in a programming sense, AMD's CPUs (erm, duh) uses the same x86/x87 FPU instruction sets as an Intel CPU would. It's not like you can get to know Intel over 13 years and expect nothing to change..

I also don't understand that statement about RISC and CISC. RISC = Reduced Instruction Set Computer, CISC = Complex Instruction Set Computer. Modern PC CPUs are all CISC.
 

General Plot

Britchie Crazy
cooliscool said:
Umm.. Generalplot. Over 13 years, Intel's architectures have been significantly different.. if you're referring to sockets, even then, Intel's changed sockets more than AMD has. If you mean in a programming sense, AMD's CPUs (erm, duh) uses the same x86/x87 FPU instruction sets as an Intel CPU would. It's not like you can get to know Intel over 13 years and expect nothing to change..

I also don't understand that statement about RISC and CISC. RISC = Reduced Instruction Set Computer, CISC = Complex Instruction Set Computer. Modern PC CPUs are all CISC.
As for the architecture, Intel has ALWAYS had the same basic design: the CPU gets more improved of course, this makes it necessary to change socket desgins. But the need for an Intel chipset onboard and how that chipset and the CPU interact has always been the same. And this means the basic flow of instructions has always been the same, no matter how many times the apparent architecture gets changed, the basic desgin is always the same, and again it's one I have known for a long time. And CISC can be processed faster than RISC, this is why Intel chips have a higher clock than a standard AMD chip that can give an equal performance at a lower clock. With the ability to run RISC (which is sort of like compressed code, if I remember this right) means that the processor has to be able to execute commands from the code as well as decompress it at the time it enters the core. So RISC doesn't need to move as fast to keep up with CISC in order for the end result to be the same, in about the same time, that saying that the CISC is running on a higher clock to make up for more code being moved at once in RISC.
 

Top