What's new

The insides of Ivy Bridge CPU

Toasty

Sony battery
I sure hope they make the April 29th release date. There's an i5-3570K with my name on it. :p
 
OP
Cyberman

Cyberman

Moderator
Moderator
An update Intel Officially release IVY bridge.

DirectX 11<-I believe this is a disaster of an idea. Considering that 'windows' is loosing considerable ground since Vista and it appears to be only accelerating. The Desktop is changing but more importantly how it's used is changing.

Cyb
 

Trotterwatch

New member
Currently running an i5 and couldn't be happier with the performance. Might go to Ivy Bridge at some point, but not for a while.
 

PsyMan

Just Another Wacko ;)
When it comes to performance and overclocking, ivy bridge processors are not such a big deal compared to sandy bridge anyway. In real-life scenarios they are about 3-5% faster when it comes to processing power, graphical capabilities excluded of course.

Now, with such a low TDP some people would expect ivy bridge processors to overclock much better than sandy bridge ones. It turns out that this is not the case either, the results should be about the same. Of course there will always be better and worse samples when it comes to overclockability.

So the real strength of this new line is the extremely low power consumption (and for developers, the new random number generator). Not bad at all, but still not as great as the jump to sandy bridge.

What I'm concerned about is the next micro-architecture Intel works on. It's supposed to have the voltage regulator of the processor integrated. I'm curious about how it's going to to end up.
 

Toasty

Sony battery
Yeah, after reading some of the initial reviews of IB's overclock-ability, I'm thinking I might just go with SB after all. IB can probably reach the same or even slightly better performance levels, but the tiny little chip gets mighty hot when overclocked (and voltage is increased). SBs will hopefully be coming down in price now anyway.
 
OP
Cyberman

Cyberman

Moderator
Moderator
Ahh Intel is trying to make more money. The price of external regulators is making them want a piece of the analog pie as well. So they are going that route.

I believe this is a yet another bad idea going worse. What happens if the regulator goes, so does the processor in this case. Now most of the built in regulator processors I've seen have a seperate section dedicated for the LDO. Ussually they are for 3.3 to 1.8V. The amount of power the intel parts use is fenominal so in order to get the power savings they want, they have to bypass patents from TI (They have several patents on automatic processor voltage scaling by load). That is more likely the reason for building it in, to get around patent issues.

Voltage scaling is pretty tricky as you have the PLL in the CPU (basically a VCO and divider that sync to an input clock). That devices has to ramp the clock up and down with the voltage In CMOS technology the clock is directly proportional to the power consumed AND voltage. CMOS gates have a specific amount of charge they need to switch (there gates are capacitors after all). So each clock transfers a specific amount of charge each clock cycle. The amount of charge on the gate is linearly proportional to the voltage. The gate switching speed also is relative to the supply voltage.

Current is charge per unit time (1 Amp is 1 couloumb per second) if the gate requires say 1 pico coulomb to chage state and you have 100 million gates switching at 1 gigahertz that's 10A current to switch at... per gigaherz (IE the charge for each gate transfered per second is the amount of current needed from the supply).

That's where MOST of the power in a CPU goes these days. Some is lost in the internal conductors but not as much as from just switching the state of the gates in the CPU.

Hence why Intel had to DROP the clock rates on their parts and use similiar techniques that AMD had been using to reduce power.

Cyb
 

Top