The Sandy Bridge Preview
by Anand Lal Shimpi on August 27, 2010 2:38 PM ESTUpdate: Be sure to read our Sandy Bridge Architecture Exposed article for more details on the design behind Intel's next-generation microprocessor architecture.
The mainstream quad-core market has been neglected ever since we got Lynnfield in 2009. Both the high end and low end markets saw a move to 32nm, but if you wanted a mainstream quad-core desktop processor the best you could get was a 45nm Lynnfield from Intel. Even quad-core Xeons got the 32nm treatment.
That's all going to change starting next year. This time it's the masses that get the upgrade first. While Nehalem launched with expensive motherboards and expensive processors, the next tock in Intel's architecture cadence is aimed right at the middle of the market. This time, the ultra high end users will have to wait - if you want affordable quad-core, if you want the successor to Lynnfield, Sandy Bridge is it.
Sandy Bridge is the next major architecture from Intel. What Intel likes to call a tock. The first tock was Conroe, then Nehalem and now SB. In between were the ticks - Penryn, Westmere and after SB we'll have Ivy Bridge, a 22nm shrink of Sandy.
Did I mention we have one?
While Intel is still a few weeks away from releasing Sandy Bridge performance numbers at IDF, we managed to spend some time with a very healthy sample and run it through a few of our tests to get a sneak peak at what's coming in Q1 2011.
New Naming
The naming isn’t great. It’s an extension of what we have today. Intel is calling Sandy Bridge the 2nd generation Core i7, i5 and i3 processors. As a result, all of the model numbers have a 2 preceding them.
For example, today the fastest LGA-1156 processor is the Core i7 880. When Sandy Bridge launches early next year, the fastest LGA-1155 processor will be the Core i7 2600. The two indicates that it’s a 2nd generation Core i7, and the 600 is the model number.
Sandy Bridge CPU Comparison | |||||||||
Base Frequency | L3 Cache | Cores/Threads | Max Single Core Turbo | Intel HD Graphics Frequency/Max Turbo | Unlocked | TDP | |||
Intel Core i7 2600K | 3.4GHz | 8MB | 4 / 8 | 3.8GHz | 850 / 1350MHz | Y | 95W | ||
Intel Core i7 2600 | 3.4GHz | 8MB | 4 / 8 | 3.8GHz | 850 / 1350MHz | N | 95W | ||
Intel Core i5 2500K | 3.3GHz | 6MB | 4 / 4 | 3.7GHz | 850 / 1100MHz | Y | 95W | ||
Intel Core i5 2500 | 3.3GHz | 6MB | 4 / 4 | 3.7GHz | 850 / 1100MHz | N | 95W | ||
Intel Core i5 2400 | 3.1GHz | 6MB | 4 / 4 | 3.4GHz | 850 / 1100MHz | N | 95W | ||
Intel Core i3 2120 | 3.3GHz | 3MB | 2 / 4 | N/A | 850 / 1100MHz | N | 65W | ||
Intel Core i3 2100 | 3.1GHz | 3MB | 2 / 4 | N/A | 850 / 1100MHz | N | 65W |
The names can also have a letter after four digit model number. You’re already familiar with one: K denotes an unlocked SKU (similar to what we have today). There are two more: S and T. The S processors are performance optimized lifestyle SKUs, while the T are power optimized.
The S parts run at lower base frequencies than the non-S parts (e.g. a Core i7 2600 runs at 3.40GHz while a Core i7 2600S runs at 2.80GHz), however the max turbo frequency is the same for both (3.8GHz). GPU clocks remain the same but I’m not sure if they have the same number of execution units. All of the S parts run at 65W while the non-S parts are spec’d at 95W.
Sandy Bridge CPU Comparison | ||||||||
Base Frequency | L3 Cache | Cores/Threads | Max Single Core Turbo | Intel HD Graphics Frequency/Max Turbo | TDP | |||
Intel Core i7 2600S | 2.8GHz | 8MB | 4 / 8 | 3.8GHz | 850 / 1100MHz | 65W | ||
Intel Core i5 2500S | 2.7GHz | 6MB | 4 / 4 | 3.7GHz | 850 / 1100MHz | 65W | ||
Intel Core i5 2500T | 2.3GHz | 6MB | 4 / 4 | 3.3GHz | 650 / 1250MHz | 45W | ||
Intel Core i5 2400S | 2.5GHz | 6MB | 4 / 4 | 3.3GHz | 850 / 1100MHz | 65W | ||
Intel Core i5 2390T | 2.7GHz | 3MB | 2 / 4 | 3.5GHz | 650 / 1100MHz | 35W | ||
Intel Core i3 2100T | 2.5GHz | 3MB | 2 / 4 | N/A | 650 / 1100MHz | 35W |
The T parts run at even lower base frequencies and have lower max turbo frequencies. As a result, these parts have even lower TDPs (35W and 45W).
I suspect the S and T SKUs will be mostly used by OEMs to keep power down. Despite the confusion, I like the flexibility here. Presumably there will be a price premium for these lower wattage parts.
200 Comments
View All Comments
gruffi - Friday, August 27, 2010 - link
Why not comparing with a HD 5570? That is what Llano is supposed to have, Redwood-class IGP. An HD 5450 is quite pointless. It just reflects competition for Ontario. But Sandy Bridge is not Ontario's competition.And what about image quality or GPGPU support? Pure FPS numbers are only half of the truth.
wiak - Friday, August 27, 2010 - link
dont think so, its said that AMD's Fusion built-in GPU will have 400 SPUs (HD 5670 level-graphics), thats a far cry from HD 5450's 80 SPUs ;)so if you wanna game you still have to use something from a real graphics manufacture like AMD when it comes to GPUs bult into CPUs, as a added bonus you also have updated drivers and a decade old DirectX 9 compatibility, so you old games work without any big problems
icrf - Friday, August 27, 2010 - link
I am impressed that you have a functioning sample at least four months before it's available, run it through enough paces for a review like this, and they let you release the numbers. I mean, are they trying to suppress holiday sales?When do you think you'll have a Bulldozer sample from AMD to run a similar preview? Barring a surprise from AMD, at this point, it looks like I'll be building an i7 2600 early next year. The similar spec chip from today is an i7-975 Extreme, which is the fastest quad core in the bench, and Sandy Bridge runs 13-14% faster in the only benchmark I care about (x264). I guess even that might change significantly if it can take advantage of this "alleged on-die video transcode engine." I'd not heard of that before.
Anand Lal Shimpi - Friday, August 27, 2010 - link
Honestly we're probably several months out from having Bulldozer silicon in a similar state. With the past few generations of Intel CPUs, by around 4 - 6 months before launch we're usually able to get access to them and they perform very well.With AMD the lead time is far shorter. I don't expect us to have access to Bulldozer silicon that's worth benchmarking until Q2 2011 at the earliest. I'm more than happy to be proven wrong though :-P
icrf - Friday, August 27, 2010 - link
I guess I'm mostly surprised that Intel would do it. Conroe made sense. They had to show the world as early as possible that they had something significantly faster than AMD, suppressing sales of that for their own a little later. But now that they own that performance crown, why show previews so many months early? I suppose I could be over-analyzing it and the vast majority of the market couldn't care less so it makes little difference to their bottom line. Bragging rights simply make for good PR.Sad to see Bulldozer so far out. I assume the server chips will ship before the consumer ones, too, so it'll be at least a solid year before it could be in my hands, anyway. Oh well. To be honest, my C2D E6400 still does well enough for me. Maybe I'll just make my upgrade an Intel G3 SSD. If I got both that and SB, I don't know what I'd do with myself.
Thanks, and keep up the good work.
Anand Lal Shimpi - Saturday, August 28, 2010 - link
This preview wasn't Intel sanctioned, I believe Intel will release its own numbers at IDF in a few weeks.Take care,
Anand
icrf - Saturday, August 28, 2010 - link
Oh, I had assumed you got this chip from Intel and they had a typical NDA that said when you could talk about what you found. Where'd it come from, then? One of Intel's motherboard partners with whom you have a friendly relationship?aegisofrime - Saturday, August 28, 2010 - link
I must say, I'm really grateful for this article. I'm in the middle of planning an upgrade and information like this is really valuable to me. (and I guess to a lot of people as well!) I would just like you to know that your articles actually do influence some of our buying choices. So... Thank you! :DNow, all I need is a Bulldozer preview and all the pieces are in place...
vajm1234 - Friday, August 27, 2010 - link
few things clear and few unclear as of nowthis sandy bridge review sample here do not have TURBO enabled. The CPU runs at 3.1GHz all the time, regardless of workload as anand stated
it says "Both the CPU and GPU on SB will be able to turbo independently of one another. If you’re playing a game that uses more GPU than CPU, the CPU may run at stock speed (or lower) and the GPU can use the additional thermal headroom to clock up. The same applies in reverse if you’re running something computationally intensive."
QUESTIONS
Q} will the on die GPU unit work in tandem with the other discrete GPUs out there or it will shut off? if yes will it work when sli or crossfire is enabled :p
Q} whatever the above statement says will it happen if we use discrete graphics from nvidia or ati?
Q} will there be any possibility to disable ONLY GPU and in certain cases ONLY its TURBO FEATURE
Q} any possibility to remain the GPU overclocked the whole time when cpu is IDLE
Q} what about accelerated hd video playback using the on die gpu?
Q} it support VT-x and AVX is it possible for you anand to use specific benchmark for these instructions, same request goes for the AMD
Q} as someone asked will there be a cheap 6 => core processor for mainstream market
Q} again as per the last comment ......When do you think you'll have a Bulldozer sample from AMD to run a similar preview?
this Ques Must be answered
all n all what i think even if there is a 15-19% perf. Jump its not worh the spending when u consider u have to upgrade the entire platform
and moreover limiting Overclocking features damm! a retarded decision i am not in a mood for amd but if the overclocking hits then i will move 10000...% :angry:
regards
DanNeely - Saturday, August 28, 2010 - link
If you're asking about an SLI/CFX pairing with the IGP almost certainly not. The only company to ever attempt something like that has been Lucid with the Hydra chip and the results have been less than impressive. Architecturally I don't know that it'd even be possible for them to try with the on die GPU. The Hydra chip sat between the CPU and the Gfx cards on the PCIe bus and looked like a single card to the OS. There's no way for them to insert themselves into the middle of the connection to the IGP.