I'm not sure who they contract their manufacturing out to now, but it is a very real possibility that they couldn't get 28nm going in time to hit Apple's production timeline. You have to remember that these chips have probably been in production for several months.
I'm guessing that the "A6" that showed up in the beta code is a die shrink of this chip for the next iPhone. I don't think they'll have an A15 design ready for a July launch.
samsung is still making them along with the screens. the least apple could have done was 32nm.
not sure if it's apple staying with the same process or samsung not offering to make CPU's for them on anything better and the move to TSMC is taking time
*Facepalm* Samsung WAS making a 28nm chip for Apple as far back as October. It was going to be the new A6 and an even thinner iPad 3 was supposed to happen. BUT since Apple got their panties in a bunch Samsung told Apple to piss off.
28nm yields are relatively low across the board. For the number of iPads Apple is going to be selling this year, it makes sense that they went with the more dependable 45nm process. It is either that or deal with not being able to make enough iPads to satisfy demand.
Samsung denying components out of spite is a ridiculous idea. They may be competing in the consumer products division, but in terms of components Apple is by far Samsung's largest client.
Why not? I'm sure at least 50 million dump yuppies will rush out to throw away their ipoop 2 in favor of dropping yet another $500-$1000 on an ipoop 3.
And what do you get for that extra $500 to $1000? A slightly prettier homescreen to stare at after safari crashes!
Read most of the other comments in this thread. Then ask yourself if you are not, perhaps, a little ashamed of your particular contribution to this debate.
We all have bad days, we all make mistakes. But you can make up for it by, in future, perhaps considering the tone of the people around you and trying to match that tone.
I'd say you're pretty blind. It is almost as bad as saying that anti-aliasing doesn't make a difference in games.
After a day with the iPad I want this sort of PPI in ALL of my monitors. My desktop monitors are high end NEC IPS displays, and now all I see are pixels and blockiness in text. Crossing fingers that yields get good enough to allow this sort of pixel density in desktop and laptop displays.
It is insane that a display of this quality exists in a consumer product that only costs $500.
You can't just switch to a different manufacturing process. Their A5, with both CPU and GPu, is designed and optimized for 45nm. They have to invest a lot of money to make it work on 32nm or 25nm. They have to redesign both CPU and GPU. It would be stupid to do this with no further change. Intel requires a whole new Generation for such a move.
For Apple it would have been idiotic, because ARM A15 technology is available now, so they better use their knowledge and time to build a new A6 based on Cortex A15 with the PowerVR6 GPU and build on 25nm. They probabley have to invest the same money and time, but get much much more.
You should rather ask why Apple still uses the Cortext A9 and hasn't switched to A15 yet (look at Qualcomm, it's possible to have A15 technology ready already).
I've said it before, I'll say it again. To me everything looks like A5X is a plan B because A6 (for whatever reason) was delayed. Look at, for example, the separate RAM that is being used, rather than PoP as in A5. I'm sure that Apple did not want this there, they were forced into it by switching to the larger A5X, and probably not being in a position, in time, to integrate double the RAM into the package. I also am guessing that this *externally driven* RAM is a substantial part of the requirement for a large battery. I find it hard to believe this much larger battery (which supposedly gives us much the same lifetimes for various tasks as the previous iPads) is required for the screen.
People keep seeming to forget that we went through this transition before: The iPhone 3GS had a 1219 mAh battery, the iPhone4 had a 1420mAh battery. The iPhone 4 was generally considered to have rather better battery life than iPhone 3GS, especially in tasks like movie-watching. That seems to indicate that, contrary to the "wisdom" of web commenters, there is no intrinsic reason that a highDPI screen has to burn power.
Sure, it is possible that this particular screen design burns power, in a way that the iPhone 4's does not --- but I think the onus on people who claim this is to provide proof, not to simply assert "well of course it does".
But 2048x1536 is so much higher than 960x640. The GPUs will be needed more often to power that display, and the display may have needed a more powerful backlight.
This is what I am saying. THINK. What determines the power usage of an LCD display. The backlight. So why should more pixels over the same area require more energy? The total emitted light is the same.
Now in principle there COULD be effects related to more "borders" between pixels resulting in more dead area that doesn't channel light --- aperture effects. In practice, (a) this seems to have been taken care of in the manufacturing process --- there was an article on it flooding the web about five days ago (b) like I said, we have the example of the iPhone retina display. If that doesn't require a substantial boost in power over its predecessor, why should the iPad display be different?
There ARE more backlight LEDs in the iPad3 screen. But that does not imply that more backlight power is being generated --- they may just be there to create a more even backlight. Certainly my iPad1, while having a quite acceptable screen, had patches where light would bleed through a pure black image more so than in other regions of the screen.
You're assuming the backlight is the same, and not more powerful. With the iPhone, they only had to pack 614,000 pixels in a small area, but with the iPad they're dealing with five times more pixels on a larger area. They even mentioned that they had to (rough remembering of it:) separate the "pixels" from the "signals" in the display and lift the former up so that signals don't get crossed. That may make it thicker, and make it need a more powerful backlight.
Plus it requires more power from the GPUs to run the display. I doubt the RAM would increase power consumption by anything but a small amount.
Let's put some science into this thread to end the speculation.
The transmissivity (amount of light it lets through) of a screen is a function of the transmissivity of the pixels and the fill factor (fraction of the total area that is actually pixels and not black). In general: higher pixel density = lower fill factor and lower transmissivity of the individual pixels. That is exactly what is going on here; the new iPad screen has significantly lower fill factor and thus, for the same luminance, needs a stronger backlight.
This is the sole reason for the larger battery. The vast majority of power is sucked up by the screen even in the original iPad and iPad 2.
You might be right that it was plan B, and that they had to make some unwanted decisions. But let's talk about power consumption: The new SoC is similar to the old one, except of a second GPU. Power consumption of the chip might be 30% more. RAM does consume some power, too, but not that much, or does RAM get noticeable hot? But the display is the deal breaker. Just take a look at the Engadget post about the iPad screen under the microscope: http://www.engadget.com/photos/the-new-ipads-lcd-u... It's pretty obvious that horizontally no added black gap was introduced by the switch to the higher density. But vertically about twice as much black area was added! Additionally does each LC cell consume power if turned off, now they have to control 4 times the amount of cells, so the panel without backlight will consume at least 4 times more power (whereas a panel doesn't consume that much at all compared to the LED backlight, still an increase). So if you increase the pixel density you will get worse transmittance, thus you have to increase the backlight brightness. With a single row of LEDs they couldn't operate the LEDs in their most efficient region, so they had to add a second row to increase brightness.
Other way to think about it: The new battery is 20Whr larger but it has the same battery life. So if you think it's because of the RAM and SoC, both together have to consume an additional 2 Watt. That alone is ridiculous.
It's wrong to say it's the SoC only, it's totally wrong to say it's because of the added RAM, it's also wrong to say it's because of the display only, but it's mainly because of the display.
Higher backlight brightness maybe twice as bright, faster GPU necessary, more RAM necessary, all because of the higher resolution.
And placing the RAM at the side or over the chip doesn't really change the power consumption, it's just a space saving, thus cost saving.
With Rogue, these mobile SoC GPUs are getting into and maybe beyond the 200Gflop range (they said 20x the per-core performance of the 543) as the PS360 GPUs. The current MP4 is about 30 I think. Do you think the limitation will be elsewhere for actual real world graphics performance though? Last I checked these chips still didn't have the memory bandwidth of graphics cards even from 2005, and then there's processor performance and how large apps/games can be, not to mention controls. With so much potential in Rogue and future SoC chips I hope the other problems are looked at too.
A 45nm A5X is a deal-killer for me. The "iPad 3" is essentially an underpowered version of the iPad 2 considering the display's high resolution and lack of CPU/GPU clock increases. The next iPad will benefit from a full node shrink to (presumably) 28nm on BOTH the CPU and the 4G baseband; likely in addition to new CPU (Cortex A15) and GPU architectures. The iPad 3 is shaping up to be a repeat of the iPhone 3G (read: only survives one iOS update before becoming slow enough to impair its usefulness).
This is in addition to the battery problems the iPad 3 is likely to experience: that 45nm A5X is BIG for a mobile SoC, and will be generating a lot of heat. Hot iPad innards = significantly diminished Li-Ion battery lifetime...
I'm not sure about your iPhone 3G comparison. Apple's iOS updates seems to be more RAM dependent than anything else. A good example would be that iPhoto runs on the iPhone 4 (512mb) but not the original iPad (256mb) even though the latter's SOC is faster. The new iPad version RAM doubled while the iPhone 3G wasn't.
Except Apple's already given their quoted battery life times and there's no change... this is an example of taking "speeds and feeds" so far, you are about to fall off a cliff.
I am very curious to see practical benchmarks. It is possible that the GPU upgrade increased performance for things other than rendering video. Remember that Core Image, Core Video, and other components of iOS/OS X are GPU accelerated. Applications actually feel a little bit snappier than they do in the iPad 2.
It is a minor difference but it is there. Again, looking forward to Anandtech's review.
So it's as big as Ivy Bridge,that's a more interesting comparison. Mobile GPU war -Apple can't be part of such a war,so for such a war to exist we would need Android phone makers that have their own SoC to go for huge die sizes,forcing Nvidia,Qualcomm and everybody else to do the same but that would push phone prices up so maybe it would be better to have no such wars before 20/22nm.There is also the matter of heat and a huge GPU could force lower CPU clocks (like it might just be doing right now in Apple's case). For traditional PCs,consoles and TV's,obviously, a large GPU could work even before 20/22nm since $ and heat budget are less of a problem but there isn't much of a point to go that way unless you got the sales volume and the software. If anything, i would much rather see 2-3x faster storeage in phones and tablets.
I'd much rather have a faster GPU than faster storage. The internal storage on phones is more than fast enough for now. Faster MicroSD cards would be nice, though.
Even on a device that does not use a swap file? Once an app is launched, which takes on average a couple of seconds, where is all this slow disk i/o that's "bottlenecking" the experience? On the other hand, I can't think of many applications on something like the iPad that aren't using the GPU for *something*.
You can't really compare it to Ivy Bridge though, IB will be using 22nm transistors so it will have a much higher density, and even Sandy Bridge is on 32nm now while Apple is on 45nm. Heck, even Tegra is on 40nm, so Apples chips are less dense overall.
AMD Zacate/Brazos (aka Bobcat high power/Bobcat low power) die size is 75mm^2 on TSMC low powered 40mm. 40nm has about a 25% greater transistor density than 45nm (I don't have numbers of TSMC vs Samsung transistor density for 45 vs 40nm so I am assuming 1:1). So 75mm^2 *1.25= 93 is mm^2
A6x is 162.94mm^2 or 212% the size of than AMD Zacate/Brazo 75mm^2 on TMSC 40nm, once you take in account 40 vs 45 that is about 173% the size of Zacate.
That still remains one of the oddest parts of Tegra 3. As pointed out in Anand's own Medfield review, current ARM cores by themselves can be very easily choked by lack of memory bandwidth (handling that much better seems to be a major part of why Medfield did so well). With the Tegra 2 it was somewhat understandable, because it was quite an early part, but at the end of 2011, with a quad-core part and updated (though still weak) GPU, it was very odd that Nvidia of all places would stay on a single-channel when everyone else had left that behind.
Tegra 3 was pretty disappointing. I very much hope as you say here that Wayne will be a major leap forward and really blow everyone's socks off. It's definitely going to be wicked exciting in 2013, with both Series 6 on the GPU-side and big.LITTLE A15/A7 heterozygous SoCs on the CPU-side. We're still on such a strong upward curve in the mobile space, every year is bringing incredible leaps forward and massive competition. Like being back in the early/mid-90s all over again but even better :).
I've always felt like Tegra was designed for marketability over all else. Every Tegra revision was supposed to be the leader of mobile SoCs, but every time they turned out to be more hot air than performance. Quad + 1 cores is marketable; dual channel memory to actually feed the cores isn't. An 8 "core" GPU is marketable, but its handily slaughtered by a year old Imagination Tech (SGX) chip.
1) They are cheap to make since the die size is so small. When carriers don't subsidize the device, margins can be small. Apple can make their chips be bigger since A) they sell so much volume and thus they can push downward pressure on their marginal costs by buying in bulk B) They are the market leader so they can charge more for their device 2) Tegra 2 was the Android development platform for Android 3.X thus everybody knows the software and you don't have to pay money to tweak it.
So good marketing combine with cheap to make means you can make your money and sell the device.
So the next SoC in iPhone coming up in 6 months time.This definitely wont be the A5X as it simply wont fit in the iPhone size.
A5X with 28nm die shrink? But as someone has stated above this doesn't make any sense because switching nodes requires tuning and redesign. It would be better if they simply design A6 around the new node.
So what will A6 be? Cortex A15+A7 with Rogue? Sounds Great! But Both A7 and A15 aren't anywhere near ready in a few months time. And it would take Apple a month to stock up parts.
I think we'll see a quad-core Cortex A15 in an iPhone, likely with 600 series graphics, and then the same in the iPad 4 but with roughly five times more powerful graphics.
What? They should have a quad-core Cortex A15 next year. The A6 found in iOS Beta is a quad-core, so it's pretty easy to guess that it'll likely be an A15 quad.
IMHO 162.94mm2 makes it very unlikely that the A5X will end up in the next iPhone. An A6 makes more sense, maybe even with big.LITTLE A7+A15. It was said that there will be devices available with big.LITTLE by the end of the year, so if the next iPhone launches in October like it did last year it could happen (yes, Apple needs a lot of chips, but they also have the advantage of owning and designing both the chip and the phone, so the timing advantage and disadvantage might cancel each other out).
I mean what are the alternatives? A5X die shrink? A6 with just 2xA15?
We won't see an A5X in the iPhone 5. The whole point of the extra graphics power is for powering the retina display in the iPad 3. It'll either stay with an SGX543MP2 or, and more likely, we'll see some 600 series chips.
I'm actually hoping Apple opts for a faster dual-core Cortex A15 rather than a quad-core Cortex A15.
Maybe it's a "quad-core" in the sense of 2xA7 + 2xA15. ARM said it's also possible to expose all 4 cores to the OS (standard would be the OS sees either 2xA7 or 2xA15). Apple with full control over software and hardware could easily expose all 4 cores to iOS.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
45 Comments
Back to Article
alent1234 - Friday, March 16, 2012 - link
huge battery and same usage time meant something is upwhy is apple still using 45nm? steve jobs should never have sued samsung
A5 - Friday, March 16, 2012 - link
I'm not sure who they contract their manufacturing out to now, but it is a very real possibility that they couldn't get 28nm going in time to hit Apple's production timeline. You have to remember that these chips have probably been in production for several months.I'm guessing that the "A6" that showed up in the beta code is a die shrink of this chip for the next iPhone. I don't think they'll have an A15 design ready for a July launch.
alent1234 - Friday, March 16, 2012 - link
samsung is still making them along with the screens. the least apple could have done was 32nm.not sure if it's apple staying with the same process or samsung not offering to make CPU's for them on anything better and the move to TSMC is taking time
scook9 - Friday, March 16, 2012 - link
Probably because the 45nm node is extremely well optimized at this point. Apple cannot afford low yields with the demand they see.dagamer34 - Friday, March 16, 2012 - link
It's not as if Samsung has released any phones with 32nm chips in them.quiksilvr - Saturday, March 17, 2012 - link
*Facepalm* Samsung WAS making a 28nm chip for Apple as far back as October. It was going to be the new A6 and an even thinner iPad 3 was supposed to happen. BUT since Apple got their panties in a bunch Samsung told Apple to piss off.KoolAidMan1 - Saturday, March 17, 2012 - link
28nm yields are relatively low across the board. For the number of iPads Apple is going to be selling this year, it makes sense that they went with the more dependable 45nm process. It is either that or deal with not being able to make enough iPads to satisfy demand.Samsung denying components out of spite is a ridiculous idea. They may be competing in the consumer products division, but in terms of components Apple is by far Samsung's largest client.
pedrostee - Sunday, March 18, 2012 - link
you appear certain in your reply.. any evidence or just blowin smoke?Shadowmaster625 - Friday, March 16, 2012 - link
Why not? I'm sure at least 50 million dump yuppies will rush out to throw away their ipoop 2 in favor of dropping yet another $500-$1000 on an ipoop 3.And what do you get for that extra $500 to $1000? A slightly prettier homescreen to stare at after safari crashes!
name99 - Friday, March 16, 2012 - link
Read most of the other comments in this thread. Then ask yourself if you are not, perhaps, a little ashamed of your particular contribution to this debate.We all have bad days, we all make mistakes. But you can make up for it by, in future, perhaps considering the tone of the people around you and trying to match that tone.
ImSpartacus - Sunday, March 18, 2012 - link
Often times, it's best to just avoid replying to such comments. More often than not, the individual is simply "trolling" for comments.gorash - Saturday, March 17, 2012 - link
I'd say the new screen is pretty overrated. It doesn't look super different.KoolAidMan1 - Saturday, March 17, 2012 - link
I'd say you're pretty blind. It is almost as bad as saying that anti-aliasing doesn't make a difference in games.After a day with the iPad I want this sort of PPI in ALL of my monitors. My desktop monitors are high end NEC IPS displays, and now all I see are pixels and blockiness in text. Crossing fingers that yields get good enough to allow this sort of pixel density in desktop and laptop displays.
It is insane that a display of this quality exists in a consumer product that only costs $500.
steven75 - Sunday, March 18, 2012 - link
Uh, mods? How about some help here.UpSpin - Friday, March 16, 2012 - link
You can't just switch to a different manufacturing process. Their A5, with both CPU and GPu, is designed and optimized for 45nm. They have to invest a lot of money to make it work on 32nm or 25nm. They have to redesign both CPU and GPU. It would be stupid to do this with no further change. Intel requires a whole new Generation for such a move.For Apple it would have been idiotic, because ARM A15 technology is available now, so they better use their knowledge and time to build a new A6 based on Cortex A15 with the PowerVR6 GPU and build on 25nm. They probabley have to invest the same money and time, but get much much more.
You should rather ask why Apple still uses the Cortext A9 and hasn't switched to A15 yet (look at Qualcomm, it's possible to have A15 technology ready already).
name99 - Friday, March 16, 2012 - link
I've said it before, I'll say it again.To me everything looks like A5X is a plan B because A6 (for whatever reason) was delayed.
Look at, for example, the separate RAM that is being used, rather than PoP as in A5. I'm sure that Apple did not want this there, they were forced into it by switching to the larger A5X, and probably not being in a position, in time, to integrate double the RAM into the package.
I also am guessing that this *externally driven* RAM is a substantial part of the requirement for a large battery. I find it hard to believe this much larger battery (which supposedly gives us much the same lifetimes for various tasks as the previous iPads) is required for the screen.
People keep seeming to forget that we went through this transition before:
The iPhone 3GS had a 1219 mAh battery, the iPhone4 had a 1420mAh battery. The iPhone 4 was generally considered to have rather better battery life than iPhone 3GS, especially in tasks like movie-watching. That seems to indicate that, contrary to the "wisdom" of web commenters, there is no intrinsic reason that a highDPI screen has to burn power.
Sure, it is possible that this particular screen design burns power, in a way that the iPhone 4's does not --- but I think the onus on people who claim this is to provide proof, not to simply assert "well of course it does".
Steelbom - Friday, March 16, 2012 - link
But 2048x1536 is so much higher than 960x640. The GPUs will be needed more often to power that display, and the display may have needed a more powerful backlight.name99 - Friday, March 16, 2012 - link
This is what I am saying. THINK.What determines the power usage of an LCD display. The backlight.
So why should more pixels over the same area require more energy? The total emitted light is the same.
Now in principle there COULD be effects related to more "borders" between pixels resulting in more dead area that doesn't channel light --- aperture effects. In practice,
(a) this seems to have been taken care of in the manufacturing process --- there was an article on it flooding the web about five days ago
(b) like I said, we have the example of the iPhone retina display. If that doesn't require a substantial boost in power over its predecessor, why should the iPad display be different?
There ARE more backlight LEDs in the iPad3 screen. But that does not imply that more backlight power is being generated --- they may just be there to create a more even backlight. Certainly my iPad1, while having a quite acceptable screen, had patches where light would bleed through a pure black image more so than in other regions of the screen.
Steelbom - Saturday, March 17, 2012 - link
You're assuming the backlight is the same, and not more powerful. With the iPhone, they only had to pack 614,000 pixels in a small area, but with the iPad they're dealing with five times more pixels on a larger area. They even mentioned that they had to (rough remembering of it:) separate the "pixels" from the "signals" in the display and lift the former up so that signals don't get crossed. That may make it thicker, and make it need a more powerful backlight.Plus it requires more power from the GPUs to run the display. I doubt the RAM would increase power consumption by anything but a small amount.
ssj3gohan - Thursday, March 22, 2012 - link
Let's put some science into this thread to end the speculation.The transmissivity (amount of light it lets through) of a screen is a function of the transmissivity of the pixels and the fill factor (fraction of the total area that is actually pixels and not black). In general: higher pixel density = lower fill factor and lower transmissivity of the individual pixels. That is exactly what is going on here; the new iPad screen has significantly lower fill factor and thus, for the same luminance, needs a stronger backlight.
This is the sole reason for the larger battery. The vast majority of power is sucked up by the screen even in the original iPad and iPad 2.
tipoo - Friday, March 16, 2012 - link
There are already reviews out and they\re all saying pretty much the same battery life as the 2.UpSpin - Saturday, March 17, 2012 - link
You might be right that it was plan B, and that they had to make some unwanted decisions.But let's talk about power consumption:
The new SoC is similar to the old one, except of a second GPU. Power consumption of the chip might be 30% more. RAM does consume some power, too, but not that much, or does RAM get noticeable hot?
But the display is the deal breaker. Just take a look at the Engadget post about the iPad screen under the microscope:
http://www.engadget.com/photos/the-new-ipads-lcd-u...
It's pretty obvious that horizontally no added black gap was introduced by the switch to the higher density. But vertically about twice as much black area was added! Additionally does each LC cell consume power if turned off, now they have to control 4 times the amount of cells, so the panel without backlight will consume at least 4 times more power (whereas a panel doesn't consume that much at all compared to the LED backlight, still an increase). So if you increase the pixel density you will get worse transmittance, thus you have to increase the backlight brightness. With a single row of LEDs they couldn't operate the LEDs in their most efficient region, so they had to add a second row to increase brightness.
Other way to think about it: The new battery is 20Whr larger but it has the same battery life. So if you think it's because of the RAM and SoC, both together have to consume an additional 2 Watt. That alone is ridiculous.
It's wrong to say it's the SoC only, it's totally wrong to say it's because of the added RAM, it's also wrong to say it's because of the display only, but it's mainly because of the display.
Higher backlight brightness maybe twice as bright, faster GPU necessary, more RAM necessary, all because of the higher resolution.
And placing the RAM at the side or over the chip doesn't really change the power consumption, it's just a space saving, thus cost saving.
tipoo - Friday, March 16, 2012 - link
With Rogue, these mobile SoC GPUs are getting into and maybe beyond the 200Gflop range (they said 20x the per-core performance of the 543) as the PS360 GPUs. The current MP4 is about 30 I think. Do you think the limitation will be elsewhere for actual real world graphics performance though? Last I checked these chips still didn't have the memory bandwidth of graphics cards even from 2005, and then there's processor performance and how large apps/games can be, not to mention controls. With so much potential in Rogue and future SoC chips I hope the other problems are looked at too.thefrick - Friday, March 16, 2012 - link
A 45nm A5X is a deal-killer for me. The "iPad 3" is essentially an underpowered version of the iPad 2 considering the display's high resolution and lack of CPU/GPU clock increases. The next iPad will benefit from a full node shrink to (presumably) 28nm on BOTH the CPU and the 4G baseband; likely in addition to new CPU (Cortex A15) and GPU architectures. The iPad 3 is shaping up to be a repeat of the iPhone 3G (read: only survives one iOS update before becoming slow enough to impair its usefulness).This is in addition to the battery problems the iPad 3 is likely to experience: that 45nm A5X is BIG for a mobile SoC, and will be generating a lot of heat. Hot iPad innards = significantly diminished Li-Ion battery lifetime...
labrats5 - Friday, March 16, 2012 - link
I'm not sure about your iPhone 3G comparison. Apple's iOS updates seems to be more RAM dependent than anything else. A good example would be that iPhoto runs on the iPhone 4 (512mb) but not the original iPad (256mb) even though the latter's SOC is faster. The new iPad version RAM doubled while the iPhone 3G wasn't.dagamer34 - Friday, March 16, 2012 - link
Except Apple's already given their quoted battery life times and there's no change... this is an example of taking "speeds and feeds" so far, you are about to fall off a cliff.gorash - Saturday, March 17, 2012 - link
It's basically iPad 2 with a better screen. iPad 2S.KoolAidMan1 - Saturday, March 17, 2012 - link
I am very curious to see practical benchmarks. It is possible that the GPU upgrade increased performance for things other than rendering video. Remember that Core Image, Core Video, and other components of iOS/OS X are GPU accelerated. Applications actually feel a little bit snappier than they do in the iPad 2.It is a minor difference but it is there. Again, looking forward to Anandtech's review.
jjj - Friday, March 16, 2012 - link
So it's as big as Ivy Bridge,that's a more interesting comparison.Mobile GPU war -Apple can't be part of such a war,so for such a war to exist we would need Android phone makers that have their own SoC to go for huge die sizes,forcing Nvidia,Qualcomm and everybody else to do the same but that would push phone prices up so maybe it would be better to have no such wars before 20/22nm.There is also the matter of heat and a huge GPU could force lower CPU clocks (like it might just be doing right now in Apple's case).
For traditional PCs,consoles and TV's,obviously, a large GPU could work even before 20/22nm since $ and heat budget are less of a problem but there isn't much of a point to go that way unless you got the sales volume and the software.
If anything, i would much rather see 2-3x faster storeage in phones and tablets.
A5 - Friday, March 16, 2012 - link
I'd much rather have a faster GPU than faster storage. The internal storage on phones is more than fast enough for now. Faster MicroSD cards would be nice, though.jjj - Friday, March 16, 2012 - link
that shows how little you know.storage impacts perf in a big way atm.darkcrayon - Friday, March 16, 2012 - link
Even on a device that does not use a swap file? Once an app is launched, which takes on average a couple of seconds, where is all this slow disk i/o that's "bottlenecking" the experience? On the other hand, I can't think of many applications on something like the iPad that aren't using the GPU for *something*.A5 - Friday, March 16, 2012 - link
Pretty much this. The only apps that don't fit into RAM are the large, graphically intense games. Most day-to-day apps load within a second.Mobile SoCs have more of a problem with memory bandwidth than storage speed.
tipoo - Saturday, March 17, 2012 - link
You can't really compare it to Ivy Bridge though, IB will be using 22nm transistors so it will have a much higher density, and even Sandy Bridge is on 32nm now while Apple is on 45nm. Heck, even Tegra is on 40nm, so Apples chips are less dense overall.Roland00Address - Monday, March 19, 2012 - link
AMD Zacate/Brazos (aka Bobcat high power/Bobcat low power) die size is 75mm^2 on TSMC low powered 40mm. 40nm has about a 25% greater transistor density than 45nm (I don't have numbers of TSMC vs Samsung transistor density for 45 vs 40nm so I am assuming 1:1). So 75mm^2 *1.25= 93 is mm^2A6x is 162.94mm^2 or 212% the size of than AMD Zacate/Brazo 75mm^2 on TMSC 40nm, once you take in account 40 vs 45 that is about 173% the size of Zacate.
zanon - Friday, March 16, 2012 - link
That still remains one of the oddest parts of Tegra 3. As pointed out in Anand's own Medfield review, current ARM cores by themselves can be very easily choked by lack of memory bandwidth (handling that much better seems to be a major part of why Medfield did so well). With the Tegra 2 it was somewhat understandable, because it was quite an early part, but at the end of 2011, with a quad-core part and updated (though still weak) GPU, it was very odd that Nvidia of all places would stay on a single-channel when everyone else had left that behind.Tegra 3 was pretty disappointing. I very much hope as you say here that Wayne will be a major leap forward and really blow everyone's socks off. It's definitely going to be wicked exciting in 2013, with both Series 6 on the GPU-side and big.LITTLE A15/A7 heterozygous SoCs on the CPU-side. We're still on such a strong upward curve in the mobile space, every year is bringing incredible leaps forward and massive competition. Like being back in the early/mid-90s all over again but even better :).
tipoo - Saturday, March 17, 2012 - link
I've always felt like Tegra was designed for marketability over all else. Every Tegra revision was supposed to be the leader of mobile SoCs, but every time they turned out to be more hot air than performance. Quad + 1 cores is marketable; dual channel memory to actually feed the cores isn't. An 8 "core" GPU is marketable, but its handily slaughtered by a year old Imagination Tech (SGX) chip.Roland00Address - Monday, March 19, 2012 - link
1) They are cheap to make since the die size is so small. When carriers don't subsidize the device, margins can be small. Apple can make their chips be bigger since A) they sell so much volume and thus they can push downward pressure on their marginal costs by buying in bulk B) They are the market leader so they can charge more for their device2) Tegra 2 was the Android development platform for Android 3.X thus everybody knows the software and you don't have to pay money to tweak it.
So good marketing combine with cheap to make means you can make your money and sell the device.
iwod - Friday, March 16, 2012 - link
So the next SoC in iPhone coming up in 6 months time.This definitely wont be the A5X as it simply wont fit in the iPhone size.A5X with 28nm die shrink? But as someone has stated above this doesn't make any sense because switching nodes requires tuning and redesign. It would be better if they simply design A6 around the new node.
So what will A6 be? Cortex A15+A7 with Rogue? Sounds Great! But Both A7 and A15 aren't anywhere near ready in a few months time. And it would take Apple a month to stock up parts.
Any ideas?
Steelbom - Saturday, March 17, 2012 - link
I think we'll see a quad-core Cortex A15 in an iPhone, likely with 600 series graphics, and then the same in the iPad 4 but with roughly five times more powerful graphics.dagamer34 - Saturday, March 17, 2012 - link
Going quad core when they know they are going to have a dual-core Cortex A15 next year is silly and dumb.Steelbom - Sunday, March 18, 2012 - link
What? They should have a quad-core Cortex A15 next year. The A6 found in iOS Beta is a quad-core, so it's pretty easy to guess that it'll likely be an A15 quad.Mike1111 - Saturday, March 17, 2012 - link
IMHO 162.94mm2 makes it very unlikely that the A5X will end up in the next iPhone. An A6 makes more sense, maybe even with big.LITTLE A7+A15. It was said that there will be devices available with big.LITTLE by the end of the year, so if the next iPhone launches in October like it did last year it could happen (yes, Apple needs a lot of chips, but they also have the advantage of owning and designing both the chip and the phone, so the timing advantage and disadvantage might cancel each other out).I mean what are the alternatives? A5X die shrink? A6 with just 2xA15?
Steelbom - Sunday, March 18, 2012 - link
We won't see an A5X in the iPhone 5. The whole point of the extra graphics power is for powering the retina display in the iPad 3. It'll either stay with an SGX543MP2 or, and more likely, we'll see some 600 series chips.I'm actually hoping Apple opts for a faster dual-core Cortex A15 rather than a quad-core Cortex A15.
Mike1111 - Tuesday, March 20, 2012 - link
Maybe it's a "quad-core" in the sense of 2xA7 + 2xA15. ARM said it's also possible to expose all 4 cores to the OS (standard would be the OS sees either 2xA7 or 2xA15). Apple with full control over software and hardware could easily expose all 4 cores to iOS.