The netbook market has exploded since its introduction; there are over 40 million netbooks out in the wild today, which is tremendous growth for what is essentially a new class of PC. With such a large number of mobile devices, it's only natural for companies like NVIDIA to look for ways to get a slice of the netbook pie. As a graphics company, NVIDIA is quick to point out how poorly Intel's IGPs perform, and the GMA 950 paired with Atom netbooks is particularly slow. As an alternative to Atom+GMA 950, NVIDIA created the ION platform, which would provide dramatically improved graphics along with HD video decoding.

The first such implementation combined the GeForce 9400M chipset with an Atom N270/N280 for netbooks, or an Atom 330 for a nettop. A single-core Atom CPU is just barely able to handle 720p H.264 decoding on its own (with the CoreAVC codec—other less optimized codecs would still drop frames). 1080p support? Fahgeddaboutit! NVIDIA's ION nettops provided the necessary hardware to make a tiny HTPC box capable of handling Blu-ray playback, and the CPU and chipset are efficient enough that passive cooling isn't a problem.

On the netbook side, ION was a tougher sell. 1080p support is a nice bullet feature, but when most netbooks have 1024x600 LCDs, does HD support really matter? Plus, you would need an external USB Blu-ray drive to make it work. There's still gaming, and with Flash 10.1 (now at Beta 3) acceleration you can certainly make the argument that an ION netbook provides a superior user experience compared to stock Atom netbooks, but the caveats don't end there. NVIDIA stated that their chipset power requirements were "competitive" with Intel's chipset, but they appear to be taking performance into the equation. Our own numbers suggest that a good GMA 950+N280 solution is anywhere from 17% (H.264 decode) to over 35% (Internet surfing) more power efficient than the original ION—that's using the ASUS 1005HA and the HP Mini 311 as a point of reference. So you'd be stuck with less battery life but more features, with a higher price as well.

Things got quite a bit more complicated with the release of the Pine Trail platform and Pineview processors. Besides the fact that Pine Trail is even more power efficient (up to 70% more battery life relative to ION in Internet testing), Intel moved their IGP solution into the CPU package and eliminated the old FSB link. The Atom N450 links to the NM10 chipset with a proprietary DMI connection and NVIDIA doesn't make—and legally can't make—a compatible chipset, so using a non-Intel chipset with N450 simply isn't an option. The problem with Pine Trail is that HD video decoding remains difficult, unless you add a separate decoder chip, and gaming and other aspects of the user experience are still lackluster—N450 netbooks typically make do with Windows 7 Starter. Lucky for NVIDIA, they have some new technology called Optimus that makes all of this a moot point.

If you've got any math skills, you've probably already put two and two together to figure out what NVIDIA is announcing today. The Next Generation ION (NG-ION) platform consists of a Pineview netbook with a discrete graphics chip from NVIDIA, with Optimus allowing the GPU to switch on/off as needed. Note that there is no Optimus technology for nettop solutions, which will simply use an NVIDIA discrete GPU all the time. On a nettop that's always plugged in, NG-ION might use ~3W more power at idle, but that's not enough to worry about. There's also a benefit to just keeping things simple by using a standard discrete GPU.

Simplifying NG-ION like we just did is great for the layman, but there are plenty of other technical aspects to discuss that make things a bit more interesting. We don't have hardware for testing, so all we can pass along are NVDIA's performance information, but they make sense as we'll see in a moment. We'll also discuss some of the implementation specific details, expected availability, etc.

Getting Technical with Next Generation ION
Comments Locked


View All Comments

  • yyrkoon - Wednesday, March 3, 2010 - link

    One of the things that gets me, is that they will not / can not port this technology to the desktop. Would it not be great to have switchable switchable graphics on a low powered IGP platform, and then get a boost when you need / want it ? But nvidia still drives up the power required to use parts on the desktop.

    But, let me backup a minute. Would it not be nice to have a mobile part in a desktop for max efficiency ? Let say, something like the equivalent of the 250M, with very low power usage, but very good performance for the power usage statistics ? I am thinking ~35-40W max under load.

    Even the 7600GT for its time, could not beat these power usage numbers, and for a single monitor at around 1440x900, it did not do terrible performance wise. That, and the 7600GT was one of the most power thrifty discrete cards offered for the desktop, that gave decent performance at or around this resolution. AM I wrong in thinking the 250M GPU could trump the 7600GT in both of these areas ? If I am, then I am sure there is something that *can*.

    Also, look, I am pro Microsoft. I really like Windows 7, especially the 64-bit variant of Ultimate. It runs really nicely on "cheap" laptop with only a T3400 CPU, but with 4GB of memory. Anyways, what is up with nvidia and their "nothing but Windows" stance on this. While again, is there something wrong with the other hardware available to make better use of this current technology ? ARM comes to mind. As well as even a different CPU produced by Intel, or even AMD.

    Maybe the above is moot, because there is already something to fill those gaps, or they do not want to compete with themselves because of the new emerging hardware ( based on ARM was it ? ) they seem to have announced recently. I really do not know the whole story, but it does seem rather short sighted to me that they would limit this hardware to a single software platform. No matter which is is. Give your customer the freedom while using your hardware, and perhaps they will respond in kind by buying your hardware to begin with( and all that ).
  • Penti - Tuesday, March 2, 2010 - link

    Twice as fast? What are you on?">">

    It's game-able with 9600M it's not really game-able with Integrated 9400M.
  • JarredWalton - Tuesday, March 2, 2010 - link

    Okay, so it's "over twice as fast". It's still not a performance part. 3DMark isn't usually the best source of data for true performance. Looking to actual games, 9600M typically scores around 2 to 3 times as high as 9400M. The 9400M achieves playable frame rates at minimum details and 800x600 in nearly all games, but only about half are playable at 1366x768. Something like a 9600M is playable in all titles at 1366x768. It's still pretty anemic compared to a $100 desktop card, or a 9800M part.
  • Penti - Wednesday, March 3, 2010 - link

    I was looking at the games (which is included in most reviews/benchmarks at that site).

    9400M does fairly well on a high-speed cpu though I'll give you that. But it's still a pain to run most games.

    Dedicated memory helps, I wonder if the NG-ION will be helped by it. Looks like it will be pretty low bandwidth. 9600M is old of course. But not much else has been available. Of course I'd rather see say a Mobility HD5650. But that's still only comparable in performance to a 9800M GS. They fit the power envelope though. But that won't happen till they move to Core i lappys for Apples part. But of course even the difference of 9400M and 9600M can be felt as enormous. You don't really need to be able to play at higher resolutions then the lappy screen either way. I do agree that it's pretty anemic any way though, especially for the 17" MacBook Pro, but then again it's not a gaming computer. It's not the same as desktop where you need to game at around 1920x1200 and has screens upto 2560x1440. Being able to play at all is pretty good on a laptop.

Log in

Don't have an account? Sign up now