Intel's Ivy Bridge: An HTPC Perspective
by Ganesh T S on April 23, 2012 12:01 PM EST- Posted in
- Home Theater
- Intel
- HTPC
- Ivy Bridge
Before proceeding to the business end of the review, let us take a look at some power consumption numbers. The G.Skill ECO RAM was set to DDR3 1600 during the measurements. We measured the average power drawn at the wall under different conditions. In the table below, the Blu-ray movie from the optical disk was played using CyberLink PowerDVD 12. The Prime95 + Furmark benchmark was run for 1 hour before any measurements were taken. The MKVs were played back from a NAS attached to the network. The testbed itself was connected to a GbE switch (as was the NAS). In all cases, a wireless keyboard and mouse were connected to the testbed.
Ivy Bridge HTPC Power Consumption | |
Idle | 37.7 W |
Prime95 + Furmark (Full loading) | 127.1 W |
Blu-ray from optical drive | 57.6 W |
1080p24 MKV Playback (MPC-HC + QuickSync + EVR-CP) | 47.1 W |
1080p24 MKV Playback (MPC-HC + QuickSync + madVR) | 49.8 W |
The Ivy Bridge platform ticks all the checkboxes for the average HTPC user. Setting up MPC-HC with LAV Filters was a walk in the park. With good and stable support for DXVA2 APIs in the drivers, even softwares like XBMC can take advantage of the GPU's capabilities. The QuickSync decoder and DXVA decoder are equally efficient, and essential video processing steps such as cadence detection and deinterlacing work beautifully
For advanced users, the GPU is capable of supporting madVR for most usage scenarios even with slow memory in the system. With fast, low-latency DRAM, it is even possible that madVR can be used as a renderer for the most complicated streams. More investigation needs to be carried out to check the GPU's performance under different madVR algorithms, but the initial results appear very promising.
Does this signify the end of the road for the discrete HTPC GPU? Unfortunately, that is not the case. The Ivy Bridge platform is indeed a HTPC dream come true, but it is not future proof. While Intel will end up pleasing a large HTPC audience with Ivy Bridge, there are still a number of areas which Intel seems to have overlooked:
- Despite the rising popularity of 10-bit H.264 encodes, the GPU doesn't seem to support decoding them in hardware. That said, software decoding of 1080p 10-bit H.264 is not complex enough to overwhelm the i7-3770K (but, that may not be true for the lower end CPUs).
- The video industry is pushing 4K and it makes more sense to a lot of people compared to the 3D push. 4K will see a much faster rate of adoption compared to 3D, but Ivy Bridge seems to have missed the boat here. AMD's Southern Islands as well as NVIDIA's Kepler GPUs support 4K output over HDMI, but none of the current motherboards for Ivy Bridge CPUs support 4K over HDMI.
- It is not clear whether the Ivy Bridge GPU supports decode of 4K H.264 clips. With the current drivers and LAV Filter implementation, 4K clips were decoded in software mode. This could easily be fixed through a driver / software update. In any case, without the ability to drive a 4K display, the capability would be of limited use.
Discrete HTPC GPUs are necessary only if one has plans to upgrade to 4K in the near term. Otherwise, the Ivy Bridge platform has everything that a HTPC user would ever need.
70 Comments
View All Comments
anirudhs - Monday, April 23, 2012 - link
I can barely notice the difference between 720P and 1080I on my 32" LCD. Will people notice the difference between 1080P and 4K on a 61" screen?It seems we have crossed the point where improvements in HD video playback on Sandy Bridge and post-Sandy Bridge machines are discernible to normal people with normal screens.
I spoke to a high-end audiophile/videophile dealer, and he tells me that the state of video technology (Blu-Ray) is pretty stable. In fact, it is more stable than it has ever been in the past 40 years. I don't think "improvements" like 4K are going to be noticed by those other consumers in the top 1%. This seems like a first-world problem to me - how to cope with the arrival of 4K?
digitalrefuse - Monday, April 23, 2012 - link
... Anything being discussed on a Web site like Anandtech is going to be "a first-world problem"...That being said, there's not much of a difference between 720 lines of non-interlaced picture and 1080 lines of interlaced picture... If anything a 720P picture tends to be a little better looking than 1080I.
The transition to 4K can't come soon enough. I'm less concerned with video playback and more concerned with desktop real estate - I'd love to have one monitor with more resolution than two 1080P monitors in tandem.
ganeshts - Monday, April 23, 2012 - link
OK, one of my favourite topics :)Why does an iOS device's Retina Display work in the minds of the consumers? What prevents one from wishing for a Retina Display in the TV or computer monitor? The latter is what will drive 4K adoption.
The reason 4K will definitely get a warmer welcome compared to 3D is the fact that there are no ill-effects (eye strain / headaches) in 4K compared to 3D.
Exodite - Monday, April 23, 2012 - link
We can certainly hope, though with 1080p having been the de-facto high-end standard for desktops for almost a decade I'm not holding my breath.Until there's an affordable alternative for improving vertical resolution on the desktop I'll stick to my two 1280*1024 displays.
Don't get me wrong, I'd love to see the improvements in resolution made in mobile displays spill over into the desktop but I'd not be surprised if the most affordable way of getting a 2048*1536 display on the desktop ends up being a gutted Wi-Fi iPad blu-tacked to your current desktop display.
aliasfox - Monday, April 23, 2012 - link
It would be IPS, too!:-P
Exodite - Monday, April 23, 2012 - link
Personally I couldn't care less about IPS, though I acknowledge some do.Any trade-off in latency or ghosting just isn't worth it, as accurate color reproduction and better viewing angles just doesn't matter to me.
ZekkPacus - Monday, April 23, 2012 - link
Higher latency and ghosting that maybe one in fifty thousand users will notice, if that. This issue has been blown out of all proportion by the measurable stats at all costs brigade - MY SCREEN HAS 2MS SO IT MUST BE BETTER. The average human eye cannot detect any kind of ghosting/input lag in anything under a 10-14ms refresh window. Only the most seasoned pro gamers would notice, and only if you sat the monitors side by side.A slight loss in meaningless statistics is worth it if you get better, more vibrant looking pictures and something where you CAN actually see the difference.
SlyNine - Tuesday, April 24, 2012 - link
I take it you've done hundreds of hours of research and documented your studies and methodology so we can look at the results.What if Anand did videocard reviews the same way your spouting out these "facts". They would be worthless conjector, just like your information.
Drop the, but its a really small number argument. Until you really document what the human eye/brain is capable all your saying its a really small number.
Well Thz is a really small number to. And we can the human body can pick up things as little as 700 Tera Hz. Its called the EYE!.
Exodite - Tuesday, April 24, 2012 - link
Look, you're of a different opinion - that's fine.I, however, don't want IPS.
Because I can't appreciate the "vibrant" colors, nor the better accuracy or bigger viewing angles.
Indeed, my preferred display has a slightly cold hue and I always turn saturation and brightness way down because it makes the display more restful for my eyes.
I work with text and when I don't do that I play games.
I'd much rather have a 120Hz display with even lower latency than I'd take any improvement in areas that I don't care about and won't even notice.
Also, if you're going to make outlandish claims about how many people can or cannot notice this or that you should probably back it up.
Samus - Tuesday, April 24, 2012 - link
Exodite, you act like IPS has awful latency or something.If we were talking about PVA, I wouldn't be responding to an otherwise reasonable arguement, but we're not. The latency between IPS and TN is virtually identical, especially to the human eye and mind. High frame (1/1000) cameras are required to even measure the difference between IPS and TN.
Yes, TN is 'superior' with its 2ms latency, but IPS is superior with its <6ms latency, 97.4% Adobe RGB accuracy, 180 degree bi-plane viewing angles, and lower power consumption/heat output (either in LED or cold cathode configurations) due to less grid processing.
This arguement is closed. Anybody who says they can tell a difference between 2ms and sub 6ms displays is being a whiny bitch.