Having just left the stage at AMD’s financial analyst day is CEO Dr. Lisa Su, who was on stage to present an update on AMD’s computing and graphic business. As AMD has already previously discussed their technology roadmaps over the next two years earlier in this presentation, we’ll jump right into the new material.

Not mentioned in AMD’s GPU roadmap but now being mentioned by Dr. Su is confirmation that AMD will be launching new desktop GPUs this quarter. AMD is not saying much about these new products quite yet, though based on their description it does sound like we’re looking at high-performance products (and for anyone asking, the picture of the card is a placeholder; AMD doesn’t want to show any pictures of the real product quite yet). These new products will support DirectX 12, though I will caution against confusing that with Feature Level 12_x support until we know more.

Meanwhile the big news here is that these forthcoming GPUs will be the first AMD GPUs to support High Bandwidth Memory. AMD’s GPU roadmap coyly labels this as a 2016 technology, but in fact it is coming to GPUs in 2015. The advantage of going with HBM at this time is that it will allow AMD to greatly increase their memory bandwidth capabilities while bringing down power consumption. Coupled with the fact that any new GPU from AMD should also include AMD’s latest color compression technology, and the implication is that the effective increase in memory bandwidth should be quite large. For AMD, they see this as being one of the keys of delivering better 4K performance along with better VR performance.

In the process AMD has also confirmed that these HBM-equipped GPUs will allow them to experiment with new form factors. By placing the memory on the same package as the GPU, AMD will be able to save space and produce smaller cards, which will allow them to produce designs other than the traditional large 10”+ cards that are typical of high-end video cards. AMD competitor NVIDIA has been working on HBM as well and has already shown off a test vehicle for one such card design, so we have reason to expect that AMD will be capable of something similar.


With apologies to AMD: NVIDIA’s Pascal Test Vehicle, An Example Of A Smaller, Non-Traditional Video Card Design

Finally, while talking about HBM on GPUs, AMD is also strongly hinting that they intend to bring HBM to other products as well. Given their product portfolio, we consider this to be a pretty transparent hint that the company wants to build HBM-equipped APUs. AMD’s APUs have traditionally struggled to reach peak performance due to their lack of memory bandwidth – 128-bit DDR3 only goes so far – so HBM would be a natural extension to APUs.

Comments Locked

146 Comments

View All Comments

  • 01189998819991197253 - Friday, May 8, 2015 - link

    @NinjaFlo
    Unfortunately Nvidia can't figure out HDMI handshaking. It's a shame because I would love to use an Nvida GPU in my HTPC, but HDMI handshake issues are a deal breaker. This issue has persisted for years and Nvidia hasn't even acknowledged it.
  • Hicks12 - Friday, May 8, 2015 - link

    @NinjaFlo, see this is how chizow should structure his messages... I wouldnt say he has balls for being an ignorant poster who simply dismisses every AMD based post as completely made up and Nvidia will always be best because of its halo product (this is why Nvidia rush to have the most expensive but 'title' card as it means people like him religiously by Nvidia :D).

    But anyway, I see your criteria for a 'good' card for your own preference and its great that you found the card and are happy with that purchase (its awesome when we actually buy a card!). Can I just focus on that efficiency criteria though

    Going by Anandtechs benchmarks (http://www.anandtech.com/bench/product/1036?vs=105...
    The R290X does beat the 780 in almost every game benchmark apart from thief, grid and bioshock (depending on resolution), it loses by a very small margin on those but wins by a tangible amount elsewhere.

    The power usage in crysis 3
    780: 327w
    R9 290x: 365w

    Thats not much... almost 12% more power but the FPS seems to for the most part be more than 10% greater than the 780 so it seems to be fairly damn even on efficiency. If 40 watts is a crucial amount then the PSU is really being pushed way too far :P, breathing room is required at least a bit ha!

    If the PC was on 24/7 at load then that 40w difference would be £56 a year ( at least for me on my average energy tariff). Dont know about you, if you game 24/7 then you're lucky and that makes a fair choice, personally I would be able to muster about 3 hours of free time a night at a push due to other commitments so it would be much less for me(£7)...

    Depending obviously when you bought it the price could have been the same or different, launch price the 780 was $649 but the R9 290x was $550 (guess that makes up the different in power? :D).

    The stock coolers of the R9 290 series were horrible, thats the one thing I always say Nvidia has done right is provide a good stock cooler, third party coolers had no issue with the R9 290 series though so it was hardly any different in noise / temps compared to the 780 if they were both the same third parties.

    End of the day people pick what they want, you're spot on that the GPU is decided by the persons criteria, if someone asks me to spec a pc I always get what games they're playing as ultimately that is what everyone wants, some games just play horribly on AMD or Nvidia so its game dependent (to a point obviously).

    I think my reply has turned into something else, I am agreeing with you but trying to remove the idea that the R9 290X is an inefficient beast compared to the 780, its not as it provides a similar increase in FPS compared to the power it draws and its not much more than the 780 already. Comparing it to the 980 is a different matter as its always been this way, Nvidia and AMD have different development schedules since the beginning of time so of course its Nvidias turn to be a 'generation' ahead of AMD.

    Anyway, back to enjoying the wonders of pc gaming :D
  • chizow - Saturday, May 9, 2015 - link

    lmao I've structured them like this in the past before but I really shouldn't have to, it should be obvious to anyone who is interested in these parts, that Nvidia wins in pretty much every consideration other than price:perf, and even then, it is always close enough except on the ultra high-end that Nvidia offers a relevant alternative at a slight premium.

    Being called ignorant by someone who can't even pick up a simple process/generation discussion however, is certainly a first.
  • chizow - Saturday, May 9, 2015 - link

    @Ninjaflow haha great post man, and right on with the comprehensive list of objective criteria. Its something I've laid out many times to AMD fanboys in giving examples of features, tech, considerations that make It a no-brainer to go with Nvidia, but then you just get responses in return from AMD fanboys downplaying or marginalizing such features as unimportant, not bugs, bad for pc gaming etc. Wonder where they get that from? haha.

    Its like straight out of AMD's "marketing" playbook. AMD fanboys love to throw out marketing as if it is some nebulous pejorative thing out there, but when Nvidia and their fans point to actual features and support they use and enjoy daily, its all negative marketing responses in reply. I guess there is a big difference in marketing strategy though, Nvidia markets awesome stuff, AMD markets why you shouldn't care about that awesome stuff. In the end, they certainly do serve their respective portions of the market. Nvidia markets to the overwhelming majority of the market that is willing to pay a slight premium for better features/support, AMD markets to tech bottomfeeders that need to save a few bucks over all else.

    In any case you will want to wait for sure and hold off on that 295X2, as AMD doesn't support HDMI 2.0 on any of their current parts. That will greatly limit your 4K options to actual desktop monitors, but the option to run to both an HDTV or G-Sync 4K monitor would be important to me at this point I think.

    But yes in the end it is about buying products that make PC gaming better, which is another reason to favor Nvidia given all their work and money invested into tech like GameWorks that makes the PC gaming experience better than consoles. I don't think you can just leave it at pcmasterrace anymore though lol, my new thing is:

    #geforcemasterrace and I hate the hole hashtag business! :)
  • Hicks12 - Saturday, May 9, 2015 - link

    You do realize that display port 1.2 is the most common connector used for 4k Monitors.... Haven't even seen a hdmi 2.0 monitor!

    Its important to note gsync doesn't support anything other than display port.... Says it all really right? Display port is the main connection of modern gpus :).

    You still can't admit it that amd and nvidia produce good gpus and ultimately it comes down to the individual. Stop saying the sweaping statement that Nvidia is better at everything because it's flat out bullshit and makes you look like a Rolo haha.

    Price performance is a serious category... You can't ignore that unless you're a die hard fan of the company and will buy anything they release.

    It's pointless even trying to get this through to you as others have said you don't listen and just keep saying the same old crap :).
  • chizow - Monday, May 11, 2015 - link

    Again, where do I confuse the two? You do realize that you have hundreds more options at 4K using HDMI 2.0 in the HDTV space, and often get a better quality panel (IPS vs. desktop monitors still mainly TN) right? I clearly state HDTV *OR* G-Sync 4K, it is all about leaving your options open and Nvidia provides that 2nd option with HDMI 2.0 support.

    Where did I say AMD didn't produce good GPUs? None of which changes the fact Nvidia produces BETTER at similar price points. Again, you can say it comes down to the individual, but that does NOT change the fact Nvidia offers more features and better support of their features in games that results in a superior end-user experience. So yes it is going to simply come down to how much importance you place on price over these other considerations, but as we have seen, the overwhelming majority of the market prefers Nvidia even if AMD leads in areas like price:performance (see 290/X getting slaughtered in the marketplace compared to GTX 970/980).

    It is pointless to keep pointing this out to you as Ninjaflo and others have done when you simply don't understand, there is more that goes into an end-user experience and a buying decision than the FPS in the corner of your screen and the price in your cart, and FreeSync vs. G-Sync is just one more of those considerations.

Log in

Don't have an account? Sign up now