I stopped by Imagination Technologies at CES 2013 and talked about their upcoming GPU IP blocks and a few recently announced SoCs, and one demo caught my eye running on one such SoC. This is the year we finally get to see some PowerVR Series6 Rogue in silicon shipping and in the flesh, and the SoC in this case is LG's home entertainment oriented H13 (hence the H) SoC that was announced at the show. 

Imagination was showing off the H13 running through a variety of their own custom-built demos showing off OpenGL ES 3.0 features like multiple render targets, occlusion queries, expanded MSAA support, new texture formats, and such that we've gone over already. We don't know too much about LG's H13 SoC at this point beyond that it likely isn't an ARM Cortex A15 based design, but does include a two-cluster Series 6 GPU (G6200?) at unspecified clocks. Alongside that live demo on real production silicon was a four-cluster Series6 GPU being simulated on an FPGA inside a PC running much slower (obviously) thanks to very constrained memory bandwidth and clocks. 

I'm told that LG's H13 isn't necessarily the first production silicon with Rogue inside, but the first that's been shown and demoed in the wild. For this it's particularly exciting. I expect to see some more Rogue designs emerge later this year. 

The second part is some news about a subset of OpenGL ES 3.0 features that will be enabled retroactively on all PowerVR Series5XT hardware. The entire lineup apparently has the hardware to enable these OpenGL ES 3.0 features as an optional extension inside OpenGL ES 2.0. Features include multiple render targets; occlusion queries; seamless cube maps; sampler access from vertex shaders; floating point textures; GLSL full-precision floating point; R and RG textures; min/max blends; and multisample render buffers. 

Imagination demoed some of the same demos they had running on Series6 Rogue on an OMAP5430 development board with an SGX544MP2 GPU. I have no doubt that Imagination has enabled some of these ES 3.0 features retroactively so that some major customers of theirs can seamlessly transition to OpenGL ES 3.0 without leaving OpenGL ES 2.0 devices behind. 

Comments Locked


View All Comments

  • Camzl1 - Saturday, January 12, 2013 - link

    I love the photos and how the laptops are running some Version of Ubuntu and unity and the window logo sticker still stuck to it.
  • Daniel Egger - Sunday, January 13, 2013 - link

    Wow, those are really bad. I take it they've been taken with the famous Samsung Android camera you talked so vividly about in the podcast?
  • Alexvrb - Sunday, January 13, 2013 - link

    Hey... you don't buy a Samsung for the camera. You buy it for the... uh... well you just buy it.

    All jokes aside, other than the camera, I've had good luck with Samsung hardware in the past. Good build quality, compared to my experience with LG and HTC anyway. Unfortunetely the only decent Samsung WP8 device (Ativ S) got skipped over by Verizon (Andrizon? Veridroid?) and instead they launched "baby brother" Ativ Odyssey.

    I might give Nokia a whirl though. Owners of 820 (and derivatives) and 920 seem quite happy, plus they've got some nice exclusive free software (which can be removed without rooting if I don't like any of it, and redownloaded at will).
  • KitsuneKnight - Sunday, January 13, 2013 - link

    Daniel wasn't talking about the camera on Samsung's phones, but the "Samsung Galaxy Camera"... which is a point-and-shoot camera that runs Android, that they talked about in the latest podcast.
  • Brian Klug - Sunday, January 13, 2013 - link

    Which ones do you think are hideously bad? I though the board shots and most of what comes out of that camera (while not DSLR or micro 4/3rds level) is pretty decent.

  • Daniel Egger - Monday, January 14, 2013 - link

    The first one with Volumetric Lighting is really awful, the one with the AA comparison seems to have some really nasty compression artifacts. But most of them look rather out-of-focus and or blurry.

    The board shot is indeed one of the better ones though I think the camera might have focused on the power cable rather than the GPU.

    I'm quite sure that even my venerable Panasonic LX-3 would have yielded much better results in that environment.

Log in

Don't have an account? Sign up now