Corsair Graphite 760T: Introduction and Packaging

Corsair has been releasing one case after another lately, expanding their already large ranks with an even greater variety of products. It has been less than three months since the release of the Obsidian 250D, a cubic Mini-ITX case, and only two days since another member of the Obsidian series, the Midi-ATX Obsidian 450D, has been announced. Today, Corsair announced the release of yet another case, the Graphite 730T/760T.

Unlike the Obsidian 450D, which was released in order to fill a specific gap into the already heavily populated Obsidian series, the release Graphite 730T/760T does not appear to have such a purpose. There are only two Graphite cases currently available, the 230T and the 600T and, considering the MSRP of the Graphite 730T/760T versions and that its aesthetic design is similar to that of the 230T, it seems more likely that it has been released as a replacement for the 600T rather than having products that will coexist. As such, the primary changes will be a modified aesthetic and improved performance.

We should clarify that the 730T and the 760T are essentially the same case; the major difference is that the former has an opaque left panel and the latter an acrylic window. The Graphite 760T also has a basic 2-speed fan controller installed and will become available in both Black and Arctic White colors. It is the Arctic White version of the Graphite 760T that we will be reviewing today. Corsair informed us that the new Graphite cases will become available through North American retailers in late April.

Corsair Graphite 760T Specifications
Motherboard Form Factor Mini-ITX, Micro-ATX, ATX, EATX, XL-ATX
Drive Bays External 3 x 5.25"
Internal 6 x 2.5"/3.5" (front drive cage)
6 x 2.5"/3.5" (optional front drive cages)
4 x 2.5" (rear of motherboard tray)
Cooling Front 2 x 120 / 140mm (2 x 140mm included)
Rear 1 x 140mm (included)
Top 3 x 120mm / 140mm (optional)
Left Side -
Bottom optional 120mm (drive cage must be removed/relocated)
Radiator Support Front Up to 240mm / 280mm
Rear 120mm / 140mm
Top Up to 360mm / 280mm
Side -
Bottom 120mm
I/O Port 2 × USB 3.0
2 × USB 3.0
1 × Headphone
1 × Mic
Fan Speed Toggle
Power Supply Size ATX
Clearances HSF 180mm
PSU Any
GPU 340mm (with drive cage)
460mm (without drive cage)
Dimensions 568mm × 246mm × 564mm (H×W×D)
22.4 in × 9.7 in × 22.2 in (H×W×D)
Prominent Features Hinged side panel with full window
360mm radiator support
Removable magnetic top panel
Two-speed fan control
Side-mounted tool-free SSD trays
Removable, reconfigurable 3.5” drive cages
Price 189 USD (MSRP)

The Graphite 760T comes in Corsair's traditional and visually simple brown cardboard box, the proportions of which hint that this is not a typical Mid-Tower case. Printed on the box are a schematic of the case and a short presentation covering its most important features. Inside the box, the case is wrapped inside a cloth-like bag and protected by very thick expanded polyethylene foam slabs.

The bundle of the Graphite 760T is very basic, especially considering the class of the case. Corsair only supplies the necessary screws and bits, a few short cable ties, and an installation guide. There are no cable straps or any other additional extras. The only positive thing about the bundle is that the supplied parts are black. If you like getting "extras", this is disappointing, but for some users the extras would simply be more clutter.

Corsair Graphite 760T Exterior
Comments Locked

71 Comments

View All Comments

  • Black Obsidian - Friday, March 28, 2014 - link

    So you agree that your thermal load has no meaningful relationship to the object it's supposed to be a proxy for (an active system), but that you see this as in any way advantageous is the part that I'm having difficulty comprehending.

    I'm going to cut down a much longer reply by simply bringing us to my ultimate point, which is that you appear to be starting by assuming a spherical cow in a vacuum (http://en.wikipedia.org/wiki/Spherical_cow for the reference, in case it's too obtuse).

    A static thermal load, like a spherical cow in a vacuum, virtually eliminates variables and simplifies the problem, which absolutely meets your stated goal of achieving repeatable results. But in so doing, it fails to emulate real-world situations (your other stated goal), because nobody, to my knowledge, actually HAS vacuum-breathing spherical cows (or static thermal loads in need of computer cases to house them).

    If your static thermal load is any easier to translate into the performance of an actual active load than one particular active load is to translate into a different active load, I'm clearly failing to understand how that is so.
  • E.Fyll - Friday, March 28, 2014 - link

    I fear that you understood little of what I said. I will consider the "spherical cow" mention as a joke, since it only applies to highly simplified theoretical studies, not laboratory testing. As a matter of fact, the "spherical cow" approach is a much better description of what you refer to as "real-world testing". Since you are referring to the results generated by a single system and you are actually trying to make comparisons with it, you are making half a thousand guesses and assumptions in order to make a guess about how a change of a single component would affect the thermal performance of a single case, let alone the comparisons between different cases or between different systems.

    What I said is that my thermal load is not directly comparable to that of an active system. It can however be used to compare the thermal performance of different cases and displays the true performance of a case, unaided by external factors. I cannot tell you how a case will perform with every possible configuration that could be installed inside it; however, I can tell you which case has better stock thermal performance regardless of the configuration that will be installed. On the other hand, testing with an active system creates results that again are not comparable to that of any other system and, as it adds a ton of variables, it also is obsolete for comparisons between different cases. If I were to do something like this, I would only be showing you some numbers that cannot be used to compare cases and cannot be used as a reference for any other system, even if it is almost entirely identical to the test system; it does not get any more useless than that.

    By what you are saying, you are suggesting to drop a methodology that can generate repeatable results and display the actual performance of the cases, in order to replace it with a "testing" procedure that will produce results impossible to compare them to other systems and useless for the comparison of different cases; in other words, meaningless and misleading.

    Let me try another, far too simple argument. I would need much less time and a fraction of the energy required to perform such testing if I were to simply press the power-on button with the system depicted in the review, run some applications and write down the numbers. Actually, it would reduce the time needed to test a single case from 2-3 days to about...30 minutes. I could essentially double my output (and my income, plus the energy cost). So, unless you actually believe that I am mentally deranged, take my word for it; there is no "real-world" testing that could produce any results meaningful to anyone.

    As you said so yourself, I strive for scientific vigor and repeatability. If you still believe that "real-world" testing is in any way better than testing done with lab equipment and by someone who at least understands the basics of the scientific method, then by all means, feel free to discard these results as "pointless" and refer to other sites for "meaningful" testing.
  • MarcusMo - Sunday, March 30, 2014 - link

    Agreed on every point. To those that cringe at the notion of reading the two full answers above, would you agree with the following summary:

    - "Real world" testing does not give any real world insight since the variance between individual systems is too great.
    - Gaining any absolute knowledge about how your system will perform in a certain case is thus impossible. Let it go people.
    - The best we can hope for are accurate comparisons between cases, but that is not going to happen as long as we cling to the flawed "real world" testing methodology. This is the rational for using a synthetic load method in case reviews from now on.

    I think part of the acceptance problem lies in the lack in any comparative data at this point. Once you have a couple of relevant test points as a reference I think people will see the upside to your awesome work. Keep it up!
  • britjh22 - Thursday, April 3, 2014 - link

    "I cannot tell you how a case will perform with every possible configuration that could be installed inside it; however, I can tell you which case has better stock thermal performance regardless of the configuration that will be installed. "

    I understand why the change in methodology was made, but I think part of the issue that people have with this new format is that it is too technical/scientific. I think most of us come to AT to read articles about various hardware because we are interested consumers and possible buyers, not interested engineers.

    While the new format is more scientifically rigorous, you yourself indicate above that the data you end up with is not representative of any system that the reader may install, so what use is it to us? Yes, we can see what the stock cooling with a simulated load may be, but is that any less or more helpful then the previous methodology of a fixed system tested across the cases? It may be more valuable from an engineering standpoint, but it may be less useful from a consumer who is comparing cases.

    While I understand the desirability for a single test that can be applied to any case regardless of form factor, a tiered system that represents more buyer expectations may be better. This could be something like a standard mATX system, an ITX system, an ATX system with tower cooler, and an ATX system with CLC for the CPU. While I realize this is not as easy to keep on hand for a reviewer, that is what you are "up against" with the other review sites. When a consumer comes to a case review, and sees the AT review with some simulated thermal load, and a competitor review with a system that is a close approximation of what they have or are planning, which do you suppose they are more likely to take to heart?

    You did make the point that there are significant differences between OEM's for similar items, and with different sensor points, etc. However, you provide no actual evidence of this, while stressing your knowledge, education, and that apparently, unlike your readers, you "at least understands the basics of the scientific method", which just make you look arrogant. I think a great article, to support this new testing methodology, would be to show just how much of a difference switching just a motherboard with different/differently placed sensors makes.

    Additionally, to help shore up the consumer value of these articles, I think more space/effort needs to be paid to how the case is while building. There are basic statements like "Building a system inside the Graphite 760T is a seamless procedure, aided by the large size of the case. Most of the time required to build a system inside this case will most likely be for the routing of the cables", but it doesn't seem to ring true with any personal experience or flair, something that Dustin did quite well that I would guess readers are missing, myself included.
  • creed3020 - Friday, March 28, 2014 - link

    Wholeheartedly agree with the above, especially the last paragraph. Scientific value and rigor have been added to the reviews but real-world, comparable metrics are arguably absent.

    I also don't see how results from one review are going to be compared to another with the style of these graphs. Obviously we don't know exactly what the graphs/charts/data grids will look like in advance but the Thermal Load graph for instance already has 4 different series worth of data. Overlaying another 4 series for just one other case is going to look very messy, never mind what it would look like with 10 others.
  • JarredWalton - Friday, March 28, 2014 - link

    I'd suggest that if you look at the final CPU/GPU/etc. temperature at the end of the test sequence, that's an easy figure to compare with other test systems. "System A has 55C on the CPU, system B has 60C on the CPU -- A is better." Hopefully we'll have enough cases to work from in the next week or two so that we can start showing additional (useful) charts.
  • BlakKW - Saturday, March 29, 2014 - link

    Ok, I guess you've convinced me that you know what you're talking about, and will try to stay open-minded as more results are compiled. But one thing that bothers me is after all your efforts to create uniform methods, why not use a nice set of the same fans in every case?

    Doesn't using stock (supplied) fans introduce a huge variable from case to case, both in thermal and acoustic testing? I would rather know how the case itself performs, as opposed to the possibly cheap fans that are included...
  • Aikouka - Thursday, March 27, 2014 - link

    I skimmed a bit of it, but this sentence stuck out at me...

    "Corsair provides ample clearance behind the motherboard tray for the routing of cables."

    If I had to give just one rule in regard to technical writing, then it would be to avoid subjective analysis. In other words, how do I know that his idea of what's "ample" is the same as mine? An actual measurement would be best, and possibly a comparison of that value to competing cases.

    Also, I miss the ability to compare cases to each other. Heat and noise are huge factors to me when considering cases, which is one reason why the lack of a side fan-mount is a no-go for me. However, we only get heat values for the current case in the reviews.
  • E.Fyll - Friday, March 28, 2014 - link

    I am not really sure if I should take that as a compliment. I am usually getting bashing because my writing is "too technical". :)

    You are right. The clearance however is not even across the entire section and people hardly care about a few mm's difference, for which reasons I believed that a qualitative evaluation would suffice. It is 21.6 mm between the panel and the motherboard tray, which falls down to 15.2 mm at the rising sections near the openings and goes up to 27.9 mm behind the 5.25" bays. It also is practically zero where the 2.5" slots are mounted.

    I will consider adding precise measurements in my future reviews.
  • Aikouka - Friday, March 28, 2014 - link

    I probably wouldn't worry too much about reporting varying differences unless said difference causes a problem. I think most users know the offending cable is typically the ATX power cable. The reason why I'm so picky about space is because of another Corsair case: the Obsidian 800D. The 800D was a decent case with a lot of interesting design choices, but a not-so-good one was the lack of clearance in the back. I used a Corsair HX750 with it, and the bulky ATX power cable caused the solid side panel to bow out as there just wasn't enough room to accommodate it. In my 900D, I actually use the same cabling kit that you are.

    Although, another issue is usually power cables connecting to hard drives. That's where this cable comes in handy: http://amzn.com/B0086OGN9E . Those plugs can be moved on the wire, which means you can get them exactly where you need them without trying to contort your poor power cables and stuff them in between drives.

Log in

Don't have an account? Sign up now