EVGA Releases GeForce GTX 1070 Ti FTW Ultra Silent: 3 Slot Bracket, ACX 3.0 Coolerby Nate Oh on December 7, 2017 12:30 PM EST
- Posted in
- GTX 1070 Ti
This week, EVGA launched another model for their GeForce GTX 1070 Ti family: the FTW Ultra Silent, featuring the ACX 3.0 cooling system. Starting from its triple slot wide PCIe bracket, the card has basically more of everything from the lower-end SC Black Edition: another 8 pin PCIe connector, 10+2 power phases, and 235W power draw. As a mix between the ACX 3.0 equipped SC Black Edition and iCX FTW2, the FTW Ultra Silent is a little over 17mm thicker than the SC Black Edition due to a heftier heatsink. In turn, that excess cooling capability permits the FTW Ultra Silent a shot at living up to its name, while also allowing for overclocking headroom.
Advertised at 1607+/1683+ MHz clockspeeds, EVGA is referring to their Precision XOC overclocking utility, as well as the Precision XOC 'OC Scanner X' feature, in light of the GTX 1070 Ti’s requirement to ship at reference clocks. The OC Scanner X functionality detects artifacting and tests the card for what it believes to be the “optimal” overclock, and then applies it; for gamers who are not comfortable with overclocking, automatic overclocking brings GTX 1070 Ti’s a step closer to actual factory overclocks. Though like with all automatic overclock features, manual tweaking is likely more precise and tuned for a given end-user.
Meanwhile, the FTW Ultra Silent picks up a couple more high-end board features with 2 BIOS chip and a backplate, though it misses out on the king of all premium graphics card attributes – RGB LEDs. The FTW Ultra Silent comes with a white LED logo like the rest of the EVGA GTX 1070 Ti family, while the FTW2 remains the only one with RGB LED capability.
|Selected EVGA GeForce GTX 1070 Ti Models|
|EVGA GTX 1070 Ti FTW Ultra Silent||EVGA GTX 1070 Ti FTW2||EVGA GTX 1070 Ti SC Black Edition|
|Boost Clock||1607+ MHz|
|Base Clock||1683+ MHz|
|VRAM||Clock / Type||8Gbps GDDR5|
|Outputs||3x DP1.4, 1x HDMI2.0, 1x DVI-D|
|Power Connectors||2 x 8pin||2 x 8pin||1 x 8pin|
|Bracket Width||3 Slot||2 Slot||2 Slot|
|Cooler Type||Open Air (ACX 3.0)||Open Air (iCX)||Open Air (ACX 3.0)|
The FTW Ultra Silent does share the $500 price with the FTW2, essentially at the price ceiling for GTX 1070 Ti, but trades off the FTW2 and iCX extras for a larger heatsink. And despite the wider three slot PCIe bracket in supporting that heatsink, the FTW Ultra Silent has the same display output configuration as the FTW2: 1 x DVI-D, 3x DisplayPort, & 1x HDMI (maximum 4 monitors supported). The FTW Ultra Silent also comes with the eponymous Ultra Silent fan profile, but details thereof were not specified.
At this time, the GTX 1070 Ti FTW Ultra Silent is only available at the EVGA store for $499.99, and appears to be in stock.
Post Your CommentPlease log in or sign up to comment.
View All Comments
DanNeely - Thursday, December 7, 2017 - linkAny chance of getting one of these in to test the silent claim?
The thought of having to go back to hearing noisy card fans every time I game (open air is quieter than blowers, but was still obnoxious after a year+ of running everything on water) is a very big part of why I don't see myself not watercooling in the future despite the cost/assembly time penalty it imposes.
IdBuRnS - Thursday, December 7, 2017 - linkYea, calling a card with 2 fans "ultra silent" is a bit suspect.
JoeyJoJo123 - Thursday, December 7, 2017 - linkUhh, it's not that unreal.
Some history, I built a mobile-mini-ITX PC (PC Part Picker List here: https://pcpartpicker.com/list/NLptZL) and I originally had removed the GTX 970 stock cooling shroud for a Raijintek Morpheus Core II, because I was going to put 2x high-pressure fans into the GPU compartment of the FTZ01. It wouldn't make that much sense to put 2 high pressure fans over a GPU's own 2 open air fans, and the extra cooling mass of a GPU air cooler would help.
Well, just about 4 months ago, when cleaning out the system, I decided to get a GPU PWM fan adapter and splitter so that the GPU fans would actually spin according to the GPU temperature, not according to the mobo's readout of the CPU temperature. (And many games appear to not tax the CPU at all while the GPU is running near full bore, so the GPU fans don't spin up accordingly). (I also applied Thermal Grizzly liquid metal thermal interface to the GPU die at the same time.)
Well for about 3 days I thought dusting the case just made a huge difference in sound levels, but in reality the GPU fans were just off. During games it'd rise up to 80C and level off, with 0 active cooling. Found out I _needed_ to have MSI afterburner up all the time so that my GPU will recognize the fan curve I set it at, for whatever reason. It now hardly rises over 51C sustained.
So yeah, if you have enough thermal mass, (and this card doesn't seem to have as much as a morpheus II core) and a very good thermal interface, then silent GPU operation is actually not impossible. What's difficult is making sure that the GPU is OK for a variety of thermally restricted cases, like many of the modern tempered glass sidepanel + rotated GPU and riser cable orientation some users are going for, where open-air coolers suffer due to having little way to radiate heat away.
ikjadoon - Thursday, December 7, 2017 - linkATX needed an update a damned decade ago. ATX was never designed to support 150W to 300W TDP add-in cards
The location and angle are **completely** asinine nowadays: a perpendicular card limited to two to three slots? If you want more cooling, you have to invest $100s in water cooling. Or kill your expansion slots--have we all forgotten how gargantuan ATX cases are? Why should we be needing to compromise any expansion slots? Why is it still perpendicular? Why is it stuck in two--maybe three--slots? Why do the power plugs come out the top, towards the side panel of the case? Why are we shoehorning 80mm-sized fans, where we've phased them out of almost every other location in the system? Why are we hinging one of the heaviest components solely on two little metal tabs and a PCIe slot? Why, if we want better cooling, are we often dumping 200W of excess heat inside cases? It's all kinds of stupidity nowadays with ATX and GPUs.
If case/motherboard/PSU manufacturers want to make some real innovation, kill ATX. It's an ugly standard and while the rest of the industry has progressed (look at the progress of CPUs from 1990 to 2017....now look at ATX cases from 1990 to 2017)... this 30+ year-old standard is holding us back. We got rid of the headphone jack, floppy drives, IDE, serial and parallel ports, and now we're phasing out USB type-A. But this goddamned ATX standard lives? Why? Kill this junk standard.
What are the alternatives? Oh, there's plenty, but they'll have to sit down and actually write out a plan. Do some combination of BTX and/or MXM. Lay the GPU flat on the motherboard so we can mount proper coolers, like Colorful did with this GTX 1070:
Move the expansion slots somewhere else--do we really need 6 more PCIe slots in this day and age of singular GPUs? So few people need 7 expansion cards populated at once. Nobody needed 10 3.5" bays and we got rid of that. Do we honestly need 8 SATA ports, then, still? And don't get me started on front-panel connectors.
But, no. Let's instead debate on tempered glass and RGB and mesh panels because--man--case design is still pretty hard, eh? Pathetic blocks called "heatsinks" on high-end X299.
Say what you want about Intel and AMD and NVIDIA for not "innovating" year over year. But at least they aren't ATX case designers in 2017.
ikjadoon - Thursday, December 7, 2017 - link/rant
But, for real. ATX can live but it needs a legitimate update. Maybe the vertical AIB cards work for server farms, but let's experiment a little here. ASRock's making moves with that mITX X299 board. Cases can innovate, too.
JoeyJoJo123 - Thursday, December 7, 2017 - linkThe problem is that the moment you change the form factor of a GPU, you lose backwards compatibility, and you kill off and lose your customer base over time.
This is why USB type A ports remain a consumer favorite port. Everything's backwards compatible. Everything just works, even if it's a bit slower on older USB ports.
For people who _need_ better GPU cooling, there's custom GPU air coolers, GPUs that are bought stock in wider 3-slot format, GPU adapters for Asetek CPU-liquid-cooling, GPUs that are bought stock in hybrid Air/Water cooling, GPUs that are bought stock with water blocks, GPUs that can be converted to fully custom watercooling loops.
The fact of the matter is that GPUs operate within normal tolerances up to ~85C or so, and two-slot coolers are _adequate_ for that operating range while providing good performance. It's only once you want to _really_ overclock your GPU that you'll find that you end up compromising on loudness using stock coolers. At this point you have plenty of aftermarket options that don't compromise much on cooling capacity or sound level, but instead compromise on _cost_.
The market's fine as it is, and I think you're blowing this way out of proportion. The typical $200 or less cards that the mass market buys operate fine and give normal people a big boost in game performance without a complicated installation.
And for the minority of people who want more, there are those options out there if you're willing to pay for it.
DanNeely - Thursday, December 7, 2017 - linkSimply put no one with enough moral authority to drive a breaking change cares enough about commodity desktops to do push one through. Dell/HP/etc can all go full custom to hit targets; and don't need to worry about existing standards (eg the HP systems with a 12V only PSU and sata/molex power cables coming off the mobo where they put the DC-DC converters).
AMD's too bruised and battered after a number of years of playing also ran to Intel/Nvidia; and while their fortune has improved recently they've got many more important targets to put time and money into.
Intel is a laptop first company these days, and their biggest customers (server makers and giant OEMs) can design custom form factors already. They may also still be feeling lingering pain from the failure of BTX.
At this point I don't think we'll ever even see a minor fiddle of part of the ATX spec. Even at the level of putting the 24 pin connector on a diet by making a bunch of the 3.3/5/ground wires optional. Modern PCs use a tiny fraction of the power that systems 20 years ago did at those voltages; making some of the wiring to support them optional would make for cheaper more flexible cables and is IMO the lowest hanging fruit on the spec. I don't ever see it happening though. No one who could make it happen cares anymore.
PeachNCream - Thursday, December 7, 2017 - linkATX is a lot like a dancing llama with eczema. It can still dance okay, but it has to stop periodically to scratch itself. Yeah, that gets annoying, but it'd be really expensive and time consuming to train another dancing llama so you just kinda deal with the eczema problem and try not to let it bother your enjoyment of the show or you hope the people putting on the performance thought to bring along some good skin lotion. Besides, most people these days are going out to watch the dancing sloth (because they use something other than an ATX-standard desktop for computing) so the itchy llama people don't really have an incentive to change things.
xchaotic - Thursday, December 7, 2017 - linkIt's amazing how they suddenly need 2x 8pin for a 235W card where a much more power hungry 1080ti (300w+) manages with 8+6. Marketing everywhere.
ZeDestructor - Friday, December 8, 2017 - linkSafety margin for boosts, presumably.