Comments Locked

100 Comments

Back to Article

  • nandnandnand - Friday, October 14, 2022 - link

    It is now safe to upgrade to the RTX 4070 12 GB.
  • TheRealArdrid - Friday, October 14, 2022 - link

    Not if it's still priced at $900.
  • haukionkannel - Friday, October 14, 2022 - link

    No... It will be 4070ti and the price is $899! or $999 because 4090 is selling so well...
  • Kangal - Friday, October 14, 2022 - link

    Is it likely, that Nvidia saw a leak or a glimpse of what's coming from AMD's RX-7000 lineup, which changed their mind?
    By removing this card, they give themselves more space at the top for future cards to try and claim the performance or the value crown.

    Just like we saw Nvidia scramble with the release of the RX-5000 series, they had to launch a whole new "RTX Super" lineup to remain competitive, and they did it n the 11th-hour.
  • Kurosaki - Friday, October 14, 2022 - link

    🤣
  • iranterres - Friday, October 14, 2022 - link

    No it's not.
  • Metroplex7k - Sunday, October 16, 2022 - link

    Yes it is.
  • philehidiot - Monday, October 17, 2022 - link

    Oooh, me too...

    Oh yes it isn't!
  • pixelstuff - Saturday, October 22, 2022 - link

    Oh no it is.
  • zamroni - Friday, October 14, 2022 - link

    It's even supposed to be 4060 because it uses third tier (*104) chip and only has 192 bits memory

    And the 16 GB 4080 should be 4070 because it uses second tier (*103) chip and only has 256 bits memory
  • Kangal - Monday, October 17, 2022 - link

    See below:
    (_% of Max Shaders - MSRP Price : Chipset Name ~ Shader Cores = Name of GPU Card)

    100% - $1000 : GM-200-400 _3072 = 9-GTX Titan X
    100% - $1200 : GP-102-450 _3840 == 10-Titan Xp
    100% - $2000 : GA-102-350 _10752 ======= RTX 3090 Ti
    100% - $2500 : TU-102-400 _4608 ==== 20-Titan RTX
    ~98% - $1500 : GA-102-300 _10496 ======= RTX 3090
    ~97% - $1600 : AD-102-300 _16384 =========== RTX 4090
    ~95% - $1200 : GA-102-250 _10240 ======= RTX 3080 Ti
    ~94% - $1000 : TU-102-300 _4352 ==== RTX 2080 Ti
    ~93% - $700 : GP-102-350 _3584 == GTX 1080Ti
    ~92% - $650 : GM-200-200 _2816 = GTX980Ti
    ~81% - $700 : GA-102-200 _8704 ======= RTX 3080
    ~80% - $1200 : AD-103-300 _9728 ========= RTX 4080

    ~67% - $550 : GM-204-300 _2048 = GTX980
    ~67% - $600 : GP-104-410 _ 2560 == GTX 1080
    ~67% - $800 : TU-104-450 _ 3072 ==== RTX 2080-S
    ~64% - $700 : TU-104-410 _ 2944 ==== RTX 2080
    ~63% - $450 : GP-104-300 _ 2432 == GTX 1070Ti
    ~58% - $900 : AD-104-400 _ 7680 =========== RTX 4080-12GB
    ~57% - $600 : GA-104-400 _ 6144 ======= RTX 3070 Ti
    ~56% - $600 : TU-104-400 _ 2560 ==== RTX 2070-S
    ~55% - $500 : GA-104-300 _ 5888 ======= RTX 3070
    ~54% - $330 : GM-204-200 _1664 = GTX970
    ~50% - $380 : GP-104-200 _ 1920 == GTX 1070
    ~50% - $500 : TU-106-410 _ 2304 ==== RTX 2070

    ~47% - $400 : TU-106-400 _ 2176 ==== RTX 2060-S
    ~45% - $400 : GA-104-200 _ 4864 ======= RTX 3060 Ti
    ~42% - $350 : TU-106-200 _ 1920 ==== RTX 2060
    ~33% - $200 : GM-206-400 _1024 = GTX960
    ~33% - $250 : GP-106-410 _ 1280 == GTX 1060Ti*
    ~33% - $280 : TU-116-400 _ 1536 === GTX 1660Ti
    ~33% - $330 : GA-106-300 _ 3584 ======= RTX 3060
    ~31% - $230 : TU-116-300 _ 1440 === GTX 1660-S
    ~30% - $200 : GP-106-300 _ 1152 == GTX 1060
    ~30% - $220 : TU-116-200 _ 1408 === GTX 1660

    ~28% - $160 : TU-116-300 _ 1280 === GTX 1650-S
    ~25% - $160 : GM-206-250 _ 768 = GTX950
    ~24% - $250 : GA-106-150 _ 2560 ======= RTX 3050
    ~21% - $130 : GM-206-200 _ 640 = GTX750Ti
    ~20% - $140 : GP-107-400 _ 768 == GTX 1050Ti
    ~17% - $100 : GM-206-100 _ 512 = GTX750
    ~17% - $110 : GP-107-300 _ 640 == GTX 1050
    ~11% - $150 : TU-117-300 _ 896 === GTX 1650

    * GTX 1060 Ti, is just a nickname for the enhanced 6GB model with 9Gbps memory, as opposed to the cut-down, slower model, with only 3GB memory.

    Based on this we can see a trends happening, especially when it comes to price:

    9-series* -> 10-series = average +12% price difference (-30% min, max +50%)
    10-series -> 16-series = average +16% price difference (+11% min, max +21%)
    10-series -> 20-series = average +54% price difference (+32% min, max +109%)
    20-series -> 30-series = average +5% price difference (-20% min, max +22%)
    30-series -> 40-series = average +43% price difference (+8% min, max +69%)
    Averages of averages: -1% min, +26% ave, +55% max = Overall price trend per generation
  • Bruzzone - Tuesday, October 18, 2022 - link

    Kangai (san) very complete table, thank you. mb
  • Kangal - Saturday, October 22, 2022 - link

    Here's the same list!
    But shorter, placed from Best Value to Least Value, according to USD $ price and the amount of hardware/silicon you're purchasing.

    ($5.71 per %) = (0.1750 % per $) = GTX 1650-S
    ($5.88 per %) = (0.1700 % per $) = GTX750
    ($6.06 per %) = (0.1650 % per $) = GTX960
    ($6.11 per %) = (0.1421 % per $) = GTX970
    ($6.19 per %) = (0.1615 % per $) = GTX750Ti
    ($6.40 per %) = (0.1563 % per $) = GTX950
    ($6.47 per %) = (0.1545 % per $) = GTX 1050
    ($6.67 per %) = (0.1500 % per $) = GTX 1060

    ($7.00 per %) = (0.1429 % per $) = GTX 1050Ti
    ($7.07 per %) = (0.1415 % per $) = GTX980Ti
    ($7.33 per %) = (0.1364 % per $) = GTX 1660
    ($7.42 per %) = (0.1348 % per $) = GTX 1660-S
    ($7.58 per %)= (0.1320 % per $) = GTX 1060Ti*
    ($7.60 per %) = (0.1316 % per $) = GTX 1070
    ($7.14 per %) = (0.1400 % per $) = GTX 1070Ti
    ($7.53 per %) = (0.1329 % per $) = GTX 1080Ti

    ($8.21 per %) = (0.1218 % per $) = GTX980
    ($8.33 per %) = (0.1200 % per $) = RTX 2060
    ($8.48 per %) = (0.1179 % per $) = GTX 1660Ti
    ($8.51 per %) = (0.1175 % per $) = RTX 2060-S
    ($8.64 per %) = (0.1157 % per $) = RTX 3080
    ($8.89 per %) = (0.1125 % per $) = RTX 3060 Ti
    ($8.96 per %) = (0.1117 % per $) = GTX 1080
    ($9.09 per %) = (0.1100 % per $) = RTX 3070

    ($10.00 per %) = (0.1000 % per $) = RTX 3060
    ($10.00 per %) = (0.1000 % per $) = RTX 2070
    ($10.00 per %) = (0.1000 % per $) = 9-GTX Titan X
    ($10.42 per %) = (0.0960 % per $) = RTX 3050
    ($10.53 per %) = (0.0950 % per $) = RTX 3070 Ti
    ($10.64 per %) = (0.0940 % per $) = RTX 2080 Ti
    ($10.71 per %) = (0.0933 % per $) = RTX 2070-S
    ($10.94 per %) = (0.0914 % per $) = RTX 2080

    ($11.94 per %) = (0.0838 % per $) = RTX 2080-S
    ($12.00 per %) = (0.0833 % per $) = 10-Titan Xp
    ($12.63 per %) = (0.0791 % per $) = RTX 3080 Ti
    ($13.64 per %) = (0.0733 % per $) = GTX 1650
    ($15.00 per %) = (0.0667 % per $) = RTX 4080
    ($15.31 per %) = (0.0653 % per $) = RTX 3090
    ($15.52 per %) = (0.0644 % per $) = RTX 4080-12GB
    ($16.49 per %) = (0.0606 % per $) = RTX 4090
    ($20.00 per %) = (0.0500 % per $) = RTX 3090 Ti
    ($25.00 per %) = (0.0400 % per $) = 20-Titan RTX

    So here is the upgrade path for graphics cards. This is over the span of 8-Years, using logical thinking and in the most robotic way, to maximise the most value possible. With a limitation to never go downwards in performance (given), while hitting the maximum value (obvious), and never going backwards in generation (severe limitation), while making each upgrade step.
    GTX750, GTX960, GTX 1050Ti, GTX 1660, RTX 2060, RTX 3080, RTX 4080.
    $100 -> $200 -> $140 -> $220 -> $350 -> $700 -> $1200.
    ($5.88 per %), ($6.06), ($7.00), ($7.33), ($8.33), ($8.64), ($15.00).
  • Kangal - Saturday, October 22, 2022 - link

    I also find it interesting how weird the Turing period was!
    The higher priced and early options were mostly terrible value. Whilst the latter "Super" options and the lower priced (16-series) ranged from decent to great value. It really was a weird era, see below:

    "GTX 1650-Super" it has the best value ($5.71 per %) in history.
    "RTX 2080" then offers exactly half the value ($10.94 per %), making it a bad choice.
    "RTX Titan 20" then offers exactly half of that value ($25.00 per %), sits as the worst value in history.
  • lilkwarrior - Tuesday, November 1, 2022 - link

    There are considerable flaws in evaluating the value, especially for creative and machine-learning professionals. It's terrible math with the proliferation of dedicated hardware for ray-tracing and machine-learning (tensor cores) in RTX GPUs.

    Especially with the AV1 and real-time path-tracing capabilities of the 4090 + invaluable capabilities like saturating what's possible on HDMI 2.1 panels.
  • Kangal - Friday, November 4, 2022 - link

    Not really.
    Ray Tracing still isn't mainstream, it's pretty niche. Also, it doesn't really make drastic differences to the experience. In many cases, people would rather have a more fluid image (+120fps), or sharper image (+1800p resolution), or even allocate those funds into a better monitor (size, brightness, contrast, colour, GtG, etc etc). I think you have a point, when RT becomes dominant, but that will only mean we can't compare the newer cards against the older one, but the comparison itself would still be sound.

    This is the fairest way to compare them. By comparing one card to its own family. And basing the product based on the amount of hardware/transistors you're buying. Then giving those cards a value rating based on its MSRP price versus the hardware. You can discern patterns happening in the market, as you compare how the price and hardware is trending between the different generations.

    Another way of doing comparisons, would be to run each GPU at Stock Settings and test them individually on a Test System. Then compare their performance on a suite of games. Run an aggregate (eg 50 games), to derive a FPS per Dollar figure. But this is tricky and also imperfect, since some new games will be incompatible with older hardware. Some new hardware will not scale well on older games as they hit a ceiling / point of diminishing returns. And some titles will not have specific optimised per-game-ready drivers which also skews the results (drivers may get updated later, or a later game update may break the optimisations/bugs).

    Again, the focus of my post was primarily about gaming, the hardware, and the money. As soon as you start mixing in other factors like Machine Learning, Encoding, Professional Applications that's when you have a problem. Because the target goal could be achieved by technological improvements in the software side too.
  • thestryker - Friday, October 14, 2022 - link

    After seeing the RTX 4090 *not* top all of the charts in 1080p/1440p resolution I couldn't help but wonder what the performance would have been like for these cards. That's ignoring the fact that this card was also using a chip which historically would have been for a high end 60 or 70 series card. The gloves are clearly off over at nvidia and they don't care about what they're doing because nobody's leaving them as a customer. I've known for a while that they weren't a great company, but between the price reset on the 20 series and everything that went on during the 30 series I hope people start looking elsewhere. The GPU market is in a terrible place and needs a hard reset.
  • WaltC - Friday, October 14, 2022 - link

    I read that the 12GB GPU will simply be renamed--possibly 4070, etc.
  • thestryker - Friday, October 14, 2022 - link

    Sure, but will nvidia really drop what was going to be a $900 card down to $500-600? I don't think so.
  • haukionkannel - Friday, October 14, 2022 - link

    Of course not!
    This will be 4070ti at $899 and they will release cut down version as 4070 at $699 later with 10Gb or 6Gb of memory!
  • nandnandnand - Friday, October 14, 2022 - link

    >4070 with 6 GB
    bruh
  • Byte - Friday, October 14, 2022 - link

    Historically the Ti were mid refresh cards. So 4075 for $899 and a 4075Ti for $999 and then we got 4070 for $599 and a 4070Ti for $699 all are possible
  • philehidiot - Monday, October 17, 2022 - link

    They don't need to. They're conceding the naming is confusing, not the pricing is wrong. They could rename it the GTX69SpaceDocker and charge the same. That they're "unlaunching" is either because an increased delay was required to allow rebranding or, more likely, they have now seen the market response and decided it's in their best interests to allow the 30 series to sell out before launching the now 4070.

    But these prices and energy consumptions mean this is another skip generation for me. I'm on a Vega64 and the most I've used my GPU for is Hashcat of late. Quite happy with older games minus the DRM that breaks them.

    My PC uses around 350W whilst gaming. That's okay. Nvidia laughed at the 250W TBP of the Vega64 when it came out. I will NOT be buying a card that uses circa 300W.... because I generate my own 'leccy and you'd be surprised how much more aware you are of usage when you're breeding your own angry pixies. You actually start to care about power factor...
  • meacupla - Friday, October 14, 2022 - link

    The RTX 4090 is completely CPU bound in 1080p, and it even manages to get bottlenecked at 1440p in some titles.
    I think you would be wasting your money if you don't play at 4K with a RTX 4090
  • Makaveli - Friday, October 14, 2022 - link

    lol you can bet your bottom dollar there will be people pairing a 4090 with coffee lake cpus at 1080/1440p then complaining in forums about it.
  • StevoLincolnite - Friday, October 14, 2022 - link

    I am tempted to pair it up with my Core 2 Quad Q6600 @ 3.6Ghz+16GB DDR2 Ram.
    Because screw normal common sense.
  • webdoctors - Friday, October 14, 2022 - link

    I still have that CPU in the closet for turning into a NVR, but not happy with the power draw :(
  • James5mith - Saturday, October 15, 2022 - link

    You aren't happy with the power draw on a Core2 Q6600? It was a 105w TDP part, back when TDP was actually the maximum power draw.

    God forbid you want to go past "low midrange" on modern CPUs, all of them start at 140w+ TDP, and TDP is just a low end estimate now for power draw.
  • SirDragonClaw - Monday, October 17, 2022 - link

    NVRs spend most of their time at low CPU usages with a few spike every now and then. A modern system (something like a 12th gen i3) will average out to under 30 watts for power draw. A Q6600 running as an NVR uses well over 120 watts on average (I know, I had one)

    In the UK the difference in power cost for these two devices is about $288 USD per year at the current power prices.
  • Tomatotech - Sunday, October 16, 2022 - link

    Raspberry Pi? Total package about $50 or less if you can buy used. Obviously depends on how many cameras and if they are HD/SD and frame rate. Uses about 10w at 70% CPU connected via Ethernet. (Mine uses 2w while running pi-hole etc. Ethernet uses less power than wifi)
  • meacupla - Sunday, October 16, 2022 - link

    Donate it to a PC recycling charity. Use a low power or mobile CPU based system. They are not even that expensive for older models.
  • Eidigean - Friday, October 14, 2022 - link

    I have an 8700k with a Radeon 580 that could use a GPU upgrade.

    I also have an i7-4960x that still doesn't suck (OC'd from 3.6 GHz to 4.0 GHz and still under-volted) with a GTX 1070 ($200 eBay special 3 years ago) that might also benefit from a GPU upgrade. My 30" 2560x1600 10-bit panel only runs at 60fps; so I don't expect to be CPU bound.

    Not looking to heat my house with a 4090, likely getting a 4080, but the Ivy Bridge extreme 4960x system use to have two GTX 580's in SLI, so the 1000w power supply with four 12v rails is up for it.

    The DDR3 2133 CAS 9 memory in there has lower latency than most newer kits. Totally getting my money's worth 9 years later. The DDR4 3200 CAS 14 memory in the Coffee Lake rig is pretty close in latency (divide the CAS timing by the RAM speed to get the latency).
  • Flunk - Saturday, October 15, 2022 - link

    No point in buying new high end GPUs for 8 year old CPUs because you'll be CPU bound all the time.
  • Samus - Sunday, October 16, 2022 - link

    You'd be best off to get a 3070Ti or 3080 for $600-$700 to pair with that 8700k. I wouldn't blow $1000+ on a modern CPU when you can get the last gen (which lets be honest is only 2 years old) for half the price while it will still be relevant for years.
  • Samus - Sunday, October 16, 2022 - link

    According to a friend who just got a 4090, his CPU (12700) bottlenecks the card at 3440x1440 in BF2042. He can confirm this because if he underclocks the card, the performance stays the same. The CPU cores aren't even pegged, but that's not the problem; the software isn't feeding the CPU enough to keep up with the GPU, and this is a common problem across all games because desktop CPU's have become so powerful that software can't efficiently use them.
  • meacupla - Sunday, October 16, 2022 - link

    Yeah, your friend is going to need a 7900X3D or 13900K to unlock more potential in the 4090.

    And to think that the 4090 is actually still handicapped using GDDR6X
  • Fallen Kell - Saturday, October 15, 2022 - link

    "The GPU market is in a terrible place and needs a hard reset."

    That only will happen if there is competition that competes at the high end with product on the shelves...
  • NextGen_Gamer - Friday, October 14, 2022 - link

    In my mind, a small price drop and a rename to GeForce RTX 4070 Ti would be completely fine. Correct me if I am wrong, but the card as was (4080 12GB) was using the full AD104 die; meaning if it is just renamed 4070, it would leave no room for the Ti unless NVIDIA moved to the much bigger AD103 chip. So, keep all the specs as they were, drop the price to $700 and call it 4070 Ti and launch a cut-down configuration at $599 called the 4070. Both of those numbers are a $100 increase over 3070/3070 Ti MSRP prices, but that falls in line with the 40xx series in general being more expensive across the board.
  • haukionkannel - Friday, October 14, 2022 - link

    4090 is selling so well, that I don´t expect price cuts, only name change.
  • meacupla - Friday, October 14, 2022 - link

    The cuts to CUDA cores on the "4080 12GB" model would have put it squarely in the 4060Ti territory.
    If the "4080 12GB" model wanted to be a "4070", at the very least, it would require 50% of what the 4090 packs. 4090 has 16385 CUDA cores.

    RTX 4080 12GB vs RTX 4090: 7680/16385=46.9%
    RTX 3060Ti vs RTX 3090: 4864/10496=46.3%

    For a real "4070", the CUDA core difference would have to be around 56.2% compared against a 4090. Or around 9200 CUDA cores at a minimum.
  • Rudde - Saturday, October 15, 2022 - link

    The AD103 die is not much bigger. 379mm² vs 295mm². In the 2000-series, the 2070 used the full 445mm² of the TU106 die and the 2070 super used the larger (545mm²) TU104 die.

    What I find more concerning is that although the x80-series have had a price between $550 and $700 in recent years, nVidia decided to price the 4080 as $900 and $1200, both premiums over earlier pricing. Now only the $1200 card remains, filling the x80 ti price slot with a x80 performance card.
  • catavalon21 - Sunday, October 16, 2022 - link

    Personally, I'm not fine with it, but to each our own. A 192-bit wide memory bus being on anything above a '60 card is nonsense. No 70 or higher card as far back as the 770 ever had such a narrow bus. Even the last 2 generations of sixty-plus cards (2060 Super, 3060 ti) had 256 bit wide memory buses.
  • hansip87 - Sunday, October 16, 2022 - link

    bus width doesn't really matter if those 192 bit card can beat last gen 384 bit ones. would you choose to be stuck at 8GB one more gen for 70 series? cause hell yeah Nvidia doesn't care.
  • meacupla - Friday, October 14, 2022 - link

    Back in 2018, the crypto market crashed, and nvidia had a massive oversupply issue as a result. They had some 300,000 cards returned from the OEMs, as the OEM couldn't sell the GPUs.

    Now I can't find the exact article anymore, but didn't nvidia destroy some of their inventory during this time? I don't remember which exact model they reduced inventory of, but they did it to keep prices from crashing.
    If anyone can find that article, I would appreciate it.
  • beginner99 - Saturday, October 15, 2022 - link

    That is why OEMs aren't allowed to return chips anymore and in fact are forced to buy 3000 series one if they want 4000 series chip. Still wonder why EVGA just left? Yeah because NV was trying to move their losses of overproduction onto the OEMs.
  • Kurosaki - Friday, October 14, 2022 - link

    Why not just sell it as a 4070 for 300usd. Like a normal company would have done 7 years ago?...
  • michael2k - Saturday, October 15, 2022 - link

    Lack of competition is probably why
  • zamroni - Friday, October 14, 2022 - link

    I guess there will be 4080 super and ti using lower binned ad102 at $1300 and $1450.

    4070 ti will use lower binned ad103 at $1000.

    Previously ad104 4080 12GB will be renamed as 4070 super at $850.
    It's lower binned will be 4070 at $700 and 4060 ti at $500
  • Kurosaki - Friday, October 14, 2022 - link

    🤣
  • web2dot0 - Friday, October 14, 2022 - link

    Gamers : 4080/12gb $900 SCAM

    Nvidia changes names to 4070

    Games 4070/12gb $900 ok fine,

    😂

    Come on bruhs. It’s just a stupid name. It’s meaningless once you out it inside your case. It’ll still run the same however they call it
  • nandnandnand - Friday, October 14, 2022 - link

    Wrong. The real scam is having a 4080 16 GB and 4080 12 GB with wildly different specs and performance. It was intended to trick people.
  • catavalon21 - Sunday, October 16, 2022 - link

    ...and it's trying to trick us into being okay with a 60-series card at best passing as a 70-series even at that. 192-bit-wide memory bus on a 70 or 70ti? No thank you. For them to put it 2 tiers in the product stack above where it should be should be called what it is. We shouldn't be happy with it falling only one rung.
  • nandnandnand - Sunday, October 16, 2022 - link

    Memory bus doesn't matter, just performance. As we saw with AMD's RX 6000 series, they can skimp on the memory bus and get good performance. AMD used "Infinity Cache" (L3), and Nvidia has increased the size of their L2 cache. The RTX 3090 Ti has 6 MiB of L2 cache, and the soon-to-be-renamed 4080 12 GB card has 48 MiB.

    If you don't like the price/performance of a card, buy something else. The prices will fall as they sell off the now-overproduced RTX 3000 cards.
  • Silver5urfer - Friday, October 14, 2022 - link

    Expected because 3090 and 3090Ti would offer better value and higher VRAM vs this 192Bit horrendous xx70 class AD104 chip. I'm betting the reviewers delivered their response to Nvidia. Probably Jensen realized he cannot pull of this BS scam esp in a state where the stock market cratered and Inflation is all time high.

    Also DLSS3 is a horrible tech, HBU did a fantastic video unlike the DF shilling campaign. The clear cut latency spike plus how the image stability already suffers with both DLSS and FSR upscalers, only worth at 4K because higher quality image and improving TAA blur BS. It won't sell the 4080 12GB because it cannot get higher FPS beyond 200 like 4090 and will suffer from smearing, image instability, image corruption, low fidelity, higher perceived latency making useless fake FPS numbers worthless.

    Now the funny part is this will just get a rename as 4070Ti with probably a 50W power bump and repackaged in new cardboard box. They won't cut price unless the 40 series total stack from top get a cut because there's a hole in $1000 mark now, currently occupied by 3090Ti how long Nvidia will keep making Ampere cards ? They cannot because they want TSMC 5nn high cost dies to sell.

    Another big joke is scalpers are selling these new 4090 cards at super high $2500 price and in EU the official price of 4090 itself is at 2400 Euros. Imagine buying the card at such mark and then get shafted by a full AD102 die as 4090Ti with extra $500 premium and DP2.0. Get Played by Nvidia if you support this trash company.

    3090 buyers got shafted hard by experimental trash PCB design with horrendous MSVDD power rail and cancerous VRAM design making them run at 100C max yeah I know Micron rated them at 118C but nope the high temp is not valid when it doesn't have damn cooling at all. Oh the PCB also has power excursions plus Memory module placed near the PCIe slot to make it bend and eventually fail. This is how garbage Nvidia plays their consumers, yeah I'm sitting with a 3090 planning to sell it off and get a 3090Ti because I aint buying this overpriced 4080 16GB trash card and the 4090 at such high price and get shafted.
  • coburn_c - Saturday, October 15, 2022 - link

    Now we can focus on the 70% (500 dollar) xx80 series price increase. Even if the performance is 70% better (it won't be) it's still not a value increase.
  • nandnandnand - Saturday, October 15, 2022 - link

    The price increase is to tell you to buy a 30-series card instead, unless you really want the best right now.
  • 0siris - Saturday, October 15, 2022 - link

    How about cancelling the price of the other 4080 as well?
  • yhselp - Saturday, October 15, 2022 - link

    Following Ampere branding, the line-up should looks as follows:
    RTX 4080 12GB should be RTX 4060 (non-Ti);
    RTX 4080 16GB should be RTX 4070 (again, non-Ti);
    RTX 4090 should be between RTX 4080 and 4080 Ti.

    Following traditional, pre-Kepler branding, the line-up should look as follows:
    RTX 4080 12GB should be RTS 4050 (or RTX 4050 Ti);
    RTX 4080 16GB should be RTX 4060;
    RTX 4090 should be RTX 4070/80.

    So really, NVIDIA needs to rebrand the other cards, too.

    Adjusted for inflation and added costs, such as process, NVIDIA could still make a huge profit (more than it needs to do good business) by charging (in late 2022) the following:
    $399 for RTX 4080 12GB;
    $599 for RTX 4080 16GB;
    $829-899 for RTX 4090.

    Additionally, it might just be able to do decent business in late 2022 (after all this is the consumer, and not professional [Quadro, etc.] market) at:
    $249-299 for RTX 4080 12GB;
    $399 for RTX 4080 16GB;
    $749-799 for RTX 4090.
  • michael2k - Saturday, October 15, 2022 - link

    You’re suggesting a more powerful chip exists if you’re rebranding the 4090 as a 4070 or 4080.

    At best I think the 4090 stays unchanged, the 4080 stays unchanged, and the delaunched 4080 becomes a 4070. Then lower power parts become the 4060 and 4050.

    And I don’t see why they should reprice their product, since they are selling out as is.
    4090 $1600
    4080 $1200
    4070 $900
    4060 $600
    4050 $400

    And of course that means the 3080, 3070, 3060, and 3050 all get slotted below them.
  • meacupla - Saturday, October 15, 2022 - link

    Have you stopped to think that maybe, just maybe, RTX 4090 FE already draws a hefty amount of power?
    It's already par with a 3090Ti at this point, with maximum 500W power consumption, and that's before any factory overclock is applied.

    Like, realistically, what do you think a "4090 Ti" with your lineup will be?
    A 650~750W card that requires 2x 12VHPWR?
  • PeachNCream - Saturday, October 15, 2022 - link

    I doubt these GPUs will sell very well anyway. Sure they do offer performance, but the MSRP for a GPU and supporting hardware to take advantage of it is in the domain of fuel to drive a car to and from work for a year and nevermind the energy costs associated with running said computer and cooling the space where it resides.
  • nandnandnand - Saturday, October 15, 2022 - link

    The high MSRP is to encourage you to buy a 30-series GPU instead, or milk the whales still willing to pay.
  • michael2k - Saturday, October 15, 2022 - link

    Both is true.
  • michael2k - Saturday, October 15, 2022 - link

    The 4090 is already sold out though. As soon as supply and demand match they’ll either drop the price or release a Ti/Super variant to keep the same price point.
  • meacupla - Saturday, October 15, 2022 - link

    4090 is only sold out because nvidia artificially restricted the flow.
    They are sitting on a ton of 4090 and 4080 inventory. They would rather pay storage fees than sell at lower prices.
  • Dribble - Tuesday, October 18, 2022 - link

    The 4090 has sold out because it's a good card - it's only $100 more then the rrp of the 3090 which given increases in costs and inflation is fine, and it's a got a huge performance increase over that card. I agree the 4080 is probably too expensive as it's not the halo card and the bunch of people with basically infinite money will all buy a 4090, there will be very few people looking to only spend the current cost of the 4080.
  • Nfarce - Monday, October 17, 2022 - link

    The power rating and increased power bills is overrated hysteria. I skipped a generation coming from a 1080 Ti to my current 3080 Ti (both EVGA FTW3 editions) bought last year. Stress gaming load from the 1080 Ti to the 3080 Ti went from approximately 310 to 430and that includes moving up from 1440p to a 4K monitor in that 4 year ownership period between the two. My monthly power bill as at best seen just a handful of dollars per month more in Kw/H usage all other things being constant and averaged including gaming time.
  • SirDragonClaw - Monday, October 17, 2022 - link

    They sold out day one, and I know many people trying to get one. They will sell very well, the 4090 in particular is a bargain for its level of performance.
  • PeachNCream - Wednesday, October 19, 2022 - link

    Its better in terms of hype to say you've sold out of the 10 that were on the shelf than to say you only sold 11 of 100 that were in stock because people expect you to lower your price when there is excess inventory available. That is what's happening at the moment in order to keep the MSRP up for the time being on this particular high margin halo component.
  • Samus - Sunday, October 16, 2022 - link

    Meanwhile I picked up a 3080 for $600 recently. When you take DLSS3 out of the equation, it's only 33% behind a 4080 at half the cost, and its available now.
  • Tomatotech - Sunday, October 16, 2022 - link

    I suspect the high prices are to push gamers towards paying for the GeForce Now cloud GPU service. It’s got several million subscribers and rising, and these people are paying on a monthly basis, which is financial gold-dust to any company.

    Another part of the strategy around the mid-named 4080: the current GFN top tier is “3080 tier”. Keeping the misnamed 4080 would allow Nvidia to offer a future “4080” tier subscription while actually offering 4070-class service. The new rename will hit their cost-of-service for the future 4080 tier service but not by a huge amount, it was a gambit that didn’t work out.

    (GFN doesn’t use actual 3080s etc, it uses server GPUs which cost maybe $10-20K each and can be tweaked to be shared between more or fewer clients. The current “3080” tier apparently delivers about 3070+ level of service which still works well over optic fibre ISPs or for non-twitch games, ADSL)
  • catavalon21 - Sunday, October 16, 2022 - link

    "NVIDIA’s early performance figures painted a picture of a card ... delivering performance upwards of 20% (or more) behind the better RTX 4080, and on-par with the last-generation flagship, the RTX 3090 Ti."

    This is why having them independently reviewed is so important. I am skeptical of performance claims from the manufacturer who already tried to pass it off as a product it isn't.

    Will AnandTech provide a review of the 40-series cards?
  • nandnandnand - Sunday, October 16, 2022 - link

    Don't hold your breath waiting for any particular review here.
  • catavalon21 - Sunday, October 16, 2022 - link

    To my knowledge the entire 30-series product line was MIA here. If AT is out of the video card review business, which was among the best in the industry, then peace - please just say so, so we can stop waiting and look at other sites' reviews. Ryan's reviews have been some of the best anywhere, for several generations of cards, and I am hopeful that now the the "no one can afford them anyway" reason is off the list of "why no reviews?".
  • PeachNCream - Wednesday, October 19, 2022 - link

    IIRC they started missing on reviews back in the 10x0-series launch with several omissions there and things have since been on a downward spiral with each subsequent release. Basically, AT is not the place to go for relevant hardware reviews. They've consistently reviewed power supplies, coolers, NUC-like stuff, and external storage devices recently and even those are sparse amid a long line of slightly flavored reposts of industry press releases.
  • Oxford Guy - Wednesday, October 19, 2022 - link

    The lack of a 960 review was the first red flag.
  • hansip87 - Sunday, October 16, 2022 - link

    seeing those 4090 outrageous CUDA count, i don't believe the 4080 deserves the 1200 USD price. The CUDA Cores is like just about 40% less, but the price only goes down 25%. Looking at 3070/3080, CUDA core count matters a lot for going into 4K beside the VRAM capacity. Should have been priced 999 USD max for 4080.
  • Xajel - Monday, October 17, 2022 - link

    RTX 4080 16GB at $899 & RTX 4070Ti 12GB at $599 are still expensive but more plausible as a generational upgrade over the 3000 series.
  • Bruzzone - Monday, October 17, 2022 - link

    AD cost : price / margin on marginal revenue = marginal cost = price at q1 risk production volume;

    AD 104, 608 mm2, TSMC design production $121.60 x 2 package/test/margin = $243.20 to Nvidia x2 to the AIB = $486.40 at an adequate economic profit to the producer and channel x5.42 to x6 for MSRP = $2636 to $2918 and at dGPU peak production volume entering supply elastic pricing $1503 to $1643.

    Nvidia 4090 FE @ $1599 q1 risk production volume before ramp volume marginal cost decrease is underwater IF you can find one. They're meant to me sold direct for +15% to 20% margin and while retail is participating [?] the production and sales distribution chain won't work for + 15% to + 20% over their cost.

    On B.O.M and if sold direct an AIB can make around $486 on 4090 but the channel will take minimally 1/2 of that margin potential.

    AD 103, 379 mm2, TSMC design production $75.80 x 2 package/test/margin = $151.60 to Nvidia x2 to the AIB = $303.20 at adequate economic profit to the producer and channel at x4.96 to x6 MSRP = $1503 to $1819 and at dGPU peak production volume entering supply elastic pricing $1260

    4080 16 GB at $199 is priced appropriately on production economic and rough B.O.M assessment.

    Defunct 4080 12 AD 104, 295 mm2, TSMC design production $59 x 2 package/test/margin = $118 to Nvidia x2 to the AIB = $236 x adequate economic profit to the producer and channel at 4070 x4,31 to x3.59 MSRP = $762 at 1st quarter risk production volume before marginal cost decline and at peak volume entering supply elastic price $635 so 4080 12 on full run marginal cost is priced approximately $136 to $263 to much.

    Summary; Nvidia as a central banker;

    Capital creation that comes back to Nvidia and AMD into future time.
    Nvidia and AMD announce dGPU MSRP capable of 15% to 20% margin sold direct by producers.
    Supply, production, distribution sales objective is a profit max margin and 15% to 20% isn’t it.
    Nvidia and AMD purposely supply ‘PR priced’ GPU cards to retail intercepted by brokers.
    Brokers and VARs set application specific competitive pricing and the real price ceiling.
    Real price ceiling carves in perfect price tier for Nvidia and AMD master distributors.
    Cards are sold for their actual value on segment-by-segment application ROI.
    Mass market MSRP is surreal, and the broker market is pricing appropriately.
    There is no such thing as a scalper, only application ROI perfect price competition.
    Enthusiast purchase cadence has moved to run end on supply elastic pricing.
    Unless a commercial / B2B price is justifiable.

    mb
  • biggerestbrain - Monday, October 17, 2022 - link

    Modify your article to not misuse the word "stack." Utilize the extant English lexicon appropriately. You don't have to call every collection of things a "stack" because you're obsessed with using cutesy buzzwords.
  • catavalon21 - Monday, October 17, 2022 - link

    But....that's essentially what a "stack" is - heck, Merriam Webster lists over half a dozen definitions for it as a noun, including "an orderly pile or heap" - heck, that's closer than a lot of ways terms are used / misused on the Web. They way it's used in this article is far from new, when referring to products 1..n in a - release? generation?
  • catavalon21 - Monday, October 17, 2022 - link

    I, on the other hand, type like I talk, which is rambling at times. My use of "heck" twice was not to make a point, just me not proofreading worth a darn.
  • Bruzzone - Tuesday, October 18, 2022 - link

    beat me to it, we agree. mb
  • Bruzzone - Tuesday, October 18, 2022 - link

    How about a product heap defined by its grade SKU (performance rung) price ladder? mb
  • yeeeeman - Monday, October 17, 2022 - link

    the gap between 4080 16 and 4090, spec wise is too big...4080 16 should be 800 bucks tops, even less.
  • nrencoret - Monday, October 17, 2022 - link

    It's sad that no comment mentions the lack of AT gpu reviews.
  • nandnandnand - Tuesday, October 18, 2022 - link

    They did.
  • ricardodawkins - Tuesday, October 18, 2022 - link

    Where is the Anandtech review for these Nvidia GPUs ?
  • Nfarce - Tuesday, October 18, 2022 - link

    Sadly AT has been slower and slower to do new hardware review for years now after Anand sold the website. I knew it would never again be up there with Toms Hardware who is still my primary go-to tech site along with Guru3D (at least one former AT writer works there now doing regular reviews).

    I do not know what's behind it, but my guess would be they are either short staffed or are no longer one of the top hardware tech review sites that gets early access review hardware like the latest GPUs (which everyone is kept under an NDA until the agreed upon test review date release).

    It's been really sad as one who started reading AT and TH at roughly the same time over 20 years ago when I got into PC building for gaming.
  • catavalon21 - Tuesday, October 18, 2022 - link

    Yep. When TH's articles were by "Tom" and AT's by "Anand", covering subjects like MMX CPUs, battles between Pentium and Athlon, and NVIDIA vs. 3dfx. Jarred Walton does GPU articles today for TH, and he penned many AT articles a while back. His reviews aren't bad, but I liked Ryan's better.
  • catavalon21 - Tuesday, October 18, 2022 - link

    Future plc owns, or controls, or publishes - not sure the exact word - both Tom's and AnandTech which they picked up from Purch in 2018. Tom's is a corporate entity now like AT.
  • Oxford Guy - Sunday, October 30, 2022 - link

    Every business that is based on the Internet is a corporate entity.
  • catavalon21 - Sunday, October 30, 2022 - link

    Fair enough, I could have worded it better. Maybe "large corporation" versus "small business" would have been better. You are correct.
  • Hrel - Wednesday, October 19, 2022 - link

    Hey Nvidia, I'm still NEVER going to spend more than $200 on a GPU. I'm far from alone on that commitment. I don't know where you get off charging 1K Dollars for a single component in a computer that as a WHOLE should cost $1,000 but I hope someone at your company can understand how insane that is.

    Remember when you guys launched the 8800GT? What was the price for that GPU 6 months after release? 9 months? Roughly $130. It played EVERYTHING out at that time on max settings or effectively max settings.

    You aren't 10% over that, you aren't 100% over that. You are NINE HUNDRED AND TWENTY THREE PERCENT over that! There is simply no economic, engineering, or developmental reason that could EVER justify a 923% increase in price.

    IDK, maybe it really is just the end of the world. But I hope not. I hope the people making these insane things happen wake up and stop before everything really does come crashing down. Old small homes need to be available for 40-60k. Used cars that are really only good for getting around town need to cost less than 2K. Yes, this means no EV's. Graphics cards need to be available at under $200.

    Maybe fire your executive board? To get your overhead down. I mean, if they can't even realize giving 2 entirely different GPU's the same name is stupid then they really can't be trusted to make any decisions at all. I mean come on, even a 2 year old can figure out that two entirely different things should not have the same name.
  • nandnandnand - Wednesday, October 19, 2022 - link

    Your red line of $200 is under the BOM costs for some of these GPUs. You are alone in this fight.
  • Oxford Guy - Wednesday, October 19, 2022 - link

    I think you missed the satire.
  • Oxford Guy - Wednesday, October 19, 2022 - link

    Large corporations are much better than two-year-olds at using inflation to disguise larger margin increases.
  • AnnonymousCoward - Saturday, October 29, 2022 - link

    When a bunch of Democrats in government decide to spend multiple trillions of dollars (all on loan) while handing out thousands of dollars to every citizen, INFLATION is gonna hit! Stupid policy.
  • Oxford Guy - Sunday, October 30, 2022 - link

    Meanwhile, Reagan brought America to the $1 trillion debt mark and W Bush brought America to $2 trillion and $3 trillion.

    Thanks, though, for playing the game of pretending that plutocracy's two biggest brand names are substantively different.
  • twtech - Tuesday, November 1, 2022 - link

    Based on the specs, this might be a rare case where the top-end card is the bargain in the lineup. Usually the X090 is not much faster than the X080, but it appears that won't be the case here. I'd take those MS Flight Simulator benchmarks shown with a grain of salt, as they probably won't be representative across all games.

Log in

Don't have an account? Sign up now