Comments Locked

67 Comments

Back to Article

  • Marlin1975 - Friday, December 20, 2019 - link

    Little disappointing. Lower performance than a 580 and higher priced than others in its price range.

    If the price was quiet a bit lower then it might make a decent HTPC card for some.
  • HarryVoyager - Friday, December 20, 2019 - link

    The 8Gb 580's, both new and used, are going to be the elephant in the room for a while I suspect. A reasonably cared for miner card runs only $100 and will keep you going at least until AMD is competitive enough to drive nVidia's pricing down.
  • Yojimbo - Friday, December 20, 2019 - link

    A reasonably cared for mining card...? How do you assure that?

    The 580s have been so cheap because AMD made too many of them. Nvidia are not so interested in low end market share. They'll sell cards there if they can make money on them. I have a feeling GDDR6 dram has come down in price in the last year more than 7 nm fabrication cost has. Now they can sell in that market segment with good profits.
  • Retycint - Friday, December 20, 2019 - link

    I would imagine most miners run their cards undervolted/underclocked for maximum efficiency. And it's a 24/7 load, unlike the periodic low->high->low load of gaming cards, which is actually better for the card in the long run. So miner cards actually tend to be in better condition (save for the fans, of course).

    But then again, there's the risk of miner cards that have had modded BIOSes installed which might have damaged the card? So I suppose it's down to buying from a reputable seller
  • Yojimbo - Friday, December 20, 2019 - link

    They would have undervolted and then pushed it to get the most performance they could without crashes and run it 24/7. It would be sitting in a hot case in a hot room. I don't think it's a particularly desirable card. But my question is actually in response to what your conclusion was: How do we judge a mining card? What do you mean a reputable seller? A 3rd party that picked up the card fro some miner?
  • PeachNCream - Saturday, December 21, 2019 - link

    GPU mining was typically done with open air brackets rather than in a densely packed case. However, GPU mining is a lot less commonplace these days given the low profitability. We are past that being a thing really so finding a modern ex-miner GPU is not as easy of a prospect as it was even a couple of years ago.
  • Kangal - Saturday, December 21, 2019 - link

    The thing is Mining Cards really are Gaming Cards.
    There's less than 1% performance and thermal difference between the two. Even Linus TechTips did an experiment on this by comparing a mining card that was used for 4 years constantly, compared to the exact same card (same variant) which was still sealed in the box. No difference.

    At worst case scenario, the used card is not going to have warranty, and you may need to "refurbish" it yourself: clean the case, clean the innards, reapply thermal paste, put in new fans. All up it's going to cost you $5-$20. So when you're saving yourself $50-$100 it's worth it, from a value point.
  • MamiyaOtaru - Sunday, December 22, 2019 - link

    nobody thinks there is a performance difference between an ex mining card and a new card. Performance doesn't do a slow fade. The worry is that the mining card could be that much closer to failure.
  • Kangal - Sunday, December 22, 2019 - link

    Actually, heaps of people think mining cards run hotter, use more power, and run slower compared to Brand New. It's actually a very widespread misconception.

    Mining cards being closer to failure is actually a myth as well. If the card is running relatively normally when you buy it, a quick refurb will bring it back to New condition. The same applies to a Gaming Card, if you hear the fans are out of tune: replace fans. If it's running hotter than normal: reapply thermal paste. If it looks dusty: clean the innards and casing.

    As was pointed out, Gaming Cards are usually at Idle, then at Maximum Performance, then Idle, then Max again. That process is actually harsher on the components: fan, tim, logic board, and uneven heat dissipation. A Mining Card is usually run cooler and at lower voltage, and its running smoothly and consistently which doesn't cause as much "micro-cracks" in the thermal paste, or the fan assembly. Not to mention, I think subjectively miners are likely to look after their hardware a little better (it's making $) and used in open air, there's some merit to it.

    However, this stigma has been good for enthusiasts for generating a lot of great mining cards at stupid low prices. The old AMD HD7970, I've seen them go for $50 like 5 years ago. The AMD R9 290, was around $150 as well. Now, we've got plenty of ex-mining RX 470 and RX 580's, that's really killing it for people considering a RX 5500XT or GTX 1660. (Un)fortunately there wasn't too many Vega56 or Vega64 cards manufactured to affect the market too much. The Nvidia cards have seemed to keep their value much better (GTX 980, GTX 1070, etc).
  • MASSAMKULABOX - Friday, January 3, 2020 - link

    Some mining cards had NO video outputs .. a slight handicap for gaming ?
    Mining cards were in general looked after well ..but home miners ..not so much...
    quite a lot of the miners factored in th resale value of the cards ,, the prices were high new, so s/h prices would also be high. Only when everyone wants to sell their cards at the same time prices drop.
  • StevoLincolnite - Sunday, December 22, 2019 - link

    But if you only paid $100 and it only lasts for a couple years, it's still worth it for that tier of performance, no?
  • Yojimbo - Monday, December 23, 2019 - link

    We don't have any data on this, which is why I would avoid a used mining card. We'll never get any data on this, either. The thing is, although one can't tell a well-cared-for gaming card from one not well-cared for, over the years a general knowledge of the expectation of a used part has been built up. In the case of mining it is a big unknown in my view. You don't really know which cards are mining cards and which are gaming, so any card that has been popular with miners is suspect, in my view, unless you know who you are buying from.
  • eastcoast_pete - Sunday, December 22, 2019 - link

    Which poses this question: Is there a program ("app") that can run a health check on a card? In addition to any "custom BIOS" , I would also be concerned about simple aging with intense, ongoing use. When manufacturers bin chips and assign them to target speeds, they supposedly do so also based on life expectancy, at least for CPUs. So, is there a way to test how much life the GPU and RAM of a card have left in them?
  • flyingpants265 - Sunday, December 22, 2019 - link

    The elephant in the room for the RX580 8GB, and AMD videocards in general, is the almost 200W power draw on a "1080p card", whereas the 1650 uses 75W. It may really suck for the price, but it uses less than half the power. Obviously the RX570 is a great choice as well.

    Then there's reliability. I've seen statistics from Puget systems and some big online retailer and AMD had some obscenely high failure rates. AMD is a much smaller company, they might have less oversight, and heat causes a lot of damage to complex electronics. Not exactly reliable info but I wouldn't really be surprised if it were somewhat accurate. I believe all consumer products are cheaply made, so I'd rather go with the lower-power, lower-heat, larger company. Too bad I don't have any hard data to back that up.

    Not really interested in anecdotal evidence either.
  • Spunjji - Monday, December 23, 2019 - link

    "Too bad I don't have any hard data to back that up.
    Not really interested in anecdotal evidence either."

    Next time start with that pitch, so folks can ignore the self-confessedly uninformed speculative rambling that follows.

    If the power draw is a bother on the RX580, a little undervolting will go a very long way without noticeably affecting frame rates at 1080p. It'll also help with longevity. Regardless, none of this is particularly crucial when you're saving ~$50 and getting a faster card with more VRAM.
  • Spunjji - Monday, December 23, 2019 - link

    RX 580 is still good for 1440p too, if you're not obsessed with hitting "Max" on every setting just because it's there.
  • khanikun - Monday, December 30, 2019 - link

    Reminds me of back in the day, when I moved to ATI for a very short while. 9700 Pro. Started overheating after a year, then broke. 9600 XT as temporary card. Started overheating in less than a year. 9800 XT. Started overheating in a couple months. Went back to Nvidia and haven't had a reason to look at ATI/AMD cards since.
  • Qasar - Wednesday, January 1, 2020 - link

    i have some of those cards from then, 9800pro, 9600pro, used them not to long ago to see if they still work.. and they still do.. hehehehe
  • WetKneeHouston - Monday, January 20, 2020 - link

    I think it makes sense to think that heat and power would lead to reliability issues. That's why I went with the 1650 Super. It's still too powerful of a card for the low tier (I can't notice the difference with higher settings, I suspect it's a scam lol) 1080p gaming I do, so I probably should have gotten the regular 1650, but they're basically the same price,
  • Yojimbo - Friday, December 20, 2019 - link

    Yeah maybe if it cost $50 it would be worth it for running dosbox.
  • WetKneeHouston - Monday, January 20, 2020 - link

    I got a 1650 Super over the 580 because it's more power efficient, and anecdotally I've experienced better stability with Nvidia's driver ecosystem.
  • yeeeeman - Friday, December 20, 2019 - link

    It is as if AMD didn't have a 7nm GPU, but a 14nm one.
  • philosofool - Friday, December 20, 2019 - link

    Can we not promote the idea, invented by card manufacturers, that everyone who isn't targeting 60fps and high settings is making a mistake? Please publish some higher resolution numbers for those of us who want that knowledge. Especially at the sub-$200 price point, many people are primarily using their computers for things other than games and gaming is a secondary consideration. Please let us decide which tradeoffs to make instead of making assumptions.
  • Dragonstongue - Friday, December 20, 2019 - link

    100% agreed on this.

    Up to the consumers themselves how where and why they will use the device as they see fit, be it gaming or streaming or "mundane" such as watching videos or even for emulation purposes, sometimes even "creation" purposes,

    IMO is very related to the same BS crud smartphone makers use(used) to ditch 3.5mm jacks "customers do not want them anymore, and with limited space we had no choice"

    so instead of adjusting the design to keep the 3.5mm jack AND a large enough battery, the remove the jack, limit the battery size ~95% are all fully sealed cannot replace battery as well as nearly all of them these days are "glass" that is by design pretty but also stupid easy to break so you have not choice but to make a very costly repair and/or buy a new one.

    with GPU they CAN make sure there are DL-DVI connector HDMI full size DP port (with maybe 1 mini DP)

    they seem to "not bother" citing silly reasons "it is impossible / customers no longer want this"

    As well as you point out..the consumer decides the usage case, provide the best possible product, give the best possible NO BS review/test data and we the consumer will see or not see therefore decide with the WALLET if it is worth it or not.

    Likely save much $$$$$$$$$$ and consumer <3 by virtue of not buying something they will inadvertently regret using in the first place.

    Hell I am using and gaming with a Radeon 7870 @ 144Hz monitor 1440p (it only runs at 60Hz due to not fully supporting higher than this) However I still manage to game on it "just fine" maybe not ultra spec everything, but comfortably (for me) high to medium "tweaked" settings.

    Amazing how long this last when they are built properly and not crap kicked out of it...that and well not having hundreds to thousands to spend every year or so (which is most people these days) should mean so much more to these mega corps than "let us sell something that most folks really do not need, let us make it right and upgrades will happen when they really need to instead of just ending in the E trash can in a few months time"
  • timecop1818 - Friday, December 20, 2019 - link

    DVI? No modern card should have that garbage connector. Just let it die already.
  • Korguz - Friday, December 20, 2019 - link

    yea ok sure... so you still want the vga connector instead ???
  • Qasar - Friday, December 20, 2019 - link

    dvi is a lot more useful then the VGA connector that monitors STILL come with. but yet we STILL have those on new monitors. no modern monitor should have that garbage connector
  • The_Assimilator - Saturday, December 21, 2019 - link

    No VGA. No DVI. DisplayPort and HDMI, or GTFO.
  • Korguz - Sunday, December 22, 2019 - link

    vga.. dead connector, limited use case, mostly business... dvi.. still useful, specially in KVMs... havent seen a display port KVM.. and the HDMI KVM, died a few months after i got it.. but the DVI KVMs i have.. still work fine. each of the 3, ( dvi, hdmi and display port ) still have their uses..
  • Spunjji - Monday, December 23, 2019 - link

    DisplayPort KVMs exist. More importantly, while it's trivial to convert a DisplayPort output to DVI for a KVM, you simply cannot fit the required bandwidth for a modern high-res DP monitor through a DVI port.

    DVI ports are large, low-bandwidth and have no place on a modern GPU.
  • Korguz - Monday, December 23, 2019 - link

    i still disagree... still more usefull then the vga connector.. and i bet displayport kvms.. are quite expensive compared to dvi kvms.....
  • NetMage - Sunday, January 5, 2020 - link

    You would lose that bet - StarTech.com had DisplayPort KVM about the same price as DVI.
  • NetMage - Sunday, January 5, 2020 - link

    The problem with DisplayPort KVMs is that Windows has hot plug enabled for DP and it can’t be disabled, which means switching one monitor in multi monitor setups can cause windows to rearrange, and cause difficulty with RDP.
  • Fujikoma - Wednesday, December 25, 2019 - link

    Old connectors have their place because there are people out there that can't just afford to buy a new computer simply to keep up and replacing a specific component is all that is needed. These people usually surf the internet, watch YouTube and look at pictures family send them. They don't need a new monitor with a new connector just because a more modern video card "shouldn't" have an old connector.
  • 29a - Friday, December 20, 2019 - link

    I agree, give us more data. I'd like to see the codec ASICs start getting tested too.
  • milkywayer - Friday, December 20, 2019 - link

    Yup. I'd like to replace my massive and heavy evga gtx 1080ti with something smaller in size just so I can play indies and light weight games like LoL at 4k. I don't care about playing the latest AAA title at highest quality in 1080p
  • Calihan - Saturday, December 21, 2019 - link

    Trade my 970 for your 1080ti
  • WetKneeHouston - Monday, January 20, 2020 - link

    I'm with you. I think highest quality settings are a mirage anyway.
  • Ryan Smith - Monday, December 23, 2019 - link

    Thanks for the feedback.

    I'm hesitant to commit to anything at this second as adding resolutions would significantly increase the workload required in testing these cards (each run at one resolution is around 2 hours these days). So I hope you can understand why. But it's definitely something I'll be mulling over for GPU Bench 2020.
  • eastcoast_pete - Friday, December 20, 2019 - link

    Thanks Ryan! Especially appreciate confirmation of the updated NVENC ASIC in the 1650 Super. Turing's NVENC is notably better than Pascal's, and makes this card potentially interesting for video encoding duties.
  • guachi - Friday, December 20, 2019 - link

    The 1650S overall is the card to get if buying a new card. But I'd still recommend a used 570 or 580 (and maybe a new 570 if you get one in sale).

    Polaris will never die.

    I just wouldn't buy THIS 1650S. The noise. Ouch! 50dB?

    No.
  • lmcd - Friday, December 20, 2019 - link

    It's a convenient size, enough so that airflow in my case will result in better overall acoustics in my case compared to a larger card. Agreed that there's better designs yet but this isn't as awful as you're implying.
  • Spunjji - Monday, December 23, 2019 - link

    50db is terrible under any circumstance.
  • lmcd - Friday, December 20, 2019 - link

    Gonna be honest I don't quite understand why the 1050 Ti and 1060 3GB both need to be in this graph set while the 1070 didn't make it in. Usually there aren't performance regressions from one generation to another, so it's more interesting to compare a higher card from the previous generation to a lower card from the current generation.
  • Ryan Smith - Friday, December 20, 2019 - link

    The 1070 was a $350 card its entire life. Whereas the 1050 Ti and 1060 3GB were the cards closest to being the 1650 Super's predecessor (1050 Ti was positioned a bit lower, 1060 3GB a bit higher). So the latter two are typically the most useful for generational comparisons.

    At any rate, this is why we have Bench. So you can use that to make any card comparisons you'd like to see that aren't in this article itself. https://www.anandtech.com/bench/GPU19/2638
  • catavalon21 - Saturday, December 21, 2019 - link

    In doing so, it paints an interesting light for those of us who do not upgrade every generation. While the 970 can't be compared directly to it in Bench, it's interesting to see how many benchmarks show it besting the 980 - which was a $550 card when it debuted. Maybe the RTX series cards are worthy of their criticisms for gen-over-gen improvement in performance per dollar, but not this guy. Yes, I know 980 was 2 generations ago, but still. The 980 takes some of the benchmarks, especially CUDA, but across the board, the 1650S competes very well. For a card to have 980-like performance for $160 at 100 watts, I'm impressed.
  • The_Assimilator - Saturday, December 21, 2019 - link

    No you're wrong, according to forum keyboard warriors there's been no improvement in price/perf in the last half decade because they can't get top-tier performance for $100. ;)
  • Spunjji - Monday, December 23, 2019 - link

    That we're only seeing a price/performance improvement over Pascal more than half-way into the Turing generation kinda proves those "keyboard warriors" correct, though. It's nice, but it was annoying when on release a large chunk of the press decided to sing songs about how new boundaries of performance were being pushed (true!) while downplaying how perf/$ remained still or regressed (equally true). Throwing up some straw men now doesn't change that.
  • Spunjji - Monday, December 23, 2019 - link

    The 1060 6GB already beat out the 980 under most circumstances - at worst it was roughly equal. That was a very nice perf/$ improvement indeed for a single generation, and it's where we got most of the gains the 1650 Super is now building on.

    The 2060 is an instructive example of how the RTX series disappointed in that regard, as the cost increase roughly matched the performance increase and its RTX features are arguably useless.
  • Kangal - Saturday, December 21, 2019 - link

    Thanks for the review Ryan.
    But I have to go against you on the mention of 4GB VRAM capacity for 2020. You have forgotten something very important. Timing.

    Sure, PC Gaming makes a lot more money than Console Gaming (and Mobile Gaming is even larger!!), but that is because the wealth is not distributed fairly, it's quite concentrated. Whereas the Console Market is more spread out, so publishers can make profits more universally and over a longer timeframe. On top of that, there's the marketing and the fear of piracy. Which is the reason why Game Publishers target the consoles first, then afterwards port their titles to PC... even though originally they developed them on PC!

    I needed to mention that above background first to give some clarification. Games for 2020 will primarily be made to target the PS4, and they might get ported to the PS5 or Xbox X. Or even those launch titles for the PS5/XbX, they will actually be made for the PS4 first, and had enhancements made. Remember the 2014 games which were still very much PS3/360 games?

    And it will take AT LEAST a full-year for the transition to occur. So games in Early 2022 will still target the PS4, which means their PC Ports will be fine for current day low-end PCs. I mean even with the PS5 release, the PS4 sales will continue, and that's a huge market base for the companies to simply ignore. And even in the PC Market, most gamers have something that's slower than a GTX 1660 Ti. Besides, low VRAM isn't too much of an issue, most of the time the game will only require 3GB RAM to run perfectly. If you have more available, say 8GB, then without any changes from your end, you will see it now start using say 6GB of VRAM. That's Double! And you didn't even change the settings! Why? Most games now use the VRAM to store assets it thinks it might use later on, so that it doesn't have to load them when required. This is analogous to how Mac/Linux uses System RAM, as opposed to say Windows does. If it does have to load them, performance will take a momentary dip, but perfectly playable.

    And even if the games now require more VRAM by default to be playable, in most cases that problem too can be solved. You can change individual settings one-by-one and see which has the most effect to the graphical fidelity, and how much it penalises your VRAM/RAM usage, and your framerates. I mean look at lowspecgamer, to see how far he pushes it. Though for a better idea, have a look at HardwareUnboxed on YouTube, and see how they optimise graphics for the recently released Red Dead Redemption 2 (PC) game. They fiddled with the graphics to get a negligible downgrade, but boosted their framerates by +66%.

    So I think 4GB VRAM will become the new 2GB VRAM (which itself replaced the 1GB VRAM), but that doesn't mean they're compromising on the longevity of the card. I think 4GB will be viable for the midrange upto 2022, then they're strictly just low-end. Asking gamers to get 8GB instead of 4GB for these low-midrange cards is not really sensible at the prices... it is exactly like asking the GTX 960 buyers to get the 4GB Variants instead of the 2GB Variants.
  • Korguz - Sunday, December 22, 2019 - link

    why do you think the games will target ps4 ?? is this just your own opinion??
  • Kangal - Sunday, December 22, 2019 - link

    Because there's a lot of PS4 units hooked up to TVs right now, there will still be hooked up until 2022. When the PS4 launched, the PS3 was slightly ahead of the Xbox 360, yet sales were nothing like the PS4's. And the PS3 was very outdated back in 2014, whereas in 2020, the PS4 is not nearly as outdated... so there's more longevity in there.

    So with all those factors and history, there's a high probability (certainty?) that Game Publishers will still target the PS4 as their baseline. This is good news for Gaming PC's with only 8GB RAM and 4GB VRAM, and performance below that of a RX 5700. Regardless, it's always easier to upgrade a PC's GPU than it is to upgrade the entire console.

    ...that's why Ryan is not quite right
  • Korguz - Sunday, December 22, 2019 - link

    um yea ok sure... and you have numbers to confirm this ?? seems plausible, but also, just personal opinion
  • Kangal - Monday, December 23, 2019 - link

    During the launch of the PS4 back in 2014, the older PS3 was 8 YEARS OLD at the time, and hadn't aged well, but it did a commendable sales of 85 Million consoles.

    I was surprised by the Xbox 360 which was 9.5 YEARS OLD, which understandably was more outdated, and it did a surprising sales of 75 Million consoles.

    Because both consoles weren't very modern/quite outdated, and marketing was strong, the initial sales of the PS4 and Xbox One were very strong in 2014. Despite this there was about another, 5 Million PS3 and Xbox 360, budget sales made in this period. And it took until Early-2016 for Game Publishers to ditch the PS3 and Xbox 360. So about 1.5 Years, and about 40 Million sales (PS4) or 25 Million sales (Xbox 360) later. During this period people using 2GB VRAM Graphic Cards (GTX 960, AMD R9 370X) were in the clear. Only after 2016 were they really outdated, but it was a simple GPU Swap for most people.

    So that's what happened, that's our history.
    Now let's examine the current/upcoming events!
    The PS4 has sold a whopping 105 Million consoles, and the Xbox One has a commendable 50 Million units sold. These consoles should probably reach 110 Million and 55 Million respectively when the PS5 and Xbox X release. And within 2 years they will probably settle on a total of 120 Million and 60 Million sales total. That's a huge player base for companies to ignore, and is actually better than the previous generation. However, this current gen will have both consoles much less outdated than the previous gen, and it's understandable since both consoles will only be 6 YEARS OLD. So by the end of 2022, it should (will !!) be viable to use a lower-end card, something that "only" has 4GB VRAM such as the RX 5500XT or the GTX 1650-Super. And after that, it's a simple GPU Swap to fix that problem anyway so it's no big deal.

    Ryan thinks these 4GB VRAM cards will be obsolete within 6 Months. He's wrong about the timing. It should take 2 Years, or about x4 as much time. If he or you disagree, that's fine, but I'm going off past behavior and other factors. I will see Ryan in 6 Months and see if he was right or wrong.... if I remember to revisit this article/comment that is : )
  • Korguz - Monday, December 23, 2019 - link

    and yet... i know some friends that sold their playstations.. and got xboxes... go figure....
    for game makers to make a game for a console to port it to a comp = a crappy game for the most part.. supreme commander 2, is a prime example of this....
  • flyingpants265 - Sunday, December 22, 2019 - link

    Most benchmarks on this site are pretty bad and missing a lot of cards.

    Bench is OK but the recent charts are missing a lot of cards and a lot of tests.

    Pcpartpicker is working on a better version of bench, they've got dozens of PCs running benchmarks, 24/7 year-round, to test every possible combination of hardware and create a comprehensive benchmark list. Kind of an obvious solution, and I'm surprised nobody has bothered to do this for... 20-30 years or longer..
  • Korguz - Sunday, December 22, 2019 - link

    hmmmmmm could it be because of, oh, let me guess... cost ?????????????????
  • sheh - Saturday, December 21, 2019 - link

    In the buffer compression tests the 1650S fares worse than both the non-S cards and the 1050 Ti.
    How come?

    Curiously, the 1660S is even worse than the 1650S.
  • catavalon21 - Saturday, December 21, 2019 - link

    Guessing it's ratio differences not rated to absolute performance. A more comprehensive chart in BENCH of the INT8 Buffer Compression test shows the 2080Ti with a far lower score than any of the recent mid-range offerings.

    https://www.anandtech.com/bench/GPU19/2690
  • catavalon21 - Sunday, December 22, 2019 - link

    * not related to
  • Ryan Smith - Monday, December 23, 2019 - link

    Yeah, it's a ratio test, and both scores fluctuate depending on things like memory bandwidth and fill rates. In this case lower bandwidth cards tend to do better, since they aren't as likely to be bottlenecked elsewhere (whereas the 2080 Ti has bandwidth to spare for days).

    It's imperfect, to say the least. But people have been asking for the data, so here it is.
  • sheh - Monday, December 23, 2019 - link

    That's strange.

    I thought, maybe, faster cards don't bother compressing since they don't need it and it uses more power. But other than that, I thought it's just a question of the supported algorithms.
  • harobikes333 - Sunday, December 22, 2019 - link

    Considering these current GPUs seem pretty darn similar. My pick between NVIDIA & AMD would the AMD card simply for the fact that NVIDIA needs competition in the future.
  • sheh - Monday, December 23, 2019 - link

    That's strange.

    I thought, maybe, faster cards don't bother compressing since they don't need it and it uses more power. But other than that, I thought it's just a question of the supported algorithms.
  • sheh - Monday, December 23, 2019 - link

    (Ignore the above. AnandTech's commenting system bugs...)
  • sharathc - Wednesday, December 25, 2019 - link

    See the pic from a distance, it looks like owl 🦉
  • jmunjr - Monday, January 6, 2020 - link

    One good thing about the SUPER variant of the 1650 is it adds the Turing version of NVENC which will boost performance for live streaming and Plex transcoding. The base 1650 used the Volta NVENC for some reason.

Log in

Don't have an account? Sign up now