The Test

As Zotac’s GeForce GTX 1650 Super card is built to NVIDIA’s reference specifications, there is no need to dial it down or otherwise adjust the card to represent a reference card. As such, the card has been tested as-is.

Meanwhile, as with last week’s Radeon RX 5500 XT review, as the card is primarily focused on 1080p gaming and clearly underpowered for anything more than that, this is what our benchmark results will focus on.

As for drivers, we’re using the latest drivers from both NVIDIA and AMD for their cards. For NVIDIA cards, this is 441.41, and for AMD cards it’s AMD’s Radeon Software 19.12.2.

CPU: Intel Core i9-9900K @ 5.0GHz
Motherboard: ASRock Z390 Taichi
Power Supply: Corsair AX1200i
Hard Disk: Phison E12 PCIe NVMe SSD (960GB)
Memory: G.Skill Trident Z RGB DDR4-3600 2 x 16GB (17-18-18-38)
Case: NZXT Phantom 630 Windowed Edition
Monitor: Asus PQ321
Video Cards: AMD Radeon RX 5500 XT 8GB
AMD Radeon RX 5500 XT 4GB
AMD Radeon RX 580
AMD Radeon RX 570
AMD Radeon RX 460 4GB
AMD Radeon R9 380
NVIDIA GeForce GTX 1660 Super
NVIDIA GeForce GTX 1660
NVIDIA GeForce GTX 1650 Super
NVIDIA GeForce GTX 1060 3GB
NVIDIA GeForce GTX 1050 Ti
Video Drivers: NVIDIA Release 441.41
NVIDIA Release 441.07
AMD Radeon Software Adrenalin 2020 Edition 19.12.2
OS: Windows 10 Pro (1903)
ZOTAC Gaming GeForce GTX 1650 Super Tomb Raider, F1 2019, & Assassin’s Creed
Comments Locked

67 Comments

View All Comments

  • WetKneeHouston - Monday, January 20, 2020 - link

    I got a 1650 Super over the 580 because it's more power efficient, and anecdotally I've experienced better stability with Nvidia's driver ecosystem.
  • yeeeeman - Friday, December 20, 2019 - link

    It is as if AMD didn't have a 7nm GPU, but a 14nm one.
  • philosofool - Friday, December 20, 2019 - link

    Can we not promote the idea, invented by card manufacturers, that everyone who isn't targeting 60fps and high settings is making a mistake? Please publish some higher resolution numbers for those of us who want that knowledge. Especially at the sub-$200 price point, many people are primarily using their computers for things other than games and gaming is a secondary consideration. Please let us decide which tradeoffs to make instead of making assumptions.
  • Dragonstongue - Friday, December 20, 2019 - link

    100% agreed on this.

    Up to the consumers themselves how where and why they will use the device as they see fit, be it gaming or streaming or "mundane" such as watching videos or even for emulation purposes, sometimes even "creation" purposes,

    IMO is very related to the same BS crud smartphone makers use(used) to ditch 3.5mm jacks "customers do not want them anymore, and with limited space we had no choice"

    so instead of adjusting the design to keep the 3.5mm jack AND a large enough battery, the remove the jack, limit the battery size ~95% are all fully sealed cannot replace battery as well as nearly all of them these days are "glass" that is by design pretty but also stupid easy to break so you have not choice but to make a very costly repair and/or buy a new one.

    with GPU they CAN make sure there are DL-DVI connector HDMI full size DP port (with maybe 1 mini DP)

    they seem to "not bother" citing silly reasons "it is impossible / customers no longer want this"

    As well as you point out..the consumer decides the usage case, provide the best possible product, give the best possible NO BS review/test data and we the consumer will see or not see therefore decide with the WALLET if it is worth it or not.

    Likely save much $$$$$$$$$$ and consumer <3 by virtue of not buying something they will inadvertently regret using in the first place.

    Hell I am using and gaming with a Radeon 7870 @ 144Hz monitor 1440p (it only runs at 60Hz due to not fully supporting higher than this) However I still manage to game on it "just fine" maybe not ultra spec everything, but comfortably (for me) high to medium "tweaked" settings.

    Amazing how long this last when they are built properly and not crap kicked out of it...that and well not having hundreds to thousands to spend every year or so (which is most people these days) should mean so much more to these mega corps than "let us sell something that most folks really do not need, let us make it right and upgrades will happen when they really need to instead of just ending in the E trash can in a few months time"
  • timecop1818 - Friday, December 20, 2019 - link

    DVI? No modern card should have that garbage connector. Just let it die already.
  • Korguz - Friday, December 20, 2019 - link

    yea ok sure... so you still want the vga connector instead ???
  • Qasar - Friday, December 20, 2019 - link

    dvi is a lot more useful then the VGA connector that monitors STILL come with. but yet we STILL have those on new monitors. no modern monitor should have that garbage connector
  • The_Assimilator - Saturday, December 21, 2019 - link

    No VGA. No DVI. DisplayPort and HDMI, or GTFO.
  • Korguz - Sunday, December 22, 2019 - link

    vga.. dead connector, limited use case, mostly business... dvi.. still useful, specially in KVMs... havent seen a display port KVM.. and the HDMI KVM, died a few months after i got it.. but the DVI KVMs i have.. still work fine. each of the 3, ( dvi, hdmi and display port ) still have their uses..
  • Spunjji - Monday, December 23, 2019 - link

    DisplayPort KVMs exist. More importantly, while it's trivial to convert a DisplayPort output to DVI for a KVM, you simply cannot fit the required bandwidth for a modern high-res DP monitor through a DVI port.

    DVI ports are large, low-bandwidth and have no place on a modern GPU.

Log in

Don't have an account? Sign up now