The Test

While the GeForce GTX 1650 is rolling out as a fully custom launch, the nature of the entry-level segment means that boards will be very similar across the board. For one, going beyond 75W TDP would require an external PCIe power connector. So the 75W ZOTAC GeForce GTX 1650 with boost clocks dialed 30MHz down to reference is a good approximation for a generic reference GTX 1650, allowing us to keep testing and analysis as apples-to-apples as possible. While not perfect, this should be reasonably accurate for a virtual reference card as we look at reference-to-reference comparisons.

Overall, as this primarily covers cards in the low- to mid-range, all game benchmarking is done at 1080p, looking at performance on our standard 1080p Ultra settings, as well as high and medium options that are better suited for these sorts of sub-$150 cards.

Test Setup
CPU Intel Core i7-7820X @ 4.3GHz
Motherboard Gigabyte X299 AORUS Gaming 7 (F9g)
PSU Corsair AX860i
Storage OCZ Toshiba RD400 (1TB)
Memory G.Skill TridentZ
DDR4-3200 4 x 8GB (16-18-18-38)
Case NZXT Phantom 630 Windowed Edition
Monitor LG 27UD68P-B
Video Cards ZOTAC GAMING GeForce GTX 1650 OC
NVIDIA GeForce GTX 1650 (4GB)

AMD Radeon RX 570 8GB
AMD Radeon RX 570 4GB
AMD Radeon RX 460 4GB (14 CU)
AMD Radeon RX 370 (2 GB)
NVIDIA GeForce GTX 1660 Ti
NVIDIA GeForce GTX 1660
NVIDIA GeForce GTX 1060 6GB Founders Edition (1260 cores)
NVIDIA GeForce GTX 1060 3GB (1152 cores)
NVIDIA GeForce GTX 1050 Ti (4GB)
NVIDIA GeForce GTX 1050 2GB
NVIDIA GeForce GTX 960 (2GB)
NVIDIA GeForce GTX 950
NVIDIA GeForce GTX 750 Ti
Video Drivers NVIDIA Release 430.39
AMD Radeon Software Adrenalin 2019 Edition 19.4.3
OS Windows 10 x64 Pro (1803)
Spectre and Meltdown Patched

Driver-wise, in addition to not being made available before launch, the 430.39 release was not the smoothest either, with a 430.59 hotfix coming out just this week to resolve bugs and performance issues. In our testing, we did observe some flickering in Ashes.

Meet the ZOTAC GeForce GTX 1650 OC Battlefield 1
Comments Locked

126 Comments

View All Comments

  • PeachNCream - Tuesday, May 7, 2019 - link

    Agreed with nevc on this one. When you start discussing higher end and higher cost components, consideration for power consumption comes off the proverbial table to a great extent because priority is naturally assigned moreso to performance than purchase price or electrical consumption and TCO.
  • eek2121 - Friday, May 3, 2019 - link

    Disclaimer, not done reading the article yet, but I saw your comment.

    Some people look for low wattage cards that don't require a power connector. These types of cards are particularly suited for MiniITX systems that may sit under the TV. The 750ti was super popular because of this. Having Turings HEVC video encode/decode is really handy. You can put together a nice small MiniITX with something like the Node 202 and it will handle media duties much better than other solutions.
  • CptnPenguin - Friday, May 3, 2019 - link

    That would be great if it actually had the Turing HVEC encoder - it does not; it retains the Volta encoder for cost saving or some other Nvidia-Alone-Knows reason. (source: Hardware Unboxed and Gamer's Nexus).

    Anyone buying a 1650 and expecting to get the Turing video encoding hardware is in for a nasty surprise.
  • Oxford Guy - Saturday, May 4, 2019 - link

    "That would be great if it actually had the Turing HVEC encoder - it does not; it retains the Volta encoder"

    Yeah, lack of B support stinks.
  • JoeyJoJo123 - Friday, May 3, 2019 - link

    Or if you're going with a miniITX low wattage system, you can cut out the 75w GPU and just go with a 65w AMD Ryzen 2400G since the integrated Vega GPU is perfectly suitable for an HTPC type system. It'll save you way more money with that logic.
  • 0ldman79 - Sunday, May 19, 2019 - link

    What they are going to do though is look at the fast GPU + PSU vs the slower GPU alone.

    People with OEM boxes are going to buy one part at a time. Trust me on this, it's frustrating, but it's consistent.
  • Gich - Friday, May 3, 2019 - link

    25$ a year? So 7cents a day?
    7cents is more then 1kWh where I live.
  • Yojimbo - Friday, May 3, 2019 - link

    The us average is a bit over 13 cents per kilowatt hour. But I made an error in the calculation and was way off. It's more like $15 over 2 years and not $50. Sorry.
  • DanNeely - Friday, May 3, 2019 - link

    That's for an average of 2h/day gaming. Bump it up to a hard core 6h/day and you get around $50/2 years. Or 2h/day but somewhere with obnoxiously expensive electricity like Hawaii or Germany.
  • rhysiam - Saturday, May 4, 2019 - link

    I'd just like to point out that if you've gamed for an average of 6h per day over 2 years with a 570 instead of a 1650, then you've also been enjoying 10% or so extra performance. That's more than 4000 hours of higher detail settings and/or frame rates. If people are trying to calculate the true "value" of a card, then I would argue that this extra performance over time, let's not forget the performance benefits!

Log in

Don't have an account? Sign up now