Comments Locked

326 Comments

View All Comments

  • mode_13h - Tuesday, June 11, 2019 - link

    A year is still pretty new, for a process node. It probably didn't become economically viable for GPU-sized dies, until very recently.
  • CiccioB - Tuesday, June 11, 2019 - link

    Yes, and that's why AMD balance is so low at the end of the quarter.
    GPU sells are pulling AMD quarter results at low numbers as that division is loosing a lot of money with respect to the CPU division.
  • evernessince - Wednesday, June 12, 2019 - link

    Lol, we all know Nvidia set the pricing way back when turing launched. Blaming AMD for pricing set 6 months ago by Nvidia is just asinine.
  • eva02langley - Thursday, June 13, 2019 - link

    And offer twice the performance... price performance ratio are better than the RTX 2060.
  • xrror - Monday, June 10, 2019 - link

    It's like... serious question here.

    Was/are Polaris and Navi actually that bad power/perf wise?
    Or
    Did nVidia hit it out of the park so hard with Maxwell and Pascal that nobody else can catch up?

    Either way it sucks for those of us who game, and don't want to pay >$600 for a tangible upgrade from GTX1070 level and/or actually have usable 4K gaming.

    Pity the person who wants a good VR rig.

    (and no, this isn't an nVidia shill - I'd love to grab another AMD card, but whoever gets me a 4K gaming card for $400 first is gonna win it)
  • mode_13h - Monday, June 10, 2019 - link

    I think you're onto something. When Nvidia set about to design the Tegra X1, they had to focus on power-efficiency in a way they never did before. When they scaled up to a desktop GPU, this gave them a perf/W edge that ultimately translated into more perf. Just look at the performance gap between Kepler and Maxwell, even though they shared the same manufacturing node!

    AMD has taken a couple generations to wise up. It seems they are still on the journey.
  • V900 - Monday, June 10, 2019 - link

    Yes pretty much. Maxwell and Pascal were that great, even when NVIDIA is using an older/bigger node than AMD.

    We’ll see what Intel brings to the GPU market, though.

    As for a tangible upgrade to the 1070, the RTX 2070 is available for 450-500$ right now, so no, you wouldn’t have to spend >600$.
  • CiccioB - Tuesday, June 11, 2019 - link

    Anyone can catch up, if it wants to affors the costs of redoing its inefficient architecture.
    by passing from Kepler to Maxwell nvidia deeply redesigned the entire architecture (making it also a bit fatter, so a little more expensive) bu they knew that was the thing to do to create a better architecture.

    AMD started with GCN in 2012 and is proposing it's "Maxwell" in 2019.
    Despite the fact that the technology has advanced and beside the 7nm PP there are more things that they still lacks like all the new features nvidia put in Maxwell, Pascal and even more in Turing.
    They just started understanding that memory compression is an advantage instead of being wasted transistors. They are about 6 years back from this point of view.
  • mode_13h - Tuesday, June 11, 2019 - link

    They're definitely not 6 years behind! They introduced tile rendering in Vega, which Nvidia first brought out in Maxwell. So, perhaps more like 2-3 years.
  • CiccioB - Wednesday, June 12, 2019 - link

    On geometry capacity they are 6 years behind.
    Like for memory compression that allows nvidia to use about 33% less bandwidth which obliged AMD use expensive HBM on high end cards to non make enormous and expensive bus on GPUs that are already fatter than the competition for the same performance.
    Without talking about the double projection feature and the acceleration for voxel to better support volumetric lights and effects (which we can see only though GameWorks extension as no console engine is thought to support them because AMD has not dedicated acceleration for them and they would result in a slide show).

Log in

Don't have an account? Sign up now