AMD Ryzen 5 2400G and Ryzen 3 2200G Integrated Graphics Frequency Scalingby Gavin Bonshor on September 28, 2018 12:30 PM EST
- Posted in
- Ryzen 3 2200G
- Ryzen 5 2400G
One of the last poignant questions from our previous Ryzen APU coverage is the way that integrated graphics scales with overclocking. As these low-end Ryzen APUs are all about gaming on a budget, our previous looks into core frequency and memory scaling lead naturally into examining how well the graphics overclocks and what extra performance can be had with a light touch of BIOS settings. We pushed both of our APUs to 1600 MHz on the graphics, representing a +45% overclock, which translates into some interesting results.
|Recommended Reading on AMD Ryzen APUs|
|2400G Review||2200G Review||Overclocking||Delidding|
|Core Scaling||Memory Scaling||Graphics Scaling||Best CPUs|
Ryzen 2000 Series APUs: Going For Gaming
The pairing of AMD's high-performance Zen compute cores and Vega graphics into a tidy little package was a resuscitation for integrated graphics, making the low-end desktop market a lot more interesting, as our Ryzen 5 2400G and Ryzen 3 2200G review concluded. In order to increase performance for gaming, there were three potential avenues to look into. The first one led to our analysis into how memory frequency scales with AMD’s APUs, based on the relationship between memory performance and gaming, showed that memory can have a positive impact on performance. The second was our Ryzen 5 2400G and Ryzen 3 2200G core frequency scaling article, which showed that increasing the CPU core speed didn’t have the effect on gaming performance one would potentially hope to find. The last of the set is overclocking the integrated graphics frequency, which often leads to a direct increase in frame rates.
The interesting thing to take from our Ryzen 2000 series APU overclocking guide was that the performance increase was easy to spot when every component on the amalgamated Zen and Vega package plus memory was pushed. The biggest take from our Ryzen APU memory scaling was that the Infinity Fabric interconnect did improve performance from memory in gaming, more so than pure CPU core frequency in certain cases. It was also noted that overclocking everything at once was more difficult than increasing each part independently - what was a good individual overclock, was not always possible when the CPU, memory, and graphics were overclocked together.
Focusing specifically on the intergrated graphics for this article, the primary aim is to ascertain whether or not overclocking the Vega cores on their own yields a big enough benefit in our game testing suite to make it worthwhile. As long as the graphics can continually be fed from the memory and infinity fabric, we should see a good linear increase, especially when the GPU is the bottleneck, however the resolutions aimed at these GPUs might throw interesting features into the mix.
In this review we will recover overclocking the integrated graphics, and take each APU from its stock graphics frequency up to the highest overclock we could achieve, in 50 MHz steps. The results, are interesting.
Test Bed and Hardware
As per our testing policy, we take a premium category motherboard suitable for the socket and equip the system with a suitable amount of memory. With this test setup, we are using the BIOS to set the integrated graphics frequency using the provided straps on the MSI B350I Pro AC motherboard. The memory is set to the maximum supported official speed. The CPU core frequency on both the Ryzen 5 2400G and Ryzen 3 2200G was left at their respective default settings.
|Processors||AMD Ryzen 3 2200G||AMD Ryzen 5 2400G|
|Motherboard||MSI B350I Pro AC|
|Cooling||Thermaltake Floe Riing RGB 360|
|Power Supply||Thermaltake Toughpower Grand 1200 W Gold PSU|
|Memory||G.Skill Ripjaws V
|Integrated GPU||Vega 8
|Hard Drive||Crucial MX300 1 TB|
|Case||Open Test Bed|
|Operating System||Windows 10 Pro|
We took our previous gaming suite for this comparison. It's a little dated, but it still checks out. We will be using newer games in future analysis pieces.
|Shadow of Mordor||Action / RPG||Sep
|Total War: WARHAMMER 2||RTS||Sept
|Ashes of the Singularity||RTS||Mar
|Rise of The Tomb Raider||Action||Nov
AMD's APU Stack
Since our last big coverage of AMD's Ryzen APUs, it was announced that several new parts will be coming to the market under the 'Athlon' brand. These will be much cheaper parts, starting at $55, and offering fewer cores and fewer Vega compute units for that market.
|AMD's APU Stack|
|Ryzen 5 2400G||Zen||4 / 8||3600||3900||11||65W||$169|
|Ryzen 3 2200G||Zen||4 / 4||3500||3700||8||65W||$99|
|Athlon 240GE||Details to be disclosed in Q4|
|Athlon 220GE||Details to be disclosed in Q4|
|Athlon 200GE||Zen||2 / 4||3200||-||3||35W||$55|
|* 2400GE and 2200GE are 'released' but not at retail|
We currently have the 200GE in for a future review, but everything in this article should apply to this APU as well.
Pages In This Review
- AMD Ryzen 2000 Series: Going For Gaming
- Overclocking the Integrated Graphics: How To
- AMD Ryzen 5 2400G: Gaming Tests (1)
- AMD Ryzen 5 2400G: Gaming Tests (2)
- AMD Ryzen 3 2200G: Gaming Tests (1)
- AMD Ryzen 3 2200G: Gaming Tests (2)
- Overall Analysis
Post Your CommentPlease log in or sign up to comment.
View All Comments
PeachNCream - Friday, September 28, 2018 - linkInteresting analysis, though it's a bit of a foregone conclusion these days to expect a GPU overclock to improve performance in games more than a CPU overclock since the central processor, after a point, has very little role in increasing framerates.
This one struck me as odd though - "...Ryzen APUs are marketed for 720p gaming, and while resolutions such as 2160p and 1440p are out of reach purely for performance reasons, we have opted to use moderate settings at 1080p for our testing."
Were the tests executed at 1080p so they would align better in the Bench? It seems more reasonable to test at 720p given the various limits associated with iGPUs in general and the use of 1080p just comes across as lazy in the same way Anandtech tests CPU performance in games at resolutions so high that GPU performance masks the differences in various processors. Tom's Hardware, back when the good doctor actually ran it, yanked resolution down as low as possible to eliminate the GPU as a variable in CPU tests and it was a good thing.
stuffwhy - Friday, September 28, 2018 - linkJust purely speculating, is it possible that 720p results are just great (60+ fps) and need no testing? One could hope.
gavbon - Friday, September 28, 2018 - linkMy reasoning for selecting 1080p gaming tests over 720p was mainly because the other scaling pieces were running at the same resolution. Not just the iGPU tests, but the dGPU testing with the GTX 1060 too. It wasn't a case of being 'lazy' but the majority of gamers who currently use steam use 1080p and as it's the most popular resolution for gamers, I figured that's where I would lay it down.
neblogai - Friday, September 28, 2018 - linkEven if monitor is 1080p, a lot of 2200G users may want to run games on 1080p with resolution scaling, for better fps. In effect, at 720p or 900p. Most games support it these days. So, popularity of 1080p monitors does not really make 720p tests less useful for this level of GPU performance.
V900 - Friday, September 28, 2018 - linkWould be great if you had tested just one game at 720p.
I know this is what I would be interested in knowing/reading if I was a possible customer.
usernametaken76 - Sunday, September 30, 2018 - linkI honestly think this "majority of gamers who currently use steam use 1080p" argument is affected by a) laptop users (see the high number of 1366x768 users) and therefore game at whatever resolution their laptop panel is set to...
Which leads one to ask what the point of testing desktop parts is when you use that as a basis for what and how to test.
TheJian - Friday, October 5, 2018 - linkagree 100%. They do a lot of dumb testing here. Ryan has been claiming 1440p was the "enthusiast resolution" since Geforce 660ti. I don't even think you can say that TODAY as I'm staring at my two monitors (have a 3rd also 1080p), both of which are 1200p/1080p.
For me, I need everything turned on, and you need to show where it hits 30fps at that level. Why? Because the designers of the games didn't want you to play their game at "MODERATE SETTINGS"...ROFL. Just ask them. They design EXACTLY WHAT THEY WANT YOU TO SEE. Then for some reason, reviewers ignore this, and benchmark the crap out of situations I'd avoid at all costs. I don't play a game until I can max it on one of my two monitors with my current card. If I want to play something that badly early, I'll buy a new card to do it. All tested resolutions should be MAXED OUT settings wise. Why would I even care how something runs being degraded? Show me the best, or drop dead. This is why I come to anandtech ONLY if I haven't had my fill from everywhere else.
One more point, they also turn cards etc, down. Run as sold, PERIOD. If it's an OC card, show it running with a simple checkbox that OC's it the max the card allows as their defaults. IE most have game mode, etc. Choose the fastest settings their software allows out of their usually 3 or 4 default settings. I'm not talking messing with OCing yourself, I mean their choses 3-4 they give in the software for defaults. Meaning a monkey could do this, so why pretend it isn't shipped to be used like this? What user comes home with an OC card and reverts to NV/AMD default ref card speeds? ROFL. Again, why I dropped this site for the most part. Would you test cars with 3 tires? Nope. They come with 4...LOL. I could go on, but you should get the point. Irrelevant tests are just that.
flyingpants265 - Tuesday, March 5, 2019 - linkHate to tell you this, but 4k is the enthusiast resolution now.
808Hilo - Saturday, October 13, 2018 - linkIs it just me or are these test just for borderline ill people?
Playing 4k with a 1080/1800/32/S970. Works reasonably well. I also do everything else in 4k. Would I go back to lower res? No way. Artifical benchmarking is one, real world is 4k. Test this rez and we get a mixed GPU, APU, CPU bench. Build meaningful systems instead of artificially push single building blocks. Push for advancements. T
Targon - Friday, September 28, 2018 - linkThe big problem with these APUs is that they limit the number of PCI Express channels, so if you DO decide to add a video card, the APU in this case will reduce performance, compared to a normal CPU without the graphics.