200m Gaming Medley

Arguably the more interesting performance results are going to be in games, so we will start here. We have dropped testing for most of the older titles, as discussing Battlefield 2 or FEAR performance doesn't mean a whole lot when we're looking at modern hardware. We've tried for a more varied selection of games this time around, with selections from most major genres.

For the FPS group, we have Crysis, Enemy Territory: Quake Wars, and Unreal Tournament 3. Real-time strategy gaming is represented by Company of Heroes. Assassin's Creed and Devil May Cry 4 take care of the action-adventure genre, and GRID covers driving simulations - and all three of these also represent recent console ports/cross-platform releases. Finally, we have results from Oblivion and Mass Effect for the RPG lovers like me.

We use built-in performance tests on Company of Heroes, Crysis, Devil May Cry 4, Enemy Territory: Quake Wars, and Unreal Tournament 3. For Assassin's Creed, GRID, Mass Effect, and Oblivion we benchmark a specific scene using FRAPS. In all tests, we run each benchmark at least four times, discard the top result, and report the highest score of the remaining results.

We will use resolution scaling graphs to compare the different laptop configurations, as that will allow us to examine how the GPU and CPU affect performance. At lower resolutions we should become more CPU limited, while the higher resolutions and detail settings should put more of a bottleneck on the GPU.


















Gaming performance is at least equal to the P-6831 in every test, and in several instances the P-7811 is substantially faster. The games where performance is tied are somewhat surprising, as most are considered GPU limited. Crysis is a virtual tie between all three models, indicating that the bottleneck is GPU memory bandwidth rather than GPU shader performance; Quake Wars is also GPU bandwidth limited. In the remaining games, we see everything from a tie at 1280x800 in Assassin's Creed to as much as an 80% lead in the Devil May Cry 4 benchmark at lower resolutions.

The average performance lead of the 7811 over the 171XL in non-bandwidth limited situations does appear to be around 20%, matching the GPU core speed increase, so the 9800M GTS is definitely an improvement. Shader clocks are apparently 1250MHz on all the 8800M/9800M parts, so we would categorize any differences of more than 20% as coming from the drivers and/or 64-bit OS (or perhaps some other hardware difference).

The significantly slower CPU in the 6831 does limit performance at lower resolutions, and it's important to remember that the 6831 ships with a 1440x900 LCD - the other resolutions were tested using an external display just to show how performance scales at higher resolutions. The 171XL has a faster CPU than the 7811, so the performance leads of the 7811 would actually be somewhat higher if the CPUs were equal. Any way you slice it, though, the performance of the 7811 is very impressive for the price. The 9800M GTS does tend to be slightly slower than the 8800M GTX, but only by about 10%. Considering laptops with the 8800M GTX typically cost $2200 or more, the P-7811 is a great follow-up to the P-6831.

Test Setup High Detail Gaming and 3DMark
Comments Locked

45 Comments

View All Comments

  • JarredWalton - Friday, August 15, 2008 - link

    9800M GT has 64 SPs; GTS has 96 SPs (like the GTX), and the 9800M GTX has 112 SPs. There's some debate about whether there's rebranding or if there are actual differences; judging by the performance, I'd bet on there being some changes. I believe, for example, that 9800M has the VP3 video processing engine and it is also fabbed on 55nm instead of 65nm... but I might be wrong.
  • JarredWalton - Friday, August 15, 2008 - link

    Suck... I screwed that up. I don't know why NVIDIA switches GT/GTS meanings all the time. 8800 GTS 320/640 < 8800 GT < 8800 GTS 512. Now we have 8800M GTS < 8800M GT. Stupid. Also worth noting is that NVIDIA has released no specific details on the core/RAM clock speeds for the 9800M series.
  • fabarati - Friday, August 15, 2008 - link

    I was basing my information upon what Clevo resellers were saying in the Notebook Review forums. There was this huge fight about this, due to nVidia posting the wrong specs on their webpage. When the NDA was lifted, they could come out and say that they were the same card.

    But yea, nVIDIA is being really annoying with the suffixes. ATI has a pretty clear lineup, for now.
  • JarredWalton - Friday, August 15, 2008 - link

    Okay, updated with the clock speed info from nTune (as well as NVIDIA's specs pages). It looks like all of the shaders are 1250MHz, while the RAM speed on all the units I've seen so far is 800MHz (1600MHz DDR3). I don't know for sure what the clocks are on the 9800M GT/GTX, as I haven't seen a laptop with that GPU yet. So in order of performance, and assuming 600MHz GPU clocks on all the 9800 cores, we have:

    8800M GTS
    9800M GTS (up to ~20% faster than 8800M GTS)
    8800M GTX (up to ~50% faster than 8800M GTS)
    9800M GT (up to ~80% faster than 8800M GTS)
    9800M GTX (up to ~110% faster than 8800M GTS)

    Now, the maximum performance increase relative to the 8800M GTS is based on the game being purely shader processing limited. Many games depend on GPU memory bandwidth and fill rate as well, in which case the difference will be much smaller.
  • fabarati - Friday, August 15, 2008 - link

    Oh, and a 1440x900 resolution is a WXGA+ resolution, not SXGA+.

Log in

Don't have an account? Sign up now