The Intel Core i3 530 Review - Great for Overclockers & Gamers
by Anand Lal Shimpi on January 22, 2010 12:00 AM EST- Posted in
- CPUs
The Performance & Power Summary
I’ve added the Core i3 530 to our Bench database, if you want a full comparison of results head over there. What I’m providing here is a subset of our tests to show the 530’s strengths and weaknesses.
In every single non-gaming test, the Core i3 530 bests the Phenom II X2 550 BE. In our gaming benchmarks the 550 was faster in two out of our 8 benchmarks. In the rest, the i3 took the lead. The Core i3 530 also manages to outperform the Phenom II X2 550 BE while using significantly less power. In the battle of the dual-cores, the i3 wins. AMD needs to fight with clock speed at at 3.1GHz, the 550 can’t muster enough to beat the i3.
The Athlon II X4 630 comparison is a little more complicated. In single and lightly threaded applications, the i3 is a much better performer thanks to its higher clock speed. The i3's gaming performance is also significantly better across the board. What the Athlon II X4 loses in clock speed, it makes up for in core count. Things like video encoding and offline 3D rendering are almost always faster on the Athlon II X4 630.
Applications that are bound more by the performance of one or two threads are almost always faster on the Core i3 530. As a general purpose desktop microprocessor or a chip for a gaming rig, I’d opt for the Core i3 530. If you’re doing a lot of heavily threaded content creation, then the Athlon II X4 is the chip for you. If you’re somewhere in between, the choice is up to you. Our Photoshop test has the two processors very close to one another, but with the i3 taking the slight lead.
Power efficiency obviously goes to the Core i3 530 thanks to its 32nm transistors.
107 Comments
View All Comments
Anand Lal Shimpi - Friday, January 22, 2010 - link
No you're pretty much right on the money. If you have a good system already, you're always better off waiting until the next big thing. In my eyes Penryn wasn't a very big deal if you already had Conroe. Clarkdale's real advantages are in power consumption and threaded performance. If you already have a quad-core CPU, chances are that you'll want to be looking at Lynnfield/Phenom II X4 or wait for Sandy Bridge if you can.Take care,
Anand
tno - Friday, January 22, 2010 - link
Cool! A reply from the man himself! Thanks, Anand! My leap was from a 2.4GHz Celeron to a PD805 to Penryn, so Penryn seemed like a revelation, highly efficient, easy to cool, fast and quadcore. Now, if you happen to have any loose systems that you're not using and want to send my way so I can experience the Lynnfield difference myself, I won't object.tno
kwrzesien - Friday, January 22, 2010 - link
I had an AMD 1.2 GHz single-core with an GeForce 2MX. It was a HUGE upgrade!lopri - Friday, January 22, 2010 - link
[QUOTE]We are still running into an issue with MPC-HC and video corruption with DXVA enabled on the 790GX, but haven't been able to fix it yet. Have any of you had issues with video corruption with AMD graphics and the latest stable build of MPC-HC for 64-bit Windows? Or should we chalk it up to being just another day in the AnandTech labs.[/QUOTE]Instead of such fleeting one-liners, how about telling us the title, format, and codec in question so that we can verify it? This is a finest example of yellow journalism.
I'm still waiting for an answer whether 2560x1600 and dual-displays work with these CPUs. Considering the silence, however, I think I know the answer.
Anand Lal Shimpi - Friday, January 22, 2010 - link
It's a Dark Knight rip we use. Take the original Blu-ray, use AnyDVD HD to strip out the DRM, re-encode to reduce file size and toss into an mkv container. The problem appears on all H.264 content though played through MPC-HC.As far as resolution support goes, Intel lists 2560 x 1600 as the maximum resolution available over Display Port. For DVI/HDMI you're limited to 1920 x 1200. VGA will get you up to 2048 x 1536.
There are four independent display ports, so in theory you should be able to run one 2560 x 1600 panel and one 1920 x 1200 (or two 25x16 panels if you had a board with dual DisplayPort outputs).
Take care,
Anand
lopri - Friday, January 22, 2010 - link
Thank you for the explanation, but unfortunately I couldn't replicate the 'problem' (what exactly?) you've experienced. I don't have The Dark Kight, so I tried Children of Men on a neighbor's 785G system I built for him. That title was chosen because its original content on the disc was encoded VC1 just like The Dark Knight. MediaInfo gave the following information:Video
ID : 1
Format : AVC
Format/Info : Advanced Video Codec
Format profile : High@L4.1
Format settings, CABAC : Yes
Format settings, ReFrames : 4 frames
Muxing mode : Container profile=Unknown@4.1
Codec ID : V_MPEG4/ISO/AVC
Duration : 1h 49mn
Bit rate : 14.2 Mbps
Nominal bit rate : 14.5 Mbps
Width : 1 920 pixels
Height : 1 040 pixels
Display aspect ratio : 16:9
Frame rate : 23.976 fps
Resolution : 24 bits
Colorimetry : 4:2:0
Scan type : Progressive
Bits/(Pixel*Frame) : 0.296
Stream size : 10.8 GiB (88%)
Title : Video @ 14489 kbps
Writing library : x264 core 67 r1165M 6841c5e
Flawless playback both in Windowed mode as well as full-screen mode, on a 30" LCD. Just to be sure, I tested The Dark Knight trailer which is a VC1 clip, and various H.264 content in .mkv, .mp4, and .m2ts. Using MPC-HC svn 1.3.3347 32-bit AND 64-bit binaries. System had an WHQL driver dated 8/17/2009, installed via Windows Updates. Only codecs installed are Matroksa Splitter and AC3filter.
So there. Now, what exactly is the problem that I don't see but you do?
WRT resolutions - Intel listed 2560x1600 on G45 as well. I even got an ADD2 (interesting choice of name, btw) card off eBay hoping it'd work, but that was simply waste of money. I am as skeptical as can be on GMA after my bitter experiences with G35/G45, and it is puzzling why you can't verify that in your lab instead of being a messenger. ("Intel says so")
Would you feel bad at all if I say I purchased G35/G45 based on your reviews, only to be greatly disappointed? I couldn't even give away a G35 based system to a junior-high kid, because the kid is someone I see on and off and I feared a potential embarrassment and unexpected calls for support.
Your reviews are full of contradictions one after another, and I am concerned whether you've lost the sense and connection to the real world.
Shadowmaster625 - Friday, January 22, 2010 - link
Given the level of integration, what is making these motherboards so expensive? When are we going to see $35 motherboards? What would keep the prices from coming down that low?strikeback03 - Friday, January 22, 2010 - link
IIRC the chipset itself currently costs $40.Anand Lal Shimpi - Friday, January 22, 2010 - link
Correct. Despite moving much of the "chipset" on-package, the actual H5x chipsets are no cheaper than their predecessors. Remember that as AMD and Intel integrate more onto the CPU they still want to preserve or increase profit margins. It's just fortunate for all of us that in the process of integration we actually get a performance benefit.Take care,
Anand
Taft12 - Friday, January 22, 2010 - link
Sounds like we are very much in need of competition in the 3rd party chipset market like the good old days!Things are going in the wrong direction with NVIDIA exiting the market, Via and SiS long gone...