ECS P67H2-A Review: A visit back to Lucid's Hydraby Ian Cutress on July 21, 2011 9:00 AM EST
If you remember back to October 2008, there were distinct murmurings about Hydra - an encompassing hardware and software solution to bring multiple GPUs together to act as one. Then, in January 2010, Ryan tested the Hydra chip, with the end result being: more development required. In my hands is the ECS P67H2-A, the latest board to include the Hydra solution. Armed with the latest version of the Hydra software too, I'm here to review this board, to see if it works as a suitable P67 solution, and if Hydra has anything to offer.
In terms of the P67H2-A itself, there are still quite a few areas that need polishing in order to improve the end user experience. There are BIOS issues relating to usability; there's a serious overclocking issue in the case of multithreading over 4.4 GHz, and the Hydra solution still isn't what we want (but you could say it's getting there, possibly). Out of the box, very few people would have issues. But it's when you get into the meat of the product where some slightly uncomfortable ridges occur, which other competitors have potentially worked on to get the better product.
The P67H2-A is a Black Series model from the ECS line-up, which means it gets the Black Series treatment - the silver, black and grey livery makes its way onto the PCB, all the slots and connectors, and the power delivery heatsink at least looks styled. The heatsink is a dual copper heatpipe design, which ECS states gives a 15-20ºC reduction in PWM temperature.
Along the top we find a 2x4-pin 12 V CPU power connector, and a series of lights indicating how many of the power phases are being utilised under different CPU loads. Next to this is a 4-pin CPU fan header – one of only three fan headers on the board (a PWR on the other side of the DIMM slots, and a SYS at the bottom of the board). Having only three fan headers on a high end board is rather shocking. Beyond the CPU fan header above the DIMM slots are voltage read points for vCore, DIMM, IMC, PCH and PLL – these require direct contact rather than easy slot in connectors, and given my experience of overclocking below, not required.
Working down the right hand side of the board, beyond the four DDR3-2133 (OC) memory slots, are onboard power and reset buttons. The PCH provides the standard two SATA 6 Gbps and four SATA 3 Gbps ports, all supporting RAID 0, 1, 5 and 10. It is odd not to find another controller provided on this board to give more internal SATA ports, given that so many others at this price range typically have another two SATA 6 Gbps ports provided by a controller. Instead, we have a controller for the eSATA 6 Gbps ports on the rear I/O panel.
Underneath this is a USB 3.0 port, which in my eyes is in a very odd place – this is a Hydra board, designed to aid multi-GPU situations. Yet if I use two medium level GPUs (e.g. 5850s at 9.5 inches long) in the appropriate slots, this obscures the USB 3.0 header completely. This seems to be an oversight in ECS’ design. Further below this port is a debug LED, which should according to the documentation double up as a ‘live’ temperature monitor after POST - however, this doesn't happen. Also of note is that this board doesn’t have a screw hole in the bottom right, for affixing the board to a case – a hole is there, but it’s neither in the right place nor the right type.
Underneath the heatsinks, this board contains a Hydra 200 chip. This chip is providing two important functions on this board – to serve as a bypass to requiring SLI certification, and also like an NF200 by organising extra PCIe lanes. As a result, the three full length PCIe x16 slots will run at x16/x16/x0 or x16/x8/x8 depending on if two or three graphics cards are used. Like the NF200, we expected this to result in a small decrease to performance compared to true x16/x16 native chipset implementations, but there may also be associated overhead in using the Hydra hardware (as well as the software). I examine the Hydra results in detail later in the review.
The PCIe layout itself is a mix of good and bad, using an x1, x16, x1, PCI, x16, PCI, x16 configuration – however the first x1 can be limited by the heatsink in terms of width, stalling any wide PCIe x1 cards. The only full length x1 will be blocked by any dual slot GPU. If the top two PCIe x16 slots are populated with GPUs, then the user could use the bottom PCIe x16 slot for the x1 card, however I’m unsure as to whether the slot would run in x8 mode (as all the PCIe are populated) or in x0 mode (because the GPUs are in the first two PCIe x16). It can be very hard to organise a tri-GPU motherboard and please everyone in this regard.
The back panel is a fairly reasonable layout of what would cover most people – a PS/2 port for mouse or keyboard, six USB 2.0 ports, four USB 3.0 ports (NEC controller), dual gigabit Ethernet (Realtek) which supports Teaming, a clear CMOS button, 8-channel audio jacks (Realtek) and an optical S/PDIF output.