Enzotech Sapphire and Luna Rev.A


The Enzotech Sapphire Rev.A is one of Enzotech’s current offerings. I’m not sure which is designated as flagship considering the recently fluctuation in prices (both can be found at Newegg for less than $50), but the Sapphire Rev.A is an increasingly popular block due to its solid “all-copper” construction (in reality, there’s a plastic cap over the copper, and a stainless steel flow guiding plate) and extremely low restriction. The block is very heavy, especially compared to some of its acetyl and acrylic-topped competitors. The base is fairly thick, but has a very well machined pin structure and a slight bow. The blue plastic cap is removable to expose a solid-copper block, but once removed, there’s a lot of wiggle room with the mounting bracket, hindering installation.

The Enzotech Luna Rev.A is the other Enzotech offering. It goes in the oppsosite direction for its design–it employs a very thin base with micropins and an injector. The overall structure is somewhat consistent though, using an all-copper design with an add-on assembly for the flashing LEDs and metallic plastic cap. Surprisingly, the Luna Rev.A does not employ a bow–a glaring omission for a modern block. The flashing LEDs are a glaring addition to the block. I’m not one for lights (or flashing ones), but even if I were, these lights are annoying–they flash way too quickly. Enzotech has, fortunately, posted a modification that would allow the end-user to disable the flashing (but still have the lighting on). In that vein, I would have attempted a modification to install a bow with an o-ring (or a return of the Silicone Mod!), but a manufacturing defect in one of the assembly screws of the block prevents me from opening up the block without permanently damaging it.

This test will focus on the performance of the blocks in general and over a large flowrate spectrum. Results from the installments of Roundup #2 will be compiled, as they’re posted, into an Overall Comparison page.

Thermal Testing Methodology/Specification


My waterblock testing methodology has evolved over the past few months and I think it’s finally at a resting point where I can start piling up test results rather than tweak the methodology (and thus preventing cross-comparisons). I use Dallas One Wire DS18B20 temperature probes at various points through my watercooling loop and at the air intake to measure temperatures, I’ve isolated the radiators so that the flowrate through them never changes, I use six different pump settings for each block, and use good testing practice by performing 5 mounts. Where applicable, I will also test various modifications to the blocks. These include testing various orientations and removing/adding various midplates, nozzles, dividers, etc. In some cases I will also modify the mounting system and present results from increased mounting pressure. For my waterblock tests, I’ll perform 5 mounts of each configuration for every waterblock. The best configuration will then go on to be tested through the full flowrate spectrum.


  • The processor I’m using for this test is my C0/C1 i7 920. I’m running it at 21×200 (4200MHz) at 1.52V loaded on a Gigabyte EX58-UD5. It is unlapped. I’m running 3GB of G.Skill DDR3 2000MHz. All heatsinks on the board are stock and I have fans blowing over the MOSFET area for added stability. The video card is a 4850 1GB with VF830 running in the top slot. The board is sitting on my desk alongside my Odin 1200W PSU and DVDRW and HDD drives.
  • The watercooling loop I’m using is very untraditional, but allows me to test the way I want to test.
    • It consists of a two MCR320s with three pairs of Yate Loon D12SH-12 fans in push/pull on each radiator. I use a D-Tek DB-1 pump on the radiator subloop.
    • For the block subloop, I use a Laing D5 and three Laing DDC3.2s for the pumps as well as Dwyer RMC-142 and RMC-144 flowmeters to monitor and track flowrates.
    • I use a shared Primochill 8-port reservoir between the two subloops.

  • I do a five mount test for each block configuration, each with their own TIM application and full cleaning between. I’m fond of semi-discarding the best and worst mount data–I present it to the reader, but my final analysis and numbers are all based on the median three mounts. As a reviewer, I feel it is my duty to present the reader with performance numbers of a product that represent what its typical performance is. Often times the best and worst mounts are somewhat anomalous; by performing five mounts and focusing on the middle three mounts (in terms of thermal performance), I feel I am best representing the expected performance of a product.
  • I have 28 temperature probes in use: 24 Dallas DS18B20 Digital one-wire sensors and 4 Intel DTS sensors in the processor.
  • For temperature logging, I use OCCT v3.1.0’s internal CPU polling that is performed every second on all four DTS sensors and is automatically output to .CSV files. I also use OCCT for loading the CPU. For air intake and various water temperatures temperatures, I use Crystalfontz 633 WinTest b1.9 to log the Dallas temp probe data on my Crystalfontz 633. I also use WinTest b1.9 to log pump RPM.
  • For processor loading, I find OCCT v3.1.0 to be extremely competent. With the Small Data Set setting, it provides a constant 100% load (so long as WinTest b1.9’s packet debugger is fully disabled) and is extraordinarily consistent. It allows me to, in one button push, start both the loading and the logging simultaneously, which helps. I immediately also start to log the Crystalfontz data via WinTest b1.9. I run a 1 hour and 40 minute program, the first minute is idle, then I have 95 minutes of load, and then 4 minutes of idle. The first 20 minutes of load data is considered warm-up and the last 75 are used for results.
  • I have found that simply using processor temperature minus ambient temperature is not adequate for Intel’s 65nm Core 2 processors. However, I have found that ambient and core temps scale perfectly fine (1:1) with i7.
Pages: 1 2 3

Leave a Reply

You must be logged in to post a comment.