Wednesday, April 6, 2011

ASUS RAMPAGE III BLACK EDITION

Its been really a long long time that i finally managed to get some spare moments and energy to post something new on my blog, a post about a product worth mentioning :)

At CeBIT last month, ASUS showed off an all new Rampage III BLACK EDITION motherboard. I received a unit from ASUS Taiwan for review about a week back and thought that i should share some of its snap shots with you people. I'll try to do a quick reviews later so check back again. See the images below to sooth your eyes ;)










Here is what ASUS has to say about it:



The complete solution for gamers, overclockers, and PC enthusiasts
Performance is critical to gamers. Anything gamers buy, they want to know that their hard-earned dollar will give them either the advantage they need to win, or the power to enjoy games at the highest settings.

Introducing the Rampage III Black Edition - ASUS' ROG-branded superpowered motherboard, designed to give your system the killer edge. Every feature you could possibly want is on this motherboard, making it the PC enthusiast's dream component. When it comes to high performance hardware, nothing can touch the Rampage III Black Edition.

Higher overclocking potential
The Black Edition of Rampage III offers it all: great features, gaming-focused LAN solution, high-quality audio, and the highest overclock potential of any ASUS board. This is perfect for the target consumers of the Rampage III Black Edition: power users, enthusiasts, and extreme gamers. Users with the know-how should be encouraged to choose the Black Edition in order to push their overclock speeds ever higher and faster, as it is the most powerful and feature-packed motherboard in ASUS' lineup.

ROG ThunderBolt LAN gives you higher performance for multiplayer gaming
The two most overlooked items in a gaming rig are the Network Interface Card and the Audio Card. Why are they important? NIC determines your latency and data throughput, which is especially important for fast-paced, frenetic multiplayer gaming, where the difference in milliseconds can mean scoring a head-shot or missing your opponent entirely. One-half of ROG ThunderBolt is the NPU KillerTM E2100 which offloads CPU network calculation and translates to faster response in game. The difference you'll see with ROG ThunderBolt is immediate - You'll never want to switch back to regular NICs after you've played with ROG ThunderBolt.

ROG iDirect App lets you monitor your network in real-time
ROG ThunderBolt includes on-board Xonar-quality audio. A built-in headset amplifier will unlock the true performance potential of high-quality headsets. Already lauded for award-winning Signal to Noise ratio and cleaner audio quality, on-board Xonar sound card can bring games to life in a big way. Support for 3D Surround Sound and EAX will let gamers hear their opponents coming from a mile away. Hardcore competitive gamers all know that high-quality audio is essential for gaining that upper hand.

Making life easier for extreme users:
ROG features take the R3BE to the next level for PC enthusiasts, overclockers, and gamers. ROG Connect and GPU TweakIt allow users to have powerful, real-time control over motherboard and GPU settings, saving lots of time for the time-consuming task of overclocking.

Enthusiasts will also want to hone in on features such as the PCIe x16 Lane Switch, which allows users to quickly turn on and turn off PCIe lanes for easy video card troubleshooting.

GPU.DIMM Post gives users further control to ensure graphics and memory functionality, saving the time normally required to hunt down damaged components.

MSRP: $599 USD
Specifications:
  • Intel Core i7 Processor Extreme Edition / Core i7 Processor
  • Support Socket LGA 1366
  • Intel X58 / ICH10R Chipset
  • Support QPI up to 6400 MT/s
  • Triple-channel DDR3 2200(O.C.)/2133(O.C.)/2000(O.C.)/1800(O.C.)/1600/1333/1066 MHz Support
  • 4 x PCIe 2.0 X16 with CrossFireX and 3-WaySLI Technology support
  • 2 x SATA 6Gb/s ports
  • 6 x SATA 3.0 Gb/s ports with RAID 0,1,5 and 10
  • 2 x eSATA 3.0 Gb/s ports
  • 2 x USB 3.0 ports
  • Intel Gb LAN
  • 8-Channel HD Audio Codec
2011 ASUS Computer International. All rights reserved. Terms and conditions are subject to change without notice. Reproduction in whole or in part without written permission is prohibited. ASUS is a trademark ASUSTek, Inc.




Sunday, December 28, 2008

Intel E5200 2.5GHz Overclocked to 4.0GHz

For more than a year i had some liquid cooling stuff lying around. It was a customized purchase meant for use with my old Asus Striker Extreme for building an XTREME rig..... But then the introduction of Intel's 45nm processors and even 65nm quads demonstrated there severe incompatibility with Striker Extreme and other nvidia 680i chipset boards, especially in terms of overclocking. This overclocking failure (even to a mid level) shattered all my dreams of building an extreme rig. I still remember that in early days of 65nm Core 2 Dous, Striker Extreme ocing capability for those processors was nothing but outstanding but then it failed miserably to produce the same results for quad cores and 45nm processors. Several bios fixes from Asus couldn't even resolved the problem. Pretty much dishearten i sold that so called extreme board and gave up the idea of liquid cooling any more for few reasons, one among those i just explained. For other reasons, follow the post till the end :)

For the last few days i was thinking about using all that liquid cooling stuff that has been stored in boxes for nothing. It was quite an investment going to waste. Learned from my past experience, this time i decided to grab a budget board along with a budget processor. I already had an Asus P5K-VM, so i chose E5200 as my budget processor and started my journey to get max clock possible.

Following is a small description of all the components i used for my cheap XTREME PC

INTEL E5200 2.5GHz
ASUS P5K-VM G33 MOTHERBOARD
CORSAIR XMS2 2X512MB-667MHz RAM
BFG 9800GT 512MB OC+ GPU
SEAGATE 160GB SATAII HDD
COOLER MASTER EXTREME 500W PSU
RAIDMAX SMILODON CASING

Dtek FuZion Extreme *Performance CPU Block *with High Flow KIT
TYGON R3603 TUBING 3/8" ID-1/2"OD.
Swiftech COOL SLEEVES 625 HORIZON UV BLUE
Swiftech MCB-120 RADBOX
HARDWARE LABS 360(TRIPLE) STEALTH EXTREME RADIATOR
FLUID XP+ NON-CONDUCTIVE COOLANT (BLUE)
THERMALTAKE 3X120MM LED FANS-78CFM EACH
ASETEK WATERCHILL XTREME 12v PUMP WITH SOFTWARE CONTROL(Qmax 1020L/H)
MICRO COOL MOSFETS HEAT SINKS
LOGISYS CCFL TUBES-UV BLUE

From all the above mentioned stuff the only thing missing was the processor so i quickly grabbed one from the store to start building my new rig and after a lot of hard work i managed to build a good rig with pretty decent results.

Following are some overclocking details and screen shots from of my cheap XTREME PC (NOW YOU MUST HAVE GUESSED WHY I CALL IT CHEAP )

E5200 @ 4.0Ghz [320*12.5] / 1GB DDR2-667 @ 641Mhz - Dual Channel






I guess i wasn't lucky enough and got this E5200 with a significant higher VID of 1.225V. It is my guess that the fresh batch of E5200 would have lower VIDs. Try to look for the box with most recent packaging date if you are out for buying a processor. The one i purchased had a packaging date of Aug 08. Higher VID translates to higher heat out put and requires more higher Vcores for stable overclocking thus making it difficult to achieve max overclocks due to high heat out put by the processor and also heavily stressing the motherboard components responsible for supplying Vcore.

Having a CPU multiplier of 12.5, installing a NB liquid chipset cooler wasn't looking necessary as higher FSBs were not required. (Unless you want to lower your CPU multiplier for higher memory bandwidth).

Overclocking to 4.0GHz wasn't easy on this board with perfect stability. I had no option but to supply a Vcore of 1.55V to keep my E5200 perfectly stable @ 4.0GHz and passing stability tests like OCCT and playing CRYSIS. Windows was able to boot at 1.5v and complete Hiper Pi and Wprime calculations but OCCT was crashing in less than a minute. Fortunately due to the Liquid Cooling set-up, the temps remained well under control as core temps stayed around 59C,58C (ROOM TEMP 25C). I also had to perform the VDROP/VDROOP mod to completely make the system stable as before applying the mod, E5200 was unable to pass OCCT @ 3.5GHz due to significant vdrop/vdroop.

Everything was done on an average motherboard that has only 3 phase power delivery system and not much ocing features like NB STRAP, NB VOLT MANAGEMENT, LARGE VCORE INCREMENTS, LOWER MAX VALUE FOR VCORE AND VDIMM etc. I am very much convinced that at any high end board xtreme ocing will be pretty much easy. Do like to mention that i used additional mosfet heat sinks to keep the CPU surrounding area cool, especially under load.

At the end all the hard work got paid by achieving 60% or 1500MHz overclock and even the liquid cooling components found there use as well

Now for those of you out there that don't have any Liquid Cooling stuff and are planning to grab some for their future build, here are my suggestions regarding water or liquid cooling:

With next generation CPUs running cooler than ever before, like 45nm chips going at 4GHz+, water cooling is looking less and less attractive....at least for CPUs. Heat pipes air cooler have also taken the air cooling to the next level where the performance difference between air and water cooling is becoming less significant. Heat pipes have extremely high thermal conductivity and can distribute heat as well as water cooling. The benefits of heat-pipe cooling are two-fold. Not only is the processor cooled with a liquid which undergoes phase-change, it's still reaping the benefits of the thin-fin copper cooler. It's actually an amalgamation of water-cooling, phase-change, and air-cooling. You'll need to decide what you want from your system. If you want to do some extreme overclocking, then yes it is better. If you want a near silent cooling solution, yes its better but if you want a quieter system and want to do some minor to high end overclocking, there are cheaper options available. A good CPU air cooler can be quiet and cool very well. A couple of well placed 120mm fans will do wonders for airflow in your case and will only cost few hundred rupees. Even water-cooled systems need good airflow in the case.

Last but not least if you want to have fun and that shock and awe factor then there is nothing better than Liquid cooling......BUT according to Anandtech, you must spend at least $300 on a WC kit to outperform the best air-cooling. Some people might argue that $200 can beat any air cooling with a custom loop. Either way, a good WC system will only give you 0.2GHz or 0.3GHz more OC potential over air-cooling at a cost 3 times more than any high end air cooling. Now the choice is yours! 

Saturday, December 13, 2008

Inno3D 9800GTX 512MB GDDR3


INTRODUCTION

Last month I reviewed XFX 8800GT Zalman Edition and made an attempt to show how it can almost compete with a stock 9800GTX once decently overclocked. It was a very good graphics card that produced excellent scores in all benchmarks. Today I would be looking at 9800GTX and see how far can we go with this one. Today’s 9800GTX comes from Inno3D, a brand usually associated as budget card manufacturer, catering for the lower-end of the market. Before testing this card I used to believe the same. I remember my last high end GPU from Inno3D was 7800GTX which proved to be a bad overclocker and since then I never dared to use any other GPU brand except XFX due to their solid brand name and decent overclocking potential. But it's amazing how fast things can change. The notion that Inno3D is a budget card manufacturer doesn’t hold true anymore. Not only that Inno3D is producing high-end graphics card now, they've also become one of Nvidia's main board partners. They've proven several times that they have what it takes to compete against the heavy weights, such as XFX, BFG, EVGA etc.

Inno3D specializes in multimedia products such as graphic/video cards and USB devices. Their head office is based in Hong Kong, and they have other offices around the world too. Inno3D are one of Nvidia's many partners, which mean all their graphic cards that they produce use Nvidia's GPUs. Here's something taken from their website.


"InnoVISION Multimedia Limited is a pioneering developer and manufacturer of a diverse range of cutting-edge multimedia PC hardware products established in 1998 Hong Kong. Our manufacturing operations have been set up since 1990 in Shenzen, China. In this short period of time we focus on manufacturing OEM/ODM products as well as contracting out extensive production and research facilities to specialist companies. We have accomplished international recognition by PC’s top reviewers as becoming an outstanding success and one of the most fast growing companies in Asia. "


Inno3D 9800GTX is based on a G92 GPU produced using 65nm technology on a PCB board that's pretty much based on Nvidia’s reference design. It uses a 256-bit memory interface, and comes with the following default clocks.

CORE: 675MHz
SHADER: 1688MHz
MEMORY: 1100MHz (2200MHz Effective)
CORE TECHNOLOGY: 65nm



Following details have been taken from Inno3D website:

9800GTX Features

NVIDIA® Unified Architecture

Fully unified shader core dynamically allocates processing power to geometry, vertex, physics or pixel shading operations, delivering up to 2X the gaming performance of prior generation GPUs

GigaThread™ Technology
Massively multi-threaded architecture supports thousands of independent, simultaneous threads, providing extreme processing efficiency in advanced, next generation shader programs

Full Microsoft DirectX 10 Support
World’s first DirectX 10 GPU with full Shader Model 4.0 support delivers unparalleled levels of graphics realism and film-quality effects

NVIDIA® SLI™ technology
Delivers up to 2X the performance of a single graphics card configuration for unequaled gaming experiences by allowing two cards to run in parallel. The must-have feature for performance PCI Express graphics, SLI technology dramatically scales performance on today’s hottest games

NVIDIA® Lumenex™ Engine
Delivers stunning image and floating point accuracy at ultra-fast frame rate

16x Anti-aliasing
Lightning fast, high quality anti-aliasing at up to 16x sample rates obliterates jagged edges

128-bit floating point High Dynamic-Range (HDR)
Twice the precision of prior generations for incredibly realistic lighting effects-now with support for anti-aliasing

NVIDIA® Quantum Effect™ Technology
Advanced shader processors architected for physics computation enable a new level of physics effects to be simulated and rendered on the GPU-all while freeing the CPU to run the game engine and AI

NVIDIA® ForceWare® unified Driver Architecture (UDA)
Delivers a proven record of compatibility, reliability and stability with the widest range of games and applications. ForceWare ensures the best out-of-box experience for every user and delivers continuous performance and feature updates over the life of NVIDIA GeForce GPUs

OpenGL® 2.0 Optimizations and Support
Ensures top-notch compatibility and performance for Open GL applications.

NVIDIA® nView Multi-Display Technology
Advanced technology provides the ultimate in viewing flexible and control for multiple monitors.

PCI Express Support
Designed to run perfectly with the PCI Express bus architecture, which doubles the bandwidth of AGP 8X to deliver over 4 GB/sc in both upstream and downstream data transfer

Built for Microsoft® Windows Vista
Nvidia’s fourth-generation GPU architecture built for Windows Vista gives users the best possible experience with the Windows Aero 3D graphical user interface

NVIDIA® PureVideo™ technology
The combination of high-definition video decode acceleration and post-processing that delivers unprecedented picture clarity, smooth video, accurate color and precise image scaling for movies and video


Specification


Product: Inno3D Geforce 9800 GTX
Chipset: Geforce 9800 GTX
Memory: 512MB GDDR3
Core Frequency: 675MHz
Memory Frequency:2200MHz
RAMDAC: 400MHZ
Interface: PCI-Express
Memory Bus: 256-bit
Stream Processors:128
Max. Resolution: 2560 x 1600
SLI Ready: Yes
Output: Dual-Link DVI, HDTV
HDCP: YES


For more information please visit Inno3D's Website

Following specs table shows how 9800GTX can be compared to other GPUs in its league.



9800 GTX is indeed faster than the 8800 GTS 512, but it's far from being "next-gen" material. The Core Clock has been bumped up 25MHz; the Shader Clock is up 63MHz, while the Memory Clock has been pushed to a very healthy 1100MHz. In other words it’s just an overclocked 8800GTS 512. On the other hand 9800GTX+ has slightly higher core and shader clocks while the memory clocks are same as 9800GTX. Nonetheless, 9800GTX+ has a core technology of 55nm as compared to 65nm of 9800GTX.

PACKAGING

The package is nicely presented and comes with pretty much everything you need. You get the usual drivers on CD, instruction manual, and one bundled game - Company of heroes: Opposing Front from THQ. There's also a DVI-to-VGA converter, a 6-pin PCIe power cable as well as 2 sets of video-out cables , both s-video and component. You'll also find an extra SLI bridge connector. This is needed for 3-Way SLI configuration. The reference PCB board design is pretty standard, but the stock fan does cool the card adequately.

One of the first things you'll notice about the card is the extra PCI-E power connector. The 2nd PCI-E power connector is mandatory for stability if you want to overclock your card. I definitely would like to see the total power consumption.


Being a Dual Slot card you can rest assured that it wont dump any heat inside your casing.The 9800 GTX can output audio using an HDMI adaptor, but you must connect an internal S/PDIF header to the card, as the 9800 GTX lacks a built-in sound pass-through.


OVERCLOCKING

As I mentioned earlier, the notion of budget card manufacturer being associated with INNO3D doesn’t hold true anymore. At least this is the case with their mainstream and high-end GPUs, aside from i-Chill line of GPUs, which comes pre-overclocked pretty decently.

I was able to bump up all the clocks from 675/1688/2200 to 775/1944/2400 and the card was perfectly stable under FurMARK GPU stability test and hours of gaming, especially Crysis. That translates to a gain of 100MHz for core, 256MHz for shader and 200MHz for memory. At these settings this card is faster than any 9800GTX+ that has a stock clocks of 738/1836/2200MHz. The most surprising for me was the substantial increase in shader clocks which I wasn’t expecting. Reason being the fact that INNO3D usually never changes the shader clocks from default even in their factory overclocked GPUs and only increases core and memory clocks for the overclocked versions. The card was able to run all the benchmarks including 3DMark06, Crysis GPU Benchmark etc without any problem even at 800/1998/2500MHz but FurMARK showed some glitches after 30 mins of stability test. Most games didn’t show any sign of instability at these clocks.

MY TEST SYSTEM
CPU E7200 CORE 2 DUO @ 3.17GHz
STOCK -INNO3D 9800 GTX @ 675/1688/2200MHz
OVERCLOCKED -INNO3D 9800 GTX @ 775/1944/2400MHz
STOCK- XFX 8800 GT @ 600/1500/1800MHz
OVERCLOCKED- XFX 8800 GT @ 700/1750/2100MHz
CORSAIR DDR2 @ 802MHz
WIN XP PROFESSIONAL SP2
NVIDiA GeFORCE DRIVER 184.48

EVGA Precesion 1.3.3 was used for overclocking the GPU.

EVGA Precision is actually designed by (Unwinder), the same person who gave us Rivatuner. Under the hood its pretty much RivaTuner but with more simple and cleaner interface. Everything we pretty much need is provided in a single window. This program allows you to fine tune your Nvidia graphics card for the maximum performance possible, with Core/Shader/Memory clock tuning, Fan Speed adjustment, real time monitoring support including in-game, Logitech Keyboard LCD Display support, ability to choose different skins, and compatibility with almost all GeForce 6, 7, 8, 9 or GTX 200 series graphics cards.

FurMARK 1.5.0 was used for stability testing.

What is FurMark?
FurMark is a very intensive OpenGL benchmark that uses fur rendering algorithms to measure the performance of the graphics card. Fur rendering is especially adapted to overheat the GPU and that's why FurMark is also a perfect stability and stress test tool (also called GPU burner) for the graphics card.

The benchmark offers several options allowing the user to tweak the rendering: fullscreen / windowed mode, MSAA selection, window size, duration.

Providing the direct comparison of different GPUs in a same league is not possible for me as I don’t have any sponsors who can provide me different GPUs for bench testing :) neither I can buy couple of myself as I don’t get any dough for reviewing stuff :) So for comparison, I will use the charts from different reviews just to give an idea about the performance of the card I am testing. The resolution and specs of these charts may not be exactly the same but will be more than enough to evaluate the performance. Do keep in mind that all the below test results by Hardware Canucks or others were performed on better and higher clocked CPUs and Rams then mine and this can affect the GPU performance.

However, I will provide the direct comparison of XFX 8800GT Zalman Edition and INNO3D 9800GTX, which were tested on the same system with same settings.

Hardware Canucks System Specs:.
Processor: Intel Core 2 Quad Extreme QX9770 @ 3.852Ghz
Memory: G.Skill 2x 2GB DDR2-1000 @ 1052Mhz DDR
Motherboard:
DFI LanParty DK X38 T2R
OS: Windows Vista Ultimate x64 SP1

Bjorn3d System Specs:
Processor Intel QX9650 @ 3.5GHz
Motherboard XFX 790i Ultra
Memory 2 GB (2 x 1 GB) Mushkin DDR3-2000
Graphics Card1 - NVIDIA 9800 GTX + (177.39)
Card 2 - NVIDIA 9800 GTX + SLI (177.39)
Card 3
- ATI 4850 (Catalyst 8.6)
Cooling Big Typhoon VX
Power Supply OCZ GameXStream 850 watts
Case No case
OS Windows Vista Ultimate 32-bit



RESULTS

CRYSIS
Crysis GPU Benchmark was used for measuring the FPS.




This chart from Bjorn3d is a good performance comparison for INNO3D 9800GTX with 9800GTX+ and ATi 4850. Though their test system had QX9650 @ 3.5GHz with DDR3-2000 Memory, we can clearly say that Inno3D 9800GTX, once overclocked can beat 9800GTX+ and 4850 both.



DEVIL MAY CRY 4
FPS measured and recorded through FRAPS 2.9.4 while playing LEVEL 8





COD4
FPS measured and recorded through FRAPS 2.9.4 after 10 mins of “HEAT” play





3DMARK06




TEMPERATURES

Speaking of cooling I must admit that 9800GTX stock cooler is more than adequate to cool this card even at higher clock rates. The maximum temperature I recorded was 74 degrees Celsius at stock and 77 when overclocked with the fan controlled by the drivers. The fan speed stays around 60% at stock settings under full load and jumps to 73% under full load once the card is overclocked. Bumping up the fan speed manually can further drop the temps, although there is a bit of noise penalty to pay for this.


CONCLUSION

If you are in a upgrade mood and want a mid range card that performs well and can even overclocks very well (to beat any 9800GTX+ for better performance), you really can’t go wrong with INNO3D 9800GTX. The price of 9800GTX has come down quite a bit and Inno3D products are tad more cheaper than other big brands like XFX, BFG, ASUS, EVGA. Warranty policy usually put more weight to one's decision when buying any GPU (or other component) but in a market like ours, its all the same for every brand and most importantly, Inno3D product quality has improved a lot over the time. Once decently overclocked you can pit this card against any 9800GTX+ or even HD4850, while the later one is still being the product of choice for many including me :)

I hope some of you may find this review useful for their next purchase.

Sunday, November 30, 2008

XFX 8800GT ZALMAN EDITION


About 2 weeks back I went to computer market looking for some good mid end nVidia GPU. I know lot of people would ask why nVidia when there are good Radeons available at a very competitive price. The reason was simple. I wanted to experiment with nVidia PHYSX and GPGPU stuff for which I must have two nVidia GPUs working in non-sli configuration. I already had XFX 9600GT which is now being used as a second GPU dedicated for PHYSX work. I will try to post about my PHYSX findings later in another thread as in this thread I’ll share my experience about XFX 8800GT 512MB ZALMAN edition, which was available at a price equivalent to 9600GTs. 9800GTs were available at Rs.15,000/- at that time and I knew that 9800GTs are just rebranded 8800GTs with the same G92 chip. Some newer 9800GTs are on a smaller die size (55nm rather than 65nm) but are hard to find, besides the box won’t tell you if it’s a 55nm or 65nm 9800GT.

The thing which caught my attention on XFX 8800GT was Zalman cooler and fortunately there was no premium for that
.

(The card build quality is quite solid. XFX has used all Japan made solid state capacitors and a Ferrite Core Choke as well.)

Being a happy user of XFX 9600GT, that overclocked like hell (725/1750/2000 from stock 650/1600/1800), I was quite sure that this 8800GT will hopefully overclock pretty well, just like my XFX 9600GT if not better. The only thing that bothered me was GPU rams, which have no heatsinks on ‘em. I was worried that this will become a problem once I start overclocking the rams. Although the mosfets were also without heat sinks but I was pretty sure that it would not pose any problem. Anyway I bought that card and installed it into my system as a primary GPU and put my old 9600GT as a secondary GPU in PCI-E x4 slot.

I was very hopeful that I would be able to squeeze the performance of stock 9800GTX out of my 8800GT and could save couple of thousand bucks. Although there is a difference of 16 stream processors between 8800GT and 9800GTX besides higher clocks of GTX (675/1688/2200), still sufficiently higher clock rates of 8800GT can compensate for this reduced stream processors. So I began my endeavor for the max overclock. After stress testing the card at stock speeds, I was quite skeptical about its overclocking potential as the GPU core was hitting 80C, not what I was expecting from a Zalman cooler. I knew that 8800GTs with stock cooler tends to get that hot but any aftermarket cooler should have performed much better. Anyway, I upped the clocks to 700/1750/1900 from 600/1500/1800 and now the card was hitting 88C under load. I was able to up the memory further to 2000MHz but having a bit of stability issues when playing Crysis at 1680x1050, all settings high but no AA/AF. I was bit disappointed so I removed the card and decided to remove the Zalman cooler and re-apply my favorite ARCTIC COOLING MX2 thermal paste. I had couple of Thermaltake GPU ram copper heatsinks lying around which I used for cooling memory as these were quite low profiled (8mm high) and fitted perfectly under the low profiled Zalman cooler. I also used Micro cool chip sinks on the mosfet area as well just to make myself more comfortable.


(After installing Thermaltake copper and Micro Cool aluminum sinks)

After reinstalling the card I checked the temp again which now have come down to 71C under load at stock. I then again overclocked the card to 700/1786/2100MHz and it was perfectly stable under all tests. Maximum temp was 79C, recorded through GPU-Z, only during crysis bench test and RTHBL GPU stability test. Playing COD4, Far Cry2 and DMC4 wasn’t able to increase the GPU temp more than 72C. Now this is something what I was looking for.

I think i can now say that the performance is almost comparable to any stock 9800GTX. I don’t have any 9800GTX at hand to compare the results but by looking at the reviews of 9800GTX on different sites I am quite confident that I am not wrong here. I am posting my 8800GT results below for you people to evaluate. I am providing the charts from other websites for making the comparison easier. Do keep in mind the test set-ups of those while comparing.

My System Specs:

CPU E7200 CORE 2 DUO @ 3.18GHz
XFX 8800 GT @ 700/1750/2100MHz
CORSAIR DDR2 @ 802MHz
WIN XP PROFESSIONAL SP2
NVIDiA GeFORCE DRIVER 178.24


RIVA TUNER 2.11 was used for overclocking the GPU.
Besides gaming, I used
Rthdribl for stress testing the GPU. Rthdribl is a "Real-time High Dynamic Range Image-Based Lighting" demo. DirectX(R) 9.0 high-precision texture formats and version-2.0 of Pixel Shader represent real-time true HDR rendering.

Following charts have been taken from Hardware Canucks. The resolution and specs may not be exactly the same but are quite identical and more than enough to evaluate the performance. Do keep in mind that all the below tests were performed on better CPUs then mine and at these settings GPU performance can be affected due to the type of CPU.
BFG 8800GT OCX used by Hardware Canucks was a factory overclocked card with the clocks @ 700/1728/2000 MHz. I found this review to be the best for comparison due to BFG 8800GT OCX.

BFG 8800GT OCX & 9600GT OCX Video Card Review - Page 2 - Hardware Canucks

Hardware Canucks System Specs:-

Processor: Intel Core 2 Quad Extreme QX9770 @ 3.852Ghz
Memory: G.Skill 2x 2GB DDR2-1000 @ 1052Mhz DDR
Motherboard: DFI LanParty DK X38 T2R
OS: Windows Vista Ultimate x64 SP1



XFX 8800GT ZALMAN

CRYSIS DEFAULT GPU BENCHMARK @ 1680 x 1050, ALL SETTINGS HIGH, NO AA
AVG FPS = 33 (28 FPS @ STOCK GPU)
GPU TEMP 79C (LOAD)


HARDWARE CANUCKS



XFX 8800GT ZALMAN

DMC4 @ 1680 x 1050, ALL SETTINGS VERY HIGH, 4xAA
AVG FPS = 100.43 (86 FPS @ STOCK GPU)
FPS measured and recorded through FRAPS 2.9.4 while playing LEVEL 7


HARDWARE CANUCKS



XFX 8800GT ZALMAN

COD4 @ 1680 x 1050, ALL SETTINGS HIGH, 0xAA, 16XAF
AVG FPS = 93
FPS measured and recorded through FRAPS 2.9.4 after 10 mins of play.


HARDWARE CANUCKS



XFX 8800GT ZALMAN

3DMARK06 @1280 x 1024
13013 (STOCK- 11908)


HARDWARE CANUCKS



FAR CRY2

XFX 8800GT ZALMAN

FAR CRY 2 SMALL RANCH @ 1680 x 1050, ALL SETTINGS VERY HIGH INCLUDING PHYSX, NO AA/AF
AVG FPS = 43 (38 FPS @ STOCK GPU)
GPU TEMP 72C (LOAD)


This is from Guru3D
Core 2 Duo E8400 Processor @ 3.0 GHz (FSB 1333)
Windows Vista 32-bit
DirectX 9/10 End User Runtime
NVIDIA GeForce 180.42




CONCLUSION

Given the lower price of 8800GT as compared to 9800GT, especially where i live in, 8800GT is still a worthy component for main stream gamers as it provides the best bang for your buck.

Friday, July 11, 2008

VDROP/VDROOP PENCIL MOD FOR ASUS P5K-V & VM

Here is a VDROP/VDROOP PENCIL MOD DETAILS for Asus P5K-V and VM as well. I would suggest doing it ONLY if you are experiencing a good amount of VDROP/VDROOP. Do it at your own risk and if something goes wrong i wouldn't be held responsible for that. Though this worked perfectly for me. If you haven't read the article OVERCLOCKING Q6600 WITH ASUS P5K-V, just do so before applying this mod.


Take any #2 pencil and fill the area of the respective resistor from one end to the other. This procedure is completely reversible in case you don't need it anymore. Don't forget to blow off the extra graphite after finishing the MOD. Above picture shows (green circle) the area where you need to look at and the below picture is the close up of that area. Resistors you need to fill in for VDROP and VDROOP are marked with GREEN and BLUE respectively.

DISCLAIMER
Overvolting and Moding any component carries a high risk of damage and/or failure. I take no responsibility for any loss or damage to your components as a result of using this MOD. You should do it at your own risk.