The feature set on the Z68A is a veritable laundry list of the must-haves for a board of this class, therefore we thought it made the most sense to list them in that fashion.Overclocking...and the issues with the OC Genie
As we mentioned before, the Z68A-GD80 features the OC Genie, a one-touch, one second overclocking feature. We decided to put that feature to the test to see exactly what value it adds to the board. The literature with the motherboard states that the OC Genie will "...automatically detect the optimum to overclock after booting the system".
The cold hard truth here is that this feature needs to work or it shouldn't be on the board to begin with. The reasoning behind this is that the likely users of such a feature are looking for an easy overclock. They do not wish to tinker around in the BIOS, fiddling with numerous different settings and stability testing to produce an overclock and by virtue of the fact that MSI markets this feature as a one-second overclock it should stand to reason that no additional adjustments should be needed by the user once they have activated the OC Genie.A welcoming feature: MSI's Z68A_GD80 B3 Packs three quick buttons, including the Automatic Overclock Button
Directions for the OC Genie state that DDR3-1333 and above memory should be installed and that users should equip a better heatsink. Seeing as we had those bases covered we powered down, press the OC Genie button and rebooted. We were greeted with the warning screen letting us know that OC Genie was active. Once we had booted into Windows we verified that the OC Genie had in fact overclocked the CPU from 3.4GHz to 4.2GHz, an 800MHz overclock, theoretically a 19-20% performance gain. A little investigating showed that the OC Genie achieved this overclock by simply bumping the multiplier from the stock 34x to 42x. CPU voltages, memory settings/timings and the like all appeared to have been untouched by the Genie.
In order to test the performance and more importantly the stability of the new performance boost we first fired up 3DMark11 and ran the system though single pass at the Extreme preset. Everything appeared to be going smoothly as the system chugged its way through the test until we hit the Physics test (CPU Test). As soon as the system hit the Physics test it froze. We attempted to repeat the test numerous times but every time without fail the system would lock when it reached the Physics portion of the test.
When we originally noticed that the OC Genie had chosen not to adjust ANY voltages to support the 800MHz overclock we were somewhat skeptical of the overclocks stability and unfortunately the 3DMark run affirms that concern.
At BSN* we take overclocking issues very seriously. The problem that has arisen with overclocking over the years is that as it has moved from a underground enthusiast pastime to an industry accepted practice it has also catapulted up the marketing pipeline. Almost any PC component these days makes some mention of its overclockability and increased performance. The problem here is that the purpose of overclocking is to increase performance, if the overclock is unstable and causes system issues, locks, etc then you have a degradation in performance and have affectively achieved the exact opposite of what you set out to do in the first place.
The fact that the overclock failed to make it through single passes of 3DMark11 underscored the instability of the overclock. At BSN* our standard stability test for overclocked components is much more lengthy and involved than single 3DMark11 passes in order to ferret out any stability issues. If an overclock cannot complete 3DMark run it has no hopes of making it through the rest of our testing gauntlet. Admittedly MSI has attempted something noble here by working to open up the performance increasing aspects of overclocking to a wider audience and we applaud their efforts, however we cannot recommend the OC Genie in its current state. Perhaps we got a bad board or the OC Genie fell victim to the a pre-release BIOS, either way we would like to see MSI continue its efforts with the OC Genie as we feel this could prove to be a great feature... as long as it is stable.Lucid Virtu
For all of MSI's branding on the retail box and even the board itself we find it interesting that the one technology they don't outright mention is Lucid's Virtu
. Virtu is short for GPU Virtualization. Essentially the software allows the motherboard to dynamically switch between the on-die Intel HD 3000 graphics (integrated GPU or iGPU) and a discrete (dGPU) graphics card(s).
Why is this important? After all, if you already have a discrete graphics card why would you even want to use the iGPU right? Well, there are a couple of reasons why Virtu is indeed an added feature. First, the iGPU is really good at one thing, video transcoding via Quick Sync
. Quick Sync is hands down one of the best avenues to transcode video and it is surprisingly fast. The downside is that it only works when the iGPU is active and the problem here is that prior to the Z68 chipset you were forced to make a choice. For those that wanted overclocking ability the P67 chipset was the only option, however P67 did not support the iGPU, instead it just used the processor as a processor. In order to utilize the iGPU, you had to choose the H67 chipset, but the H67 did not support overclocking. Once you slapped in a discrete GPU the iGPU was no longer accessible. Virtu allows you to utilize a discrete graphics card while still having access to the transcoding power of the iGPU.
The second reason why Virtu deserves your attention is that Virtu is GPU Virtualization, not switchable graphics. The difference is that with switchable graphics you are either using the discrete card OR the integrated graphics, therefore you are switching between the two. With virtualization the motherboard can make use of BOTH the discrete GPU and the integrated GPU, and it can do it simultaneously. If you want to encode a movie while simultaneously playing Crysis 2, Virtu allows you to do just that. Essentially Virtu directs the graphics workload to the GPU of its choosing so that video encoding/decoding gets sent to the iGPU while 3D content such as games get sent to the dGPU.
Virtu utilizes SandyBridge's onboard display connector to actually display the video. When a workload is sent to the dGPU for processing, the processed data is then copied to the iGPU for display. The dGPU has done the actual computational work and the iGPU is merely being used to display it. Lucid does say that the monitor can be connected to either the iGPU or dGPU and even provides a chart for which features are supported by each.
In practice Virtu was simple enough to setup. Simply install a discrete graphics card, remember to leave the monitor connected to the motherboards video port, reboot and install the drivers. With our particular setup we needed to enter the BIOS and changed the initial video adapter to the integrated offering before the Virtu driver would even attempt to install.Testing Lucid's GPU Virtualization clearly shows it works - power consumption went down considerably. Kudos to MSI for a proper implementation of the technology
Throughout our testing we randomly switched from physically connecting the monitor the iGPU to the dGPU and so on. In doing so we did notice some anomalies. At certain times Unigine's Heaven benchmark would not run at 1920x1080 when plugged into the discrete card yet would run at that resolution without issue when the monitor was connected to the iGPU. Another side effect of Virtu has to do with the discrete graphic drivers. Since Virtu is telling Windows that you are using the iGPU, the graphics driver (both NVIDIA and AMD) act as though no graphic card is installed. This prevents any type of changes from being made via the discrete graphics "control panel"
and also greets you with random warning messages that no hardware for the installed driver has been found (such was the case with NVIDIA Optimus Technology and some CUDA Applications). If you prefer to make most of your graphic setting changes in-game then this is not much of an issue... the continued warning messages however can become rather annoying.
Game play was seamless with the software automatically switching to the discrete graphics card when a supported game was launched. Enough talk, let's get down to testing.
© 2009 - 2014 Bright Side Of News*, All rights reserved.