Since our last article
, we have been able to peer a little deeper into the state of 4K. There’s a lot of talk about 4K now that most of the major TV vendors have shown off the displays that they have planned for 2013 and beyond. We have seen displays from Viewsonic and Sharp as well as TVs from LG and Sony. Some of these displays are slowly starting to leak out into the market, but in extremely limited quantities and with some incredibly high price tags. LG’s currently ‘available’ 4K TV is a gargantuan 84” and sports a single HDMI 1.4a connection rather than two dual-link DVI.
Currently, the only displays that you can really get your hands on are LG’s 84” TV which features a limited refresh rate of 30Hz due to being run off of HDMI 1.4a (only supports 4096x2160 at 24Hz). Until HDMI 2.0 comes out and becomes a standard, you won’t really be using an HDMI-based 4K display for much of anything other than watching TV. However, there are displays that can do 4K that is capable of 60Hz and beyond. We currently have an EIZO FDH3601 display, which is powered by two dual-link DVI cables that each drive one half of the display.
Our 4K testing setup on the right with the EIZO 4K monitor
Even though this display also has DisplayPort 1.1 connectors, you need to have DisplayPort 1.2 (capable of 4K at 60Hz) in order to be able to drive a 4K display with a single cable. Therein lies our biggest problem, right now, those devices simply aren’t available to the mass market. At CES, we spoke with Viewsonic and Sharp about their 4K displays that would feature a single DisplayPort 1.2 connector and we had Sharp telling us to expect their 4K displays in late 1Q and early 2Q 2013. Viewsonic gave us a slightly later expectation, with no concrete date of actual launch.
GPU Support of 4K
The reason why we’re talking about displays is because we’ve already had a chance to test both AMD’s and Nvidia’s 4K support with our initial article, however, running a dual-link DVI display with these graphics cards has been a bit difficult. The EIZO FDH3601 is recognized by windows and the graphics card drivers as two displays because each half of the display is driven by one of the dual-link DVI cables. This can make for some fun gaming scenarios to say the least.
Nvidia’s graphics drivers are generally designed for display configurations of 1 or 3 displays, having two displays does not really work very well as most games try to maximize themselves into one half of the display or the other. As such, we could not enable Nvidia’s surround feature which turns multiple monitors into one desktop. Because of this limitation, we could only run applications windowed and then maximize them manually or using software. Running applications like Adobe Photoshop CS6 and Premiere Pro were no issue; however, they still required maximizing. With Battlefield 3, we were able to get the GTX Titan to display in 4K very well, but once we enabled SLI we had issues with one half of the display running 30Hz and the other in 60Hz. Nvidia’s Tegra 4 also claims 4K support over HDMI 1.4a, which would limit you to 30Hz, effectively being for video playback only.
AMD’s graphics drivers are unsurprisingly more flexible when it comes to multiple displays as they were the creators of Eyefinity, which originally drove Nvidia to develop their surround technology. Eyefinity enables up-to 6 displays for a single graphics card, with up to 3 being available to almost all non-specialized graphics cards. What’s good about AMD’s implementation is that it is first and foremost incredibly easy to use, and does almost all of the work for you. It does not discriminate between 1, 2 or 3 displays or how they are set up. For this, we give AMD kudos. However, their drivers suffer from the same problem that Nvidia’s do when running multiple GPUs. We’re not entirely sure if this is because of the type of monitor or because of the way the graphics are handled in multiple GPU configurations. But both AMD and Nvidia have the same issue when running multi-GPU on the EIZO display.
To remedy this issue for gamers and people doing 3D modeling, users will need to obtain DisplayPort 1.2 capable displays coming from Sharp and Viewsonic. We’ve also heard murmurings that some US display manufacturers may be getting Sharp displays and rebranding them as their own. Nevertheless, in order to really be able to fairly measure game and application performance across all platforms, DisplayPort 1.2 is necessary since it delivers 4096x2160 at 60Hz. We will have to wait for these displays to arrive to the market for us to be able to accurately measure 4K performance between Nvidia, AMD and Intel (and perhaps even Qualcomm).
Intel’s current supported 4K solution (dual Thunderbolt cables) is not really intended for anything more than desktop display and 2D applications. Their GPUs simply aren’t powerful enough. With Haswell, Intel should get a significant performance boost, but don’t expect more than video playback and 2D performance for quite some time. Intel will likely support a single cable solution with DisplayPort 1.2 enabled displays which should make 4K more accessible to those that don’t want to spend money on a dedicated GPU for video playback and 4K in 2D.
Qualcomm’s current 4K support is primarily being touted in their Snapdragon 800 processor, which was shown at this year’s CES playing back a 4K video. Considering that the Snapdragon 800 likely doesn’t have a DisplayPort 1.2 connector support or HDMI 2.0 support, it will, like the Tegra 4, be limited to 4K video playback at 24Hz. Unless, it runs at 3840x2160, in which case HDMI can be driven up to 30Hz.
While little has been said about Imagination Technologies’ mobile GPUs, they have found themselves embedded into some of the 4K displays. Imagination Technologies PowerVR graphics IP has found its way into LG’s latest 4K displays as they showed us back at this year’s CES 2013. We’re still waiting to hear more from them about mobile and 4K support, but considering that their latest PowerVR 6 series graphics already supports 4K, we expect an official mobile announcement not to be far behind.4K Gaming (Performance and support)
When it comes to 4K gaming, we found ourselves a bit limited by the display that we were using as it isn’t necessarily intended for gaming. The EIZO FDH3601 is really designed for more 2D applications with some 3D applications being possible with the right professional graphics solutions from Nvidia and AMD.
Nevertheless, we were able to get trusty ole BF3 to run on both graphics solutions and we’ve got benchmarks. We compared Nvidia’s latest and greatest single GPU against AMD’s latest and greatest single GPU. Since we don’t have an HD7990 or an ARES card, we’ll have to suffice with two HD 7970 GHz PCS+ edition cards from Powercolor.
Here we have our Battlefield 3 results. As you can see, one GTX Titan is basically as fast as two HD 7970 GHz edition cards. While we were unable to run two Titans successfully due to displays scaling issues in SLI (likely due to the type of display being run), there is no doubt that the Titan is the best single GPU in the world for 4K gaming, two HD 7979 GHz edition GPUs do deliver the exact same gaming experience at a lower cost, however their power consumption and heat generation is significantly greater. If you want the most elegant solution, the GTX Titan is the best GPU, but if you want the best bang for your buck, AMD's HD 7970 GHz edition has you covered.
After playing some Battlefield 3, we decided to see how other games panned out. Unfortunately for us, most games would not cooperate with the GTX Titans because of the way the drivers are for consumer cards. As such, most games would only run on one half of the display as we mentioned earlier. So, for the rest of our gaming benchmarks, we ran two AMD Radeon HD 7970 GHz Edition graphics cards. First, we tried our hand at Crysis 3 to see if two of AMD's cards could, in fact, play Crysis 3 in 4K.
Looking at our results, we were unable to run two HD 7970 GHz edition cards in CrossFire smoothly at maximum settings. It unfortunately requires three HD 7970s or two HD 7990s. In order to be able to play Crysis, we had to dial down the preset graphical settings a bit to high from very high.
As you can see above, we could get most of the games playing at their highest settings, however Batman Arkham City in 4K simply was not playable at Ultra settings, so I had to dial them down to very high settings. The game still looked absolutely stunning, but not as good as it could look. Borderlands 2 and Skyrim both ran very smoothly and quickly as you can see above. Skyrim definitely ran the fastest, but that is without the high resolution texture pack and I have to be honest, Skyrim definitely looked the worst of all the games. If you're going to play Skyrim in 4K you're going to want to get the high resolution texture packs.
Even with AMD’s Eyefinity enabled, we still ran into some scaling issues with games like FarCry3, Max Payne 3 and Counter Strike: Global Offensive either being unplayable or only playing on half of the screen. While we’re not entirely sure whether it was the graphics drivers or the games themselves, we really hope that game developers are preparing to support 4K resolutions, like it or not.
We are hoping to get our hands on some of the newest DisplayPort 1.2 displays very soon and will do a proper 4K gaming comparison when they arrive. 4K Media
In addition to 4K displays and 4K gaming, there is also a huge thirst for 4K video content to drive these new displays. Currently, there are very few 4K cameras out there, with most of them being 4K cinema cameras. There are, however, bright spots in the quest for 4K content with companies like GoPro supporting 4K video at 15 frames per second with their GoPro Hero 3 Black Edition. While 4K at 15 FPS is not really a good video frame rate, it is now only a matter of time until the next GoPro Hero supports 4K at 24 and 30 FPS.
JVC also has a $4,999 handheld camera, the GY-HMQ10U, which actually does record 4K video at 3830x2160 (QuadHD) at 24 FPS, 50 FPS and 60 FPS. This is still technically a professional camera, however, at $5,000 it is significantly cheaper than almost all of its competitors. The next best affordable 4K camera comes from Sony For $8000, almost double the price
There is no doubt that 4K content is slowly catching up from where it used to be with only Sony and Panavision making incredibly expensive 4K cameras. Part of that, I believe, we have to thank RED Digital Cinema for having helped drive down the cost of 4K.
Now that cameras are beginning to come down in price and improve in specifications, we’re also starting to see codecs come along as well. The latest codec, which was recently developed and ratified earlier this year, was HEVC (High Efficiency Video Codec) also known at H.265. HEVC will replace AVC or H.264 over time and is expected to deliver reduced bandwidth by about 50% while maintaining the same level or better quality as H.264.
This new codec will find its way into hardware and software once the movie industry decides how they plan to use the new codec in their content sometime later this year. We recently wrote an article
about HEVC where NTT DoCoMo demonstrated 4K running at about 10Mbps, which is significantly lower than anything anyone would’ve expected. Furthermore, 10Mbps allows for the streaming of 4K over the internet and can enable instant delivery of content to new 4K TV owners.
So, it appears that 4K is well on its way to becoming the next HD standard and that 2013 is definitely the year where that fact becomes concrete. In the past, we were sold technologies like 3D while 4K was already possible and the market pushed back. 4K promises to deliver four times as many pixels as 1080P while remaining on the same screen size and giving us comparable visual quality that we already expect to see on our phones. While I can’t necessarily say that 2013 will be the year of 4K with so many questions still in the process of being answered, it is safe to say that 2013 will be a defining year for 4K and HD video as a whole.
© 2009 - 2014 Bright Side Of News*, All rights reserved.