Blog
Apple adopts DirectX 11 GPUs, buys AMD Radeon HD 5750
Published
16 years agoon
By
ArchivebotWhile roaming the CeBIT halls here in Hannover, Germany, we managed to encounter a juicy rumor which we were able to confirm couple of dozen hours later. To our sources, it looks like Apple’s acquisition of AMD’s brightest minds in a form of Raja Koduri and Bob Drebin [and a few others] is now resulting in Apple acquiring AMD hardware.
It all started with the purchase of ATi Radeon HD 4850, which now as we all know – is an integral part of their iMac lineup and continuing with a recent switch to ATI Radeon HD 5750. Originally, the deal between AMD and Apple was supposed to result in a steady supply of Radeon HD 4850 chips until the end of June, but this all changed for good when Apple requested HD 5750. As a result, there is currently a HD5750 shortage which will trickle down to market in a few weeks or less. Luckily, there’s no signs of 5770 shortage, so you should be able to continue buying Juniper-based parts. It’s just that almost all 720 shader parts will go to Apple.
Just like any other OEM, Apple is known for changing their minds often and we weren’t surprised to learn that giant from Cupertino adopted AMD’s DX11 hardware in a form of HD 5750. Can you say – iMac refresh?
Given that Apple continued its orders with Intel in “business as usual” fashion, we take that Apple iMac refresh could be set for June 2010 and traditional Apple Worldwide Developer Conference, traditionally taking place in San Francisco, CA.
Naturally, Apple does not support DirectX 11 functionality as it is limited only to Microsoft’s Windows operating system. However, near-identical functionality is exposed to Apple’s own 3D API [OpenGL] through OpenGL 3.0, 3.1 and 3.2 revisions of this popular open standard API.
Among other things HD 5750 also supports extended functionality in OpenCL, another open standard Apple is heavily relying on. Now you know it folks, new Mac Pro and iMac designs are sure as heck to feature Radeon HD 5750 graphics, and that can’t be bad. 800 Shaders meets DirectX 11 / OpenGL 3.2, up from DirectX 10.1 and a bit problematic OpenCL support.
All in all, it is good to know that Apple won’t be more than a year late with their GPUs incorporated in their otherwise excellently designed products.
Original Author: Theo Valich
Webmaster’s note: This news article is part of our Archive, if you are looking for up-to date articles we would recommend a visit to our technology news section on the frontpage. Additionally, we take great pride in our Home Office section, as well as our VPN Reviews, so be sure to check them out as well.
You may like
Best Streaming Webcams 2025: 4K & 60fps Picks for Creators
Why Singapore and the UAE Are Setting the Benchmark in Global Crypto Innovation
Why Contractors Are Turning to Smarter Tools to Keep Projects on Track
Digital Payments and How They are Changing Online Gaming in Malaysia
Employer of Record Brazil: Build Global Teams with Confidence
Common Challenges in Solana Token Development and How to Overcome
Input Lag and Touch Targets – Tiny Details, Big Feel on Mobile
Understanding Hydrocodone Withdrawal Symptoms and Treatment
Why Efficient Microscope Slide Storage Is Essential for Modern Research Labs
