الأربعاء، 8 يوليو 2020

NEWS TECHNOLOGIE

There’s news out that Apple isn’t just going to be getting rid of Intel silicon — it’s supposedly going to be pushing GPUs out of its products as well. That read is based on a slide Apple passed around at WWDC that lays the whole thing out pretty clearly:

Apple-Plans-GPU

Image by Longhorn

At first glance, that looks like an open and shut case. Apple Silicon Macs will have Apple GPUs, with a new tile-based deferred renderer and support for Metal GPU Family Mac 2 and Metal GPU Family Apple, whereas previous GPUs will only retain support for the former. So, Apple GPUs replaces every single AMD solution on future Macs within two years?

That’s a heck of a lot less clear.

Apple has already told us that it will finish its switch to Intel hardware in about two years, and the company has announced its initiative for that process. It has made no equivalent commitment to GPU hardware.

Yes, Apple has experience in building its own mobile GPUs, but scaling up a GPU design isn’t like scaling up a CPU design. Assuming Apple’s A12Z isn’t a huge departure from the A12X, we can expect a die size of 122mm2. Die size on a larger chip like the Core i9-10900K (comparing monolithic die to monolithic die) is 206.1mm2. Only modestly bigger.

122mm2 doesn’t even get the ball rolling, in terms of a high-end GPU. The RTX 2080 Ti has a die size of 775mm2. Even AMD’s Radeon VII, while much smaller, is a 331mm2 design. It’s also not an accident that Intel has taken years to bring a discrete GPU to market, despite the fact that Tiger Lake systems with Xe-class graphics are expected to ship in the very near future. Just as it takes time to scale up a CPU design, it takes time to scale up a GPU design, too.

What may happen is this: Apple may launch its own laptop / desktop ARM CPU, with its own Apple silicon, while simultaneously supporting AMD (most-likely) GPUs on higher-end Mac hardware. So long as we’re talking about an SoC, it makes obvious sense for Apple to field its own silicon. The question is, will Apple start building its own discrete add-in cards? It would have to, if it wants to supply top-end graphics performance.

The only way to avoid that requirement would be to design a motherboard platform capable of handling the combined heat of a workstation-class CPU + GPU in the same “socket.” This can theoretically be done — modern motherboards can field sockets that handle 250W+ of power draw — but you’re going to need to design an integrated part with a combined HBM2 stack to feed both CPU and memory. There’s no DDR interface that can feed a top-end workstation card.

It would be a surprise to see Apple commission its own discrete GPUs, given that we’ve heard not a whisper of a dramatic GPU ramp. Even if the company takes this step, it seems more likely to ramp up its notebook and low-power desktop graphics first while continuing to rely on AMD, Intel, or Nvidia for dGPUs at higher performance levels.

The fact that Apple hasn’t said anything publicly about the GPU transition the way it has the CPU transition suggests to me that the company is biding its time on the announcement, or possibly feels a bit less certain about it. Supposedly Apple began investigating its own CPU in 2015 and got serious about the effort in 2018. It may not have hit the same point in GPU quite yet.

Now Read:



from ExtremeTechExtremeTech https://ift.tt/38CiZJX

ليست هناك تعليقات:

إرسال تعليق