الاثنين، 15 نوفمبر 2021

NEWS TECHNOLOGIE

Credit: Laura Ockel/ Unsplash, PCMag
(Photo: Credit: Laura Ockel/ Unsplash, PCMag)
According to a Twitter account that posts rumors about upcoming tech, the next generation of GPUs from both AMD and Nvidia are going to be unlike anything we’ve ever seen before, both in terms of raw power and power consumption. The Twitter user @greymon55 has listed possible specs for what could be upcoming GPUs from both camps, and though they are obviously just rumors at this point, they point to what could be a massive leap forward for both companies, with the flagship cards effectively offering more than twice the power of today’s tentpole products.

(Photo: greymon55 on Twitter)

Starting with Nvidia, the tipster indicates the upcoming Ada Lovelace AD102 GPU will be built on a 5nm process from TSMC, which is a big change since its current Ampere GPU lineup was produced on Samsung’s 8nm process. TSMC’s 5nm process is responsible for the impressive Apple SoCs used in both its iPhones and its new M1 Pro and Max silicon. It should be packing a whopping 144 streaming multiprocessors, which compares to 82 in the current RTX 3090. It could also have 18,432 CUDA cores, which is a 54 percent increase. This rumor lines up with previous rumors from a year ago from a different poster. It should retain its 384-bit memory bus width, and will also continue to use GDDR6X memory. Core clock speeds could hover in the low 2GHz-plus range, but this one area where rumors have fluctuated. The newest leak suggests Nvidia could be able to crank the clocks up to 2.3GHz, resulting in FP32 performance that is simply jaw-dropping, with it achieving between 85 and 92 Teraflops. The RTX 3090 is only theoretically capable of 35 Teraflops, so it would be a gigantic leap forward.

Of course, all that power also means the card will be an energy-guzzling behemoth, most likely. This particular leaker suggests AD102 could consume somewhere between 450-650w at peak, and we’ve heard similar rumors that the card could end up somewhere around 500w. This is also a pretty large jump from the current maximum, which is 350w for the RTX 3090. The previous generation of GPUs all maxed out at 250w, generally speaking.

(Photo: greymon55 on Twitter)

In the AMD camp, the rumors are heating up as well, pardon the pun. Dubbed RDNA 31 aka the RX 7900 XT, the next flagship GPU from AMD is switching from a monolithic design (which Nvidia will apparently be sticking with for Ada) to a Multi-Chip Module (MCM) layout that’s similar to what it’s currently using in some of its Ryzen and Epyc CPUs. According to WCCFTech, AMD will be dropping Compute Units (CU) in favor of Work Processor Groups (WGP), and the MCM will feature a Graphics Core Die (GCD) and a Multi-Cache Die, the former coming from TSMC’s 5nm line, and the latter from its 6nm process. With two GCDs per card, it equals 7,680 cores per GCD, and 15,360 cores total, which compares to 5,120 “streaming processors” in the current RX 6900 XT, but with such a radical change in architecture, comparisons between new and old are becoming less meaningful.

Rounding out the specs will be up to 32GB of GDDR6 on a 256-bit interface, 256MB of Infinity Cache per module for a total of 512MB of on-die “3D” cache, and clock speeds hovering in the mid-2GHz range. This could allow the GPU to achieve a theoretical FP32 performance of around 75 Teraflops, which is lower than the rumored power of Nvidia’s Ada card, but FP32 prowess doesn’t directly translate into gaming performance. Finally, the rumor monger says they have no data about total board power, but they think it’ll be between 350-550W.

To summarize, we know these are just rumors and they could be wildly off the mark, but rumors like this are all that has been coming through the transom for some time now, and they match leaks from almost a year ago as well, so our anticipation level is definitely high for these GPUs. Also, the most interesting question these GPUs introduce is one OP wrote on twitter, “Double performance, double power consumption, can you accept it?” Our answer? Oh, hell yes we can.

Now Read:



from ExtremeTechExtremeTech https://ift.tt/3DiT3Br

ليست هناك تعليقات:

إرسال تعليق