NVIDIA GeForce RTX 3070 Ti spotted with 16GB GDDR6 memory

Authored by videocardz.com and submitted by PokemonAreCoolz
image for NVIDIA GeForce RTX 3070 Ti spotted with 16GB GDDR6 memory

Lenovo has confirmed that their Legion T7 gaming desktop system will ship with GeForce RTX 3080 and GeForce RTX 3070 graphics cards.Â

Lenovo launches Legion T7 with GeForce RTX 3070 Ti 16GB

Interestingly, Lenovo also confirmed that their Legion T7 system will feature the GeForce RTX 3070 Ti model. This SKU has not been announced or even teased by NVIDIA in any form. Though, it aligns with the rumors that RTX 3070 series will be offered with both 8GB and 16GB memory. What remains unclear is whether the model is really called 3070 Ti or 3070 SUPER, we have heard both names in private conversations with AIBs.

NVIDIA GeForce RTX 30 Series Specifications VideoCardz.com RTX 3090 RTX 3080 RTX 3070 Ti/SUPER RTX 3070 Picture GPU 8nm GA102-300 8nm GA102-200 8nm GA104- TBC 8nm GA104-300 Board PG132 SKU 30 PG132 SKU 10 TBC PG142 SKU 10 CUDA Cores 10496 8704 TBC 5888 Base Clock 1395 MHz 1440 MHz TBC 1500 MHz Boost Clock 1695 MHz 1710 MHz TBC 1725 MHz FP32 Perf. 35.6 TFLOPS 29.8 TFLOPS TBC 20.3 TFLOPS Memory 24GB G6X 10GB G6X 16GB G6 8GB G6 Memory Clock 19.5 Gbps 19 Gbps TBC 14 Gbps Memory Bus 384-bit 320-bit 256-bit 256-bit Bandwidth 936 GB/s 760 GB/s TBC 441 GB/s TGP 350W 320W TBC 220W MSRP $1499 $699 TBC $499

The Legion T7 systems are equipped with an Intel Z490 motherboard and either Core i9-10900K or Core i7-10700K processors, so these are definitely new additions to the desktop Legion lineup.

There is, however, something to consider. NVIDIA clearly did not inform the partners with the full specifications until the very last moment. We have heard that the final BIOS for the Ampere series was provided only recently. The doubled FP32 SM (Cuda) count has also not been communicated clearly to partners until just a few days ago. Hence, some AIBs still list incorrect CUDA core counts (5248/4352/2944) on their websites. What this means is that Lenovo may still rely on old data, which could’ve changed over the past few days.

Another point that this might not be accurate is that all cards (3080/3070Ti/3070) are listed with GDDR6 non-X memory. Thus, theoretically, these might be custom OEM variants of some sort that NVIDIA did not announce yet with slower memory technology.

Lenovo Legion T7 with GeForce RTX 3070 Ti

Many thanks to Martin Refseth (HDR) for the tip!

MidgetsRGodsBloopers on September 2nd, 2020 at 15:17 UTC »

The level of ignorance on what be VRAM is actually for in this thread is depressing.

IT HOLDS TEXTURES.

It also contains the framebuffer, which is up to 3 raw uncompressed images in the screen resolution you're using.

Resolution and VRAM requirement haven't been strongly correlated since we had cards over 1 GB. It used to be a big thing a couple decades ago - oh, you want to play on 1280x1024? Yikes, that 384 MB might be cutting it close, maybe get a 512 MB to be safe.

The difference between VRAM use at 640x480 and 8K isn't that big compared to the size of a modern VRAM pool.

Many games ALLOCATE more VRAM than they actually USE. Calidudy allocates**** ALL your VRAM regardless of how much you have. It's difficult to determine at any given time how much VRAM is actually IN USE vs RESERVED.

Just a few years ago we were having this talk about 3GB vs 4GB vs 6GB vs 8 GB. Digital Foundry determined that VRAM requirements are vastly overstated in most cases and for example in the case of RE2 I believe, a 3GB card would happily run it when the game said it would need far more.

Finally, texture memory limitations are the EASIEST thing for a developer or end-user to work around. You lower the texture setting one notch.

You can rest assured, developers will take into account the amount of VRAM available in their target audience and optimize their engine and presets accordingly.

Superbone1 on September 2nd, 2020 at 13:24 UTC »

But does having more VRAM actually do that much for us? Do people with newer cards that are 8-10gb feel like it's not enough? They've also said these cards are more optimized already.

Chewy12 on September 2nd, 2020 at 11:46 UTC »

They intentionally gave these base cards an underwhelming amount of RAM so people would still feel the need to upgrade later