Nvidia's RTX 5070 Ti and RTX 5070 allegedly sport 16GB and 12GB of GDDR7 memory, respectively — Up to 8960 CUDA cores, 256-bit memory bus, and 300W TDP
The only thing left to be seen is the pricing.
Renowned and avid leaker Kopite has detailed Nvidia's RTX 5070 family, and the overall bump in specs over Ada Lovelace is a mixed bag, at least on paper. The leaker has a proven track record, having previously leaked the RTX 5090 and RTX 5080 specifications. The RTX 5070 family comprises the RTX 5070 Ti and the base RTX 5070. Rumor has it that Nvidia will debut Blackwell with the RTX 5090, RTX 5080, and the RTX 5070 family at CES next month.
According to the leaked data from Kopite, the RTX 5070 Ti sports the GB203-300-A1 die, similar to the RTX 5080, and has 8960 CUDA cores or 70 SMs; 16% more than the RTX 4070 Ti. Over a 256-bit interface, the RTX 5070 Ti gets 16GB of GDDR7 VRAM, rumored to run at 28 Gbps for a total bandwidth of 896 GB/s. This puts the RTX 5070 Ti quite close to the RTX 5080 despite the 20% delta in core counts.
For context, the RTX 4080 and 4070 Ti had a noticeable spec gap, pushing Nvidia to price them almost $400 apart. It is reasonable to expect that this delta will not be as large with Blackwell, but the ball is in Nvidia's court. The RTX 5070 Ti is expected to chug 300W of power, 15W more than its predecessor.
Moving on, the RTX 5070 is allegedly powered by the GB205-300-A1 die. This is a step down against the RTX 4070, featuring AD104-250, an XX104 class GPU, and a tier higher than XX105/205 class GPUs. The smaller die lands the RTX 5070, resulting in a significant reduction in core counts to 6144 CUDA cores, though that's still 4% more than the RTX 4070. That aside, it offers 12GB of GDDR7 memory across a 192-bit interface for 672 GB/s of bandwidth. The TDP is slightly higher at 250W, 25% more than the RTX 4070.
GPU Name | RTX 5070 Ti | RTX 4070 Ti | RTX 5070 | RTX 4070 |
---|---|---|---|---|
Die | GB203-300-A1 | AD104-400-A1 | GB205-300-A1 | AD104-250-A1 |
CUDA Cores | 8960 | 7680 | 6144 | 5888 |
Bus Width | 256-bit | 192-bit | 192-bit | 192-bit |
Memory | 16GB | 12GB | 12GB | 12GB |
TDP | 300W | 285W | 250W | 200W |
The RTX 5080's substantial reduction in specs as compared to the RTX 5090 has set an underwhelming tone for the remaining Blackwell lineup. While the RTX 4090 had 68% more cores than the RTX 4080, this disparity has increased to 102% generation-over-generation, at least according to the unconfirmed leaks.
Samsung is expected to initiate mass production of its GDDR7 24Gb (3GB) memory early next year. A potential RTX 50 Super refresh could employ these newer modules for 50% higher VRAM capacities, but that's speculation. Theoretically speaking, Nvidia could announce a 48GB RTX 5090 SUPER, but it doesn't have to since both Intel and AMD have dropped out of the high-end market. Users demanding higher VRAM capacities, likely for AI, will have to pay through the nose and opt for Nvidia's Blackwell data center accelerators or future Blackwell workstation GPUs.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Hassam Nasir is a die-hard hardware enthusiast with years of experience as a tech editor and writer, focusing on detailed CPU comparisons and general hardware news. When he’s not working, you’ll find him bending tubes for his ever-evolving custom water-loop gaming rig or benchmarking the latest CPUs and GPUs just for fun.
-
Newoak It reminds me of the us auto industry, where rich peoples cars kept getting more expensive, and poor peoples cars, kept getting worse and more expensive, with small, turbo charged engines, cvt transmissions no one liked, and stop go engine on, environmental regulations everyone hates. In G-d I trust.Reply -
hotaru251 honestly hoping amd's low/mid range gpu are actually decent on raytracing as if THATS a 5070 the 5060 is gonna be bad & likely a worse buy as the 4060Reply -
giorgiog I know it’s Christmas and all, but maybe the hardware “journalists” should stop being shills and immediately <Mod Edit> on NVIDIA for gimping these cards with anemic amounts of VRAM. I’m keeping my 3080 for another 2 years.Reply -
nitrium My RTX 2060 6GB which is turning 6 years old in March 2025, is so long in the tooth now that I have to replace it with something. Probably will get an RTX 5070, but I suspect it won't be the upgrade it should be given how much time has passed.Reply -
thestryker I have a feeling that if the leaked VRAM capacities are all correct we're going to see a midcycle refresh on everything but the 5060 Ti (rumored to have 16GB) using 24Gb GDDR7 modules (8GB becomes 12GB, 12GB becomes 18GB and 16GB becomes 24GB). I don't really care what sort of compression nvidia pulls off there are no guarantees when it comes to the way games are developed. Having to turn down options your GPU is capable of running because it's not paired with enough VRAM is just a place nobody should be in.Reply
The mediocre releases seen the last couple of years makes it really easy to not upgrade, but I do feel bad for anyone putting together something new or who needs to. -
derekullo The Nvidia A4000 still feels like the sweet spot if you want to experiment with AI and large language models.Reply
16 gigabytes of vram and it only takes up a single slot at $700 each! -
palladin9479 thestryker said:I have a feeling that if the leaked VRAM capacities are all correct we're going to see a midcycle refresh on everything but the 5060 Ti (rumored to have 16GB) using 24Gb GDDR7 modules (8GB becomes 12GB, 12GB becomes 18GB and 16GB becomes 24GB). I don't really care what sort of compression nvidia pulls off there are no guarantees when it comes to the way games are developed. Having to turn down options your GPU is capable of running because it's not paired with enough VRAM is just a place nobody should be in.
The mediocre releases seen the last couple of years makes it really easy to not upgrade, but I do feel bad for anyone putting together something new or who needs to.
Those amounts are correct because GDDR7 is still primarily shipped in 16Gb (2GB) modules. Sometime next year Samsung is supposed to have it's 24Gb (3GB) modules available, so expect a Super or TI refresh using them at a price premium.
Having said, there is no game that you'll be playing at enjoyable framerates that is going to struggle with VRAM. It's like demanding all computers come with more then 32GB of ram. (32GB is actually overkill, but 16GB is starting to be too little and there isn't really a middle ground here). -
thestryker
Here we go again with your lies regarding VRAM capacity I've asked you to stop before and here we are again: please just stop.palladin9479 said:Having said, there is no game that you'll be playing at enjoyable framerates that is going to struggle with VRAM. It's like demanding all computers come with more then 32GB of ram. (32GB is actually overkill, but 16GB is starting to be too little and there isn't really a middle ground here).
Rather than repeat myself again here:
https://forums.tomshardware.com/threads/intel-arc-b580-review-the-new-249-gpu-champion-has-arrived.3864870/post-23393536https://forums.tomshardware.com/threads/intel-arc-b580-review-the-new-249-gpu-champion-has-arrived.3864870/post-23393123
Oh and I'll add another one for good measure:
https://i.imgur.com/3sJZprP.jpeghttps://i.imgur.com/J8xVzVN.jpeghttps://www.computerbase.de/artikel/gaming/indiana-jones-und-der-grosse-kreis-benchmark-test.90500/seite-2
edit: Graphics intensive games today tend to work on console cycles so pretty much everything designed in the PS4/Xbox One era was good with 8GB VRAM and current are typically good up to 12GB VRAM, but as we've seen with increasing limitations with 8GB VRAM this is unlikely to last. -
Alvar "Miles" Udell With the 5070 Ti likely pushing north of $800 I just can't get excited about it, and with the 5070 vanilla sticking to 12GB VRAM, I can't get excited about it either because 12GB VRAM in 2024 is like 8GB in 2020, it's already getting insufficient.Reply -
bit_user
There are 24 GB DDR5 DIMMs, of course.palladin9479 said:It's like demanding all computers come with more then 32GB of ram. (32GB is actually overkill, but 16GB is starting to be too little and there isn't really a middle ground here).
One thing about consoles is that even when you use a PS5 with a 4k TV, not all games will render natively at 4k. A lot will just render at 1080p and then scale, which is why the PS5 Pro put so much effort into improving scaling quality.thestryker said:edit: Graphics intensive games today tend to work on console cycles so pretty much everything designed in the PS4/Xbox One era was good with 8GB VRAM and current are typically good up to 12GB VRAM, but as we've seen with increasing limitations with 8GB VRAM this is unlikely to last.
My point is that PS5 has 16 GB of memory shared between the GPU and CPU, but the memory used by the GPU might still only be holding textures and assets sized for 1080p rendering. So, however much you figure they devote to the GPU might be rather lower than what a PC version of the same game would use at 1440p or 4k.