You may be a little confused. After all, doesn't the RTX 3050 already exist? Yes, but as a laptop GPU. Rumours are now swirling that there is going to be a desktop variant of the RTX 3050 launched imminently. We've pulled together all the information we can find to let you know exactly what's being discussed. Here's what you need to know about the expected Nvidia RTX 3050 release.
NVIDIA RTX 3050 Release Date
The RTX 3050 is expected to be officially announced at CES in Las Vegas this week, potentially on January 4th. Given that the card is yet to be officially confirmed, we don't have a solid, locked-in release date. But the good old rumour mill is optimistic of a release date on or around January 27th. The expectation is that there will be an announcement of a higher-end RTX 3090 Ti GPU alongside the 3050, as Nvidia seems to be trying to cater to both the lower end and the higher end of the market.
Interestingly, the suggestion is that there are actually going to be two versions of the RTX 3050 GPU. Let's look at this in a bit more detail.
We're expecting the 3050 to be a budget GPU, but there is a suggestion that it will come with two different model numbers: GA106-150 and GA106-140. The 150 variant is expected to come with 2560 CUDA cores (CUDA stands for Compute Unifies Device Architecture - a proprietary programming language designed to optimise performance) 8GB of VRAM, and 80 Tensor Cores. The 140 version meanwhile, is expected to have 2304 CUDA cores, 4GB of VRAM and 72 Tensor Cores. The working assumption is that the RTX 3050 will use GDDR6 memory, like the RTX 3060.
There's a lot of uncertainty around the exact specs, and there haven't been any benchmark leaks yet for us to accurately compare performance. But given this is a budget GPU, you can probably expect it to be competing against the likes of Intel's Arc graphics cards, and the imminent AMD Radeon RX 6500 XT.
Anything Else Worth Knowing?
We are still waiting for the official announcement before we know what price these will have. The RTX 3060 launched at $329/£300, which probably offers a benchmark. We'd expect the 3050 to be cheaper than this. Although given the ongoing chip shortage, we'll wait to see exactly how aggressively they decide to price it. The chip shortage has meant that budget GPU cards have been thin on the ground. And we still don't know what avilability will be like. That's even before scalpers get in on the act.
There's also speculation that these cards will be followed in due course by a slightly more powerful GA106-powered Ti version. Having three products that fundamentally use the same core GPU makes a lot of sense in terms of reducing manufacturing costs. Hopefully, this will then be reflected in the price of these products.
We'll keep a close eye on Nvidia at CES, and we'll update this as soon as we know more. So check back in soon.
Update: RTX 3050 Announced
We were expecting the RTX 3050 to be announced by Nvidia at CES, and they didn't disappoint. Cited as an affordable option for gamers, the RTX 3050 is claimed to bring "the performance and efficiency of the Ampere architecture to more gamers than ever before". It is expected to be capable of powering the latest games at over 60fps. The examples cited were Control, F1 2021, DOOM Eternal, Call of Duty: Black Ops Cold War, and Marvel's Guardians of the Galaxy.
The 3050 is equipped with second-generation RT cores for ray tracing, alongside third-generation Tensor cores for DLSS and AI. Other specs include:
- 8GB of G6 memory
- 9 Shader Teraflops
- 18 RT Teraflops
- 73 Tensor Teraflops
The RTX 3050 will cost from $249, and will be available worldwide as soon as January 27th (exactly as predicted).
Update: Further Details Revealed
Nvidia has finally confirmed the full specifications for the RTX 3050. In addition to what was already known, these are as follows:
- 2560 CUDA cores
- 2nd Generation Ray Tracing Cores
- 3rd Generation Tensor Cores
Prices start from £239 in the UK. These also support Nvidia DLSS (Deep Learning Super Sampling) and Nvidia RTX. In theory, these will boost performance and the best possible image quality. But it won't be until we see the real-life benchmarks, that we can gauge how good the performance really is.