Thursday, January 2nd 2025

NVIDIA Plans GeForce RTX 5080 "Blackwell" Availability on January 21, Right After CES Announcement

Hong Kong tech media HKEPC report indicates that NVIDIA's GeForce RTX 5080 graphics card will launch on January 21, 2025. The release follows a planned announcement event on January 6, where CEO Jensen Huang will present the new "Blackwell" architecture. Anticipated specifications based on prior rumors point to RTX 5080 using GB203-400-A1 chip, containing 10,752 CUDA cores across 84 SM. The card maintains 16 GB of memory but upgrades to GDDR7 technology running at 30 Gbps, while other cards in the series are expected to use 28 Gbps memory. The graphics card is manufactured using TSMC's 4NP 4 nm node. This improvement in manufacturing technology, combined with architectural changes, accounts for most of the expected performance gains, as the raw CUDA core count only increased by 10% over the RTX 4080. NVIDIA is also introducing larger segmentation between its Blackwell SKUs, as the RTX 5090 has nearly double CUDA cores and double GDDR7 memory capacity.

NVIDIA is organizing a GeForce LAN event two days before the announcement, marking the return of this gathering after 13 years, so the timing is interesting. NVIDIA wants to capture gamer's hearts with 50 hours of non-stop gameplay. Meanwhile, AMD currently has no competing products announced in the high-end graphics segment, leaving NVIDIA without direct competition in this performance tier. This market situation could affect the final pricing of the RTX 5080, which will be revealed during the January keynote. While the January 21 date appears set for the RTX 5080, launch dates for other cards in the Blackwell family, including the RTX 5090 and RTX 5070 series, remain unconfirmed. NVIDIA typically releases different models in their GPU families on separate dates to manage production and distribution effectively.
Sources: HKEPC (X post deleted), via VideoCardz
Add your own comment

24 Comments on NVIDIA Plans GeForce RTX 5080 "Blackwell" Availability on January 21, Right After CES Announcement

#1
Legacy-ZA
Let's see if they can get enough on the shelves, there is always a shortage on the launch date.

Does the new/younger generation enjoy going to LANs? All I know is, my old bones are aching, I can't carry my rig along with me anymore. ;P
Posted on Reply
#2
N/A
I would trade the faster 16GB for the slower 24GB. Who cares about bandwidth when the buffer is full.
Posted on Reply
#3
Calmmo
N/AI would trade the faster 16GB for the slower 24GB. Who cares about bandwidth when the buffer is full.
Definitely expecting a ti with 24gigs a few months later from defective GB202 silicon.
Posted on Reply
#4
Macro Device
Legacy-ZADoes the new/younger generation enjoy going to LANs?
I'd definitely give that a shot but I'm banned from visiting all these four countries. Better luck next time I guess.
Posted on Reply
#5
napata
N/AI would trade the faster 16GB for the slower 24GB. Who cares about bandwidth when the buffer is full.
Lack of bandwidth can criple performance though. Like you could end up with a GPU slower than a 4080 if you'd go for a 192 bit bus. I can assure you more people would raise a ruckus about that than the 16GB. More VRAM over 16GB doesn't add anything for 99.9% of cases and I say that as someone who owned a 4090 and will get a 5090 when it releases.
Posted on Reply
#6
Daven
I expect the 5080 to be about 20% faster than the 4080 (10% CUDA core and 10% clock increases). Any higher would require significant IPC changes as even faster clocks could be limited due to using the same 4nm process node.
Posted on Reply
#7
mb194dc
Doesn't seem a very exciting product, will be a bit faster than 4080 but probably more $ as well.

5090 looks interesting, but will have a titan price.
Posted on Reply
#8
RedelZaVedno
Yeah, 5080 is gonna have tough time matching 4090 (26% performance difference vs 4080 in 4K). Maybe it can roughly catch it on averages if DDR7 and clocks really make such a difference (I doubt it), but 256-bit bus will be a limiting factor in +4K resolutions so I don't expect it to match 0.1/1% lows of 4090, which are in my opinion more important to achieve smooth experience than averages. It will all come down to pricing at the end. It will be decent GPU if priced at 999USD and a complete shitshow if priced 1299USD or higher. The truth is Ngreedia can do whatever it pleases as it has noone left to compete with :banghead:
Posted on Reply
#9
Daven
RedelZaVednoYeah, 5080 is gonna have tough time matching 4090 (26% performance difference vs 4080 in 4K). Maybe it can roughly catch it on averages if DDR7 and clocks really make such a difference (I doubt it), but 256-bit bus will be a limiting factor in +4K resolutions so I don't expect it to match 0.1/1% lows of 4090, which are in my opinion more important to achieve smooth experience than averages. It will all come down to pricing at the end. It will be decent GPU if priced at 999USD and a complete shitshow if priced 1299USD or higher. The truth is Ngreedia can do whatever it pleases as it has noone left to compete with :banghead:
Competition will be fierce below $500 this time with products from AMD covering all the way up to the 5070 Ti.

5070 Ti 100%
9070XT 95% $449
5070 80%
9070 75% $349
B770 70% $349
5060 Ti 65%
9060XT 65% $279
5060 55%
9060 55% $229
B580 50% $249
B570 45% $199

Nvidia will need to drop 5060 and 5070 series prices compared to the 4060 and 4070 series launch prices to compete. Only the 5080 and 5090 will be lacking competition however the above SKUs might limit how high Nvidia can price them.
Posted on Reply
#10
tpa-pr
Legacy-ZALet's see if they can get enough on the shelves, there is always a shortage on the launch date.

Does the new/younger generation enjoy going to LANs? All I know is, my old bones are aching, I can't carry my rig along with me anymore. ;P
The mere thought of trying to lift my rig makes my back ache :)

Anyway, I hope the 5080 gives the uplift and features people are looking for. I'm remaining firmly Team Red but new tech is new tech.

I also hope it'll be reasonably priced but based on all GPU pricing lately (except maybe Intel?) I suspect I'll be disappointed.

Is it likely they'll announce their new tech (eg: DLSS 4) alongside it or will that be held for the 5090? Or will the 5090 be announced at the same time? I'm wholly unfamiliar with CES.
Posted on Reply
#11
RedelZaVedno
DavenOnly the 5080 and 5090 will be lacking competition however the above SKUs might limit how high Nvidia can price them.
I wouldn't call that "only" as it's a huge deal from a marketing perspective. xx80 class had at least some kind of competition dating back to ATI tech times. The way I see it complete lack of competition at the top is giving Nvidia a status of an undisputed dGPU market champion and with that comes a price premium. They can easily charge 100 maybe even 200 bucks more for every tier and still outsell competition like 5:1 as average Joe looks at what's the best first and then what from that brand can he/she actually afford, disregarding other options on the market unfortunately.
Posted on Reply
#12
Daven
RedelZaVednoI wouldn't call that "only" as it's a huge deal from a marketing perspective. xx80 class had at least some kind of competition dating back to ATI tech times. The way I see it lack of competition at the top is giving Nvidia a status of an undisputed dGPU market champion and with that comes a price premium. They can easily charge 100 maybe even 200 bucks more for every tier and still outsell competition like 5:1 as average Joe looks at what's the best first and then what from that brand can he/she actually afford, disregarding other options on the market unfortunately.
AMD has been inconsistent with competing up and down the stack for a while now. There were no high-end SKUs for the 400 and 500 Polaris series. Also no high-end SKUs for the 5000 RDNA1 series. The Radeon VII was a one-off weird SKU that barely beat the mid-range 5700XT that came shortly afterwards. Vega 56 and 64 sucked so I won't even include them. Before the 6900XT launch in Dec 2020, AMD really didn't have a high-end SKU since Fury in mid-2015.

And now after just two generations of high-end SKUs (6900XT and 7900XTX), AMD is back to mid-range only again.
Posted on Reply
#13
Prima.Vera
CalmmoDefinitely expecting a ti with 24gigs a few months later from defective GB202 silicon.
Most likely it will be a 320-bit card with 20GB of VRAM. Which is fine.
The pricing however....
Posted on Reply
#14
Bomby569
couldn't care less, call me when they are releasing 5060 and 5070 lineups.
Posted on Reply
#15
RedelZaVedno
DavenAMD has been inconsistent with competing up and down the stack for a while now. There were no high-end SKUs for the 400 and 500 Polaris series. Also no high-end SKUs for the 5000 RDNA1 series. The Radeon VII was a one-off weird SKU that barely beat the mid-range 5700XT that came shortly afterwards. Vega 56 and 64 sucked so I won't even include them. Before the 6900XT launch in Dec 2020, AMD really didn't have a high-end SKU since Fury in mid-2015.

And now after just two generations of high-end SKUs (6900XT and 7900XTX), AMD is back to mid-range only again.
Yeah, but Vega 64 competed with 1080, Radeon VII took on 1080TI or at least they've tried. Now AMD doesn't even pretend to compete anymore. The only hope to have anything resembling competition in dGPU market segment is Intel atm and that's a big maybe as it's TSMC limited just like AMD. That's why I don't believe RDNA4 will be well priced. Sure, 9070XT could come out at $499 and be outsold everywhere in a heartbeat, but why would Radeon/AMD do that, knowing that it has wery limited waffer capacity alocated to dGPUs? I'm afraid, they'll try to charge as much as they can for it, market share be damned.
Posted on Reply
#16
Daven
RedelZaVednoYeah, but Vega 64 competed with 1080, Radeon VII took on 1080TI or at least they've tried. Now AMD doesn't even pretend to compete anymore. The only hope to have anything resembling competition in dGPU market segment is Intel atm and that's a big maybe as it's TSMC limited just like AMD. That's why I don't believe RDNA4 will be well priced. Sure, 9070XT could come out at $499 and be outsold everywhere in a heartbeat, but why would Radeon/AMD do that, knowing that it has wery limited waffer capacity alocated to dGPUs? I'm afraid, they'll try to charge as much as they can for it, market share be damned.
AMD executives have explicitly said RDNA4s purpose is to increase market share in order to attract developers.

www.techpowerup.com/326415/amd-confirms-retreat-from-the-enthusiast-gpu-segment-to-focus-on-gaining-market-share?amp

Everything you said contradicts statements made by these executives. Either they are lying or they are hell bent on the right perf/$ to gain market share. Time will tell.
Posted on Reply
#17
RedelZaVedno
DavenAMD executives have explicitly said RDNA4s purpose is to increase market share in order to attract developers.
I've heard that one and don't believe it. The truth probably is (as Coreteks sources leaked) that RDNA4 was originaly designed as a new iteration of chiplet arch and failed miserably for some reason. Now it's a damage control, trying to "salvage" and sell as a monolithic die what they can. And secondy if AMD was serious about increasing it's dGPU market share one would think they would alocate more TSMC's waffer capacity to Radeon division, yet they've shrinked it according to Taiwanese TSMC insiders. Why would they do that if they expect ultra strong RDNA4 sales in 2025/26?
Posted on Reply
#18
wEeViLz
When will we see reviews on these cards?
Posted on Reply
#19
MxPhenom 216
ASIC Engineer
wEeViLzWhen will we see reviews on these cards?
Jan 21st? Or couple days before that.
Posted on Reply
#20
Daven
RedelZaVednoI've heard that one and don't believe it. The truth probably is (as Coreteks sources leaked) that RDNA4 was originaly designed as a new iteration of chiplet arch and failed miserably for some reason. Now it's a damage control, trying to "salvage" and sell as a monolithic die what they can. And secondy if AMD was serious about increasing it's dGPU market share one would think they would alocate more TSMC's waffer capacity to Radeon division, yet they've shrinked it according to Taiwanese TSMC insiders. Why would they do that if they expect ultra strong RDNA4 sales in 2025/26?
So your belief is the same belief that everyone else uses to critique AMD's GPU division for the last ten years? (e.g. iteration, meant to be, damage control, failure, salvage, etc).
Posted on Reply
#21
RedelZaVedno
DavenSo your belief is the same belief that everyone else uses to critique AMD's GPU division for the last ten years? (e.g. iteration, meant to be, damage control, failure, salvage, etc).
There's not much to be proud about in Radeon division imho. I've been buying ATI products since early 1990ies and it was all downhill once AMD aquired the company especially once old dev ATI team was gone.
Posted on Reply
#22
Frank_100
Is that photo real?
I expect a 5080 card to much larger.
Something about the size of a brief case.
Posted on Reply
#23
Dawora
RedelZaVednoYeah, 5080 is gonna have tough time matching 4090 (26% performance difference vs 4080 in 4K). Maybe it can roughly catch it on averages if DDR7 and clocks really make such a difference (I doubt it), but 256-bit bus will be a limiting factor in +4K resolutions so I don't expect it to match 0.1/1% lows of 4090, which are in my opinion more important to achieve smooth experience than averages. It will all come down to pricing at the end. It will be decent GPU if priced at 999USD and a complete shitshow if priced 1299USD or higher. The truth is Ngreedia can do whatever it pleases as it has noone left to compete with :banghead:
Its so easy to tell who is AMDead Gpu user..


How 256bit is limiting?
5080 is something like 960 GB/s
4090 is 10008 GB/s

i dont see any limiting here?
mb194dcDoesn't seem a very exciting product, will be a bit faster than 4080 but probably more $ as well.

5090 looks interesting, but will have a titan price.
What is the price? i didint see it yet
Posted on Reply
#24
mb194dc
DaworaIts so easy to tell who is AMDead Gpu user..


How 256bit is limiting?
5080 is something like 960 GB/s
4090 is 10008 GB/s

i dont see any limiting here?


What is the price? i didint see it yet
Just rumours from China, converted to usd $2600 iirc.
Posted on Reply
Add your own comment
Jan 4th, 2025 16:37 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts