• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GALAX Confirms GeForce RTX 3080 20GB and RTX 3060, RTX 3060 Matches RTX 2080

Well, GDDR6X DRAM modules might be less scarce than RTX 3080 chips, so why sell the cheaper version of the card with "just" 10GB, right?
 
What the game uses and what it actually needs the card to have as a minimum not to bottleneck the game are two different stories.
Then i wish you are right. I went for a 1080ti when it came out because the r9 struggled too much on new game releases.
 
Even the RTX 2060 matched the GTX 1080, so it's not a big deal.


This.

Hadn't the overall community established that *all* GPU makers make BS performance claims? Also, that the performance that leather jacket man was talking about was just really all about ray tracing, rather than actual performance?
 
I bet even a single 4GB VRAM GPU give a better gaming experience than Xfire/Sli setup at 4K. Nvidia just gave up on SLI already.
They totally gave up on multi gpu, but the 1070 SLI seemed to perform really well when the prices have gone down.
 
Then i wish you are right. I went for a 1080ti when it came out because the r9 struggled too much on new game releases.
Well I'm talking about the 4k resolution. Some people claim that you need more than 10GB to run certain game at 4k with the settings at Ultra. Like Battlefield V when played 4k Ultra with DXR.
Here you have a 2070 Super at 4k ultra with DXR playing Battlefield V, on a 9600K. Mem usage 7GB. Well the game runs around 45FPS but it is just a 2070 and that one has 8GB which is still enough. If you go down from any setting in the game, you wouldn't do it because of the Vram deficiency but to bump FPS. Maybe if the CPU was OC'ed to 5Ghz (in game shows 4.5-4.6), the game would run faster.
 
Unfortunately, the mass market does not understand many things.
They (tried to) buy the 3080 because of the hype surrounding it, to go slot into their Ryzen 3300x system, DVI-linked to a 60hz 1080p HP monitor they bought on Black Friday at Walmart.
More is always better so more VRAM definitely is better, right? At least this way they might release the current 3080s into the used market.

(I would like to believe that) Nvidia isn't stupid. They made the current cards with the current VRAM size because they devised that it would be more than enough for more than 90% of users buying a 3080.
This size of VRAM is a commodity, where you either have special use cases (i.e. planning for 4K, 8K texture mods on EVERYTHING for 2077 with custom meshes including 2B ass port) or you buy for future proofing (which is gamble within itself as it heavily depends on game developers utilizing what Nvidia gives them. SLI was a good example of FOMO, now that its officially dead for the average consumer. MAAAAAYBEEE it might help DLSS3.0 but tin foil hat theory with the premises that they have to use the extra VRAM for something.).

Currently, the real bottleneck that no one talks about is the audio/video interface. Idk if its possible to get around it by forcing multiple links (as seen on some monitors), though 8K 60fps "HDR+" (no compression) is already impossible with a single HDMI2.1/DP2.0 (the marketing numbers they give you up front are the maximum resolution with compression without HDR+ and at or lower than 60fps). So you gain all this processing power that your monitor can't even fully display because of a physical bandwidth limit of the interface. This is the reason the marketing shifted to 8K gaming, now that the 4k high refresh rate (120+) HDR gaming is a reality, the new "1440p," so they want you to lust for the next unobtainable.
 
I disagree. If you don't know the first thing about GPU, chances are you're not following new releases anyway. Early adopters are tech savvy, they know what they buy into.
I'm an early adopter. I have to be for my work and have been for at least 15 years - it's just that I don't have to worry about value for money or how wise the decisions are because I'm not paying for it.

Over the years, very few of my "early adopter" experiences have panned out. Most of them were simply victims of early adopter tax. It's why it's called a tax.

If you wait for just a month or two, you can usually get a better version of that product for less money with more stable drivers and better all-round experience.
 
I'm an early adopter. I have to be for my work and have been for at least 15 years - it's just that I don't have to worry about value for money or how wise the decisions are because I'm not paying for it.

Over the years, very few of my "early adopter" experiences have panned out. Most of them were simply victims of early adopter tax. It's why it's called a tax.

If you wait for just a month or two, you can usually get a better version of that product for less money with more stable drivers and better all-round experience.
What does "pan out" mean to you?
For an early adopter, it's all about getting the latest hardware, as soon as it is available. Not getting the best deal and teething problems go with the territory. Otherwise we'd all be early adopters.
 
What does "pan out" mean to you?
For an early adopter, it's all about getting the latest hardware, as soon as it is available. Not getting the best deal and teething problems go with the territory. Otherwise we'd all be early adopters.
As per my original reply, "Sometimes they get lucky and accidentally purchase a true gem", such as Maxwell GTX980 'FE' reference cards, that turned out to be a relatively long-lasting architecture, on solid drivers, with well-built cards/coolers that weren't undercut by the later AIB models.

You said yourself, - not getting the best deal and teething problems, for the sake of impatience. If you don't call that a sucker, what do you call a sucker?
 
As per my original reply, "Sometimes they get lucky and accidentally purchase a true gem", such as Maxwell GTX980 'FE' reference cards, that turned out to be a relatively long-lasting architecture, on solid drivers, with well-built cards/coolers that weren't undercut by the later AIB models.

You said yourself, - not getting the best deal and teething problems, for the sake of impatience. If you don't call that a sucker, what do you call a sucker?
I don't call a sucker someone who takes a gamble knowing the risks in advance. That's just an informed decision.
 
They have already pulled one rabbit from the hat.........Ryzen
True, but every time they come out with a new GPU, they hype it up only to have it be a failure.
 
Yeah it's gonna be a long time before people switch to 4K high refresh, probably after the 3080 has become obsolete.
4K 120hz is nice but 1440p 240hz is still nicer :D, especially when people mix in competitive games.

100% agree. my next monitor will be 1440p 240hz.
 
For an early adopter, it's all about getting the latest hardware, as soon as it is available. Not getting the best deal and teething problems go with the territory.
I don't call a sucker someone who takes a gamble knowing the risks in advance. That's just an informed decision.

Exactly this, I want new hardware when it's brand new.

I had a GTX1080 and have been waiting since Turing, watched reviews and knew what I wanted. To an extent I even enjoy the 'experimental' type features that don't have wide support and being on that bleeding edge. Hec even that GTX1080 was purchased launch week and I enjoyed over 4 years of amazing performance from it, including Nvidia allowing Freesync to work and enabling RTX for the lols, and even today it performs admirably.

To me, the only ways the RTX3080 10gb doesn't 'pan out' for me is if the card outright dies or has some widespread hardware issue it needs to be RMA'd for (even then it's manageable), or more likely a new product releases soon that reduces the RTX3080 10gb value by worsening the price/performance ratio compared to then new products.
 
Last edited:
  • Like
Reactions: bug
List 1 AMD GPU in the last 5 years that made Nvidia buyers reconsidered their purchase decision :banghead:
There is none, people just learned after waiting for Vega, Radeon VII and Navi that AMD has nothing to offer beside a slightly cheaper alternative to Nvidia's midrange...
AMD's GPU's from the past five years are completely irrelevant. The RNDA 2 microarchitecture is a big leap for them, and this time round they're genuinely worrying Nvidia. Why do you Nvidia just dropped a 2080ti performance card for ÂŁ500? They're concerned.

Regardless of your doubts because of past history, when two or more companies release new products within a month of each other, it's common sense to see what both have on offer...
 
AMD's GPU's from the past five years are completely irrelevant. The RNDA 2 microarchitecture is a big leap for them, and this time round they're genuinely worrying Nvidia. Why do you Nvidia just dropped a 2080ti performance card for ÂŁ500? They're concerned.

I mean, you have to understand the skepticism. We hear these same arguments every AMD launch.

Before, it was RDNA. Big words, ok performance but nothing game changing, much driver issues.

Before that, it was Vega and so on and so forth...
 
you have to admit though, high refresh gaming is good stuff. consoles just can't do it bb
I'm playing some of the games within 30-40 FPS with G-SYNC ON, and it's as fluid as it can be. Since I'm not a pro gamer, this whole MOAR FPS is just overrated.
 
I'm playing some of the games within 30-40 FPS with G-SYNC ON, and it's as fluid as it can be. Since I'm not a pro gamer, this whole MOAR FPS is just overrated.

it really depends on the game imo, some games are programmed better to look smooth at 30-60 fps. now Aliens vs Predator 2010 game is a great game to test this idea with, cause it actually is not bad at 60 fps, and it does not allow high refresh natively, but if you hit alt + enter twice, it will make the game run at 144 or 165 fps, and it is a night and day difference experience. it's really hard to go back to that distortion of 60hz (yes gsync was on at all times for this testing)
 
it really depends on the game imo, some games are programmed better to look smooth at 30-60 fps. now Aliens vs Predator 2010 game is a great game to test this idea with, cause it actually is not bad at 60 fps, and it does not allow high refresh natively, but if you hit alt + enter twice, it will make the game run at 144 or 165 fps, and it is a night and day difference experience. it's really hard to go back to that distortion of 60hz (yes gsync was on at all times for this testing)
That's just it, high-refresh is great for some games, yet all non high-refresh monitor announcements are met with boo! around here.
 
5700 and 5700XT
5700XT maybe, but 5700 was basically a slightly better 2060, 9 months later and with no RTRT. If you bought a 2060 for RTRT, you wouldn't have any regrets.
5700XT didn't have RTRT either, but it came for like $100 less.
 
Depends a lot on price and performance of the 20 GB model, say the price turns out to be $999 but only a perfomance boost of 5-10%, then I don't think the one's that bought the 10 GB model will be that angry.
Also, as it stands right now, the $1500 RTX 3090 is rumoured to be only ~10% faster than 3080 (in games) but that's just a rumour for now, we will know tomorrow.

So the 3090 is actually 10% faster than 3080. The 3080 20GB, then would cost maybe $900 for 10 more GB and i guess 5% perf increase?
 
Back
Top