• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 3090 and 3080 Specifications Leaked

Joined
May 15, 2020
Messages
697 (0.42/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
Idk should i upgrade from my HD 4850 512MB GDDR3? Not much performance difference, it seems. Minus the lack of Ray tracing. :roll:
Well if you're only playing Baldur's Gate II, ray tracing isn't supported anyway...
 
Joined
Nov 6, 2016
Messages
1,752 (0.59/day)
Location
NH, USA
System Name Lightbringer
Processor Ryzen 7 2700X
Motherboard Asus ROG Strix X470-F Gaming
Cooling Enermax Liqmax Iii 360mm AIO
Memory G.Skill Trident Z RGB 32GB (8GBx4) 3200Mhz CL 14
Video Card(s) Sapphire RX 5700XT Nitro+
Storage Hp EX950 2TB NVMe M.2, HP EX950 1TB NVMe M.2, Samsung 860 EVO 2TB
Display(s) LG 34BK95U-W 34" 5120 x 2160
Case Lian Li PC-O11 Dynamic (White)
Power Supply BeQuiet Straight Power 11 850w Gold Rated PSU
Mouse Glorious Model O (Matte White)
Keyboard Royal Kludge RK71
Software Windows 10
Trying to pull 2 year old arguments? Hardly. Given the fact that many AAA titles are raytracing enabled, the fact that main game engines like UE now support raytracing and some of the biggest games are going to be raytraced - Cyberpunk 2077, Minecraft just for an example - nobody believes those old arguments anymore. And you seem to have a bit of a split personality - your beloved AMD is saying new consoles and their RDNA2 GPUs will support raytracing as well. So what are you really trying to say? Are you trying to prepare for the eventuality that AMD's raytracing performance sucks?

Someone needs to tell this guy to chill out and to stop taking criticism of a corporation and their products personally (weird, right?). I never understood the individuals who appoint themselves as the defenders of someone else's honor, especially when that someone isn't even a person but an abstract entity. Everyone needs to take a breath and remind themselves these are inanimate pieces of hardware...

.... Can't we all just get along?

P.S. The guys he's reply to isn't trying to "prepare for AMD Ray tracing to be bad, he's making the point that the consoles dominate how things get done because consoles represent the vast bulk of gaming, so what's he's saying is that however AMD does Ray tracing will be the predominant way Ray tracing gets done and therefore, games will primarily optimize for AMD's method over Nvidia's.

I'm just imaging the head of a development team calling 911:
911 Operator: "What's the problem this evening?"

Developer: "My family had a package delivered today that had an Nvidia video card in it and a .357 round...and now there's some guy wearing a leather jacket snooping around the bushes outside of my house!"

911 Operator: "OK, can you describe the individual so the police know who they're looking for?"

Developer: "Yeah, OK.... Um.... He's probably about 5'1" tall, has a $5 haircut and he's ranting and raving like a madman.... Something about 'buying more is saving more', he might be mentally unstable."

911 Operator:" OK, you said he's wearing a leather jacket, what else is he wearing?"

Developer:" He's just wearing the leather jacket and a pair of knee high construction boots, but other than that, he's completely naked from the waist down"

911 Operator: "We'll have someone there immediately"
 
Last edited:
Joined
Dec 31, 2009
Messages
19,371 (3.56/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
bit of a hard question to answer, do you "need" anything in this space? do you even "need" a dedicated gpu?

This is about high end gaming and more Vram to work with is better, higher resolution textures and shadow quality and other stuff.
10gb on a new flagship 3080 is..... just extremely lackluster.

Like I said its like Big N is following Big I and just sell us a nothingburger due to lack of competition in these price brackets.
To play games at ultra settings at 1080p/60hz, YES you need a dedicated GPU for most titles.

The flagship isn't a 3080, however.

lol at the third line... here is a tissue for your premature ejac...errr opinion. :p
 
Joined
Nov 6, 2016
Messages
1,752 (0.59/day)
Location
NH, USA
System Name Lightbringer
Processor Ryzen 7 2700X
Motherboard Asus ROG Strix X470-F Gaming
Cooling Enermax Liqmax Iii 360mm AIO
Memory G.Skill Trident Z RGB 32GB (8GBx4) 3200Mhz CL 14
Video Card(s) Sapphire RX 5700XT Nitro+
Storage Hp EX950 2TB NVMe M.2, HP EX950 1TB NVMe M.2, Samsung 860 EVO 2TB
Display(s) LG 34BK95U-W 34" 5120 x 2160
Case Lian Li PC-O11 Dynamic (White)
Power Supply BeQuiet Straight Power 11 850w Gold Rated PSU
Mouse Glorious Model O (Matte White)
Keyboard Royal Kludge RK71
Software Windows 10
Ouch, it does indeed say boost, but things really don't add up. AMD managed to boost their first 7nm iteration at 1.9GHz, why would Nvidia only manage 1.7? I would believe this only if it was on Samsung 8nm and if it's really not a good node...

Maybe they can boost higher, but then they'd be consuming 450-600 watts

If i remember correctly, Nvidia said that AMD reserved most of the 7nm over at TSMC. But they went on ahead and beat AMD into reserving 5nm for next year. So wait for RTX 3000 Super series for TSMC 5nm.

Where have you heard that conserving 5nm? How don't k ow how that's possible considering AMD has had 5nm reservations for Zen4 for a while now

The point, however, is that RT tech is here... it isn't a gimmick with everyone all in. Capeesh? :)
Have a cookie...just saying 300W is nothing compared to some cards in the past. :)

Yeah, the past as in 45nm to 28nm....for them to be that high at 8nm (Samsung is 8nm, right?) is a little curious, but then again I've heard leaks that Nvidia is really worried about RDNA2 and we're forced into this position
 
Joined
Feb 11, 2009
Messages
5,556 (0.96/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> RX7800XT
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
To play games at ultra settings at 1080p/60hz, YES you need a dedicated GPU for most titles.

The flagship isn't a 3080, however.

lol at the third line... here is a tissue for your premature ejac...errr opinion. :p

Well thats the point, "do you "need" X" is a difficult question to answer, now you are quantifying it, now we can talk about something.
Do you need more then 10gb of Vram....well yeah if a game developer wants to stuff massive high quality textures on your 4k or higher resolution screen it does.
Why would the 3090 have 24gb if 10 was enough....

The flagship inst the 3080 but its close enough really, this is the first time we have a 3090 as a clear bracket above, before this its more 50 - 60 - 70 - 80 and thne 80 special edition.
80 is up there, way too up there to have a pedestrian 10gb of Vram, today I would give that to the 3060 honestly, the 1060 had 6.....

No clue what your reaction to "the third line " is about.
 
Last edited:
Joined
Sep 17, 2014
Messages
22,468 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
As in Cyberpunk 2077? Yeah, right, that's a weak title. Minecraft? Yeah, also a weak title. Anything made on UE for the foreseeable future? Also weak and unimportant. Console ports made for DXR? Also unimportant .. ;) Riiiiight.

Those titles are not any weaker or stronger with or without RT, reading comprehension buddy.

They were all there before we even knew it was going to be in them. I'm hard passing for paying premium on features I barely get to use and see little value in. It is just a graphical update. Not much more. There is no new content for it, no new gameplay, nothing that gives me even a minute more fun than it used to.

I'm not paying to early adopt and beta test technology that really isn't feasible in the economic reality we used to have. And that reality has gotten worse, too. Its not like we're enjoying big paychecks because all is going so well lately, are we? Why then would the industry force us to move into much bigger dies with much higher cost to make? You know the answer.

The fact power budgets apparently need to go up to 320W even for a non-enthusiast 3080 is telling. It speaks of a design that needed to be stretched to meet its demands. Its Vega all over again, really, and I passed on that too. 320W is not entering my case, especially not just for some fancy lighting.

My principles haven't changed, maybe now people understand pro-Nvidia doesn't exist over here... I am pro consumer and pro progress. This isn't progress, its brute forcing something that used to be done much more efficiently, and making us pay the bill. Hell no. I'm interested still in how Nvidia and AMD fill up the mid range. It better be really good price/perf wise, or I'm left with this 1080 for another few years. Which is fine, by the way, it does all I need it to.
 
Last edited:
Joined
Jan 21, 2020
Messages
109 (0.06/day)
Someone needs to tell this guy to chill out and to stop taking criticism of a corporation and their products personally (weird, right?). I never understood the individuals who appoint themselves as the defenders of someone else's honor, especially when that someone isn't even a person but an abstract entity. Everyone needs to take a breath and remind themselves these are inanimate pieces of hardware...

.... Can't we all just get along?

P.S. The guys he's reply to isn't trying to "prepare for AMD Ray tracing to be bad, he's making the point that the consoles dominate how things get done because consoles represent the vast bulk of gaming, so what's he's saying is that however AMD does Ray tracing will be the predominant way Ray tracing gets done and therefore, games will primarily optimize for AMD's method over Nvidia's.... At least whenever Nvidia doesn't throw a dump truck of cash at developers and hold a gun to their head to use Blackbox tech like game works.... Something tells me that Nvidia deals with developers like the Mexican Cartels deal with bribing government officials, "Silver or lead", haha.

I'm just imaging the head of a development team calling 911:
911 Operator: "What's the problem this evening?"

Developer: "My family had a package delivered today that had an Nvidia video card in it and a .357 round...and now there's some guy wearing a leather jacket snooping around the bushes outside of my house!"

911 Operator: "OK, can you describe the individual so the police know who they're looking for?"

Developer: "Yeah, OK.... Um.... He's probably about 5'1" tall, has a $5 haircut and he's ranting and raving like a madman.... Something about 'buying more is saving more', he might be mentally unstable."

911 Operator:" OK, you said he's wearing a leather jacket, what else is he wearing?"

Developer:" He's just wearing the leather jacket and a pair of knee high construction boots, but other than that, he's completely naked from the waist down"

911 Operator: "We'll have someone there immediately"
Trying to belittle the head of a corporation that currently beats your beloved company on every level does not change the actual facts. No matter how hard you try. Raytracing is here to stay, lots of games use and even more are going to. Both Intel and AMD are also getting onto the raytracing train with their future products, so it the future. This leak shows pretty clearly that raytracing is going to be a big focus point of Ampere. We've heard many attempts to joke about Huang's leather jackets etc., and it doesn't achieve anything else than show you ran out of arguments. Let's just stick to the facts and the topic of this article - Ampere specs - not leather jackets :D
 
Joined
Nov 6, 2016
Messages
1,752 (0.59/day)
Location
NH, USA
System Name Lightbringer
Processor Ryzen 7 2700X
Motherboard Asus ROG Strix X470-F Gaming
Cooling Enermax Liqmax Iii 360mm AIO
Memory G.Skill Trident Z RGB 32GB (8GBx4) 3200Mhz CL 14
Video Card(s) Sapphire RX 5700XT Nitro+
Storage Hp EX950 2TB NVMe M.2, HP EX950 1TB NVMe M.2, Samsung 860 EVO 2TB
Display(s) LG 34BK95U-W 34" 5120 x 2160
Case Lian Li PC-O11 Dynamic (White)
Power Supply BeQuiet Straight Power 11 850w Gold Rated PSU
Mouse Glorious Model O (Matte White)
Keyboard Royal Kludge RK71
Software Windows 10
Exaclty. From the attempts to downplay raytracing to the AMD fans talking about future nodes, it doesn't seem like the Red team had much confidence in RDNA2 / Big Navi. I for one am really curious just how much Nvidia is going to push raytracing forward. The rumours about a 4x increase could be true, given the TDP and classic CUDA core counts from this leak. Maybe even more than that? Can't wait to see.

What are you talking about, the reason why he was bringing up AMD nodes is because someone literally made the completely false claim that Nvidia had 5nm reserved to the point of blocking AMD out and then people were correcting him, that's all, seems straightforward with no alterior motives, at least to me?

Trying to belittle the head of a corporation that currently beats your beloved company on every level does not change the actual facts. No matter how hard you try. Raytracing is here to stay, lots of games use and even more are going to. Both Intel and AMD are also getting onto the raytracing train with their future products, so it the future. This leak shows pretty clearly that raytracing is going to be a big focus point of Ampere. We've heard many attempts to joke about Huang's leather jackets etc., and it doesn't achieve anything else than show you ran out of arguments. Let's just stick to the facts and the topic of this article - Ampere specs - not leather jackets :D
Dude, it's a joke to lighten the mood, and instead of laughing, you decided to somehow take it personally again.... Also, why is making a joke about Nvidia, automatically make someone an AMD fanboy? I don't follow how that works.... are there only two choices in this world, either be personally offended for Nvidia or be an AMD fanboy? How about I have no loyalty and buy whatever offers the best value at the price I can afford? Is that OK with you, or am I forced into this false binary you've imagined?

Never would I imagine that a light hearted joke to cut the tension, directed toward someone a million miles away from this forum would result in someone here being personally offended, and then attacking me personally, when I did no such thing, why was it necessary to make it personal?

"ran out of arguments"? What argument was I making? I believe I didn't make a single argument in this entire forum, and it seems like you're imagining a fight that doesn't exist.... I'm sincerely just trying to understand what occurred here, and why you think I'm engaging in combat when all I did was make a joke to make people laugh, and haven't attacked anyone or called anyone out. So far the only thing I did was try and point out to someone that people were talking about AMD process nodes to correct someone else who had brought it up first and that made a false statement about Nvidia reserving all the 5nm production when there's numerous, reputable sources to the contrary .... Why are you attempting drag me into a fight with you for no reason I can discern?

Again, I'm not trying to be combative, never have been, I'm just sincerely and genuinely trying to understand what's going on to de-escalate the situation and find out what's occurred to remedy it
 
Last edited:
Joined
Jan 21, 2020
Messages
109 (0.06/day)
Those titles are not any weaker or stronger with or without RT.

They were all there before we even knew it was going to be in them. I'm hard passing for paying premium on features I barely get to use and see little value in. It is just a graphical update. Not much more. There is no new content for it, no new gameplay, nothing that gives me even a minute more fun than it used to.
So is higher resolution, shadows, higher detailes models and basically all the graphical advances over the years. All just graphical updates. With your logic, we could all just stay at 640x480. In reality all those advances, including raytracing, improve immersion. And while a good story and gameplay is still the key and it's sad that some games put graphical fidelity first, I agree on that, adding more immersion and realism is a huge boost when the gameplay is good.
 
D

Deleted member 185088

Guest
I was expecting the 3070 to beat comfortably the 2080ti given the new node and architecture, but it doesn't seem like it, and given nVidia's dominance in the market I don't expect them to price Ampere reasonably. I only hope AMD can do a Zen2 for GPUs, Navi was very good (besides drivers issues) it beat Turing while being cheaper, but the performance offered was a bit low, it would've been better to have a 5800xt comparable to a 2080/s for 100$ less.
 
Joined
Sep 17, 2014
Messages
22,468 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
So is higher resolution, shadows, higher detailes models and basically all the graphical advances over the years. All just graphical updates. With your logic, we could all just stay at 640x480. In reality all those advances, including raytracing, improve immersion. And while a good story and gameplay is still the key and it's sad that some games put graphical fidelity first, I agree on that, adding more immersion and realism is a huge boost when the gameplay is good.

Blanket statements. Not interested in this level of debate, sorry. Realism and immersion were achieved prior to RT. It is not necessary, it is only necessary to feed an industry that is running into the limits of mainstream resolutions. Which underlines my statement: I can do just fine riding this 1080 for a few more years. A 2016 card that still hits 120 fps at 1080p. That is why RT exists. Don't kid yourself. Nothing is stopping me from jumping in when there IS actual content for it, that CAN actually run on something that doesn't need a nuclear fission plant and ditto cooling.

Smart companies create new demand to feed their business. Its always been this way. Its up to us to tell them to stick it back in their behind for further refinement... or we can jump on it like gullible fools and pay their premium for them. In hindsight everyone can agree that Turing was exactly that: early adopting territory. Why is Ampere different? The content still isn't there, the price has increased yet again... explain the madness.

I didn't buy 300W AMD cards, and I don't plan to start now even if the color is different.

Well, these might still be inaccurate leaks, but 10GB sounds weird to me too. As the idea of having another 2080 variant with 20GB.

I always thought that AMD's policy of making the 5500 with 2 memory sizes was stupid, 4GB is defintely not enough for some games, while 8GB will never be fully utilized by such a weak GPU with low bandwidth. I prefer the manufacturer to pick the optimal memory size for that SKU and stick with that.

And there's a good chance that 10GB might not be enough for 4K in 2 years.

The whole thing looks like a complete failure if you ask me, and Turing was no different. Too pricy and bulky, too little to show us to motivate that price.

I think its very likely we will see a quick SUPER update to this lineup. And it could be big, if it truly is all based on foundry issues an update might fix things bigtime. Fix, there I said it, because this looks broken af.
 
Last edited:
Joined
Jan 6, 2013
Messages
81 (0.02/day)
I was hoping for displayport 2.0 on these cards. Disappointing because these cards will be able to do high frame rates at 4k.
 
Joined
May 15, 2020
Messages
697 (0.42/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
Why would the 3090 have 24gb of 10 was enough....
Well, these might still be inaccurate leaks, but 10GB sounds weird to me too. As the idea of having another 2080 variant with 20GB.

I always thought that AMD's policy of making the 5500 with 2 memory sizes was stupid, 4GB is defintely not enough for some games, while 8GB will never be fully utilized by such a weak GPU with low bandwidth. I prefer the manufacturer to pick the optimal memory size for that SKU and stick with that.

And there's a good chance that 10GB might not be enough for 4K in 2 years.
 

crazyeyesreaper

Not a Moderator
Staff member
Joined
Mar 25, 2009
Messages
9,816 (1.71/day)
Location
04578
System Name Old reliable
Processor Intel 8700K @ 4.8 GHz
Motherboard MSI Z370 Gaming Pro Carbon AC
Cooling Custom Water
Memory 32 GB Crucial Ballistix 3666 MHz
Video Card(s) MSI RTX 3080 10GB Suprim X
Storage 3x SSDs 2x HDDs
Display(s) ASUS VG27AQL1A x2 2560x1440 8bit IPS
Case Thermaltake Core P3 TG
Audio Device(s) Samson Meteor Mic / Generic 2.1 / KRK KNS 6400 headset
Power Supply Zalman EBT-1000
Mouse Mionix NAOS 7000
Keyboard Mionix
so 350 watt max with the TDP limit raised the card could get close to 400+ watts OCed
 
Joined
Dec 31, 2009
Messages
19,371 (3.56/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
Well thats the point, do you "need" X is a difficult question to answer, now you are quantifying it, now we can talk about something.
Do you need more then 10gb of Vram....well yeah if a game developer wants to stuff massive high quality textures on your 4k or higher resolution screen it does.
Why would the 3090 have 24gb of 10 was enough....

The flagship inst the 3080 but its close enough really, this is the first time we have a 3090 as a clear bracket above, before this its more 50 - 60 - 70 - 80 and thne 80 special edition.
80 is up there, way too up there to have a pedestrian 10gb of Vram, today I would give that to the 3060 honestly, the 1060 had 6.....

No clue what your reaction to "the third line " is about.
Do I really have to answer why a flagship has 24GB versus anoher down the stack has less? You've been here long enough... think about it. :)

But, it ISN'T, the flagship. there isn't a 'close enough'. 10GB = pedestrian, lolololol. It's 25% more than the 2080... and a mere 1GB less than 2080 Ti. I'd call it an improvement... especially at where it is intended to play games. ;)

That is a reaction to the third line of the post I quoted. Premature is premature. ;)
 
Last edited:
Joined
Jan 21, 2020
Messages
109 (0.06/day)
What are you talking about, the reason why he was bringing up AMD nodes is because an obvious Nvidia fan literally made the completely false claim that Nvidia had 5nm reserved to the point of blocking AMD out and then people were correcting him, do you even bother reading things before Comme ting on them?
Oh really? And this is what?
Ouch, it does indeed say boost, but things really don't add up. AMD managed to boost their first 7nm iteration at 1.9GHz, why would Nvidia only manage 1.7? I would believe this only if it was on Samsung 8nm and if it's really not a good node...

He was not the one starting the process nodes argument. Let's just end it right here and move on.
 
Joined
May 15, 2020
Messages
697 (0.42/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
I was expecting the 3070 to beat comfortably the 2080ti given the new node and architecture, but it doesn't seem like it, and given nVidia's dominance in the market I don't expect them to price Ampere reasonably. I only hope AMD can do a Zen2 for GPUs, Navi was very good (besides drivers issues) it beat Turing while being cheaper, but the performance offered was a bit low, it would've been better to have a 5800xt comparable to a 2080/s for 100$ less.
Well, looking at memory bandwidths, the 3070 might tie or beat a 2080Super in pure rasterized performance, and that's a very decent showing.

Where the 2070 will beat the 2080Ti handily, is most probably in ray-tracing...
 
Joined
Jul 9, 2015
Messages
3,413 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Joined Jan 21, 2020

2 year old arguments?
Impressive.


raytracing performance sucks?
It doesn't matter, as we are not talking about performance, but only about power consumption.

So... why would "RT cores" (of which we now have "moe") consume power when NOT running one of the handful RT enabled games?
 

ppn

Joined
Aug 18, 2015
Messages
1,231 (0.36/day)
The most interesting card is 3070. if full chip nothing gimped and 7nm EUV.
 
Joined
Sep 15, 2016
Messages
484 (0.16/day)
The most interesting thing about this leak is that the 3090 will be sold by AIB partners and won't be exclusive like the previous generation Titans
 
Joined
Jan 21, 2020
Messages
109 (0.06/day)
Blanket statements. Not interested in this level of debate, sorry. Realism and immersion were achieved prior to RT. It is not necessary, it is only necessary to feed an industry that is running into the limits of mainstream resolutions. Which underlines my statement: I can do just fine riding this 1080 for a few more years. A 2016 card that still hits 120 fps at 1080p. That is why RT exists. Don't kid yourself. Nothing is stopping me from jumping in when there IS actual content for it, that CAN actually run on something that doesn't need a nuclear fission plant and ditto cooling.

You started it, were proven wrong and now you're running away. Evolution of computer graphics is incremental and there's no denying that. Trying to downplay a significant advance in technology just because your favourite company is behind is silly. You started this level of debate by trying to downplay RT with false claims. If you don't want to have such debates, don't start them next time and stay on topic.

Jinxed said:
Joined Jan 21, 2020
2 year old arguments?

Impressive.

Yes, AMD fans were trying to downplay raytracing since Turing GPUs were released. So yes, 2 years LOL
 
Joined
Dec 31, 2009
Messages
19,371 (3.56/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
The whole thing looks like a complete failure if you ask me, and Turing was no different.
Who replaced Vayra with our fanboys. :(

NV is penalized by some for leading the way. If it wasn't the way (RT), AMD wouldn't follow. Pricing and power use are likely to be out of line, but I'd imagine most any title will now be playable with full RT........ at least that is the hope with the rumored power consumption. Couple that with consoles getting RT titles, there is a better chance than ever to see a lot more momentum on that front, no? Such absolutes from you...... :(... not used to that.
 
Joined
Sep 17, 2014
Messages
22,468 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
You started it, were proven wrong and now you're running away. Evolution of computer graphics is incremental and there's no denying that. Trying to downplay a significant advance in technology just because your favourite company is behind is silly. You started this level of debate by trying to downplay RT with false claims. If you don't want to have such debates, don't start them next time and stay on topic.

No, I'm not running away, I'm making a choice as a consumer. Its really that simple. If AMD produces something similar, its a no buy in either camp.

I agree about evolution in computer graphics. Its about picking your battles. Remember 3DFX?

Who replaced Vayra with our fanboys. :(

NV is penalized by some for leading the way. If it wasn't the way (RT), AMD wouldn't follow. Pricing and power use are likely to be out of line, but I'd imagine most any title will now be playable with full RT........ at least that is the hope with the rumored power consumption. Couple that with consoles getting RT titles, there is a better chance than ever to see a lot more momentum on that front, no? Such absolutes from you...... :(... not used to that.

Wait and see mode for me, simple. I care about content before hardware.

Let's see that momentum, first. So far the consoles aren't out and we know it takes a while for content to mature on them. Patience, young gwasshoppa

The most interesting card is 3070. if full chip nothing gimped and 7nm EUV.

The last hope...
 
Last edited:
Joined
Apr 10, 2020
Messages
504 (0.30/day)
I'll postpone buying decision until RDNA2 comes out plus a few months for prices to settle. I'll go with AMD if RDNA2 brings more value to the table, if not, I'll opt for Nvidia or skip this generation altogether if price/performance ratios of both GPUs suck. 1080TI is still a decent GPU after all and I really don't care about ray tracing at this point in time. Maybe this will change in 2-3 years.
 
Top