• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Rumor: NVIDIA's Next Generation GeForce RTX 3080 and RTX 3070 "Ampere" Graphics Cards Detailed

Joined
Mar 23, 2005
Messages
4,092 (0.57/day)
Location
Ancient Greece, Acropolis (Time Lord)
System Name RiseZEN Gaming PC
Processor AMD Ryzen 7 5800X @ Auto
Motherboard Asus ROG Strix X570-E Gaming ATX Motherboard
Cooling Corsair H115i Elite Capellix AIO, 280mm Radiator, Dual RGB 140mm ML Series PWM Fans
Memory G.Skill TridentZ 64GB (4 x 16GB) DDR4 3200
Video Card(s) ASUS DUAL RX 6700 XT DUAL-RX6700XT-12G
Storage Corsair Force MP500 480GB M.2 & MP510 480GB M.2 - 2 x WD_BLACK 1TB SN850X NVMe 1TB
Display(s) ASUS ROG Strix 34” XG349C 144Hz 1440p + Asus ROG 27" MG278Q 144Hz WQHD 1440p
Case Corsair Obsidian Series 450D Gaming Case
Audio Device(s) SteelSeries 5Hv2 w/ Sound Blaster Z SE
Power Supply Corsair RM750x Power Supply
Mouse Razer Death-Adder + Viper 8K HZ Ambidextrous Gaming Mouse - Ergonomic Left Hand Edition
Keyboard Logitech G910 Orion Spectrum RGB Gaming Keyboard
Software Windows 11 Pro - 64-Bit Edition
Benchmark Scores I'm the Doctor, Doctor Who. The Definition of Gaming is PC Gaming...
Just clicked a few links, I see "half".
Are you joking with me? lol, all sources claim its brand spanking new design that will feature some exceptional things including VRS and RT.
At the end of the day it's rumours and speculation, but note that a couple of those sources have been right before.
 

Otonel88

New Member
Joined
Oct 3, 2019
Messages
27 (0.01/day)
When is expected for Nvidia to announce the new Ampere cards?
And AMD with Big Navi?
Maybe if they are close together it will create some kind of competition between them, wich is always good for the consumer. (Look at the 5600 release)
 
Joined
Mar 23, 2005
Messages
4,092 (0.57/day)
Location
Ancient Greece, Acropolis (Time Lord)
System Name RiseZEN Gaming PC
Processor AMD Ryzen 7 5800X @ Auto
Motherboard Asus ROG Strix X570-E Gaming ATX Motherboard
Cooling Corsair H115i Elite Capellix AIO, 280mm Radiator, Dual RGB 140mm ML Series PWM Fans
Memory G.Skill TridentZ 64GB (4 x 16GB) DDR4 3200
Video Card(s) ASUS DUAL RX 6700 XT DUAL-RX6700XT-12G
Storage Corsair Force MP500 480GB M.2 & MP510 480GB M.2 - 2 x WD_BLACK 1TB SN850X NVMe 1TB
Display(s) ASUS ROG Strix 34” XG349C 144Hz 1440p + Asus ROG 27" MG278Q 144Hz WQHD 1440p
Case Corsair Obsidian Series 450D Gaming Case
Audio Device(s) SteelSeries 5Hv2 w/ Sound Blaster Z SE
Power Supply Corsair RM750x Power Supply
Mouse Razer Death-Adder + Viper 8K HZ Ambidextrous Gaming Mouse - Ergonomic Left Hand Edition
Keyboard Logitech G910 Orion Spectrum RGB Gaming Keyboard
Software Windows 11 Pro - 64-Bit Edition
Benchmark Scores I'm the Doctor, Doctor Who. The Definition of Gaming is PC Gaming...
When is expected for Nvidia to announce the new Ampere cards?
And AMD with Big Navi?
Maybe if they are close together it will create some kind of competition between them, wich is always good for the consumer. (Look at the 5600 release)
Couldn't Agree More, lol
 
Joined
Dec 31, 2009
Messages
19,372 (3.54/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
Why not check the internet. There's a wealth of speculation and rumours for both Big Navi / RDNA2 and Ampere. lol

*snip*
From a link (or two) of yours...
AMD claims up to a 50 percent improvement in performance for the same power consumption.

And with Nvidia Ampre saying the same thing (50% performance for the same power)...if we assume both are true... that leaves the landscape in a similar position, no?

I wonder how AMD is going to accomplish this on the same node with just an arch tweak. Nvidia on the other hand has all of the natural improvements of a die shrink, PLUS their arch changes.
 
Joined
Mar 10, 2015
Messages
3,984 (1.11/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
I wonder how AMD is going to accomplish this on the same node with just an arch tweak.

Aren't they moving to EUV? That should give them something...or not. Or RDNA2 would have to be radically different like Zen3 is purport to be. But that doesn't seem to make sense at this point. Or HBM anyone lol? Get those power savings somehow.

Nvidia on the other hand has all of the natural improvements of a die shrink, PLUS their arch changes.

They could be hedging on RTX gains this year seeing as they likely know the steep ass hill AMD has to climb.

Lots of ifs, lots of coulda, lots of we don't really know. Dash of salt.....voila!
 
Joined
Mar 23, 2005
Messages
4,092 (0.57/day)
Location
Ancient Greece, Acropolis (Time Lord)
System Name RiseZEN Gaming PC
Processor AMD Ryzen 7 5800X @ Auto
Motherboard Asus ROG Strix X570-E Gaming ATX Motherboard
Cooling Corsair H115i Elite Capellix AIO, 280mm Radiator, Dual RGB 140mm ML Series PWM Fans
Memory G.Skill TridentZ 64GB (4 x 16GB) DDR4 3200
Video Card(s) ASUS DUAL RX 6700 XT DUAL-RX6700XT-12G
Storage Corsair Force MP500 480GB M.2 & MP510 480GB M.2 - 2 x WD_BLACK 1TB SN850X NVMe 1TB
Display(s) ASUS ROG Strix 34” XG349C 144Hz 1440p + Asus ROG 27" MG278Q 144Hz WQHD 1440p
Case Corsair Obsidian Series 450D Gaming Case
Audio Device(s) SteelSeries 5Hv2 w/ Sound Blaster Z SE
Power Supply Corsair RM750x Power Supply
Mouse Razer Death-Adder + Viper 8K HZ Ambidextrous Gaming Mouse - Ergonomic Left Hand Edition
Keyboard Logitech G910 Orion Spectrum RGB Gaming Keyboard
Software Windows 11 Pro - 64-Bit Edition
Benchmark Scores I'm the Doctor, Doctor Who. The Definition of Gaming is PC Gaming...
From a link (or two) of yours...


And with Nvidia Ampre saying the same thing (50% performance for the same power)...if we assume both are true... that leaves the landscape in a similar position, no?

I wonder how AMD is going to accomplish this on the same node with just an arch tweak. Nvidia on the other hand has all of the natural improvements of a die shrink, PLUS their arch changes.
But you assume AMD is going to do nothing more than a arch tweak, where the majority of sources state RDNA2 is a new GPU architecture. But it's also something that AMD is keeping quiet, just like how they kept ZEN quiet.
 
Joined
Dec 31, 2009
Messages
19,372 (3.54/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
But you assume AMD is going to do nothing more than a arch tweak, where the majority of sources state RDNA2 is a new GPU architecture. But it's also something that AMD is keeping quiet, just like how they kept ZEN quiet.
Lol, I like your positive outlook, even if it is inherently misplaced here. :)

Your links show 50% for amd so is ampre (assuming both are true). So I ask again, regardless of 7nm+ and new arch, we're in the same place, if we follow the rumors, eh?
 
Last edited:
Joined
Mar 23, 2005
Messages
4,092 (0.57/day)
Location
Ancient Greece, Acropolis (Time Lord)
System Name RiseZEN Gaming PC
Processor AMD Ryzen 7 5800X @ Auto
Motherboard Asus ROG Strix X570-E Gaming ATX Motherboard
Cooling Corsair H115i Elite Capellix AIO, 280mm Radiator, Dual RGB 140mm ML Series PWM Fans
Memory G.Skill TridentZ 64GB (4 x 16GB) DDR4 3200
Video Card(s) ASUS DUAL RX 6700 XT DUAL-RX6700XT-12G
Storage Corsair Force MP500 480GB M.2 & MP510 480GB M.2 - 2 x WD_BLACK 1TB SN850X NVMe 1TB
Display(s) ASUS ROG Strix 34” XG349C 144Hz 1440p + Asus ROG 27" MG278Q 144Hz WQHD 1440p
Case Corsair Obsidian Series 450D Gaming Case
Audio Device(s) SteelSeries 5Hv2 w/ Sound Blaster Z SE
Power Supply Corsair RM750x Power Supply
Mouse Razer Death-Adder + Viper 8K HZ Ambidextrous Gaming Mouse - Ergonomic Left Hand Edition
Keyboard Logitech G910 Orion Spectrum RGB Gaming Keyboard
Software Windows 11 Pro - 64-Bit Edition
Benchmark Scores I'm the Doctor, Doctor Who. The Definition of Gaming is PC Gaming...
Lol, I like your positive outlook, even if it is inherently misplaced here. :)

Your links show 50% for amd so is ampre (assuming both are true). So I ask again, regardless of 7nm+ and new arch, we're in the same place, if we follow the rumors, eh?
Agreed. :clap::peace:
 
Joined
May 31, 2016
Messages
4,440 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Lol, I like your positive outlook, even if it is inherently misplaced here. :)

Your links show 50% for amd so is ampre (assuming both are true). So I ask again, regardless of 7nm+ and new arch, we're in the same place, if we follow the rumors, eh?
If we follow the rumors the Ampere wont be Ampere. Zen3 will suck and Cascade lake will mop the floor with it. The new NV graphics might not come out.
Well, God knows what is true or not and if rumors are in play here, we, users, customers know nothing about the actual product and will know nothing till it shows up. It is in the best interest for the companies to not reveal anything serious, meaningful that would jeopardize the product in any way. The rumors are manufactured not leaked nowadays.
This new NV graphics will be good. Why wouldn't it be? RT is being pushed forward and it's been our "leather jacket dude" dream and now let us wait what the price will be and how this will play in the future against AMD and what the latter will show with the RDNA2. BUT, if anyone will justify that the price is being jacked up again because more RT core are in the GPU or because it is high end of the highest end, then please leave the planet and never come back.
 
Joined
Feb 26, 2016
Messages
551 (0.17/day)
Location
Texas
System Name O-Clock
Processor Intel Core i9-9900K @ 52x/49x 8c8t
Motherboard ASUS Maximus XI Gene
Cooling EK Quantum Velocity C+A, EK Quantum Vector C+A, CE 280, Monsta 280, GTS 280 all w/ A14 IP67
Memory 2x16GB G.Skill TridentZ @3900 MHz CL16
Video Card(s) EVGA RTX 2080 Ti XC Black
Storage Samsung 983 ZET 960GB, 2x WD SN850X 4TB
Display(s) Asus VG259QM
Case Corsair 900D
Audio Device(s) beyerdynamic DT 990 600Ω, Asus SupremeFX Hi-Fi 5.25", Elgato Wave 3
Power Supply EVGA 1600 T2 w/ A14 IP67
Mouse Logitech G403 Wireless (PMW3366)
Keyboard Monsgeek M5W w/ Cherry MX Silent Black RGBs
Software Windows 10 Pro 64 bit
Benchmark Scores https://hwbot.org/search/submissions/permalink?userId=92615&cpuId=5773
Ya I remember years back before I tried my first SLI setup (2x 780 Ti) I was scared to death about the stuttering people kept talking about. I've run 780 Ti SLI, Titan X (maxwell) SLI, and now 1080 Ti SLI...I haven't had a lick of stuttering on any games I play. I mean zippo. I generally run vsync @ 60fps in 4k and my games have been butter smooth. If I ever feel like a game is running right around that 60fps limit for my cards and may fluctuate (which can cause input lag in those rare situations), then I switch out of true vsync and enable adaptive vsync at the driver level and that will take care of any issues.

My experiences with multi gpu have been great. It's obviously not for everyone given that the scaling is never 1:1, and in some cases not even close, but if you have tons of cash and / or are just a hardware enthusiast that wants maximum image quality and / or framerates, it's something I'd recommend people try.



Ya I'd like to think they'll get the prices back down into the realm of reason, but I am skeptical with NVidia. :) I may need to just plan on buying 2nd hand Turing once the prices get down into reasonable territory.
The 1070 Ti was actually the best performance/$ card out of the 10 series, with the 1080 Ti right behind. Take a look in this: https://docs.google.com/spreadsheets/d/1-JXBPMRZtUx0q0BMMa8Nzm8YZiyPGkjeEZfZ2DOr8y8/edit?usp=sharing This is my database for the NVIDIA graphics cards, and I also have launch MSRPs and price/TFLOPS (for this context, it is directly relevant with GPUs with the same architecture). I only use launch MSRPs, and not adjusted MSRPs like the GTX 1080 which was launched at 699$ but they dropped it I think 100 or 200$, I don't remember the price cut.

Would also be nice to hear some rumours about the improvements to RTX hardware and tensor cores too, shader numbers are not going to inform us much about the performance in isolation.

@Berfs1 I Can't see any of that panning out, since when did Nvidia improve their best then undercut it? so I think the prices we have now plus 50 dollars minimum, at least at launch with supers or something coming out later ,once stocks of the 2xxx series run out, makes sense tbf.
Yea, the pricing was a rumor, but I can believe that the performance will be substantially better. Also, since you mentioned the Super cards, here is a few facts that not a lot of people know about them: They have better price/performance, but they have higher performance/watt than the non-Supers. Not a lot of people touched on that, so I just wanted to let it out there that the performance/watt is actually worse on Supers than the non-Supers. Everyone wants the most performance with the lowest cost and lowest power, you can only get a good balance from two of those, you will never get max performance at lowest power.
 
Joined
Dec 28, 2006
Messages
4,378 (0.67/day)
Location
Hurst, Texas
System Name The86
Processor Ryzen 5 3600
Motherboard ASROCKS B450 Steel Legend
Cooling AMD Stealth
Memory 2x8gb DDR4 3200 Corsair
Video Card(s) EVGA RTX 3060 Ti
Storage WD Black 512gb, WD Blue 1TB
Display(s) AOC 24in
Case Raidmax Alpha Prime
Power Supply 700W Thermaltake Smart
Mouse Logitech Mx510
Keyboard Razer BlackWidow 2012
Software Windows 10 Professional
That 320-bit bus looks weird: allowing your xx80 grade card to have such high memory bandwidth, you start to cripple your xx80Ti's performance at higher resolutions (unless it uses 512-bit bus or HBM2).
Though I'd be happy to be wrong, as better grade products for the same or cheaper price is always a welcome.

last card with a 320bit bus was an 8800GTS, its to odd a number to be the complete memory controller, I'd wager there is some disabled silicone and the controller is really 384bit for TI
 
Joined
Oct 25, 2005
Messages
193 (0.03/day)
Location
Long Island, NY
Processor 9700K
Motherboard Asrock Z390 Phantom Gaming-ITX/ac
Cooling Alpenfohn Black Ridge
Memory 32GB Micron VLP 18ADF2G72AZ-3G2E1
Video Card(s) 3090 FE
Display(s) Samsung G9 NEO
Case Formd T1
Power Supply Corsair SF750
As much as I want to buy the latest and greatest GPU each year... I think I am going to stop doing so until they make gaming great again!

All the game developers, but a maybe a few, suck! Tired of 1/4 baked games filled with bugs, or heavily monetized with money transactions...
 
Joined
Feb 26, 2016
Messages
551 (0.17/day)
Location
Texas
System Name O-Clock
Processor Intel Core i9-9900K @ 52x/49x 8c8t
Motherboard ASUS Maximus XI Gene
Cooling EK Quantum Velocity C+A, EK Quantum Vector C+A, CE 280, Monsta 280, GTS 280 all w/ A14 IP67
Memory 2x16GB G.Skill TridentZ @3900 MHz CL16
Video Card(s) EVGA RTX 2080 Ti XC Black
Storage Samsung 983 ZET 960GB, 2x WD SN850X 4TB
Display(s) Asus VG259QM
Case Corsair 900D
Audio Device(s) beyerdynamic DT 990 600Ω, Asus SupremeFX Hi-Fi 5.25", Elgato Wave 3
Power Supply EVGA 1600 T2 w/ A14 IP67
Mouse Logitech G403 Wireless (PMW3366)
Keyboard Monsgeek M5W w/ Cherry MX Silent Black RGBs
Software Windows 10 Pro 64 bit
Benchmark Scores https://hwbot.org/search/submissions/permalink?userId=92615&cpuId=5773
last card with a 320bit bus was an 8800GTS, its to odd a number to be the complete memory controller, I'd wager there is some disabled silicone and the controller is really 384bit for TI
Not the last card to have 320 bit bus width; the GTX 470 had 320 bit, as well as the GTX 560 Ti (448 core) and GTX 570, but yes it is weird nonetheless. Not to mention the 3080 Ti may be 352 or 384 bit, I mean the 1080 Ti and 2080 Ti had 352 bit, the 780 Ti and 980 Ti had 384 bit, who knows at this point lol, but I have a hunch the Titan Ampere may have 48GB VRAM and 3080 Ti may be 20GB/24GB VRAM, since the 80 Ti cards have always have had half or near half the VRAM of the Titan cards, with the one exception being the Titan Z (which was a dual GPU card).
 
Joined
Jun 10, 2014
Messages
2,995 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
Not the last card to have 320 bit bus width; the GTX 470 had 320 bit, as well as the GTX 560 Ti (448 core) and GTX 570, but yes it is weird nonetheless. Not to mention the 3080 Ti may be 352 or 384 bit, I mean the 1080 Ti and 2080 Ti had 352 bit, the 780 Ti and 980 Ti had 384 bit, who knows at this point lol, but I have a hunch the Titan Ampere may have 48GB VRAM and 3080 Ti may be 20GB/24GB VRAM, since the 80 Ti cards have always have had half or near half the VRAM of the Titan cards, with the one exception being the Titan Z (which was a dual GPU card).
Most of these were just partially disabled memory controllers. GPUs don't have a single memory controller, but multiple 64-bit controllers, but even these can be partially disabled to get 32-bit increments.

This rumor does however contain two specific and odd details;
1) Different SM count per GPC (Nvidia usually have these the same within an architecture)
2) A separate die for a chip with a 320-bit memory controller
These are two pieces of information that are either completely true or completely wrong, and those under NDA would immediately know if this entire rumor is true or just BS.

As much as I want to buy the latest and greatest GPU each year... I think I am going to stop doing so until they make gaming great again!

All the game developers, but a maybe a few, suck! Tired of 1/4 baked games filled with bugs, or heavily monetized with money transactions...
Why upgrade so often? Upgrade when you need to.

But you're right about game developers; high quality work is more the exception than the norm. It's not Nvidia that's "killing" gaming, it's poor software.
 
Joined
Mar 23, 2005
Messages
4,092 (0.57/day)
Location
Ancient Greece, Acropolis (Time Lord)
System Name RiseZEN Gaming PC
Processor AMD Ryzen 7 5800X @ Auto
Motherboard Asus ROG Strix X570-E Gaming ATX Motherboard
Cooling Corsair H115i Elite Capellix AIO, 280mm Radiator, Dual RGB 140mm ML Series PWM Fans
Memory G.Skill TridentZ 64GB (4 x 16GB) DDR4 3200
Video Card(s) ASUS DUAL RX 6700 XT DUAL-RX6700XT-12G
Storage Corsair Force MP500 480GB M.2 & MP510 480GB M.2 - 2 x WD_BLACK 1TB SN850X NVMe 1TB
Display(s) ASUS ROG Strix 34” XG349C 144Hz 1440p + Asus ROG 27" MG278Q 144Hz WQHD 1440p
Case Corsair Obsidian Series 450D Gaming Case
Audio Device(s) SteelSeries 5Hv2 w/ Sound Blaster Z SE
Power Supply Corsair RM750x Power Supply
Mouse Razer Death-Adder + Viper 8K HZ Ambidextrous Gaming Mouse - Ergonomic Left Hand Edition
Keyboard Logitech G910 Orion Spectrum RGB Gaming Keyboard
Software Windows 11 Pro - 64-Bit Edition
Benchmark Scores I'm the Doctor, Doctor Who. The Definition of Gaming is PC Gaming...
Most of these were just partially disabled memory controllers. GPUs don't have a single memory controller, but multiple 64-bit controllers, but even these can be partially disabled to get 32-bit increments.

This rumor does however contain two specific and odd details;
1) Different SM count per GPC (Nvidia usually have these the same within an architecture)
2) A separate die for a chip with a 320-bit memory controller
These are two pieces of information that are either completely true or completely wrong, and those under NDA would immediately know if this entire rumor is true or just BS.


Why upgrade so often? Upgrade when you need to.

But you're right about game developers; high quality work is more the exception than the norm. It's not Nvidia that's "killing" gaming, it's poor software.
It's not the developers fault it's the gaming companies Board of Directors that conjure up techniques to squeeze as much money as possible into there pockets and greatly limiting development funding.

They want less money invested into proper game development while still trying to make huge profits.

Nowadays games are popped out too quickly with lots of issues and bugs. It's really too bad, and hopefully things change.
 
Joined
Oct 25, 2005
Messages
193 (0.03/day)
Location
Long Island, NY
Processor 9700K
Motherboard Asrock Z390 Phantom Gaming-ITX/ac
Cooling Alpenfohn Black Ridge
Memory 32GB Micron VLP 18ADF2G72AZ-3G2E1
Video Card(s) 3090 FE
Display(s) Samsung G9 NEO
Case Formd T1
Power Supply Corsair SF750
Why upgrade so often? Upgrade when you need to.

For no logical reason. I like like playing with new gadgets. It's a disease... LOL. But now that they cost over $1200 each year, I'm going to stop wasting my money... especially since I can't find any games that suck me in anymore.
 
Joined
Dec 28, 2006
Messages
4,378 (0.67/day)
Location
Hurst, Texas
System Name The86
Processor Ryzen 5 3600
Motherboard ASROCKS B450 Steel Legend
Cooling AMD Stealth
Memory 2x8gb DDR4 3200 Corsair
Video Card(s) EVGA RTX 3060 Ti
Storage WD Black 512gb, WD Blue 1TB
Display(s) AOC 24in
Case Raidmax Alpha Prime
Power Supply 700W Thermaltake Smart
Mouse Logitech Mx510
Keyboard Razer BlackWidow 2012
Software Windows 10 Professional
For no logical reason. I like like playing with new gadgets. It's a disease... LOL. But now that they cost over $1200 each year, I'm going to stop wasting my money... especially since I can't find any games that suck me in anymore.

I would the average gamer is on an rx580 or gtx1060 and this is the middle road they're going to Target.
 
Joined
Mar 23, 2005
Messages
4,092 (0.57/day)
Location
Ancient Greece, Acropolis (Time Lord)
System Name RiseZEN Gaming PC
Processor AMD Ryzen 7 5800X @ Auto
Motherboard Asus ROG Strix X570-E Gaming ATX Motherboard
Cooling Corsair H115i Elite Capellix AIO, 280mm Radiator, Dual RGB 140mm ML Series PWM Fans
Memory G.Skill TridentZ 64GB (4 x 16GB) DDR4 3200
Video Card(s) ASUS DUAL RX 6700 XT DUAL-RX6700XT-12G
Storage Corsair Force MP500 480GB M.2 & MP510 480GB M.2 - 2 x WD_BLACK 1TB SN850X NVMe 1TB
Display(s) ASUS ROG Strix 34” XG349C 144Hz 1440p + Asus ROG 27" MG278Q 144Hz WQHD 1440p
Case Corsair Obsidian Series 450D Gaming Case
Audio Device(s) SteelSeries 5Hv2 w/ Sound Blaster Z SE
Power Supply Corsair RM750x Power Supply
Mouse Razer Death-Adder + Viper 8K HZ Ambidextrous Gaming Mouse - Ergonomic Left Hand Edition
Keyboard Logitech G910 Orion Spectrum RGB Gaming Keyboard
Software Windows 11 Pro - 64-Bit Edition
Benchmark Scores I'm the Doctor, Doctor Who. The Definition of Gaming is PC Gaming...
I would the average gamer is on an rx580 or gtx1060 and this is the middle road they're going to Target.
I have the following: As you can see in the drop down menu for System Specs. And I have absolutely NO Issues playing on Ultra High Picture Quality Settings at 1440p. My FPS in games like DOOM, RAGE 2, the entire METRO series, Wolfenstein New Order, Old Blood, II New Colossus, Young Blood, Resident Evil 2 Revamp, Resident Evil 7 BioHazard, Mad Max, PREY, Dying Light ETC., all gaming at a range of 60 FPS up to 100 FPS. Never have I experienced any slow downs in a game except in Dying Light a couple times, when I am picking up stuff too quickly, FPS drop down to about 50 FPS, then go back up to my average 75 FPS, for that game. DOOM is about 70-100 FPS, depending on the area. Same goes for Wolfenstein games for the most part.

Specs:
* Sapphire Radeon RX 580 8GB Nitro+ SE
* AMD Ryzen 7 1700X @ stock
* G.Skill TridentZ 32GB (2 x 16GB) DDR4 3200
* Asus 27" (MG278Q) 144Hz WQHD 1440p

My next upgrade:
Radeon RX 6700XT 8GB
AMD Ryzen 7 4800X
Same Ram & Monitor.
X670 chipset mobo Socket AM4 etc., I can dream right? lol
End of my Ramblings....
 
Joined
Jun 10, 2014
Messages
2,995 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
It's not the developers fault it's the gaming companies Board of Directors that conjure up techniques to squeeze as much money as possible into there pockets and greatly limiting development funding.

They want less money invested into proper game development while still trying to make huge profits.

Nowadays games are popped out too quickly with lots of issues and bugs. It's really too bad, and hopefully things change.
Yes, agree. I should have emphasized that I meant the game development companies, not the poor individuals given the "impossible" task.

But as I've mentioned in other threads before, it's often not only about money spent, but rather "quick turnover". These companies' boards often rather want rapid product cycles rather than spending the time necessary to build a proper game engine first, then do full-scale content creation, and then finally proper testing before shipping.
 
Joined
Sep 15, 2011
Messages
6,762 (1.39/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
Estimated on confirmed release date?
 
Joined
Oct 14, 2017
Messages
210 (0.08/day)
System Name Lightning
Processor 4790K
Motherboard asrock z87 extreme 3
Cooling hwlabs black ice 20 fpi radiator, cpu mosfet blocks, MCW60 cpu block, full cover on 780Ti's
Memory corsair dominator platinum 2400C10, 32 giga, DDR3
Video Card(s) 2x780Ti
Storage intel S3700 400GB, samsung 850 pro 120 GB, a cheep intel MLC 120GB, an another even cheeper 120GB
Display(s) eizo foris fg2421
Case 700D
Audio Device(s) ESI Juli@
Power Supply seasonic platinum 1000
Mouse mx518
Software Lightning v2.0a
I wanne tell you something that nobody mentions about multicard: input delay
people have to understand that SLI "frames" are not realy frames :x
think about it: what is the point to have x2 fps but an input delay of x3 ?!

yhe it the true, all the games realy bad last 10 yars :x
 
Last edited:
Joined
Jul 19, 2011
Messages
540 (0.11/day)
Well 3070 is 2080S on steroids, this is 60% faster than 2070. 60% the chip size at same power higher clocks, and lower price.... than 2080S.

And how will the 3070 get over the same bandwidth limitations as the 2080S if it uses the same GDDR6 256 but memory? The 16 gb/s stuff is already quite expensive. It's been shown that even a standard 2080 performs the same as the 2080S when memory is set at the same speed despite the SM deficiency.

This is why the 3080 is getting a 320 bit bus otherwise it would perform the same as the 3070 while saving room for a 3080ti which will most likely get a full 384 bit bus.
 
Joined
Sep 15, 2011
Messages
6,762 (1.39/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
Btw, when was Intel supposedly to launch their new GPUs ??
 
Joined
Mar 23, 2005
Messages
4,092 (0.57/day)
Location
Ancient Greece, Acropolis (Time Lord)
System Name RiseZEN Gaming PC
Processor AMD Ryzen 7 5800X @ Auto
Motherboard Asus ROG Strix X570-E Gaming ATX Motherboard
Cooling Corsair H115i Elite Capellix AIO, 280mm Radiator, Dual RGB 140mm ML Series PWM Fans
Memory G.Skill TridentZ 64GB (4 x 16GB) DDR4 3200
Video Card(s) ASUS DUAL RX 6700 XT DUAL-RX6700XT-12G
Storage Corsair Force MP500 480GB M.2 & MP510 480GB M.2 - 2 x WD_BLACK 1TB SN850X NVMe 1TB
Display(s) ASUS ROG Strix 34” XG349C 144Hz 1440p + Asus ROG 27" MG278Q 144Hz WQHD 1440p
Case Corsair Obsidian Series 450D Gaming Case
Audio Device(s) SteelSeries 5Hv2 w/ Sound Blaster Z SE
Power Supply Corsair RM750x Power Supply
Mouse Razer Death-Adder + Viper 8K HZ Ambidextrous Gaming Mouse - Ergonomic Left Hand Edition
Keyboard Logitech G910 Orion Spectrum RGB Gaming Keyboard
Software Windows 11 Pro - 64-Bit Edition
Benchmark Scores I'm the Doctor, Doctor Who. The Definition of Gaming is PC Gaming...
Yes, agree. I should have emphasized that I meant the game development companies, not the poor individuals given the "impossible" task.

But as I've mentioned in other threads before, it's often not only about money spent, but rather "quick turnover". These companies' boards often rather want rapid product cycles rather than spending the time necessary to build a proper game engine first, then do full-scale content creation, and then finally proper testing before shipping.
That is exactly what I meant to say lol,

Btw, when was Intel supposedly to launch their new GPUs ??
If Intel is serious about Discrete Graphics, I can see them matching both AMD & Nvidia in about 3 years time. As for the release, I heard sometime in 2020, but its going to barely do 1080p last I read.
 
Top