• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Alleged NVIDIA AD102 PCB Drawing Reveals NVLink is Here to Stay, Launch Timelines Revealed

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.60/day)
Location
Ex-usa | slava the trolls
And there is zero change that after more than 15 years, they simply gave up because it just didn't make sense paying for testing, support and keeping profiles updated for games new and decade-old. Right?

I don't know.
I can only guess that there is very strong anti-multi-GPU opposition from nvidia because AMD's original strategy for "sweet spot" GPUs was to use X2 cards.
And nvidia because of its enormous power consumption was always beaten by AMD's better performance-per-watt solutions.

Look at the Radeon HD 3870 and Radeon HD 3870 x2.
Then Radeon HD 4870 and Radeon HD 4870 x2...

Today the CF technology is much better than it was 15 years ago, and if they push for it, they can make it work.
Even if they split that R&D costs to the entire product line - the product lineup is extremely overpriced today, anyways, and can take those expenses.

Only if I was the Manager in charge at AMD...
 
Joined
Jul 9, 2015
Messages
3,413 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Why would anyone not want to see multi-GPU configurations healthy, up and running?
Because in this case it means using more than one oversized GPU.

I would like to see "chiplet" like multi-chip solutions though. (less waste of silikon, cheaper to produce etc)
 

bug

Joined
May 22, 2015
Messages
13,844 (3.95/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
I don't know.
I can only guess that there is very strong anti-multi-GPU opposition from nvidia because AMD's original strategy for "sweet spot" GPUs was to use X2 cards.
And nvidia because of its enormous power consumption was always beaten by AMD's better performance-per-watt solutions.

Look at the Radeon HD 3870 and Radeon HD 3870 x2.
Then Radeon HD 4870 and Radeon HD 4870 x2...

Today the CF technology is much better than it was 15 years ago, and if they push for it, they can make it work.
Even if they split that R&D costs to the entire product line - the product lineup is extremely overpriced today, anyways, and can take those expenses.

Only if I was the Manager in charge at AMD...
Are you aware multi-GPU support still exists? It's built right into DirectX now. What changed is AMD and Nvidia gave up doing all the work. Game engine developers are still able to offer you multi-GPU. But they also aren't eager to do the work, because, as I have already told you, the work doesn't pay off.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.60/day)
Location
Ex-usa | slava the trolls
Are you aware multi-GPU support still exists? It's built right into DirectX now. What changed is AMD and Nvidia gave up doing all the work. Game engine developers are still able to offer you multi-GPU. But they also aren't eager to do the work, because, as I have already told you, the work doesn't pay off.

Keeping flagships up and running is a very good advertisement policy for the companies.
Radeon RX 6950 XT is the current flagship but it is slower than the competition.

Imagine if AMD had kept its original "sweet-spot" strategy for and released the ultimate dual-GPU Radeon RX 6995X2 with two Radeon RX 6800 that would ultimately destroy any nvidia offering... Would it not be better for AMD?
Of course...

But who thinks about it?
 
Joined
Oct 27, 2009
Messages
1,195 (0.22/day)
Location
Republic of Texas
System Name [H]arbringer
Processor 4x 61XX ES @3.5Ghz (48cores)
Motherboard SM GL
Cooling 3x xspc rx360, rx240, 4x DT G34 snipers, D5 pump.
Memory 16x gskill DDR3 1600 cas6 2gb
Video Card(s) blah bigadv folder no gfx needed
Storage 32GB Sammy SSD
Display(s) headless
Case Xigmatek Elysium (whats left of it)
Audio Device(s) yawn
Power Supply Antec 1200w HCP
Software Ubuntu 10.10
Benchmark Scores http://valid.canardpc.com/show_oc.php?id=1780855 http://www.hwbot.org/submission/2158678 http://ww
This seems obvious, NVlink is for the data center/enterprise segment so it'll stick around


SLI on the other hand, is dead. so very very dead.
Yeah, why were they expecting it to leave?
 
Joined
Dec 5, 2017
Messages
157 (0.06/day)
Why would anyone not want to see multi-GPU configurations healthy, up and running?
AMD's CrossFire has been a very nice approach to get some more FPS, significantly so in the best optimised scenarios.

You can't get any more performance out of a single Radeon RX 6950 XT but if you pair two in CF, you probably will get at least 50-60% FPS uplift on average..


Using multiple chiplets on a single PCB is a very similar approach to multi-GPU on several PCBs.
Well, the answers are obvious and have already been given. How do you split the workload between two entirely separate compute engines? You either split the frame or do alternate frame rendering. Both introduce lag and stutter that is completely unavoidable and inherent to the design. I too miss the days of CFX/SLI, but at the end of the day, it was not worth the investment: maybe it was for a few people running quadfire 7970s with an 8000x2560 eyefinity setup, but not all the industry players who had to support the crap, or the market as a whole.

The whole point of the chiplet direction is that the separate dice no longer function as separate processors. That's not the case yet for the HPC cards, where those scheduling concerns don't play into the massively parallel workloads. But what you want is a single command front-end along with some other modifications that allow the compute chiplets to work in tandem with a single master chiplet that interfaces with the rest of the system. This results in an entirely software agnostic implementation where the MCM is seen as one GPU by the OS/driver.
Yes, yes. :rolleyes:

The market has two options:
1. Top-down forced kill-off which doesn't take the users' preferences and likes;
2. Bottom-up when the users force the game developers to get their act together and begin to resurrect the CF/multi-GPU support:

AMD Radeon RX 6800 XT tested in Multi-GPU configuration - VideoCardz.com
That simply will not happen. Putting the onus on the game developers with mGPU led to its own death. A lot of people were saying things like, "why would devs bother investing for RTX so 5% of the market can get some fancy reflections at 30 FPS for a few minutes until they turn it back off". That same line of reasoning applies several times over in this case. The benchmarks you show are almost all of the benchmarks that exist because no games offer the support.

I would also like to remind you that, quite ironically, your benchmarks are looking at average FPS, primarily in games that are already above 144fps. Is 230 avg 120 min better than 160 avg 120 min? I would say no...
 
Joined
Apr 30, 2020
Messages
1,001 (0.59/day)
System Name S.L.I + RTX research rig
Processor Ryzen 7 5800X 3D.
Motherboard MSI MEG ACE X570
Cooling Corsair H150i Cappellx
Memory Corsair Vengeance pro RGB 3200mhz 32Gbs
Video Card(s) 2x Dell RTX 2080 Ti in S.L.I
Storage Western digital Sata 6.0 SDD 500gb + fanxiang S660 4TB PCIe 4.0 NVMe M.2
Display(s) HP X24i
Case Corsair 7000D Airflow
Power Supply EVGA G+1600watts
Mouse Corsair Scimitar
Keyboard Cosair K55 Pro RGB
Amd 's current cards support more mGPU games than Nvidia's. It has to do with how implementing drive flags from dx11 on dx12 games, certain game engines look for those flags still even on dx12.

Every year so far, both manufacturers have only had an increase of about 50-60% from their last generation GPU. Consider that going from 1080p to 2160p in screen size, is 400% increase in area, when you double a square. It is pointless to think that a single card in the next 3-4 years will able to do 2160p at 120fps or higher, with that limited increase in performance. Mathematically it will not add up. I haven't even add in what raytracing in games takes away in performance either.
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.91/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
Why would anyone not want to see multi-GPU configurations healthy, up and running?
AMD's CrossFire has been a very nice approach to get some more FPS, significantly so in the best optimised scenarios.

You can't get any more performance out of a single Radeon RX 6950 XT but if you pair two in CF, you probably will get at least 50-60% FPS uplift on average..


Using multiple chiplets on a single PCB is a very similar approach to multi-GPU on several PCBs.
Because the previous implementations honestly sucked, they were a hacked in solution with a lot of complications.

Not adding in VRAM, broken in various game engines, required non stop tweaks and updates from the GPU developers to stop it breaking titles as well as requiring precisely matching cards (Yes AMD let you vary it a little, by reducing performance)


DX12 was meant to fix that situation but never did
 
Joined
Sep 17, 2014
Messages
22,722 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Amd 's current cards support more mGPU games than Nvidia's. It has to do with how implementing drive flags from dx11 on dx12 games, certain game engines look for those flags still even on dx12.

Every year so far, both manufacturers have only had an increase of about 50-60% from their last generation GPU. Consider that going from 1080p to 2160p in screen size, is 400% increase in area, when you double a square. It is pointless to think that a single card in the next 3-4 years will able to do 2160p at 120fps or higher, with that limited increase in performance. Mathematically it will not add up. I haven't even add in what raytracing in games takes away in performance either.

You do realize that in the SLI days, like Kepler, the gen-to-gen performance jump was in fact smaller? 670 > 770 was +25%. The 970 was again about 30% faster.

Now fast forward to today. Maxwell > Pascal was a 45~50% perf jump per tier. Turing wasn't as big a jump, but still rivalled older generational jumps while adding hardware for RT. Ampere, was again a major leap, albeit with a power increase; but the generational performance increase has truly never been higher on GPU. I'm not really mentioning AMD here because there is no consistency, but they've followed suit or stalled on that scenario.

It is in fact the polar opposite of what you're thinking. Right now, we've already had such major performance jumps that 4K has become feasible within two generations of it slowly gaining market share on monitors and TVs. Resolution isn't what kills your performance. New games do.

Its really about a bit more than API flags btw. I ran 660s in SLI... it was horror. More often than not, the experience was either subpar or you just had half performance of what you paid for, but STILL had the extra heat, noise, etc.

I don't know.
I can only guess that there is very strong anti-multi-GPU opposition from nvidia because AMD's original strategy for "sweet spot" GPUs was to use X2 cards

Only if I was the Manager in charge at AMD...

Right. Remember this?


You'd be another Raja ;)
 
Joined
Feb 1, 2019
Messages
3,669 (1.70/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
Well, the answers are obvious and have already been given. How do you split the workload between two entirely separate compute engines? You either split the frame or do alternate frame rendering. Both introduce lag and stutter that is completely unavoidable and inherent to the design. I too miss the days of CFX/SLI, but at the end of the day, it was not worth the investment: maybe it was for a few people running quadfire 7970s with an 8000x2560 eyefinity setup, but not all the industry players who had to support the crap, or the market as a whole.

The whole point of the chiplet direction is that the separate dice no longer function as separate processors. That's not the case yet for the HPC cards, where those scheduling concerns don't play into the massively parallel workloads. But what you want is a single command front-end along with some other modifications that allow the compute chiplets to work in tandem with a single master chiplet that interfaces with the rest of the system. This results in an entirely software agnostic implementation where the MCM is seen as one GPU by the OS/driver.

That simply will not happen. Putting the onus on the game developers with mGPU led to its own death. A lot of people were saying things like, "why would devs bother investing for RTX so 5% of the market can get some fancy reflections at 30 FPS for a few minutes until they turn it back off". That same line of reasoning applies several times over in this case. The benchmarks you show are almost all of the benchmarks that exist because no games offer the support.

I would also like to remind you that, quite ironically, your benchmarks are looking at average FPS, primarily in games that are already above 144fps. Is 230 avg 120 min better than 160 avg 120 min? I would say no...
60fps and im happy :).

SLI is inefficient, hard to get to work properly and would make GPUs even more scarce so yep I agree with the popular view. :)
 
Joined
Jun 11, 2017
Messages
283 (0.10/day)
Location
Montreal Canada
I'm not running a IPS 4k monitor. They are small and well smaller and higher res is expensive. In the TV market you can find some sweet deals but you have to really look at the specs. I bought a Samsung TV for like 299.00 on a boxing day sale. There was a 40 inch and even a 43 inch that was on sale for 100.00 cheaper. But the 43 inch did not have the specs. The 40 inch supported HDMI 2.1 and 120 hz refresh rate. Also supported 1.08 billion colors vs the 16.7 million.
I have had it for 6 years now 4k gaming on it is sweet. Also had 3 HDMI in with all supporting it so I have my PC and PS4 Pro and PS3 orginial all plugged up.

Good site to check out is

You can get a 40 inch TV cheaper than any monitor and get almost the same results.

60fps and im happy :).

SLI is inefficient, hard to get to work properly and would make GPUs even more scarce so yep I agree with the popular view. :)

SLI is easy to setup. The major problem I always found with SLI and setups when I worked at my own COmputer store I use to own. Was they would plug it in and think right off the bat they had SLI. One simple little click in the driver and wala SLI working on all games.

I could get SLI working in games that did not even support it.

Look if you do not see SLI inside the game config does not mean it does not support it. There is no SLI config in unreal 2003 or Quake 3 arena. All SLI config was done in the driver not the game.
 

64K

Joined
Mar 13, 2014
Messages
6,773 (1.72/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) Temporary MSI RTX 4070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Temporary Viewsonic 4K 60 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
This seems obvious, NVlink is for the data center/enterprise segment so it'll stick around


SLI on the other hand, is dead. so very very dead.

SLI was dead a long, long time ago. It was a problem causer more than a solution even when it was supported by Developers and Nvidia.

The last dual GPU gaming card from Nvidia was the Titan Z. That was 8 years ago. It was silly priced at $3,000. That should have been a warning to all paying at least a little attention that SLI was going to be history.

The future is MCM.
 
Joined
Jun 11, 2017
Messages
283 (0.10/day)
Location
Montreal Canada
Just wait till 7680 × 4320 comes out. Unless there is SLI or Crossfire on a Single Card there will be no way it will run unless you like playing games at 20 fps

Oh and the 3090 yes has that feature you turn on the AI deep learning to increase FPS. You know what it really does just cut corners to produce those higher FPS. If it does not need to render long distance items it blurs them.
 

64K

Joined
Mar 13, 2014
Messages
6,773 (1.72/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) Temporary MSI RTX 4070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Temporary Viewsonic 4K 60 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
Just wait till 7680 × 4320 comes out. Unless there is SLI or Crossfire on a Single Card there will be no way it will run unless you like playing games at 20 fps

According to the Steam Hardware Survey the vast majority are still on 1080p. Only 2.6% report using 4K.

8K is not going to go mainstream for decades. If anyone plans to buy an 8K gaming monitor before then they will see it's just a money pit to game on and hardly worth it.
 
Joined
Sep 17, 2014
Messages
22,722 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Just wait till 7680 × 4320 comes out. Unless there is SLI or Crossfire on a Single Card there will be no way it will run unless you like playing games at 20 fps

Oh and the 3090 yes has that feature you turn on the AI deep learning to increase FPS. You know what it really does just cut corners to produce those higher FPS. If it does not need to render long distance items it blurs them.
Bigger numbers do not equal better games or gaming. PPI is a thing, most people game on a downscaled res, or lower internal render res on 4K panels and they own one for the simple fact there is nothing else anymore.

Even ultrawide has more traction in the gaming market than a 4x resolution boost. for very little reasons other than 'omg its more pixuls'. 15 minutes of gaming and you stop noticing the added detail because quite simply, PPI is a thing your brain wont ignore. In a desktop setting, 4K is overkill.

The fact some enthusiasts and tubers do the marketing for corporations does not make it something the majority either needs or wants. 8K is another step in the wrong, utterly wasteful direction. Maybe you missed the climate discussion lately... the age of rampant growth is finite, in comes an age of bare necessity. Its about time, too. mGPU is a horribly wasteful tech.
 
Joined
Jun 11, 2017
Messages
283 (0.10/day)
Location
Montreal Canada
Bigger numbers do not equal better games or gaming. PPI is a thing, most people game on a downscaled res, or lower internal render res on 4K panels and they own one for the simple fact there is nothing else anymore.

Even ultrawide has more traction in the gaming market than a 4x resolution boost. for very little reasons other than 'omg its more pixuls'. 15 minutes of gaming and you stop noticing the added detail because quite simply, PPI is a thing your brain wont ignore. In a desktop setting, 4K is overkill.

The fact some enthusiasts and tubers do the marketing for corporations does not make it something the majority either needs or wants. 8K is another step in the wrong, utterly wasteful direction. Maybe you missed the climate discussion lately... the age of rampant growth is finite, in comes an age of bare necessity. Its about time, too. mGPU is a horribly wasteful tech.
You see if we goto submission and settle for 1440p then 4k becomes nothing but a pipe dream. I love 4k man its sweet. Games never looked soo good. Gamers did not strive for it. They just went into submission and marketing brainwashed them into less for more. Just an example 3090 is like 5 grand in Canada to find one. I can buy a second hand car for that price. I think Nvidia will do a SLI single card maybe a dual gpu setup for high end gamers. I mean thats what voodoo did in the 2000's 20 years ago.
 

64K

Joined
Mar 13, 2014
Messages
6,773 (1.72/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) Temporary MSI RTX 4070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Temporary Viewsonic 4K 60 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
You see if we goto submission and settle for 1440p then 4k becomes nothing but a pipe dream. I love 4k man its sweet. Games never looked soo good. Gamers did not strive for it. They just went into submission and marketing brainwashed them into less for more. Just an example 3090 is like 5 grand in Canada to find one. I can buy a second hand car for that price. I think Nvidia will do a SLI single card maybe a dual gpu setup for high end gamers. I mean thats what voodoo did in the 2000's 20 years ago.

SLI cards aren't the problem. It's the Developer support in games that is not available. A good example is the dual GPU GTX 690 and the single GPU GTX 680. A 690 is two 680s paired together on one card but even years ago we saw that they benched the same FPS because only one of the GPUs on the 690 was used.

There's no place for SLI anymore. There is only a few die shrinks to be had before making GPUs too expensive to continue on with die shrinks.

multi chip modules is the future.
 
Joined
Sep 17, 2014
Messages
22,722 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
You see if we goto submission and settle for 1440p then 4k becomes nothing but a pipe dream. I love 4k man its sweet. Games never looked soo good. Gamers did not strive for it. They just went into submission and marketing brainwashed them into less for more. Just an example 3090 is like 5 grand in Canada to find one. I can buy a second hand car for that price. I think Nvidia will do a SLI single card maybe a dual gpu setup for high end gamers. I mean thats what voodoo did in the 2000's 20 years ago.
Games got heavier irrespective of resolution. You can run tons of games in 4K. But not the supposed triple A cutting edge... which in fact is nothing but a bunch of ultra settings that make very little sense... and perhaps a few weak RT effects. Core of most game development is scoped based on the console capability. Not the PC.

Again... marketing vs reality vs your view on whats what. The vast majority of games releases on engines that are fully enabled to push 4K efficiently. Tricks are deployed already as i pointed out. Dont mistake submission with a complete lack of actual demand. You can run anything at any resolution, just tweak it to fit your use case. Between FSR and DLSS things only got more accessible.

People want games before pixels, higher res was always pure luxury, as it will remain. Its as old as running 1600x1200 on your CRT.

Also what is submission to 1440p in the eyes of a GPU vendor anyway? Not exactly a good business case is it? Nvidia and AMD have every reason to push the bar ever higher. Why do you think the AMD powered consoles tout themselves as 4K machines?

The fact is, they figured out some time ago people wont just buy into 4K for all the aforementioned reasons. They aint as stupid as we are... so Nvidia quickly pushed RTX out to create new buyer incentive... too bad they forgot to bring some actual content along. They rushed it to market because they knew Pascal was fine, too fine even, and AMD owned the console space. They had to have a new way to differentiate and sell product.
Wake up man ;)

SLI cards aren't the problem. It's the Developer support in games that is not available. A good example is the dual GPU GTX 690 and the single GPU GTX 680. A 690 is two 680s paired together on one card but even years ago we saw that they benched the same FPS because only one of the GPUs on the 690 was used.

There's no place for SLI anymore. There is only a few die shrinks to be had before making GPUs too expensive to continue on with die shrinks.

multi chip modules is the future.
MCM is a case of seeing is believing... but yeah, seems to move that way.
 
Last edited:

64K

Joined
Mar 13, 2014
Messages
6,773 (1.72/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) Temporary MSI RTX 4070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Temporary Viewsonic 4K 60 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
Games got heavier irrespective of resolution. You can run tons of games in 4K. But not the supposed triple A cutting edge... which in fact is nothing but a bunch of ultra settings that make very little sense... and perhaps a few weak RT effects. Core of most game development is scoped based on the console capability. Not the PC.

Again... marketing vs reality vs your view on whats what. The vast majority of games releases on engines that are fully enabled to push 4K efficiently. Tricks are deployed already as i pointed out. Dont mistake submission with a complete lack of actual demand. You can run anything at any resolution, just tweak it to fit your use case. Between FSR and DLSS things only got more accessible.

People want games before pixels, higher res was always pure luxury, as it will remain. Its as old as running 1600x1200 on your CRT.

Also what is submission to 1440p in the eyes of a GPU vendor anyway? Not exactly a good business case is it? Nvidia and AMD have every reason to push the bar ever higher. Why do you think the AMD powered consoles tout themselves as 4K machines?

The fact is, they figured out some time ago people wont just buy into 4K for all the aforementioned reasons. They aint as stupid as we are... so Nvidia quickly pushed RTX out to create new buyer incentive... too bad they forgot to bring some actual content along. They rushed it to market because they knew Pascal was fine, too fine even, and AMD owned the console space. They had to have a new way to differentiate and sell product.
Wake up man ;)


MCM is a case of seeing is believing... but yeah, seems to move that way.

The latest rumor is that the RX 7970 XT will be MCM and release Q4 this year. It could easily get delayed though.
 
Joined
Jan 21, 2021
Messages
67 (0.05/day)
You see if we goto submission and settle for 1440p then 4k becomes nothing but a pipe dream. I love 4k man its sweet. Games never looked soo good. Gamers did not strive for it. They just went into submission and marketing brainwashed them into less for more. Just an example 3090 is like 5 grand in Canada to find one. I can buy a second hand car for that price. I think Nvidia will do a SLI single card maybe a dual gpu setup for high end gamers. I mean thats what voodoo did in the 2000's 20 years ago.
Why do you blatantly lie?
(rhetorical question)

3090 are accessible and less than $2k

 
  • Haha
Reactions: ARF

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.60/day)
Location
Ex-usa | slava the trolls
Why do you blatantly lie?
(rhetorical question)

3090 are accessible and less than $2k


The question is why would you pay even that much for it? 2020 technology.
Until very recently, the guy is talking absolute truth.

About 4K 3840x2160 - it should be the main mainstream resolution today. 2560x1440 should be low-end, entry level, and 5K, 6K, 8K, etc should be enthusiast class but not necessary to upgrade to...
 
Joined
Jan 21, 2021
Messages
67 (0.05/day)
The question is why would you pay even that much for it? 2020 technology.
Until very recently, the guy is talking absolute truth.

About 4K 3840x2160 - it should be the main mainstream resolution today. 2560x1440 should be low-end, entry level, and 5K, 6K, 8K, etc should be enthusiast class but not necessary to upgrade to...
"very recently"
Prices have been down for months!
I never said the state of affairs wasn't ridiculous, I just hate blatant exaggerations.

For reference, I didn't pay much attention to games for a good 7-8 years from 2013. Except UT2004, I played that now and then. When I started checking things out again I thought for sure I'd be blown away by the graphics - I was underwhelmed. Extremely. Literally almost no progression. If you disregard ray tracing (which you can pretty much do) there's actually been no progression. Sure frame rates are up and 720p/1080p/1440p is now 1080p/1440p/4k, and if you look veeeerry carefully things look ever so slightly more realistic...
 
Last edited:
Joined
Sep 17, 2014
Messages
22,722 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
"very recently"
Prices have been down for months!
I never said the state of affairs wasn't ridiculous, I just hate blatant exaggerations.

For reference, I didn't pay much attention to games for a good 7-8 years from 2013. Except UT2004, I played that now and then. When I started checking things out again I thought for sure I'd be blown away by the graphics - I was underwhelmed. Extremely. Literally almost no progression. If you disregard ray tracing (which you can pretty much do) there's actually been no progression. Sure frame rates are up and 720p/1080p/1440p is now 1080p/1440p/4k, and if you look veeeerry carefully things look ever so slightly more realistic...
Well... there are real gems in graphics that are truly new and it definitely is more detailed but yeah, none of these shiny effects truly enabled 'new gameplay'.

Its really more of the same, even RT effects are just another 'pass' of polish over what is essentially the same core of graphics. I would have expected way better integration; even 15 years ago games would play around with light and shadow, tying gameplay elements to it (stealth, for example). Those kinds of integration of gameplay and graphics, or a thing such as physics, are so rare... I dont get it.

About 4K 3840x2160 - it should be the main mainstream resolution today. 2560x1440 should be low-end, entry level, and 5K, 6K, 8K, etc should be enthusiast class but not necessary to upgrade to...
If everyone is on 4K what is the upgrade path exactly? How would Nv sell 4xxx series?
 
  • Like
Reactions: 64K

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.60/day)
Location
Ex-usa | slava the trolls
If everyone is on 4K what is the upgrade path exactly? How would Nv sell 4xxx series?

- Higher FPS;
- New more demanding games;
- Higher Hz 4K screens - 4K@60, 4K@120, 4K@144, 4K@360...
 
Joined
Sep 17, 2014
Messages
22,722 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
- Higher FPS;
- New more demanding games;
- Higher Hz 4K screens - 4K@60, 4K@120, 4K@144, 4K@360...
Yes, and now reality: even HDMI 2.1 is not mainstream yet. You're mistaking wishful thinking with economic reality.
 
  • Like
Reactions: 64K
Top