You'll hit the physics wall before you end up anywhere close to full RT RT, wanna bet on that?It's not a delusion. It takes several generations for new standards to be widely adopted.
You'll hit the physics wall before you end up anywhere close to full RT RT, wanna bet on that?It's not a delusion. It takes several generations for new standards to be widely adopted.
Yes but I switched my 1070 to 3070 Ti then I upgraded the CPU, so I had a Core i7 8th gen with a 3070 Ti, the CPU helped but the GPU helped way more.Yes, but I'd say it's because you have a Core i9 13th gen now and a Core i7 8th gen then.
System Name | "Icy Resurrection" |
---|---|
Processor | 13th Gen Intel Core i9-13900KS Special Edition |
Motherboard | ASUS ROG MAXIMUS Z790 APEX ENCORE |
Cooling | Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM |
Memory | 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V |
Video Card(s) | ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition |
Storage | 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD |
Display(s) | 55-inch LG G3 OLED |
Case | Pichau Mancer CV500 White Edition |
Power Supply | EVGA 1300 G2 1.3kW 80+ Gold |
Mouse | Microsoft Classic Intellimouse |
Keyboard | Generic PS/2 |
Software | Windows 11 IoT Enterprise LTSC 24H2 |
Benchmark Scores | I pulled a Qiqi~ |
You'll hit the physics wall before you end up anywhere close to full RT RT, wanna bet on that?
System Name | RyzenGtEvo/ Asus strix scar II |
---|---|
Processor | Amd R5 5900X/ Intel 8750H |
Motherboard | Crosshair hero8 impact/Asus |
Cooling | 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK |
Memory | Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB |
Video Card(s) | Powercolour RX7900XT Reference/Rtx 2060 |
Storage | Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme |
Display(s) | Samsung UAE28"850R 4k freesync.dell shiter |
Case | Lianli 011 dynamic/strix scar2 |
Audio Device(s) | Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset |
Power Supply | corsair 1200Hxi/Asus stock |
Mouse | Roccat Kova/ Logitech G wireless |
Keyboard | Roccat Aimo 120 |
VR HMD | Oculus rift |
Software | Win 10 Pro |
Benchmark Scores | 8726 vega 3dmark timespy/ laptop Timespy 6506 |
Your being extreme, silicon is still being worked on and with and will be for at least 7 more generations of shrink and likely won't be replaced quickly or easily, few materials have a suitably workable bandgap, even GAi devices are very new and niche and they're in the lead experimentally over other silicon alternatives.It's well known we're at the end of silicon and to continue onwards new materials will need to be used, but I would never bet against the tech industry, that is most unwise
System Name | "Icy Resurrection" |
---|---|
Processor | 13th Gen Intel Core i9-13900KS Special Edition |
Motherboard | ASUS ROG MAXIMUS Z790 APEX ENCORE |
Cooling | Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM |
Memory | 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V |
Video Card(s) | ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition |
Storage | 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD |
Display(s) | 55-inch LG G3 OLED |
Case | Pichau Mancer CV500 White Edition |
Power Supply | EVGA 1300 G2 1.3kW 80+ Gold |
Mouse | Microsoft Classic Intellimouse |
Keyboard | Generic PS/2 |
Software | Windows 11 IoT Enterprise LTSC 24H2 |
Benchmark Scores | I pulled a Qiqi~ |
Your being extreme, silicon is still being worked on and with and will be for at least 7 more generations of shrink and likely won't be replaced quickly or easily, few materials have a suitably workable bandgap, even GAi devices are very new and niche and they're in the lead experimentally over other silicon alternatives.
I actually think Innovation, possibly photonics will extend the viability of silicon due to the vast amounts of working processes developed for silicon fabrication way way beyond the Angstrom era.
USB-C video port on GPU may actually evolve in future to transmit USB and/or PCIe data, alongside video.USB-C is likely not coming back
System Name | "Icy Resurrection" |
---|---|
Processor | 13th Gen Intel Core i9-13900KS Special Edition |
Motherboard | ASUS ROG MAXIMUS Z790 APEX ENCORE |
Cooling | Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM |
Memory | 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V |
Video Card(s) | ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition |
Storage | 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD |
Display(s) | 55-inch LG G3 OLED |
Case | Pichau Mancer CV500 White Edition |
Power Supply | EVGA 1300 G2 1.3kW 80+ Gold |
Mouse | Microsoft Classic Intellimouse |
Keyboard | Generic PS/2 |
Software | Windows 11 IoT Enterprise LTSC 24H2 |
Benchmark Scores | I pulled a Qiqi~ |
USB-C video port on GPU may actually evolve in future to transmit USB and/or PCIe data, alongside video.
Asus has recently shown NVMe drive attached to GPU's PCB transmitting data over PCIe link. By extension, its enough to install USB4 controller on GPU's PCB to inject USB and PCIe data flowing to/from USB-C port alongside DisplayPort video data.
I remember a rumour that Intel would stop Arc production after first gen. It wasn't true.
System Name | Nebulon B |
---|---|
Processor | AMD Ryzen 7 7800X3D |
Motherboard | MSi PRO B650M-A WiFi |
Cooling | be quiet! Dark Rock 4 |
Memory | 2x 24 GB Corsair Vengeance DDR5-4800 |
Video Card(s) | AMD Radeon RX 6750 XT 12 GB |
Storage | 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2 |
Display(s) | Dell S3422DWG, 7" Waveshare touchscreen |
Case | Kolink Citadel Mesh black |
Audio Device(s) | Logitech Z333 2.1 speakers, AKG Y50 headphones |
Power Supply | Seasonic Prime GX-750 |
Mouse | Logitech MX Master 2S |
Keyboard | Logitech G413 SE |
Software | Bazzite (Fedora Linux) KDE |
How about using one GPU only for raster, and one only for RT, similarly to the earliest physics accelerators before Nvidia bought Ageia, the company that made them (or 3DFX cards that didn't have 2D support). ATX is basically just a ton of unused space in a modern gaming PC, so why do raster and RT have to be on the same chip, or even the same card?It's well known we're at the end of silicon and to continue onwards new materials will need to be used, but I would never bet against the tech industry, that is most unwise
System Name | "Icy Resurrection" |
---|---|
Processor | 13th Gen Intel Core i9-13900KS Special Edition |
Motherboard | ASUS ROG MAXIMUS Z790 APEX ENCORE |
Cooling | Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM |
Memory | 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V |
Video Card(s) | ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition |
Storage | 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD |
Display(s) | 55-inch LG G3 OLED |
Case | Pichau Mancer CV500 White Edition |
Power Supply | EVGA 1300 G2 1.3kW 80+ Gold |
Mouse | Microsoft Classic Intellimouse |
Keyboard | Generic PS/2 |
Software | Windows 11 IoT Enterprise LTSC 24H2 |
Benchmark Scores | I pulled a Qiqi~ |
How about using one GPU only for raster, and one only for RT, similarly to the earliest physics accelerators before Nvidia bought Ageia, the company that made them (or 3DFX cards that didn't have 2D support). ATX is basically just a ton of unused space in a modern gaming PC, so why do raster and RT have to be on the same chip, or even the same card?
He is Taliban AMD and you have nothing to discuss with him. I have it on ignore.well it doesn't beat 4090 in any other game... valhalla is just a heavy amd game. your refusal to admit it doesn't even beat it in one game though is troubling
System Name | Nebulon B |
---|---|
Processor | AMD Ryzen 7 7800X3D |
Motherboard | MSi PRO B650M-A WiFi |
Cooling | be quiet! Dark Rock 4 |
Memory | 2x 24 GB Corsair Vengeance DDR5-4800 |
Video Card(s) | AMD Radeon RX 6750 XT 12 GB |
Storage | 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2 |
Display(s) | Dell S3422DWG, 7" Waveshare touchscreen |
Case | Kolink Citadel Mesh black |
Audio Device(s) | Logitech Z333 2.1 speakers, AKG Y50 headphones |
Power Supply | Seasonic Prime GX-750 |
Mouse | Logitech MX Master 2S |
Keyboard | Logitech G413 SE |
Software | Bazzite (Fedora Linux) KDE |
Comparing anything with DLSS 3 FG on sounds more like an Nvidia advert than a review to me. One must not forget about the possible input latency issues with FG.He is Taliban AMD and you have nothing to discuss with him. I have it on ignore.
View attachment 308242
RTX 4090 is the undisputed king at the moment. An expensive king, it is right, and the price is the only weapon with which these Taliban, denying the army of technologies that increase the value of an nVidia video card. Anyway, their desperation can be seen in how they use a game sponsored by AMD to hide the drama from the others.
Something funny that stuck in my mind from the review of a well-known compatriot for RTX 4060 Ti/ RX 7600:
4060 Ti - driver installation and running games without problems.
7600 - driver installation and... error, error, error. It was solved after several wasted hours, including reinstalling the OS.
And he also found that he had to use nVidia software to be able to get the results of the AMD video card in the games that did not have a benchmark utility because AMD has nothing similar.
I extracted his results from two reviews because he "omitted" to compare the results, they were too against AMD. How to compare in Cyberpunk the 34 fps obtained by the RX 7600 (igp disaster) with the 111 fps obtained by the 4060?
Video cards were tested only in 1080p
View attachment 308239
View attachment 308241
The sources are here
iGPU has been using USB-C for quite some time. I have Z390 motherboard from Asrock with three monitor ports from CPU - DP, HDMI and Thunderbolt 3.This is already possible, the CPU graphics on my motherboard are wired to an USB-C DisplayPort source. But even if that is the case, it's still a rather unusual format for monitors.
System Name | "Icy Resurrection" |
---|---|
Processor | 13th Gen Intel Core i9-13900KS Special Edition |
Motherboard | ASUS ROG MAXIMUS Z790 APEX ENCORE |
Cooling | Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM |
Memory | 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V |
Video Card(s) | ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition |
Storage | 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD |
Display(s) | 55-inch LG G3 OLED |
Case | Pichau Mancer CV500 White Edition |
Power Supply | EVGA 1300 G2 1.3kW 80+ Gold |
Mouse | Microsoft Classic Intellimouse |
Keyboard | Generic PS/2 |
Software | Windows 11 IoT Enterprise LTSC 24H2 |
Benchmark Scores | I pulled a Qiqi~ |
iGPU has been using USB-C for quite some time. I have Z390 motherboard from Asrock with three monitor ports from CPU - DP, HDMI and Thunderbolt 3.
The new thing would be installing USB4 controller on GPU's PCB, so that USB-C port carries not only DP video data, but other protocols too. We have never had this solution on GPU.
Processor | 5800x3d |
---|---|
Motherboard | MSI Tomahawk x570 |
Cooling | Thermalright |
Memory | 32 gb 3200mhz E die |
Video Card(s) | 3080 |
Storage | 2tb nvme |
Display(s) | 165hz 1440p |
Case | Fractal Define R5 |
Power Supply | Toughpower 850 platium |
Mouse | HyperX Hyperfire Pulse |
Keyboard | EVGA Z15 |
System Name | "Icy Resurrection" |
---|---|
Processor | 13th Gen Intel Core i9-13900KS Special Edition |
Motherboard | ASUS ROG MAXIMUS Z790 APEX ENCORE |
Cooling | Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM |
Memory | 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V |
Video Card(s) | ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition |
Storage | 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD |
Display(s) | 55-inch LG G3 OLED |
Case | Pichau Mancer CV500 White Edition |
Power Supply | EVGA 1300 G2 1.3kW 80+ Gold |
Mouse | Microsoft Classic Intellimouse |
Keyboard | Generic PS/2 |
Software | Windows 11 IoT Enterprise LTSC 24H2 |
Benchmark Scores | I pulled a Qiqi~ |
They also really need their FG tech to be as good as DLSS3. I just wish Nvidia would toss us 3000 series people a bone with FG... but guess I'll have to wait for AMD/Intel solutions...
It's no and no currently.If AMD really want this chaplet approach to work for GPU, they need to be more aggressive securing advanced nodes from TSMC, or they better be sure Samsung can really deliver and go as advanced as possible with them.
Processor | 5800x3d |
---|---|
Motherboard | MSI Tomahawk x570 |
Cooling | Thermalright |
Memory | 32 gb 3200mhz E die |
Video Card(s) | 3080 |
Storage | 2tb nvme |
Display(s) | 165hz 1440p |
Case | Fractal Define R5 |
Power Supply | Toughpower 850 platium |
Mouse | HyperX Hyperfire Pulse |
Keyboard | EVGA Z15 |
That's why I'm saying they need to be more aggressive in securing an advanced node. Doesn't need to be the most advanced, but they can not afford disparity with Nvidia/Intel. Its an investment that pays off. Isn't that supposed to be an advantage of chiplet approach, you get better yields because the chip is not as big/complex?It's no and no currently.
At TSMC, Apple always has a priority and currently uses 90% of 3nm capacity. Plus, they don't pay for defective dies for the first time, which means 3nm yields are lower than expected, perhaps 65-70% at the moment.
At Samsung, yields are unknown on GAA Fet 3nm node. Unclear if it's 50 or 60% currently, which is low. Nvidia really had a problem with them on 8nm.
Everybody wants to be on a cutting edge node for most advanced products, but it takes a few years to improve yields towards 90%. It's a painfully slow process...
Processor | AMD Ryzen 7 5700X3D |
---|---|
Motherboard | MSI MPG B550I GAMING EDGE WIFI Mini ITX |
Cooling | Noctua NH-U12S Chromax Black |
Memory | Corsair Vengeance RGB Pro SL 32 GB (2 x 16 GB) 3600MHz CL18 |
Video Card(s) | AMD RX 6750XT Reference Design |
Storage | 2.5 TB 2.5" SSD / 3 TB HDD |
Display(s) | ASUS 27" 165HZ VG27WQ / Vertical 16/10 iiyama 25" 75Hz ProLite XUB2595WSU-B1 |
Case | be quiet! Dark Base 700 RGB |
Audio Device(s) | PSB Alpha P3 / LOXJIE A30 Amp |
Power Supply | EVGA SuperNOVA 650 GA |
Mouse | Cooler master MM720 |
Keyboard | Roccat horde |
VR HMD | Oculus Rift S (please Valve, release a new headset) |
Software | Windows 10 |
Did he use DDU ? It's well know that nvidia and amd don't like eachothers. Nothing surprising.Something funny that stuck in my mind from the review of a well-known compatriot for RTX 4060 Ti/ RX 7600:
4060 Ti - driver installation and running games without problems.
7600 - driver installation and... error, error, error. It was solved after several wasted hours, including reinstalling the OS.
Im pretty sure i have seen a perf summary somewhere in adrenalin. Isn't a ton of test are made without any manufacturer software anyway ?And he also found that he had to use nVidia software to be able to get the results of the AMD video card in the games that did not have a benchmark utility because AMD has nothing similar.
I dont think AMD could be more "aggressive". They are the second most preferred client of TSMC currently having access to several latest customised nodes.That's why I'm saying they need to be more aggressive in securing an advanced node. Doesn't need to be the most advanced, but they can not afford disparity with Nvidia/Intel. Its an investment that pays off.
Yes, they can get higher chiplet yield per wafer due to their smaller size, but 3nm wafer itself is significantly more expensive at the moment that only Apple can afford the capacity they had booked and paid two years ago.Isn't that supposed to be an advantage of chiplet approach, you get better yields because the chip is not as big/complex?
System Name | Nebulon B |
---|---|
Processor | AMD Ryzen 7 7800X3D |
Motherboard | MSi PRO B650M-A WiFi |
Cooling | be quiet! Dark Rock 4 |
Memory | 2x 24 GB Corsair Vengeance DDR5-4800 |
Video Card(s) | AMD Radeon RX 6750 XT 12 GB |
Storage | 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2 |
Display(s) | Dell S3422DWG, 7" Waveshare touchscreen |
Case | Kolink Citadel Mesh black |
Audio Device(s) | Logitech Z333 2.1 speakers, AKG Y50 headphones |
Power Supply | Seasonic Prime GX-750 |
Mouse | Logitech MX Master 2S |
Keyboard | Logitech G413 SE |
Software | Bazzite (Fedora Linux) KDE |
Nvidia has already said that the hardware necessary to make DLSS 3 work is there in Turing and Ampere, just "probably" not fast enough to do it at the proper speed (whatever that means). That, to me, is a hint that DLSS 3 for Ampere and Turing is coming soon - probably when Ada sales have reached or exceeded Nvidia's expectations.I bet that by shortly after FSR 3.0 goes public, if it's really completely hardware agnostic and you can run it on a GTX 980, they'll just announce "DLSS 3.5" with some Ada improvements and "new ability to run on Ampere, Turing, Pascal and Maxwell", with the implication that's going to be the sendoff gift for the 900 series GPUs... it'd be basically a repeat of what they did with the image sharpening feature, they added it as far back as Kepler shortly after AMD rolled it to Polaris and announced "other GPUs were coming soon", basically stealing the spotlight there.
System Name | Home |
---|---|
Processor | Ryzen 3600X |
Motherboard | MSI Tomahawk 450 MAX |
Cooling | Noctua NH-U14S |
Memory | 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16 |
Video Card(s) | MSI RX 5700XT EVOKE OC |
Storage | Samsung 970 PRO 512 GB |
Display(s) | ASUS VA326HR + MSI Optix G24C4 |
Case | MSI - MAG Forge 100M |
Power Supply | Aerocool Lux RGB M 650W |
That's not that rumour was saying. The rumour was saying that Intel will do just one or 2 cards each generation, with very low ambitions, instead of having a complete lineup and competing at all levels. For all purposes it seems to have been true. But that doesn't mean that this rumor will turn out to be true, hopefully not.I remember a rumour that Intel would stop Arc production after first gen. It wasn't true.