• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX GPUs to Support the DirectX 12 Ultimate API

Joined
Mar 13, 2012
Messages
279 (0.06/day)
Microsoft have a top secret weapon against Sony, they have an ACE up their sleeve they will reveal closer to release date:

You will be able to run Windows 10X on XBOX Series X
 
Joined
Sep 17, 2014
Messages
22,723 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
As I pointed out to him.

Other site's ran with DX12 ulti as the news piece.

Here it's Nvidia's pr piece first and foremost, laughable.

Heh gotcha. I'll keep my sensors out for that one more. Never really even occurred to me; perhaps Nvidia had a PR piece and AMD left it at the blogpost we saw? Its not like we don't see AMD news here. But we don't get a feed of their blog; neither do we get it from Nvidia. And we KNOW these companies have a different PR approach, with Nvidia throwing a few more dollars at it.

Microsoft have a top secret weapon against Sony, they have an ACE up their sleeve they will reveal closer to release date:

You will be able to run Windows 10X on XBOX Series X

If they place a full fat Windows 10 on this machine I'm buying this for the next HTPC. 100%.

And if they make it homebrew capable of doing so with an official press release saying they will allow it, I'm game too. Heck its the reason I bought a PS3 back in the day. The console as just a gaming machine is useless to me.
 
Joined
Mar 10, 2010
Messages
11,878 (2.20/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
Heh gotcha. I'll keep my sensors out for that one more. Never really even occurred to me; perhaps Nvidia had a PR piece and AMD left it at the blogpost we saw? Its not like we don't see AMD news here. But we don't get a feed of their blog; neither do we get it from Nvidia. And we KNOW these companies have a different PR approach, with Nvidia throwing a few more dollars at it.



If they place a full fat Windows 10 on this machine I'm buying this for the next HTPC. 100%.

And if they make it homebrew capable of doing so with an official press release saying they will allow it, I'm game too. Heck its the reason I bought a PS3 back in the day. The console as just a gaming machine is useless to me.
 
Joined
Aug 20, 2007
Messages
21,560 (3.40/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 5800X Optane 800GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
Joined
Nov 4, 2005
Messages
12,019 (1.72/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs, 24TB Enterprise drives
Display(s) 55" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
It doesn't depend on the engineers, though. This is politics with Intel having the upper hand.


Cell and other RISC based processors are great, at exactly what they are coded and engineered for. Unfortunately CISC is far superior at complex and out of order processing. If video games and other things were "on rails/in order" they would be optimized for Cell or other architectures based on RISC, but the choices and out of order nature of so many processing workloads means X86-64 processing will be here for a long time, and it performs so much better at general serialized workloads thanks to "hardware acceleration" which is essentially a dedicated par tof the core that makes it a RISC processor.

Why is Intel doing LITTLE.big or BIG.little or whatever? Power savings, thermal management, that little part of the core that can run RISC code at uber speed due to it merely being lookup tables that are being compared is small and can be power gated and shut off, or power gate other parts of the chip that makes it a whole CISC core to make it as or more efficient.

We could also ask why don't we have ASIC's running everything, and hte answer again is, I want to listen to youtube music, while typing, reading a document that is in an odd application specific format, and that would require specific hardware for each, and what happens when the ASIC you just bought doesn't meet standard X Y or Z, or when you dont have enough expansion ports for your ASIC's...... You build a general purpose CISC X86-64 processor.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.60/day)
Location
Ex-usa | slava the trolls
There is a great thread on the matter at https://hardforum.com/threads/have-...in-cpu-microarchitecture-performance.1953702/

The reason why I raised the question is because of some very serious concerns and actually it turns out to be a general consensus that x86 hits IPC and clock ceiling and further development won't be possible.

Since this, a new architecture and standard will be needed at some point in the future.
I expect it to become clear in 2-3 years.

The answer to your first question is "no". x86 is a dead-end ISA. It hits both an IPC wall and a frequency wall. And core count is limited by stuff as Amdhlaw law, as you mention. Next figure show the IPC and the frequency walls than Intel chips have been approaching in late years. Frequency is bound by about 5GHz and IPC is bound by about 8-wide

This deceleration on CPU development doesn't have anything to do with lack of competition, but it is a consequence of physical and technological limits and that is why the deceleration affects to everyone, not only Intel. For instance this is a visual representation of the frequency wall for Intel, AMD, IBM, Sun DEC and others.

The existence of the IPC wall has been known since the 80s or so. It is the reason why Intel tried to use the migration from 32bit to 64bit to replace the x86 ISA by a new scalable ISA. The approach failed miserably (specially because it relied on the existence of a smart enough compiler that never was built), but at least Intel and HP engineers tried.

There is absolutely nothing that AMD and Intel engineers can do to solve the current situation. Academic and industrial researchers (including people at Intel labs) have been investigating lots of new architectures and microarchitectures during decades, but not one of the proposals has worked well enough to be moved to commercial production.

Of course, engineers can optimize the existing commercial muarchs here and there to get 2% IPC here and 4% there... and foundry engineers can optimize this and that to get some MHz here and some MHz there from the next node. But that is all.

There are some academics that are working in a continuation of Intel EPIC: the guys of the Mill CPU group are working on a very wide VLIW muarch (up to 33-wide) with some improvements over the original Intel approach and much more advanced compilers. Some technical discussion abouy the Mill happened in RWT the past year, with Ivan Godard (chief of the Mill project and a compiler guy). As many therein I am completely skeptical about the Mill project. I think it will be another fiasco.


As explained before, adding more cores is limited by Amdahl law. This is not a programing model problem, but just a consequence of the sequential nature of some algorithms.

A big.LITTLE approach does not solve the performance problem, because the sequential portions of code will have to be executed on the big cores, whose performance will continue being limited by both frequency and IPC, as it happens on current cores.

Moreover, those heterogeneous approaches have the additional problem that the partition of silicon into big and small is static and made during the design phase, whereas different applications require different combinations of latency and throughput: one application would work better on a 16 BIG + 256 LITTLE configuration, whereas another application would work better on a 4 BIG + 1024 LITTLE configuration. If your heterogeneous CPU is 8 BIG + 512 LITTLE, then those two applications will run inefficiently compared to the respective optimal silicon cases.
 
Joined
Nov 4, 2005
Messages
12,019 (1.72/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs, 24TB Enterprise drives
Display(s) 55" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
The reason why I raised the question is because of some very serious concerns and actually it turns out to be a general consensus that x86 hits IPC and clock ceiling and further development won't be possible.


What is this clock ceiling and why do you think it has to do with X86? Do you think that X86 your blaming is a function of process node size and designed limitations for TDP? IPC is a function of how efficient per clock at executing programs a given processor design is, and again, given that out of order serial is the name of the game just due to not every person typing the same font, size, color, and program to type it on....

You seem fairly intelligent, but I would go read up on architectures and maybe design some logic circuits of your own to understand why we are here, and no, its not cause Intel has a "monopoly" to protect.
 
Joined
Mar 10, 2010
Messages
11,878 (2.20/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
There is a great thread on the matter at https://hardforum.com/threads/have-...in-cpu-microarchitecture-performance.1953702/

The reason why I raised the question is because of some very serious concerns and actually it turns out to be a general consensus that x86 hits IPC and clock ceiling and further development won't be possible.

Since this, a new architecture and standard will be needed at some point in the future.
I expect it to become clear in 2-3 years.
Are you ever going to be relevant though, that's the question?.
 
Joined
Dec 28, 2012
Messages
3,969 (0.91/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
There is a great thread on the matter at https://hardforum.com/threads/have-...in-cpu-microarchitecture-performance.1953702/

The reason why I raised the question is because of some very serious concerns and actually it turns out to be a general consensus that x86 hits IPC and clock ceiling and further development won't be possible.

Since this, a new architecture and standard will be needed at some point in the future.
I expect it to become clear in 2-3 years.
Oh, of course a bunch of HardOCP posters are more knowledgeable then silicon engineers. Would explain how, after this thread was created, ryzen 3000 came and smashed that IPC ceiling right in the face. And ryzen 4000 is coming this year with even more performance improvements. OOOF. If you took ARM, scaled it up with large caches and cores complex enough to match X86 processors in a variety of applications, they would pull just as much power, because current ARM designs can only match x86 in geekbench. Throw anything remotely demanding at them and they choke. DEC is long dead. The RISC vs CISC argument was beaten to death in the 90s.

X86 will die when apple, AMD, and walmart go out of business and right before the US ceases to exist after being conquered by north korea.
 
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
Can we stop derailing this thread about DX12 with CPU uArch trolling, plzkthx. Just ignore ARF and all will be good.
 
Joined
Dec 22, 2011
Messages
3,890 (0.82/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11

Damn, If people were hoping AMD would drop some classy yet subtle high class ray tracing demo, they just got burnt.

That already looks dated, quite a feat.
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
42,721 (6.69/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
Microsoft have a top secret weapon against Sony, they have an ACE up their sleeve they will reveal closer to release date:

You will be able to run Windows 10X on XBOX Series X

Rather run W7 until ms gets their head out of the sand.

Considering DEC Alpha is literally discontinued, no.

What back in 2003-2005 (SktA/462)?
 
Joined
Mar 10, 2010
Messages
11,878 (2.20/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
Damn, If people were hoping AMD would drop some classy yet subtle high class ray tracing demo, they just got burnt.

That already looks dated, quite a feat.
As a demo, I'd agree , I think the same but the ray's were cool just a bit wtf ISH in totality.
 
Joined
Feb 23, 2019
Messages
6,109 (2.86/day)
Location
Poland
Processor Ryzen 7 5800X3D
Motherboard Gigabyte X570 Aorus Elite
Cooling Thermalright Phantom Spirit 120 SE
Memory 2x16 GB Crucial Ballistix 3600 CL16 Rev E @ 3600 CL14
Video Card(s) RTX3080 Ti FE
Storage SX8200 Pro 1 TB, Plextor M6Pro 256 GB, WD Blue 2TB
Display(s) LG 34GN850P-B
Case SilverStone Primera PM01 RGB
Audio Device(s) SoundBlaster G6 | Fidelio X2 | Sennheiser 6XX
Power Supply SeaSonic Focus Plus Gold 750W
Mouse Endgame Gear XM1R
Keyboard Wooting Two HE
Joined
Apr 18, 2013
Messages
1,260 (0.29/day)
Location
Artem S. Tashkinov
While we do not know why Microsoft decided to call this the "Ultimate" version, it is possibly used to convey clearer information about which features are supported by the hardware. In the leaked slide there is a mention of consoles as well, so it is coming to that platform as well.

It's a great source of income for AMD whose RDNA 1.0 products are not DX12 Ultimate compatible. Now all their fans will have to buy new RDNA2.0 cards because, "Your GPU is not DX12 Ultimate compatible! Shame on you!"

When RDNA1.0 cards were first out I already told that their buyers are screwed and now they really are. With all the hate/flak NVIDIA receives from AMD fans, NVIDIA's products have turned out to be more future-proof. What a shame.
 
  • Angry
Reactions: ARF
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
It's a great source of income for AMD whose RDNA 1.0 products are not DX12 Ultimate compatible. Now all their fans will have to buy new RDNA2.0 cards because, "Your GPU is not DX12 Ultimate compatible! Shame on you!"

When RDNA1.0 cards were first out I already told that their buyers are screwed and now they really are. With all the hate/flak NVIDIA receives from AMD fans, NVIDIA's products have turned out to be more future-proof. What a shame.

RDNA2 supporting RTRT does mean that we will see a lot more titles providing ray tracing renderers. But much like DX11 vs DX12, I don't believe we'll see titles that support RTRT to the exclusion of all other renderers, simply because it would exclude potential customers without RTRT, like RDNA1 owners.

Do I think it's a bit of a s**tty move to launch RDNA1, then RDNA2 with much better features under a year later? Yes. But on the flipside, RDNA1 cards have been cheaper than NVIDIA cards, and it's not like people have been unaware of the lack of RTRT in RDNA1 (e.g. W1zz's reviews have always called it out). It's not like AMD has forced people to buy their cards, or that those cards are bad - I would've gone for AMD this round if not for the driver issues, simply because RTRT was not compelling enough for me. (Of course, RTRT just got a lot more compelling with these console announcements.)

So I really don't think RDNA1 buyers are screwed in any way shape or form. The hardware is competitive, and will remain so as long as rasterisation remains the primary rendering technique, which I'm quite sure will be the case for another decade at least. The only people who are likely to be salty are those who keep graphics cards for that amount of time, and expect the "fine wine" driver treatment to keep them relevant... there's no amount of driver updates that can add hardware RTRT.
 
Joined
Sep 17, 2014
Messages
22,723 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
It's a great source of income for AMD whose RDNA 1.0 products are not DX12 Ultimate compatible. Now all their fans will have to buy new RDNA2.0 cards because, "Your GPU is not DX12 Ultimate compatible! Shame on you!"

When RDNA1.0 cards were first out I already told that their buyers are screwed and now they really are. With all the hate/flak NVIDIA receives from AMD fans, NVIDIA's products have turned out to be more future-proof. What a shame.

It was clear from the get go what the cards would and would not have, and also that the chances were real that RT was going to make a dent sometime soon. Customers made a choice and the info was there. Due diligence, its called. The choice on offer is great, because it was actual choice, beyond a 5% perf gap at the same dollar and endless BS in the margin of drivers being good or not.

The interesting question I think is how the line up and the stack will be for both camps. Will they fill it up with RT capable cards top to bottom? Or will it cut off hard at the midrange at some price point because its just not going to be viable anyway? Will we see another split line up or a double one? How much of that GPU will be devoted to RT?

Interesting times! Now much more so than during Turing IMO. Its picking up steam.
 
Top