• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA's New Ada Lovelace RTX GPU Arrives for Designers and Creators

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
17,597 (2.41/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/yfsd9w
Opening a new era of neural graphics that marries AI and simulation, NVIDIA today announced the NVIDIA RTX 6000 workstation GPU, based on its new NVIDIA Ada Lovelace architecture. With the new NVIDIA RTX 6000 Ada Generation GPU delivering real-time rendering, graphics and AI, designers and engineers can drive cutting-edge, simulation-based workflows to build and validate more sophisticated designs. Artists can take storytelling to the next level, creating more compelling content and building immersive virtual environments. Scientists, researchers and medical professionals can accelerate the development of life-saving medicines and procedures with supercomputing power on their workstations—all at up to 2-4x the performance of the previous-generation RTX A6000.

Designed for neural graphics and advanced virtual world simulation, the RTX 6000, with Ada generation AI and programmable shader technology, is the ideal platform for creating content and tools for the metaverse with NVIDIA Omniverse Enterprise. Incorporating the latest generations of render, AI and shader technologies and 48 GB of GPU memory, the RTX 6000 enables users to create incredibly detailed content, develop complex simulations and form the building blocks required to construct compelling and engaging virtual worlds.




"Neural graphics is driving the next wave of innovation in computer graphics and will change the way content is created and experienced," said Bob Pette, vice president of professional visualization at NVIDIA. "The NVIDIA RTX 6000 is ready to power this new era for engineers, designers and scientists to meet the need for demanding content-creation, rendering, AI and simulation workloads that are required to build worlds in the metaverse."

Global Leaders Turn to NVIDIA RTX 6000
"NVIDIA's professional GPUs helped us deliver an experience like none other to baseball fans everywhere by bringing legends of the game back to life with AI-powered facial animation," said Michael Davies, senior vice president of field operations at Fox Sports. "We're excited to take advantage of the incredible graphics and AI performance provided by the RTX 6000, which will help us showcase the next chapter of live sports broadcast."

"Broadcasters are increasingly adopting software and compute to help build the next generation of TV stations," said Andrew Cross, CEO of Grass Valley. "The new workstation GPUs are truly game changing, providing us with over 300% performance increases—allowing us to improve the quality of video and the value of our products."

"The new NVIDIA Ada Lovelace architecture will enable designers and engineers to continue pushing the boundaries of engineering simulations," said Dipankar Choudhury, Ansys Fellow and HPC Center of Excellence lead. "The RTX 6000 GPU's larger L2 cache, significant increase in number and performance of next-gen cores and increased memory bandwidth will result in impressive performance gains for the broad Ansys application portfolio."

Next-Generation RTX Technology
Powered by the NVIDIA Ada architecture, the world's most advanced GPU architecture, the NVIDIA RTX 6000 features state-of-the-art NVIDIA RTX technology. Features include:
  • Third-generation RT Cores: Up to 2x the throughput of the previous generation with the ability to concurrently run ray tracing with either shading or denoising capabilities.
  • Fourth-generation Tensor Cores: Up to 2x faster AI training performance than the previous generation with expanded support for the FP8 data format.
  • CUDA cores: Up to 2x the single-precision floating point throughput compared to the previous generation.
  • GPU memory: Features 48 GB of GDDR6 memory for working with the largest 3D models, render images, simulation and AI datasets.
  • Virtualization: Will support NVIDIA virtual GPU (vGPU) software for multiple high-performance virtual workstation instances, enabling remote users to share resources and drive high-end design, AI and compute workloads.
  • XR: Features 3x the video encoding performance of the previous generation, for streaming multiple simultaneous XR sessions using NVIDIA CloudXR.

Availability
The NVIDIA RTX 6000 workstation GPU will be available from global distribution partners and manufacturers starting in December.

View at TechPowerUp Main Site | Source
 
Joined
Jun 21, 2021
Messages
3,121 (2.50/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
There is no DisplayPort 2.0 even on creators' cards.

Again, these are probably somewhere in the DP 2.0 certification process.

Moaning about this in thread after thread isn't going to speed the certification process up for you or anyone else.

In fact, if the actual engineers doing the certification were reading all these threads, they wouldn't be doing their jobs in the most efficient manner, would they?

Feel free to keep prattling on about this but either me or someone else is going to echo the same statement. They don't have the certification yet. That doesn't mean it can't be offered in the future.

Remember, these cards aren't being loaded onto a FedEx delivery truck right now.
 
Joined
Jun 18, 2021
Messages
2,547 (2.03/day)
What are we supposed to use to distinguish this new GPUs from the previous ones? The previous one was called RTX A6000, now this is RTX 6000? Right, because that's not confusing at all... an whatever happened to Quadro!?

Again, these are probably somewhere in the DP 2.0 certification process.

Moaning about this in thread after thread isn't going to speed the certification process up for you or anyone else.

In fact, if the actual engineers doing the certification were reading all these threads, they wouldn't be doing their jobs in the most efficient manner, would they?

Feel free to keep prattling on about this but either me or someone else is going to echo the same statement. They don't have the certification yet. That doesn't mean it can't be offered in the future.

Remember, these cards aren't being loaded onto a FedEx delivery truck right now.

Hmm I haven't seen all the news from todays announcement but I'm gonna press doubt on that, if they were going to support DP 2.0 the certification should already have been done and/or they would advertise it (it's not like certification is the first spec test done)

In reality it's not like quadro (or whatever name we should use to distinguish workstation gpus now, like what the fuck nvidia!?) needs the higher bandwith when they'll be used in virtualized scenarios or with lower refresh rates and can leverage DSC, but would still have been nice to see it implemented.
 
Joined
Jun 21, 2021
Messages
3,121 (2.50/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
I don't know what VESA has on their plate right now. For sure, they don't just accept a bunch of hardware submissions and press "Approved" to clear everything at once.

There's also the possibility that DP 2.0 certification hinges on some sort of software support (firmware or driver) that NVIDIA must provide.

That might explain why no AIB partner cards have any mention of DP 2.0 either.
 
Joined
Jun 29, 2018
Messages
537 (0.23/day)
I don't know what VESA has on their plate right now. For sure, they don't just accept a bunch of hardware submissions and press "Approved" to clear everything at once.

There's also the possibility that DP 2.0 certification hinges on some sort of software support (firmware or driver) that NVIDIA must provide.

That might explain why no AIB partner cards have any mention of DP 2.0 either.
Intel managed to get it even for A380. The lack of support on Ada-based cards is suspicious. If it was under active certification I'm sure NVIDIA would have mentioned it in the PR materials.
 
Joined
Jun 21, 2021
Messages
3,121 (2.50/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
But DP 2.0 certification isn't final for Intel A770.

There's a small chance that NVIDIA completely forgot about DisplayPort 2.0 and neglected to include that technology on their Ada generation cards despite having enough presence of mind to include HDMI 2.1.

What do you think the odds are that NVIDIA thought that they could just skip DisplayPort 2.0 with their 40 series cards?
 
Joined
Jun 29, 2018
Messages
537 (0.23/day)
But DP 2.0 certification isn't final for Intel A770.
You're right, but what Intel wrote in their specification ("**Designed for DP2.0, certification pending VESA CTS Release") is what I expected NVIDIA to do if they also were in the certification process, but they didn't.
There's a small chance that NVIDIA completely forgot about DisplayPort 2.0 and neglected to include that technology on their Ada generation cards despite having enough presence of mind to include HDMI 2.1.

What do you think the odds are that NVIDIA thought that they could just skip DisplayPort 2.0 with their 40 series cards?
That's why I wrote it was suspicious ;) I guess we'll have to wait for an official NVIDIA response to this issue.
 
Joined
Aug 25, 2021
Messages
1,170 (0.99/day)
Intel managed to get it even for A380. The lack of support on Ada-based cards is suspicious. If it was under active certification I'm sure NVIDIA would have mentioned it in the PR materials.
True that. You cannot omit such an important tech development in marketing or at least in a teaser.

You're right, but what Intel wrote in their specification ("**Designed for DP2.0, certification pending VESA CTS Release") is what I expected NVIDIA to do if they also were in the certification process, but they didn't.
Exactly! This is the least they could have done to inform people, like Intel did, that the technology will be included, to assure prospective buyers that DP 2.0 was baked into hardware, regardless of formal certification process.

With or without VESA certificate, if the hardware is capable, it should be mentioned without second thoughts, just like HDMI 2.1 port worked from the first day. Certification simply brings a formal recognition that the industry standard was implemented. The port itself should work anyway, if Nvidia supports it in software too.

No one sane would be hiding such important video capability of GPU that is expected to work once VESA blessing is available. This makes me think that nothing was submitted for DP 2.0 certification and 4000 cards will run on older DP 1.4a.
 
Joined
Aug 25, 2021
Messages
1,170 (0.99/day)
Right? Seriously with the no DP2.0? I'm beginning to understand why EVGA dropped out...
Exactly. Here is the spec for 6000. They did not bake DP 2.0 hardware support into boards on the most expensive cards on the market.
Intel's lowest card, A380, has DP 2.0 port at 40 Gbps...
RTX6000-ADA-1200x885.png
 
Joined
Aug 25, 2021
Messages
1,170 (0.99/day)
And before anyone points how the bandwith increase is not that great (32gbit to 40) the encoding scheme also changed so effective bandwith goes from 26 to 38.6 which is pretty massive when you account 4k 10bit color, HDR or daisy chaining.
Exactly. The only way to speed up improvements in monitors and bring more 4K/5K 10-bit panels into mainstream is to install DP 2.0 ports and be free from HDMI 2.1 FRL that brought so many giant OLED TVs into PC space. When I look into high quality 4K HDR monitors, I am horrified by prices of Asus ProArt line. It cannot be the case that high quality OLED panels on giant TVs still cost up to three times less than similarly well-speced truly HDR monitor. As HDR requires ~25% more bandwidth, it is time DP 2.0 hits the ground running to get those monitor vendors to speed up mainstream innovation and image quality.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,168 (1.27/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
I'll be keen to see if they make an RTX 2000 like the A2000, low profile 75w max, let's see the extent of the perf:watt improvement.
 
Joined
Nov 4, 2005
Messages
11,980 (1.72/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs, 24TB Enterprise drives
Display(s) 55" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
In fact, if the actual engineers doing the certification were reading all these threads, they wouldn't be doing their jobs in the most efficient manner, would they?
How does it feel to be a living straw man?
 
Joined
Jun 21, 2021
Messages
3,121 (2.50/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
Very hollow obviously. :D

It will be interesting to see how NVIDIA navigates through the next few weeks before the first shipments start. Perhaps more interesting will be how they react after AMD and Intel make their next moves.
 
Joined
Sep 2, 2022
Messages
92 (0.11/day)
Location
Italy
Processor AMD Ryzen 9 5900X
Motherboard ASUS TUF Gaming B550-PLUS
Memory Corsair Vengeance LPX DDR4 4x8GB
Video Card(s) Gigabyte GTX 1070 TI 8GB
Storage NVME+SSD+HDD
Display(s) Benq GL2480 24" 1080p 75 Hz
Power Supply Seasonic M12II 520W
Mouse Logitech G400
Software Windows 10 LTSC
No NVLink even on the prefessionals cards?
That would be really weird. They may have removed the NVLink from the 4090 to allow a certain level of performance only to the workstation gpus and force the people to buy these instead of the Geforce.
But removing NVLink from the RTX 6000 can convince the people who benefit from that technology to stay with the old gen cards. And the same can be said for the ones who used it with the 3090.
Unless they found another way to scale the memory of multiple gpus in the same way of NVLink.
 
Joined
Nov 30, 2021
Messages
135 (0.12/day)
Location
USA
System Name Star Killer
Processor Intel 13700K
Motherboard ASUS RO STRIX Z790-H
Cooling Corsair 360mm H150 LCD Radiator
Memory 64GB Corsair Vengence DDR5 5600mhz
Video Card(s) MSI RTX 3080 12GB Gaming Trio
Storage 1TB Samsung 980 x 1 | 1TB Crucial Gen 4 SSD x 1 | 2TB Samsung 990 Pro x 1
Display(s) 32inch ASUS ROG STRIX 1440p 170hz WQHD x 1, 24inch ASUS 165hz 1080p x 1
Case Lian Li O11D White
Audio Device(s) Creative T100 Speakers , Razer Blackshark V2 Pro wireless
Power Supply EVGA 1000watt G6 Gold
Mouse Razer Viper V2 Wireless with dock
Keyboard ASUS ROG AZOTH
Software Windows 11 pro
No NVLink even on the prefessionals cards?
Nobody uses it, and it requires a massive amount of time to create drivers for. Like 10 people use nvlink.
 
Joined
Jun 18, 2021
Messages
2,547 (2.03/day)
That would be really weird. They may have removed the NVLink from the 4090 to allow a certain level of performance only to the workstation gpus and force the people to buy these instead of the Geforce.
But removing NVLink from the RTX 6000 can convince the people who benefit from that technology to stay with the old gen cards. And the same can be said for the ones who used it with the 3090.
Unless they found another way to scale the memory of multiple gpus in the same way of NVLink.

Professional applications don't need NVLink because they don't need the level of synchronization it provides (that games require for example). They can just share resources through their regular pcie connection, it's good enough for the type of loads they'll be doing that are heavely parallelized and easily distributed to multiple processors
 
Joined
Sep 2, 2022
Messages
92 (0.11/day)
Location
Italy
Processor AMD Ryzen 9 5900X
Motherboard ASUS TUF Gaming B550-PLUS
Memory Corsair Vengeance LPX DDR4 4x8GB
Video Card(s) Gigabyte GTX 1070 TI 8GB
Storage NVME+SSD+HDD
Display(s) Benq GL2480 24" 1080p 75 Hz
Power Supply Seasonic M12II 520W
Mouse Logitech G400
Software Windows 10 LTSC
Professional applications don't need NVLink because they don't need the level of synchronization it provides (that games require for example). They can just share resources through their regular pcie connection, it's good enough for the type of loads they'll be doing that are heavely parallelized and easily distributed to multiple processors
NVLink was introduced for a reason: there are scenarios where a program needs the fast gpu calculations and cuda cores and at the same time a massive amount of gpu memory. Rendering is one of these situations. Very complex animations can need more than 24GB of vram. If you think that rendering such scene can take many hours even with gpus, you can imagine that the author would do anything possible to avoid crashes.
Even if a new card can complete a rendering faster, it is worthless for some people or studios if it doesn't have enough memory.
By the way, programs like Octane, 3ds Max, Maya, Blender, Redshift, DaVinci Resolve, etcetera, can use NVLink. If you think that programs like Max and Maya are the standard tools used in the videogame and movie industry, it's easy to understand that this technology has its benefits and can be a must have for someone.
For the scientific calculations can be even more valuable.

So, I can't believe that they are ditching it like that and without explaining why. I'm curious to know what is going on.
 
Joined
Jul 5, 2013
Messages
27,703 (6.66/day)
So, I can't believe that they are ditching it like that and without explaining why. I'm curious to know what is going on.
My theory is what Jensen hinted at elsewhere, instead of NVLink, developers can do what wasn't effective until now, use of the card bus for direct data transfers an inter communications. The PCIe bus has truly massive amounts of available bandwidth since PCIe4.0, more than any single GPU alone can saturate. So dropping in two(or more) GPU's into a system and then connecting them tandem in software is now a doable option.

Hardware SLI no longer needs to be a thing as it can now be done in software through the exist card bus.
 
Joined
Sep 2, 2022
Messages
92 (0.11/day)
Location
Italy
Processor AMD Ryzen 9 5900X
Motherboard ASUS TUF Gaming B550-PLUS
Memory Corsair Vengeance LPX DDR4 4x8GB
Video Card(s) Gigabyte GTX 1070 TI 8GB
Storage NVME+SSD+HDD
Display(s) Benq GL2480 24" 1080p 75 Hz
Power Supply Seasonic M12II 520W
Mouse Logitech G400
Software Windows 10 LTSC
My theory is what Jensen hinted at elsewhere, instead of NVLink, developers can do what wasn't effective until now, use of the card bus for direct data transfers an inter communications. The PCIe bus has truly massive amounts of available bandwidth since PCIe4.0, more than any single GPU alone can saturate. So dropping in two(or more) GPU's into a system and then connecting them tandem in software is now a doable option.

Hardware SLI no longer needs to be a thing as it can now be done in software through the exist card bus.
Well, if they can share their memory like that would be great. I guess we'll wait and see.
 
Top