• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Plans GeForce RTX 4060 Launch for Summer 2023, Performance Rivaling RTX 3070

Joined
Jan 14, 2019
Messages
12,682 (5.83/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
The driver instability argument is from people than know AMD since before RDNA2.
To get to their stable driver today, AMD basically wiped the slate clean with RDNA, throwing everything that came before it under the bus. As you have noted, even with a clean slate, the first RDNA iteration was still a bumpy ride. Today the drivers are much better, but you can't fault people that were bitten in the past for still having a bitter taste in their mouths.
I'm not denying that there aren't those that simply parrot "driver instability" simply because they don't like AMD, but let's not pretend that's all there is to it.
I know AMD and ATi from the early 2000s, and never had any problem with drivers, except for the 5700 XT.

If people want to continue parroting "driver instability" instead of asking users and doing some research, that's their choice. I accept it, just don't agree with it. They should specify that they're too lazy to look into things before bringing up past issues that are totally irrelevant with today's products and misleading other people as a result.
 
Joined
May 19, 2009
Messages
1,868 (0.33/day)
Location
Latvia
System Name Personal \\ Work - HP EliteBook 840 G6
Processor 7700X \\ i7-8565U
Motherboard Asrock X670E PG Lightning
Cooling Noctua DH-15
Memory G.SKILL Trident Z5 RGB Black 32GB 6000MHz CL36 \\ 16GB DDR4-2400
Video Card(s) ASUS RoG Strix 1070 Ti \\ Intel UHD Graphics 620
Storage 2x KC3000 2TB, Samsung 970 EVO 512GB \\ OEM 256GB NVMe SSD
Display(s) BenQ XL2411Z \\ FullHD + 2x HP Z24i external screens via docking station
Case Fractal Design Define Arc Midi R2 with window
Audio Device(s) Realtek ALC1150 with Logitech Z533
Power Supply Corsair AX860i
Mouse Logitech G502
Keyboard Corsair K55 RGB PRO
Software Windows 11 \\ Windows 10
The dollar is already ~1:1 with euro, good luck with that.

You mean it might be even more? Would not surprise me in slightest. 3080 still cost ~1k EUR in Baltics on average.
 
Joined
May 10, 2020
Messages
738 (0.44/day)
Processor Intel i7 13900K
Motherboard Asus ROG Strix Z690-E Gaming
Cooling Arctic Freezer II 360
Memory 32 Gb Kingston Fury Renegade 6400 C32
Video Card(s) PNY RTX 4080 XLR8 OC
Storage 1 TB Samsung 970 EVO + 1 TB Samsung 970 EVO Plus + 2 TB Samsung 870
Display(s) Asus TUF Gaming VG27AQL1A + Samsung C24RG50
Case Corsair 5000D Airflow
Power Supply EVGA G6 850W
Mouse Razer Basilisk
Keyboard Razer Huntsman Elite
Benchmark Scores 3dMark TimeSpy - 26698 Cinebench R23 2258/40751
… at just $699 …

Hopefully Nvidia will lose a huge part of their customers, with this generation.
 
Joined
Apr 14, 2018
Messages
703 (0.29/day)
Yes. But is there any reason, to upgrade every generation? I upgrade any 2nd generation and I am fine with the performance jump.

Depends everything on the competition/price/availability of new the AMD generation. No real competition -> high prices for consumers.

That’s a dangerous slope. Upgrade when you personally see fit, but using that as even the smallest justification for price increases being acceptable isn’t a good reason. Say we see this kind of price jump two generations back to back. We would end up with a 5080 being 2k+, regardless of who needs the performance from a “gaming” gpu.

I swap gpus a lot. Multiple times within a generation because it’s my hobby and I enjoy that within reason and a budget. The price of mid to high end hardware is vastly outstripping the value of the hobby in my opinion.

I strongly believe the 4000 series pricing is entirely related to moving back stock of 3000 series cards and the ridiculous greed exhibited by multiple parties during the Covid/crypto boom. It’s just comical to see gpu hardware getting a thumbs up in tech editorials with pricing the way it is.
 
Joined
Sep 13, 2021
Messages
86 (0.07/day)
What income? I thought we were talking about gaming.
If you like to spend your time to search for driver errors, fine. No problem. I prefer to be paid for such work or be spared from it.

If you don't want to buy an AMD card again just because you got burned once with driver support for one application, that's fine. If you don't want to date again just because one of your exes snored, that's okay, too. You can live your whole life alone if you want to. You can also spend unreasonable amounts of money on an Nvidia Ada card, let me not stop you. ;).
nVidia cards are not only used for gaming and are dominating the creative sector. AMD support for professional and semiprofessional software was very bad. They got somewhat better the last years, but still lousy compared to nVidia.

Did you read my whole post? There are other Nvidia cards out there that are much better deals than the 4080 or 4090!
The 4090 is by far the best high end card on the market.
 
Joined
Jan 14, 2019
Messages
12,682 (5.83/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
That’s a dangerous slope. Upgrade when you personally see fit, but using that as even the smallest justification for price increases being acceptable isn’t a good reason. Say we see this kind of price jump two generations back to back. We would end up with a 5080 being 2k+, regardless of who needs the performance from a “gaming” gpu.

I swap gpus a lot. Multiple times within a generation because it’s my hobby and I enjoy that within reason and a budget. The price of mid to high end hardware is vastly outstripping the value of the hobby in my opinion.

I strongly believe the 4000 series pricing is entirely related to moving back stock of 3000 series cards and the ridiculous greed exhibited by multiple parties during the Covid/crypto boom. It’s just comical to see gpu hardware getting a thumbs up in tech editorials with pricing the way it is.
Well said.

If we get 30% more performance for 50% higher price this generation, and we're okay with that, then why shouldn't we get 20% more performance for 70% higher price in the next gen, and then 10% more performance for 90% higher price, and so on. Price-to-performance should be better than last gen, otherwise there is no improvement to talk about. Instead, price-to-performance in Ada is worse than last gen, yet some people are praising Nvidia purely for the performance crown, and I don't understand why.

If you like to spend your time to search for driver errors, fine. No problem. I prefer to be paid for such work or be spared from it.
What driver errors?

nVidia cards are not only used for gaming and are dominating the creative sector. AMD support for professional and semiprofessional software was very bad. They got somewhat better the last years, but still lousy compared to nVidia.
Fair enough - I don't know much about the creative sector.

The 4090 is by far the best high end card on the market.
Sure, but at what cost? It's price-to-performance ratio is worse than that of nearly every other graphics card on the market.
 
Joined
Sep 13, 2021
Messages
86 (0.07/day)
That’s a dangerous slope. Upgrade when you personally see fit, but using that as even the smallest justification for price increases being acceptable isn’t a good reason. Say we see this kind of price jump two generations back to back. We would end up with a 5080 being 2k+, regardless of who needs the performance from a “gaming” gpu.
The price is formed from supply and demand. Not understanding this is a dangerous knowledge gap.

[...]I strongly believe the 4000 series pricing is entirely related to moving back stock of 3000 series cards and the ridiculous greed exhibited by multiple parties during the Covid/crypto boom. It’s just comical to see gpu hardware getting a thumbs up in tech editorials with pricing the way it is.
Yes, temporary oversupply of 3000 cards.

What driver errors?
Fair enough - I don't know much about the creative sector.
nVidia published a ray tracer, license free, my preferred 3D software implemented it, nVidia gave them support to code the interface. I had a radeon at that time. RT with cudacore support is approximately 10-20x faster, than without.
AMD did not publish a cudacore support. Not even Blender or other widely popular apps got access to AMD cudacores. Only expensive plugins >200$ with the side effect of only indirect support by AMD. I bought nVidia, everything speeds up smoothly. Over last 10 years, interface, render engine, cuda-cores, got monthly to quarterly updates, all for free. Perfect.
Lately AMD got better. I think there are some free interfaces now, based on OpenGL or directX, to support GPU accelerated RT in Blender eg., as AMD joined the Blender development club some years ago.
Sure, but at what cost? It's price-to-performance ratio is worse than that of nearly every other graphics card on the market.
4090 has performance spikes >100% in rt visualization, real-time preview of RT editing in complex sceneries. 3090 and 4090 are comparable to rtx titan, cards with costs always 1500-2500$. Nothing changed, only the names and nVidias propaganda to sell them as gaming GPUs. 4080-4090 is a weird gab. I have no idea, why nVidia did that, it will damage its gamer base.
 

bug

Joined
May 22, 2015
Messages
13,847 (3.95/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
You mean it might be even more? Would not surprise me in slightest. 3080 still cost ~1k EUR in Baltics on average.
Take the price in USD, add 20%, the average sales tax in EU and you may still be a little under the mark :(

I know AMD and ATi from the early 2000s, and never had any problem with drivers, except for the 5700 XT.

If people want to continue parroting "driver instability" instead of asking users and doing some research, that's their choice. I accept it, just don't agree with it. They should specify that they're too lazy to look into things before bringing up past issues that are totally irrelevant with today's products and misleading other people as a result.
I know ATI since their Rage3D days. Back then, there was even a 3rd party driver that was better than the official one (that's ancient history, but still a hilarious anecdote). I've know them when Nvidia was running circles around their Linux driver. Don't think everything was always peachy just because you had a good ride.
 
Joined
Apr 14, 2018
Messages
703 (0.29/day)
The price is formed from supply and demand. Not understanding this is a dangerous knowledge gap.

The current market situation is a lot more unique than simply stating it’s “supply and demand”. I’d say especially when Nvidia “planned” for the 4000 series to “co-exist” with the 3000 series cards. Sounds much more like artificial price control for them over producing during the Covid/crypto boom.
 

bug

Joined
May 22, 2015
Messages
13,847 (3.95/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
The current market situation is a lot more unique than simply stating it’s “supply and demand”. I’d say especially when Nvidia “planned” for the 4000 series to “co-exist” with the 3000 series cards. Sounds much more like artificial price control for them over producing during the Covid/crypto boom.
Do you have any proof of that? GPU dies are just getting bigger because of more shaders and cache. That's what drives the costs up. Fab capacity for leading nodes is now contested my a billion mobile devices that didn't exist 10 years ago.
Not even Intel, who's a newcomer, could use price to gain a competitive advantage.
 

sharafa

New Member
Joined
Nov 19, 2022
Messages
2 (0.00/day)
The price is formed from supply and demand. Not understanding this is a dangerous knowledge gap.

Yes, temporary oversupply of 3000 cards.


nVidia published a ray tracer, license free, my preferred 3D software implemented it, nVidia gave them support to code the interface. I had a radeon at that time. RT with cudacore support is approximately 10-20x faster, than without.
AMD did not publish a cudacore support. Not even Blender or other widely popular apps got access to AMD cudacores. Only expensive plugins >200$ with the side effect of only indirect support by AMD. I bought nVidia, everything speeds up smoothly. Over last 10 years, interface, render engine, cuda-cores, got monthly to quarterly updates, all for free. Perfect.
Lately AMD got better. I think there are some free interfaces now, based on OpenGL or directX, to support GPU accelerated RT in Blender eg., as AMD joined the Blender development club some years ago.

4090 has performance spikes >100% in rt visualization, real-time preview of RT editing in complex sceneries. 3090 and 4090 are comparable to rtx titan, cards with costs always 1500-2500$. Nothing changed, only the names and nVidias propaganda to sell them as gaming GPUs. 4080-4090 is a weird gab. I have no idea, why nVidia did that, it will damage its gamer base.
Very good point with this professional point of view. As an AV worker mostly and 3D artist sometime, I could'nt agree more. Nvidia is dominating the market with CUDA. There is no alternative for us ATM.
I spended the last day to explore the other options with AMD hardware and minus Blender (and i don't use it) there is nothing on 3D with AMD.
2 years ago we had ProRender in Cinema4D but Maxon remove it. Now it is a CUDA monopoly. Octane and Redshift will support AMD in the future, but nothing yet.
I agree that the 4090 is a titan class card at this performance point, BUT :
1- no NVLINK is very strange and disapointing for pro
2- DP 1.4 ... i cannot understand for such a top of the line product
3- too huge and hungry to put more than 2 in a computer case without a custom and very expensive water loop.

I was waiting this generation to upgrade and i am hugely disapointed. I think i will go renting my cards in renderfarms for the next year , i don't want to pay for such a product line, even if i have the budget to buy 3.
From a gamer perspective, the ADA line and Nvidia politic are a disaster. If I had the choice I will go AMD immediatly.
 
Joined
Jun 30, 2008
Messages
267 (0.04/day)
Location
Sweden
System Name Shadow Warrior
Processor 7800x3d
Motherboard Gigabyte X670 Gaming X AX
Cooling Thermalright Peerless Assassin 120 SE ARGB White
Memory 64GB 6000Mhz cl30
Video Card(s) XFX 7900XT
Storage 8TB NVME + 4TB SSD + 3x12TB 5400rpm
Display(s) HP X34 Ultrawide 165hz
Case Fractal Design Define 7 (modded)
Audio Device(s) SMSL DL200 DAC / AKG 271 Studio / Moondrop Joker..
Power Supply Corsair hx1000i
Mouse Roccat Burst Pro
Keyboard Cherry Stream 3.0 SX-switches
VR HMD Quest 1 (OLED), Pico 4 128GB
Software Win11 x64
Quite late release. This is likely so that they can sell overpriced GPUs to the masses simply just to make more money.
 
Joined
May 10, 2020
Messages
738 (0.44/day)
Processor Intel i7 13900K
Motherboard Asus ROG Strix Z690-E Gaming
Cooling Arctic Freezer II 360
Memory 32 Gb Kingston Fury Renegade 6400 C32
Video Card(s) PNY RTX 4080 XLR8 OC
Storage 1 TB Samsung 970 EVO + 1 TB Samsung 970 EVO Plus + 2 TB Samsung 870
Display(s) Asus TUF Gaming VG27AQL1A + Samsung C24RG50
Case Corsair 5000D Airflow
Power Supply EVGA G6 850W
Mouse Razer Basilisk
Keyboard Razer Huntsman Elite
Benchmark Scores 3dMark TimeSpy - 26698 Cinebench R23 2258/40751
If you like to spend your time to search for driver errors, fine. No problem. I prefer to be paid for such work or be spared from it.

what’s your last experience with Radeon cards ? Did you try a RDNA2 series ?
I had issues with RDNA for sure, but it seems the 6000 series was fine.

nVidia cards are not only used for gaming and are dominating the creative sector. AMD support for professional and semiprofessional software was very bad. They got somewhat better the last years, but still lousy compared to nVidia.
Yes, for professional works I would choose an Nvidia card, because of their STudio Drivers that are better than AMD’s… but here the main focus seems to be gaming
The 4090 is by far the best high end card on the market.
The 4090 is a ridiculous card…
Be the faster doesn’t automatically mean be the best.

Take the price in USD, add 20%, the average sales tax in EU and you may still be a little under the mark :(


I know ATI since their Rage3D days. Back then, there was even a 3rd party driver that was better than the official one (that's ancient history, but still a hilarious anecdote). I've know them when Nvidia was running circles around their Linux driver. Don't think everything was always peachy just because you had a good ride.
To compare Radeon 7000 with ATi Rage3D makes no sense at all. We are speaking about totally different teams here. I would use rdna 2 as a term for comparison…
 
Joined
Jan 14, 2019
Messages
12,682 (5.83/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
nVidia published a ray tracer, license free, my preferred 3D software implemented it, nVidia gave them support to code the interface. I had a radeon at that time. RT with cudacore support is approximately 10-20x faster, than without.
AMD did not publish a cudacore support. Not even Blender or other widely popular apps got access to AMD cudacores. Only expensive plugins >200$ with the side effect of only indirect support by AMD. I bought nVidia, everything speeds up smoothly. Over last 10 years, interface, render engine, cuda-cores, got monthly to quarterly updates, all for free. Perfect.
Lately AMD got better. I think there are some free interfaces now, based on OpenGL or directX, to support GPU accelerated RT in Blender eg., as AMD joined the Blender development club some years ago.

4090 has performance spikes >100% in rt visualization, real-time preview of RT editing in complex sceneries. 3090 and 4090 are comparable to rtx titan, cards with costs always 1500-2500$. Nothing changed, only the names and nVidias propaganda to sell them as gaming GPUs. 4080-4090 is a weird gab. I have no idea, why nVidia did that, it will damage its gamer base.
It looks like you're talking from a completely creative user point of view, which I know nothing about, so I'll leave it at that. My point still stands for gaming, though.

I know ATI since their Rage3D days. Back then, there was even a 3rd party driver that was better than the official one (that's ancient history, but still a hilarious anecdote). I've know them when Nvidia was running circles around their Linux driver. Don't think everything was always peachy just because you had a good ride.
Do you mean Omega? I had those drivers for my 9600 XT. :)

I'm not saying that either company is perfect. What I'm saying is that I've had good and bad from both. The low point from AMD was driver support and heat issues with my 5700 XT. The low point from Nvidia was the 7800 GS AGP which was loud, it overheated, and when it did, its built-in speaker screamed so loud that it woke up the neighbours. You couldn't change its cooler, either, because it had a unique mounting hole arrangement which no aftermarket company built coolers for. It was an absolutely garbage card, totally not worth swapping my X800 XT for.

The other thing I'm saying is that you shouldn't exclude buying options just because you got burned in the past. Instead, you should visit forums like this one, and educate yourself.
 
Joined
Sep 13, 2021
Messages
86 (0.07/day)
The current market situation is a lot more unique than simply stating it’s “supply and demand”. I’d say especially when Nvidia “planned” for the 4000 series to “co-exist” with the 3000 series cards. Sounds much more like artificial price control for them over producing during the Covid/crypto boom.
Oversupply in 3000er are compensated by lowered 4000er production to stabilize the price and to prepare to pass the recession without high inventories. nVidia cant control the price, they can only control the supply of newly produced 4000s. What consumers are willing to pay, is their's decision. The higher the price, the lower the sales, the higher the income per unit. It is up to the management to optimize all this mathematically in order to achieve the optimum operating result under the current market conditions.
 
Joined
Sep 6, 2013
Messages
3,430 (0.83/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 7600 / Ryzen 5 4600G / Ryzen 5 5500
Motherboard X670E Gaming Plus WiFi / MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2)
Cooling Aigo ICE 400SE / Segotep T4 / Νoctua U12S
Memory Kingston FURY Beast 32GB DDR5 6000 / 16GB JUHOR / 32GB G.Skill RIPJAWS 3600 + Aegis 3200
Video Card(s) ASRock RX 6600 / Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes / NVMes, SATA Storage / NVMe, SATA, external storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) / 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
That just means that upgrading every 2-3 generations isn't necessary anymore. I mean, if we assume that you're happy with a 1060, then why would you look for upgrade options? It's pointless.
Obviously today we have cheap options that can cover the needs of the vast majority better than before. A cheap 6 core/12 Threads Ryzen at $100, can cover the needs of much more people than a dual core Pentium of the same price 10 years ago. So the need for upgrading is becoming less of a necessity for more people year after year. That's why performance is becoming more and more expensive year after year.

But while in CPUs we still keep seeing new models at $100 or lower, models that bring (much) more performance at those price segments, in GPUs we don't. While in CPUs the top mainstream models maintain a price that is clearly under $1000, in GPUs we are constantly hitting over $1500. And as I pointed before, even if we add on the CPU the cost of RAM, motherboard, cooling, the price of that combination will still be lower of a equivalent graphics card, with equivalent I mean low end CPU vs low end GPU(RTX 3050/3060 vs i3 12100 for example, or Ryzen 5 5600G), mid range vs mid range, hi end vs hi end.
If things continue in that direction, in 5 years the cheaper new card will start at $500, while the more expensive one will have an official MSRP of over $2000. That's a price where (semi) pro cards where sitting before, not gaming cards.
 
Joined
Oct 15, 2010
Messages
951 (0.18/day)
System Name Little Boy / New Guy
Processor AMD Ryzen 9 5900X / Intel Core I5 10400F
Motherboard Asrock X470 Taichi Ultimate / Asus H410M Prime
Cooling ARCTIC Liquid Freezer II 280 A-RGB / ARCTIC Freezer 34 eSports DUO
Memory TeamGroup Zeus 2x16GB 3200Mhz CL16 / Teamgroup 1x16GB 3000Mhz CL18
Video Card(s) Asrock Phantom RX 6800 XT 16GB / Asus RTX 3060 Ti 8GB DUAL Mini V2
Storage Patriot Viper VPN100 Nvme 1TB / OCZ Vertex 4 256GB Sata / Ultrastar 2TB / IronWolf 4TB / WD Red 8TB
Display(s) Compumax MF32C 144Hz QHD / ViewSonic OMNI 27 144Hz QHD
Case Phanteks Eclipse P400A / Montech X3 Mesh
Power Supply Aresgame 850W 80+ Gold / Aerocool 850W Plus bronze
Mouse Gigabyte Force M7 Thor
Keyboard Gigabyte Aivia K8100
Software Windows 10 Pro 64 Bits
I had issues with RDNA for sure, but it seems the 6000 series was fine.
I've had:
Geforce 2 MX400
Geforce 4 MX440
Geforce FX 5700 Ultra
Geforce 6800 GS
Geforce RTX 3060 Ti

Radeon HD4870
Radeon HD7870
Radeon HD7970
Radeon RX 580
Radeon RX 6800 XT

Never had any driver problems. In 20+ years of gaming.
 
Last edited:
Joined
Sep 13, 2021
Messages
86 (0.07/day)
Very good point with this professional point of view. As an AV worker mostly and 3D artist sometime, I could'nt agree more. Nvidia is dominating the market with CUDA. There is no alternative for us ATM.
I spended the last day to explore the other options with AMD hardware and minus Blender (and i don't use it) there is nothing on 3D with AMD.
2 years ago we had ProRender in Cinema4D but Maxon remove it. Now it is a CUDA monopoly. Octane and Redshift will support AMD in the future, but nothing yet.
I agree that the 4090 is a titan class card at this performance point, BUT :
1- no NVLINK is very strange and disapointing for pro
2- DP 1.4 ... i cannot understand for such a top of the line product
3- too huge and hungry to put more than 2 in a computer case without a custom and very expensive water loop.

I was waiting this generation to upgrade and i am hugely disapointed. I think i will go renting my cards in renderfarms for the next year , i don't want to pay for such a product line, even if i have the budget to buy 3.
From a gamer perspective, the ADA line and Nvidia politic are a disaster. If I had the choice I will go AMD immediatly.
Hello. I am using Iray from nVidia for rendering. There is an octane plugin for around 150-200$ for my app, but the plugin is developed by a small independent team, sometimes updates are late and there is always the risk, the software will be abandoned. When I started rt there was no octane plugin. Changing the render engine means changing all surfaces shader and changing all the lighting. That's a huge work. So I am using nVidia, because it works and is reliable. I would not have any problem, to buy AMD, if they would provide the same support.
I will wait some time, until 4090 is available around MSRP, and after installing, I want to play a bit with the unreal V engine to test the new possibilities. Game render engines are much faster than the usual offline render engines, but they now have a lot to offer. I'm just doing this for fun. I used to play a lot, now I prefer to play with the possibilities of game engines than with specific games, do some art, modding, such things.
To the 4090: You are right, it seems nVidia wants to stop the 4090 cannibalizing the quadro series, cutting memory pooling/nv-link. There are possibilities to use two 4090, undervolting would be necessary, but not every creative has such knowledge. For $2000 you can get two 3090. 48GB VRam including nVlink. I don't need so much :)
 

bug

Joined
May 22, 2015
Messages
13,847 (3.95/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Do you mean Omega? I had those drivers for my 9600 XT. :)
No, I don't mean Omega (I miss those), those were just tweaked official drivers. I mean back in Rage days, someone wrote a driver from scratch, without any official documentation, and the driver was better than ATI's. Sure, video cards were much, much simpler back then.
 

sharafa

New Member
Joined
Nov 19, 2022
Messages
2 (0.00/day)
To the 4090: You are right, it seems nVidia wants to stop the 4090 cannibalizing the quadro series, cutting memory pooling/nv-link. There are possibilities to use two 4090, undervolting would be necessary, but not every creative has such knowledge. For $2000 you can get two 3090. 48GB VRam including nVlink. I don't need so much :)
This is the way I will follow if the price of the 3090 finaly begin de drop to raisonable lvl. Even 2x or 3x 3070 is not a bad bargain with the local "out of core" memory from some renderer (but it is a no go for renderfarming)
My priority is reliability, cost and performance last.
Octane is an awesome renderer btw :)
 
Joined
Sep 13, 2021
Messages
86 (0.07/day)
This is the way I will follow if the price of the 3090 finaly begin de drop to raisonable lvl. Even 2x or 3x 3070 is not a bad bargain with the local "out of core" memory from some renderer (but it is a no go for renderfarming)
My priority is reliability, cost and performance last.
Octane is an awesome renderer btw :)
Yes, I saw impressive results with octane. If I ever have more time again, I'll try other renderers/octane. But I have to switch to a professional 3D suit first to avoid 3rd party plugins. I can't promise anything. :) Hardware, I am still using a 2070, it's about time to update. 4090 for MSRP or a used 3090. I'm not in a hurry, it will depend on opportunities.

About 4080 pricing, the card seems to be available around MSRP, but is selling bad. nVidia can lower the price or lose market share to AMD. Scalper neutralized due to high price, low demand :laugh:

A price development comparison from https://wccftech.com/amd-radeon-nvi...-significantly-improve-gpu-availability-2022/.
Like it, as it shows, the Titan tier segment is priced normally, while the gaming/enthusiast tier got hit by recent pricing.
nvidia pricing.jpg
 
Joined
Jun 10, 2014
Messages
2,999 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
This product line is an unmitigated disaster.
What unmitigated disaster?
RTX 4080 and RTX 4090 has proven to be great performers, the RTX 4080 performed better than "leaks" were projecting. The pricing can change if AMD offers some serious competition.

As for the rest of the product lineup, we actually don't know yet. But let's use any opportunity to bash Nvidia prematurely anyway!

But what if the shoe was on the other foot?
Anyone remembers Radeon VII? Performing midway between RTX 2070 and RTX 2080, while costing as much as RTX 2080? People were making excuses for that one, even though there were better alternatives.

I'm trying to figure this out. The 4070Ti (what was the 4080 12GB) is - based on the DLSS 3.0 graphs floating around out there - is a little bit behind the 3090Ti. It would be safe to say the 4070Ti should perform at a 3090 level.
4070Ti = 3090
3090 = 10% faster than 3080 10GB
3080 10GB is about 20% faster than the 3070.

If the 4060 is to "rival" the 3070, but there's no mention of being at a 3080 level, I'd venture to guess the 4060 should be around 5 to maybe 10% faster.

Does that mean the 4070 is only going to only match the 3080 10GB? I'm guessing $599 price on the 4070.

Speculating, the mid ranged for Nvidia doesn't appear to be very impressive.
If the actual targeted launch is next summer, then neither pricing, TDP or clockspeed would be set at this point. Meaning this leak is either partially nonsense or completely nonsense.

Also RTX 3070 isn't 20% faster than RTX 3060 (like the source claims), it's more like 45-50% in 1440p. So if RTX 4060 turns out to be ~50% faster than RTX 3060, then it's not a bad improvement.

Historically, the 60-cards have been priced well and offered good value. I think Nvidia would regret it if they didn't do so this time too.

But as for all those wondering whether to buy or wait; If something well priced shows up, like a cheap RTX 3070, then by all means buy it :)
 
Joined
Jan 14, 2019
Messages
12,682 (5.83/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
What unmitigated disaster?
RTX 4080 and RTX 4090 has proven to be great performers, the RTX 4080 performed better than "leaks" were projecting. The pricing can change if AMD offers some serious competition.

As for the rest of the product lineup, we actually don't know yet. But let's use any opportunity to bash Nvidia prematurely anyway!

But what if the shoe was on the other foot?
Anyone remembers Radeon VII? Performing midway between RTX 2070 and RTX 2080, while costing as much as RTX 2080? People were making excuses for that one, even though there were better alternatives.


If the actual targeted launch is next summer, then neither pricing, TDP or clockspeed would be set at this point. Meaning this leak is either partially nonsense or completely nonsense.

Also RTX 3070 isn't 20% faster than RTX 3060 (like the source claims), it's more like 45-50% in 1440p. So if RTX 4060 turns out to be ~50% faster than RTX 3060, then it's not a bad improvement.

Historically, the 60-cards have been priced well and offered good value. I think Nvidia would regret it if they didn't do so this time too.

But as for all those wondering whether to buy or wait; If something well priced shows up, like a cheap RTX 3070, then by all means buy it :)
The products themselves aren't the disaster. Their pricing is. And the insult that we're being told that it's okay is.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.60/day)
Location
Ex-usa | slava the trolls
This is an insult :banghead:

RTX 3080: 700$
RTX 4080: 1200$


1668964117215.png


This is a new product tier, RTX 4080 doesn't belong here but to the Ultra Enthusiast and Titan tiers.
1668964285171.png


1668964330960.png
 
Top