• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4070 Could See Price Cuts to $549

Joined
Sep 6, 2013
Messages
3,306 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
Maybe FSR 3.0 works. I mean, without FSR 3.0, the RTX 4070 still enjoys a nice advantage because of Frame Generation and even if someone doesn't care about RT performance or CUDA, or power consumption or whatever, FG, no matter how we see it, does give RTX 4070 a very nice performance advantage, at least on paper, over Radeon cards and even RTX 3000 cards. But IF FSR 3.0 works, then there is no FG advantage. RTX 4060 and RTX 4070 cards lose an advantage against Radeons and RTX 3000 cards, in fact probably the main advantage Nvidia was pushing for RTX 4000 series in games.
 
Joined
Sep 10, 2018
Messages
6,823 (3.04/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
FG DLAA gimmicks should be for free. they insist on pushing it not our problem.

100% agree as much as I like it in a couple games it isn't a feature people should be buying any gpu for.
 
Joined
Mar 10, 2010
Messages
11,878 (2.22/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
100% agree as much as I like it in a couple games it isn't a feature people should be buying any gpu for.
And that's exactly how every feature Nvidia sold me a card for has played out, Physx, DLSs1 /2 , same thing to be fair with AMD but less intense on the big sell, just I mean hair Fx, I say no more.
But a feature, ,, scarcely ever been worth the effort because it'll be superceded way before you feel you got your worth out of the GPU you Bought for it.
And I have, many times.
 
Joined
Sep 10, 2018
Messages
6,823 (3.04/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
And that's exactly how every feature Nvidia sold me a card for has played out, Physx, DLSs1 /2 , same thing to be fair with AMD but less intense on the big sell, just I mean hair Fx, I say no more.
But a feature, ,, scarcely ever been worth the effort because it'll be superceded way before you feel you got your worth out of the GPU you Bought for it.
And I have, many times.

I like DLSS enough that an AMD alternative would have to be clearly better otherwise other than that I don't care much about any other Nvidia feature.
 
Joined
Mar 10, 2010
Messages
11,878 (2.22/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
I like DLSS enough that an AMD alternative would have to be clearly better otherwise other than that I don't care much about any other Nvidia feature.
Yeah I didn't.
3 Fg might be great but I bought into 1 got 2 for free then no 3.
So limited use in the 600+ games I own!
But it doesn't suit most games I play or more succinctly I choose other ways to gain Frames If I need too, such is my revulsion for what a 1080 spin(dirt rally2/others) does with it on and if you're not spinning out sometimes, are you sure you're trying hard enough.
I play online FPS too, in groups so higher FPS stabley and highly accurate outweigh all other needs and then fsr or dlss isn't good enough, none are , and before it's said reflex plus native is again faster than reflex and dlss.
All personal tastes really so I'm not arguing my ways it , I am again saying YdY.
 
Joined
Feb 18, 2013
Messages
2,181 (0.51/day)
Location
Deez Nutz, bozo!
System Name Rainbow Puke Machine :D
Processor Intel Core i5-11400 (MCE enabled, PL removed)
Motherboard ASUS STRIX B560-G GAMING WIFI mATX
Cooling Corsair H60i RGB PRO XT AIO + HD120 RGB (x3) + SP120 RGB PRO (x3) + Commander PRO
Memory Corsair Vengeance RGB RT 2 x 8GB 3200MHz DDR4 C16
Video Card(s) Zotac RTX2060 Twin Fan 6GB GDDR6 (Stock)
Storage Corsair MP600 PRO 1TB M.2 PCIe Gen4 x4 SSD
Display(s) LG 29WK600-W Ultrawide 1080p IPS Monitor (primary display)
Case Corsair iCUE 220T RGB Airflow (White) w/Lighting Node CORE + Lighting Node PRO RGB LED Strips (x4).
Audio Device(s) ASUS ROG Supreme FX S1220A w/ Savitech SV3H712 AMP + Sonic Studio 3 suite
Power Supply Corsair RM750x 80 Plus Gold Fully Modular
Mouse Corsair M65 RGB FPS Gaming (White)
Keyboard Corsair K60 PRO RGB Mechanical w/ Cherry VIOLA Switches
Software Windows 11 Professional x64 (Update 23H2)
"could" is a very loose term. I say let all of NoVideo's AIB partners suffer a little more and watch the stocks rot away when it's still more expensive than a 7800XT. (yes, AMD also gets the same treatment too.)
 
Joined
Sep 10, 2018
Messages
6,823 (3.04/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
Yeah I didn't.
3 Fg might be great but I bought into 1 got 2 for free then no 3.
So limited use in the 600+ games I own!
But it doesn't suit most games I play or more succinctly I choose other ways to gain Frames If I need too, such is my revulsion for what a 1080 spin(dirt rally2/others) does with it on and if you're not spinning out sometimes, are you sure you're trying hard enough.
I play online FPS too, in groups so higher FPS stabley and highly accurate outweigh all other needs and then fsr or dlss isn't good enough, none are , and before it's said reflex plus native is again faster than reflex and dlss.
All personal tastes really so I'm not arguing my ways it , I am again saying YdY.

Me either, for sure everyone should always do what's best for their hobby and always grab the card that fits best with their use case.
 
Joined
Oct 17, 2022
Messages
62 (0.08/day)
Maybe FSR 3.0 works. I mean, without FSR 3.0, the RTX 4070 still enjoys a nice advantage because of Frame Generation and even if someone doesn't care about RT performance or CUDA, or power consumption or whatever, FG, no matter how we see it, does give RTX 4070 a very nice performance advantage, at least on paper, over Radeon cards and even RTX 3000 cards. But IF FSR 3.0 works, then there is no FG advantage. RTX 4060 and RTX 4070 cards lose an advantage against Radeons and RTX 3000 cards, in fact probably the main advantage Nvidia was pushing for RTX 4000 series in games.

Agree 100%.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.69/day)
Location
Ex-usa | slava the trolls
Dream on..

At least 1 sale down. You have to look at this talk from the perspective of it being a negotiation between a customer and a supplier.
You don't tell your customer "dream on", because there are other suppliers and you will lose sales :D

People not happy at all, if you ask me.

1694726137236.png

 
Last edited:
Joined
Jun 21, 2021
Messages
3,092 (2.52/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
And that's exactly how every feature Nvidia sold me a card for has played out, Physx, DLSs1 /2 , same thing to be fair with AMD but less intense on the big sell, just I mean hair Fx, I say no more.
But a feature, ,, scarcely ever been worth the effort because it'll be superceded way before you feel you got your worth out of the GPU you Bought for it.
And I have, many times.
All computer graphics are fakery. They're just mathematic tricks to get you to think something is realistically portrayed.

Better results generally come from more sophisticated formulas. That means increased computational requirements.

Person A: "Hey, there's a new shading model called Gouraud. It looks better than flat shading."
Person B: "Why do I need that? I'm happy with flat shading."



Years later.

Person A: "Hey, there's an even better shading model called Phong. It's more realistic than Gouraud."
Person B: "Nah, I upgraded to Gouraud a couple of years ago. I'm fine with that."

A few more years pass.

Person A: "A mathematician by the name of Jim Blinn has altered Phong shading."
Person B: "I'll check it out. How do you spell his name?"
Person A: "B-L-I-N-N"

DLSS, upscaling, frame generation, ray-trace reconstruction, all part of the evolution of computer graphics.

There's a reason why we don't see flat shading in computer games anymore.

Yes, there might not be a usage case for you today for DLSS 3 Frame Generation or DLSS 3.5 Ray Reconstruction. But someday there probably will be for a usage case (e.g., game title) that you care about. The problem is you just don't know when that will happen.

DLSS 1.0 was not embraced at launch. Now all three GPU manufacturers (Nvidia, AMD, Intel) provide it as a tool for developers to tap into often with great effect. Many now consider DLSS 2.0 to have superior results to conventional TAA.

For sure the technology is improving, often in software. And it's not just about who has better/more transistors. A lot of these implementations are heavily influenced by the quality of the developer tools used to harness this technology.
 
Last edited:
Joined
Jan 14, 2019
Messages
12,169 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
And that's exactly how every feature Nvidia sold me a card for has played out, Physx, DLSs1 /2 , same thing to be fair with AMD but less intense on the big sell, just I mean hair Fx, I say no more.
But a feature, ,, scarcely ever been worth the effort because it'll be superceded way before you feel you got your worth out of the GPU you Bought for it.
And I have, many times.
That's why I'm an advocate of hardware-agnostic technologies and winning on pure performance. That makes me an AMD fan in some people's eyes which is laughable considering that maybe 30% of my hardware arsenal is AMD, the rest are Intel/Nvidia. :roll:
 
Joined
Jun 21, 2021
Messages
3,092 (2.52/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
That's why I'm an advocate of hardware-agnostic technologies and winning on pure performance. That makes me an AMD fan in some people's eyes which is laughable considering that maybe 30% of my hardware arsenal is AMD, the rest are Intel/Nvidia. :roll:

There is no such thing as totally hardware-agnostic technologies in modern computing or consumer electronics.

The reason we have GPUs in the first place is that CPU cores aren't efficient in handling the mathematic calculations for 3D graphics. Many tasks and functions are now handled by specialized ASICs that were formerly handled by the CPU.

Want to see your CPU in action doing 3D graphics calculations? Just run Cinebench. Note the speed that the images are generated. Imagine playing a game with that rendering speed.

If your phone tried to decode video just using CPU cores, the battery life would be minutes, not hours.

When you watch YouTube on your fancy computer, it is using an encoder chip on your graphics card, not the fancy Ryzen 7800X3D CPU. At a fraction of the power (and thus heat). If you forced it to do software decoding handled by the CPU, you'd see a big power spike and complain about the CPU fan being too loud.

At some point, someone came up with algorithms for ray tracing. Originally this was done in software on CPUs. Took forever, not useful for real-time graphics. So it was reserved for still images or a short film (if you had the budget and time) like early Pixar shorts.

At some point, someone said, "hey, let's build a circuit that will handle these calculations more efficiently." Today, we have smartphone SoCs with ray-tracing cores.

Someday in the not too distant future, we'll have some other form of differentiated silicon. MPEG-2 encoders used to be custom ASICs. Today they handle a wide variety of encoding schemes, the latest being AV1. Someday there will something else that succeeds AV1 as the next generation. Performance will suck on today's encoding architecture, will be better with specially modified silicon to help speed things up.

A graphics card purchase is a singular event in time but a usage case may pop up next month that wasn't on the radar last month. We saw this with the crypto mining craze. We also found out what happens when the crypto mining policies change leaving a bunch of cards utterly useless.

I remember buying a Sapphire Pulse Radeon RX 580 new for $180 (down from the original launch MSRP of $230). Six months later during the height of the mining craze, that card was going for 3x what I paid for.
 
Last edited:
Joined
Jan 14, 2019
Messages
12,169 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
The reason we have GPUs in the first place is that CPU cores aren't efficient in handling the mathematic calculations for 3D graphics. Many tasks and functions are now handled by specialized ASICs that were formerly handled by the CPU.

If your phone tried to decode video just using CPU cores, the battery life would minutes, not hours.

There is no such thing as totally hardware-agnostic technologies in modern computing or consumer electronics.
DirectX, OpenGL, programmable shaders... there's a reason why 99% of game technologies run on every CPU and GPU.
 
Joined
Jun 21, 2021
Messages
3,092 (2.52/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
DirectX, OpenGL, programmable shaders... there's a reason why 99% of game technologies run on every CPU and GPU.
Not sure how many DirectX games run on my iPhone.

Before OpenGL was a standard, it was proprietary IrisGL. It's not like OpenGL was instantly welcomed and adopted by everyone the moment the OpenGL ARB pressed the "publish" button. OpenGL wasn't always "just there" and it's not going to last forever either. Years ago Apple deprecated OpenGL (which was invented by SGI, a defunct 3D graphics company whose heyday was in the Nineties).

My computer (Mac mini M2 Pro) doesn't support OpenGL. My Mac mini 2018 (Intel CPU) might if I installed an old version of the operating system.

And DirectX is really a Microsoft standard that has been forced on the world due to their near monopolistic practices and strangehold on the PC industry (you remember that anti-trust investigation 20 years ago?).

A few years ago Vulkan wasn't on anyone's radar. Today it's important. Someday it'll fall to the side, overtaken by more modern graphics technology that has developed to address the changing needs of the industry and its users.

There are basic concepts that span multiple architectures but even in the products from one company, there isn't even compliance. As an AMD guy you should know that DirectX 12 isn't fully and evenly implemented on every single GPU even within one generation.

And designing any computer architecture is both a combination of hardware and software. The people who understand the hardware the best will have the best software. So Nvidia has hardware engineers that work with software engineers the latter writing drivers, APIs, etc. Apple has done this to great effect.

Remember that just because the industry picks a standard doesn't mean that it will be embraced by all forever and ever. How many DVI and VGA connectors does your RX 7800 XT have? Does it have a VirtualLink port (looks like USB-C)?
 
Last edited:
Joined
Jan 14, 2019
Messages
12,169 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Not sure how many DirectX games run on my iPhone.

Before OpenGL was a standard, it was proprietary IrisGL. It's not like OpenGL was instantly welcomed and adopted by everyone the moment the OpenGL ARB pressed the "publish" button. OpenGL wasn't always "just there" and it's not going to last forever either. Years ago Apple deprecated OpenGL (which was invented by SGI, a defunct 3D graphics company whose heyday was in the Nineties).

My computer (Mac mini M2 Pro) doesn't support OpenGL. My Mac mini 2018 (Intel CPU) might if I installed an old version of the operating system.

And DirectX is really a Microsoft standard that has been forced on the world due to their near monopolistic practices and strangehold on the PC industry (you remember that anti-trust investigation 20 years ago?).

A few years ago Vulkan wasn't on anyone's radar. Today it's important. Someday it'll fall to the side, overtaken by more modern graphics technology that has developed to address the changing needs of the industry and its users.

There are basic concepts that span multiple architectures but even in the products from one company, there isn't even compliance. As an AMD guy you should know that DirectX 12 isn't fully and evenly implemented on every single GPU even within one generation.

And designing any computer architecture is both a combination of hardware and software. The people who understand the hardware the best will have the best software. So Nvidia has hardware engineers that work with software engineers the latter writing drivers, APIs, etc. Apple has done this to great effect.

Remember that just because the industry picks a standard doesn't mean that it will be embraced by all forever and ever. How many DVI and VGA connectors does your RX 7800 XT have? Does it have a VirtualLink port (looks like USB-C)?
Sure, things don't always (or rather, usually don't) start as universal, but there's some sort of standardisation along the way. Power connectors, car safety standards, there's lots of things that have been put into law, or at least some sort of general agreement. Companies have their own stuff, which get standardised, or die out, or take the Apple route (closed ecosystem with a solid fanbase) with time. I'm all for standardisation and all against the Apple approach (which Nvidia seems to be following lately). I like choice when I'm buying something, and I don't want to be forced to buy Nvidia because of DLSS, or AMD because of whatever.

I'm not an AMD guy. Only my main gaming rig is fully AMD at the moment, but I've got two HTPCs that are both Intel+Nvidia, and I've got lots of various parts lying around from every manufacturer. I generally prefer AMD's open source approach towards new technologies, but that doesn't mean I restrict my choices to only one brand (although prices seem to be doing that for me anyway).

I hope that makes sense. :)
 
Joined
Sep 10, 2018
Messages
6,823 (3.04/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
Yep, $449 would be closer to the mark for a 12GB 3060 replacement.

We'd need a substantially more competitive market for that the 7800XT would likely needed to launch at the same time as the 4070 at $399 and even then I doubt Nvida would price that low.
 

OneMoar

There is Always Moar
Joined
Apr 9, 2010
Messages
8,794 (1.65/day)
Location
Rochester area
System Name RPC MK2.5
Processor Ryzen 5800x
Motherboard Gigabyte Aorus Pro V2
Cooling Thermalright Phantom Spirit SE
Memory CL16 BL2K16G36C16U4RL 3600 1:1 micron e-die
Video Card(s) GIGABYTE RTX 3070 Ti GAMING OC
Storage Nextorage NE1N 2TB ADATA SX8200PRO NVME 512GB, Intel 545s 500GBSSD, ADATA SU800 SSD, 3TB Spinner
Display(s) LG Ultra Gear 32 1440p 165hz Dell 1440p 75hz
Case Phanteks P300 /w 300A front panel conversion
Audio Device(s) onboard
Power Supply SeaSonic Focus+ Platinum 750W
Mouse Kone burst Pro
Keyboard SteelSeries Apex 7
Software Windows 11 +startisallback
call me when it gets to 329
 
Joined
Jul 5, 2013
Messages
27,347 (6.61/day)
No, even at 499 dollars the 4070 is a bad deal.
Your opinion. Clearly, not everyone agrees.

The ONLY thing the 4070 does better than the RX 7800 XT in is in power usage (it's gonna be used on stationary desktops, so who really cares anyways?) and ray tracing. And that's it.
And there you go. Those two things, and a few others you left out, are very good reasons to go with a 4070.

I'm not saying the 7800XT isn't a great card, because it is. I'm saying that it depends in what the user needs and wants out of their gaming experience.

call me when it gets to 329
Wait 2 years and buy it used.

People not happy at all, if you ask me.
Those are whiners doing what they do best. The rest of us live in the real world.

Better results generally come from more sophisticated formulas. That means increased computational requirements.
Not always. Frequently method to do the same work in a better, more efficient, ways are developed.
 
Last edited:

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.69/day)
Location
Ex-usa | slava the trolls
There is no such thing as totally hardware-agnostic technologies in modern computing or consumer electronics.

The reason we have GPUs in the first place is that CPU cores aren't efficient in handling the mathematic calculations for 3D graphics. Many tasks and functions are now handled by specialized ASICs that were formerly handled by the CPU.

Want to see your CPU in action doing 3D graphics calculations? Just run Cinebench. Note the speed that the images are generated. Imagine playing a game with that rendering speed.

Short answer: GPUs have far more floating point execution units than CPUs. Long answer: a GPU is designed for highly parallel, low FP-precision computation, such as graphics rendering.
GPUs (graphics processing units) are optimized for parallel processing, which allows them to perform many calculations at once. This is in contrast to CPUs (central processing units), which are typically optimized for sequential processing. Because of this, GPUs are able to perform many more floating point operations per second (FLOPS) than CPUs. Additionally, GPUs have specialized hardware, such as multiple cores and larger caches, that are optimized for the types of calculations that are commonly used in graphics processing, such as matrix operations. This further increases their ability to perform FLOPS.

GPU computing is faster than CPU computing because GPUs have thousands of processing cores, while CPUs have comparatively fewer cores.

Those are whiners doing what they do best. The rest of us live in the real world.

Actually it is the opposite. The "whiners" do live in the real world, while those who support unreal pricing, they lost connection with the world situation right now.
It's called stagflation, the worst, if you still remember this.
 
Joined
Jul 5, 2013
Messages
27,347 (6.61/day)
Actually it is the opposite. The "whiners" do live in the real world, while those who support unreal pricing, they lost connection with the world situation right now.
It's called stagflation, the worst, if you still remember this.
There's a fanciful twist. Reality is accepting things the way they really are. Fantasy is wishing for what you want to be. Those people, as well as many here, are expressing their wishes, fantasies that have zero bearing on actually reality. And some of them are doing so with complaint as a context. Thus, whiners be whining.
 
Joined
Dec 14, 2011
Messages
1,005 (0.21/day)
Location
South-Africa
Processor AMD Ryzen 9 5900X
Motherboard ASUS ROG STRIX B550-F GAMING (WI-FI)
Cooling Corsair iCUE H115i Elite Capellix 280mm
Memory 32GB G.Skill DDR4 3600Mhz CL18
Video Card(s) ASUS RTX 3070 Ti TUF Gaming OC Edition
Storage Sabrent Rocket 1TB M.2
Display(s) Dell S3220DGF
Case Corsair iCUE 4000X
Audio Device(s) ASUS Xonar D2X
Power Supply Corsair AX760 Platinum
Mouse Razer DeathAdder V2 - Wireless
Software Microsoft Windows 11 Pro (64-bit)
  • Like
Reactions: ARF

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.69/day)
Location
Ex-usa | slava the trolls
There's a fanciful twist. Reality is accepting things the way they really are. Fantasy is wishing for what you want to be. Those people, as well as many here, are expressing their wishes, fantasies that have zero baring on actually reality. And some of them are doing so with complaint as a context. Thus, whiners be whining.

No, this is simply some capitalists, speaking of nvidia, abusing their monopolistic position in the market.
There are two solutions - vote with your wallet and don't buy (like the "whiners" actually do) which results in an all-time low graphics cards shipments, and if this trend goes on, nivida will be forced to exit the graphics cards market.
 
Joined
Jan 14, 2019
Messages
12,169 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
  • Haha
Reactions: ARF
Top