• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

[Speculation] A curiously stupid crossroad

Joined
Jan 2, 2024
Messages
754 (1.82/day)
Location
Seattle
System Name DevKit
Processor AMD Ryzen 5 3600 ↗4.0GHz
Motherboard Asus TUF Gaming X570-Plus WiFi
Cooling Koolance CPU-300-H06, Koolance GPU-180-L06, SC800 Pump
Memory 4x16GB Ballistix 3200MT/s ↗3800
Video Card(s) PowerColor RX 580 Red Devil 8GB ↗1380MHz ↘1105mV, PowerColor RX 7900 XT Hellhound 20GB
Storage 240GB Corsair MP510, 120GB KingDian S280
Display(s) Nixeus VUE-24 (1080p144)
Case Koolance PC2-601BLW + Koolance EHX1020CUV Radiator Kit
Audio Device(s) Oculus CV-1
Power Supply Antec Earthwatts EA-750 Semi-Modular
Mouse Easterntimes Tech X-08, Zelotes C-12
Keyboard Logitech 106-key, Romoral 15-Key Macro, Royal Kludge RK84
VR HMD Oculus CV-1
Software Windows 10 Pro Workstation, VMware Workstation 16 Pro, MS SQL Server 2016, Fan Control v120, Blender
Benchmark Scores Cinebench R15: 1590cb Cinebench R20: 3530cb (7.83x451cb) CPU-Z 17.01.64: 481.2/3896.8 VRMark: 8009
I'm not sure how to preface this other than my experience between bargain and enthusiast/flagship cards tells me nothing is safe anymore.

From the Super 7 era I've been an ATI guy and the Rage XL/Mach64 was kind of the default on everything. You'll still find these chips on servers with a VGA out.
Great for movies and content but gaming suffered. Fair Direct2D performance on early Direct3D hardware. I solved low FPS issues by switching to a Rage Magnum/Xpert 2000.
No idea if it was the pixel shaders, bandwidth, the insane jump from 8MB SDR->32MB DDR, or whatever, it worked. That experience solidified what I look for in upgrades.

Feature level lockout has never been an issue or concern until I switched up to a Radeon 9200 to hasten a Pentium 4 build. I eventually swapped in an AH3450 to deal with it.
Things were fine until I lost integrity with my main system of the time, which featured a similar HD3300 IGP. Changed boards and switched up to a HD6570.
This was around the era when the nuisance of Bitcoin mining started to boom. This is also where I finally made the jump to PCI-E cards. Note the features.

1723166289627.png


The feature level is DX11_0 and that was the selling point. Reasonable performance and games would get bogged hard where there's a bunch of stuff on screen.
I'm talking massive sprite count of recruits in Kingdoms or a bunch of enemies/effects in Vampire Survivors, HoloCure and anything high particle or poly.
This was also the first point where I started experiencing massive audio driven STUTTERS in desktop mode VR applications that start chug swapping video memory.
Effects like cameras and mirrors would also cause hard locks when looking at them too quickly. Just all around troublesome at the turn of the current era.

So I picked up a RX 580 Red Devil. Took a few months to get one during all the ETH mining noise.

1723176613112.png


All the missing features I cared about, unlocked. All the render issues, low FPS audio stutters and everything else vanished. Finally ran games at 1080p144.
I was able to encode with this card too. Rendering jobs finally worked right. Kind of unreasonably well. Vulkan stuff is still lethal but UnityVR handles great.
This is a pressing point for me because gaming performance is falling behind again and content creation suffers so badly. I can usually play 1080p60 but encode 720p60 or 1080p30.
Neither options look good and at some point I need to guarantee 1080p60. That is never going to happen with this card or any nearby generation in AVC or HEVC.

Which brings me to the 7900XT:

1723177578276.png


At the surface it's not great. Feature level is DX12_2 which means it runs anything. OpenCL gets an update and there's a dedicated encoder for AVC1 but that's it.
AI toolkits work, a new generation of GDDR memory and more than double the size of what I consistently fall back to because I can't get my hands on a card that works.
There is a 5 year gap between these cards. Since October I've tried getting a Hellhound but from January it's been back and forth. 8 months in and I can't make it happen.
Yesterday a very stupid opportunity appeared due to what seems to be tons of cards getting dumped right back onto the market: Highly binned 5700XT and 6900XT cards.

I'm not familiar with how either one performs. I'm under the impression that each model is well over my 580 for gaming performance but not much else.
Either way I'd lose AV1 encode, which is the selling point for 7000 series and minor improvements to AVC/HEVC. Odds and ends are probably AI related and I don't care.

What appears to be the reasonable solution here? There is a tech principle where you stop work or delay a project because newer technologies will have greater impact.
AMD is in the middle of that with RDNA4 and pouring massive time and resources into RDNA5. Not sure if I expect better options from either one.
I'm already finding situations where OLD decodes get cut in favor of H.264 and that's fine but everything I've witnessed about GPU improvement is centered around antiques.
There's some possibility the next gen gives us massive upper scale AVIF/AV2 support that isn't immediately useful then minimal or no improvement where currently needed.

If you wonder why I favor the 7900XT for 1080p over others, I have jobs that call for that level of performance plus I don't care to return to this issue for another 8 years.
Plus with platform promises floating around about better encoder support on the table, it really makes sense that there is importance in NOT missing that milestone.
It seems to be a very difficult spot to be a creator with this kind of hardware or requirements. That's what I get out of my own situation and it doesn't look better for nVidia.
Two hours ago I watched one of my favorite streamers have a meltdown for about an hour because a horribly unoptimized game tanks the frames and locks her computer.
All while struggling to get something else to work after the fact. Others with 4070Ti and upper accelerators noted similar issues when she looked it up.
Like, there just doesn't seem to be any reasonable expectation to do anything. If any of this makes any sense to you guys, thanks for taking the time to read through it all.
Is there anything that fits? Like did I miss something important? Am I in the wrong part of the GPU market?
 
Joined
Jun 24, 2017
Messages
192 (0.07/day)
What appears to be the reasonable solution here? There is a tech principle where you stop work or delay a project because newer technologies will have greater impact.
Yes.
And another one that states: No product can have all the features clients demand. Example: Motherboard market segmentation. Smartphones.
And another: We (marketing department) must have control of the obsolence of the product. Example: Windows versions vs. feature implementations
And another one: Release first, fix later. Most "AAA" game and not so "AAA"
and: The sum of the accessories must be more expensive than the product itself. Any workstation.
and: When you want to justify something associated it with SECURITY or ECOLOGY. Any software piece as a service.
And so on.
Business schools man! The sin of our century :D

@ topic:
Old gpus (matrox, number9, first Geforce, radeons, etc.) had lots of problems, incompatibilities, driver issues and individual patches. DirectX used to explode along with windows until DX9 + 30-patches came in. Then the PC era came and most stuff worked rather fine.

Its not only the GPU market. Game creators, and software creators in general, are under a heavy pressure nowadays so they are forced to release unfinised, unoptimized, untested products. I feel the cycle is like follows:
RELEASE = Minimun Viable Product + Marketing Budged
PATCHEs = if(success and low_cost_maintain)
DLC = if(success and low_cost_development)
Because you either want to fire people (give them the oportunity to find new goals) or move the workforce in the next MVP.
 
Last edited:

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
43,690 (6.78/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
The Jump from a 8500 to a 9500/9700 Was huge
 
Joined
Nov 27, 2023
Messages
2,862 (6.36/day)
System Name The Workhorse
Processor AMD Ryzen R9 5900X
Motherboard Gigabyte Aorus B550 Pro
Cooling CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front
Memory GSkill Trident Z 3200CL14
Video Card(s) NVidia GTX 1070 MSI QuickSilver
Storage Adata SX8200Pro
Display(s) LG 32GK850G
Case Fractal Design Torrent (Solid)
Audio Device(s) FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone
Power Supply Corsair RMx850 (2018)
Mouse Razer Viper (Original) on a X-Raypad Equate Plus V2
Keyboard Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black)
Software Windows 11 Pro (24H2)
I am honestly not sure what the question is. Is it worth it to buy whatever is currently latest? Or will the next cards brung new features? The answer is usually yes to both, but that depends on one’s circumstances. A for of professionals don’t even look at AMD cards, for example, since CUDA is an absolute must for quite a few workloads and in something like Blender AMD is still pathetic in many ways.
 
Joined
Sep 3, 2019
Messages
3,849 (1.93/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 220W PPT limit, 85C temp limit, CO -8~14, +50MHz (up to 5.0GHz)
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F39b, AGESA V2 1.2.0.C
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off-center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3600MT/s 1.38V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~467W (382W current) PowerLimit, 1060mV, Adrenalin v24.12.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR400/1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, ATX v2.4, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v24H2, OSBuild 26100.3037), upgraded from Win10 to Win11 on Jan 2024
Yesterday a very stupid opportunity appeared due to what seems to be tons of cards getting dumped right back onto the market: Highly binned 5700XT and 6900XT cards.

I'm not familiar with how either one performs. I'm under the impression that each model is well over my 580 for gaming performance but not much else.
RX580 >> 5700XT (+80% perf. +25% power consumption) (had both of them)
RX580 >> 6900XT (x3.5 perf, +70% power consumption)
5700XT >> 6900XT (x2 perf, + 35% power consumption)
6900XT >> 7900XT (+25% perf, 0 power increase)

I have a question though...
Why you cant get a working 7900XT? Have you tried multiple 7900XTs?
Is it possible that something else is wrong like the PSU for example?

RX580 (180W)
5700XT (225W)
6900XT (300W)
7900XT (300W)
 
Joined
Jan 2, 2024
Messages
754 (1.82/day)
Location
Seattle
System Name DevKit
Processor AMD Ryzen 5 3600 ↗4.0GHz
Motherboard Asus TUF Gaming X570-Plus WiFi
Cooling Koolance CPU-300-H06, Koolance GPU-180-L06, SC800 Pump
Memory 4x16GB Ballistix 3200MT/s ↗3800
Video Card(s) PowerColor RX 580 Red Devil 8GB ↗1380MHz ↘1105mV, PowerColor RX 7900 XT Hellhound 20GB
Storage 240GB Corsair MP510, 120GB KingDian S280
Display(s) Nixeus VUE-24 (1080p144)
Case Koolance PC2-601BLW + Koolance EHX1020CUV Radiator Kit
Audio Device(s) Oculus CV-1
Power Supply Antec Earthwatts EA-750 Semi-Modular
Mouse Easterntimes Tech X-08, Zelotes C-12
Keyboard Logitech 106-key, Romoral 15-Key Macro, Royal Kludge RK84
VR HMD Oculus CV-1
Software Windows 10 Pro Workstation, VMware Workstation 16 Pro, MS SQL Server 2016, Fan Control v120, Blender
Benchmark Scores Cinebench R15: 1590cb Cinebench R20: 3530cb (7.83x451cb) CPU-Z 17.01.64: 481.2/3896.8 VRMark: 8009
The Jump from a 8500 to a 9500/9700 Was huge
Yeah. Double shaders, double bus width, new DX9, OpenGL feature levels and shaders. It was tight.
That kind of jump was visible to me at the time but I didn't pay attention to it. Should have been the first tell that vram didn't matter.
I am honestly not sure what the question is.
So here's the thing. Some of us spend a lot of time around this nerd junk. A LOT of time (and money). We spend time here for many different reasons.
Some consume content while others create it. Gaming demands a lot of different features and so does content creation. They used to be somewhat exclusive.
Features would get split between entire computers for creation, then add-in cards bridged the gap and then we got a cult of purists.
These days everything gets packaged in with video accelerators and it's how some people get by with just one machine, one accelerator, which is nuts.

Content creation is fully established with team green. I get that. They dominate the playing field. I'm not nVidia's customer, not their camp but I'm still a creator.
I can deal with editing, making announcement cards, encoding videos and so on but certain forms of recording and encoding are already at an impasse.
Some of us combine these things with other technologies like VR and streaming, which is a direction that demands a lot.
The current era favors NVENC and I don't. My options are H.264/AVC, H.265/HEVC and a possibility of AV1 getting adopted soon™ between these and H.266/VVC.

I have no problem choosing hardware for consumption but suffer issues as a creator like getting robbed of mission critical features for years at a time.
Ages ago it was clock speed, pixel shaders, DirectX9 feature level, vram...Today it's DirectX12 feature level and the encoder.
I have no idea how much AV1 encode should play a part in my purchase decision but I've been sold on it twice and had no luck getting to the POST screen.
It doesn't seem likely to happen. So I wonder if others had QA issues, lethal shipping issues, board compatibility problems, a bunch of this and that.

Last week I was presented with some rather high risk alternatives. Older but extremely high binned cards that could be an all/nothing difference with TONS of variables.
So rather than chasing 7000 series cards that are now more trouble than they're worth, I'm considering options like the 5700XT and 6900XT.
No idea if they're valid alternatives or how long they'll last but I can't keep doing this RMA dance forever.

The other day I found a JTC video giving a solid demo of the 6900XT Ultimate:

So that all looks like a very fun time. Then I found another 6900XT video on something I never considered:

Is it possible or even likely that both VBIOS chips can outright fail on these 7000 series cards? Multiple cards? How likely is it on 5000/6000 series?
I never bothered to tear down either card and opted to return them as possible DOA but it sucks that I can't even conclude what's wrong.
Is this our new hardware hell?
I have a question though...
Why you cant get a working 7900XT? Have you tried multiple 7900XTs?
Is it possible that something else is wrong like the PSU for example?
The numbers look really fun.

I have no idea what's wrong and wonder if others have any hint of what it could be. I never got any hardware fail beeps from the mobo, got lights to come on and everything, VGA problem lamp+beeps but no fan spin or picture. I even attempted to power on the 7900XT and RX 580 together and could only get detection out of the 580. So no power draw issues here but this is already a horrific nuisance. The main appeal of the 7000 series is power efficiency. I translate that as better voltage control and less risk of extreme spikes that trigger OCP or something on an already minimal build.

These cards aren't going to pull max draw just getting to POST but I wonder how much worse the power spikes of the 6900XT would be over the 7900XT.
Triple 8-pin totally doesn't look threatening at all. Should I give up and try the 6900XT? It looks like a pain to setup but the features and performance look promising.

1723324995888.png
 
Joined
Nov 27, 2023
Messages
2,862 (6.36/day)
System Name The Workhorse
Processor AMD Ryzen R9 5900X
Motherboard Gigabyte Aorus B550 Pro
Cooling CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front
Memory GSkill Trident Z 3200CL14
Video Card(s) NVidia GTX 1070 MSI QuickSilver
Storage Adata SX8200Pro
Display(s) LG 32GK850G
Case Fractal Design Torrent (Solid)
Audio Device(s) FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone
Power Supply Corsair RMx850 (2018)
Mouse Razer Viper (Original) on a X-Raypad Equate Plus V2
Keyboard Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black)
Software Windows 11 Pro (24H2)
@DaemonForce
Being brutally honest and meaning no offense - I feel like you are massively overthinking this whole thing. If you are a content creator, a serious one, then you should swallow whatever weird hang ups you have and buy the fastest and latest NV GPU you can afford comfortably. That’s the truth. It might be an uncomfortable one, but it is what it is.

As for everything else - you are being frustratingly vague. What do you mean by “high risk alternatives”? Used cards? What do you mean by “highly binned”? That usually refers to ASIC quality and, unless you are into xOC, that is completely irrelevant to your use case. You mentioned you couldn’t get a 7900XT that works. Again, used cards? WERE those the issues of cards themselves or something else in your system?
 
Joined
Jan 2, 2024
Messages
754 (1.82/day)
Location
Seattle
System Name DevKit
Processor AMD Ryzen 5 3600 ↗4.0GHz
Motherboard Asus TUF Gaming X570-Plus WiFi
Cooling Koolance CPU-300-H06, Koolance GPU-180-L06, SC800 Pump
Memory 4x16GB Ballistix 3200MT/s ↗3800
Video Card(s) PowerColor RX 580 Red Devil 8GB ↗1380MHz ↘1105mV, PowerColor RX 7900 XT Hellhound 20GB
Storage 240GB Corsair MP510, 120GB KingDian S280
Display(s) Nixeus VUE-24 (1080p144)
Case Koolance PC2-601BLW + Koolance EHX1020CUV Radiator Kit
Audio Device(s) Oculus CV-1
Power Supply Antec Earthwatts EA-750 Semi-Modular
Mouse Easterntimes Tech X-08, Zelotes C-12
Keyboard Logitech 106-key, Romoral 15-Key Macro, Royal Kludge RK84
VR HMD Oculus CV-1
Software Windows 10 Pro Workstation, VMware Workstation 16 Pro, MS SQL Server 2016, Fan Control v120, Blender
Benchmark Scores Cinebench R15: 1590cb Cinebench R20: 3530cb (7.83x451cb) CPU-Z 17.01.64: 481.2/3896.8 VRMark: 8009
@DaemonForce
Being brutally honest and meaning no offense - I feel like you are massively overthinking this whole thing. If you are a content creator, a serious one, then you should swallow whatever weird hang ups you have and buy the fastest and latest NV GPU you can afford comfortably. That’s the truth. It might be an uncomfortable one, but it is what it is.
You're probably right. I don't like to hear it but somebody needs to point it out to me.
If I'm unable to move forward with the current options then I'll have to make the jump to nVidia hardware and that's gonna be a much bigger headache.
I'm not interested in ~$1000 additional investment just to have a $800 card play well with everything only to find that it still won't.
As for everything else - you are being frustratingly vague. What do you mean by “high risk alternatives”? Used cards? What do you mean by “highly binned”? That usually refers to ASIC quality and, unless you are into xOC, that is completely irrelevant to your use case. You mentioned you couldn’t get a 7900XT that works. Again, used cards? WERE those the issues of cards themselves or something else in your system?
An older card with high power draw is a high risk. Of course those are going to be used, they're old. Used or not, they're unknown territory.
I have no idea how they behave or if they even will behave.

What is the different between a flagship card of poor silicon vs the flagship with very good silicon?
A lot? A little? I don't know. Clear enough?

I don't know what the issues are with the 7900XT cards. I have tried every possible hardware configuration and the bare minimum. They struggle with detection. As in the computer doesn't SEE the hardware attached to the system and will throw a VGA lamp+beep codes. 1L3S = VGA error.
 
Joined
Nov 27, 2023
Messages
2,862 (6.36/day)
System Name The Workhorse
Processor AMD Ryzen R9 5900X
Motherboard Gigabyte Aorus B550 Pro
Cooling CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front
Memory GSkill Trident Z 3200CL14
Video Card(s) NVidia GTX 1070 MSI QuickSilver
Storage Adata SX8200Pro
Display(s) LG 32GK850G
Case Fractal Design Torrent (Solid)
Audio Device(s) FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone
Power Supply Corsair RMx850 (2018)
Mouse Razer Viper (Original) on a X-Raypad Equate Plus V2
Keyboard Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black)
Software Windows 11 Pro (24H2)
What is the different between a flagship card of poor silicon vs the flagship with very good silicon?
A lot? A little? I don't know. Clear enough?
Potentially some OC headroom. That’s it. All of them will still hit advertised clocks, they are validated for them. As I said - unless extreme OC is your jam I would not care about “binning”.

I don't know what the issues are with the 7900XT cards. I have tried every possible hardware configuration and the bare minimum. They struggle with detection. As in the computer doesn't SEE the hardware attached to the system and will throw a VGA lamp+beep codes. 1L3S = VGA error.
This doesn’t seem normal. Huh. I assume the specs in your profile are correct and up to date for the system you tried those 7900XT on?
 
Joined
Jan 2, 2024
Messages
754 (1.82/day)
Location
Seattle
System Name DevKit
Processor AMD Ryzen 5 3600 ↗4.0GHz
Motherboard Asus TUF Gaming X570-Plus WiFi
Cooling Koolance CPU-300-H06, Koolance GPU-180-L06, SC800 Pump
Memory 4x16GB Ballistix 3200MT/s ↗3800
Video Card(s) PowerColor RX 580 Red Devil 8GB ↗1380MHz ↘1105mV, PowerColor RX 7900 XT Hellhound 20GB
Storage 240GB Corsair MP510, 120GB KingDian S280
Display(s) Nixeus VUE-24 (1080p144)
Case Koolance PC2-601BLW + Koolance EHX1020CUV Radiator Kit
Audio Device(s) Oculus CV-1
Power Supply Antec Earthwatts EA-750 Semi-Modular
Mouse Easterntimes Tech X-08, Zelotes C-12
Keyboard Logitech 106-key, Romoral 15-Key Macro, Royal Kludge RK84
VR HMD Oculus CV-1
Software Windows 10 Pro Workstation, VMware Workstation 16 Pro, MS SQL Server 2016, Fan Control v120, Blender
Benchmark Scores Cinebench R15: 1590cb Cinebench R20: 3530cb (7.83x451cb) CPU-Z 17.01.64: 481.2/3896.8 VRMark: 8009
Yeah. I even tried pulling everything to the bare minimum, BIOS update and everything. AMD doesn't know. ASUS doesn't know. No bueno. X2
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
43,690 (6.78/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
Yeah. Double shaders, double bus width, new DX9, OpenGL feature levels and shaders. It was tight.
That kind of jump was visible to me at the time but I didn't pay attention to it. Should have been the first tell that vram didn't matter.

So here's the thing. Some of us spend a lot of time around this nerd junk. A LOT of time (and money). We spend time here for many different reasons.
Some consume content while others create it. Gaming demands a lot of different features and so does content creation. They used to be somewhat exclusive.
Features would get split between entire computers for creation, then add-in cards bridged the gap and then we got a cult of purists.
These days everything gets packaged in with video accelerators and it's how some people get by with just one machine, one accelerator, which is nuts.

Content creation is fully established with team green. I get that. They dominate the playing field. I'm not nVidia's customer, not their camp but I'm still a creator.
I can deal with editing, making announcement cards, encoding videos and so on but certain forms of recording and encoding are already at an impasse.
Some of us combine these things with other technologies like VR and streaming, which is a direction that demands a lot.
The current era favors NVENC and I don't. My options are H.264/AVC, H.265/HEVC and a possibility of AV1 getting adopted soon™ between these and H.266/VVC.

I have no problem choosing hardware for consumption but suffer issues as a creator like getting robbed of mission critical features for years at a time.
Ages ago it was clock speed, pixel shaders, DirectX9 feature level, vram...Today it's DirectX12 feature level and the encoder.
I have no idea how much AV1 encode should play a part in my purchase decision but I've been sold on it twice and had no luck getting to the POST screen.
It doesn't seem likely to happen. So I wonder if others had QA issues, lethal shipping issues, board compatibility problems, a bunch of this and that.

Last week I was presented with some rather high risk alternatives. Older but extremely high binned cards that could be an all/nothing difference with TONS of variables.
So rather than chasing 7000 series cards that are now more trouble than they're worth, I'm considering options like the 5700XT and 6900XT.
No idea if they're valid alternatives or how long they'll last but I can't keep doing this RMA dance forever.

The other day I found a JTC video giving a solid demo of the 6900XT Ultimate:

So that all looks like a very fun time. Then I found another 6900XT video on something I never considered:

Is it possible or even likely that both VBIOS chips can outright fail on these 7000 series cards? Multiple cards? How likely is it on 5000/6000 series?
I never bothered to tear down either card and opted to return them as possible DOA but it sucks that I can't even conclude what's wrong.
Is this our new hardware hell?

The numbers look really fun.

I have no idea what's wrong and wonder if others have any hint of what it could be. I never got any hardware fail beeps from the mobo, got lights to come on and everything, VGA problem lamp+beeps but no fan spin or picture. I even attempted to power on the 7900XT and RX 580 together and could only get detection out of the 580. So no power draw issues here but this is already a horrific nuisance. The main appeal of the 7000 series is power efficiency. I translate that as better voltage control and less risk of extreme spikes that trigger OCP or something on an already minimal build.

These cards aren't going to pull max draw just getting to POST but I wonder how much worse the power spikes of the 6900XT would be over the 7900XT.
Triple 8-pin totally doesn't look threatening at all. Should I give up and try the 6900XT? It looks like a pain to setup but the features and performance look promising.

View attachment 358409
The card had 128 mb framebuffer
 
Top