• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

I switched from LGA1700 to AM5, here are my thoughts

Joined
Nov 16, 2023
Messages
1,481 (3.68/day)
Location
Nowhere
System Name I don't name my rig
Processor 14700K
Motherboard Asus TUF Z790
Cooling Air/water/DryIce
Memory DDR5 G.Skill Z5 RGB 6000mhz C36
Video Card(s) RTX 4070 Super
Storage 980 Pro
Display(s) Some LED 1080P TV
Case Open bench
Audio Device(s) Some Old Sherwood stereo and old cabinet speakers
Power Supply Corsair 1050w HX series
Mouse Razor Mamba Tournament Edition
Keyboard Logitech G910
VR HMD Quest 2
Software Windows
Benchmark Scores Max Freq 13700K 6.7ghz DryIce Max Freq 14700K 7.0ghz DryIce Max all time Freq FX-8300 7685mhz LN2
Shit I side graded once from 1366 actually a downgrade from an EVGA x3 SLI to like an skt 1150(??) I5 because #clockspeeds. In retrospect that was some dumb shit, but its what I wanted.
I think a lot of us have done this at some time. I used FX processors daily for a while. Should have just stuck with Phenom II x6 for a while but wanted 8 cores. I should have went Intel at the time but it was like a lot more money than just a cpu upgrade for like 120$.

But tomorrow comes and then 9000 series and Core Ultra!! And all this will be behind us again.
 
Joined
Dec 25, 2020
Messages
7,013 (4.81/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
I find it weird when people use the MSRP tactic, its irrelevant for people that don't buy on launch.
And yes I agree Nvidia cutting their die size while raising cost is a reason why I'd rather not buy their cards, it's gotten egregious with the 4000 series, everything except the 4090 is cut down a tier but priced up a whole tier.
The aggressiveness from some users is disappointing, especially when it comes from staff members, I would expect staff to be a bit more open minded towards others preferences.

Just read this, MSRP stands for "Manufacturer's Suggested Retail Price", and thus what a product is assumed and presumed to cost. It is not a cheap argumentation tactic, as we cannot know the conditions and timing of one's purchase, so that is a nominal value to go by. For all we know, OP could have purchased it back when it was new in late 2020... and back then it was priced the exact same $999 as the 4080S is today. Nothing malicious about it. 500 for a 6900 XT was, and if you ask me, still is a good deal. It's one of the few Radeon cards that aren't completely hopeless as of late - and the 4080S will too decrease in price as newer generation products come and replace it. Still, it is unable to do many things that the RTX 40 series cards can do, and AMD's support of RDNA 2 has not been as forthright as everyone would have liked, they clearly give a particular preferential treatment to RDNA 3 at this point in time.

Die sizes don't have to be large on purpose, in fact, in the interests of controlling costs, it's better that dies are smaller. That's why AMD can build a processor like the Ryzen or the Epyc, yield is much higher and sorting is much easier, meanwhile, those credit-card sized Intel monolithic CPUs are very difficult to manufacture, and thus very expensive as well. What Nvidia is guilty of doing is selling lower-tier products, ofte with disabled execution units, as higher-tier parts and taking home the profits - but AMD has been unable to deliver a product line that prevents them from doing this, so there is little we can do.
 
Joined
Sep 10, 2018
Messages
6,965 (3.03/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
Just read this, MSRP stands for "Manufacturer's Suggested Retail Price", and thus what a product is assumed and presumed to cost. It is not a cheap argumentation tactic, as we cannot know the conditions and timing of one's purchase, so that is a nominal value to go by. For all we know, OP could have purchased it back when it was new in late 2020... and back then it was priced the exact same $999 as the 4080S is today. Nothing malicious about it. 500 for a 6900 XT was, and if you ask me, still is a good deal. It's one of the few Radeon cards that aren't completely hopeless as of late - and the 4080S will too decrease in price as newer generation products come and replace it. Still, it is unable to do many things that the RTX 40 series cards can do, and AMD's support of RDNA 2 has not been as forthright as everyone would have liked, they clearly give a particular preferential treatment to RDNA 3 at this point in time.

Die sizes don't have to be large on purpose, in fact, in the interests of controlling costs, it's better that dies are smaller. That's why AMD can build a processor like the Ryzen or the Epyc, yield is much higher and sorting is much easier, meanwhile, those credit-card sized Intel monolithic CPUs are very difficult to manufacture, and thus very expensive as well. What Nvidia is guilty of doing is selling lower-tier products, ofte with disabled execution units, as higher-tier parts and taking home the profits - but AMD has been unable to deliver a product line that prevents them from doing this, so there is little we can do.

I'd argue and I feel like you'd agree even the 4090 looks more like a 80ti tier product when you look at how cut down it is. Which is funny because before I bought it I actually looked at it vs my 3080ti which was also somewhat meh in retrospect although it was the only card I could even get at MSRP when it launched even a 3080 10G was like 1800 on ebay and the 6900XT was around 14-1500 for meh models.....

Hardware is hardware you win some you lose some but I feel the more you get to play around with the better from both sides..... I really like the 6700XT much more than the low end Nvidia gpu's around it's price that I've had hands on with. I like the 4070 even though it feels like a 3060ti successor that is just the way the cookie crumbles in 2023/24.
 

Outback Bronze

Super Moderator
Staff member
Joined
Aug 3, 2011
Messages
2,042 (0.42/day)
Location
Walkabout Creek
System Name Raptor Baked
Processor 14900k w.c.
Motherboard Z790 Hero
Cooling w.c.
Memory 48GB G.Skill 7200
Video Card(s) Zotac 4080 w.c.
Storage 2TB Kingston kc3k
Display(s) Samsung 34" G8
Case Corsair 460X
Audio Device(s) Onboard
Power Supply PCIe5 850w
Mouse Asus
Keyboard Corsair
Software Win 11
Benchmark Scores Cool n Quiet.
View attachment 360211

877 cash nowwwwwwwww

But seriously; good advice. I remember the gripes and moans back and fourth during the 4090 and 7900XTX launches and shit every major launch for the past 20 or so years iv been here and people see the performance deltas and try to clown those that bought them. Maybe more so in the past few years. Maybe its because so much of the old mentality and members are gone.

IDGAF what any of you think about the purchases I make with the money I wake up and earn.

And neither should anyone reading this. Are you happy? Awesome. Thats worth more than anything anyone can say in any post here or anywhere else.

Shit I side graded once from 1366 actually a downgrade from an EVGA x3 SLI to like an skt 1150(??) I5 because #clockspeeds. In retrospect that was some dumb shit, but its what I wanted.

Is that you driving that bus?

Where you heading too??

Edit: Sry, couldn't resist. Trying to lighten up this thread.

Back to OP.
 
Last edited:

Solaris17

Super Dainty Moderator
Staff member
Joined
Aug 16, 2005
Messages
27,082 (3.83/day)
Location
Alabama
System Name RogueOne
Processor Xeon W9-3495x
Motherboard ASUS w790E Sage SE
Cooling SilverStone XE360-4677
Memory 128gb Gskill Zeta R5 DDR5 RDIMMs
Video Card(s) MSI SUPRIM Liquid X 4090
Storage 1x 2TB WD SN850X | 2x 8TB GAMMIX S70
Display(s) 49" Philips Evnia OLED (49M2C8900)
Case Thermaltake Core P3 Pro Snow
Audio Device(s) Moondrop S8's on schitt Gunnr
Power Supply Seasonic Prime TX-1600
Mouse Razer Viper mini signature edition (mercury white)
Keyboard Monsgeek M3 Lavender, Moondrop Luna lights
VR HMD Quest 3
Software Windows 11 Pro Workstation
Benchmark Scores I dont have time for that.
Is that you driving that bus?

Where you heading too??

Edit: Sry, couldn't resist. Trying to lighten up this thread.

Back to OP.

If your not from the US or surrounding countries you probably dont get that meme or know the commercial, everyone else hates me because they will sleep to it tonight.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,246 (1.28/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Maybe more so in the past few years. Maybe its because so much of the old mentality and members are gone.
Ahh the old days, modded bioses, multi gpu setups and some insane overclocking gains. :peace:

Maybe part of the change in mentality goes hand in hand with the growing divide in wealth between the top %'ers and everyone else, because like virtually any hobby you can think of, if you've got deep pockets then boy do companies have some absolutely bonkers cool products to sell to you.
 
Last edited:
Joined
Nov 16, 2023
Messages
1,481 (3.68/day)
Location
Nowhere
System Name I don't name my rig
Processor 14700K
Motherboard Asus TUF Z790
Cooling Air/water/DryIce
Memory DDR5 G.Skill Z5 RGB 6000mhz C36
Video Card(s) RTX 4070 Super
Storage 980 Pro
Display(s) Some LED 1080P TV
Case Open bench
Audio Device(s) Some Old Sherwood stereo and old cabinet speakers
Power Supply Corsair 1050w HX series
Mouse Razor Mamba Tournament Edition
Keyboard Logitech G910
VR HMD Quest 2
Software Windows
Benchmark Scores Max Freq 13700K 6.7ghz DryIce Max Freq 14700K 7.0ghz DryIce Max all time Freq FX-8300 7685mhz LN2
Ahh the old days, modded bioses, multi gpu setups and some insane overclocking gains. :peace:

Maybe part of the change in mentality goes hand in hand with the growing divide in wealth between the top %'ers and everyone else, because like virtually any hobby you can think of, if you've got deep pockets then boy do companies have some absolutely bonkers cool products to sell to you.
You can still do Multi GPU. In fact that's pretty much what AMD calls it these days. mGPU.

So you can mGPU pretty much all the RX cards of the same or similar part number. I have mGPU RX 6700 non XT and RX 6700 Xt. As I will example below. Interestingly, the display driver did Not like the XT installed with the non XT as the backup for some odd reason. Was completely unstable.

Only works on DX12 and up though....

3044095.png
 
Last edited:
Joined
Jun 7, 2020
Messages
7 (0.00/day)
Processor AMD Ryzen 7 7800X3D - 8C/16T @ 5.1 GHz (103 BCLK, 2166 FCLK)
Motherboard ASUS ROG Crosshair X670E HERO
Cooling Corsair iCUE H170i Elite Capellix 420mm AIO w/ 4 Corsair AF140s
Memory 64 GB Corsair Dominator Titanium DDR5 @ 6000 MT/s (1.5V) 26-35-35-47-1T GDM off
Video Card(s) MSI GeForce RTX 4080 SUPER Gaming X Slim 16 GB G6X @ 2970 MHz/1625 MHz (26 Gbps)
Storage x2 Sabrent Rocket NVMe PCIe 4.0 SSDs 3 TB, WD Black 10 TB
Display(s) LG UltraGear 32GP750 1440p 165Hz, Samsung UN225003BF 1080p 60Hz, TCL 50S435 4K 60Hz
Case Corsair 7000D AIRFLOW
Audio Device(s) Creative Sound BlasterX G6 - Beyerdynamic DT 990 Pro LE 250 ohm & Blue Yeti X Mic
Power Supply 2021 Corsair RM1000x 80+ Gold (1000W)
Mouse Corsair Sabre RGB PRO
Keyboard Corsair K70 MAX
Software Windows 11 Pro
When you consider how Intel has handled the entire Raptor Lake degradation debacle, I don't see going to a 7800X3D as a "downgrade" from a 13700K. Intel was at first blaming board partners as to why their CPUs were unstable, to finally admitting that there's indeed issues with CPUs degrading (and also the oxidation issue they decided to never admit until recently, despite it happening in 2023), and then releasing a BIOS update with the microcode fix that ended up introducing new bugs in the BIOS. For instance, disabling the baseline BIOS settings completely disables the voltage limit for some god forsaken reason, even though Intel's exact reason for this update is to implement a hard voltage limit that can be requested. Those issues were documented on Buildzoid's channel. It took Intel forever to finally admit there was an actual issue when there were documented crashes occurring on Raptor Lake as early as late 2022, they didn't even bother recalling any CPUs that were potentially affected by the oxidation issue, and all they did was increase your warranty by two years. Sure, that's great, but considering how Intel has tried to squirm their way out of this and sweep it under the rug until it got really bad for them when Wendell and GN Steve called them out, I wouldn't trust that they'll honor RMAs for every affected CPU. I've also read some people saying that Intel basically would ghost them whenever they would inquire about a replacement CPU due to experiencing crashes.

Yeah, Ryzen 7000 CPUs were blowing up in the past due to SoC voltage being pumped way too high by default when enabling EXPO. However, AMD made absolute sure to have it fixed ASAP and made it right for those who were affected. It's not about companies having issues with faulty products, it happens. What matters is the response to it and what actions said company takes to alleviate those issues.

Besides, AM5 also will have future CPU releases, whereas LGA1700 is completely finished as of this fall/winter when Arrow Lake arrives. And I'm not very keen on trusting that Arrow Lake won't have issues with degradation considering the fact that clock speeds seem to still be 5.5+ GHz for ST workloads, but we'll have to see.
 
Joined
Jan 14, 2019
Messages
12,572 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Doing an entire platform sidegrade to save 56 watts under gaming load while adding 25 watts on idle.

Uneasy about undervolting/disabling HT/disabling E cores (or using Win 10 for gaming and Process Lasso, or Win 11 for the scheduler) since ~10% performance lost. But then running a 3-600 W GPU at 180 W to save power (also resulting in a performance loss).

View attachment 360100

This entire concept is... confusing.

Nothing about my comment is aggressive, it's simply confused questioning instead of the praise you may have expected.

View attachment 360097

Note these CPU gaming tests are done with a 4090, so they represent worst case scenario for gaming power draw (when paired with fastest GPU).
Saving Watts isn't necessarily about saving Watts, but also saving heat. I'm not running my RAM at bog standard 4800 MHz because it saves 10 W on my CPU (which isn't bad when idle, but that's besides the point), but because it also saves 5-7 °C, and some time on boot.
 
Joined
Dec 25, 2020
Messages
7,013 (4.81/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
You can still do Multi GPU. In fact that's pretty much what AMD calls it these days. mGPU.

So you can mGPU pretty much all the RX cards of the same or similar part number. I have mGPU RX 6700 non XT and RX 6700 Xt. As I will example below. Interestingly, the display driver did Not like the XT installed with the non XT as the backup for some odd reason. Was completely unstable.

Only works on DX12 and up though....

View attachment 360232

Problem is that DX12 explicit multi adapter is only supported in 3DMark and a single game (Ashes of the Singularity Benchmark) :oops:
 
Joined
Nov 13, 2007
Messages
10,845 (1.74/day)
Location
Austin Texas
System Name stress-less
Processor 9800X3D @ 5.42GHZ
Motherboard MSI PRO B650M-A Wifi
Cooling Thermalright Phantom Spirit EVO
Memory 64GB DDR5 6400 1:1 CL30-36-36-76 FCLK 2200
Video Card(s) RTX 4090 FE
Storage 2TB WD SN850, 4TB WD SN850X
Display(s) Alienware 32" 4k 240hz OLED
Case Jonsbo Z20
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse DeathadderV2 X Hyperspeed
Keyboard 65% HE Keyboard
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
When you consider how Intel has handled the entire Raptor Lake degradation debacle, I don't see going to a 7800X3D as a "downgrade" from a 13700K. Intel was at first blaming board partners as to why their CPUs were unstable, to finally admitting that there's indeed issues with CPUs degrading (and also the oxidation issue they decided to never admit until recently, despite it happening in 2023), and then releasing a BIOS update with the microcode fix that ended up introducing new bugs in the BIOS. For instance, disabling the baseline BIOS settings completely disables the voltage limit for some god forsaken reason, even though Intel's exact reason for this update is to implement a hard voltage limit that can be requested. Those issues were documented on Buildzoid's channel. It took Intel forever to finally admit there was an actual issue when there were documented crashes occurring on Raptor Lake as early as late 2022, they didn't even bother recalling any CPUs that were potentially affected by the oxidation issue, and all they did was increase your warranty by two years. Sure, that's great, but considering how Intel has tried to squirm their way out of this and sweep it under the rug until it got really bad for them when Wendell and GN Steve called them out, I wouldn't trust that they'll honor RMAs for every affected CPU. I've also read some people saying that Intel basically would ghost them whenever they would inquire about a replacement CPU due to experiencing crashes.

Yeah, Ryzen 7000 CPUs were blowing up in the past due to SoC voltage being pumped way too high by default when enabling EXPO. However, AMD made absolute sure to have it fixed ASAP and made it right for those who were affected. It's not about companies having issues with faulty products, it happens. What matters is the response to it and what actions said company takes to alleviate those issues.

Besides, AM5 also will have future CPU releases, whereas LGA1700 is completely finished as of this fall/winter when Arrow Lake arrives. And I'm not very keen on trusting that Arrow Lake won't have issues with degradation considering the fact that clock speeds seem to still be 5.5+ GHz for ST workloads, but we'll have to see.
If you're gaming it's not a downgrade...

But if you're gaming at a 1440 or above it's not a real upgrade.

If your'e gaming + your old chip and mobo setup was causing grief in terms of temps, heat etc. then it's actually a pretty huge upgrade just from the mental side - your FPS are a bit smoother, you feel better, you're happier with the build.
 
Joined
Nov 11, 2016
Messages
3,459 (1.17/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
Turned off HT and E-Cores on my 13700KF, it runs much smoother (higher 1% Low FPS in online competitive games)

7800X3D is better, but it's not to a degree that's worth upgrading to even with a 4090, much less 6900XT

I'm waiting for the outcome of the battle of next-gen CPU, 9800X3D vs Arrow Lake-S to make my decision for the next upgrade :D
 
Joined
Sep 10, 2018
Messages
6,965 (3.03/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
Turned off HT and E-Cores on my 13700KF, it runs much smoother (higher 1% Low FPS in online competitive games)

7800X3D is better, but it's not to a degree that's worth upgrading to even with a 4090, much less 6900XT

I'm waiting for the outcome of the battle of next-gen CPU, 9800X3D vs Arrow Lake-S to make my decision for the next upgrade :D

I mean they are both great cpu's you'd have a hard time at reasonable settings telling apart.... Although I get the impression performance isn't the main reason the OP switched.
 
Joined
May 10, 2023
Messages
352 (0.59/day)
Location
Brazil
Processor 5950x
Motherboard B550 ProArt
Cooling Fuma 2
Memory 4x32GB 3200MHz Corsair LPX
Video Card(s) 2x RTX 3090
Display(s) LG 42" C2 4k OLED
Power Supply XPG Core Reactor 850W
Software I use Arch btw
As for the on topic part of this thread. big sad :c I idle at like 150w.
I feel you, but mostly because Ampere is shit at idle, and even worse on Linux. Each of my 3090s idles at 35~45W.
 
Joined
Nov 16, 2023
Messages
1,481 (3.68/day)
Location
Nowhere
System Name I don't name my rig
Processor 14700K
Motherboard Asus TUF Z790
Cooling Air/water/DryIce
Memory DDR5 G.Skill Z5 RGB 6000mhz C36
Video Card(s) RTX 4070 Super
Storage 980 Pro
Display(s) Some LED 1080P TV
Case Open bench
Audio Device(s) Some Old Sherwood stereo and old cabinet speakers
Power Supply Corsair 1050w HX series
Mouse Razor Mamba Tournament Edition
Keyboard Logitech G910
VR HMD Quest 2
Software Windows
Benchmark Scores Max Freq 13700K 6.7ghz DryIce Max Freq 14700K 7.0ghz DryIce Max all time Freq FX-8300 7685mhz LN2
Problem is that DX12 explicit multi adapter is only supported in 3DMark and a single game (Ashes of the Singularity Benchmark) :oops:
Correct. That 1 game and a few 3DMark Benchmarks can utilize non identical cards.

All other DX12 games and up mGPU is supported. That means both of the cards need to be identical.

I think the Night Raid Benchmark only supports identical RX GPUs if I remember right.
 
Joined
Sep 11, 2019
Messages
276 (0.14/day)
I put together a 7800X3D system this week. It's a peach. There will always be emotionally stunted people chugging haterade and trying to get a rise out of you on the interwebz. Water off a duck's back.
 
  • Like
Reactions: N/A
Joined
Jul 26, 2013
Messages
438 (0.11/day)
Location
Midlands, UK
System Name Electra III
Processor AMD Ryzen 5 3600 @ 4.40 GHz (1.3 V)
Motherboard ASUS PRIME X570-PRO with BIOS 5003
Cooling Cooler Master Hyper 212 EVO V1 + 4× ARCTIC P12 PWM
Memory 32 GiB Kingston FURY Renegade RGB (DDR4-3600 16-20-20-39)
Video Card(s) PowerColor Fighter RX 6700 XT with Adrenalin 24.7.1
Storage 1 TiB Samsung 970 EVO Plus + 4 TB WD Red Pro
Display(s) Dell G3223Q + Samsung U28R550Q + HP 22w
Case Fractal Design Focus G (Black)
Audio Device(s) Realtek HD Audio S1220A
Power Supply EVGA SuperNOVA G3 750 W
Mouse Logitech G502 X Lightspeed + Logitech MX Master 2S
Keyboard MSI VIGOR GK71 SONIC Blue
Software Windows 10 22H2 Pro x64
Benchmark Scores CPU-Z = 542/4,479 — R15 = 212/1,741 — R20 = 510/3,980 — PM 10 = 2,784/19,911 — GB 5 = 1,316/7,564
Saving Watts isn't necessarily about saving Watts, but also saving heat. I'm not running my RAM at bog standard 4800 MHz because it saves 10 W on my CPU (which isn't bad when idle, but that's besides the point), but because it also saves 5-7 °C, and some time on boot.

When it gets uncomfortably hot in the summer here, the ambient temperature in my room soars to the 30s and 40s. To make matters worse, A/C is expensive, and the electricity to power it, is expensive. Can't win either way, so going with a machine that runs cooler and dumps less heat into the room is actually better.

If you're gaming it's not a downgrade...

And if you're moving from a platform with no further upgrades to one with future products planned for at least the next 3 years, that should not be considered a downgrade either. A sidegrade, perhaps, but it would only improve from that point. And sometimes a sidegrade makes sense to do, as in my situation above.
 
Joined
Jan 14, 2019
Messages
12,572 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
When it gets uncomfortably hot in the summer here, the ambient temperature in my room soars to the 30s and 40s. To make matters worse, A/C is expensive, and the electricity to power it, is expensive. Can't win either way, so going with a machine that runs cooler and dumps less heat into the room is actually better.
Not to mention the rest of your system parts which also soak up some of their surrounding heat.
 
Joined
Apr 14, 2022
Messages
758 (0.77/day)
Location
London, UK
Processor AMD Ryzen 7 5800X3D
Motherboard ASUS B550M-Plus WiFi II
Cooling Noctua U12A chromax.black
Memory Corsair Vengeance 32GB 3600Mhz
Video Card(s) Palit RTX 4080 GameRock OC
Storage Samsung 970 Evo Plus 1TB + 980 Pro 2TB
Display(s) Acer Nitro XV271UM3B IPS 180Hz
Case Asus Prime AP201
Audio Device(s) Creative Gigaworks - Razer Blackshark V2 Pro
Power Supply Corsair SF750
Mouse Razer Viper
Keyboard Asus ROG Falchion
Software Windows 11 64bit
It matters as Nvidia keeps raising prices but giving the consumers less of the die, like the 4070Ti should've been a 4060Ti at best, with its 192 bit bus and only 12GB VRAM.
Why does the size matter and not the number of transistors for example?
nVidia could say that we sell you more transistors regardless the smaller size of the die. The die size argument is a joke and I hope you understand.

on topic: stick to the positive. You can upgrade to the next 3D with the minimum effort.
And this is why I think to do the crime and upgrade to AM5 although my 5800X3D is decent enough for gaming.
 
Joined
Jun 2, 2017
Messages
9,370 (3.39/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
I get it, sometimes you live with a cpu and just don't like it..... I had a 7800X3D for a week and was like na.

Although alternatives I could see my 7950X3D bugging a lot of people as well.

Kinda reminds me of the 9900k I had a love/hate relationship although part of that was the Z390 Code just being kinda crap for a 450 usd board though....

Switching hardware only needs to make sense to the one purchasing the hardware. OP gave it a year did a ton of tweaks and was like na. Sometimes you just need a change.

I don't mind raptorlake it's fine I've only had issues with 2 out of the box and one was just crappy default bios settings. It's not hard to get it down to 125-140w while only losing a small amount of perfomance in gaming which I find acceptable.

I care more about what my CPU consumes than my GPU but that's only because I like to get total system power about 600w in worse case scenarios so the highest all core power draw I'm willing to live with is 140w.... People should never try to use their own use case to judge someone else hardware purchase decisions.

We have plenty of Intel is the greatest or AMD is the greatest threads so somebody switching to or from them isn't a big deal honestly or at least shouldn't be.

Also reviews are very good as a guide but they give you 0 insight into how living with a piece of hardware is going to be so unless you can buy everything and test it yourself you'll never have the full picture.

At the end of the day it's just silicon I'm not attached to one brand more than the other but I am glad I typically get hands on with a lot of different hardware prior to making a purchasing decision somtimes you just have to take a chance sometimes it works out sometimes it doesn't.

The 6900XT is fine nothing AMD has is worth upgrading to from it (7900XTX is borderline if you hate nvidia for whatever reason) and until the 4070ti super dropped no card from nvidia that was worth a shite had more than 12GB under 1200 usd the op wanted 16GB of vram obviously, it's not mine or anyone's place to tell them they shouldn't want it...

My 2 cents anyways that ain't worth jack either lol.
You got that right. A few people on here that never used my 7900X3D, tell me that I am lying about the performance of that chip. When I tell them that not every Game supports Vcache they did not understand how 6 cores running at 5.6 GHz would mean that in some Games the 7900X3D is faster in Games that do not support Vcache but the 7800X3D argument got raised to the point where I bought one.

Compared to the 7900X3D the 7800X3D feels sluggish in regular use. So like you after 1 week of use I returned it. The OP got real pushback from pro Intel users that love to bash AMD.
 
Joined
Jan 14, 2019
Messages
12,572 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
You got that right. A few people on here that never used my 7900X3D, tell me that I am lying about the performance of that chip. When I tell them that not every Game supports Vcache they did not understand how 6 cores running at 5.6 GHz would mean that in some Games the 7900X3D is faster in Games that do not support Vcache but the 7800X3D argument got raised to the point where I bought one.

Compared to the 7900X3D the 7800X3D feels sluggish in regular use. So like you after 1 week of use I returned it. The OP got real pushback from pro Intel users that love to bash AMD.
What do you mean by "sluggish in regular use"? Personally, I can barely feel any difference between my 7800X3D and an old i5 in regular use as long as they're both booting from an SSD.

I'm not trying to nitpick, and I don't disagree with you, I'm just curious.
 
Joined
Jun 2, 2017
Messages
9,370 (3.39/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
What do you mean by "sluggish in regular use"? Personally, I can barely feel any difference between my 7800X3D and an old i5 in regular use as long as they're both booting from an SSD.

I'm not trying to nitpick, and I don't disagree with you, I'm just curious.
8 cores running at 5.0 Ghz vs 6 cores running at 5.6 Ghz. Then there is more L3 Cache vs that as well. For me it is the same as the 5800X vs the 5900X.
 
Joined
Jan 14, 2019
Messages
12,572 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
8 cores running at 5.0 Ghz vs 6 cores running at 5.6 Ghz. Then there is more L3 Cache vs that as well. For me it is the same as the 5800X vs the 5900X.
Sure, but these are just arbitrary numbers. How does one feel more "sluggish" than the other in everyday use?
 
Joined
Apr 13, 2023
Messages
314 (0.51/day)
System Name Can it run Warhammer 3?
Processor 7800X3D @ 5Ghz
Motherboard Gigabyte B650 Aorus Elite AX
Cooling Enermax Liqmax III 360mm
Memory Teamgroup DDR5 CL30 6000Mhz 32GB
Video Card(s) Gigabyte 4090
Storage Silicon Power XS70, Corsair T700
Display(s) BenQ EX2710Q, BenQEX270M
Case NZXT H7 Flow
Audio Device(s) AudioTechnica M50xBT
Power Supply SuperFlower Leadex III 850W
Who's kidding who.

It's a Down grade period, OP got what he wanted, shared some thoughts....sloppily. Perhaps it should had been a profile update instead. I'm one who wants to see screen shots, FPS differences, load v-cores.... any kind of comparison with actual data.

Nope, it's a feelings thread lol. XD.
You’re kidding yourself.

Going to a faster more efficient processor is not a downgrade, even if it’s a small margin of performance improvement. We don’t need OP to post benches to know that.
 
Joined
Jan 25, 2014
Messages
2,094 (0.53/day)
System Name Ryzen 2023
Processor AMD Ryzen 7 7700
Motherboard Asrock B650E Steel Legend Wifi
Cooling Noctua NH-D15
Memory G Skill Flare X5 2x16gb cl32@6000 MHz
Video Card(s) Sapphire Radeon RX 6950 XT Nitro + gaming Oc
Storage WESTERN DIGITAL 1TB 64MB 7k SATA600 Blue WD10EZEX, WD Black SN850X 1Tb nvme
Display(s) LG 27GP850P-B
Case Corsair 5000D airflow tempered glass
Power Supply Seasonic Prime GX-850W
Mouse A4Tech V7M bloody
Keyboard Genius KB-G255
Software Windows 10 64bit
Jeez guys what is up with all the attack's, as long as she is happy with the switch it's all good.
As for bashing the 6900xt, I have a 6950xt and got it for 620$, some of the cards you are comparing it to cost like the 4070 around 590-650$ for the cheaper and more shitty models, the 4070 ti was 900$, the 4080 was 1200$ at the time, and not to mention the 4090 that was 2000-2200$ depending on the model. No one buys a high end card for it's efficiency but for it's performance.

Ps Some of the comments on the first page felt like an attack, but down the line I'm happy to see it tidy down and become a normal discussion thread.
 
Top