• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

(Anti) SFF fun house

tabascosauz

Moderator
Supporter
Staff member
Joined
Jun 24, 2015
Messages
7,675 (2.37/day)
Location
Western Canada
System Name ab┃ob
Processor 7800X3D┃5800X3D
Motherboard B650E PG-ITX┃X570 Impact
Cooling NH-U12A + T30┃AXP120-x67
Memory 64GB 6000CL30┃32GB 3600CL14
Video Card(s) RTX 4070 Ti Eagle┃RTX A2000
Storage 8TB of SSDs┃1TB SN550
Case Caselabs S3┃Lazer3D HT5
That could be related to the discord bug nvidia has right now, or possibly overheating VRAM - which happened to a loooooot of 30 series cards as well, it only takes one VRAM thermal pad out of place for things to get weird there (Each VRAM module has an internal temp sensor, but arent exposed to the user - 100% fan speed was the usual key clue on Nvidia)

Interesting. I thought VRAM junction on both Nvidia and AMD (at least in current HWInfo) is taken from the highest reading of all the VRAM packages? At least, that's what the description says.

On the topic of VRAM, these 4070 Ti coolers are positively bonkers. The GDDR6X is the biggest surprise. Something to do with these new 16Gb density, 21Gbps G6X packages and all the attention the AIBs are now paying to VRAM cooling with baseplates. Still haven't broken 60C, if this was the 19Gbps G6X on the 3070 Ti I would have been creeping up on 80C long ago.

Meanwhile the GDDR6 on the RDNA3 cards are just going full steam in the opposite direction, absolutely scorching. 58C at idle on a good day.

Genshin runs fanless.
War Thunder runs fanless.
I can barely hear the fans at 1400rpm.
GPU edge temp idles at 30C flat.
VRAM idles at 30C flat.

That ones fascinating. DirectStorage aint out yet, and in theory it shouldnt be any slower if it has the same PCI-E bandwidth to load up the VRAM
I dont have any AMD GPU's on hand to test, but if i do get one i'll happily try and find out (So far i've seen no changes between 9 10 and 30 series Nvidia when my ITX rig died and parts got moved, so it's not a VRAM amount/age thing)

Might tie in with your VRAM downclocking issues (especially if triggered by the multi monitor problems - they clock em up to avoid issues, get backlash on the power, downclock, get backlash for the issues, etc)
Nvidia weren't immune to those issues either, they still have HDMI 2.1 problems galore and seizure inducing flickers with multi monitor and any of their scaling settings enabled

It is a very interesting thought. It must be something driver-related, to be such a night-and-day difference between the 7900XT, and the other 3 cards I've run in the past month (2060S/3070Ti/4070Ti). But what could it be? No one knows.

I am the same as you, I have never seen this phenomenon from any GPU before. It's the fact that it's the game launching that is slow, that stumps me. The worst offender was Sniper Elite 5 - I'd regularly wait more than a minute for it to launch.

It's just baffling to me that these two competitors are constantly so wildly different in regards to multi monitor.
  • 2060 Super: 165Hz + 165Hz, 40W idle
  • 3070 Ti: 165Hz + 165Hz, 20W idle
  • 4070 Ti: 165Hz + 165Hz, 101MHz VRAM clock, 11W idle
  • 7900XT: 165Hz + 60Hz, sub-200MHz VRAM clock, 17-30W idle
  • 7900XT: 165Hz + 75/90/120/144/165Hz, 2487MHz VRAM clock, 80-90W idle
Frequently it felt like the power metrics didn't add up on the 7900XT. Same way as core+SOC power doesn't account for PPT on chiplet Ryzens, there's always 10-20W lost to the void. Maybe it's an inevitability of the chiplet design, but AMD also made so much fanfare about how much more efficient and aggressively power-gated the Fanout tech can be, compared to IFOP in CPUs.

All I know is that AMD doesn't pay me nearly enough money to continue to care about their bugs, so I won't
 
Joined
Jan 14, 2019
Messages
9,971 (5.13/day)
Location
Midlands, UK
System Name Nebulon-B Mk. 4
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 7800 XT
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Benchmark Scores Cinebench R23 single-core: 1,800, multi-core: 18,000. Superposition 1080p Extreme: 9,900.
View attachment 282071

Talking about Radeon drivers always hurts, because the hard product is well-built. The XT reference PCB is stout enough that all the AIBs are still using some variation of it. Compare the 4070 Ti PCBs - they look like friggin 3060 Tis.

Spent 29 days with a 7900XT. 3 driver versions, 22.12.2, 23.1.1, 23.1.2. Can't tell me I didn't give Radeon a fair chance.
  • Multi monitor idle power consumption (still acknowledged and unfixed on release notes), probably will never be fixed
  • Multi monitor stutter while video playback (still acknowledged and unfixed on release notes)
  • The oscillating core/VRAM clock/power bug during video playback that I reported and collected all that data for
  • Multi monitor stutter while in-game and simultaneously displaying anything with a changing picture anywhere else (still acknowledged and unfixed on release notes)
  • Incredible stuttering due to VRAM downclocking while in-game
  • Insane framepacing issues in MW19 due to Anti-Lag completely incompatible with the game
Somehow in my time with the 7900XT, all games were always slow to launch. 5800X3D performed just fine, nothing wrong with the SSDs, 7900XT had the benefit of a completely clean Windows install made just for it. Swapped in the 4070 Ti and app launches went right back to being lightning fast. Bizarre. Maybe Radeon drivers were interfering with AM4 chipset drivers in some way.

Yes, I hate having to give up 20GB VRAM. Yes, I hate the fact that the memory subsystem is a literal downgrade from 3070 Ti. Yes, I hate not having a super compact card. Yes, I don't like 12VHPWR.

But I also LIKE not having to gimp or turn off my own monitors, not doubling my idle draw from the wall, fan stop that actually works, VRAM that doesn't idle at 60C and game at 90C, and a card that doesn't make me babysit and troubleshoot it all day long.

Maybe one day, eventually, Radeon Technologies Group will come under [actually] new management.

View attachment 282074
Sorry to hear your bad experiences. :(

I think buying a new generation of anything is always a gamble. Maybe it works, maybe it doesn't. And like you said, AMD might pull their act together in the future.

If it makes you feel better, my MSi motherboard (in my profile) was an absolute garbage with the factory BIOS. It wouldn't even run EXPO without a 1.36 VSOC, would only boot with FCLK 1:2, but still hang up sometimes, and it would freeze and blue screen out on JEDEC 4800 MHz. Now, after a couple of updates, it is rock solid with EXPO 6000, FCLK at 1:1 and 1.2 VSOC.

As for VRAM downclocking, is your GPU overclocked? I've seen a video somewhere that the GPU and VRAM are on the same power budget - if the GPU eats more, there's less left for the VRAM.
 

tabascosauz

Moderator
Supporter
Staff member
Joined
Jun 24, 2015
Messages
7,675 (2.37/day)
Location
Western Canada
System Name ab┃ob
Processor 7800X3D┃5800X3D
Motherboard B650E PG-ITX┃X570 Impact
Cooling NH-U12A + T30┃AXP120-x67
Memory 64GB 6000CL30┃32GB 3600CL14
Video Card(s) RTX 4070 Ti Eagle┃RTX A2000
Storage 8TB of SSDs┃1TB SN550
Case Caselabs S3┃Lazer3D HT5
Sorry to hear your bad experiences. :(

I think buying a new generation of anything is always a gamble. Maybe it works, maybe it doesn't. And like you said, AMD might pull their act together in the future.

As for VRAM downclocking, is your GPU overclocked? I've seen a video somewhere that the GPU and VRAM are on the same power budget - if the GPU eats more, there's less left for the VRAM.

The main issue is that these aren't even new problems. If anything, Navi31 is a surprisingly solid product on its own. The entire multi monitor headache collection, issues with MPO and ULPS - some have been around since Polaris and Vega. For every promise AMD made to fix these issues, there was another AMD employee or affected last gen owner confirming that AMD is in fact, bullshitting again.

I never touched VRAM. The downclocking in games that AMD deems unworthy is "working as intended" behaviour in RDNA2 as well. The only even remotely effective solution is to increase render resolution or other means of increasing GPU load. And if that's not enough, well, Jesus take the wheel.

Like I said before, it's unwinnable. Not enough VRAM clock and you get blackscreens/flicker and stuttering in game. Too much VRAM clock and you get 90W idle. Years and years of the same dilemma, always unchanging. Yet, generation after generation Nvidia pulls the necessary bandwidth out of its ass to power 165Hz+165Hz?

Anyways, not worth any more of my time. I tried to be positive about Finewine, but I don't live in the future. Ada is smooth sailing.

Did I mention that custom resolutions in Radeon Software always force the display into 6-bit color, without exception? Complaints about this one go back years and years. Not that complicated, AMD.
 
Last edited:
Joined
Jan 14, 2019
Messages
9,971 (5.13/day)
Location
Midlands, UK
System Name Nebulon-B Mk. 4
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 7800 XT
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Benchmark Scores Cinebench R23 single-core: 1,800, multi-core: 18,000. Superposition 1080p Extreme: 9,900.
I never touched VRAM. The downclocking in games that AMD deems unworthy is "working as intended" behaviour in RDNA2 as well. The only even remotely effective solution is to increase render resolution or other means of increasing GPU load. And if that's not enough, well, Jesus take the wheel.
Not the VRAM, I mean the GPU. If you ask the GPU to draw more power (without touching the VRAM), your VRAM will suffer from less power headroom being available.

Sorry to hear your troubles. I guess not every product is meant for everyone.
 

tabascosauz

Moderator
Supporter
Staff member
Joined
Jun 24, 2015
Messages
7,675 (2.37/day)
Location
Western Canada
System Name ab┃ob
Processor 7800X3D┃5800X3D
Motherboard B650E PG-ITX┃X570 Impact
Cooling NH-U12A + T30┃AXP120-x67
Memory 64GB 6000CL30┃32GB 3600CL14
Video Card(s) RTX 4070 Ti Eagle┃RTX A2000
Storage 8TB of SSDs┃1TB SN550
Case Caselabs S3┃Lazer3D HT5
Not the VRAM, I mean the GPU. If you ask the GPU to draw more power (without touching the VRAM), your VRAM will suffer from less power headroom being available.

Sorry to hear your troubles. I guess not every product is meant for everyone.

Nah, power limit was stock. MBA XT cooler is small so I wasn't keen on asking it to do the XTX's job. It can cope fine with 355W in Timespy but Timespy doesn't run very hot.

When the 3D load is demanding (where GPU might conceivably be hogging the power), VRAM is always happy to sit at 2487MHz. It's the games that run at 100W where VRAM gets lazy.
 
Joined
Feb 20, 2019
Messages
7,394 (3.88/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Spent 29 days with a 7900XT. 3 driver versions, 22.12.2, 23.1.1, 23.1.2. Can't tell me I didn't give Radeon a fair chance.
  • Multi monitor idle power consumption (still acknowledged and unfixed on release notes), probably will never be fixed
  • Multi monitor stutter while video playback (still acknowledged and unfixed on release notes)
  • The oscillating core/VRAM clock/power bug during video playback that I reported and collected all that data for
  • Multi monitor stutter while in-game and simultaneously displaying anything with a changing picture anywhere else (still acknowledged and unfixed on release notes)
  • Incredible stuttering due to VRAM downclocking while in-game
  • Insane framepacing issues in MW19 due to Anti-Lag completely incompatible with the game
Somehow in my time with the 7900XT, all games were always slow to launch. 5800X3D performed just fine, nothing wrong with the SSDs, 7900XT had the benefit of a completely clean Windows install made just for it. Swapped in the 4070 Ti and app launches went right back to being lightning fast. Bizarre. Maybe Radeon drivers were interfering with AM4 chipset drivers in some way.

Yes, I hate having to give up 20GB VRAM. Yes, I hate the fact that the memory subsystem is a literal downgrade from 3070 Ti. Yes, I hate not having a super compact card. Yes, I don't like 12VHPWR.

But I also LIKE not having to gimp or turn off my own monitors, not doubling my idle draw from the wall, fan stop that actually works, VRAM that doesn't idle at 60C and game at 90C, and a card that doesn't make me babysit and troubleshoot it all day long.

Maybe one day, eventually, Radeon Technologies Group will come under [actually] new management.

View attachment 282074
You should forward a link to this post to AMD support.
It *is* hurting their bottom line, and I'm currently displeased about the 90-day old driver for 6000-series cards. 2022.11.2 means that there are no game profiles or fixes since that date.

Their 9% dGPU market share doesn't need threads like this showing that Nvidia "just works" and AMD has issues. They fumbled the 5700-series press drivers and launch drivers when RDNA launched too, which hurt public perception of AMD GPUs. It doesn't matter that they fixed the 5700-series drives in a timely manner, the damage was already done.

9% marketshare. For GPUs that offer significantly better performance/$ in most games, that can only be lack of public faith in AMD's ability to give them a smooth experience.
 

tabascosauz

Moderator
Supporter
Staff member
Joined
Jun 24, 2015
Messages
7,675 (2.37/day)
Location
Western Canada
System Name ab┃ob
Processor 7800X3D┃5800X3D
Motherboard B650E PG-ITX┃X570 Impact
Cooling NH-U12A + T30┃AXP120-x67
Memory 64GB 6000CL30┃32GB 3600CL14
Video Card(s) RTX 4070 Ti Eagle┃RTX A2000
Storage 8TB of SSDs┃1TB SN550
Case Caselabs S3┃Lazer3D HT5
You should forward a link to this post to AMD support.
It *is* hurting their bottom line, and I'm currently displeased about the 90-day old driver for 6000-series cards. 2022.11.2 means that there are no game profiles or fixes since that date.

Their 9% dGPU market share doesn't need threads like this showing that Nvidia "just works" and AMD has issues. They fumbled the 5700-series press drivers and launch drivers when RDNA launched too, which hurt public perception of AMD GPUs. It doesn't matter that they fixed the 5700-series drives in a timely manner, the damage was already done.

Last week I put a detailed writeup for the fullscreen video playback bug up on AMD forums and r/AMD. Replete with charts and data. Nobody cared. All of these issues I've submitted via bug reporter as well. Every one of the issues except for maybe MW19 and the video playback issue they should already know from years and years of tears. Judging from the [lack of] progress, I don't think anyone reads those. So if AMD team really wants to do their job, they know where to find it.

In hindsight, of course r/AMD would behave that way. It's a festering circlejerk of people who know nothing about how Ryzen or Radeon work, with the occasional green or blue troll sprinkled in.

Perhaps I should feel lucky that at least I wasn't downvoted and lambasted into oblivion on r/AMD.

When I showed up to the store to pick up a HX1000 for troubleshooting the 7900XT, someone remarked "there's plenty of 4080s here, why not grab one?". I'm sure I said 7900XT and he just assumed I had an XTX. Then at the time I smiled when he made the usual remark about bad AMD drivers.................but much as it's a crude oversimplification, it's hard to argue against it if it's true.

9% marketshare. For GPUs that offer significantly better performance/$ in most games, that can only be lack of public faith in AMD's ability to give them a smooth experience.

Generation after generation, the hardware is strong. Generation after generation, the drivers transport you back in time to the Wild West to fend for yourself.

Like damn, I know down in my bones that Nvidia is indescribably exploitative; I just can't find any more pity left to offer AMD. Treat your customers like garbage, but what goes around comes around.
 
Last edited:

freeagent

Moderator
Staff member
Joined
Sep 16, 2018
Messages
7,622 (3.69/day)
Location
Winnipeg, Canada
Processor AMD R9 5900X
Motherboard Asus Crosshair VIII Dark Hero
Cooling Thermalright Aqua Elite 360 V3 1x TL-B12, 2x TL-C12 Pro, 2x TL K12
Memory 2x8 G.Skill Trident Z Royal 3200C14, 2x8GB G.Skill Trident Z Black and White 3200 C14
Video Card(s) Zotac 4070 Ti Trinity OC
Storage WD SN850 1TB, SN850X 2TB, Asus Hyper M.2, 2x SN770 1TB
Display(s) LG 50UP7100
Case Fractal Torrent Compact RGB
Audio Device(s) JBL 2.1 Deep Bass
Power Supply EVGA SuperNova 750w G+, Monster HDP1800
Mouse Logitech G502 Hero
Keyboard Logitech G213
VR HMD Oculus 3
Software Yes
Benchmark Scores Yes
I wonder what ATi thinks of all this :(
 

tabascosauz

Moderator
Supporter
Staff member
Joined
Jun 24, 2015
Messages
7,675 (2.37/day)
Location
Western Canada
System Name ab┃ob
Processor 7800X3D┃5800X3D
Motherboard B650E PG-ITX┃X570 Impact
Cooling NH-U12A + T30┃AXP120-x67
Memory 64GB 6000CL30┃32GB 3600CL14
Video Card(s) RTX 4070 Ti Eagle┃RTX A2000
Storage 8TB of SSDs┃1TB SN550
Case Caselabs S3┃Lazer3D HT5
I wonder what ATi thinks of all this :(

Rolling in their graves, probably

One of ATI's children is on the road to world domination (Adreno) and the other is on the butt end of a joke (Radeon)
 

Mussels

Freshwater Moderator
Staff member
Joined
Oct 6, 2004
Messages
58,413 (8.16/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
teresting. I thought VRAM junction on both Nvidia and AMD (at least in current HWInfo) is taken from the highest reading of all the VRAM packages? At least, that's what the description says.
its meant to - but that can still go wrong if the temp sensor isnt located at the hotspot, from a misaligned thermal pad - the part with the sensor might be cooled, 1cm over may not be

We get some readings - but not all of em, and not all cards make them user/software visible
And then some cards just cut corners and removed some of the sensors to save money
 

tabascosauz

Moderator
Supporter
Staff member
Joined
Jun 24, 2015
Messages
7,675 (2.37/day)
Location
Western Canada
System Name ab┃ob
Processor 7800X3D┃5800X3D
Motherboard B650E PG-ITX┃X570 Impact
Cooling NH-U12A + T30┃AXP120-x67
Memory 64GB 6000CL30┃32GB 3600CL14
Video Card(s) RTX 4070 Ti Eagle┃RTX A2000
Storage 8TB of SSDs┃1TB SN550
Case Caselabs S3┃Lazer3D HT5
4070 Ti

AD104 cannot idle core or VRAM nearly as low as Navi31 can, but idle board power is down in the single digits. "GPU Power" is TBP because it is equivalent to PCIe slot power + 12VHPWR power. There is no practical difference between running 60Hz or 165Hz on the second monitor (60Hz at left, 165Hz at right), both end up around the 9W mark.

There's an argument to be made for HWInfo not properly interpreting Nvidia cards' sleep states, but being identical to the previous 3 generations (1070, 2060S, 3070Ti), I'm pretty confident that they are just simpletons - a steady and static idle clock, but delivering relatively low idle power figures.

4070 ti idle 60hz.png
4070 ti idle full.png


When only a single 165Hz monitor is turned on, the 4070 Ti idles at just 5W. The "2W" readings come from times when the computer is awake but the monitors are sleeping.

4070 ti idle single.png


----------------------------------------------------------------------------------------------------------------------------------------------------------------------------

7900XT


By contrast, Navi31 power gates its GCD nearly as aggressively as modern Ryzen parts (where cores power regularly approaches zero watts). If Radeon overlay and HWInfo are to be believed, VRAM clock can also dip far below what AD104 can do at idle (double digit clocks). Unfortunately, even at ideal idle (15-20W) the GCD's savings diluted by 10W of mystery power.

I wonder if the opportunity to make Navi31 monolithic would have cut down on that number; such is the relationship between Vermeer and Cezanne - Vermeer CPUs always have 10-15W of this "none of the above" power draw, while monolithic Cezanne parts won't even see 5W of miscellaneous power even when Fabric is heavily overclocked.

23.1.1 power.png


But that's with the second monitor at 60Hz. As expected, idle goes flying out the window when both are turned up to 165Hz. Single monitor 165Hz is identical to the 165Hz+60Hz results.

23.1.2 power v2.png


It's interesting to observe that power isn't drawn from any one particular rail but distributed throughout all 3 rails that might remotely have something to do with VRAM function. Strictly speaking, for 2500MHz, Vmem itself isn't all that bad at 12W.........unfortunately, when I still had access to the 7900XT, it didn't cross my mind to look at GCD vs. MCD temperatures under these circumstances, and which die (GCD or MCD1-MCD5) was really accountable for the higher core/hotspot temps.



At the end of the day, typical for RDNA there's a lot of cool innovation here from the engineers. The aggressive sleep states of the GCD, the low idle memory clock, the super low idle Vcore, and the Ryzen-esque granularity of clock control on every rail. Unfortunately, the drivers render much of it pretty pointless.
 
Joined
Jan 14, 2019
Messages
9,971 (5.13/day)
Location
Midlands, UK
System Name Nebulon-B Mk. 4
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 7800 XT
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Benchmark Scores Cinebench R23 single-core: 1,800, multi-core: 18,000. Superposition 1080p Extreme: 9,900.
4070 Ti

AD104 cannot idle core or VRAM nearly as low as Navi31 can, but idle board power is down in the single digits. "GPU Power" is TBP because it is equivalent to PCIe slot power + 12VHPWR power. There is no practical difference between running 60Hz or 165Hz on the second monitor (60Hz at left, 165Hz at right), both end up around the 9W mark.

There's an argument to be made for HWInfo not properly interpreting Nvidia cards' sleep states, but being identical to the previous 3 generations (1070, 2060S, 3070Ti), I'm pretty confident that they are just simpletons - a steady and static idle clock, but delivering relatively low idle power figures.
What do you mean "it cannot idle as low as Navi 31"? 210 MHz on the GPU and 101 on the VRAM and a single-digit power consumption seem pretty idle to me. :)

The difference is that Nvidia operates with a minimum 2D clock that it cannot go under, while AMD doesn't.

It looks like your problem is solved, although by buying another graphics card, which I guess wasn't the ideal choice, but still. :)
 

tabascosauz

Moderator
Supporter
Staff member
Joined
Jun 24, 2015
Messages
7,675 (2.37/day)
Location
Western Canada
System Name ab┃ob
Processor 7800X3D┃5800X3D
Motherboard B650E PG-ITX┃X570 Impact
Cooling NH-U12A + T30┃AXP120-x67
Memory 64GB 6000CL30┃32GB 3600CL14
Video Card(s) RTX 4070 Ti Eagle┃RTX A2000
Storage 8TB of SSDs┃1TB SN550
Case Caselabs S3┃Lazer3D HT5
The difference is that Nvidia operates with a minimum 2D clock that it cannot go under, while AMD doesn't.

And maybe the solution could really be that simple. More thresholds at which VRAM clock goes up a notch to accommodate more display. If Nvidia's very discrete style of clock handling can handle at least 2 in-between VRAM clock states, surely it should be no sweat for AMD's design. They already use em for video playback!

Say, a small monitor gets VRAM that falls to its usual near-zero clock. Maybe a single 4K gets 200MHz to be safe. Then two monitors up to 1440p165 get a bump up to 900MHz. Then maybe another one around 1500MHz to keep the triple monitor crowd happy. Then full 2500MHz for everyone else.

I think I'll go to bed dreaming that the Adrenalin driver maintainers actually took this feedback and implemented it :)
 
Joined
Jan 14, 2019
Messages
9,971 (5.13/day)
Location
Midlands, UK
System Name Nebulon-B Mk. 4
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 7800 XT
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Benchmark Scores Cinebench R23 single-core: 1,800, multi-core: 18,000. Superposition 1080p Extreme: 9,900.
And maybe the solution could really be that simple. More thresholds at which VRAM clock goes up a notch to accommodate more display. If Nvidia's very discrete style of clock handling can handle at least 2 in-between VRAM clock states, surely it should be no sweat for AMD's design. They already use em for video playback!

Say, a small monitor gets VRAM that falls to its usual near-zero clock. Maybe a single 4K gets 200MHz to be safe. Then two monitors up to 1440p165 get a bump up to 900MHz. Then maybe another one around 1500MHz to keep the triple monitor crowd happy. Then full 2500MHz for everyone else.

I think I'll go to bed dreaming that the Adrenalin driver maintainers actually took this feedback and implemented it :)
I think the basic idea is there, only that there's some grain in the machine on the practical implementation side of things.

I mean, AMD GPUs can clock as high or low as they want, but when you hook up your two 165 Hz displays, the card doesn't seem to know what to do with them.
 

Mussels

Freshwater Moderator
Staff member
Joined
Oct 6, 2004
Messages
58,413 (8.16/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
And maybe the solution could really be that simple. More thresholds at which VRAM clock goes up a notch to accommodate more display. If Nvidia's very discrete style of clock handling can handle at least 2 in-between VRAM clock states, surely it should be no sweat for AMD's design. They already use em for video playback!

Say, a small monitor gets VRAM that falls to its usual near-zero clock. Maybe a single 4K gets 200MHz to be safe. Then two monitors up to 1440p165 get a bump up to 900MHz. Then maybe another one around 1500MHz to keep the triple monitor crowd happy. Then full 2500MHz for everyone else.

I think I'll go to bed dreaming that the Adrenalin driver maintainers actually took this feedback and implemented it :)
I think the problem is that they pick their aggresive default for an assumed common audience - say, a single monitor 4K120Hz setup (based on monitor bandwidth, more than anything)

When it goes out of comfort zone, crank the shite up to max - and they don't seem to have a middle state for this, Nvidias got a low 3D state when you take the monitors to extremes and AMD seems to lack that
 

tabascosauz

Moderator
Supporter
Staff member
Joined
Jun 24, 2015
Messages
7,675 (2.37/day)
Location
Western Canada
System Name ab┃ob
Processor 7800X3D┃5800X3D
Motherboard B650E PG-ITX┃X570 Impact
Cooling NH-U12A + T30┃AXP120-x67
Memory 64GB 6000CL30┃32GB 3600CL14
Video Card(s) RTX 4070 Ti Eagle┃RTX A2000
Storage 8TB of SSDs┃1TB SN550
Case Caselabs S3┃Lazer3D HT5
Still waiting for the Caselabs parts production to start...............

In the meantime, got some more peripherals upgrades:

FiiO K7: new DAC. Big size. Upgrade from E10K. Night and day difference. Not much else to say, it's great.

fiio k7 size comparison resize.jpg
fiio k7 and e10k.jpg


MCG Ultimate: replaced the SCG (Kosmosima). This marks "endgame" for any VKB setup buuuuuuut.......this one's a weird one. Yes, MCGU is a 600g absolute unit and makes the TM Warthog embarrassed to exist. Yes, MCGU is always a delight to use. Yes, MCGU is one of best grips on the entire market. But the size, shape and weight make cam and spring choice an endless struggle. It's been 3 months of non-stop tweaking and I'm still not 100% sure about things (right now on both Space-S cams with single #40 spring on both). Maybe I do need a short extension after all? Anyways, I have decided to hold off on making any more changes before the Strike Eagle releases next month.

MCGU june 2023.jpg


Foxx mount: Foxx makes great desk mounts and has great customer support. Originally I wasn't too happy with the standard mount, so I opted for some parts upgrades. The new assembly method with the pitch adapter allows canting the stick forward by up to 30 degrees, and is also easier to assemble and more sturdy than the standard mount that uses 2 anchor fasteners to hold the two beams together. At the moment the stick is still flat but I am already appreciating the superior mounting system.

Additionally, I requested a custom horizontal beam that is shorter than standard by about 50mm, to minimize wasted space in front of the Gunfighter base. Every inch of extra length pushes my chair further out and further away from my already-small-for-sims 32" panel.

Finally, to round out the package, a silver top plate for some added flair and a Blackbox mount that attaches to the reverse side, so that everything the stick needs can be consolidated on the mount itself.

foxx mount june 2023.jpg
foxx mount top plate.jpg
 
Joined
Jun 14, 2021
Messages
2,107 (1.99/day)
Location
UK
It always feels weird to name these inanimate objects, but maybe it's the constant hanging out on F150forum.com that I've gotten used to the practice. :laugh: In all seriousness, this ol' girl has a name because we've been up to 30,000ft like it's a second home, to all different places over yonder and helped me see through the (not particularly enjoyable) events of the past few years. Sometimes being able to come back and fire up a game or movie or music or just sit there and stare at a webpage was all there was between me and something very, very rash.

Now, after years of hauling a Pelican 1510 that would weigh between 25-30lb through distant airports, all while sweating bullets (in addition to sweating normally because it's damn hot lugging that thing around) and fearing for what the gate agents on shift that day would say about my 1510, I think it may be time to wrap up that life. Yes, I do have a short stint remaining at school, which means I will need to fly there and back again with something that is not a laptop, but that will most likely be something along the lines of a Hades Canyon NUC; whether it's a HNK or HVK is something I have not yet decided, but all I know is that you will not find me twiddling my thumbs waiting for the debut of Intel's new dGPU, which will most likely be the centerpiece of the Hades' replacement - the Ghost Canyon NUC. That would also mean the retirement of my 1510 in its current role, as my new and miniature (in relative terms) Pelican 1300 in screaming yellow fits the bill far better for such a compact machine; I would just be putting the dividers back in padded dividers and using it to house my camera gear.

This frees up Ol' Beastie to move to a roomier case, without the constraints of the 1510. While a Caselabs Mercury S3 or Nova X2M would have been an impeccable choice at this time, as my luck would have it, Caselabs no longer exists. With that in mind, the choice of case for the near future is still very much in question. I'm also having trouble deciding if I really want to give up the ability to take Ol' Beastie, fully protected, around town to friends' houses.

I think it's an elitist thing, lmao. The M1 is a little more premium than most mass market ones and I'm having some trouble accepting the prospect of going back to an NZXT (which still makes damn fine cases, by the way) or a Silverstone, kek. It's also a nice little case, and if I'm gonna be honest, nothing else is really pulling me in and giving a real reason to ditch the M1 as of yet.

--------​

In the beginning, I was just tossing ideas around. In that warm spring of 2015, I had long since come to realize that I could not survive without the familiar feel of a desktop PC. :laugh: So at that point, the conclusion was that whatever I would use, I’d have to bring with me.

So I started with the Pelican 1510, ubiquitous amongst photographers everywhere. I picked pluck foam to start.

But what to make and how to make it? There were no guides on building a PC specifically to endure the rigors of air travel. What protection would I use? What case could I fit? Would air coolers and GPUs be safe, or would they transform into unrestrained weapons ready to tear through their sockets and slots at the slightest shock? Most AIOs might make it through security on a round trip……but mostly wasn’t going to cut it, I was going to have to make it through CATSA/TSA/Border Force every single time, without fail, on my 4-6 flights a year.

Hell, even if you queried a forum for suggestions, they’d mostly just laugh and tell you to buy a laptop or build one at the destination. With the rise of SFF and uSFF, followed by the continuous revision of the M1 design and introduction of the SM550 and DAN-A4, it’s incredibly easy nowadays. There’s even a dedicated shoulder bag for the M1 and A4. But that sure wasn’t the case back then.

The SG08B-Lite popped up as a reasonable candidate, given its ability to fit the D9L, while still being under 15L in volume. The build came together as a i3-4160, H81I and R7 265 with a XFX Core Edition (Seasonic S12II) providing power, as for much of the time before my departure, it would only function as a secondary HTPC to my main rig. Sadly, Dropbox had a big fustercluck about a year later and deleted nearly all the photos I had of that setup.


September came, and off we went. It was the first time I was flying with such a thing. I very quickly discovered what many a photographer had at their chagrin – the stock plastic wheels on the 1510 are horrible. If air travel wouldn’t break my PC, those wheels would make sure the PC would be shattered by the time it reached its destination. But carrying it was also heavy as hell (the SG08B-Lite has a full 10mm thick front panel made of solid aluminium), so I did this awkward combination of wheeling it gingerly over smooth surfaces and carrying it the rest of the way.

I was so jumpy going through CATSA – not good vibes to have at airport security. The guy was nice and all, as well as curious to the components I had chosen, but out of the subsequent dozen opportunities, that first flight marked the one and only time my rig was ever swabbed for explosives. :eek:

Next challenge was the size of the carry-on. Air Canada had introduced these stupid “check your carry on size for fitment compliance” crates constructed from steel tubing, and it just so happened that the Pelican 1510, despite complying with literally every major airline’s standards, was slightly out of spec for Air Canada’s on one axis. Furthermore, it was also a couple of pounds too heavy. The gate agent was less than understanding at first, but after I explained everything at stake and that I simply could not accept checking it into the hold, he put a red tag on it and I got on. There were no further hiccups on that flight.


Things went well until 1) I got tired of carrying around 28lb in Heathrow Terminal 2; 2) my H81I started giving up on life. Given how the board looked by the end, I wagered that the horizontal motherboard placement and the weight of the D9L on top of it had done it in. As to the first point, little did I know, it was about to get a lot worse. Air Canada flights are generally ushered into Terminal 2 gates, a large, modern complex that manages to be airy and refreshing; most importantly, the floors are smooth tile. I switched to BA for their standard dimensions and generous 51lb carry-on allowance. British Airways flights to Western Canada come into Terminal 3, a literal fucking hellhole on Earth, where the floors were designed to destroy PCs like mine and the walk from the check-in to the lounge is a Long March in itself, while the lounge to whatever gate they announce is another Long March. The gate is always fluid and changing, only announced an hour prior to departure. Have fun getting 25lb to the gate within 20 minutes!

Over the next half dozen flights, I quickly learned to arrive at the gate early and get on early. Some lady might be trying to keep her Coach purse from contacting the filthy floor of the cabin, but my stakes are much, much higher – sorry ma’am.

And thus, next in line was a SG05. This drastically reduced the size of Ol’ Beastie and her weight as well, but meant a downgrade to a Silverstone 450W SFX as well as a L9x65. By this time, I had swapped in my 4790K instead, so major thermal problems were to be expected. From this time to my Ryzen 3000 upgrade in August 2019, that 4790K was underclocked to 3.5GHz.



The R7 265 also departed in favour of a GTX 750 Ti from EVGA. It was technically a downgrade in graphics performance, but Maxwell held two significant advantages – it was incredibly efficient in thermals and power consumption, and being EVGA, it supported a backplate, which I purchased for $15 extra. Given the nature of this computer, I take any extra rigidity and strength I can get. As for the dying H81I, a H97N-WIFI was substituted instead, with integrated 802.11ac MIMO WIFI being a breath of fresh air compared to the shockingly terrible USB 802.11n adapters of old.



--------

to be continued
Long intresting post. :)
 

tabascosauz

Moderator
Supporter
Staff member
Joined
Jun 24, 2015
Messages
7,675 (2.37/day)
Location
Western Canada
System Name ab┃ob
Processor 7800X3D┃5800X3D
Motherboard B650E PG-ITX┃X570 Impact
Cooling NH-U12A + T30┃AXP120-x67
Memory 64GB 6000CL30┃32GB 3600CL14
Video Card(s) RTX 4070 Ti Eagle┃RTX A2000
Storage 8TB of SSDs┃1TB SN550
Case Caselabs S3┃Lazer3D HT5

tabascosauz

Moderator
Supporter
Staff member
Joined
Jun 24, 2015
Messages
7,675 (2.37/day)
Location
Western Canada
System Name ab┃ob
Processor 7800X3D┃5800X3D
Motherboard B650E PG-ITX┃X570 Impact
Cooling NH-U12A + T30┃AXP120-x67
Memory 64GB 6000CL30┃32GB 3600CL14
Video Card(s) RTX 4070 Ti Eagle┃RTX A2000
Storage 8TB of SSDs┃1TB SN550
Case Caselabs S3┃Lazer3D HT5
Another reminder that the road to SFF is fraught with unforeseen peril......

finished l5 top.jpg
finished l5 with a2000.jpg


In the middle of a game and the RTX A2000 suddenly decides to limit itself below 30W of total board power and gets stuck there. As expected, massive lag and panic ensues. Never happened before when running Timespy or games.

The card just refused to exceed 30W. Core and mem both constantly spiking to 3D clocks then immediately dropping back again. %TDP Power Consumption in GPU-Z would start normal then as soon as 3D load began it would blank out and display nothing. Constant Pwr perfcap. Nothing showed as amiss in software for the A2000.
  • Tried clean installing a bunch of different drivers
  • DDU'd
  • Removed leftover AMD driver garbage (thanks ddu NOT)
  • Reflashed BIOS
  • fTPM off and on
  • Updated to 22H2
  • HAGS off and on
  • PCIe link speed
In the end it was some combination of the following that got the card back to normal:
  • Cleaned out the card
  • Cleaned out x16 slot
  • Reseated the card
  • Made a 76W PBO profile for the 5700G
  • Cleared CMOS
  • Unplugged and discharged mobo power
Since it is a slot-powered card after all, stands to reason gunk in the PCIe slot could have had something to do with it.
 

tabascosauz

Moderator
Supporter
Staff member
Joined
Jun 24, 2015
Messages
7,675 (2.37/day)
Location
Western Canada
System Name ab┃ob
Processor 7800X3D┃5800X3D
Motherboard B650E PG-ITX┃X570 Impact
Cooling NH-U12A + T30┃AXP120-x67
Memory 64GB 6000CL30┃32GB 3600CL14
Video Card(s) RTX 4070 Ti Eagle┃RTX A2000
Storage 8TB of SSDs┃1TB SN550
Case Caselabs S3┃Lazer3D HT5
Oh yeah, let's go baby

AMD proving once again that AGESA is not a friend of the people

20230825_105114~01.jpg
 
Joined
Feb 20, 2019
Messages
7,394 (3.88/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Oh yeah, let's go baby

AMD proving once again that AGESA is not a friend of the people

View attachment 310660
RAM timings after clearing CMOS?
It's the worst thing about XMP - they're basically Intel-only timings that AMD has to work with (and they don't do a great job).
 

tabascosauz

Moderator
Supporter
Staff member
Joined
Jun 24, 2015
Messages
7,675 (2.37/day)
Location
Western Canada
System Name ab┃ob
Processor 7800X3D┃5800X3D
Motherboard B650E PG-ITX┃X570 Impact
Cooling NH-U12A + T30┃AXP120-x67
Memory 64GB 6000CL30┃32GB 3600CL14
Video Card(s) RTX 4070 Ti Eagle┃RTX A2000
Storage 8TB of SSDs┃1TB SN550
Case Caselabs S3┃Lazer3D HT5
RAM timings after clearing CMOS?
It's the worst thing about XMP - they're basically Intel-only timings that AMD has to work with (and they don't do a great job).

No, I flashed to the F18b BIOS (120A) because I'm still struggling with the A2000 30W issue, and also want to get fTPM functionality back for updates.

The new BIOS is actual flaming garbage. It is just literally allergic to high Fabric. I came back down to 2000 FCLK for now, the instability instantly broke my Windows install and also was so bad it prevented Windows installer from continuing. All the settings I punched in are identical to what I ran for more than a year on F13l, including my usual 4333CL16 profile.

I've still got a couple ideas for the A2000 so I'll leave it here while I work on that. I was planning on taking advantage of the A2000 to push past 2200 FCLK (which I can do, just not with iGPU active), but that plan went in the bin.
 
Joined
Feb 20, 2019
Messages
7,394 (3.88/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
No, I flashed to the F18b BIOS (120A) because I'm still struggling with the A2000 30W issue, and also want to get fTPM functionality back for updates.
Ugh, I'm assuming this is the Aorus?
Gigabyte have never been my choice for tinkering with overclocks or undervolts. I buy them more than any other board since they're the least problematic/most reliable for "comfortable" clocks/timings but they never push well and 2166FLCK is more than I'd have expected from Aorus TBH, you're lucky you got that far!
 

tabascosauz

Moderator
Supporter
Staff member
Joined
Jun 24, 2015
Messages
7,675 (2.37/day)
Location
Western Canada
System Name ab┃ob
Processor 7800X3D┃5800X3D
Motherboard B650E PG-ITX┃X570 Impact
Cooling NH-U12A + T30┃AXP120-x67
Memory 64GB 6000CL30┃32GB 3600CL14
Video Card(s) RTX 4070 Ti Eagle┃RTX A2000
Storage 8TB of SSDs┃1TB SN550
Case Caselabs S3┃Lazer3D HT5
Ugh, I'm assuming this is the Aorus?
Gigabyte have never been my choice for tinkering with overclocks or undervolts. I buy them more than any other board since they're the least problematic/most reliable for "comfortable" clocks/timings but they never push well and 2166FLCK is more than I'd have expected from Aorus TBH, you're lucky you got that far!

Yeah, it has a lot of stupid quirks that I've gotten used to in 3 years, but I cannot stand bad BIOSes. F13 release was a bad BIOS so I stayed on the elusive F13l beta all the way until now. But having fTPM disabled prevents me from getting major Win 11 feature updates so I thought I'd just try something new instead of doing the ol switcheroo and wait for Windows Update to react.

It is a 8-layer ITX so I know it can do better (and I know my RMA'd 5700G can do better too). I did some 4800 runs with the 4650G on this board, but stayed at 4333 because iGPU.

The most annoying one is sometimes BIOS settings either not saving, or AX200 Wifi and/or BT broken until you completely drain power from the board.

I don't really have any alternatives though. I have a B550M-itx/ac but that board is kinda sucktastic, a 6-layer ITX, bends easily and the rear I/O doesn't line up properly. I also have a B550 Strix-I but RAM OC kinda sucks on it and Asus' BIOS auto recovery is legit awful, so it's just been running the and and super damaged 4650G at stock for a while now.
 
Joined
Feb 20, 2019
Messages
7,394 (3.88/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Yeah, it has a lot of stupid quirks that I've gotten used to in 3 years, but I cannot stand bad BIOSes. F13 release was a bad BIOS so I stayed on the elusive F13l beta all the way until now. But having fTPM disabled prevents me from getting major Win 11 feature updates so I thought I'd just try something new instead of doing the ol switcheroo and wait for Windows Update to react.

It is a 8-layer ITX so I know it can do better (and I know my RMA'd 5700G can do better too). I did some 4800 runs with the 4650G on this board, but stayed at 4333 because iGPU.

The most annoying one is sometimes BIOS settings either not saving, or AX200 Wifi and/or BT broken until you completely drain power from the board.

I don't really have any alternatives though. I have a B550M-itx/ac but that board is kinda sucktastic, a 6-layer ITX, bends easily and the rear I/O doesn't line up properly. I also have a B550 Strix-I but RAM OC kinda sucks on it and Asus' BIOS auto recovery is legit awful, so it's just been running the and and super damaged 4650G at stock for a while now.
I've not worked out a reliable pattern of which vendors/models you're best using the AGESA menu or vendor's own settings.

The fact there are often duplicate menus for most BIOSes is bad enough, but it's usually possible to set two conflicting values in the BIOS - You can enable PBO on one page, and disable it on another page, then go back to the first page and it's still showing enabled....

IMO the AGESA menu should never be exposed to end users; it would stop AMD blaming vendors for bugs and vice-versa.
 
Top