Furmark GREATLY downclocks the card massively.
At 400W Power limit, a 3090 or 3080 card that would normally run at 1920 mhz would run at 1300 mhz in furmark.
You can destroy the card if it doesn't.
Proof? Try renaming furmark.exe to Quake3.exe or UnrealTournament.exe
Don't blame me for what happens.
Few issues with your argument:
1) It's not Catalyst from 13 years ago. Renaming Furmark won't make a difference. Just in case I'm too old and not keeping up with modern things, I've decided to run the latest v1.26 on my 1070Ti, and.... [drumroll] .... no difference whatsoever. It's not Furmark that downclocks your card, it's your card that downclocks your card for one reason or the other.
2) I think you misunderstand how Nvidia boost works. Just because you have 1900+MHz in one application doesn't mean you'll get the same boost in another app. More so... unless you are doing strictly manual OC with tightly controlled environment and decent pre-warmup, there's no
guarantee that you'll get
same clocks in the
same app during consecutive runs. Boost 4.0 is so temperature-dependent that you may get a few MHz drop just by farting in a room with your PC.
3) I'm neither rich or lucky enough to torture 3090 personally, but I can find at least few screens|videos of RTX3090 running furmark at more than 1300MHz even at 400W PL.
4) Core boost clock depends on many factors, which include temps, power, voltage, and consequentially load type(intensity of load). Same reason why a stock 9700K won't boost as high in Linpack as in CPU-Z stress test.
Once again, it's not the software problem, it's a hardware problem. Could be bad VRM, could be defects in a PCB just like in EVGA's 10-series, but in either case it's all on EVGA.
@buildzoid did a video not too long ago about this crap, there are some more details to this occurrence.