• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 6600 XT PCI-Express Scaling

Joined
May 2, 2017
Messages
7,762 (2.81/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Now there's a strawman argument. Wher did I say any of that? I didnt. The only thing I said was that AMD has a habit of kneecaping themselves when they start catching nvidia. Rebranding cards, too little memory (4GB 580) or a x8 bus that impacts performance in some games (6600xt, 5500xt was hit by BOTH of these issues).
Not a straw man, just highlighting the flawed logic behind your argument.
You know, outside of software that did show a performance difference. Of course:

So no real world consequences. Outside of real world consequences, but who coutnts those?
...sigh. I and several others have argued why considering those outliers precisely as outliers is reasonable. I have yet to see an actual argument for the opposite.
Right, so any time performance doesnt line up with expectations there are excuses.
No. Perspective and context is not the same as an excuse.
Using a x16 bus like nvidia would fix that problem, but the GPU isnt gimped. Everyone known that buggy console port games NEVER sell well or are popular, ever. Right?
Hey, look at that, a first attempt at an argument for why these games matter. However, it is once again deeply flawed. If a port is buggy and performs poorly, on whom is the onus to rectify that situation? The developer, engine developer/vendor, GPU (+driver) maker, OS vendor/platform owner, all of the above? Depending on the issue, I would say some balance of all of the above. You are arguing as if the only responsible party for improving things is AMD. Hitman's highly variable performance is widely documented, as is DS and HZD's issues. AMD has done work towards improving things with driver updates, as have developers, but at this point, outside of possible effects of unidentified early driver bugs that typically get fixed in 1-2 releases after launch of a new SKU, the remaining responsibility falls squarely on the developer and engine vendor.
If you have to come up with excuses for why examples of a x8 bus hurting performance dont actually matter you've answered your own question.
But that's the thing: we have no proof of that. We know that a 3.0 x8 bus on a current-gen high-end CPU sees a performance drop. We have no way of knowing whether a 4.0 x16 bus would perform better than an x8 one - it might be identical. A 3.0 x16 bus will likely perform better than an x8 one (due to matching 4.0 x16 in bandwidth), but that still leaves another issue with this 'fault': essentially nobody is going to use this GPU on PCIe 3.0 with a CPU as fast in gaming as the 5600X. Nobody who bought a 9900K or 10700K are going to buy a 6600 XT - most likely they already have something equivalent or better. And if your CPU is holding you back, well, then you won't have the performance overhead to actually see that bandwidth bottleneck at all.

So the scope of your issue is progressively shrinking:
It's not that PCIe 4.0 x8 is a bottleneck, it's only on older 3.0 systems.
It's not a bottleneck in all older systems, only those with CPUs that are close to the gamign performance of the 5600X.
It's not only on fast CPUs, but only in a highly selective range of titles with known issues unfixed by developers.
The Venn diagram of all of these caveats is tiny. Hence the 'issue' is overblown.
You've constructed your own argument here that you can never lose because you immedately discredit anything that goes against your narrative. I dont knwo what it is about the modern internet where any criticism against a product has to be handwaved away. The 6600xt is already a gargantuan waste of money, why defend AMD further screwing with it by doing this x8 bus thing that nvidia woudl get raked over the coals for doing?
Because criticism should be considered, relevant, accurate, useful, applicable, and so on. This criticism is neither. It is pointing to a spec-sheet deficiency that has negative consequences in an extremely restrictive and unlikely set of circumstances. Making that out to be a wholesale criticism of the product is irrational and deeply flawed logic, which either speaks to bias (implicit or explicit) or just poor reasoning.

You say the 6600 XT is a 'gargantuan waste of money' - in your opinion, is it more so than the 3060? Or the 3060 Ti? At MSRP, or at street prices? I could understand that argument in the current market if leveled against literally every GPU sold, as they all are. But you're singling out one specific card. That requires more reason than "it has the potential to slightly underperform if you play a highly specific selection of titles on a highly specific hardware configuration".

And that's the point here. People aren't defending the 6600 XT specifically as much as we are arguing against any singling out of it. The arguments for doing so do not stand up to scrutiny. Sure, there are (very) niche cases where it will be a poor choice, and if you're in that niche, then it really isn't the GPU you should be looking at buying. But extrapolating that into supposedly generally applicable advice? That's some really, really poor reasoning.
 

INSTG8R

Vanguard Beta Tester
Joined
Nov 26, 2004
Messages
8,042 (1.10/day)
Location
Canuck in Norway
System Name Hellbox 5.1(same case new guts)
Processor Ryzen 7 5800X3D
Motherboard MSI X570S MAG Torpedo Max
Cooling TT Kandalf L.C.S.(Water/Air)EK Velocity CPU Block/Noctua EK Quantum DDC Pump/Res
Memory 2x16GB Gskill Trident Neo Z 3600 CL16
Video Card(s) Powercolor Hellhound 7900XTX
Storage 970 Evo Plus 500GB 2xSamsung 850 Evo 500GB RAID 0 1TB WD Blue Corsair MP600 Core 2TB
Display(s) Alienware QD-OLED 34” 3440x1440 144hz 10Bit VESA HDR 400
Case TT Kandalf L.C.S.
Audio Device(s) Soundblaster ZX/Logitech Z906 5.1
Power Supply Seasonic TX~’850 Platinum
Mouse G502 Hero
Keyboard G19s
VR HMD Oculus Quest 3
Software Win 11 Pro x64
"where in this review outside of cases where it matters can you find examples of it mattering"

Well if you're going to immediately throw out evidence you dont like this conversation will go nowhere.
You’re the one who’s trying to ignore the evidence to the contrary and claiming it’s “kneecapped” the 2%? Still beats it’s competition with your “disability” regardless.
 

Fast Turtle

New Member
Joined
Aug 24, 2021
Messages
6 (0.01/day)
I'm currious how the card compares against the 5600 XT as this is supposedly a direct replacement/upgrade to that.
 
Joined
May 2, 2017
Messages
7,762 (2.81/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
I'm currious how the card compares against the 5600 XT as this is supposedly a direct replacement/upgrade to that.
Check one of the many reviews? They all inlcude comparisons to a heap of other GPUs.
ASRock Phantom Gaming D
Asus ROG Strix OC
MSI Gaming X
XFX Speedster Merc 308
Sapphire Pulse XT

Tl;dr: going from the Pulse XT (which is the closest to stock clocks of the cards above) it's about 33% faster at 1080p and 1440p and 22% at 2160p (not that you'd use this GPU for 2160p gaming) with ~10W higher power draws.
 
Joined
Jul 24, 2009
Messages
1,002 (0.18/day)
Some time ago I read article (maybe even here) if any card can saturate PCI-E 3.0 x16. If I remember right, answer was that it cant. And since it wasnt that long time ago, Im fairly positive this kinda mainstream GPU is perfectly fine with x8.
 
Joined
May 2, 2017
Messages
7,762 (2.81/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Some time ago I read article (maybe even here) if any card can saturate PCI-E 3.0 x16. If I remember right, answer was that it cant. And since it wasnt that long time ago, Im fairly positive this kinda mainstream GPU is perfectly fine with x8.
IIRC the 2080 Ti was the first tested GPU to show significant (i.e. more than 1-2%) performance limitations when going from PCIe 3.0 x16 to 2.0 x16. And the 3080 Ti still doesnt' meaningfully differ between 4.0 x16 and 3.0 x16 (and only drops ~4% at 2.0 x16). If anything, the different PCIe scaling between these two GPUs in outlier games like Death Stranding points to something else besides PCIe bandwidth being the cause of the drop, as there's no bandwidth-related reason why the 6600 XT should lose more performance moving from 4.0 x8 to 3.0 x8 than the 3080 Ti does from 3.0 x16 to 2.0 x16 - those are the same bandwidth, after all. This indicates that the cause for the drop in performance in those titles isn't the bandwidth limitation itself, but some other bottleneck (driver issue? Some convoluted game bug? The engine for some reason transferring far more data to VRAM on RDNA2 than on Ampere?) that can't be identified through this testing. It's too bad the 3080 Ti wasn't tested with Hitman 3, as that would have been another interesting comparison.
 
Joined
Aug 13, 2009
Messages
3,221 (0.58/day)
Location
Czech republic
Processor Ryzen 5800X
Motherboard Asus TUF-Gaming B550-Plus
Cooling Noctua NH-U14S
Memory 32GB G.Skill Trident Z Neo F4-3600C16D-32GTZNC
Video Card(s) Sapphire Radeon Rx 580 Nitro+ 8GB
Storage HP EX950 512GB + Samsung 970 PRO 1TB
Display(s) HP Z Display Z24i G2
Case Fractal Design Define R6 Black
Audio Device(s) Creative Sound Blaster AE-5
Power Supply Seasonic PRIME Ultra 650W Gold
Mouse Roccat Kone AIMO Remastered
Software Windows 10 x64
Is there any explanation why is the card limited to x8?
 
Joined
Feb 20, 2019
Messages
8,283 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Is there any explanation why is the card limited to x8?
Cost savings.
x8 is fewer traces on the PCB, simpler PCB layout, less expensive gold and copper used etc.

I know these cards are being scalped at $600 and have a high MSRP of $380 but realistically this is budget/entry-level design where cost-effectiveness is more important than outright max performance. If they can shave 3% off the price and it only has a 2% effect on the performance, then that's a worthwhile tradeoff.
 
Joined
May 2, 2017
Messages
7,762 (2.81/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Cost savings.
x8 is fewer traces on the PCB, simpler PCB layout, less expensive gold and copper used etc.

I know these cards are being scalped at $600 and have a high MSRP of $380 but realistically this is budget/entry-level design where cost-effectiveness is more important than outright max performance. If they can shave 3% off the price and it only has a 2% effect on the performance, then that's a worthwhile tradeoff.
Yeah, there's also the (small, but existent) die area savings from a smaller PCIe PHY and minute power savings from the same. Not an unreasonable thing for what this GPU is clearly designed to be - an upper midrange budget+ GPU (likely in the ~$300 range like the 5600 XT). Of course the market has made pricing like that unrealistic from many directions (increasing material prices, fab capacity shortages, silicon wafer shortages, SMD component shortages, etc., plus crypto, plus demand from gamers sitting on 2-generations-old hardware, +++), but this design was likely started long before this situation started making itself known in a major way. IIRC the 5600 XT was x8 as well, no? Alongside the RX 460/560 too. It's probably designed to scale downwards with 1-2 cut-down SKUs using the same reference board design (just not fully populated with VRMs etc.), which would make cost savings important as margins on those cheaper products would inevitably be lower.
 
Joined
Oct 7, 2018
Messages
118 (0.05/day)
Location
Pennsylvania, USA
Processor AMD Ryzen 5900X
Motherboard MSI MAG B550 Mortar
Cooling ARCTIC COOLING Liquid Freezer II 240
Memory G.SKILL Flare X Series 32GB (4 x 8GB) 288-Pin DDR4 SDRAM DDR4 3200
Video Card(s) EVGA GeForce GTX 2080 FTW3 Ultra, 08G-P4-2287-KR, 8GB GDDR6
Storage 1 x Samsung 980 PRO 500G | 1 x Mushkin Enhanced Pilot-E M.2 2280 2TB | 2 x 1TB WD10EADS
Display(s) 1 x ASUS ROG PG259QNR, 1 x Dell ST2421L
Case Lian Li O11D MINI-X
Audio Device(s) SteelSeries Arctis 5
Power Supply Seasonic FOCUS SGX-650, 650W
Mouse Mionix NAOS QG
Keyboard SteelSeries Apex Pro
Software Windows 10 Pro 21H1
Thank you @W1zzard for a very informative review. I was very curious about how much of an impact it would be between pci gen 3 vs 4.
 
Joined
Apr 14, 2021
Messages
56 (0.04/day)
System Name Too many rads
Processor Intel Core i9-13900KS @ 5.7-6.2GHz
Motherboard Asus Rog Strix Z690-A Gaming
Cooling Thermalright Frozen Magic 360 SCENIC V2
Memory 4X8 GSkill Trident Z 4133MHz CL16
Video Card(s) Sapphire TOXIC 6900XT Limited @2.765 GHz
Storage Samsung 990 Pro 1TB
Display(s) Asus Rog XG27WQ
Case Phanteks P500A
Power Supply Cooler master V750 Gold
Mouse ASUS ROG Harpe ACE
Keyboard Custom QK65
VR HMD Quest 2
Software Windows 11 Pro
Benchmark Scores [url=https://valid.x86.fr/9nlxi2][img]https://valid.x86.fr/cache/banner/9nlxi2-6.png[/img][/url]
Joined
Dec 8, 2011
Messages
22 (0.00/day)
There is probably nothing tragically wrong with this card. It all comes down to price and availability. MSRP is irrelevant in most cases. Of course this card comes with compromises, like ray tracing capabilities are non existent, but used 1070s and 980Ti's sell for $350+ on ebay, 6700 XTs go for $850 so I guess if one could easily buy a 6600 XT for $400, it wouldn't be such a bad deal. Again, in current market conditions. But If all cards were readily available at msrp it would be a whole different ball game. AMD knew they'll sell these like hot cakes regardless of reviews, so they figured they wouldn't have to design a top value product. If only availability improved, but we all know it'll take many more months for this nonsense market to stabilize.

I personally will just keep torturing my GTX 980 until I can buy something decent for $400-$450.
 
Joined
Sep 17, 2014
Messages
22,452 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
There is probably nothing tragically wrong with this card. It all comes down to price and availability. MSRP is irrelevant in most cases. Of course this card comes with compromises, like ray tracing capabilities are non existent, but used 1070s and 980Ti's sell for $350+ on ebay, 6700 XTs go for $850 so I guess if one could easily buy a 6600 XT for $400, it wouldn't be such a bad deal. Again, in current market conditions. But If all cards were readily available at msrp it would be a whole different ball game. AMD knew they'll sell these like hot cakes regardless of reviews, so they figured they wouldn't have to design a top value product. If only availability improved, but we all know it'll take many more months for this nonsense market to stabilize.

I personally will just keep torturing my GTX 980 until I can buy something decent for $400-$450.

This - and let's be honest, its not like Nvidia is producing top level product on the entire line up right now. Or for any sort of reasonable price.
 
Joined
Nov 14, 2011
Messages
74 (0.02/day)
yeah, i haven't checked here, but in AU, HUb found that the cheapest Nvidia at the same price as the 6600XT is the 1660 Super
 
Joined
Apr 18, 2019
Messages
2,369 (1.16/day)
Location
Olympia, WA
System Name Sleepy Painter
Processor AMD Ryzen 5 3600
Motherboard Asus TuF Gaming X570-PLUS/WIFI
Cooling FSP Windale 6 - Passive
Memory 2x16GB F4-3600C16-16GVKC @ 16-19-21-36-58-1T
Video Card(s) MSI RX580 8GB
Storage 2x Samsung PM963 960GB nVME RAID0, Crucial BX500 1TB SATA, WD Blue 3D 2TB SATA
Display(s) Microboard 32" Curved 1080P 144hz VA w/ Freesync
Case NZXT Gamma Classic Black
Audio Device(s) Asus Xonar D1
Power Supply Rosewill 1KW on 240V@60hz
Mouse Logitech MX518 Legend
Keyboard Red Dragon K552
Software Windows 10 Enterprise 2019 LTSC 1809 17763.1757
Only <1% loss @ PCIe 3.0 x8; these cards seem perfect to run off PCIe 4.0 x4 if your workstation or multifunction appliance needs the lanes for other devices. Something like a 100gbps NIC, HBA, or RAID card might need the bandwidth more.
If it wasn't for the signal integrity issues, you could run your GPU off the CPU M.2 slot, riser'd to wherever you wanted, and retain your full pcie x16 slot. Say, for a bifurcated quad m.2 card?
 
Joined
Jul 28, 2014
Messages
191 (0.05/day)
Location
Denmark
System Name NorthBlackGoldDream
Processor Ryzen 7600X
Motherboard Gigabyte B650M-DS3H
Cooling Arctic Freezer II 240
Memory 16 GB DDR5-5200C40
Video Card(s) GTX 1080 Ti 11 GB
Storage 1 TB NVMe PCIe 3.0
Display(s) 24.5" 240 Hz TN
Case Fractal North Black Mesh
Power Supply 650W
So, strangely enough - I am measuring a significant performance loss in none other than CS:GO, on the map ancient. Very niché I know, but I am still taking it up with AMD right now...
 

Fast Turtle

New Member
Joined
Aug 24, 2021
Messages
6 (0.01/day)
Interesting thought there LabRat but I'd go and get a Zotack GT 730 PCIe x1 based card for lots cheaper. Yes they're out there though originally OEM only but Newegg has an entry. Very useful for a Server HTPC and other media playback with very light (Freecell) gaming.
 
Joined
Jul 23, 2018
Messages
37 (0.02/day)
Location
Durban, South Africa
Processor i5-14600KF
Motherboard GA-H610M-H-DDR4
Cooling NH-D15S
Memory F4-3200C16D-16GIS
Video Card(s) Palit 4060Ti 16GB
Display(s) VG249Q
The underlying reason we're seeing these effects in some games is that nearly all titles are developed for consoles first, which have just one kind of memory that's shared between CPU and GPU. This also means that moving data between the CPU and GPU is incredibly fast and doesn't incur the latency or bandwidth penalties we're seeing on the PC platform. Remember, consoles are basically using something similar to IGP graphics, which has CPU and GPU integrated as a single unit, unlike the PC where you have discrete GPUs sitting on the PCI-Express bus. The onus is now on the game developers to make sure that their games not only run the best on consoles, their cash cow, but also on the PC platform.
Thank you for this. Great reasoning. Those PCIe 1.1 x8 numbers are good for naming and shaming. Comparing the AMD numbers to the NVIDIA numbers in terms of the scaling and relative performance is quite revelatory. My guess is it's the AMD hardware GPU scheduler showing off.
 
Last edited:
Joined
Jul 20, 2020
Messages
71 (0.04/day)
So, strangely enough - I am measuring a significant performance loss in none other than CS:GO, on the map ancient. Very niché I know, but I am still taking it up with AMD right now...
Any updates? How much was the difference in performance? Did they fix it?
 
Joined
Jul 28, 2014
Messages
191 (0.05/day)
Location
Denmark
System Name NorthBlackGoldDream
Processor Ryzen 7600X
Motherboard Gigabyte B650M-DS3H
Cooling Arctic Freezer II 240
Memory 16 GB DDR5-5200C40
Video Card(s) GTX 1080 Ti 11 GB
Storage 1 TB NVMe PCIe 3.0
Display(s) 24.5" 240 Hz TN
Case Fractal North Black Mesh
Power Supply 650W
Any updates? How much was the difference in performance? Did they fix it?
They didn't fix jack. They didn't answer - as always. But I found a fix. ReBAR.

 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.64/day)
Location
Ex-usa | slava the trolls
Don't buy..

Lowest prices by type of card:

RX 6600 XT:

1665079881242.png

AMD Radeon RX 6600 XT Grafikkarte (2022) Preisvergleich | Günstig bei idealo kaufen

RX 6600:

1665079917982.png

AMD Radeon RX 6600 Grafikkarte (2022) Preisvergleich | Günstig bei idealo kaufen

RX 6650 XT:

1665079949314.png

AMD Radeon RX 6650 XT Grafikkarte (2022) Preisvergleich | Günstig bei idealo kaufen


Whoever decides these pricings, is either mad or super stupid, with lost connection with the physical world :D
 

newuser78

New Member
Joined
Oct 2, 2022
Messages
16 (0.02/day)
@MagnuTron: he pretty much doubles his framerates.... really? that's something trump would say. No benchmark that you can find out there gives rebar even 10% more performance.

I can even tell you how he faked that performance jump in the video, you enable multicore rendering in csgo settings - on my 5800x that gets you from around 180fps to 400fps (defualt csgo cap) and even more with fps_max 0. but its not a stable frametime.
but you should enable that setting and limit to the fps of your monitor and let it run in fullscreen windowed. otherwise you have some kind of weird lag especially if you turn in game.

tested on a 6900xt
 
Last edited:

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.94/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
@MagnuTron: he pretty much doubles his framerates.... really? that's something trump would say. No benchmark that you can find out there gives rebar even 10% more performance.

I can even tell you how he faked that performance jump in the video, you enable multicore rendering in csgo settings - on my 5800x that gets you from around 180fps to 400fps (defualt csgo cap) and even more with fps_max 0. but its not a stable frametime.
but you should enable that setting and limit to the fps of your monitor and let it run in fullscreen windowed. otherwise you have some kind of weird lag especially if you turn in game.

tested on a 6900xt
Bugs exist dude


Go look at how intels GPUs perform withour reBAR
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.94/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
this thread is about 6000 series
And? Unless you own the same hardware as the person who posted about his problem, you can't know if his situation was correct or not.

A user had a problem and posted a solution and you've decided to attack them and pretend they made the whole thing up.
 
Top