• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ASUS ROG Blitz Extreme and Blitz Formula - Sneak Peek

WarEagleAU

Bird of Prey
Joined
Jul 9, 2006
Messages
10,812 (1.61/day)
Location
Gurley, AL
System Name Pandemic 2020
Processor AMD Ryzen 5 "Gen 2" 2600X
Motherboard AsRock X470 Killer Promontory
Cooling CoolerMaster 240 RGB Master Cooler (Newegg Eggxpert)
Memory 32 GB Geil EVO Portenza DDR4 3200 MHz
Video Card(s) ASUS Radeon RX 580 DirectX 12 DUAL-RX580-O8G 8GB 256-Bit GDDR5 HDCP Ready CrossFireX Support Video C
Storage WD 250 M.2, Corsair P500 M.2, OCZ Trion 500, WD Black 1TB, Assorted others.
Display(s) ASUS MG24UQ Gaming Monitor - 23.6" 4K UHD (3840x2160) , IPS, Adaptive Sync, DisplayWidget
Case Fractal Define R6 C
Audio Device(s) Realtek 5.1 Onboard
Power Supply Corsair RMX 850 Platinum PSU (Newegg Eggxpert)
Mouse Razer Death Adder
Keyboard Corsair K95 Mechanical & Corsair K65 Wired, Wireless, Bluetooth)
Software Windows 10 Pro x64
While not fully true, they may not be able to make a chipset that supports full PCIe x16 lanes (two each).

With first and second gen pcie cards, 16x wasnt necessary as they couldnt take advantage of the extra bandwidth. Nowadays however, the newer cards thrive with that extra bandwidth. I guess Intel feels a small performance hit wont hurt anyone.


And for the life of me, I would like to know why the bearlake chipset isnt capable of full 16X PCIe lanes. Any Intel knowers out there able to shed light on this?
 
Joined
Feb 5, 2007
Messages
191 (0.03/day)
Processor AMD A10-6800k @4.8GHz
Motherboard GIGABYTE G1.Sniper A88X
Cooling SCYTHE Katana 3 Type A
Memory 4GB DDR3/1600 Exeleram (for now)
Video Card(s) AMD HD8670D (APU)
Storage WDC WD10EALX / WDC WD6401AALS / Seagate ST3320620AS / Seagate ST3160812AS
Display(s) Iiyama Pro Lite E2200WSV B1 22"
Case Antec
Audio Device(s) GIGABYTE AMP-UP Audio
Power Supply Antec Earthwatts 500W
I guess that everybody complaining about "why not 16x" is talking about 2 x PCI-E VGA's in Crossfire, not 1 card. For 1 VGA, it will work on 16x. The future chipset from Intel, the one to come in autumn, will "know" 2 x 16x.

And I hope you remember that on P965, the one P35 is replacing, knew only 16x + 4x...

What I don't understand is how on Earth Asus managed to have 2 x 8x (electrical!) when P35 knows only 16x + 4x (electrical)...
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.94/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
It just means that SLI users will wait on Nforce 7, and ATI crossfire users... well they're screwed.
 
Top