Thursday, August 4th 2022

ASUS Unveils the ROG Crosshair X670E Hero and ROG Crosshair X670E Extreme

During AMD's Meet the Experts event, ASUS revealed more details about its ROG Crosshair X670E Extreme, a board the company revealed during Computex, but didn't show the rear I/O of. However, ASUS also unveiled the ROG Crosshair X670E Hero, a board the company hadn't shown off prior to the AMD event. Both boards will feature a pair of USB4 ports, with both ports supporting DisplayPort Alt Mode. Both boards feature a further two USB-C ports, plus seven plus USB-A ports. The Extreme features 10 Gbps and 2.5 Gbps Ethernet, whereas the Hero gets to make do with 2.5 Gbps Ethernet, although it gains an HDMI output. Both boards have a full set of audio jacks and WiFi 6E support, as well as a rear mounted clear CMOS and BIOS FlashBack button.

Taking a closer look at the Hero board, it has two PCIe x16 PCIe slots, plus a single, open-ended PCIe x1 slot. The board supports four M.2 NVMe slots for SSDs and comes with a PCIe 5.0 card for a fifth drive. It also has what appears to be six SATA ports, a front header for a 20 Gbps USB 3.2 2x2 USB-C port that also supports up to 60 W USB PD and Qualcomm Quick Charge 4+. The Hero board will be kitted out with an 18 phase power design, with the Extreme getting a 22 phase design, both with a 110 Ampere power stage. ASUS has moved its audio solution to the ALC4082 USB based audio codec and at least the Extreme will have an ESS ES9218 audio codec. ASUS is also bringing over the Q-Release solution for graphics cards to these boards, as well as the Q-Latch for M.2 SSDs.

Source: ASUS
Add your own comment

57 Comments on ASUS Unveils the ROG Crosshair X670E Hero and ROG Crosshair X670E Extreme

#26
Valantar
kapone32From what I have seen of the X670 boards you will need at least a 850 Watt PSU with at least 3 8 pin CPU cables. 1000W will more likely be more feasible though. I counted up to 6 PSU connections on these boards. I like the concept of the Extreme allowing to use that DRAM like module. From what I can see it looks like it is connected to the CPU (Likely 4.0) which means you don't have to share 8 lanes with the GPU using that adapter card. Not that that is going to make a difference right now, but the bandwidth of the hardware is getting faster.
Where are you getting that math from? Let's see:
CPU: 170W sustained, let's say 300W with an OC.
2x PCIe x16: 150W
4x USB-C: 60W (4x15W)
8x USB-A: 80W
5x m.2: 50W
Fans, RAM, various controllers and other stuff: let's say 50W.
That sums up to 690W without the >75W power of the GPU - but, crucially, assumes every single component and port is loaded to 100% power draw at once. With a very power hungry OC. That never, ever happens in a PC. Ever. And, of course, dual GPU is dead, so that 2x75W for PCIe x16 is completely unrealistic (very few non-GPU AICs draw 75W). Nor will you ever fully load five SSDs at once, or draw full power from every USB port at once. It just isn't happening.

Is total power creeping up from these faster interfaces, and is potential total power creeping up from more power outputs? Sure! Does that matter? Not much. A baseline build will still draw 250-300W under normal loads; an upper midrange build will still draw 400-500W under normal loads (with each of these peaking at 25-50% higher under unrealistic torture loads).

Also, crucially, most boards with more than one EPS connector don't actually need more than one to be connected. A single EPS cable is rated for 336W after all. The extras are for XOC or to look cool.
kapone32Some of these boards report 60W charging for 2 or more of the rear USB C ports. There is one board that has 2 6 or 8 pins for the USB C ports at the back. If you have your Smartphone and your partner's Smartphone charging and an external GPU running it will suck board power. That can't be for the PCIe lanes as TRx40 has way more lanes and less 6 or 8 pin MB connectors.
USB hosts generally support 5v3A or 15W. There are exceptions, but they are rare. Also, who on earth charges their phone from a rear USB port? :kookoo: Most fast-charging smartphones also don't support their peak rating with PD, but use some proprietary fast charging and step down significantly when on PD.


This board is pretty insane though. Outside of people using tons of AICs, I can't think of anything it doesn't have in spades. Looking forward to the >$1000 price tag!
Posted on Reply
#27
RedelZaVedno
What does added E stand for? Extreme pricing?
Posted on Reply
#28
Mysteoa
RedelZaVednoWhat does added E stand for? Extreme pricing?
Yes, it's for Extreme. But nobody forces you to buy it, when you can get the non E model.
Posted on Reply
#29
TheoneandonlyMrK
ValantarWhere are you getting that math from? Let's see:
CPU: 170W sustained, let's say 300W with an OC.
2x PCIe x16: 150W
4x USB-C: 60W (4x15W)
8x USB-A: 80W
5x m.2: 50W
Fans, RAM, various controllers and other stuff: let's say 50W.
That sums up to 690W without the >75W power of the GPU - but, crucially, assumes every single component and port is loaded to 100% power draw at once. With a very power hungry OC. That never, ever happens in a PC. Ever. And, of course, dual GPU is dead, so that 2x75W for PCIe x16 is completely unrealistic (very few non-GPU AICs draw 75W). Nor will you ever fully load five SSDs at once, or draw full power from every USB port at once. It just isn't happening.

Is total power creeping up from these faster interfaces, and is potential total power creeping up from more power outputs? Sure! Does that matter? Not much. A baseline build will still draw 250-300W under normal loads; an upper midrange build will still draw 400-500W under normal loads (with each of these peaking at 25-50% higher under unrealistic torture loads).

Also, crucially, most boards with more than one EPS connector don't actually need more than one to be connected. A single EPS cable is rated for 336W after all. The extras are for XOC or to look cool.

USB hosts generally support 5v3A or 15W. There are exceptions, but they are rare. Also, who on earth charges their phone from a rear USB port? :kookoo: Most fast-charging smartphones also don't support their peak rating with PD, but use some proprietary fast charging and step down significantly when on PD.


This board is pretty insane though. Outside of people using tons of AICs, I can't think of anything it doesn't have in spades. Looking forward to the >$1000 price tag!
I mean you say never ever but I did run quadfire Polaris plus a 2X460 Gtx, that's six GPU, in a Fx8350 Asus crosshair system.
There's idiots out there that do, just because.
F£$€ I would again.

In a way they have limited the chaos I could cause with only two pciex5 x16.
Posted on Reply
#30
Mussels
Freshwater Moderator
Thank F they labelled the USB ports


Oh good, four USB C
Two 40Gb, then a 10Gb, then a 20Gb?

Oh but on the extreme lets spice that up and go 10-40-40-20


Front panel USB-C port doing 60W of PD/QC, that's great to see. Can actually charge a phone from the PC at last.
Posted on Reply
#31
TheLostSwede
News Editor
RedelZaVednoWhat does added E stand for? Extreme pricing?
No, most X670E boards will be priced similar to their equivalent X570 models. See my Computex coverage of Gigabyte for some ballpark figures.
Posted on Reply
#32
Mussels
Freshwater Moderator
TheLostSwedeNo, most X670E boards will be priced similar to their equivalent X570 models. See my Computex coverage of Gigabyte for some ballpark figures.
If the non E chipsets will be cheaper, so more like B550 prices, or an in-between?

Ballpark figures get hard when you're not used to that countries pricing and currencies... american stuff pre-tax for example confuses things
Posted on Reply
#33
TheLostSwede
News Editor
MusselsIf the non E chipsets will be cheaper, so more like B550 prices, or an in-between?

Ballpark figures get hard when you're not used to that countries pricing and currencies... american stuff pre-tax for example confuses things
No idea about X670, there doesn't seem be a whole bunch of them. B650 boards, which are launching next year, will be sub $200 for sure, even with current inflation, maybe even a few around $150 or less. MSRP pricing without tax obviously.
Posted on Reply
#34
Valantar
TheLostSwedeNo idea about X670, there doesn't seem be a whole bunch of them. B650 boards, which are launching next year, will be sub $200 for sure, even with current inflation, maybe even a few around $150 or less. MSRP pricing without tax obviously.
Any whispers of any ITX boards so far?
Posted on Reply
#35
TheLostSwede
News Editor
ValantarAny whispers of any ITX boards so far?
B650/B650E only so far, but there was a weird poll during the AMD event where they asked about X670/X670E mATX/mini-ITX boards, which I don't see the point of.
Posted on Reply
#36
Valantar
TheLostSwedeB650/B650E only so far, but there was a weird poll during the AMD event where they asked about X670/X670E mATX/mini-ITX boards, which I don't see the point of.
Yeah, that sounds weird and unnecessary. ITX with a full complement of I/O would be nice, but there's just no space to make use of all of those PCIe lanes, let alone fitting dual chipsets on there. Current itx boards are cramped already, I can't imagine fitting two chipsets would help any.
Posted on Reply
#37
sgunes
Anybody know anything about the bottom PCI-e slot on the ROG Crosshair X670E extreme. Is it a PCI-e 3 or 4 x4 slot? I need something for a SATA card for additional hard drives.
Posted on Reply
#38
asdkj1740
TheLostSwedeNo, most X670E boards will be priced similar to their equivalent X570 models. See my Computex coverage of Gigabyte for some ballpark figures.
maybe for gigabyte only.
asrock pro rs and msi pro p wifi are all using 8 layers low loss PCB while gigabyte aorus elite is just 6.
also godlike and ace have pcie switches everywhere on PCB, crazy.
Posted on Reply
#39
Mussels
Freshwater Moderator
sgunesAnybody know anything about the bottom PCI-e slot on the ROG Crosshair X670E extreme. Is it a PCI-e 3 or 4 x4 slot? I need something for a SATA card for additional hard drives.
Going by how they've been doing it with previous generations, it's going to be a 4x 4.0 shared with the final NVME slot
(or one version of the board has an extra NVME slot, and that one has the lanes assigned there instead)
Posted on Reply
#40
Valantar
MusselsGoing by how they've been doing it with previous generations, it's going to be a 4x 4.0 shared with the final NVME slot
(or one version of the board has an extra NVME slot, and that one has the lanes assigned there instead)
Remember that X670E has a significant increase in PCIe lanes compared to X570 though. 24 vs. 12-16 IIRC?
Posted on Reply
#41
asdkj1740
ValantarRemember that X670E has a significant increase in PCIe lanes compared to X570 though. 24 vs. 12-16 IIRC?
x670/x670e=20
x570=16

am4 cpu=20
am5 cpu=24

alderlake cpu=20
z690=28
Posted on Reply
#42
Asni
asdkj1740x670/x670e=20
x570=16

am4 cpu=20
am5 cpu=24

alderlake cpu=20
z690=28
Intel: DMI 8*pci-ex gen4 lanes.
Amd: 4*pci-ex gen5 lanes from the cpu to the chipset.

The number of lanes the chipset generates is useless.
Posted on Reply
#43
asdkj1740
AsniIntel: DMI 8*pci-ex gen4 lanes.
Amd: 4*pci-ex gen5 lanes from the cpu to the chipset.

The number of lanes the chipset generates is useless.
am5 gen5x4 from cpu, yes.
but am5 b650 chipset cant take gen5x4 but gen4x4 only.
so....
Posted on Reply
#44
Valantar
AsniIntel: DMI 8*pci-ex gen4 lanes.
Amd: 4*pci-ex gen5 lanes from the cpu to the chipset.

The number of lanes the chipset generates is useless.
It absolutely isn't. The number of lanes to the chipset determine peak throughput from any devices connected to it. The number of lanes from the chipset determine how many devices can be connected to it, and how fast they can be. The latter is far more interesting - and important, and likely to be a bottleneck. No consumer use case is ever going to meaningfully saturate either PCIe 5.0x4 or 4.0x8 from the chipset for any significant amount of time, regardless of the number of devices connected. But if the chipset doesn't have the lanes to connect your SSD, capture card, or whatever, then you literally can't use it. Working with the theoretical but highly unrealistic potential for performance hiccups > not working at all.
Posted on Reply
#45
Mussels
Freshwater Moderator
ValantarUSB hosts generally support 5v3A or 15W. There are exceptions, but they are rare. Also, who on earth charges their phone from a rear USB port? :kookoo: Most fast-charging smartphones also don't support their peak rating with PD, but use some proprietary fast charging and step down significantly when on PD.


This board is pretty insane though. Outside of people using tons of AICs, I can't think of anything it doesn't have in spades. Looking forward to the >$1000 price tag!
The 60W power output isn't just for phones, its for VR headsets and monitors powered by USB-C

welcome to 2022
Posted on Reply
#46
Valantar
MusselsThe 60W power output isn't just for phones, its for VR headsets and monitors powered by USB-C

welcome to 2022
PD is absolutely very useful, and it's great that they're starting to add real PD support on some of these, though from what I can tell that's only one front port on a few high end motherboards? Most USB-C host ports still only provide 5V3A, as that's the base requirement.

I'll be interested in seeing the power management for that PD port though - I guess it must have its own buck/boost converter, running off 12V, to deliver the 5~20V needed for PD support? Unless they're being really shady and only delivering 12V5A through that port, in which case it's far less useful - that won't charge most laptops, or most other power hungry PD devices. (Also, isn't 12V PD deprecated since ... 2.1 or something?)

Also, which VR headsets need 60W of power? That sounds ... unsustainable. Venting 60W of heat off of a device mounted 5cm off your eyeballs isn't going to be comfortable for long. USB-C PD monitors also tend to peak around ~15-25W, and many are happy to run off host port power.

Still, useful? Absolutely.
Posted on Reply
#47
Mussels
Freshwater Moderator
Agreed, but it looks like the era of 60W ports has begun - front panel first, and USB 4.0 rear ports
Posted on Reply
#48
Valantar
MusselsAgreed, but it looks like the era of 60W ports has begun - front panel first, and USB 4.0 rear ports
I really don't see the point of 60W rear ports though. Accessories literally can't rely on that, as only 15W is mandated as host power by the standard, so either they stay within that power envelope or come with their own power supply, and ... who is going to fast charge their laptop or phone from the rear ports on their motherboard? This just seems like a gimmick to me. Front ports have a tad more utility, but still ...
Posted on Reply
#49
Mussels
Freshwater Moderator
ValantarI really don't see the point of 60W rear ports though. Accessories literally can't rely on that, as only 15W is mandated as host power by the standard, so either they stay within that power envelope or come with their own power supply, and ... who is going to fast charge their laptop or phone from the rear ports on their motherboard? This just seems like a gimmick to me. Front ports have a tad more utility, but still ...
If its got the DP output, i absolutely see the purpose.

We are at the era of VR headsets and regular old displays both using USB-C DP inputs, and being powered by the same USB-C.
It needs to exist first, before devices use it more often.


I've gotta use full sized DP and a powered USB 3 cable for my rift S, quest 2 uses compressed data and goes flat as you use it - the next gen could/should use regular old USB4 with 60W of power, higher quality compressed data or pure displayport
Posted on Reply
#50
Valantar
MusselsIf its got the DP output, i absolutely see the purpose.

We are at the era of VR headsets and regular old displays both using USB-C DP inputs, and being powered by the same USB-C.
It needs to exist first, before devices use it more often.
But again: they can't design these displays expecting more than 15W from the host port, or they won't work on the vast majority of host devices. No laptop in the world supplies more than 15W from its USB-C ports, and extremely few desktop motherboards or GPUs can deliver more. If the device needs more than 15W, it needs to be designed around a secondary power input. It could of course be designed around either >15W PD or external power, but that's a layer of design complexity that I've so far never seen.

USB-C displays can indeed need more than 15W, but are similarly then designed around either a low-power mode with only 15W host power, or external power. Or, as in some cases, just don't work without external power, or have an internal battery that is slowly drained if only receiving 15W host power. (Also, how many people globally use a USB-C-powered portable monitor with their desktops for any significant amount of time?)

Why? Because up until this point, AFAIK not a single mass market host device delivering more than 15W and display+data over the same cable has actually existed.

Could these things be simplified through adoption of 60W PD output from host devices? Sure! But that would kill backwards compatibility, and there's no way laptops can deliver that reliably without absolutely killing battery life (or needing stupidly overpowered AC adapters).
MusselsI've gotta use full sized DP and a powered USB 3 cable for my rift S, quest 2 uses compressed data and goes flat as you use it - the next gen could/should use regular old USB4 with 60W of power, higher quality compressed data or pure displayport
AFAICT there's no requirement for USB4 hosts to deliver 60W, meaning that compatibility would be a major minefield if this was necessary. The USB4 spec only refers to the USB-C and USB-PD specs for its power requirements, without specifying further, and USB-C and USB-PD mandate 5V3A output for host devices.

It would have the potential to be useful if this was standardized, but this happening seems highly unlikely, simply due to the amount of USB4 ports present on mobile devices that could never live up to such a power output requirement. Which would, in turn, kill the standard. Of course it's great if client devices start adapting to accept >15W PD input if available, and that might be possible, but it'll only solve the problem you're describing in a very few cases.
Posted on Reply
Add your own comment
Dec 23rd, 2024 08:05 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts