Friday, June 5th 2015

ASRock Z170 Extreme7 Motherboard Pictured Up Close

Here's one of the first pictures of ASRock's flagship socket LGA1151 motherboards from its mainline Z170 Extreme series, the Z170 Extreme7. This board offers an exhaustive feature set, and should appeal to gamers and overclockers alike. The board draws power from a combination of 24-pin ATX, 8-pin EPS, with no other auxiliary inputs. The LGA1151 socket is wired to four DDR4 DIMM slots, supporting up to 64 GB of dual-channel DDR4-3200 MHz memory; and three PCI-Express 3.0 x16 slots (x16/NC/NC or x8/x8/NC or x8/x4/x4); the fourth PCIe x16 slot is electrical x4, wired to the PCH. There's also an mPCIe slot; which will seat a WLAN+Bluetooth card on the /ac variant of this board.

The Z170 Extreme7 is the first motherboard to feature three M.2 slots, all three of which are PCIe gen 3.0 x4 (32 Gb/s). Other storage options include three SATA-Express 16 Gb/s, and ten SATA 6 Gb/s ports. The board features two USB 3.1 ports, one each of Type-C and Type-A; and eight USB 3.0 ports, four on the rear panel, four by headers. Display outputs include one each of dual-link DVI, HDMI 2.0 and DisplayPort 1.2. The onboard audio is an ASRock designed Purity Sound III solution, which combines a Realtek ALC1150 CODEC with ground-layer isolation, a headphones amp, and audio-grade capacitors. Other features include dual-BIOS (manual switching), and a high-quality plastic sheath running along its I/O area.
Add your own comment

24 Comments on ASRock Z170 Extreme7 Motherboard Pictured Up Close

#1
shhnedo
Don't mean to sound like a jerk or something, but DVI... on a high-end Skylake mobo? Seriously? Check your calendar, ASRock...
Why won't they learn that ancient crap needs to be phased out already... I don't think they make that much of a profit selling high-end boards to people with pre-historic monitors. I mean no one in their right mind will buy this and use the DVI port...
Posted on Reply
#2
Caring1
Normally their designs are well thought out, but this time if you want to put a card in the mPCIe slot, you can't populate all three M2 slots because they cross paths, or so it appears.
Posted on Reply
#3
Caring1
shhnedoDon't mean to sound like a jerk or something, but....... I mean no one in their right mind will buy this and use the DVI port...
Mission failed.
There are lots of people still using DVI, there is nothing wrong with it.
Posted on Reply
#4
GorbazTheDragon
Have to admit I chuckled a little when I counted three M.2 slots...
Posted on Reply
#5
shhnedo
Caring1Mission failed.
There are lots of people still using DVI, there is nothing wrong with it.
Then I'd be damned if I ever buy a board like this and plug DVI into it. I simply can't imagine spending that much money on a board like this and not being able to afford decent GPU and monitor(maybe a DP cable to go along). I would understand a budget board(let's say H100) having the above mentioned display outs, even only DVI and HDMI, but then I'd go for an i3/Pentium and use the iGPU to watch my movies and what not...
To me it just doesn't make any sense.
Posted on Reply
#6
rtwjunkie
PC Gaming Enthusiast
shhnedoDon't mean to sound like a jerk or something, but DVI... on a high-end Skylake mobo? Seriously? Check your calendar, ASRock...
Why won't they learn that ancient crap needs to be phased out already... I don't think they make that much of a profit selling high-end boards to people with pre-historic monitors. I mean no one in their right mind will buy this and use the DVI port...
Really? So everyone in the world in your own warped version of it, can afford to go out and replace a high quality monitor just because YOU think it's "prehistoric", and that anyone that has one is out of their mind?! Have i got that about right?

See, in the real world, we have mortgages, raise families, pay car payments, insurance, and every other bill under the sun. Gpu's or motherboards rate far higher to upgrade with the limited funds remaining, because they get outclassed. if a monitor is still very good, it's connection plug is NOT a reason to upgrade.
Posted on Reply
#7
RCoon
shhnedoThen I'd be damned if I ever buy a board like this and plug DVI into it. I simply can't imagine spending that much money on a board like this and not being able to afford decent GPU and monitor(maybe a DP cable to go along). I would understand a budget board(let's say H100) having the above mentioned display outs, even only DVI and HDMI, but then I'd go for an i3/Pentium and use the iGPU to watch my movies and what not...
To me it just doesn't make any sense.
I use my iGPU output as well as my GTX 970 output. Primary monitor is in the GPU, secondary is plugged into the iGPU. Keeps resources off the GPU for secondary monitor purposes, means fullscreen applications don't black out the secondary monitor, saves more power, and is also a cunning way of forcing QuickSync to enable itself. (I use QuickSync a lot these days).

DVI is a perfectly acceptable digital standard. Murdering VGA makes sense, its analogue. DVI however doesn't because it's digital. It's not like it's bad quality or anything, it can support resolutions up to 1600p. For most people this is ample.

I also know an awful lot of people with Z97 boards and i5s and i7s that have zero need for a dedicated GPU. Most offices with high CPU requirements run these kinds of setups, and a fair amount of them use two monitors. As such, that's why motherboards have HDMI + DVI output ports, as well as a DP output. Some people run three monitors, and intel's iGPUs support that. Hell. The guy that's sat next to me in my office is running an i5 and a Z board without a dedicated GPU. He encodes videos all day and uses a multimonitor setup using DVI + HDMI. All makes sense to me.
Posted on Reply
#8
newtekie1
Semi-Retired Folder
Caring1Normally their designs are well thought out, but this time if you want to put a card in the mPCIe slot, you can't populate all three M2 slots because they cross paths, or so it appears.
I've seen this on some other boards, I believe by AsRock as well. They are set up in a way that one goes over the other so they both can be used at the same time.
RCoonmeans fullscreen applications don't black out the secondary monitor, saves more power
I certainly agree with everything else you said, but when you have multiple monitors on a single dedicated GPU full screen apps don't black out the second monitor. Also, enabling the iGPU may actually use more power than plugging the second monitor into your 970. Intel's iGPUs are gated, so when they aren't enabled they aren't consuming any power, and plugging a second monitor into a 970 only raises the power consumption on it by about 2w.
rtwjunkieReally? So everyone in the world in your own warped version of it, can afford to go out and replace a high quality monitor just because YOU think it's "prehistoric", and that anyone that has one is out of their mind?! Have i got that about right?

See, in the real world, we have mortgages, raise families, pay car payments, insurance, and every other bill under the sun. Gpu's or motherboards rate far higher to upgrade with the limited funds remaining, because they get outclassed. if a monitor is still very good, it's connection plug is NOT a reason to upgrade.
I guess I should throw out this new 1440p monitor I just bought because it uses DVI...
Posted on Reply
#9
bogami
board looks very lavishly equipped ,but we have to realize that at the expense of inputs,
We have to decide what will benefit and what is possible. There is no PLX chip which allows us to expand and many options are shared on PCH.Chipset brings some new opportunities and this is mainly in the conductivity of the memory elements. (DDR4, SAS. M.2 .SATA e .USB3.1..)only offer is bad in these new standerd ,is too expensive for the normal buyer.
Posted on Reply
#10
swirl09
shhnedoWhy won't they learn that ancient crap needs to be phased out already... I don't think they make that much of a profit selling high-end boards to people with pre-historic monitors.
But... I like my pre-historic monitor :cry:
Posted on Reply
#11
Assimilator
Caring1Mission failed.
There are lots of people still using DVI, there is nothing wrong with it.
rtwjunkieReally? So everyone in the world in your own warped version of it, can afford to go out and replace a high quality monitor just because YOU think it's "prehistoric", and that anyone that has one is out of their mind?! Have i got that about right?

See, in the real world, we have mortgages, raise families, pay car payments, insurance, and every other bill under the sun. Gpu's or motherboards rate far higher to upgrade with the limited funds remaining, because they get outclassed. if a monitor is still very good, it's connection plug is NOT a reason to upgrade.
RCoonDVI is a perfectly acceptable digital standard. Murdering VGA makes sense, its analogue. DVI however doesn't because it's digital. It's not like it's bad quality or anything, it can support resolutions up to 1600p. For most people this is ample.
Good luck running a 1600p screen off an Intel motherboard DVI port.
RCoonI also know an awful lot of people with Z97 boards and i5s and i7s that have zero need for a dedicated GPU. Most offices with high CPU requirements run these kinds of setups, and a fair amount of them use two monitors. As such, that's why motherboards have HDMI + DVI output ports, as well as a DP output. Some people run three monitors, and intel's iGPUs support that. Hell. The guy that's sat next to me in my office is running an i5 and a Z board without a dedicated GPU. He encodes videos all day and uses a multimonitor setup using DVI + HDMI. All makes sense to me.
Buying a Z-series board and not using a discrete graphics card with it is just plain stupid. The only thing that the Z-series chipsets offer over H- and Q-series is overclocking, which is completely unnecessary in a corporate environment. If you see a Z-series board in such an environment, it's either because the IT guys are clueless, or someone's getting a kickback, or the user in question is a "power user" who demanded the best of the best. None of those are valid reasons to pay more for a Z-series board and not use its functionality.
newtekie1I guess I should through out this new 1440p monitor I just bought because it uses DVI...
So what are y'all gonna do when the AMD Fury comes out, with 0 DVI connectors? Blame AMD because you bought monitors that use legacy connectors, or because you're too cheap to buy a DVI-to-DisplayPort adapter?

DVI needs to go not because it is limited in functionality, but because it takes up a massive amount of space on the motherboard's IO panel. You can fit 4 USB ports in the amount of space a DVI connector takes, which is a far better use of that space.
Posted on Reply
#12
rtwjunkie
PC Gaming Enthusiast
@Assimilator none of us said we are too cheap to buy an adapter. We are countering that pompous ass who apparently thinks all DVI monitors are dinosaurs, without any regard for people having real lives.

I upgrade what needs upgrading when I have money. Which means, I and most of the real world he apparently doesn't live in have to make hard choices about equipment upgrades. A high quality monitor which is operating perfectly but happens to have a DVI connector is NOT a reason to upgrade.

When it dies, or I need a different resolution, of course I will buy a DP monitor.
Posted on Reply
#13
newtekie1
Semi-Retired Folder
AssimilatorGood luck running a 1600p screen off an Intel motherboard DVI port.
I've done it, it works just fine.
AssimilatorBuying a Z-series board and not using a discrete graphics card with it is just plain stupid. The only thing that the Z-series chipsets offer over H- and Q-series is overclocking, which is completely unnecessary in a corporate environment. If you see a Z-series board in such an environment, it's either because the IT guys are clueless, or someone's getting a kickback, or the user in question is a "power user" who demanded the best of the best. None of those are valid reasons to pay more for a Z-series board and not use its functionality.
There are applications that require raw CPU power that people use and don't need GPU power. Or, as others have pointed out, maybe their using the onboard connectors for second or third monitors.
AssimilatorSo what are y'all gonna do when the AMD Fury comes out, with 0 DVI connectors? Blame AMD because you bought monitors that use legacy connectors, or because you're too cheap to buy a DVI-to-DisplayPort adapter?
Use the HDMI to DVI connector that comes with the card.
AssimilatorDVI needs to go not because it is limited in functionality, but because it takes up a massive amount of space on the motherboard's IO panel. You can fit 4 USB ports in the amount of space a DVI connector takes, which is a far better use of that space.
That is why VGA went. Now there are already plenty of USB ports on the boards. My Z97 has 6 USB ports on the back, and that is more than enough. This board is going to have 2xUSB3.1 and 4xUSB3.0, so six total. Do you have more than 6 permanently attached USB devices? Oh, and even if you do, USB3.0 Hubs are like $12 now...

Though, I will say I'd rather see two HDMI ports, which will fit in the same space as a single DVI.
Posted on Reply
#14
Delta6326
Wow with 3m.2 ports no need to have sata wirers going anywhere, time for some really clean builds.
Posted on Reply
#15
Caring1
Case sizes could really be reduced if more devices utilized the M2 slots, like m-GPU's for example.
I suppose it will happen one day.
Posted on Reply
#16
RCoon
AssimilatorIf you see a Z-series board in such an environment, it's either because the IT guys are clueless, or someone's getting a kickback, or the user in question is a "power user" who demanded the best of the best. None of those are valid reasons to pay more for a Z-series board and not use its functionality.
We do actually do mild overclocks on all our office machines (~400mhz), and fit them with Hyper 212 EVOs. All the mhz count for video converting/rendering.

But thanks for calling me clueless.
Posted on Reply
#17
john_
shhnedoDon't mean to sound like a jerk or something, but DVI... on a high-end Skylake mobo? Seriously? Check your calendar, ASRock...
Why won't they learn that ancient crap needs to be phased out already... I don't think they make that much of a profit selling high-end boards to people with pre-historic monitors. I mean no one in their right mind will buy this and use the DVI port...
Many many years ago a company named Abit looked at their calendar(2002) and decided to get rid of all those legacy ports like the PS2 for example. That decision was a disaster.
Posted on Reply
#18
Aquinus
Resident Wat-man
RCoonWe do actually do mild overclocks on all our office machines (~400mhz), and fit them with Hyper 212 EVOs. All the mhz count for video converting/rendering.

But thanks for calling me clueless.
It's a different story if you're talking about production servers or hardware for medical equipment. Sometimes performance is more important that stability or absolute correctness depending on the application. However, if the machine isn't stable, that's lost time so it's only a good thing if they're stable. I'm sure you stress test them before unleashing it to an employee though.
Posted on Reply
#19
RCoon
AquinusIt's a different story if you're talking about production servers or hardware for medical equipment. Sometimes performance is more important that stability or absolute correctness depending on the application. However, if the machine isn't stable, that's lost time so it's only a good thing if they're stable. I'm sure you stress test them before unleashing it to an employee though.
Obviously for servers we don't, just on workstations. We always stress test them for days at a time too before handing them over.
Posted on Reply
#20
deemon
First proper Skylake board!
Proper M.2 slots ... USB-C present ... now only why there is Thunderbolt 3.0 missing?
shhnedoDon't mean to sound like a jerk or something, but DVI... on a high-end Skylake mobo? Seriously? Check your calendar, ASRock...
Why won't they learn that ancient crap needs to be phased out already... I don't think they make that much of a profit selling high-end boards to people with pre-historic monitors. I mean no one in their right mind will buy this and use the DVI port...
and you mean "the people" who use this board will use integrated GPU and the motherboard ports in the first place? Even if they do, DVI is still good to have... plenty of not so outdated good monitors use DVI.
Posted on Reply
#21
deemon
AssimilatorBuying a Z-series board and not using a discrete graphics card with it is just plain stupid. The only thing that the Z-series chipsets offer over H- and Q-series is overclocking, which is completely unnecessary in a corporate environment. If you see a Z-series board in such an environment, it's either because the IT guys are clueless, or someone's getting a kickback, or the user in question is a "power user" who demanded the best of the best. None of those are valid reasons to pay more for a Z-series board and not use its functionality.
RCoonWe do actually do mild overclocks on all our office machines (~400mhz), and fit them with Hyper 212 EVOs. All the mhz count for video converting/rendering. But thanks for calling me clueless.
Not to mention the price. Q boards usually are way more expensive than Z boards (at least around here). And H97 and Z97 boards cost about the same ... so why not take Z over H if the price is the same?
Posted on Reply
#22
Uplink10
rtwjunkieWhen it dies, or I need a different resolution, of course I will buy a DP monitor.
Too bad DP monitors are hard to get if you are not buying 4K display, most of them have HDMI and VGA/DVI.
newtekie1That is why VGA went.
VGA is still present on new monitors.
newtekie1Do you have more than 6 permanently attached USB devices? Oh, and even if you do, USB3.0 Hubs are like $12 now...
Depends on where you live, but from my observations USB 3.0 hubs are costlier than USB 3.0 PCIe card.
newtekie1Though, I will say I'd rather see two HDMI ports, which will fit in the same space as a single DVI.
Let's make that DP ports.
Delta6326Wow with 3m.2 ports no need to have sata wirers going anywhere, time for some really clean builds.
I would rather buy HDD because I get more space for the same price and what is the point of buying small capacity SSD on which you cannot even get a 4K video (because of the size) which you can only watch on SSD (or RAID) because it has higher read speed. And buying high capacity SSDs is too bloody expensive.
RCoonWe do actually do mild overclocks on all our office machines (~400mhz), and fit them with Hyper 212 EVOs. All the mhz count for video converting/rendering.
Overclocking is just raising frequency to unspecified level, in the past it gave us the additional performance but today it just gives manufacturers an opportunity to raise prices and that is why buying Z series motherboards and overclockable CPUs is pointless.
john_Many many years ago a company named Abit looked at their calendar(2002) and decided to get rid of all those legacy ports like the PS2 for example. That decision was a disaster.
That was too soon but they said legacy video ports will be phased out in 2013 and then 2015 but nothing is happening. What is the point of saying they will phase something out if they are not going to?
How is company supposed to make a decision when buying a new hardware when companies are not sure about their future plans for hardware support and make false predictions?
deemonNot to mention the price. Q boards usually are way more expensive than Z boards (at least around here). And H97 and Z97 boards cost about the same ... so why not take Z over H if the price is the same?
This depends on the reseller but generally Z motherboard if not discounted costs more.
Posted on Reply
#23
newtekie1
Semi-Retired Folder
Uplink10VGA is still present on new monitors.
Yes, but it doesn't need to be on the motherboard. At most, DVI-I would be all you need.
Uplink10Let's make that DP ports.
That could work too.
Posted on Reply
#24
yotano211
Delta6326Wow with 3m.2 ports no need to have sata wirers going anywhere, time for some really clean builds.
I need more storage then speed. M.2 SSDs are still more expensive than normal SSDs and even msata ones. M.2 SSDs are still very limited on storage space. The highest capacity one is 512, msata or 2.5 inch SSD are 1tb or more in space.
Posted on Reply
Add your own comment
Nov 21st, 2024 12:02 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts