• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD "Fiji" Block Diagram Revealed, Runs Cool and Quiet

Tetsudo77

New Member
Joined
Sep 14, 2014
Messages
7 (0.00/day)
Still too expensive, slow and bad driver support... Better buy a 980ti for that amount of money.
Nvidia still wins
And next year when nvidia start using HBM memory amd will surely die, because it can't keep up with the advancement in technology that nvidia r&d team have.
Still, this is a good card... But not as good as the 980ti.

AMD developed this card with many times less budget than nvidia, mind you. Technically speaking GCN is superior than anything nvidia can develope. AMD drivers tend to increase their cards performace overtime while NV drivers don't.

This comes from a owner of a GTX 970.
 

Amgal

New Member
Joined
May 18, 2013
Messages
6 (0.00/day)
Love it.

"Cool and quiet"

...when there's a CLC waterblock attached to it.
 
Joined
Mar 10, 2015
Messages
3,984 (1.11/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
And? My point still stands that the Titan could be using less power even if it runs at higher temperature.

Your point is true but I was just adding to your point. Not too mention, a cooler running chip should be more stable, last longer, and not throttle. Like the Titan X or 290X.
 
Joined
Jun 22, 2015
Messages
76 (0.02/day)
Processor AMD R7 3800X EKWB
Motherboard Asus Tuf B450M-Pro µATX +MosfetWB (x2)
Cooling EKWB on CPU + GPU / Heatkiller 60/80 on Mosfets / Black Ice SR-1 240mm
Memory 2x8GB G.Skill DDR4 3200C14 @ ----
Video Card(s) Vega64 EKWB
Storage Samsung 512GB NVMe 3.0 x4 / Crucial P1 1TB NVMe 3.0 x2
Display(s) Asus ProArt 23" 1080p / Acer 27" 144Hz FreeSync IPS
Case Fractal Design Arc Mini R2
Power Supply SeaSonic 850W
Keyboard Ducky One TKL / MX Brown
Still too expensive, slow and bad driver support... Better buy a 980ti for that amount of money.
Nvidia still wins
And next year when nvidia start using HBM memory amd will surely die, because it can't keep up with the advancement in technology that nvidia r&d team have.
Still, this is a good card... But not as good as the 980ti.

What driver problems is your Fury/X experiencing ?
I'm sure we'd all like to know.

~36Hrs left and the reviews will tell all....
 
Joined
Dec 3, 2014
Messages
348 (0.09/day)
Location
Marabá - Pará - Brazil
System Name KarymidoN TitaN
Processor AMD Ryzen 7 5700X
Motherboard ASUS TUF X570
Cooling Custom Watercooling Loop
Memory 2x Kingston FURY RGB 16gb @ 3200mhz 18-20-20-39
Video Card(s) MSI GTX 1070 GAMING X 8GB
Storage Kingston NV2 1TB| 4TB HDD
Display(s) 4X 1080P LG Monitors
Case Aigo Darkflash DLX 4000 MESH
Power Supply Corsair TX 600
Mouse Logitech G300S
Cheaper than Titan X... lets see if this is faster...
*grabs popcorn*
 
Joined
Apr 19, 2011
Messages
2,198 (0.44/day)
Location
So. Cal.
because it can't keep up with the advancement in technology
Really AMD appears they could this time... on a shoe string... but yeah maybe that might still come to fruition sometime Q1 of what 2017...
 
Joined
Jan 13, 2011
Messages
221 (0.04/day)
I still wonder if the water cooling was the only way around HBM heat management not saying the card runs extra hot just that the chip is more heatsentivie because of HBM and this was a slapped together solution to the problem.
 
Joined
Jul 14, 2008
Messages
872 (0.15/day)
Location
Copenhagen, Denmark
System Name Ryzen/Laptop/htpc
Processor R9 3900X/i7 6700HQ/i7 2600
Motherboard AsRock X470 Taichi/Acer/ Gigabyte H77M
Cooling Corsair H115i pro with 2 Noctua NF-A14 chromax/OEM/Noctua NH-L12i
Memory G.Skill Trident Z 32GB @3200/16GB DDR4 2666 HyperX impact/24GB
Video Card(s) TUL Red Dragon Vega 56/Intel HD 530 - GTX 950m/ 970 GTX
Storage 970pro NVMe 512GB,Samsung 860evo 1TB, 3x4TB WD gold/Transcend 830s, 1TB Toshiba/Adata 256GB + 1TB WD
Display(s) Philips FTV 32 inch + Dell 2407WFP-HC/OEM/Sony KDL-42W828B
Case Phanteks Enthoo Luxe/Acer Barebone/Enermax
Audio Device(s) SoundBlasterX AE-5 (Dell A525)(HyperX Cloud Alpha)/mojo/soundblaster xfi gamer
Power Supply Seasonic focus+ 850 platinum (SSR-850PX)/165 Watt power brick/Enermax 650W
Mouse G502 Hero/M705 Marathon/G305 Hero Lightspeed
Keyboard G19/oem/Steelseries Apex 300
Software Win10 pro 64bit
this looks awesome!!!! cant wait for the reviews!!!
 
Joined
May 9, 2012
Messages
8,545 (1.85/day)
Location
Ovronnaz, Wallis, Switzerland
System Name main/SFFHTPCARGH!(tm)/Xiaomi Mi TV Stick/Samsung Galaxy S23/Ally
Processor Ryzen 7 5800X3D/i7-3770/S905X/Snapdragon 8 Gen 2/Ryzen Z1 Extreme
Motherboard MSI MAG B550 Tomahawk/HP SFF Q77 Express/uh?/uh?/Asus
Cooling Enermax ETS-T50 Axe aRGB /basic HP HSF /errr.../oh! liqui..wait, no:sizable vapor chamber/a nice one
Memory 64gb DDR4 3600/8gb DDR3 1600/2gbLPDDR3/8gbLPDDR5x/16gb(10 sys)LPDDR5 6400
Video Card(s) Hellhound Spectral White RX 7900 XTX 24gb/GT 730/Mali 450MP5/Adreno 740/Radeon 780M 6gb LPDDR5
Storage 250gb870EVO/500gb860EVO/2tbSandisk/NVMe2tb+1tb/4tbextreme V2/1TB Arion/500gb/8gb/256gb/4tb SN850X
Display(s) X58222 32" 2880x1620/32"FHDTV/273E3LHSB 27" 1920x1080/6.67"/AMOLED 2X panel FHD+120hz/7" FHD 120hz
Case Cougar Panzer Max/Elite 8300 SFF/None/Gorilla Glass Victus 2/front-stock back-JSAUX RGB transparent
Audio Device(s) Logi Z333/SB Audigy RX/HDMI/HDMI/Dolby Atmos/KZ x HBB PR2/Moondrop Chu II + TRN BT20S
Power Supply Chieftec Proton BDF-1000C /HP 240w/12v 1.5A/USAMS GAN PD 33w/USAMS GAN 100w
Mouse Speedlink Sovos Vertical-Asus ROG Spatha-Logi Ergo M575/Xiaomi XMRM-006/touch/touch
Keyboard Endorfy Thock 75%/Lofree Edge/none/touch/virtual
VR HMD Medion Erazer
Software Win10 64/Win8.1 64/Android TV 8.1/Android 14/Win11 64
Benchmark Scores bench...mark? i do leave mark on bench sometime, to remember which one is the most comfortable. :o
Wow, thats a very comfortable operating temperature while gaming!
just like my 290 ... rarely seen above 48° :D (ok ok ... not the same level of water cooling tho ...)
 
Joined
Apr 16, 2010
Messages
2,070 (0.39/day)
System Name iJayo
Processor i7 14700k
Motherboard Asus ROG STRIX z790-E wifi
Cooling Pearless Assasi
Memory 32 gigs Corsair Vengence
Video Card(s) Nvidia RTX 2070 Super
Storage 1tb 840 evo, Itb samsung M.2 ssd 1 & 3 tb seagate hdd, 120 gig Hyper X ssd
Display(s) 42" Nec retail display monitor/ 34" Dell curved 165hz monitor
Case O11 mini
Audio Device(s) M-Audio monitors
Power Supply LIan li 750 mini
Mouse corsair Dark Saber
Keyboard Roccat Vulcan 121
Software Window 11 pro
Benchmark Scores meh... feel me on the battle field!
Should the title not be "AMD Does not trust its own hardware and has to desperately resort to watercooling otherwise their heater might melt" ?


.......naaaah : Amd dose not trust its customers water cooling skills ........so they do it for them!!


business.jpg
 
Last edited:
Joined
Oct 2, 2004
Messages
13,791 (1.87/day)
Love it.

"Cool and quiet"

...when there's a CLC waterblock attached to it.

Well, the fact is, it'll be cooler and quieter. So, it's not a lie no matter how you turn it. Fiji is not an R9-290X, mind you. And FYI, Titan X is not exactly a quiet card either. So, why not just give a card AiO and you solve two problems entirely while giving users extra headroom for overclocking. I'm using AiO on my CPU for like 2 years now and I wouldn't trade it for any air cooler. If I had a decently sized case (but I love my cute miniATX), i'd be using AiO on my graphic card already. Fury might even become a reason to invest time into modding my case for it...
 
Joined
May 9, 2012
Messages
8,545 (1.85/day)
Location
Ovronnaz, Wallis, Switzerland
System Name main/SFFHTPCARGH!(tm)/Xiaomi Mi TV Stick/Samsung Galaxy S23/Ally
Processor Ryzen 7 5800X3D/i7-3770/S905X/Snapdragon 8 Gen 2/Ryzen Z1 Extreme
Motherboard MSI MAG B550 Tomahawk/HP SFF Q77 Express/uh?/uh?/Asus
Cooling Enermax ETS-T50 Axe aRGB /basic HP HSF /errr.../oh! liqui..wait, no:sizable vapor chamber/a nice one
Memory 64gb DDR4 3600/8gb DDR3 1600/2gbLPDDR3/8gbLPDDR5x/16gb(10 sys)LPDDR5 6400
Video Card(s) Hellhound Spectral White RX 7900 XTX 24gb/GT 730/Mali 450MP5/Adreno 740/Radeon 780M 6gb LPDDR5
Storage 250gb870EVO/500gb860EVO/2tbSandisk/NVMe2tb+1tb/4tbextreme V2/1TB Arion/500gb/8gb/256gb/4tb SN850X
Display(s) X58222 32" 2880x1620/32"FHDTV/273E3LHSB 27" 1920x1080/6.67"/AMOLED 2X panel FHD+120hz/7" FHD 120hz
Case Cougar Panzer Max/Elite 8300 SFF/None/Gorilla Glass Victus 2/front-stock back-JSAUX RGB transparent
Audio Device(s) Logi Z333/SB Audigy RX/HDMI/HDMI/Dolby Atmos/KZ x HBB PR2/Moondrop Chu II + TRN BT20S
Power Supply Chieftec Proton BDF-1000C /HP 240w/12v 1.5A/USAMS GAN PD 33w/USAMS GAN 100w
Mouse Speedlink Sovos Vertical-Asus ROG Spatha-Logi Ergo M575/Xiaomi XMRM-006/touch/touch
Keyboard Endorfy Thock 75%/Lofree Edge/none/touch/virtual
VR HMD Medion Erazer
Software Win10 64/Win8.1 64/Android TV 8.1/Android 14/Win11 64
Benchmark Scores bench...mark? i do leave mark on bench sometime, to remember which one is the most comfortable. :o
.......naaaah : Amd dose not trust its customers water cooling skills ........so they do it for them!!
say that to my 290 ... :cry: (that was my 1st water cooling attempt ... )

i wonder if i can fit 2 AIO rad in my Air540 ... in the front in place of the Phobya G-Changer 240V2 (60mm)

pfah! i prefer 2 fury X or nano with a custom block and a single slot shield so i can reuse my l oop and put my 290 and the Kryographics block+ backplate to rest on my shelf as "the best card ever since 2yrs and still going strong" (or just "the best bang for bucks card" )
 
Last edited:
Joined
May 2, 2013
Messages
170 (0.04/day)
Should the title not be "AMD Does not trust its own hardware and has to desperately resort to watercooling otherwise their heater might melt" ?


Well, water-cooling in this instance has nothing to do with the TDP of the chip. Water cooling is used to cool the memory chips which have a way smaller footprint than GDDR5 and thus the rate at which heat needs to be transfered out of these chips has to be a lot higher than what it was with the previous GDDR5 chips.

As for AMD not trusting their own CPUs claim, which sounds true, I don't think you're getting the whole point. Under a good multi-core aware graphics API, their 8 core FX CPUs do better than Intel's 4 core CPUs. DX11 is not that... and all we have now are DX11 games, so they had to use their competitors CPUs to drive two FuryX graphics cards in their custom Fury system.

Whether you're a shill, a troll or just a misinformed person, I hope I have taught you something, son )
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,171 (2.79/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
Well, water-cooling in this instance has nothing to do with the TDP of the chip. Water cooling is used to cool the memory chips which have a way smaller footprint than GDDR5 and thus the rate at which heat needs to be transfered out of these chips has to be a lot higher than what it was with the previous GDDR5 chips.

As for AMD not trusting their own CPUs claim, which sounds true, I don't think you're getting the whole point. Under a good multi-core aware graphics API, their 8 core FX CPUs do better than Intel's 4 core CPUs. DX11 is not that... and all we have now are DX11 games, so they had to use their competitors CPUs to drive two FuryX graphics cards in their custom Fury system.

Whether you're a shill, a troll or just a misinformed person, I hope I have taught you something, son )
There is some logic to what he's saying. The general trend for many years now has been to throw power consumption under the bus for the sake of performance. There is no reason to believe that isn't still the case as AMD has not demonstrated anything to the contrary in quite some time. However, I do have to say that ZoneDymo needs to tone down the bias a bit.

Side note: Cool 'n Quiet? What the hell is this, a CPU? I think the Athlon 64 in my attic wants its technology back. :)
 
Joined
May 2, 2013
Messages
170 (0.04/day)
There is some logic to what he's saying. The general trend for many years now has been to throw power consumption under the bus for the sake of performance. There is no reason to believe that isn't still the case as AMD has not demonstrated anything to the contrary in quite some time. However, I do have to say that ZoneDymo needs to tone down the bias a bit.

Side note: Cool 'n Quiet? What the hell is this, a CPU? I think the Athlon 64 in my attic wants its technology back. :)

I did not find his post consistent to say the least. Nvidia obviously spent some time after "kepler", which is a marketing term BTW, and ironed out their chip fabrication to reduce power leakage and the result was "Maxwell", another marketing term. Anyhow... an extra $15 for a whole year power bill isn't much to talk about.

AMD were more efficient in their 3000, 4000, 5000 and 6000 series. They were slightly less efficient with their 7000 series, but now they caught up and seem to have done quite a good job with Fiji.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,171 (2.79/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
MD were more efficient in their 3000, 4000, 5000 and 6000 series. They were slightly less efficient with their 7000 series
Hardly. I would say GCN helped in that respect. The problem is that AMD kept pushing it harder so the power savings simply got turned into performance while maintaining the same power envelope, also for 1080p, I think there is a clear winner on which camp is power efficient with its power. With respect to Fury's power consumption, that has yet to be seen. We don't know how power hungry it is because there is still no info with respect to the GPU's performance. We should know tomorrow though. :) Good thing too, I want to replace these 6870s.

I just took the 390X review because it was right up front, but it's another great example of how AMD would rather suck down the amps which is why I'm not overly optimistic that there will be anything ground breaking. Also consider the AIO water cooler. That's a sign that the GPU's TDP might be a bit on the high side which would be consistent with what AMD has been doing.
 
Joined
May 2, 2013
Messages
170 (0.04/day)


Don't give these performance/watt charts when a couple of Gameworks/Nvidia-poisoned games can swing things to their side by a great deal. Here's a chart of load power consumption. Keep in mind that on average your graphics card is under full load less than 20% of the day.
If you look at the chart, you'll see that TitanX consumes about the same as the 290X while being ~40ish% faster.

I personally wouldn't care about 40% more power under load for the same performance if the product is considerably cheaper. Again, an extra $10 - $15 for power bills in a one year period isn't much to talk about.
 
Joined
May 2, 2013
Messages
170 (0.04/day)
Fiji is not the same as Hawaii. That's like saying Maxwell is the same as Kepler...

Those are marketing terms. Fiji is a GCN chip with optimization here and there to improve performance where it matters. Same goes for Nvidia's chips.
 
Joined
Oct 2, 2004
Messages
13,791 (1.87/day)
What's more important is idle power consumption. Something AMD doesn't have in check for multi monitor setups. I don't care since I have just one monitor, but still...
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,171 (2.79/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
What's more important is idle power consumption. Something AMD doesn't have in check for multi monitor setups. I don't care since I have just one monitor, but still...
...but I like my 200 watt idle power consumption. :laugh:
How much of that do you think is my 6870s? :p Actually #2 is in ULV, so I suspect any of it is mostly the primary which is the non-reference one.


Don't give these performance/watt charts when a couple of Gameworks/Nvidia-poisoned games can swing things to their side by a great deal. Here's a chart of load power consumption. Keep in mind that on average your graphics card is under full load less than 20% of the day.
If you look at the chart, you'll see that TitanX consumes about the same as the 290X while being ~40ish% faster.

I personally wouldn't care about 40% more power under load for the same performance if the product is considerably cheaper. Again, an extra $10 - $15 for power bills in a one year period isn't much to talk about.

You see, the problem with that is the way that Hilbert at Guru3d calculates power consumption and how W1zz actually gets it. Hilbert does some math to approximate the GPU's power usage. If you actually read his reviews, he explains it:
Hilbert said:
Let's have a look at how much power draw we measure with this graphics card installed. The methodology: We have a device constantly monitoring the power draw from the PC. We simply stress the GPU, not the processor. The before and after wattage will tell us roughly how much power a graphics card is consuming under load. Our test system is based on an eight-core Intel Core i7-5960X Extreme Edition setup on the X99 chipset platform. This setup is clocked to 4.40 GHz on all CPU cores. Next to that we have energy saving functions disabled for this motherboard and processor (to ensure consistent benchmark results). We'll be calculating the GPU power consumption here, not the total PC power consumption.

Power consumption Radeon R9-390X
  1. System in IDLE = 94W
  2. System Wattage with GPU in FULL Stress = 342W
  3. Difference (GPU load) = 248W
  4. Add average IDLE wattage ~10W
  5. Subjective obtained GPU power consumption = ~ 258 Watts

Then you have @W1zzard who has gone through painsteaking work to figure out that actual power draw of the GPU from the PCI-E slot and PCI-E power connectors as described in his reviews:
W1zzard said:
For this test, we measure the power consumption of only the graphics card via the PCI-Express power connector(s) and PCI-Express bus slot. A Keithley Integra 2700 digital multimeter with 6.5-digit resolution is used for all measurements. Again, the values here only reflect the card's power consumption as measured at its DC inputs, not that of the whole system.

We use Metro: Last Light as a standard test representing typical 3D gaming usage because it offers the following: very high power draw; high repeatability; is a current game that is supported on all cards; drivers are actively tested and optimized for it; supports all multi-GPU configurations; test runs in a relatively short time and renders a non-static scene with variable complexity.

Our results are based on the following tests:
  • Idle: Windows 7 Aero sitting at the desktop (1920x1080) with all windows closed and drivers installed. Card left to warm up in idle mode until power draw was stable.
  • Multi-monitor: Two monitors connected to the tested card, both using different display timings. Windows 7 Aero sitting at the desktop (1920x1080+1280x1024) with all windows closed and drivers installed. Card left to warm up in idle mode until power draw was stable. When using two identical monitors with same timings and resolution, power consumption will be lower. Our test represents the usage model of many productivity users, who have one big screen and a small monitor on the side.
  • Blu-ray Playback: Power DVD 9 Ultra is used at a resolution of 1920x1080 to play back the Batman: The Dark Knight Blu-ray disc with GPU acceleration turned on. Measurements start around timecode 1:19, which has the highest data rates on the BD with up to 40 Mb/s. Playback keeps running until power draw converges to a stable value.
  • Average: Metro: Last Light at 1920x1080, representing a typical gaming power draw. Average of all readings (12 per second) while the benchmark was rendering (no title/loading screen). In order to heat up the card, we run the benchmark once without measuring power consumption.
  • Peak: Metro: Last Light at 1920x1080, representing a typical gaming power draw. Highest single reading during the test.
  • Maximum: Furmark Stability Test at 1280x1024, 0xAA. This results in a very high no-game power-consumption that can typically be reached only with stress-testing applications. We report the highest single reading after a short startup period. Initial bursts during startup are not included, as they are too short to be relevant.
Power consumption results of other cards on this page are measurements of the respective reference design.

In all candor, who do you think has gone through more work to answer this question? Hilbert used a Kill-a-watt and did some math. W1zz used meters on the GPU itself. I think some credit goes to W1zz for going through such lengths to give us such detailed information for those of us (sp. most of us,) who don't have the hardware to do get a real number and not simply doing some math with a kill-a-watt and I think that amount of data W1zz provides describes that in detail.

Lastly, Hilbert doesn't even describe what kind of test he uses to figure out power consumption. All he says is that "he stresses the GPU to 100%." All things considered, that's a bit more shady than W1zz giving us the full lowdown.
 

fullinfusion

Vanguard Beta Tester
Joined
Jan 11, 2008
Messages
9,909 (1.60/day)
I would take that cooling right off and put a full coverage block on it right away. Water cooling the way its meant to be done.
That comment made me kinda chuckle, well almost but whatever :rolleyes:

Now without you going ape shit on me let me ask you why would you think it needs a full coverage block on it? The memory is right beside the gpu so why a full coverage,? Ahh but it is full coverage.. Just not the coverage thats ment to be, see what i just did there :rockout:Does the water cooler just cover the gpu and not the 4 memory modules :wtf: ... I'm about 99.99999% sure the block covers the entire gpu/ memory..

Tbh I can't wait 3 more weeks to get my new card... I want it now, gaming evolved, see what I did there :laugh:

Ok fun times over :peace:
 
Joined
Jun 13, 2012
Messages
1,412 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Audio Device(s) Logitech Z906 5.1
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
I has been a long time since I read GPU temps for a stock card so low and this quite. I must say it's a job well done by AMD engineers for atleast making the card cool and quite under load.
Any gpu can be made to run cool when a water cooler is applied. Shoot a GTX980 had a water cooler on it thing topped out at 43c, while clocked at 1600mhz.

AMD developed this card with many times less budget than nvidia, mind you. Technically speaking GCN is superior than anything nvidia can develope. AMD drivers tend to increase their cards performace overtime while NV drivers don't.
This comes from a owner of a GTX 970.

amd driver increase performance while nv drivers don't? That comment is so full of stupidity I don't even know where to start or how to. GCN being more advanced that some of their new tech most their cards didn't even support freesync for example.
 

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
13,013 (2.49/day)
Location
Loveland, CO
System Name Ryzen Reflection
Processor AMD Ryzen 9 5900x
Motherboard Gigabyte X570S Aorus Master
Cooling 2x EK PE360 | TechN AM4 AMD Block Black | EK Quantum Vector Trinity GPU Nickel + Plexi
Memory Teamgroup T-Force Xtreem 2x16GB B-Die 3600 @ 14-14-14-28-42-288-2T 1.45v
Video Card(s) Zotac AMP HoloBlack RTX 3080Ti 12G | 950mV 1950Mhz
Storage WD SN850 500GB (OS) | Samsung 980 Pro 1TB (Games_1) | Samsung 970 Evo 1TB (Games_2)
Display(s) Asus XG27AQM 240Hz G-Sync Fast-IPS | Gigabyte M27Q-P 165Hz 1440P IPS | LG 24" IPS 1440p
Case Lian Li PC-011D XL | Custom cables by Cablemodz
Audio Device(s) FiiO K7 | Sennheiser HD650 + Beyerdynamic FOX Mic
Power Supply Seasonic Prime Ultra Platinum 850
Mouse Razer Viper v2 Pro
Keyboard Corsair K65 Plus 75% Wireless - USB Mode
Software Windows 11 Pro 64-Bit
That comment made me kinda chuckle, well almost but whatever :rolleyes:

Now without you going ape shit on me let me ask you why would you think it needs a full coverage block on it? The memory is right beside the gpu so why a full coverage,? Ahh but it is full coverage.. Just not the coverage thats ment to be, see what i just did there :rockout:Does the water cooler just cover the gpu and not the 4 memory modules :wtf: ... I'm about 99.99999% sure the block covers the entire gpu/ memory..

Tbh I can't wait 3 more weeks to get my new card... I want it now, gaming evolved, see what I did there :laugh:

Ok fun times over :peace:

Its not about need, water cooling to me is mostly for the aesthetic (seeing how hardware doesnt run all that hot any more to warrant it, no one needs it) , and I find AIO, unless its the swiftech kits, to be rather ugly. If I already have a water cooling loop, why would I want a part that just adds more clutter in terms of tubes, and another radiator to mount? When I can just strip off the stock cooling, and throw a nice full waterblock on it to integrate into my already awesome water cooling setup.

Also I am more worried about the GPU + VRM temps, than the memory(maybe HBM is different, but I know the VRMs on my 780 run hotter then the memory), which is where full cover comes into play.

Also temps will be even better in a real water cooling loop. My 360 and 240 rad set up laughs at that dinky 120mm.
 
Top