# The Ampere owners club - 30xx series



## 15th Warlock (Oct 24, 2020)

Hi everyone, I don't think there's a club for Ampere based cards yet, so let's get the ball rolling!

I'll go first, here's my RTX 3090 FE, ordered from Best buy:






















Hope more of you can post pics of your Ampere based cards!

As more cards begin becoming available I hope this club will grow in size soon! Happy hunting!


----------



## ebivan (Oct 26, 2020)

Ampere availability is so bad, it seems you're the only proud owner here. 
Congratulations anyways


----------



## 15th Warlock (Oct 27, 2020)

ebivan said:


> Ampere availability is so bad, it seems you're the only proud owner here.
> Congratulations anyways



Thank you, I double checked before opening this thread to make sure there wasn't one for Ampere owners yet, and was honestly surprised not to see a thread for this topic.

 I'm sure I've seen a few people in the forum post about their 30xx cards, I hope they come join the club and post pics of their cards soon!


----------



## JalleR (Oct 27, 2020)

My cards first delivery date was 2 days ago it has been pushed 2 times and is now late December


----------



## 15th Warlock (Oct 27, 2020)

JalleR said:


> My cards first delivery date was 2 days ago it has been pushed 2 times and is now late December



Where did you order your cards from? Hope you get them soon! Don't forget to post pics here!


----------



## theonek (Oct 27, 2020)

late december just for christmas miracle


----------



## bug (Oct 27, 2020)

15th Warlock said:


> Where did you order your cards from? Hope you get them soon! Don't forget to post pics here!


You didn't have to twist the knife...


----------



## 15th Warlock (Oct 27, 2020)

bug said:


> You didn't have to twist the knife...





I just asked where he ordered from and wished him luck getting them soon, how's that twisting the knife?

Back to topic, now that the 3070 has launched, I expect more people will be joining the club soon, that card has a great price/performance ratio.

I'm also excited to see what AMD is bringing to the table, I already am fully committed to an AMD build, I have my Asus TUF X570 plus ready and waiting for Zen 3 to launch.

It's been a while since I've been excited about the PC upgrade scene, as I didn't see any reason to upgrade my 2080Ti or the 8700K currently driving my build, but both AMD and Nvidia have delivered in terms of performance, I'm looking forward to my first AMD CPU ever, and can't wait to pair it to my 3090!

Now we just need availability to be less of an issue, 2020 has been atrocious when it comes to bots seizing all the goods and preventing enthusiasts from being able to have access to these new cards, hope that trend dies as more and more retailers start enforcing captcha for any purchase as well as limits on how many cards can be ordered per account.


----------



## ebivan (Oct 28, 2020)

15th Warlock said:


> Back to topic, now that the 3070 has launched, I expect more people will be joining the club soon, that card has a great price/performance ratio.


Yeah, sure. Looking at Proshops 3070 statistics. They ordered 4620 AIB cards. Of those a whopping 206 have been received and another 578 are allegedly on the way. Seems like there are loads of future 3070 owners out there 

Source: https://www.proshop.de/RTX-30series-overview


----------



## Calmmo (Oct 28, 2020)

Cancelled mine last week. (order placed on sept 17)


----------



## Deleted member 24505 (Oct 28, 2020)

ebivan said:


> Yeah, sure. Looking at Proshops 3070 statistics. They ordered 4620 AIB cards. Of those a whopping 206 have been received and another 578 are allegedly on the way. Seems like there are loads of future 3070 owners out there
> 
> Source: https://www.proshop.de/RTX-30series-overview



Them figures are terrible


----------



## 15th Warlock (Oct 28, 2020)

ebivan said:


> Yeah, sure. Looking at Proshops 3070 statistics. They ordered 4620 AIB cards. Of those a whopping 206 have been received and another 578 are allegedly on the way. Seems like there are loads of future 3070 owners out there
> 
> Source: https://www.proshop.de/RTX-30series-overview





Better than zero cards, right?

That card will be in high demand, and supply won’t be able to meet that demand for many months unfortunately. I created this thread for owners of 30xx cards to share their their pics and experiences related to Ampere cards, I realize it’ll be a while before that happens, but please let’s not let it devolve into a bashing contest regarding availability, there already are plenty of other threads on this forum dedicated to that.

Even if it takes some time for more people to join in, I wanted Ampere owners to have a common place to talk about their cards.


----------



## bug (Oct 28, 2020)

I'm really tempted to join this club. I usually buy into the x60 range, but 3070 looks like a great buy to get a feel of RTRT. I have a 4k monitor, so I'd have to lower some settings here and there (no worries, I know how to tweak them). But I'm in no rush to buy, if I don't find a great Christmas offer, I'll let these (and AMD's counterparts) settle for a few more months.


----------



## okbuddy (Oct 28, 2020)

could we test it with lg oled hdmi2.1

and ps5 hdmi 2.1 with oled too


----------



## bug (Oct 28, 2020)

okbuddy said:


> could we test it with lg oled hdmi2.1
> 
> and ps5 hdmi 2.1 with oled too


Wrong thread?


----------



## 15th Warlock (Oct 28, 2020)

bug said:


> I'm really tempted to join this club. I usually buy into the x60 range, but 3070 looks like a great buy to get a feel of RTRT. I have a 4k monitor, so I'd have to lower some setting here and there (no worries, I know how to tweak a game's settings). But I'm in no rush to buy, if I don't find a great Christmas offer, I'll let these (and AMD's counterparts) settle for a few more months.



Nice, hope you decide to join the club,with performance that’s close to the 2080Ti, I can tell you from personal experience that should be able to run most current games at 4K and maxed out settings, especially if DLSS can be turned on.

Also, with AMD about to release more information regarding Navi in only a few more hours, I feel like many people who are on the fence regarding what card to get will finally be able to make a decision. No matter which camp you pick, these are exciting times to be a PC gamer!


----------



## bug (Oct 28, 2020)

We'll see. If I give this more time, the 3060Ti will also enter the picture. And the gap between 3070 and 3080 is pretty big


----------



## ratirt (Oct 28, 2020)

bug said:


> We'll see. If I give this more time, the 3060Ti will also enter the picture. And the gap between 3070 and 3080 is pretty big


Maybe it's a matter of time till the 3070 TI shows up. I'd go with the 3070 Ti just to have the NV card in my other system.


----------



## Vayra86 (Oct 28, 2020)

bug said:


> I'm really tempted to join this club. I usually buy into the x60 range, but 3070 looks like a great buy to get a feel of RTRT. I have a 4k monitor, so I'd have to lower some settings here and there (no worries, I know how to tweak them). But I'm in no rush to buy, if I don't find a great Christmas offer, I'll let these (and AMD's counterparts) settle for a few more months.



I'd refrain from 8GB with a 4K panel, tbh. Its the exact edge case you shouldn't be on with this sort of capacity and res. But I'm sure you've considered the ifs.

1440p sweetspot really. Other than that... I find myself in the exact same boat. Waiting it out for AIB reviews, good availability and early adopter driver nonsense to pass, plus Navi's similar path to shelves.


----------



## bug (Oct 28, 2020)

Vayra86 said:


> I'd refrain from 8GB with a 4K panel, tbh. Its the exact edge case you shouldn't be on with this sort of capacity and res. But I'm sure you've considered the ifs.
> 
> 1440p sweetspot really. Other than that... I find myself in the exact same boat. Waiting it out for AIB reviews, good availability and early adopter driver nonsense to pass.


Meh, what's the worst that can happen? Turn down the texture quality a notch 2-3 years from now?
And I don't really game anymore. But I do have a backlog of a few outstanding titles I want to try. So, not playing the latest and greatest, I think I'll be fine 
I went 4k because it's great for picture retouching and IDE real-estate. If gaming can't keep up with that, I'll live with it.


----------



## Vayra86 (Oct 28, 2020)

bug said:


> Meh, what's the worst that can happen? Turn down the texture quality a notch 2-3 years from now?
> And I don't really game anymore. But I do have a backlog of a few outstanding titles I want to try. So, not playing the latest and greatest, I think I'll be fine
> I went 4k because it's great for picture retouching and IDE real-estate. If gaming can't keep up with that, I'll live with it.



Oh yeah that's definitely a big factor. It is for me too. I find myself quickly burned out from new titles and releases. Shallow stuff most of it, and the non-shallow stuff generally isn't cutting edge in graphics. Most of what I play is from the years past...

But my experience is also that gaming hypes go with ups and downs. Some new title might just be the aha erlebnis you've wanted for ages... and with RT development probably intertwined, I think a margin is wise.


----------



## Fouquin (Oct 31, 2020)

Didn't pay MSRP but also didn't pay out to a scalper. Local warehouse I contacted after launch finally got around to me with a card offer and I took it.



It's not the best but it's a functional RTX 3080. Slaps the old 2080 SUPER around for sure.









						Result
					






					www.3dmark.com


----------



## 15th Warlock (Oct 31, 2020)

Fouquin said:


> Didn't pay MSRP but also didn't pay out to a scalper. Local warehouse I contacted after launch finally got around to me with a card offer and I took it.
> 
> View attachment 173902
> 
> ...



Congratulations! And welcome to the club! Yes, that's a huge update over a 2080 super.


----------



## Hyderz (Oct 31, 2020)

i got a rtx 3090 galax ex gamer white oc. Msrp price after converted us dollars and custom tax.
so i didnt pay full price. ill hook it up to my 6700k and 9900k and post some ultrawide benchmark results later.
im upgrading from a gtx 1070ti.


----------



## P4-630 (Oct 31, 2020)

Hyderz said:


> ex gamer white




I will buy a RTX 3080 , end of this year probably.


----------



## kurosagi01 (Nov 1, 2020)

Here's my Palit rtx 3080 gamingpro non OC that I got 2 weeks after launch.


----------



## 15th Warlock (Nov 1, 2020)

kurosagi01 said:


> Here's my Palit rtx 3080 gamingpro non OC that I got 2 weeks after launch.
> View attachment 174039
> View attachment 174040


That’s a nice looking card, too bad we can’t get Palit cards here in America. Congrats mate!


----------



## bug (Nov 1, 2020)

15th Warlock said:


> That’s a nice looking card, too bad we can’t get Palit cards here in America. Congrats mate!


You can, but I think they're called Galax or KFA2 over there


----------



## kurosagi01 (Nov 2, 2020)

bug said:


> You can, but I think they're called Galax or KFA2 over there


Ironically you can get KFA2 in the UK too but not Gainward or Galax. Sad I remember them fancy artwork on Gainward on 9000 series.


15th Warlock said:


> That’s a nice looking card, too bad we can’t get Palit cards here in America. Congrats mate!


Appreciate it mate, they are my go to brand apart from EVGA for Nvidia GPUs as they are generally "underrated" brand compared to others and Wizzard always give them a positive review.


----------



## JustAnEngineer (Nov 4, 2020)

My RTX3080 purchased directly from EVGA arrived this evening.  It was a tight fit in the Define 7 Compact with the front-mounted radiator.


----------



## kurosagi01 (Nov 11, 2020)

There seems to be more availability popping up for founder edition cards past few weeks.
My older brother initially bought the 3070 but wasn't too pleased to not have any LED on the geforce text and he wanted more just bit more fps playing monster hunter so he then bought a 3080 when they became available and got it the following week(ordered on Thursday last week).
I've decided to buy his 3070 which will be going into my GF pc which will be a very nice bump from 1660 super.


----------



## X800 (Nov 13, 2020)

I did finally get my Msi 3080 Ventus 3X 10 OC .I it did take 57days to get...The card is great it boosts to 2130 so im happy .


----------



## ThisMayBeYou (Nov 13, 2020)

Didn’t pay MSRP but I’m still happy with it
Bought for my 1440p monitor. Love these frames. Clocks boost up to 1995mhz


----------



## Mussels (Dec 9, 2020)

I feel poor now - should be here within 48 hours






this threads so quiet due to lack of owners, lol


----------



## kurosagi01 (Dec 9, 2020)

Here is a pic of the 3070 FE now in my GF pc.


----------



## Mussels (Dec 9, 2020)

that top intake is just getting yeeted right back outside, can you move that front top fan back further?


----------



## kurosagi01 (Dec 9, 2020)

Mussels said:


> that top intake is just getting yeeted right back outside, can you move that front top fan back further?


I can yeah, just haven't got round yet. I'll probably do it after replacing the CPU cooler as its too tall for the Carbide 240, which is my own fault when I didn't check for clearance when I bought it lol.


----------



## the54thvoid (Dec 9, 2020)

Mussels said:


> I feel poor now - should be here within 48 hours
> 
> 
> 
> ...



Yeah. I can get fleeced (not for me), buy a 'picture' of a 3080 for MSRP, or easily buy a 3090 (which I feel is a waste of money, for me).

You all enjoy those cards, though.

Edit: Scan, in the UK has all of these RTX3090's. Clearly, they're not getting as much love as Nvidia had hoped. 

All sold out..... A few hours. Insane.


----------



## 15th Warlock (Dec 10, 2020)

Yay! Thread has been brought back from death! Congrats to all the new club members! Hope we get more people in soon!


----------



## jlewis02 (Dec 10, 2020)

I get to join a club early for once.


----------



## Mussels (Dec 10, 2020)

Ugh i paid for 1 day shipping and its been most of a day, whyyyyyy isn't it here

(1 day shipping is a joke here in aus. it's very likely NOT to arrive in 1 day, but on a $2k order the extra $5 was trivial)
Back of my GPU has a solid plate over it, wonder what resistor setup it'll have


----------



## nguyen (Dec 12, 2020)

Joining the club


----------



## Mussels (Dec 12, 2020)

3080 arrived and installed

PCI-E power plugs are inverted and REALLY tight together, so i gotta be artistic to make the RGB cables fit (this is pre-wiring tidy up, FYI. It's triggering my OCD, even if the missing front panel was a big clue)


----------



## nguyen (Dec 13, 2020)

wooo....the Waterblock doesn't fit inside my Lian Li 011 case it seems, the soft tubes are there temporarily while I test out the GPU





Timespy score seems good, this TUF 3090 OC might be a good clocker


----------



## 15th Warlock (Dec 13, 2020)

Nice pics everyone! Love all of your builds, now that more cards are hitting the channel I'm glad more people are enjoying Ampere, it's a monster of an architecture, and with Cyberpunk finally out, it's breathtaking to finally experience what next gen feels like.

Here are pics of my updated rig, I tried for the longest time to get a Ryzen 5900x, my PC has been long overdue for an upgrade, and was really hoping to go team AMD this time, but alas, it was not to be, as between it selling out in seconds on release, and bots grabbing what little stock has trickled since, I got tired of trying to get a 5900x, and I wouldn't have been able to update my PC in time for Cyberpunk.

At 4K resolutions though this CPU is not a bottleneck at all, found an awesome deal at Newegg for black Friday, when buying it in a combo with the Asus mobo I saved over $200 over MSRP for both after MIRs, and it really allows the 3090 to stretch it's legs, the new 360mm AIO keeps the CPU chill at 63 even after hours of gaming, so it boosts to 5.2GHz on all cores pretty much all the time. My card is also hovering in the lower 60s too and boosting to 1910MHz pretty much the whole time.

















Keep posting pictures of your awesome rigs, it's nice to see new members joining the club almost every day now!


----------



## Caring1 (Dec 14, 2020)

Mussels said:


> 3080 arrived and installed
> 
> PCI-E power plugs are inverted and REALLY tight together, so i gotta be artistic to make the RGB cables fit (this is pre-wiring tidy up, FYI. It's triggering my OCD, even if the missing front panel was a big clue)
> View attachment 179211


The Radiator positioning triggers my OCD, I'd swap the two around and have the GPU's Radiator hoses at the bottom front, with the CPU's Radiator at the top.


----------



## Mussels (Dec 14, 2020)

Caring1 said:


> The Radiator positioning triggers my OCD, I'd swap the two around and have the GPU's Radiator hoses at the bottom front, with the CPU's Radiator at the top.



GPU puts out triple the wattage of the CPU, it's going exhaust


----------



## wolf (Dec 14, 2020)

Can't believe it's taken me this long to find the club, I've had my TUF 3080 since the 21st of September!

Frikken love the card, it's an absolute powerhouse with any task I can throw at it, plus it runs cool and quiet. Blows me away how much of an improvement it is in all of these regards over my GTX1080.

Current setup pictured below, with more context around the build if you want it here


----------



## nguyen (Dec 14, 2020)

wolf said:


> Can't believe it's taken me this long to find the club, I've had my TUF 3080 since the 21st of September!
> 
> Frikken love the card, it's an absolute powerhouse with any task I can throw at it, plus it runs cool and quiet. Blows me away how much of an improvement it is in all of these regards over my GTX1080.
> 
> Current setup pictured below, with more context around the build if you want it here



Was about to post some undervolting with my 3090 in your Ampere undervolting experiment 
So the undervolting with my 3090 goes like this, I set up 4 undervolt at 800mV, 850mV, 900mV and 950mV

Tested in Watch Dogs Legion 
Clock - Voltage - Power usage - FPS
1860mhz  - 800mV - 255W - 83fps
1950mhz - 850mV - 292W - 86fps 
2010mhz - 900mV - 313W - 87fps
2070mhz - 950mV - 342W - 89fps 

850mV seems to be my go to undervolt for best performance at reasonable power consumption


----------



## wolf (Dec 14, 2020)

nguyen said:


> Was about to post some undervolting with my 3090 in your Ampere undervolting experiment


Damn looks like you did fairly well in the silicon lottery! How's it treating you for noise and thermals?


----------



## nguyen (Dec 14, 2020)

wolf said:


> Damn looks like you did fairly well in the silicon lottery! How's it treating you for noise and thermals?



well I have full cover waterblock (Bitspower) and the 3090 seems to behave the same as my 2080 Ti did, core temp is 6C higher than water temperature. 
I can't close my glass side panel because the WB stick out, now I have to live with a little pump noise until I change into a new case .

My guess is 3090 will always be of higher silicon quality than 3080, is your TUF 3080 the normal or OC version ? I bought the OC version and it's around 100usd more expensive than the non OC version is my country.


----------



## wolf (Dec 14, 2020)

nguyen said:


> well I have full cover waterblock (Bitspower) and the 3090 seems to behave the same as my 2080 Ti did, core temp is 6C higher than water temperature.
> I can't close my glass side panel because the WB stick out, now I have to live with a little pump noise until I change into a new case .
> 
> My guess is 3090 will always be of higher silicon quality than 3080, is your TUF 3080 the normal or OC version ? I bought the OC version and it's around 100usd more expensive than the non OC version is my country.


Yeah good point I didn't think of that, by virtue of being a 3090 it's likely a bit better. Nah I got the non-OC, for me it was a battle of crashing website etc in the precious minutes after launch, so I took the first I could get.

No regrets tho, the TUF is simply outstanding.


----------



## Mussels (Dec 14, 2020)

Whats the best way to undervolt these beasties?


----------



## nguyen (Dec 14, 2020)

Mussels said:


> Whats the best way to undervolt these beasties?



With Afterburner, 

1. Apply a core clock and memory clock overclocking 
2. Open the Curve Editor (or the tiny bar graph left of the core clock), left click on the curve at where you want to undervolt to (I recommend 850mV)
3. Highlight the all the area starting at 850mV to the end with Shift + left click and drag   




4. Hold Shift + Enter twice so that the curve flatten at 850mV, hit apply on Afterburner before closing the curve editor
5. Save profile


----------



## wolf (Dec 14, 2020)

Personally before opening the curve editor I slide core clock all the way down in the main window and apply, then when you open the curve editor the whole line is so low that when you pick your point (say 850mv @ 1950mhz) and apply, everything to the right is flat in one go.


----------



## Mussels (Dec 14, 2020)

Found the WCCFTECH guide




806mv / 1800MHz seems pretty damn efficient taking over 100W off the card, and its a little overpowered for me right now so a 10% loss for that kind of heat reduction is a winner

Oh, anyone want an anime tiddies 3080?


----------



## kurosagi01 (Dec 14, 2020)

Mussels said:


> Found the WCCFTECH guide
> 
> 
> 
> ...


Asus has a Gundam edition of the 3080 and 3090 which I would be tempted for but only available for Asia.


----------



## wolf (Dec 15, 2020)

Mussels said:


> 806mv / 1800MHz seems pretty damn efficient taking over 100W off the card, and its a little overpowered for me right now so a 10% loss for that kind of heat reduction is a winner


Vs an FE card, so the vast majority of 3080 results shown in benchmarks in reviews and the like, ~1800mhz won't even be 10% slower, just checked a few and they seem to level off when heat soaked at ~1830mhz. My card runs 1830mhz@806mv with +500mhz memory for 20gbps effective, so that should be good for roughly stock performance with much better power draw and thermals.

I'm in the same boat of overpowered so if I want that extra 5-10% OC horsepower down the track I can muscle in for it, or just go for it in winter? lol


----------



## Mussels (Dec 15, 2020)

wolf said:


> Vs an FE card, so the vast majority of 3080 results shown in benchmarks in reviews and the like, ~1800mhz won't even be 10% slower, just checked a few and they seem to level off when heat soaked at ~1830mhz. My card runs 1830mhz@806mv with +500mhz memory for 20gbps effective, so that should be good for roughly stock performance with much better power draw and thermals.
> 
> I'm in the same boat of overpowered so if I want that extra 5-10% OC horsepower down the track I can muscle in for it, or just go for it in winter? lol



Good point, for comparing to an FE card. I have a 240mm water cooling on mine so i'm the one person who SHOULD be overclocking the tits out of it, but hell yeah to 40C GPU loads!


----------



## nguyen (Dec 15, 2020)

5 undervolt profiles 
1845mhz - 800mV
1920mhz - 850mV
2010mhz - 906mV
2070mhz - 950mV
2130mhz - 1025mV

Well for 10% more FPS, the power consumption rise by 50%  (220W -> 330W) . AC Valhalla seems to load the GPU very lightly even at 3440x1440, GPU can actually reach max clock  before it reach power limit.
But ouch...60fps , I can get higher FPS with Cyberpunk at 3440x1440 Ultra with RT/DLSS ON


----------



## kurosagi01 (Dec 15, 2020)

I've got my undervolt currently at 850mv/1920mhz and stock memory speed which is better than my older bro FE.
I do have a custom fan curve aswell which runs at peak 65% speed, highest temp so far around 60c.


----------



## xrobwx71 (Dec 15, 2020)

I'll join the club when I can find one. I'm not paying scalper prices, I'd go back to my VooDoo 3 Fx before I did that.


----------



## Mussels (Dec 15, 2020)

After spending heaps of time screwing with the overclocking curve and finding out my original information was pretty spot on, anyone not after benchmarking records should give this a shot in afterburner
(dont forget to save to profile 1 and start with windows!)

1. reset to defaults
2. -300 on GPU
3. Ctrl-F
4. Click on 806mv, drag to 1800Mhz
5. apply in main window - curve will flatten
(the -300 is purely to make it flatten the curve for you with one adjustment, saves clicking/effort)


Roughly 100W less power consumed, usually less than 5FPS drop (zero, if you were CPU limited)
It's an amazing heat and noise difference


----------



## nguyen (Dec 16, 2020)

made a quick video on how to overclock + undervolt


----------



## Ominence (Dec 16, 2020)

Please halp me increase the PL on my card (Aorus Master 3070). I'm hitting peak 303w on Time Spy Extreme Stress Test @ 68-70C (custom fan curve 100% @ 70C) and 1995-1965Mhz core. Auto tune curve went to +58Mhz in Aorus Engine 1.94. Card is fully stable. Had very occasional random stability issues at manual +60Mhz OC (over the 1845 factory base OC) which 'stabilised' the card at '2040-2010' but only on Port Royal Stress Test. Me thinks the +11 power budget became an issue every now and then (270 base factory OC+30 user OC=300w). VMEM clock is stock, voltage slider is stock.


----------



## nguyen (Dec 16, 2020)

Ominence said:


> Please halp me increase the PL on my card (Aorus Master 3070). I'm hitting peak 303w on Time Spy Extreme Stress Test @ 68-70C (custom fan curve 100% @ 70C) and 1995-1965Mhz core. Auto tune curve went to +58Mhz in Aorus Engine 1.94. Card is fully stable. Had very occasional random stability issues at manual +60Mhz OC (over the 1845 factory base OC) which 'stabilised' the card at '2040-2010' but only on Port Royal Stress Test. Me thinks the +11 power budget became an issue every now and then (270 base factory OC+30 user OC=300w). VMEM clock is stock, voltage slider is stock.




300W is already the highest PL you can find for any RTX 3070, you are not getting any additional performance with any higher PL anyways.


----------



## Ominence (Dec 16, 2020)

nguyen said:


> 300W is already the highest PL you can find for any RTX 3070, you are not getting any additional performance with any higher PL anyways.


 But I'm  mate. In my head i'll 'gain' or is is it gane... see . I wanna that 2GHz number rock solid stable.... if only that Palit 3070 card (330w) had 2x HDMI for my all HDCP 2.2 setup. i think that will show me 2GHz. But also it's not available in Oz. But it's also 2% moar performance!!!!!


----------



## nguyen (Dec 16, 2020)

Ominence said:


> But I'm  mate. In my head i'll 'gain' or is is it gane... see . I wanna that 2GHz number rock solid stable.... if only that Palit 3070 card (330w) had 2x HDMI for my all HDCP 2.2 setup. i think that will show me 2GHz. But also it's not available in Oz. But it's also 2% moar performance!!!!!



Try lighter load game like AC Valhalla and you will see your GPU constantly running in the 2Ghz, stress test just load the GPU too heavily.


----------



## Ominence (Dec 16, 2020)

nguyen said:


> Try lighter load game like AC Valhalla and you will see your GPU constantly running in the 2Ghz, stress test just load the GPU too heavily.


that is fair or fare. sorry cannot stop being silly for some stupid reason. but now i'm also very .


----------



## Mussels (Dec 17, 2020)

This 3080 is a totally different beast with that 100W power reduction, and how Nvidia should have launched them.

Standard high efficiency 1.8GHz mode, then a 2GHz turbo mode (BIOS switch?)


----------



## wolf (Dec 17, 2020)

Mussels said:


> This 3080 is a totally different beast with that 100W power reduction, and how Nvidia should have launched them.


I tend to agree, I have to assume there are other factors at play, like they couldn't get a high enough binning rate at these sort of clock/volt targets, or they were really shooting for the last ~2-5% that hard because they had a set performance target? I don't know nearly enough as a mere mortal who is just a hardware enthusiast.

It's a bloody brilliant GPU once tuned, I'd wager most people can get higher than stock performance with less than stock power consumption too.


----------



## bug (Dec 17, 2020)

wolf said:


> I tend to agree, I have to assume there are other factors at play, like they couldn't get a high enough binning rate at these sort of clock/volt targets, or they were really shooting for the last ~2-5% that hard because they had a set performance target? I don't know nearly enough as a mere mortal who is just a hardware enthusiast.



Either that or they _really_ wanted to push that 12 pin connector 
In the end, they did what AMD did with every card since Polaris. And since for every card since Polaris I've said it's not ok to make the user undervolt to get the card working in optimum shape, I'm not going to give Nvidia a pass on these either.


----------



## HammerON (Dec 17, 2020)

Joining the club this coming Friday!  Been waiting since release to get my hands on the 3080 EVGA FTW3 Ultra
@Mussels I would have preferred the card you got though


----------



## Mussels (Dec 17, 2020)

I get the feeling they knew how AMD was going to perform and NEEDED that 5% boost
Imagine if the 3080 and 3090 were 5% slower and AMD beat them in raster benchmarks?

I just want to  flash this in a BIOS and make it permanent

Edit: Hammer, the EVGA is what i originally ordered but there is STILL no stock here in Au... so i moved to the crazy expensive one that lasted more than 3 seconds


----------



## wolf (Dec 17, 2020)

bug said:


> Either that or they _really_ wanted to push that 12 pin connector
> In the end, they did what AMD did with every card since Polaris. And since for every card since Polaris I've said it's not ok to make the user undervolt to get the card working in optimum shape, I'm not going to give Nvidia a pass on these either.


In terms of the board power though, 2x8pin would have been enough, so I don't really see the correlation between high TDP and pushing the 12pin? you may be right tho... I think it's just a founders thing to push the engineering and if I'm honest I see it catching on as years pass and they come standard on high end power supplies.

It's a tough one with undervolting, because they do need to ship a card that is stable and performant for 100% of buyers, but it seems like Polaris, Vega, Ampere etc are quite a bit further from optimum than most.


Mussels said:


> I get the feeling they knew how AMD was going to perform and NEEDED that 5% boost
> Imagine if the 3080 and 3090 were 5% slower and AMD beat them in raster benchmarks?
> 
> I just want to  flash this in a BIOS and make it permanent


Yeah maybe that was the case after all, and I suspect we will never really know. My best bet is a combination of things, such as that and perhaps hedging bets on the Samsung 8N node performance.

Interesting idea, I'd probably bump up to ~825mv and make a bios for that and call it a day, just a tad more buffer to be sure it was rock solid in any workload, given it's likely ones coming up could be even more punishing than ones today.


----------



## HammerON (Dec 19, 2020)

New GPU arrived!


----------



## Mussels (Dec 19, 2020)

HammerON said:


> New GPU arrived!View attachment 180154



That's what i originally wanted!


----------



## T3RM1N4L D0GM4 (Dec 25, 2020)

Here I am!


----------



## FireFox (Dec 25, 2020)

It felt weird not to be part of this Club 







I guess i was lucky because the ex owner
paid 1000€ and he was selling it for 850€ and because i had to pick it up i paid just 800€


----------



## Mussels (Dec 31, 2020)

So, warranty resulted in an upgrade






It's only single colour RGB and not ARGB, but its synced to razer (as are the PCI-E power cables) so i can do some fun lighting stuff


----------



## wolf (Dec 31, 2020)

Mussels said:


> So, warranty resulted in an upgrade


I missed the bit where something went wrong! much changover money?


----------



## witkazy (Dec 31, 2020)

Kinda by accident here , planned to get 6800xt but 2020 happened and i wound up with Palit 3070 oc and i can not say bad word about it . Playing Cyberpunk on 1440p no worries so cheers.


----------



## nguyen (Dec 31, 2020)

witkazy said:


> Kinda by accident here , planned to get 6800xt but 2020 happened and i wound up with Palit 3070 oc and i can not say bad word about it . Playing Cyberpunk on 1440p no worries so cheers.



Enjoying best game of the year on the eve of new year is bliss isn't it


----------



## HammerON (Dec 31, 2020)

wolf said:


> I missed the bit where something went wrong! much changover money?


Yeah - we want details @Mussels !!!


----------



## Mussels (Jan 1, 2021)

wolf said:


> I missed the bit where something went wrong! much changover money?



3080 AIO sprung a leak, and they only had 3090s in stock so i got sale prices on what they had left
Forgive image quality, my pixel 4XL battery died and i'm on a spare until the warranty arrives (they dont ship weekends or public holidays here, so.... fook)


3080 said nope





3090 said "hah i'm eating tempered glass and i love it"






Horizontal mount made system look very clean


----------



## wolf (Jan 1, 2021)

Mussels said:


> 3080 AIO sprung a leak, and they only had 3090s in stock so i got sale prices on what they had left
> Forgive image quality, my pixel 4XL battery died and i'm on a spare until the warranty arrives (they dont ship weekends or public holidays here, so.... fook)


Damn not having much luck, a Pixel and a 3080 

Still, 3090 for dayzzzz


Mussels said:


> 3080 said nope





Mussels said:


> 3090 said "hah i'm eating tempered glass and i love it"
> 
> Horizontal mount made system look very clean


Which have you stuck with? my money is on horizontal


----------



## HammerON (Jan 1, 2021)

I like being able to see those fans!


----------



## Mussels (Jan 1, 2021)

horizontals worked out, seeing 60C load in most games

In the future, i'll get a case with triple slot vertical mount support and show off the bling
(The RGB lighting cables genuinely look better in horizontal mount too, and they're individually controlled for fancy lighting)


----------



## Mussels (Jan 4, 2021)

Got a BIOS update from galax support for the 3090, as a response to a problem with the fan speed at idle. uploaded to TPU, will wait and see if it helps or not


----------



## cst1992 (Jan 4, 2021)

Newest Ampere owner!





Got it for a good price too - INR 35,900($491) including shipping and handling and taxes.
That's very cheap here in India, cheaper even than the AMD RX 5700XT!
Check this post out for my impressions!



T3RM1N4L D0GM4 said:


> Here I am!


That box and tagline, as well as card design make me sure of one thing: Galax and KFA2 are the same company.
They're cheap where I live, so I've been avoiding them. Do let me know how yours performs!


----------



## T3RM1N4L D0GM4 (Jan 4, 2021)

cst1992 said:


> That box and tagline, as well as card design make me sure of one thing: Galax and KFA2 are the same company.
> They're cheap where I live, so I've been avoiding them. Do let me know how yours performs!


You are right, KFA2 is the ITA/EU brand name for Galax.
Card assembly is pretty solid and it's very quiet under load. ATM I'm playing Rise of the Tomb Raider and Need for Speed Heat, both over 70-100fps @Ultra setting (3440x1440 34" freesync monitor).


----------



## cst1992 (Jan 4, 2021)

T3RM1N4L D0GM4 said:


> ATM I'm playing Rise of the Tomb Raider and Need for Speed Heat


If you get Control and/or Red Dead Redemption 2, also post results for that.
Also, FS2020


----------



## Mussels (Jan 4, 2021)

T3RM1N4L D0GM4 said:


> You are right, KFA2 is the ITA/EU brand name for Galax.
> Card assembly is pretty solid and it's very quiet under load. ATM I'm playing Rise of the Tomb Raider and Need for Speed Heat, both over 70-100fps @Ultra setting (3440x1440 34" freesync monitor).



I've used and sold a few galax cards lately

They cheap out in a few ways (RGB and not ARGB for example - and they seem to have just the one cooler design)
but they're cheaper than the competition, quiet, and not too ugly (a 3090 air cooled will NEVER be silent... but its faaaaaaar better than the 1070ti i just upgraded from)


----------



## cst1992 (Jan 6, 2021)

T3RM1N4L D0GM4 said:


> ATM I'm playing Rise of the Tomb Raider and Need for Speed Heat


How do you like Heat? I just bought NFS: 2015 over Steam and I don't really like it that much.
I've heard Payback is worse, but Heat is actually decent.


----------



## thesmokingman (Jan 9, 2021)

Just got mine in, installed it then had to do stuff all day. It's so damn big. I ordered a bykski block and a 12pin adapter, so now gotta rough it on water for a while. Hopefully I'll get some more time to play with it this weekend.


----------



## the54thvoid (Jan 9, 2021)

thesmokingman said:


> View attachment 183155
> 
> Just got mine in, installed it then had to do stuff all day. It's so damn big. I ordered a bykski block and a 12pin adapter, so now gotta rough it on water for a while. Hopefully I'll get some more time to play with it this weekend.


 Get pics once the block is on!


----------



## thesmokingman (Jan 9, 2021)

the54thvoid said:


> Get pics once the block is on!


Oh yea, I will. Frankly speaking it was really strange when I was researching blocks. There are only 4 blocks available, and one brand has a bad history of leaks so that one's off the list. That just leaves Aqua, BP and Blykski. The Aqua and BP is out of stock everywhere. The supposed EK beauty... well looks like EK is not making it, just threw away the whole idea lol. Hello my friend Blyski!

And then on top of that, I was hoping to find other users who posted their experiences or results and I could not find one. I hopoe I just goofed the search parameters but man I could not find any. When I get mine on I'll def share what I learn.


----------



## dyonoctis (Jan 15, 2021)

I finally joined that _exclusive_ club, I was initially going for the humble 3060Ti, but finding the 3070 at MSRP (519€) was a much better deal than all the overpriced aftermarket 3060Ti. LDLC (french retailer) doesn't list the FE on their website, you can only access them with a specific link, they stayed in stock for a good 10min. Once it's done the link will just show you a 404 page.


----------



## nguyen (Jan 15, 2021)

dyonoctis said:


> I finally joined that _exclusive_ club, I was initially going for the humble 3060Ti, but finding the 3070 at MSRP (519€) was a much better deal than all the overpriced aftermarket 3060Ti. LDLC (french retailer) doesn't list the FE on their website, you can only access them with a specific link, they stayed in stock for a good 10min. Once it's done the link will just show you a 404 page.



Sounds like the elusive preorder link that AMD sent to their Team Red promoters there


----------



## cst1992 (Jan 15, 2021)

519 EUR is very nice for 3070.

Post more pics of the card!


----------



## dyonoctis (Jan 15, 2021)

cst1992 said:


> 519 EUR is very nice for 3070.
> 
> Post more pics of the card!


I'm running out of zip tie, I cannot post pictures of my pc in that state


----------



## bug (Jan 15, 2021)

dyonoctis said:


> I finally joined that _exclusive_ club, I was initially going for the humble 3060Ti, but finding the 3070 at MSRP (519€) was a much better deal than all the overpriced aftermarket 3060Ti. LDLC (french retailer) doesn't list the FE on their website, you can only access them with a specific link, they stayed in stock for a good 10min. Once it's done the link will just show you a 404 page.
> View attachment 184094
> 
> View attachment 184096


The card is still expensive, but a good deal, all things considered. Congrats on your luck and enjoy!


----------



## sepheronx (Jan 17, 2021)

Well, I kinda made a mistake buy.

So at our local memory express they had a bunch of RTX 3060 Ti's and 3070s arrive.  I told my wife to flip a coin while we were in the car (I went there to just pick up a SSD for another build I am doing to sell) and so I got heads (instead of tails obviously) and so I purchased a ASUS RTX 3070 Dual.


----------



## cst1992 (Jan 17, 2021)

Why? It's a great card and it's the more powerful of the two.
Unless you think it's a bad value for money it's not a mistake buy.


----------



## sepheronx (Jan 17, 2021)

cst1992 said:


> Why? It's a great card and it's the more powerful of the two.
> Unless you think it's a bad value for money it's not a mistake buy.


$800 CAD total is expensive.  Far from MSRP even if you do the exchange rate


----------



## cst1992 (Jan 17, 2021)

Agree.


----------



## bug (Jan 17, 2021)

sepheronx said:


> $800 CAD total is expensive.  Far from MSRP even if you do the exchange rate


Asus has little to do with MSRP. They're overpriced by default. Sometimes for a good reason, but most times... not.


----------



## sepheronx (Jan 17, 2021)

bug said:


> Asus has little to do with MSRP. They're overpriced by default. Sometimes for a good reason, but most times... not.


believe it or not, it was the cheapest one.  It was $760~ before taxes.  The cheapest model in Canada is the Gigabyte Eagle model at $730~ before taxes.


----------



## bug (Jan 17, 2021)

sepheronx said:


> believe it or not, it was the cheapest one.  It was $760~ before taxes.  The cheapest model in Canada is the Gigabyte Eagle model at $730~ before taxes.


What I meant is: don't fret too much about not being close to MSRP, Asus usually isn't even without factoring in supply issues 

That said, enjoy your new purchase


----------



## Mussels (Jan 19, 2021)

I still love that i got a 3090, and barely manage to max it out in any game.

tasty irony.


----------



## nguyen (Jan 19, 2021)

Mussels said:


> I still love that i got a 3090, and barely manage to max it out in any game.
> 
> tasty irony.



Time to dial in some tasty overclock there , 5-10% more perf helps especially when fps <60.


----------



## Mussels (Jan 19, 2021)

nguyen said:


> Time to dial in some tasty overclock there , 5-10% more perf helps especially when fps <60.


nah i underclocked to 1800MHz at 0.8v to annoy the people who have to overclock to reach my performance


and then i play rimworld to let the GPU run without starting the fans


----------



## wolf (Jan 19, 2021)

Mussels said:


> and barely manage to max it out in any game.


I know that feeling even with the 3080, it just breezes through any task I can throw at it, bar say CP2077 with all the bells.

Picked up Battlefront 2 for free on EPIC, turned all the dials to 11, and need to add in well over 100% render scale to bring it down from frame capping on my monitor, and that's undervolted at 1830mhz/806mv too. I love being in a position to add gratuitous super-sampling.


----------



## Dellenn (Jan 19, 2021)

Got mine from Best Buy back in October when they started selling them.


----------



## toilet pepper (Jan 19, 2021)

So I got a got an MSi 3080 Ventus OC and it runs very hot. At stock it reaches around 75C at 1845mhz during port royal. I have since undervolted it to .81mV @ 1830mhz and used a custom fan curve to get the temps right with no perceivable impact. I now get around 65C at the same test and during gaming.

What I found odd was when I'm doing some mining on the side. The card was thermal throttling at my custom curve even when the core is downclocked to 1100mhz at .71mV. I sort-off fixed it by running the fan at 80%.

I realized this shouldn't be the case. The gddr6x was cooking itself. The thermal pads are there and I can see them from the side. Even the "graphene" (plastic) backplate was getting hot and the thermal pads are visibly connecting it to the board. 

From the sides, it is hard to tell what kind of thermal pads they used but it looks somewhere near 2 or 3mm. So I bought a 3mm Gelid thermal pad and squeezed it a little and replace the existing thermal pad. 

The gpu temp increased by 2C during port royal and gaming which looks good meaning the vram are now transferring heat properly to the heatsink. To validate this I used the mining application again and there is no thermal throttling anymore using my curve and the gpu is getting hot relatively.

There are few reviews of this card and hard to get some decent information about it. This was the cheapest and most decent 3080 I can find at the time and I am not that surprised by the things I personally found about it.

TLDR

There are no vram temp sensors. I'm sure this is an nvidia thing. If it can be read I'm sure wiz would have included it in gpuz. 

Gddr6x can become really hot and will cook itself. 

Cheap coolers are cheap. The Ventus cooler can barely handle the stock 3080. I'm imagining MSI intentionally cheaped out on the thermal pads to make the gpu core temp look a little better since you cant check vram temps.

I know this problem isn't isolated to the Ventus as I have seen people complaining with the same problem who owns cards from other brands.

PS: I know mining is not its intended use but this PSA might help people in the future when the 2nd hand market gets flooded.


----------



## cst1992 (Jan 19, 2021)

toilet pepper said:


> So I got a got an MSi 3080 Ventus OC and it runs very hot. At stock it reaches around 75C at 1845mhz during port royal. I have since undervolted it to .81mV @ 1830mhz and used a custom fan curve to get the temps right with no perceivable impact. I now get around 65C at the same test and during gaming.
> 
> What I found odd was when I'm doing some mining on the side. The card was thermal throttling at my custom curve even when the core is downclocked to 1100mhz at .71mV. I sort-off fixed it by running the fan at 80%.
> 
> ...


Do you have the power draw recorded with temps in these tests?

Anyone owning a reference 3080 care to do a comparison?


----------



## nguyen (Jan 19, 2021)

toilet pepper said:


> So I got a got an MSi 3080 Ventus OC and it runs very hot. At stock it reaches around 75C at 1845mhz during port royal. I have since undervolted it to .81mV @ 1830mhz and used a custom fan curve to get the temps right with no perceivable impact. I now get around 65C at the same test and during gaming.
> 
> What I found odd was when I'm doing some mining on the side. The card was thermal throttling at my custom curve even when the core is downclocked to 1100mhz at .71mV. I sort-off fixed it by running the fan at 80%.
> 
> ...



Some other user complained that his 3090's VRAM is thermal throttling during mining and his GPU-Z screenshot indicated that the memory is pulling 187W, while during gaming it's only around 80-100W.
Seems like you really need to underclock the GDDR6X for mining 



wolf said:


> I know that feeling even with the 3080, it just breezes through any task I can throw at it, bar say CP2077 with all the bells.
> 
> Picked up Battlefront 2 for free on EPIC, turned all the dials to 11, and need to add in well over 100% render scale to bring it down from frame capping on my monitor, and that's undervolted at 1830mhz/806mv too. I love being in a position to add gratuitous super-sampling.



4K is really tough on the 3090 though, I have to settle for DLSS Performance in CP2077 and lower some settings in Watch Dogs Legion (DLSS Balanced since Performance looks bad in WD Legion) to get stable 60fps .


----------



## cst1992 (Jan 19, 2021)

nguyen said:


> 4K is really tough on the 3090 though, I have to settle for DLSS Performance in CP2077 and lower some settings in Watch Dogs Legion (DLSS Balanced since Performance looks bad in WD Legion) to get stable 60fps .


And here NVIDIA is, advertising the 3090 for 8K gaming


----------



## Hyderz (Jan 19, 2021)

Mussels said:


>



galax card?


----------



## cst1992 (Jan 19, 2021)

It is.


----------



## Mussels (Jan 19, 2021)

Tis indeed a galax

it's RGB and not ARGB, and other than that its actually really nice quality. at stock the fans ramp up louder than i'd like, but quieter than the MSI 1070ti/1080 it replaced

heres rainbow mode activated


----------



## cst1992 (Jan 19, 2021)

Mussels said:


> at stock the fans ramp up louder than i'd like


This should help with fan noise: https://www.techpowerup.com/forums/threads/3060ti-undervolting.276449/#post-4426953
Achieved really close results with noticeably lower power draw.


----------



## toilet pepper (Jan 19, 2021)

cst1992 said:


> Do you have the power draw recorded with temps in these tests?
> 
> Anyone owning a reference 3080 care to do a comparison?



@stock it is 320 watts accdng to gpuz. Undervolted 1830mhz at .81mv is 260 watts. When mining it is 220 watts 1100mhz +500 mem at .72mV.


----------



## cst1992 (Jan 19, 2021)

toilet pepper said:


> @stock it is 320 watts accdng to gpuz. Undervolted 1830mhz at .81mv is 260 watts. When mining it is 220 watts 1100mhz +500 mem at .72mV.


Look who you tagged


----------



## Mussels (Jan 19, 2021)

toilet pepper said:


> @stock it is 320 watts accdng to gpuz. Undervolted 1830mhz at .81mv is 260 watts. When mining it is 220 watts 1100mhz +500 mem at .72mV.



I've been using the undervolt trick since day 1 of my deceased 3080 and 3090, thats why i mentioned its noise at stock

at 1800/0.8v i dont notice it over the room fan here (it's summer)


----------



## toilet pepper (Jan 20, 2021)

1830 at .81 is my sweet spot. Any higher clocks and port royal will just crash.

I just checked, the vram was sucking 150 watts of power when mining. If there is poor thermal conductivity with the cooler it will really cook itself.


----------



## FireFox (Jan 20, 2021)

What is the point under-clocking the card?


----------



## wolf (Jan 20, 2021)

Knoxx29 said:


> What is the point under-clocking the card?


So these cards (and a lot of others these days) have a voltage/frequency curve that is designed around the lowest common denominator and then some, basically ultra safe (more voltage than needed for any given frequency) so that 100% of cards will be stable no matter what you get in the silicon lottery. Undervolting  is tweaking the card so that you're optimising for your unique silicon 'win' or 'loss'.

In GA102's case (3080 and 3090), it seems the FE cards settle in the low 1800mhz range with with a lot more volts, 0.950v to 1.025v or thereabouts if memory serves, but it is very common to run them right around ~1800mhz @ 0.800v.

In practice what this achieves is roughly FE/stock performance, but with drastically reduced power consumption (50++watts) and as a consequence, heat too, due to the lower voltage used to maintain that frequency.

The same methodology of undervolting can be used using the MSI Afterburner curve editor to maximise clock speeds and effectively overclock too, say 1950+ mhz but at lower than stock volts, like maybe ~.950v or so - thus giving you better than stock performance for roughly the same power and heat. The other extreme is also possible for those chasing crazy efficiency, like ~1600-1700mhz at even lower volts like maybe 0.700v or so, thus giving you ~90% of stock performance for ~70% of the power.

Hope I've done the explanation some justice, I'm sure others will help fill gaps.


----------



## Mussels (Jan 20, 2021)

Knoxx29 said:


> What is the point under-clocking the card?


100W more for 5% performance isn't worth it?

Some people play games, not fight for benchmarks


----------



## nguyen (Jan 20, 2021)

toilet pepper said:


> 1830 at .81 is my sweet spot. Any higher clocks and port royal will just crash.
> 
> I just checked, the vram was sucking 150 watts of power when mining. If there is poor thermal conductivity with the cooler it will really cook itself.



Yeah my 3090's VRAM use like <90W, these GDDR6X modules are capable of sucking down a tremendous amount of juice   .
The Asus TUF design with seperate VRAM heatsink would be much better suited for mining, no wonder they are all sold out .


----------



## wolf (Jan 20, 2021)

nguyen said:


> The Asus TUF design with seperate VRAM heatsink would be much better suited for mining


Indeed, been using my TUF 3080 for mining, still, can't really tell what the mem temps are, just piece of mind that they're well cooled, even the back of the card with no memory chips has thermal pads on the backside of the memory contacting the backplate.


----------



## FireFox (Jan 20, 2021)

wolf said:


> So these cards (and a lot of others these days) have a voltage/frequency curve that is designed around the lowest common denominator and then some, basically ultra safe (more voltage than needed for any given frequency) so that 100% of cards will be stable no matter what you get in the silicon lottery. Undervolting  is tweaking the card so that you're optimising for your unique silicon 'win' or 'loss'.
> 
> In GA102's case (3080 and 3090), it seems the FE cards settle in the low 1800mhz range with with a lot more volts, 0.950v to 1.025v or thereabouts if memory serves, but it is very common to run them right around ~1800mhz @ 0.800v.
> 
> ...


Thanks for the explanation.

However i know what Under-volting/Under-clocking means i just asked the wrong question to @toilet pepper  , i meant that we used to overclock the cards as higher as we could with the less voltage possible but these days the cards aren't just under-volted but under-clocked too 

I was curious and tried 0.800v for 1800Mhz but nothing happened, in Afterburner i set a curve 0.800v for 1800Mhz saved the profile and ran Heaven but my card was using to default settings


----------



## bug (Jan 20, 2021)

Knoxx29 said:


> Thanks for the explanation.
> 
> However i know what Under-volting/Under-clocking means i just asked the wrong question to @toilet pepper  , i meant that we used to overclock the cards as higher as we could with the less voltage possible but these days the cards aren't just under-volted but under-clocked too
> 
> I was curious and tried 0.800v for 1800Mhz but nothing happened, in Afterburner i set a curve 0.800v for 1800Mhz saved the profile and ran Heaven but my card was using to default settings


The thing is highest clocks and highest voltages aren't sustained for very long. If you lower those, you don't lose much, but in some cases you can lower the power draw significantly.

To me, this is not a feature, but rather a sign the manufacturers push their cards into unreasonable territory, because they need to look good in benchmarks.


----------



## FireFox (Jan 20, 2021)

bug said:


> The thing is highest clocks and highest voltages aren't sustained for very long. If you lower those, you don't lose much, but in some cases you can lower the power draw significantly.
> 
> To me, this is not a feature, but rather a sign the manufacturers push their cards into unreasonable territory, because they need to look good in benchmarks.



i have my card at 0.918v for 1950Mhz and it wont downclock when Gaming and for me that is fine, now could you explain me the last bit of my previous post?


----------



## toilet pepper (Jan 20, 2021)

wolf said:


> Indeed, been using my TUF 3080 for mining, still, can't really tell what the mem temps are, just piece of mind that they're well cooled, even the back of the card with no memory chips has thermal pads on the backside of the memory contacting the backplate.


I just saw a teardown of the Asus TUF and it looks good! The Ventus has thermal pads on the back as well and it gets hot too. Honestly, I never considered mining when I bought the Ventus and couldn't justify that the TUF costs around $100 more when 3080s costs around $1000 here already.


----------



## bug (Jan 20, 2021)

Knoxx29 said:


> i have my card at 0.918v for 1950Mhz and it wont downclock when Gaming and for me that is fine, now could you explain me the last bit of my previous post?


Not really. I only buy mid-range cards, so I don't have first-hand experience with undervolting&friends.


----------



## Deleted member 205776 (Jan 20, 2021)

My 2070 died on me in December (that's what you get for buying Zotac), needed to use my PC immediately and I found a 3070 Gaming X Trio in stock very close to MSRP. So I went for it. Got the card like 2 weeks ago but figured I'd still post this here. Never goes past 64C not even in Furmark, MSI did a great job on the cooling for this card.












(note: I have since installed a GPU support mount from uphere since it was sagging a little lol)

Bit bummed about the only 8 gigs of VRAM but it'll do I guess. Playing at 1440p144. My next upgrade will be RTX 4000.


----------



## toilet pepper (Jan 20, 2021)

Knoxx29 said:


> Thanks for the explanation.
> 
> However i know what Under-volting/Under-clocking means i just asked the wrong question to @toilet pepper  , i meant that we used to overclock the cards as higher as we could with the less voltage possible but these days the cards aren't just under-volted but under-clocked too
> 
> I was curious and tried 0.800v for 1800Mhz but nothing happened, in Afterburner i set a curve 0.800v for 1800Mhz saved the profile and ran Heaven but my card was using to default settings



Here's what I noticed with the Ampere cards and also my 2070s, if left at stock the card will constantly try to hit the highest boost clock and pump as much voltage to it as possible it then downclocks it because it is getting hot. This cycle just soaks the heatsink and causes unnecesary power draw just for that split second 1-5fps increase. Keep in mind that if you dont fix the voltage curve it will do this all the time that's why it is running hot.


----------



## biffzinker (Jan 20, 2021)

Knoxx29 said:


> I was curious and tried 0.800v for 1800Mhz but nothing happened, in Afterburner i set a curve 0.800v for 1800Mhz saved the profile and ran Heaven but my card was using to default settings


What happens if you lower the Power Limit below 100%? Say 85/80%, should be the same as a undervolt. I’m unable to change the voltage for my RTX 2060 only recourse is the power limit.


----------



## bug (Jan 20, 2021)

toilet pepper said:


> Here's what I noticed with the Ampere cards and also my 2070s, if left at stock the card will constantly try to hit the highest boost clock and pump as much voltage to it as possible it then downclocks it because it is getting hot. This cycle just soaks the heatsink and causes unnecesary power draw just for that split second 1-5fps increase. Keep in mind that if you dont fix the voltage curve it will do this all the time that's why it is running hot.


That's the whole idea behind "boosting". It's the same for CPUs.
It works well for bursty workloads (e.g. loading a complex web page, performing some calculations in Excel, short game benchmark runs). For steady loads, it's supposed to stabilize at lower clocks. But if those lower clocks are not low enough, they will still generate too much heat and cause thermal throttling.

Also keep in mind these values are not always set in stone. CPUs at least will boost as high as your cooling can sustain and with good cooling you can go quite a bit above nominal TDP. I wouldn't be surprised if Ampere did the same, but so far I don't have enough data on that.


----------



## FireFox (Jan 20, 2021)

biffzinker said:


> What happens if you lower the Power Limit below 100%? Say 85/80%, should be the same as a undervolt. I’m unable to change the voltage for my RTX 2060 only recourse is the power limit.



I tried and it was working when running Heaven Benchmark





Stopped working right after i started playing COD


----------



## Mussels (Jan 20, 2021)

biffzinker said:


> What happens if you lower the Power Limit below 100%? Say 85/80%, should be the same as a undervolt. I’m unable to change the voltage for my RTX 2060 only recourse is the power limit.


the undervolt works a lot better


the power limit still lets it boost up really high for brief periods, then it has to lower to cool down so its see-sawing - where the undervolt gives you more performance per watt and stays at a steady level

in a quick short benchmark that doesnt build up heat, being at 2GHz would give a better score - but in sustained gaming where it might dip to 1900 due to heat anyway, i'm gunna get the same experience locked to 1800 with a curve - and the reduced heat in the case could help CPU performance too, in many systems


----------



## FireFox (Jan 20, 2021)

Mussels said:


> the undervolt works a lot better


but i cant get it at 0.800v the card reverse the voltage that i set, i don't know if this is a card issue


----------



## Mussels (Jan 20, 2021)

Are you using a curve?
make sure to uninstall anything that isn't afterburner, and run the latest beta


----------



## FireFox (Jan 20, 2021)

Mussels said:


> Are you using a curve?
> make sure to uninstall anything that isn't afterburner, and run the latest beta


yes i am using and afterburner 4.6.2, uninstall evga precision too?


----------



## Mussels (Jan 20, 2021)

Knoxx29 said:


> yes i am using and afterburner 4.6.2, uninstall evga precision too?


Does 4.6.2 even support ampere? you need the 4.6.3 betas for ampere OCing


----------



## wolf (Jan 20, 2021)

toilet pepper said:


> Honestly, I never considered mining when I bought


Me neither but when I saw Ampere going beast mode on mining, and I have solar power....


Knoxx29 said:


> but i cant get it at 0.800v the card reverse the voltage that i set, i don't know if this is a card issue


There are a few methods, but looking at your pic there are still values above your target.

What I do in MSI AB is click reset, then drag core clock all the way down in the main window, -300 or something I think, then when you open the curve editor, drag up the point along the curve that is your target then hit apply, everything to the right should be a flat line now.


----------



## FireFox (Jan 20, 2021)

Mussels said:


> Does 4.6.2 even support ampere? you need the 4.6.3 betas for ampere OCing


i will download 4.6.3 in a few minutes, you are right, the 4.6.2 doesn't support ampere


wolf said:


> What I do in MSI AB is click reset, then drag core clock all the way down in the main window, -300


-502 says here


----------



## Mussels (Jan 20, 2021)

As others have said, its simple enough

reset
-300 on core clock
ctrl-f to bring up the window
click and drag the clock you want to the voltage you want (in my case, 0.8v 1800mhz) click apply, save profile - le done

if you want to tweak it, reset and try again - this stops you from needing to adjust every single step along the way


----------



## toilet pepper (Jan 21, 2021)

Knoxx29 said:


> but i cant get it at 0.800v the card reverse the voltage that i set, i don't know if this is a card issue


.800 is too low and can only be achieved by a very good sample. Try chcecking what your default clocks and voltages are and go for a lower voltage until it crashes.


----------



## FireFox (Jan 21, 2021)

My main concern was that i couldn't undervolt because the card was reversing the settings i applied, thanks @Mussels and @wolf for your advice, now it is working



toilet pepper said:


> .800 is too low and can only be achieved by a very good sample. Try chcecking what your default clocks and voltages are and go for a lower voltage until it crashes.


I didn't try 0.800v i went for 0.813v 1815Mhz and so far is stable when running Heaven


----------



## cst1992 (Jan 21, 2021)

Mussels said:


> Does 4.6.2 even support ampere? you need the 4.6.3 betas for ampere OCing


I was able to raise my card's power limit via 4.6.2, however not able to set a minimum fan speed.
Not tried undervolting or overclocking yet.


----------



## sepheronx (Jan 21, 2021)

Man, the RTX 3070 ASUS Dual is very loud and very hot.  I get up to about 83C during gameplay and normally the GPU isn't being maxed out (guess Tropico 6 isn't stressing it too much) and yet it is very audible.

Makes the ASUS GTX 1070 Turbo sound silent.


----------



## cst1992 (Jan 21, 2021)

sepheronx said:


> Man, the RTX 3070 ASUS Dual is very loud and very hot.  I get up to about 83C during gameplay and normally the GPU isn't being maxed out (guess Tropico 6 isn't stressing it too much) and yet it is very audible.
> 
> Makes the ASUS GTX 1070 Turbo sound silent.


How much power draw is there? Could you post a GPU-Z screenshot?


----------



## sepheronx (Jan 21, 2021)

cst1992 said:


> How much power draw is there? Could you post a GPU-Z screenshot?


ill have to do it when I get home.


----------



## FireFox (Jan 21, 2021)

cst1992 said:


> I was able to raise my card's power limit via 4.6.2, however not able to set a minimum fan speed.
> Not tried undervolting or overclocking yet.


Yes you can do that via 4.6.2 but undervolting is a pita, make things easy and use 4.6.3 beta


----------



## sepheronx (Jan 21, 2021)

ok, so this is rather interesting.

I tested cyberpunk with the 3070 Dual OC and surprised it handled well.  So far.  And, temps didn't get crazy nor did the fan speed.  It seems to be rather secluded to Tropico 6.




The game is set to 1080p (my monitor), max everything with Ray Tracing set to medium and DLSS at Balanced.


----------



## qubit (Jan 21, 2021)

Can't get these damned things for love nor money without going to some dodgy seller at inflated prices on eBay. Therefore, I'm gonna enjoy my virtual 3080 and check out just how good those virtual benchmarks are!


----------



## ThrashZone (Jan 21, 2021)

Hi,
Yeah 2080s I wouldn't be in any hurry either.


----------



## biffzinker (Jan 21, 2021)

qubit said:


> Can't get these damned things for love nor money without going to some dodgy seller at inflated prices on eBay. Therefore, I'm gonna enjoy my virtual 3080 and check out just how good those virtual benchmarks are!


How about sneakers instead?  







			https://www.techpowerup.com/forums/threads/general-nonsense.232862/page-492#post-4441409


----------



## thesmokingman (Jan 21, 2021)

It's about to get wet here!


----------



## mouacyk (Jan 21, 2021)

thesmokingman said:


> It's about to get wet here!View attachment 184968


Let me know what load temps you get and how much rad space you have.  I have Bykski's block for Gigabyte Eagle OC card and I think temps are little high, although internet seems to be fine with 50C+ temps.


----------



## thesmokingman (Jan 21, 2021)

Man, this is the biggest pain in the ass aircooler to remove that I've ever dealt with. Beware of the crappy star screws... and those magnetic covers. The aircooler works great but the way it's put together sucks ass.





mouacyk said:


> Let me know what load temps you get and how much rad space you have.  I have Bykski's block for Gigabyte Eagle OC card and I think temps are little high, although internet seems to be fine with 50C+ temps.


I just got it in and it's idling at 21c and water temp is 22c. It's idling around 10c lower than air give or take 1c. I suspect load will be around 40c. I've currently got two HWL Nemesis L series 480mm rads with AP15s at 1krpm or less. Did you orient the in/out ports?

Just ran FSU, gpu peaked at 46c. Just got a 5900x in...









						I scored 13 497 in Fire Strike Ultra
					

AMD Ryzen 9 3900XT, NVIDIA GeForce RTX 3090 x 1, 32768 MB, 64-bit Windows 10}




					www.3dmark.com


----------



## biffzinker (Jan 21, 2021)

thesmokingman said:


> It's about to get wet here!View attachment 184968


Looks even nicer at home on the big monitor.


----------



## mouacyk (Jan 22, 2021)

thesmokingman said:


> Man, this is the biggest pain in the ass aircooler to remove that I've ever dealt with. Beware of the crappy star screws... and those magnetic covers. The aircooler works great but the way it's put together sucks ass.
> View attachment 184993
> 
> I just got it in and it's idling at 21c and water temp is 22c. It's idling around 10c lower than air give or take 1c. I suspect load will be around 40c. I've currently got two HWL Nemesis L series 480mm rads with AP15s at 1krpm or less. Did you orient the in/out ports?
> ...


Yeah, I've got the right inlet/outlet ports. See, even you are getting closer to 50C for load temp and that's on 960mm of rad space!  Super clean build though, nice work.


----------



## thesmokingman (Jan 22, 2021)

mouacyk said:


> Yeah, I've got the right inlet/outlet ports. See, even you are getting closer to 50C for load temp and that's on 960mm of rad space!  Super clean build though, nice work.


Thanks. Though I have to add that my rads while it is 960mm of rad space they are thin 30mm rads. That and the fact that this is a 350w tdp gpu so you should expect a higher load temp. And in my case the gpu is running up to 2200mhz with power maxed out yet gpu temp stayed in the 46c range. It's not bad imo.


----------



## toilet pepper (Jan 22, 2021)

I'm planning a custom loop with my 3080 and a 5800x inside the NR200p using the tempered glass. Any idea what's the suggested radiator size is needed to cool these beasts?


----------



## mouacyk (Jan 22, 2021)

max it for that case.  i have 120x37mm and 360x60mm and gpu heats up to 52C from idle of 24C.


----------



## 15th Warlock (Jan 31, 2021)

It's nice to see so many new members in the club! Welcome everyone!

I added a new Ampere card to my repertoire, this time inside a new a thin and light gaming laptop, a Gigabyte Aorus 15G YC.

For the specs, it packs an  8 core/16 thread Intel i7 10870H, an RTX 3080, 32GBs of 2933MHz RAM, a 1920x1080 240Hz IPS panel, an NVMe 1TB SSD and a 99Wh Li-Ion battery (the upper capacity limit allowed by airlines to pack in a laptop), all crammed in a chassis weighing only 2Kg.

Last night I added a new 2TB XPG SX8200 Pro NVMe 1.3 SSD, for a total of 3TB of solid state storage running at PCIe 4x.

Here are some pics:















I'm running Cyberpunk 2077 with all details and ray tracing maxed out, and so far I haven't found any hiccups, loading times are also insanely fast, taking only a few seconds to load my current saved game from the main menu.

All in all, in very impressed with this little laptop, I pre-ordered it as soon as it became available on Newegg, and it was delivered the next day after it became available; for $1,999 before tax, I was impressed by everything Gigabyte was able to cram into this system, I can wholeheartedly recommend it!

Please keep adding pictures of your Ampere systems! It's so awesome to see this clubhouse grow so fast in the last couple of months!


----------



## cst1992 (Jan 31, 2021)

Damn, that's a monster laptop.
Also, it seems the days of 4kg gaming laptops are behind us now!
How are your temps? Are those two fans really enough to cool a 45W CPU and 320W GPU?
Also, it seems your battery is actually 94Wh(15.2V * 6.2 Ah), not 99.


----------



## 15th Warlock (Jan 31, 2021)

cst1992 said:


> Damn, that's a monster laptop.
> Also, it seems the days of 4kg gaming laptops are behind us now!
> How are your temps? Are those two fans really enough to cool a 45W CPU and 320W GPU?
> Also, it seems your battery is actually 94Wh(15.2V * 6.2 Ah), not 99.



Yes, I agree, I remember having an old 17” laptop that tipped the scales at almost 5Kg, it was never fun traveling with all that weight in my backpack.

The CPU tops at 89 °C, while the GPU reaches 72 °C when playing Cyberpunk for extended periods of time. Also, you gotta remember, this is not a full desktop card, but a mobile part rated to top at around 100W if I’m not mistaken, heck the power brick is rated for 230W for the whole system, can’t cram a 320W video card in a laptop of any reasonable size or weight lol.

As for the battery, I’m going off the rating in the official specs, as well as the one in the battery label, either way, I tested it to last a bit over 6 hours with medium use, like browsing the net and streaming videos, haven’t tested it while gaming, but I’m sure doing so, while not plugged into an outlet, would greatly reduce the performance of both the CPU and the GPU.

My previous gaming laptop, another 15.6” with an i7 7700HQ and a GTX 1070 was only able to last less than 2 hours on battery power under light load, so I consider this a huge improvement.


----------



## cst1992 (Jan 31, 2021)

It seems to be the 3080M then: https://www.techpowerup.com/gpu-specs/geforce-rtx-3080-mobile.c3684
That card is the big brother of the 3070, with 48 SMs enabled instead of 46, but underclocked to reduce power consumption.
The fact that they managed to uppercut the 3060Ti with 110W of power consumption is impressive.


----------



## P4-630 (Jan 31, 2021)

15th Warlock said:


> Yes, I agree, I remember having an old 17” laptop that tipped the scales at almost 5Kg, it was never fun traveling with all that weight in my backpack.
> 
> The CPU tops at 89 °C, while the GPU reaches 72 °C when playing Cyberpunk for extended periods of time. Also, you gotta remember, this is not a full desktop card, but a mobile part rated to top at around 100W if I’m not mistaken, heck the power brick is rated for 230W for the whole system, can’t cram a 320W video card in a laptop of any reasonable size or weight lol.
> 
> ...



So you know which one you got?












						Nvidia RTX 30xx GPUs for laptops have 28 different configurations
					

The series of Nvidia GeForce RTX 30xx cards for laptops actually consists of 28 different models. Nvidia introduced three: the RTX 3060, 3070 and 3080, but there are dozens of different configurations for clock speeds and tgp, which affects performance.  Laptop manufacturers receive a long list...




					www.techpowerup.com


----------



## 15th Warlock (Jan 31, 2021)

P4-630 said:


> So you know which one you got?
> 
> View attachment 186332
> 
> ...



Oh, I'm sure its one of the Max-Q variants, although so far it's boosting to over 1600MHz, so I'm not complaining 

I think the models with higher wattage are used in much bigger laptops, with beefier cooling solutions.


----------



## Mussels (Feb 8, 2021)

Have you ever been so bored you stuck an intel stock cooler onto your 3090?

I'm testing bitcoin mining and this keeps the GPU fan down about 15% since its cooling the RAM... i want a better slim CPU cooler to permanently stick here now


----------



## biffzinker (Feb 8, 2021)

Wonder if that would work on my card? The metal bracket gets pretty toasty during long gaming sessions.


----------



## Mussels (Feb 8, 2021)

biffzinker said:


> Wonder if that would work on my card? The metal bracket gets pretty toasty during long gaming sessions.


102C memory junction to 96C, and this is as ghetto as it comes

I'm looking into a 92mm/100mm alu heatsink off the net to slap on with thermal tape

There are 2.5" SATA "heatsinks" available on amazon and ebay that look like they'd add some nice surface area, and could have a 92mm fan slapped on top...


----------



## nguyen (Feb 8, 2021)

how about getting a spare AIO on the backside of your 3090 .


----------



## thesmokingman (Feb 8, 2021)

Throw a waterblock on it.


----------



## Mussels (Feb 8, 2021)

Look, look, look.


i have a spare 280mm AIO and i'd totally do something stupid with it, if i could secure it.

2x copper heatsinks have been ordered, say they'll arrive before may lol. we'll see how big a difference they make with a 92mm fan on top.

dont judge me


----------



## cst1992 (Feb 8, 2021)

Mussels said:


> Have you ever been so bored you stuck an intel stock cooler onto your 3090?
> 
> I'm testing bitcoin mining and this keeps the GPU fan down about 15% since its cooling the RAM... i want a better slim CPU cooler to permanently stick here now


That looks kind of cute actually...

Are those pads powerful enough to take full advantage of a 280mm waterblock?
You might want to consider a carbon-based pad instead.


----------



## Mussels (Feb 8, 2021)

I'm literally throwing crap inside this room on top to see what happens

I have burned one of my fingers on something that was over 100C, i have a blister the size of a pinhead now

i'm not seeing any real temp differences, but using bitcoin mining as a benchmark (garbage idea) i see about a 20 cent improvement. STONKS.

(this was with the AIO sitting in place but unplugged, then connecting the SATA cable and seeing what happened)



the fan drops somewhere between 5% and 10%, so it IS moving heat


----------



## cst1992 (Feb 8, 2021)

$13.5/day! You only have to run it 24/7 for 200 days for it to pay for itself...
May I recommend a waterblock? That might take a few days off...
(j/k ofc)


----------



## Mussels (Feb 8, 2021)

I have now undervolted to 0.725v 1600Mhz because nothing i do needs the full 350W of this card

It now only runs at 290W bitcoin mining (waaaaay less in games) at 50% fan speed, 60C
(of course, it still has the intel heatsink and the AIO sitting on its back helping a little)


----------



## cst1992 (Feb 8, 2021)

How much of a performance difference does the undervolting make? Based on what I've read in other threads, it's quite possible to reduce power draw a lot with minimal impact to performance.


----------



## sepheronx (Feb 8, 2021)

So I am wanting to undervolt/overclock my ASUS 3070 Dual OC.  It currently maxes at around 1815 or so on the core but goes to about 70C (before hand it was heating up more but seems to now balance out at around 71 or so).

My options seem to be MSI afterburner and ASUS OC tool.


----------



## phill (Feb 8, 2021)

Guys, I was just wondering has anyone had any luck cooling these cards memories down or is it not such a problem??  I've seen quite a few getto mods on these cards, which leads me to wonder if it's a weak spot on the cards or if the manufactures just didn't plan well enough for when it comes to mining 

Has anyone got a card under water?  Does the memory temp drop a lot compared to air?


----------



## bug (Feb 8, 2021)

cst1992 said:


> How much of a performance difference does the undervolting make? Based on what I've read in other threads, it's quite possible to reduce power draw a lot with minimal impact to performance.


When cards are pushed beyond their sweet spot, performance increases come at an exponential cost in power draw. Depending on how far the card is pushed, you can reduce power draw by 20 (or maybe even) 30 percent in exchange for 5-10 percent performance.

But the thing is, you shouldn't have to undervolt. Undervolting needs thorough testing and software that's usually available only for Windows. As a customer, you shouldn't be subjected to these.


----------



## P4-630 (Feb 8, 2021)

bug said:


> But the thing is, you shouldn't have to undervolt.



Agree.
One of the reasons I'm reluctant buying AMD hardware.
People with AMD hardware always talk about undervolting.


----------



## FireFox (Feb 8, 2021)

phill said:


> has anyone had any luck cooling these cards memories down


I dont even know how to check memory temp  , my is still on Air because i haven't found the Tubes i want for my Watercoling project


----------



## phill (Feb 8, 2021)

I have a card at home (sadly not a 3000 series) but it has a very high memory temp (might be a faulty sensor or an incorrect reading I guess?) but I read a lot about the 3080/3090 having similar issues...  I just wondered what owners of the cards have noticed with theirs


----------



## nguyen (Feb 8, 2021)

Mussels said:


> I have now undervolted to 0.725v 1600Mhz because nothing i do needs the full 350W of this card
> 
> It now only runs at 290W bitcoin mining (waaaaay less in games) at 50% fan speed, 60C
> (of course, it still has the intel heatsink and the AIO sitting on its back helping a little)
> ...



I bet putting thermal paste in between the backplate and the AIO can decrease VRAM temp even further, leaving more room for memory overclock.


----------



## FireFox (Feb 8, 2021)

phill said:


> I read a lot about the 3080/3090 having similar issues...


This is after 5 hours and 30 minutes of Gaming


----------



## Mussels (Feb 8, 2021)

cst1992 said:


> How much of a performance difference does the undervolting make? Based on what I've read in other threads, it's quite possible to reduce power draw a lot with minimal impact to performance.


piss all really - i was seeing the GPU sit around 1750 in regular benchies once it was heat soaked, so i've lost almost nothing except some bursty clocks. I could of course aim for a higher static OC, but its summer here so i'm in the mood for minimum voltages.


my backplate gets HOT - you cant hold your finger on it. IR gun says 70C, i cant help but wonder if its misreading due to age, but the heat transfer makes me confident its got decent stock pads.


----------



## wolf (Feb 10, 2021)

Bought a 3d printer, so de-shrouded the 3080 again and went duct-mad, plus some tiny cute heatsinks on the backplate for the memory... might add more.


----------



## xkm1948 (Feb 10, 2021)

thesmokingman said:


> It's about to get wet here!



Wait you got a 3090? I thought you only do red flavored GPUs 

jk, congrats on the purchase. That looks like a killer setup


----------



## toilet pepper (Feb 10, 2021)

wolf said:


> Bought a 3d printer, so de-shrouded the 3080 again and went duct-mad, plus some tiny cute heatsinks on the backplate for the memory... might add more.


What's your temp like? I just downsized to an NR200p and ghettoed the deshroud mod as the temps were abysmal. I'm using the 3080 Ventus Gaming OC and getting average 62C on Core and 93C on Mem junction during mining for several hours. I replaced the stock thermal pads with Gelid's 12 W/mk pads and they made a big difference. It was getting to 100C+ before.


----------



## phill (Feb 10, 2021)

I'd put a few fans over the card, help it breathe    There's a reason why miners never put GPUs into cases and such


----------



## Mussels (Feb 10, 2021)

I need new memory pads on mine i reckon, seeing upto 102C on the junction

anyone know what my size pads galax 3090 SG1 would use, or will i have to open it up to measure it?


----------



## thesmokingman (Feb 10, 2021)

102c? Holy carp man, that's not gonna be good long-term.


----------



## bug (Feb 10, 2021)

thesmokingman said:


> 102c? Holy carp man, that's not gonna be good long-term.


It's ok, GDDR6X is known to run at very high temps.
Well, ok in the sense that it can take it, not that you want that in your case.


----------



## Calmmo (Feb 10, 2021)

84c here, cant complaint


----------



## Mussels (Feb 10, 2021)

So would i need 1mm, 2mm?

and whats worth it, hit me up with the good brands

I think heat transfers well because holy shit the backplate sits at 70C+ (which is why i ghetto cooled it) but i'm up for making this thing live longer


----------



## FireFox (Feb 11, 2021)

Which one is more important max temp or average?



Mussels said:


> hit me up with the good brands


I use Arctic


----------



## Mussels (Feb 11, 2021)

Knoxx29 said:


> Which one is more important max temp or average?
> 
> 
> I use Arctic


yeah but what size do i need


----------



## FireFox (Feb 11, 2021)

Mussels said:


> yeah but what size do i need


Using EKBW as reference the Waterblock compatibility for your card says 1.mm


----------



## Mussels (Feb 11, 2021)

Knoxx29 said:


> Using EKBW as reference the Waterblock compatibility for your card says 1.mm


it has compatible EK waterblocks? siiiiick

i know what i'll throw money at one day

Time to get some 1mm thermal pads and some fancy thermal grizzly crap i dont need like their varnish and liquid metal remover at the same time


----------



## FireFox (Feb 11, 2021)

Mussels said:


> it has compatible EK waterblocks? siiiiick


----------



## Mussels (Feb 11, 2021)

been browsing the EK site and a few compatible blocks, awesome news for the future.

where the heck did you find the memory pad info? that, i cannot see


----------



## FireFox (Feb 11, 2021)

Mussels said:


> where the heck did you find the memory pad info? that, i cannot see


In the GPU's installation manual


----------



## Mussels (Feb 11, 2021)

i found it mentions 1mm for the front ,but nothing for the back (which is what i want to replace)

some of these pads are $50Au and above so getting the wrong size would be an oof - and 1mm seems to be rare locally, odd
Ugh, EK pads are like 1/4 the price of TG, and theres no arctic locally. choosing is hard yo.


----------



## FireFox (Feb 11, 2021)

Buying a full Waterblock comes with 1mm so i assume it's the same for the back and front.


----------



## Mussels (Feb 11, 2021)

thats logic i can work with

arctics cheap enough on amazon, i'll grab an overkill amount and use any leftovers on my other GPUs as needed (in all the years, i've NEVER replaced a thermal pad with a new one - just paste. time to fix that)


----------



## FireFox (Feb 11, 2021)

Mussels said:


> some of these pads are $50Au and above


Expensive.
I paid for 145x145mm 1mm 15€


----------



## Mussels (Feb 11, 2021)

$58 for two of those sheets shipped here - would have been $100 or more for a single sheet of the other brands in 1mm

i'll just slap both sheets whole on top of the GPU, it'll be fine


----------



## wolf (Feb 11, 2021)

toilet pepper said:


> What's your temp like?


GPU mid 50's @ 270w load, memory tops in the mid 80's gaming, high 90's mining... I would very much consider better/more thermal pads too, not sure what thickness/brand to buy though.


----------



## Mussels (Feb 11, 2021)

wolf said:


> GPU mid 50's @ 270w load, memory tops in the mid 80's gaming, high 90's mining... I would very much consider better/more thermal pads too, not sure what thickness/brand to buy though.


apparently we just buy lots of 1mm and hope for the best, according to the wise person who posted above you


----------



## dyonoctis (Feb 11, 2021)

It's much more cleaner with bequiet! cable :


----------



## cst1992 (Feb 11, 2021)

Apparently that'll void your warranty with NVIDIA.


----------



## dyonoctis (Feb 11, 2021)

cst1992 said:


> Apparently that'll void your warranty with NVIDIA.


technically they are saying " use of third party power dongles" and not "use of third party power cable". 
If it's coming from a shady vendor from wish or aliexpress it's a fair concern, but Corsair and seasonic also made cables 12 pins cable for their power supply. IMO if Nvidia doesn't like that's, it's basically saying that people doing power supply, don't know how to do their jobs.


----------



## cst1992 (Feb 11, 2021)

NVIDIA are just covering their asses, considering the 12-pin connector has been developed in-house by them.
I must say, I actually like the little connector. If not for the stupid central placement of the connector, it'd also have been a hit in reviews.


----------



## bug (Feb 11, 2021)

cst1992 said:


> NVIDIA are just covering their asses, considering the 12-pin connector has been developed in-house by them.
> I must say, I actually like the little connector. If not for the stupid central placement of the connector, it'd also have been a hit in reviews.


Tbh, that's not stupid placement, it's where the PCB ends. It would have been really hard to put the connector at the end of the board.
But it is awkward placement, that's for sure.


----------



## toilet pepper (Feb 11, 2021)

Mussels said:


> i found it mentions 1mm for the front ,but nothing for the back (which is what i want to replace)
> 
> some of these pads are $50Au and above so getting the wrong size would be an oof - and 1mm seems to be rare locally, odd
> Ugh, EK pads are like 1/4 the price of TG, and theres no arctic locally. choosing is hard yo.




I eyeballed the thermal pads in the 3080 Ventus. Took a paper and measured the thermal pads and referenced it with the only ruler I had. It came to about 3mm in paper. So I bought the 3mm Gelid. Turns out I bought the wrong size. The front size was around 2.8-2.5mm. The Gelid pads are stiff and hard to squish but a I managed to squish it by hand to make it fit.

I can tell it is working though. The main heatsink shares cooling for the core and memory and the core was getting hotter when I mined. (I checked the heatsink contact with the core and changed thermal paste between application)






There's limited reviews out there about it but I can say these works great. Downside is it is expensive costing around $15 and was just enough to cover the front chips. 

The "graphene" backplate already had thermal pads and they get hot too. The problem is since there's no fins or anything they just basically act as a heat soak. A flow-through design cooler connecting the backplate and the main heatsink is ideal for cards with GDDR6X.


----------



## Mussels (Feb 11, 2021)

I look forward to making my 3090 look uglier with all these heatsinks and fan mods


----------



## cst1992 (Feb 11, 2021)

You still have hope... Ask these guys to design a cooler for you.
That thing is enormous, but can keep a 300W Threadripper at 60C(at least LTT says so)


----------



## Mussels (Feb 11, 2021)

wow that thread went poorly


----------



## cst1992 (Feb 11, 2021)

Tell me about it. I got a week-long post ban.
My own fault though, the conversation changed very quickly on pages 2 and onward, with me just kind of shoving in after that.
Checking the LTT video though, it's actually good innovation. It's honestly not what I expected with the thread contents on the first page.


----------



## thesmokingman (Feb 11, 2021)

I saw that thread.... thoughts that came to mind were moronic op and message, avoided it. You have to pay to be in it ala ship costs, time and you get nothing out of it. WTF??


----------



## Mussels (Feb 11, 2021)

I used to do reviews for 3dchipset.com before the owner just up and vanished and everything went offline (taking my crappy reviews i made as a 17 year old with it)

This was standard practise for expensive items - home users aint used to it. Lots of reviews out there are genuinely done with the same prototype or early production run unit, sent from one to another to another.
Upside: you can catch faults and flaws early on, that one reviewer may miss (see the extra thermal pads 30x0 FE cards got)
downside: everyone gets the same unit, so that could be the one good unit in a shit batch of 100 - cherry picking samples is bad


----------



## cst1992 (Feb 11, 2021)

Mussels said:


> that could be the one good unit in a shit batch of 100


Even if that angle is ignored entirely, testing a used unit is simply not the same as testing a new unit.
I can kinda understand why they were reluctant to give away the units to reviewers though, the cooler is simply too exclusive and expensive for that.


----------



## thesmokingman (Feb 12, 2021)

Mussels said:


> $58 for two of those sheets shipped here - would have been $100 or more for a single sheet of the other brands in 1mm
> 
> i'll just slap both sheets whole on top of the GPU, it'll be fine



That's more than a third of a waterblock in costs.

Was bored ran a couple cpu timespy extreme runs back to back, memory got up to 72c.


----------



## Mussels (Feb 12, 2021)

thesmokingman said:


> That's more than a third of a waterblock in costs.
> 
> Was bored ran a couple cpu timespy extreme runs back to back, memory got up to 72c.


1. i can use those heatsinks on a new backplate too
2. i can use them on other GPUs
3. a single waterblock is one part of dozens you need for a custom loop


----------



## phill (Feb 12, 2021)

How do you guys install the heatsink on the back of the GPU?   Do you use just the pads and that's it?


----------



## Mussels (Feb 12, 2021)

phill said:


> How do you guys install the heatsink on the back of the GPU?   Do you use just the pads and that's it?


The cards come with them stock, i'm pretty sure some of us just replace the pads with better ones and thats it


----------



## nguyen (Feb 12, 2021)

huh why are people spending so much on thermal pads . Also it is not safe when you have incorrect pad thickness as it would compress on the memory modules causing solder failure in the long run (it happened with the first batch of Turing cards)

If I were mining I would manually set the fan speed to 80-100% first, don't leave the fans in auto because they are tuned to GPU thermal and not VRAM thermal.


----------



## phill (Feb 12, 2021)

Mussels said:


> The cards come with them stock, i'm pretty sure some of us just replace the pads with better ones and thats it


I was wondering what people are doing to improve the temps?  I'll do a bit more testing and report back when I can 



nguyen said:


> huh why are people spending so much on thermal pads . Also it is not safe when you have incorrect pad thickness as it would compress on the memory modules causing solder failure in the long run (it happened with the first batch of Turing cards)
> 
> If I were mining I would manually set the fan speed to 80-100% first, don't leave the fans in auto because they are tuned to GPU thermal and not VRAM thermal.


It's the back of the card that is the problem issue not the core.  The memory chips on the back get massively hot and its not easy to cool them as simply putting a fan over the backside of the card, does literally nothing.  
In normal uses, this is not really a problem, but when it comes to mining, it's a problem.  This is why I'm trying to make sure that I'm doing all I can to prolong my cards lives.



Mussels said:


> Have you ever been so bored you stuck an intel stock cooler onto your 3090?
> 
> I'm testing bitcoin mining and this keeps the GPU fan down about 15% since its cooling the RAM... i want a better slim CPU cooler to permanently stick here now
> View attachment 187495
> ...


This is what I'm needing to do!!


----------



## FireFox (Feb 12, 2021)

Watercooling will solve all your problems


----------



## phill (Feb 12, 2021)

Knoxx29 said:


> Watercooling will solve all your problems


I'll be trying that with my Strix card but the downside with that is that it's still only cooling the core and VRAM on the face of the card, nothing is getting cooled on the backside of the card...  that's the battle at the moment


----------



## nguyen (Feb 12, 2021)

phill said:


> I'll be trying that with my Strix card but the downside with that is that it's still only cooling the core and VRAM on the face of the card, nothing is getting cooled on the backside of the card...  that's the battle at the moment



Heat travel through the PCB, that why you can cool front mount components (GPU, VRAM and VRM) with the help of the backplate and the opposite is also true. If you keep the front mounted memory modules cool it can keep the back side modules cool too .
Remember evga VRM caught fire fiasco? they solved it by placing thermal pads on the backside.
I bet increasing the fans speed is much more effective at cooling the backside modules than replacing the thermal pads on the back anyways.


----------



## phill (Feb 12, 2021)

Well I've given it a boost to 80% and I'm just keeping an eye on it now to see if that temp drops....


----------



## toilet pepper (Feb 12, 2021)

I experimented with the 3080 Ventus cooling while mining inside the NR200p. The standard bottom-up cooling works but not ideal since the gpu just blows around 300 watts of heat into the case and the CPU. Running the fans at 100% dont help and the mem temp hits around 98C and CPU is idle at 70C. 

I changed the fan air flow to top to bottom with the deshrouded 3080 exhausting air to the bottom. This shed 3C and I can run the fans at 80% now. The overall temp inside the case is now cooler and the backplate does not get as warm now since it is getting fresh air from the top. I just slapped an m.2 heatsink and it helps so I ordered more 12 more little heatsinks and they are on the way.








I'm highly considering watercooling everything and calling it a day. I'll probably use my mining gains to pay for that.


----------



## thesmokingman (Feb 12, 2021)

Mussels said:


> 1. i can use those heatsinks on a new backplate too
> 2. i can use them on other GPUs
> 3. a single waterblock is one part of dozens you need for a custom loop


You have a 1.5K+ gpu dude. It's ok to invest in a waterblock for sustained maximum performance that will keep said overpriced card from longterm degradation. And for me the cost is nothing in the scope of things, pay too play or in this case pay to be cool.


----------



## mouacyk (Feb 12, 2021)

thesmokingman said:


> You have a 1.5K+ gpu dude. It's ok to invest in a waterblock for sustained maximum performance that will keep said overpriced card from longterm degradation. And for me the cost is nothing in the scope of things, pay too play or in this case pay to be cool.


With the shortage problem as it is, I hope many of these careless mods burn out their GDDR6X...


----------



## ThrashZone (Feb 12, 2021)

Hi,
Yeah I pondered even adding a water block on the back plate too lol


----------



## Mussels (Feb 12, 2021)

nguyen said:


> huh why are people spending so much on thermal pads . Also it is not safe when you have incorrect pad thickness as it would compress on the memory modules causing solder failure in the long run (it happened with the first batch of Turing cards)
> 
> If I were mining I would manually set the fan speed to 80-100% first, don't leave the fans in auto because they are tuned to GPU thermal and not VRAM thermal.


i spent $2600 on a 3090, its worth a few thermal pads. i'm seeing 92C-102C on Tjunction while mining, so i'd like to lower that.

Ambient here is going from 17C to 35C every few days, so.... its fun. but i'm not giving up $140 a week while i'm unemployed and in lockdown again.


----------



## phill (Feb 13, 2021)

I'm really surprised that the small heatsink on the back does so much but what I have also noticed, the VRAM on the back is still bloody hot...  Even with that and a few fans over it, it's getting too warm for my liking so I've ordered a few NVME heatsinks which where knocked down, so my plan is simple.  Wherever I feel a mass of heat off the back plate, I'm going to put a thermal pad plus one of these NVME heatsinks...  

I'm not sure why MSI decided to put a copper heatpipe through the back plate but not in some way connect it too the main heatsink or have an extra heatsink on the back, whatever, but think there cooling design wasn't quite the best....  Still when I receive it all through and I've got it properly tested, I'll post up some results...

@Mussels just out of massive interest, what are your hash rates your getting with your 3090?


----------



## Mussels (Feb 13, 2021)

phill said:


> I'm really surprised that the small heatsink on the back does so much but what I have also noticed, the VRAM on the back is still bloody hot...  Even with that and a few fans over it, it's getting too warm for my liking so I've ordered a few NVME heatsinks which where knocked down, so my plan is simple.  Wherever I feel a mass of heat off the back plate, I'm going to put a thermal pad plus one of these NVME heatsinks...
> 
> I'm not sure why MSI decided to put a copper heatpipe through the back plate but not in some way connect it too the main heatsink or have an extra heatsink on the back, whatever, but think there cooling design wasn't quite the best....  Still when I receive it all through and I've got it properly tested, I'll post up some results...
> 
> @Mussels just out of massive interest, what are your hash rates your getting with your 3090?


110 on high, 100 on medium (nicehash)
medium is silent without the fans ramping up, so i stick with that mostly


----------



## nguyen (Feb 13, 2021)

Mussels said:


> i spent $2600 on a 3090, its worth a few thermal pads. i'm seeing 92C-102C on Tjunction while mining, so i'd like to lower that.
> 
> Ambient here is going from 17C to 35C every few days, so.... its fun. but i'm not giving up $140 a week while i'm unemployed and in lockdown again.



Yeah but according to this Gelid thermal pads testing you are only geting less than 2C difference with an expensive pad





How about fixing your fans speed to at least 70% to reduce the VRAM temp and avoiding the fans ramping up and down? kinda a waste of cooling potential just letting the 3 fans running silently and your VRAM is cooking.

Also a superior option to thermal pads would be these copper shim


----------



## Mussels (Feb 13, 2021)

when the goods arrive, i will let you know the differences


----------



## toilet pepper (Feb 13, 2021)

nguyen said:


> Yeah but according to this Gelid thermal pads testing you are only geting less than 2C difference with an expensive pad
> 
> 
> 
> ...


I was already running the fans at 85% to begin with and it was reaching 102c at stock. I noticed it before when checking gpuz and it was thermal throttling even though the core is around 50c. This is a Ventus 3080 meaning its the bottom of the crop for 3080s. Funny enough Gigabyte didnt even put thermal pads on their cards from what I've seen in reddit.

I changed the thermal pads and everything changed. No throttle and I can set the fan to lower fan speeds.

Those copper pads are conductive. A lot of things can go south if someone uses that on their unobtanium cards.


----------



## nguyen (Feb 13, 2021)

toilet pepper said:


> I was already running the fans at 85% to begin with and it was reaching 102c at stock. I noticed it before when checking gpuz and it was thermal throttling even though the core is around 50c. This is a Ventus 3080 meaning its the bottom of the crop for 3080s. Funny enough Gigabyte didnt even put thermal pads on their cards from what I've seen in reddit.
> 
> I changed the thermal pads and everything changed. No throttle and I can set the fan to lower fan speeds.
> 
> Those copper pads are conductive. A lot of things can go south if someone uses that on their unobtanium cards.



Well you can only place these copper shim in between the memory modules and the backplate and it should work better than any thermal pads, other places on the back side of PCB should only use thermal pads as usual. 

My guess it that your 3080 wasn't tighten down enough at the factory or the old pads did not make proper contact.

I don't think you should let the fans go to lower speed though, just keep your card as cold as possible, better be safe than sorry, these are unobtanium after all. Besides the VRAM temperature there are also VRM for the memory, the memory on cheap 3080s only has 3 phases VRM and you are pulling a lot of current through these phases.

Another possible point of failure with mining for fun is that thermal cycling will lead to early death (solder failure), if you let these card mining 24/7 there would be less failure rate.


----------



## toilet pepper (Feb 13, 2021)

nguyen said:


> Well you can only place these copper shim in between the memory modules and the backplate and it should work better than any thermal pads, other places on the back side of PCB should only use thermal pads as usual.
> 
> My guess it that your 3080 wasn't tighten down enough at the factory or the old pads did not make proper contact.
> 
> ...


Old pads were just cheap soft chewing gums.

You might be right as well. As the ventus was known to sag and it might not have proper contact. However, it is mounted vertically with no side panel so that was not the problem.

For the past few days, the rig is mining mostly 24/7. I'm even mining while working from home and can do zoom meetings.

It works but I know it can be better temp wise.


----------



## Mussels (Feb 13, 2021)

nguyen said:


> Well you can only place these copper shim in between the memory modules and the backplate and it should work better than any thermal pads, other places on the back side of PCB should only use thermal pads as usual.
> 
> My guess it that your 3080 wasn't tighten down enough at the factory or the old pads did not make proper contact.
> 
> ...


you would still need pads or paste on both sides of those shims, so you're adding more and more layers
we have backplates, we just want to get the heat from the VRAM into the backplate, so we can cool it externally - thats where lumps of copper works instead of flat plain old aluminum


----------



## nguyen (Feb 13, 2021)

Mussels said:


> you would still need pads or paste on both sides of those shims, so you're adding more and more layers
> we have backplates, we just want to get the heat from the VRAM into the backplate, so we can cool it externally - thats where lumps of copper works instead of flat plain old aluminum



Well I tried out some ETHminer but it doesn't work so I can't test out the Memory junction temp during mining, only gaming. 
Playing CP2077 the memory Tjunction on my 3090 get up to 66C, placing a fan on top of the backplate does nothing (here is my backplate)
Reading around the Tjunction is the temp of hottest module, so when you forget to cool the front side modules (lowering the fans), it will negatively affect the memory Tjunction


----------



## Caring1 (Feb 14, 2021)

Remove the back plate and add copper heatsinks directly to the affected areas.


----------



## toilet pepper (Feb 14, 2021)

It arrived. Let's see what the point of diminishing return is.


----------



## wolf (Feb 15, 2021)

Dropped a few more of the 30x40x5mm pi4 heatsinks on the back (5 total) and dropped in a ghetto fan for mining funzies.

Max Junction temp mining all yesterday was 96c, core 44c, and the room got to 28c ambient (cheers summer). Seems sustainable for ~90mh/s.


----------



## toilet pepper (Feb 15, 2021)

wolf said:


> Dropped a few more of the 30x40x5mm pi4 heatsinks on the back (5 total) and dropped in a ghetto fan for mining funzies.
> 
> Max Junction temp mining all yesterday was 96c, core 44c, and the room got to 28c ambient (cheers summer). Seems sustainable for ~90mh/s.
> 
> View attachment 188340



I managed to put a bunch of those heatsinks and I can tell it is working. I managed to shave off 2 degrees. My 3080 is now hashing at 90mhs at 94 degrees. 




http://imgur.com/a/SU51g0L


I've realized the best airflow for my scenarios is top to bottom. Fresh air comes from the top and the 3080 exhausts hot air at the bottom. This flow also help cool the backplate and heat my fingers when the ac is on while working. And the tempered glass helps in directing the flow downwards.

No more 250 watts of heat being dumped inside my nr200 which means no need to worry about beefy cooler for the cpu.

I've thought about doing the inverted mod so the gpu is at the top exhausting heat upwards but its too much work for me.


----------



## wolf (Feb 15, 2021)

toilet pepper said:


> I've realized the best airflow for my scenarios is top to bottom.



Interesting, I may well have to experiment with reverse airflow direction. Did your GPU temps drop too?

I ask because my next 3d printed CPU duct was actually going to be a 90-degree elbow to pull fresh air from the back panel, then throw the TG panel on the side, so the 3080's heat would be fairly inconsequential and I can keep the natural hot air rising flow direction going.


----------



## toilet pepper (Feb 15, 2021)

wolf said:


> Interesting, I may well have to experiment with reverse airflow direction. Did your GPU temps drop too?
> 
> I ask because my next 3d printed CPU duct was actually going to be a 90-degree elbow to pull fresh air from the back panel, then throw the TG panel on the side, so the 3080's heat would be fairly inconsequential and I can keep the natural hot air rising flow direction going.


They helped a lot. My temps are almost the same ranging from 92-96c but my cpu is now colder. It was idling at 60c+ when mining. Not to mention the overall temps of everything inside the case is a lot hotter.

The normal orientation would have worked if I have a fan for exhaust at the back (which is on the way) but the air just doesnt escape fast enough to make internals cool even if the top fans are at full speed.

No one is to blame but us for mining inside an "SFF" case.


----------



## HuLkY (Feb 15, 2021)

It was a huge adventure buying that card, to buy it from EVGA US site, been in the queue for exactly 127 days, then I had to find a friend in US to ship it for him, buy from a US account, buy from US IP!!! It was a huge adventure for me, applied associate discounts and so, bought it for the good old 799USD MSRP, I hope nV can fix that situation fast. I ordered also a GPU support bracket and moving the whole build to a new case, gonna post photos once done. BTW that's a big freaking card compared to my arm xD


----------



## Mussels (Feb 16, 2021)

Well, my card uses 1.5mm pads and not 1.0

good news is, 90% of the stock ones survived taking the cooler off, leaving one where i had to double up two 1mm on top of each other short term


----------



## 15th Warlock (Feb 16, 2021)

HuLkY said:


> It was a huge adventure buying that card, to buy it from EVGA US site, been in the queue for exactly 127 days, then I had to find a friend in US to ship it for him, buy from a US account, buy from US IP!!! It was a huge adventure for me, applied associate discounts and so, bought it for the good old 799USD MSRP, I hope nV can fix that situation fast. I ordered also a GPU support bracket and moving the whole build to a new case, gonna post photos once done. BTW that's a big freaking card compared to my arm xD
> 
> View attachment 188369
> 
> View attachment 188368


Congrats, and welcome to the club!


----------



## Mussels (Feb 16, 2021)

200 RPM lower, same GPU and Junction temps - because i basicallyt replaced two pads with doubled up arctics and replaced the TIM and thats all :/

$40 later for 1.5mm shipped, and i might be able to actually improve the pads next week


----------



## wolf (Feb 16, 2021)

Mussels said:


> 200 RPM lower, same GPU and Junction temps - because i basicallyt replaced two pads with doubled up arctics and replaced the TIM and thats all :/
> 
> $40 later for 1.5mm shipped, and i might be able to actually improve the pads next week


Very cool dude, I would love a definitive answer on what thickness pads the Asus TUF uses (front and back) so I could replace the lot, perhaps add even more on the rear and do the tim too


----------



## purecain (Feb 17, 2021)

good evening all!!!! Here's my new card!!!


----------



## 15th Warlock (Feb 17, 2021)

purecain said:


> good evening all!!!! Here's my new card!!! View attachment 188726View attachment 188731View attachment 188735View attachment 188733


Nice! Did you sort out the problems with the screen going pink? Welcome to the club, it was long overdue!


----------



## purecain (Feb 17, 2021)

15th Warlock said:


> Nice! Did you sort out the problems with the screen going pink? Welcome to the club, it was long overdue!


No i didnt, i'm still having issues with video after sleep mode and need to restart. Today i turned on the PC and had little blue squares start appearing. My heart jumped into my throat as i restarted. 

Luckily the card worked normally again but these little issues dont half get me worrying. 

I'm going to buy a new cable over the next few days. If that helps we might have a solution for anyone else having the issue in the future.

Can you suggest a HDMI2.1 cable brand?


----------



## 15th Warlock (Feb 18, 2021)

Monoprice has always given me a good reliability to price ratio, I have never had any of their cables crap out on me.



			https://www.amazon.com/Monoprice-Ultra-High-Speed-Cable/dp/B07WNM2NJ5/ref=mp_s_a_1_30?dchild=1&keywords=hdmi+2.1+cable&qid=1613607372&sprefix=hdmi&sr=8-30


----------



## wolf (Feb 18, 2021)

Looks like I'll go all out on Thermalright Odessey pads, TUF needs 1mm, 2mm, and 3mm to do the lot, plus some thermal grizzly Kryonaut to do the TIM.

No half measures! with the de-shroud in place, temps and acoustics are going to be bloody wild.


----------



## purecain (Feb 18, 2021)

@15th Warlock  Thanks buddy!!!! Checking them out now. Is there a version with connectors to fit a monitor/tv flush to the wall. The TV is leant on the connector atm and i dont think that will be helping. I'll post what I find searching from your link as the starting point. If there isnt one available yet I'll get the one you suggested.


----------



## phill (Feb 18, 2021)

I thought I'd update everyone with a few developments  

The MSI card I have, the temps are fairly steady under mining, about 90 to 98C at most depending on what I'm doing with the card (overclocking etc.) and I've got some more heatsinks coming today and some thin thermal pads so I'm hoping I can put the pads and the heatsinks over where the heatpipes are on the back plate to help with disapating the heat from the RAM.  I highly doubt it will do anything much at all but for a few quid of effort and such, it's worth a try 

And as for my Strix card, that was put under water and the temps, surprise surprise, are no different under mining, even with gaming as well, they are the same as the air cooler...  So long and short, probably don't bother with a block...


----------



## Caring1 (Feb 18, 2021)

purecain said:


> good evening all!!!! Here's my new card!!! View attachment 188731


Isn't that the type they had issues with the first batch due to the layout of chips on the back?
I  think they changed it to this layout.


----------



## toilet pepper (Feb 18, 2021)

phill said:


> I thought I'd update everyone with a few developments
> 
> The MSI card I have, the temps are fairly steady under mining, about 90 to 98C at most depending on what I'm doing with the card (overclocking etc.) and I've got some more heatsinks coming today and some thin thermal pads so I'm hoping I can put the pads and the heatsinks over where the heatpipes are on the back plate to help with disapating the heat from the RAM.  I highly doubt it will do anything much at all but for a few quid of effort and such, it's worth a try
> 
> And as for my Strix card, that was put under water and the temps, surprise surprise, are no different under mining, even with gaming as well, they are the same as the air cooler...  So long and short, probably don't bother with a block...



Did your block have cooling plate at the back? I think the junction sensors checks the hottest chip and throttles it from there. If yours didnt have anything at the back it would still show hot temps.


----------



## phill (Feb 18, 2021)

Strix had a full cover block and a back plate that was another $40 on top of the block lol  My mate was gutted as it's not helped a thing and I believe that these heatsinks I've ordered as well won't help much if at all either....  We'll see when I get them through


----------



## jesdals (Feb 18, 2021)

HuLkY said:


> View attachment 188368


Please dont tell my that you paid an arm and a leg for that 3080


----------



## nguyen (Feb 18, 2021)

Tried mining Ethash for 10 mins, getting 100MH/s and max memory Junction temp is 90C and 86C with a fan blowing over the backplate. 3090 undervolted to 1740mhz/750mV (mem +100mhz so 19.7Gb/s) and using an avg of 290W during mining.






model is Asus TUF 3090 with Bitspower full covered waterblock


----------



## toilet pepper (Feb 18, 2021)

phill said:


> Strix had a full cover block and a back plate that was another $40 on top of the block lol  My mate was gutted as it's not helped a thing and I believe that these heatsinks I've ordered as well won't help much if at all either....  We'll see when I get them through



The 3090 have actual chips at the back and the GDDR6X needs actual cooling. I'm thinking a simple backplate would not cut it. They need actual cooling and I think EK has some sort of quantum cooling thing where the backplate is kinda watercooled as well.

With how hot these GDDR6X gets, traditional cooling for graphics card would not cut it. They should have separate cooling from the gpu core. These things soak most of the heat capacity of any heatsink. The 3090 should have had HBM or lesser chips to ease that problem with cooling at the back.


----------



## Mussels (Feb 18, 2021)

nguyen said:


> Tried mining Ethash for 10 mins, getting 100MH/s and max memory Junction temp is 90C and 86C with a fan blowing over the backplate. 3090 undervolted to 1740mhz/750mV (mem +100mhz so 19.7Gb/s) and using an avg of 290W during mining.
> 
> View attachment 188872
> 
> model is Asus TUF 3090 with Bitspower full covered waterblock


I get 105-110MH at 1600/0.7v, 250W

1.5mm pads arrived today, looking forward to doing both sides later


----------



## wolf (Feb 19, 2021)

Mussels said:


> I get 105-110MH at 1600/0.7v, 250W


memory speed? that's a bloody good hash rate, here I was thinking ~90 was pretty good but even with a 3080 I could get 97 or so it seems, just need to find the magic settings.


----------



## nguyen (Feb 19, 2021)

Mussels said:


> I get 105-110MH at 1600/0.7v, 250W
> 
> 1.5mm pads arrived today, looking forward to doing both sides later



I only ran the miner without setting up a wallet or finding the closest pool so hashrate could have been better I guess .


----------



## wolf (Feb 19, 2021)

Anyone seen this yet too? Mining GPU's and Nvidia gimping Drivers for the 3060's mining ability.. lets see how long that all lasts!


----------



## nguyen (Feb 19, 2021)

phill said:


> Strix had a full cover block and a back plate that was another $40 on top of the block lol  My mate was gutted as it's not helped a thing and I believe that these heatsinks I've ordered as well won't help much if at all either....  We'll see when I get them through



Tell your mate to put the old Strix cooler on top of the backplate? could have drop the temp by a few degree .
The TUF model have separate VRAM heatsink and that could have looked more elegant.


----------



## Mussels (Feb 19, 2021)

Okay: looks like my 3090 needs 1mm 1.5mm and 2mm

I kept most stock 2mm, so strangely i've seen almost no change to the junction temp - 98C with my ghetto fan mod on top, 102C without

That said, GPU is now 57C while mining so hell yeah for the new thermal paste







Update: ugh its hell getting the right sized thermal pads, one? of my chips hits thermal throttle so i gotta rip it apart, guess, and try again with differing thicknesses


----------



## purecain (Feb 20, 2021)

Its the cable, I reconnected it and now the sleep problem is gone. How about that??? Id be happy if I hadn't just been criminalised by the next door neighbour. Meh!!
@wolf Those mining GPU's will be worthless as soon as the mining ends.


----------



## nguyen (Feb 20, 2021)

Mussels said:


> Okay: looks like my 3090 needs 1mm 1.5mm and 2mm
> 
> I kept most stock 2mm, so strangely i've seen almost no change to the junction temp - 98C with my ghetto fan mod on top, 102C without
> 
> ...



Well I tried to warn you replacing thermal pads are fruitless endeavor , you would have better thermal running the fans at higher speed anyways.


----------



## Mussels (Feb 20, 2021)

nguyen said:


> Well I tried to warn you replacing thermal pads are fruitless endeavor , you would have better thermal running the fans at higher speed anyways.



I'm seeing <60C while gaming so there was some big upsides with replacing the non VRAM pads

I caved and ordered gelid 3mm ones so at nuts prices, so after wasting all this money i'll finally have it working as intended.


----------



## FireFox (Feb 20, 2021)

This is what i will be buying for the GPU: GPU Block



Mussels said:


> Okay: looks like my 3090 needs 1mm 1.5mm and 2mm


Yes, and the 3080 1.5mm 2mm


----------



## nguyen (Feb 21, 2021)

Knoxx29 said:


> This is what i will be buying for the GPU: GPU Block
> 
> 
> Yes, and the 3080 1.5mm 2mm
> ...



You can't use manual from a specific waterblock for thermal pad thickness reference because different heatsink/waterblock have different design, some have taller protruding surfaces that requires thinner thermal pads.

From that manual alone I suspect the EKWB only focus on GPU thermal and not the rest of the PCB, my Bitspower WB has thermal pads placements for all the capacitors and inductors (which produce some heat too) 



And the Bitspower WB use 4 different pad thickness (0.5, 1, 2 and 2.5mm).


----------



## thesmokingman (Feb 21, 2021)

Wait what... you guys were getting your pad thickness' from different products as a referance? That's the wrong way to do it! Measure the relative thickness of your own stocks pads then add in variance. I'd also add that stock gpu pads usually have a crapton of clearance because they have big variations in their manufacturing. Waterblocks on the other hand have much tighter clearances as a result of their tighter tolerances, especially in the case of Aqua.


----------



## Mussels (Feb 21, 2021)

thesmokingman said:


> Wait what... you guys were getting your pad thickness' from different products as a referance? That's the wrong way to do it! Measure the relative thickness of your own stocks pads then add in variance. I'd also add that stock gpu pads usually have a crapton of clearance because they have big variations in their manufacturing. Waterblocks on the other hand have much tighter clearances as a result of their tighter tolerances, especially in the case of Aqua.


honestly i f*cked up and removed some of mine before measuring, because at an eyeball glance they all seemed the same (then i found out that 2mm and 2.25mm exist)


----------



## nguyen (Feb 21, 2021)

Mussels said:


> honestly i f*cked up and removed some of mine before measuring, because at an eyeball glance they all seemed the same (then i found out that 2mm and 2.25mm exist)


wait what, 2.25mm thermal pad exist ?


----------



## thesmokingman (Feb 21, 2021)

Mussels said:


> honestly i f*cked up and removed some of mine before measuring, because at an eyeball glance they all seemed the same (then i found out that 2mm and 2.25mm exist)


I dunno... about that thickness, but if it were me I would get the brown Fujipoly pad slightly thicker and be done with it. The Fujis will squish down a bit filling the variance that you find yourself in. Note once the Fujis squish down they eventually break apart when removed, ie. they can only be reused once or twice if yer lucky.


----------



## Mussels (Feb 21, 2021)

thesmokingman said:


> I dunno... about that thickness, but if it were me I would get the brown Fujipoly pad slightly thicker and be done with it. The Fujis will squish down a bit filling the variance that you find yourself in. Note once the Fujis squish down they eventually break apart when removed, ie. they can only be reused once or twice if yer lucky.


i got gelid extremes, so now i've got 1.0, 1.5 and 3mm - i'll make this combination work, damnit



nguyen said:


> wait what, 2.25mm thermal pad exist ?


came up in an image earlier in the thread, theres a bunch of weird sizes that exist

i just need larger than my 1.5mm, cant find 2mm for sale - so 3mm and squishing it is


----------



## FireFox (Feb 21, 2021)

nguyen said:


> You can't use manual from a specific waterblock for thermal pad thickness reference because different heatsink/waterblock have different design, some have taller protruding surfaces that requires thinner thermal pads.


My mistake not to point that i was talking about EKWB Blocks.


nguyen said:


> From that manual alone I suspect the EKWB only focus on GPU thermal and not the rest of the PCB, my Bitspower WB has thermal pads placements for all the capacitors and inductors (which produce some heat too)



The Vector water block directly cools the GPU, VRAM, and the VRM (voltage regulation module) as a cooling liquid is channeled right over these critical areas. The water block is in contact with the power stages of the VRM.



Mussels said:


> then i found out that 2mm and 2.25mm exist)


I bought some 2.25mm years ago, i think it was for my Evga 1080 Classified or 1080ti FTW3


----------



## phill (Feb 22, 2021)

phill said:


> I thought I'd update everyone with a few developments
> 
> The MSI card I have, the temps are fairly steady under mining, about 90 to 98C at most depending on what I'm doing with the card (overclocking etc.) and I've got some more heatsinks coming today and some thin thermal pads so I'm hoping I can put the pads and the heatsinks over where the heatpipes are on the back plate to help with disapating the heat from the RAM.  I highly doubt it will do anything much at all but for a few quid of effort and such, it's worth a try
> 
> And as for my Strix card, that was put under water and the temps, surprise surprise, are no different under mining, even with gaming as well, they are the same as the air cooler...  So long and short, probably don't bother with a block...


Well guys just to confirm the status of the extra pads and heatsinks, whilst it's removing the heat and doing what I had hoped it would do, the temps are the same.  There's no drop in them at all.  So I'm guessing the probe for the temp is either reading wrong, in a place that can't be cooled better or any other reason, but it's not cooling down any better at all.

The Strix with the water block and back plate are working but temps are still in the higher side than we'd like 96C I believe is the maximum my mate has seen but otherwise, it's all a bit of a muchness.  I believe EK are releasing a new backplate so I believe my mate is getting that to see if the temps go lower, otherwise I think it'll just be a case of suck it and see...


----------



## nguyen (Feb 22, 2021)

phill said:


> Well guys just to confirm the status of the extra pads and heatsinks, whilst it's removing the heat and doing what I had hoped it would do, the temps are the same.  There's no drop in them at all.  So I'm guessing the probe for the temp is either reading wrong, in a place that can't be cooled better or any other reason, but it's not cooling down any better at all.
> 
> The Strix with the water block and back plate are working but temps are still in the higher side than we'd like 96C I believe is the maximum my mate has seen but otherwise, it's all a bit of a muchness.  I believe EK are releasing a new backplate so I believe my mate is getting that to see if the temps go lower, otherwise I think it'll just be a case of suck it and see...



expensive thermal pads and backplate are not going help much since the surface of the memory modules have horrible heat transfer property, maybe someone should try sanding down memory modules .


----------



## phill (Feb 22, 2021)

nguyen said:


> expensive thermal pads and backplate are not going help much since the surface of the memory modules have horrible heat transfer property, maybe someone should try sanding down memory modules .


Lapping it'll work!!    Maybe I need to get an iFixit kit, that fixes everything, right??


----------



## purecain (Feb 22, 2021)

Fixed the sleep issue with a new cable... I kept second guessing myself, just installed a new cable and put the pc to sleep and it woke back up with no corruption issues at all. Card is working perfectly.


----------



## Mussels (Mar 9, 2021)

New thermal pads in, they were too thick at first so i gave em the old FUCK YOU SQUISH and then they fit just fiiiine

It's like these memory chips just keep ramping themselves up and never give up on their goal of running at 100C at all times under load

GPU: 55C
TJunction: 94C


----------



## nguyen (Mar 9, 2021)

Mussels said:


> New thermal pads in, they were too thick at first so i gave em the old FUCK YOU SQUISH and then they fit just fiiiine
> 
> It's like these memory chips just keep ramping themselves up and never give up on their goal of running at 100C at all times under load
> 
> ...



What load though, mining or gaming?


----------



## Mussels (Mar 9, 2021)

mining - gamings fine. I wanted to be able to mine without the fan screeching, it auto ramps up when the Tjunction passes 100C


----------



## nguyen (Mar 12, 2021)

oh boy Crypto mining has ruin Nvidia GPU prices in my country (Vietnam)
3060 - 800usd
3060Ti - 1000usd
3070 - 1200usd
3080 - 1700usd
3090 - 2000usd+
Meanwhile AMD GPU stay reasonable, 6800XT - 1000usd, 6900XT - 1300usd
Price included VAT already.

How about GPU prices in other places?


----------



## Mussels (Mar 12, 2021)

The pads have settled in, and i sit at 92C while mining in "low" mode
When it goes to high, the heat builds up and it eventually needs the fan to go overboard... i need bigger heatsinks on the backplate for high end mining since the fan only cools the FRONT of the card and doesnt alter the temps of these back memory chips...


Oh god i qualify for cheaper electricity due to a medical condition

They're subsidising my mining


----------



## thesmokingman (Mar 12, 2021)

Why don't ya throw a 60mm ish fan (or whatever size) on the back?


----------



## nguyen (Mar 12, 2021)

Mussels said:


> The pads have settled in, and i sit at 92C while mining in "low" mode
> When it goes to high, the heat builds up and it eventually needs the fan to go overboard... i need bigger heatsinks on the backplate for high end mining since the fan only cools the FRONT of the card and doesnt alter the temps of these back memory chips...
> 
> 
> ...



Which medical condition is that? minnilingus? now don't be licking your mining equipment too much . Aren't you using an AIO to cool the backplate?


----------



## toilet pepper (Mar 12, 2021)

nguyen said:


> oh boy Crypto mining has ruin Nvidia GPU prices in my country (Vietnam)
> 3060 - 800usd
> 3060Ti - 1000usd
> 3070 - 1200usd
> ...


It was very high to begin with here in the Philippines. I bought my 3080 last December for 1.1k USD as an impulse.

The 3080 already paid itself off at this point. Now I'm thinking about getting additional cards for mining and its all effed-up. A 3080 now costs 1.8k USD for a GB Eagle.

You woildnt be surprized about it. A 3080 nets more than what you would get working a minimum wage job with the current crypto prices.


----------



## nguyen (Mar 12, 2021)

toilet pepper said:


> It was very high to begin with here in the Philippines. I bought my 3080 last December for 1.1k USD as an impulse.
> 
> The 3080 already paid itself off at this point. Now I'm thinking about getting additional cards for mining and its all effed-up. A 3080 now costs 1.8k USD for a GB Eagle.
> 
> You woildnt be surprized about it. A 3080 nets more than what you would get working a minimum wage job with the current crypto prices.



Well if Ethereum hard forking to proof of stake in April, I doubt paying 1700usd for a 3080 can pay itself anytime soon. Miners flooding to other alt coins would just make them unprofitable very soon.
Let hope GPU prices come back to normal again soon .


----------



## FireFox (Mar 12, 2021)

nguyen said:


> oh boy Crypto mining has ruin Nvidia GPU prices in my country (Vietnam)


Don't say that too loud because there are people that still claim that it's not crypto mining fault.


----------



## Hachi_Roku256563 (Mar 12, 2021)

Knoxx29 said:


> Don't say that too loud because there are people that still claim that it's not crypto mining fault.


Its not just crypto mining
their is also stock issues


----------



## Mussels (Mar 12, 2021)

nguyen said:


> Which medical condition is that? minnilingus? now don't be licking your mining equipment too much . Aren't you using an AIO to cool the backplate?



two stock intel coolers with a noctua 92mm fan on top now, lol


----------



## FireFox (Mar 12, 2021)

Isaac` said:


> stock issues


Stock issues because mining, all the cards that should be in someone Gaming PC are now in a Mining farm.


----------



## nguyen (Mar 12, 2021)

Knoxx29 said:


> Stock issues because mining, all the cards that should be in someone Gaming PC are now in a Mining farm.



3080 is currently in at .77% of all GPU in steam hardware survey, that and Steam active players is estimated to be around 120mil in 2021, meaning >900K units are sitting inside player's computers even with the mining craze going on. 
Yeah "stock issue" 
Nvidia probably produced more 3080 stock than all AMD's 6800, 6800XT and 6900XT combined.


----------



## Hachi_Roku256563 (Mar 12, 2021)

Knoxx29 said:


> Stock issues because mining, all the cards that should be in someone Gaming PC are now in a Mining farm.


And you know silicon shortage


----------



## nguyen (Mar 12, 2021)

Isaac` said:


> And you know silicon shortage



You mean we are having sand shortage right now?


----------



## Dinnercore (Mar 13, 2021)

I just joined the club today. Got a watercooled 3090 and I'm in the process of figuring it out. This reported memory temp seems to be always the hottest chip yeah? It reached ~74°C and the backplate is getting really warm, pointed a fan on the back to cool the backside chips and it dropped to ~70°C. I wonder if I void my warranty by sticking some heatsinks to it.


----------



## toilet pepper (Mar 13, 2021)

Dinnercore said:


> I just joined the club today. Got a watercooled 3090 and I'm in the process of figuring it out. This reported memory temp seems to be always the hottest chip yeah? It reached ~74°C and the backplate is getting really warm, pointed a fan on the back to cool the backside chips and it dropped to ~70°C. I wonder if I void my warranty by sticking some heatsinks to it.


That warranty sticker they always use on the screwholes... I get around that with a hobby knife. 

You're most likely using a gigabyte 3090. They dont put thermal pads on the back of the chips or they use crappy ones. You can change them but if you're just gaming, 70C is good.


----------



## Caring1 (Mar 13, 2021)

Dinnercore said:


> I wonder if I void my warranty by sticking some heatsinks to it.


Depends what country you are in, in most it is a bluff and you don't void warranty.


----------



## jboydgolfer (Mar 13, 2021)

nguyen said:


> How about GPU prices in other places?


wow. thats unreal

& the AMD cards are reasonable??


----------



## witkazy (Mar 13, 2021)

Mussels said:


> The pads have settled in, and i sit at 92C while mining in "low" mode
> When it goes to high, the heat builds up and it eventually needs the fan to go overboard... i need bigger heatsinks on the backplate for high end mining since the fan only cools the FRONT of the card and doesnt alter the temps of these back memory chips...
> 
> 
> ...


I might talking out of my ass , but why not get ride of back plate all together and try to blow some air directly on hot area with extra fan? Mod version ,replace back plate with mesh plate of similar size ,add some spacers /washers then you could mount fan directly to mesh plate .Idk ,just bored.Cheers.


----------



## Mussels (Mar 13, 2021)

witkazy said:


> I might talking out of my ass , but why not get ride of back plate all together and try to blow some air directly on hot area with extra fan? Mod version ,replace back plate with mesh plate of similar size ,add some spacers /washers then you could mount fan directly to mesh plate .Idk ,just bored.Cheers.


the backplate is used to hold the front plate in tighter


----------



## Dinnercore (Mar 14, 2021)

Caring1 said:


> Depends what country you are in, in most it is a bluff and you don't void warranty.


Sadly its Germany and things are pretty strict. Some manufacturers like EVGA are very kind and accept tinkering with stuff to some degree but most just completly refuse the RMA if there is a seal or sticker broken. I will try to just glue something to the existing backplate.



Mussels said:


> the backplate is used to hold the front plate in tighter


Yeah I face a similar issue, I need my backplate to hold the waterblock on tight. I know these OEM blocks can be very tricky with getting the correct mounting pressure. Had this issue with my 2080, don't need it again.

I'll try some things, I got a few heatsinks flying around from scrapped TV pcbs. If you find something that works decent, please share.


----------



## nguyen (Mar 14, 2021)

Dinnercore said:


> I just joined the club today. Got a watercooled 3090 and I'm in the process of figuring it out. This reported memory temp seems to be always the hottest chip yeah? It reached ~74°C and the backplate is getting really warm, pointed a fan on the back to cool the backside chips and it dropped to ~70°C. I wonder if I void my warranty by sticking some heatsinks to it.



If you are only gaming then there is no need to cool the backplate, 74C Tjunction is a very good temperature already, stock 3080FE and 3090FE can reach like 90C-100C Tjunction. 
You can increase the radiator fans speed and it would lower VRAM Tjunction along with GPU temp too. 
Overall just enjoy your 3090, there is no need to do anything else beside playing games


----------



## thesmokingman (Mar 14, 2021)

Dinnercore said:


> I just joined the club today. Got a watercooled 3090 and I'm in the process of figuring it out. This reported memory temp seems to be always the hottest chip yeah? It reached ~74°C and the backplate is getting really warm, pointed a fan on the back to cool the backside chips and it dropped to ~70°C. I wonder if I void my warranty by sticking some heatsinks to it.


What block are you using? My watercooled (bykski) 3090 memory junction temp maxes out around 75c ish after a four hour+ session. It's been rainy here so nothing to do but game I guess.


----------



## Dinnercore (Mar 14, 2021)

nguyen said:


> If you are only gaming then there is no need to cool the backplate, 74C Tjunction is a very good temperature already, stock 3080FE and 3090FE can reach like 90C-100C Tjunction.
> You can increase the radiator fans speed and it would lower VRAM Tjunction along with GPU temp too.
> Overall just enjoy your 3090, there is no need to do anything else beside playing games


Its even lower in games but I have not reached 100% load in games yet. I play at 1440p and I think the most stressful game I play at the moment would be Star Citizen.

It got to 74°C tops with mining (daggerhashimoto). My radiator fans don't need to be cranked, I use a MoRa 360 in push/pull with 8x 180mm fans. It has its own loop, I got three separate loops now for my CPU, my 2080 and the 3090.
And yes, these temps are fine but I'm the kind of guy who chases every improvement possible for hardware even if it makes no sense at all.

Which is why I did this:





I placed some heatsinks on the spots where it does not create clearance issues. Glue is your typical thermal silicone stuff which I found works well enough and has a very nice and strong bond while not leaving residue when you remove it.

I'm letting it set at the moment, will update later if it changed my thermals at all.

EDIT: Dropped the memory mining load temp by ~2°C to 68°C. Backplate feels noticably cooler to the touch. I know direct contact heatsinks would improve things a lot more, but I don't want to void my warranty yet.
Comparing this to regular air cooled models and some of the other watercooled ones I'm happy. Seems to be a very healthy temperature.



thesmokingman said:


> What block are you using? My watercooled (bykski) 3090 memory junction temp maxes out around 75c ish after a four hour+ session. It's been rainy here so nothing to do but game I guess.


I got the inno3D Frostbite card, comes with a block made by alphacool. Once it runs out of warranty I will probably replace it with a block from watercool or aquacomputer.


----------



## toilet pepper (Mar 16, 2021)

My 2mm Thermalrite Odyssey pad arrived and the tinkering began. Here's what happened to the GELID 12.8wmk pad after 2 months. It fried up and crumbled.







Some repaste with an MX4 and alcohol cleaning later. The oily leak was from the stock thermal pads that came with the MSI 3080 Ventus






Here's some pics of the pads. The Thermalright is slightly softer.





The Thermalright made a lot of difference. I got 6C less from Mem Junction Temp. Previously the temps when mining it was 96C and the fans were at 90%. I can now run the fans at 75% and the temps stays at 90C.


----------



## Mussels (Mar 16, 2021)

Wait, so all the money i just spent on GELID pads was a bad idea? woops

I'm down to 92C when mining on the low TDP profile, but it still has the fans ramp to crazy levels on full wattage due to VRAM temps


----------



## thesmokingman (Mar 16, 2021)

toilet pepper said:


> My 2mm Thermalrite Odyssey pad arrived and the tinkering began. Here's what happened to the GELID 12.8wmk pad after 2 months. It fried up and crumbled.


That's what the clay like based pads do over time. There's a trick to getting them off w/o breaking them apart but really they only have one or two applications before they crumble.


----------



## toilet pepper (Mar 16, 2021)

Mussels said:


> Wait, so all the money i just spent on GELID pads was a bad idea? woops
> 
> I'm down to 92C when mining on the low TDP profile, but it still has the fans ramp to crazy levels on full wattage due to VRAM temps


You're using a 3090 so that might be the reason of the high temps. The chip at the back needs active cooling.

I'm running the deshrouded 3080 inside an NR200p with 2 sickleflow fans blowing air to it from the bottom. 

Mining in a mini itx case with a 3080 is not ideal but it works.


----------



## nguyen (Mar 16, 2021)

Mussels said:


> Wait, so all the money i just spent on GELID pads was a bad idea? woops
> 
> I'm down to 92C when mining on the low TDP profile, but it still has the fans ramp to crazy levels on full wattage due to VRAM temps



Next let's try some low power TEC cooling , something like 50W TEC inbetween the backplate and the crappy Intel stock cooler? only plug in the TEC when GPU is fully loaded to prevent condensation, and some paper towel too.


----------



## Mussels (Mar 16, 2021)

toilet pepper said:


> You're using a 3090 so that might be the reason of the high temps. The chip at the back needs active cooling.
> 
> I'm running the deshrouded 3080 inside an NR200p with 2 sickleflow fans blowing air to it from the bottom.
> 
> Mining in a mini itx case with a 3080 is not ideal but it works.



i have a frikkin intel stock cooler sitting on the backplate to dissipate some of the heat, its nuts how how they get


----------



## FireFox (Mar 16, 2021)

Shouldn't Watercooling the GPU solve the heating problem?


----------



## Deleted member 205776 (Mar 16, 2021)

Just dropping in here to say that you all should undervolt your Ampere cards... stock was 1965-1980 MHz @ 1050mv.


----------



## nguyen (Mar 16, 2021)

Alexa said:


> Just dropping in here to say that you all should undervolt your Ampere cards... stock was 1965-1980 MHz @ 1050mv.



All Ampere owners should under-mine you mean , yeah we all undervolt our 3080 and 3090 to somewhere around 800-850mV for games and lower for mining. I have 5 undervolt profiles
1740mhz/750mV
1860mhz/806mV
1950mhz/875mV
2040mhz/931mV
2130mhz/993mV


----------



## FireFox (Mar 16, 2021)

Mine is 1830-850mV


----------



## Deleted member 205776 (Mar 16, 2021)

Yeah, could undervolt more for lower temps, but honestly at 950mv I found the best balance between frequencies and temps. At 925mv, the card would not boost past 2040 MHz but as soon as I set it to 950mv, it jumped to 2085-2100 MHz. Voltages lower than 950mv honestly didn't make much of a difference in temps and even overall power draw.

The temps are great as it is -- 59-61C max after undervolt, 66C max stock. I'll take the reduced heat output and performance uplift please


----------



## FireFox (Mar 16, 2021)

I am building a custom Loop so i am eager to see how much temps will drop.


----------



## Deleted member 205776 (Mar 16, 2021)

I was very surprised at the temps of this card, a 3070 Gaming X Trio, on air. Even more surprised at undervolted temps and performance. Shaved off 30-50W.


----------



## FireFox (Mar 16, 2021)

What's your temperature with those voltages and clock?


----------



## Deleted member 205776 (Mar 16, 2021)

Knoxx29 said:


> What's your temperature with those voltages and clock?


Mentioned above, with 2085-2100 @ 950mv my temps are 59-61C while gaming. I think 63C max during 3DMark, Superposition, you name it, all while keeping 2085 MHz or higher.

On stock it's ~1980 MHz @ 1050mv, and reaches 63C while gaming, 65-66C in benchmarks.

All while being extremely quiet on either configurations -- the GPU went from being the loudest part in my system to the quietest. Fans don't need to spin past 42% at all, even lower on the undervolt. MSI did a great job on the cooler.


----------



## FireFox (Mar 16, 2021)

Alexa said:


> Mentioned above, with 2085-2100 @ 950mv my temps are 59-61C while gaming.


I was editing my post 

Those are pretty amazing temps, i should check mine too.


----------



## Deleted member 205776 (Mar 16, 2021)

Knoxx29 said:


> I was editing my post
> 
> Those are pretty amazing temps, i should check mine too.


Edited mine too, they are pretty amazing coming from a 82-85C Zotac Mini card. The nightmares


----------



## nguyen (Mar 16, 2021)

Knoxx29 said:


> I am building a custom Loop so i am eager to see how much temps will drop.



Depend on how much rad space and GPU Waterblock you are putting in, but on avg GPU load temp should be around Ambient temp +20C


----------



## FireFox (Mar 16, 2021)

nguyen said:


> Depend on how much rad space and GPU Waterblock you are putting in, but on avg GPU load temp should be around Ambient temp +20C


2 x 480 and 1 x 360


----------



## nguyen (Mar 16, 2021)

Knoxx29 said:


> 2 x 480 and 1 x 360



Maybe Ambient temp +15C is pretty realistic for you case



Alexa said:


> Mentioned above, with 2085-2100 @ 950mv my temps are 59-61C while gaming. I think 63C max during 3DMark, Superposition, you name it, all while keeping 2085 MHz or higher.
> 
> On stock it's ~1980 MHz @ 1050mv, and reaches 63C while gaming, 65-66C in benchmarks.
> 
> All while being extremely quiet on either configurations -- the GPU went from being the loudest part in my system to the quietest. Fans don't need to spin past 42% at all, even lower on the undervolt. MSI did a great job on the cooler.



Well I tested the performance across different undervolt profiles here and after the 800mV mark, the 3080 and 3090 would become inefficient real quick, the last 10% performance require 30% of total TDP (100W for 10% extra FPS)


----------



## FireFox (Mar 16, 2021)

Alexa said:


> my temps are 59-61C while gaming.


Just curious, which Games?


----------



## Deleted member 205776 (Mar 16, 2021)

Knoxx29 said:


> Just curious, which Games?


Warzone and PUBG specifically, those are on the 61C side, same with Metro Exodus and any RTX on game


----------



## FireFox (Mar 16, 2021)

I don't remember which temps i am getting on Warzone but FIFA for some unknown reason doesn't push the GPU that much so i get 42c/45c, will post later about Warzone.


----------



## Deleted member 205776 (Mar 16, 2021)

I get up to 66C on Warzone on stock. Soon as I apply my undervolt while ingame, drops to 61C slowly and remains there. Ultra everything including DXR on. Performance doesn't take a hit regardless of the VRAM usage hitting 10 GB and it starting to use shared memory.


----------



## toilet pepper (Mar 16, 2021)

Mussels said:


> i have a frikkin intel stock cooler sitting on the backplate to dissipate some of the heat, its nuts how how they get


You might not be ramping up your fans high enough. By default, before I did all the mods to my 3080 I have to make the fan spinm atleast 80% for the mem temp to stay at 100C. Since I deshrouded it and put it in a case, I use 2 120mm fans at 90% speed for sanity. I can go with a quieter or lower speed but I can't hear it over the noisy airconditioner that I have in our room anyways.

The RGB adds +5mhs and lower my temps by 5C. I could go with the color blue for better temps but it is drawing too much power.


----------



## Mussels (Mar 16, 2021)

As to the undervolting peeps: Well, summer was hot here and then 99% of my games were fine at this clock level. I'm happy for better efficiency until i need the juice.







Okay silly copper heatsinks have arrived:


Nicehash mining on low TDP (it heats the ram FAAAAAST)

Stock: 96C
Intel stock heatsink on top: 92C
Copper slabs (no adhesive/pads yet): 90C

It took no time at all for that backplate to roast these copper plates to 60C according to my temp probe, finger says "ah shit its hotter than that"


----------



## FireFox (Mar 16, 2021)

Mussels said:


> summer was hot here


and we still have winter here that is why i am not rushing on building the Custom Loop


----------



## thesmokingman (Mar 16, 2021)

Knoxx29 said:


> and we still have winter here that is why i am not rushing on building the Custom Loop


For those with loops, winter is benching season!


----------



## FireFox (Mar 16, 2021)

thesmokingman said:


> For those with loops, winter is benching season!


Unfortunately i couldn't/cant finish the Loop right now because i am having a hard time trying to find the last Rad which is a Hardwarelabs BLACK ICE NEMESIS GTX 360 RADIATOR - BLACK.


----------



## Mussels (Mar 16, 2021)

Hmmm, so the copper slabs heated up over time and back to 92-94C we go (ambients went up 1C)

Needs more fans? theres a friggin 92mm noctua on there


----------



## R-T-B (Mar 16, 2021)

Knoxx29 said:


> Don't say that too loud because there are people that still claim that it's not crypto mining fault.


I mean, it's several factors and mining certainly isn't the chief one, but it is a factor.


----------



## Mussels (Mar 17, 2021)

Okay so take three:

1. The thermal pads i used at the top were too thick and warping the backplate, so i used my fingers and crushed them flatter to un-warp the backplate
2. ADHESIVE TIME
3. wow it worked
4. (i think the bottom facing ones need better sized pads to work fully at high TDP mining... i fucked up getting 3mm pads instead of 2mm i think? maybe i need to combine my other ones and make 1.5mm i dunno)


----------



## FireFox (Mar 17, 2021)

Honestly why all this when things can be solved just adding water?


----------



## Mussels (Mar 17, 2021)

Knoxx29 said:


> Honestly why all this when things can be solved just adding water?


I literally got this card because my AIO 3080 sprung a leak. I wanna stay dry for a while... but yes long term i'd like of those new double sided EK blocks


----------



## thesmokingman (Mar 17, 2021)

Knoxx29 said:


> Unfortunately i couldn't/cant finish the Loop right now because i am having a hard time trying to find the last Rad which is a Hardwarelabs BLACK ICE NEMESIS GTX 360 RADIATOR - BLACK.


Oh that sucks. The HWL rad production is spotty especially now. I went with the primered ready rads instead their crinkle coated because it was available. That said I dislike their crinkle finish which is so 80's, lol. Their stock will return, it just could take some time unfortunately. These new GTX rads are really great though. I used two GTX 480s in a threadripper build and they tame a 1100w load just the two rads.


----------



## FireFox (Mar 17, 2021)

thesmokingman said:


> I used two GTX 480s in a threadripper build and they tame a 1100w load just the two rads.


It tooks me 2 months to find the GTX 480s ordered 2 a few days ago from France, they should be delivered tomorrow.


----------



## Mussels (Mar 18, 2021)

Turns out one of the BOTTOM thermal pads slipped out of position - did a tear down to replace the VRM pad and noticed it had done a woopsie, temps are heaps better now

96C VRAM while mining at full power MAKES MUSSELS.... well it makes it "okay" since the fans stay at low speeds


----------



## toilet pepper (Mar 18, 2021)

Mussels said:


> Turns out one of the BOTTOM thermal pads slipped out of position - did a tear down to replace the VRM pad and noticed it had done a woopsie, temps are heaps better now
> 
> 96C VRAM while mining at full power MAKES MUSSELS.... well it makes it "okay" since the fans stay at low speeds


Which pad slipped? The VRM or the Mem pads? You doing 120mhs now?


----------



## Mussels (Mar 18, 2021)

toilet pepper said:


> Which pad slipped? The VRM or the Mem pads? You doing 120mhs now?


mem pad, but i noticed it on the way for VRM


94MH but silently, works for me


----------



## Mussels (Mar 23, 2021)

3090 owners: what temp does your GPU and Tjunction idle at?

I'm finding my VRAM/TJ takes a loooooong time to cool down after mining or gaming, lowest it gets is around 45C idle (which for such a huge cooler and no load, seems hot)


----------



## jboydgolfer (Mar 23, 2021)

Mussels said:


> what temp does your GPU and Tjunction idle at?


what are you using to monitor?


----------



## Mussels (Mar 23, 2021)

jboydgolfer said:


> what are you using to monitor?


hwinfo


----------



## jboydgolfer (Mar 23, 2021)

i was wondering why i didnt have those options. i have openHW monitor.


----------



## Mussels (Mar 23, 2021)

jboydgolfer said:


> i was wondering why i didnt have those options. i have openHW monitor.


well you're silly, give HWinfo a shot


----------



## nguyen (Mar 23, 2021)

Mussels said:


> 3090 owners: what temp does your GPU and Tjunction idle at?
> 
> I'm finding my VRAM/TJ takes a loooooong time to cool down after mining or gaming, lowest it gets is around 45C idle (which for such a huge cooler and no load, seems hot)



Well the Tjunction sensor is inside the module after all, it should be 20C hotter than the surface of the module, which is sitting at ambient temp during idling.

More info here








						GDDR6X at the limit? Over 100 degrees measured inside of the chip with the GeForce RTX 3080 FE! | Investigative | igor'sLAB
					

Latest memory chips, such as Micron's GDDR6X modules on the GeForce RTX 3080 allow the internal protection mechanism to be used for special protection mechanisms (e.g. the clocking down) of the chip…




					www.igorslab.de


----------



## Dinnercore (Mar 23, 2021)

Mussels said:


> 3090 owners: what temp does your GPU and Tjunction idle at?
> 
> I'm finding my VRAM/TJ takes a loooooong time to cool down after mining or gaming, lowest it gets is around 45C idle (which for such a huge cooler and no load, seems hot)


Since I'm on water for the front of the GPU my data is probably not super useful for you. But this is what I get when dropping from an X-hour mining load (with slight memory OC, getting 111MH/s):

Instantly drops from 70°C down to ~50°C and then continues on to settle at 32°C after a minute. I did glue some heatsinks to the backplate and point a fan at it to try and improve the backside thermals.


----------



## Mussels (Mar 23, 2021)

I need to save the mining dosh as a watercooling fund

Front and back blocks are gunna cost $200Au each! from EK, before i look at anything else - and i honestly have no idea what i need, so i'd end up coughing up for one of the kits like this (another $450 au)



Oh my god aliexpress has 3090 AIO units

Bykski AIO GPU Water Cooling Kit For NVIDIA RTX 3080 3090 AIC Reference Graphics Card VGA Radiator 5V A RGB B FRD3090H RBW|Fluid DIY Cooling| - AliExpress


Wait, if my cards reference and EVGA XC3 cards are reference, can i use their AIO kit?
EVGA - Products - EVGA HYBRID Kit for EVGA GeForce RTX 3090/3080 XC3, 400-HY-1978-B1, ARGB - 400-HY-1978-B1
The EVGA manual stock photo vs the photo of my card mid-pad replacement


----------



## Caring1 (Mar 23, 2021)

Looks different to me.


----------



## nguyen (Mar 23, 2021)

Galax 3090 SG is indeed using reference PCB, however all EVGA models are using custom PCB
More info here
So you can't use waterblock for the EVGA XC3 model, only WB for Reference PCB.


----------



## wolf (Mar 23, 2021)

FWIW, I dissected my Asus TUF 3080 and had Thermal Grizzly Kryonaut and Thermalright Oddessey 12.8w/mk pads on hand, fully ready to improve the situation, and I'm willing to wager Asus has used either those exact same pads, or ones with the same thermal xfer properties. I have the same temps as before across the board and basically took the risk of dissecting the card for nothing. Asus has put together an absolutely top-notch product with the TUF 3080.


----------



## Mussels (Mar 23, 2021)

nguyen said:


> Galax 3090 SG is indeed using reference PCB, however all EVGA models are using custom PCB
> More info here
> So you can't use waterblock for the EVGA XC3 model, only WB for Reference PCB.


well... expensive EK it is.

i'll need to make a wishlist or find a good kit (or have faith in that ali express AIO)


----------



## Caring1 (Mar 23, 2021)

You gotta have faith


----------



## Dinnercore (Mar 23, 2021)

Mussels said:


> well... expensive EK it is.
> 
> i'll need to make a wishlist or find a good kit (or have faith in that ali express AIO)





			https://www.alphacool.com/download/compatibility%20list%20Nvidia.pdf
		



			https://www.aquacomputer.de/tl_files/aquacomputer/downloads/rtx_3080_3090_product_compatibility_list.pdf
		



			http://gpu.watercool.de/WATERCOOL_HEATKILLER_GPU_Compatibility.pdf
		


You can check if any of these make a waterblock that fits your model, maybe you can get them cheaper. EK with a block for front and back would be ideal, I don't think water on the back is neccessary. My temps are ok I think.
Your card seems to be reference at first glance, but that RGB header and fan header placement is wonky ok its already confirmed yours is reference too. Your card does look very similar to mine, but I can not take my card apart and check without loosing warranty.

I got the same cap-placement as you, same connector placement, same coil and fuse positions close to the power connectors, and I also go some empty headers in the area yours are in. My block clears them. This is whats on my card: https://www.aquatuning.us/water-coo...-gpx-n-rtx-3090/3080-with-backplate-reference


----------



## nguyen (Mar 23, 2021)

Mussels said:


> well... expensive EK it is.
> 
> i'll need to make a wishlist or find a good kit (or have faith in that ali express AIO)



Well I made a list for you 
1x Galax 3090 WB
1x 280mm radiator
1x D5 Pump and Resevoir combo
4x 1M soft tube 10x13mm
7x Compression fitting for soft tube 10x13mm
6x Stop plug
1x Temperature stop plug
1x Valve
1x Pump bracket
2x 90degree fittings

Loop order: Pump outflow --> Radiator --> GPU --> Resevoir Top Inlet (or outflow --> GPU --> Radiator --> inlet)

Use the bottom Inlet for the drain valve (can be used for filling the loop also)
Add 2x compression fittings for each additional component (2x for CPU WB and 2x for additional Radiator)
The temperature stop fitting can go into the GPU WB to monitor water temp, which can be used to regulate fans speed.

Here is an example from my spare build


----------



## FireFox (Mar 23, 2021)

Love how simple it looks


----------



## Mussels (Mar 23, 2021)

nguyen said:


> Well I made a list for you
> 1x Galax 3090 WB
> 1x 280mm radiator
> 1x D5 Pump and Resevoir combo
> ...



mmmm ali expresssss

Many edits later, ali wants $90 shipping and will take a month, and i really want that active backplate for this madness hot VRAM.

One local store sells EK products and watercooling stuff, i can skip $90 shipping and get it quicker... when they get the right waterblock in next month which gives me time to save.

PC Case Gear (direct link to wishlist)

(remember, aussie prices suck (tube cutter and fill bottle are because i have issues with my hands at times, and they seem like they'll make it easier)





I could also just get one of the EK kits (currently no stock) and slap the GPU block in with it?
Buy EK Classic P360 D-RGB Liquid Cooling Kit Black Nickel [3831109834169] | PC Case Gear Australia


----------



## nguyen (Mar 24, 2021)

Looks good, however the EK SPC pump is a little weak, 250L/H. You might have a problem later when you want to add a CPU block and another RAD. 
How about the Corsair pump/res combo with D5 pump?


----------



## Mussels (Mar 24, 2021)

I have a problem with corsair and icue, as long as it can be controlled without their software and proprietary ARGB connector its an option


----------



## nguyen (Mar 24, 2021)

Mussels said:


> I have a problem with corsair and icue, as long as it can be controlled without their software and proprietary ARGB connector its an option



well the EK Quantum Kinetic TBE does look better than the corsair, much stronger pump too.
For high static pressure 120mm fans, I recommend some Arctic P12


----------



## Mussels (Mar 24, 2021)

Guh, fans. At least if i get the EK kit it comes with some.
(I'd need ARGB fans, these will be top mounted in a blingy system)


----------



## nguyen (Mar 24, 2021)

Mussels said:


> Guh, fans. At least if i get the EK kit it comes with some.
> (I'd need ARGB fans, these will be top mounted in a blingy system)



I tried some Vardar fans before and the noise/perf ratio kinda sucky, Corsair fans are the same.
Maybe get some Phantek RGB fan frames?


----------



## Mussels (Mar 24, 2021)

Edited to all heck and back

The kits are sold out everywhere despite being a help for a beginner to not miss anything

This is the front and backplate i want and will mix in with that PCCG order
EK-Quantum Vector RE RTX 3080/3090 D-RGB - Nickel + Plexi
Active Backplate for the reference design RTX 3080 and RTX 3090 EK water blocks


----------



## nguyen (Mar 24, 2021)

Mussels said:


> Edited to all heck and back
> 
> The kits are sold out everywhere despite being a help for a beginner to not miss anything
> 
> ...



Problem with that kit is that the pump is too weak, almost like AIO level, and has no room for loop expansion in the future.

Remember you will need 2x classic fittings for the active backplate also, so order more of them (hell they are 1$ a piece in Aus?, they are 5$ a piece here)


----------



## Mussels (Mar 24, 2021)

nguyen said:


> Problem with that kit is that the pump is too weak, almost like AIO level, and has no room for loop expansion in the future.
> 
> Remember you will need 2x classic fittings for the active backplate also, so order more of them (hell they are 1$ a piece in Aus?, they are 5$ a piece here)


what? no thats 7x$7

i'm not 100% sure on how that backplate connects to the front, so you think it'll need more?
(this is why i was after a kit, because without it in front of me i dont know how many of each piece i need)

this is gunna motivate me to get a new job


----------



## nguyen (Mar 24, 2021)

Mussels said:


> what? no thats 7x$7
> 
> i'm not 100% sure on how that backplate connects to the front, so you think it'll need more?
> (this is why i was after a kit, because without it in front of me i dont know how many of each piece i need)
> ...



Just connect GPU WB and the Backplate in series, WB outlet go into backplate inlet, backplate outlet go to resevoir or radiator depending on loop order. 

9x Classic fittings in total, 1 extra fitting for the drain valve (to connect a tube for easier draining)
1x Male to male extender for the drain valve
3x Stop Plug is enough, no need for 6 of them (some multi-port Resevoir just don't come with stop plug, so I was just being safe)
2x EK CryoFuel 1000ml (for when shit happen and your loop is leaking )


----------



## Mussels (Mar 24, 2021)

nguyen said:


> Just connect GPU WB and the Backplate in series, WB outlet go into backplate inlet, backplate outlet go to resevoir or radiator depending on loop order.
> 
> 9x Classic fittings in total, 1 extra fitting for the drain valve (to connect a tube for easier draining)
> 1x Male to male extender for the drain valve
> ...



cheers. i emailed them and asked them to stock the blocks i need as well

you can see the realtime wishlist here btw
PC Case Gear


----------



## toilet pepper (Mar 25, 2021)

Has anybody gotten their new bios for resize BAR yet?


----------



## Dinnercore (Mar 25, 2021)

toilet pepper said:


> Has anybody gotten their new bios for resize BAR yet?


Thanks for reminding me. I updated my drivers and checked EVGA if they would enable this feature on the X299 DARK, sure enough they did.

So my board and CPU are compatible, driver is too but I still need a new VBIOS for the 3090. I contacted support and asked if they can provide me one, will see if I get it.


----------



## Deleted member 205776 (Mar 25, 2021)

toilet pepper said:


> Has anybody gotten their new bios for resize BAR yet?


Still waiting for my ReBar BIOS update, NVIDIA driver, and VBIOS for my card.



Dinnercore said:


> Thanks for reminding me. I updated my drivers and checked EVGA if they would enable this feature on the X299 DARK, sure enough they did.
> 
> So my board and CPU are compatible, driver is too but I still need a new VBIOS for the 3090. I contacted support and asked if they can provide me one, will see if I get it.


Driver is compatible? What driver ver has ReBar support for other Ampere cards? I'm on 461.92.


----------



## Dinnercore (Mar 25, 2021)

Alexa said:


> Driver is compatible? What driver ver has ReBar support for other Ampere cards? I'm on 461.92.


You are correct it is not yet compatible. I'm on the same version. Just read this from Nvidia:
"For desktops, you will need a compatible CPU, compatible motherboard, motherboard SBIOS update, GPU VBIOS update (the GeForce RTX 3060 already ships with the necessary BIOS) and new GeForce Game Ready driver to enable Resizable BAR."

And I thought it meant the current latest driver would already support it. So yeah still have to wait.


----------



## toilet pepper (Mar 25, 2021)

Dinnercore said:


> You are correct it is not yet compatible. I'm on the same version. Just read this from Nvidia:
> "For desktops, you will need a compatible CPU, compatible motherboard, motherboard SBIOS update, GPU VBIOS update (the GeForce RTX 3060 already ships with the necessary BIOS) and new GeForce Game Ready driver to enable Resizable BAR."
> 
> And I thought it meant the current latest driver would already support it. So yeah still have to wait.


They said March but they didn't say what year. Guess we'll have to wait.


----------



## Dinnercore (Mar 26, 2021)

Alright, got an answer from support. New VBIOS version will come soon™©. ReBAR will be enabled with the next VBIOS revision for my card and they plan to release a specific tool for this update. I guess since a lot more people than usual want to update their VBIOS they will attempt to make it idiot-proof.


----------



## Mussels (Mar 26, 2021)

i'm assuming it'll be an .exe file we flash like they did for the displayport fix they had a while back

Someone just sold me their old EK watercooling setup for $100
(intel) CPU block, pump, 280mm rad, EK fans

It's pre-RGB stuff but i can combine it with a new GPU block and away i go


----------



## Mussels (Mar 27, 2021)

Second hand pile of goodies has arrived

280mm EK rad
(damaged) CPU block (meh)
bunch of fittings and small amount of hose for 1/2ID 3/4OD (ordered more hose, this is my jam now)
D5 pump and honky res

Hose ordered from amazon cause that size isnt common in clear, all thats left is to wait for the active backplate to be ready to order and i'm greenlit


----------



## BarbaricSoul (Mar 27, 2021)

I have been trying to get a RTX 3000 card since Christmas. I have entered every Newegg shuffle for the past 3 weeks (when I found out about it) selecting any and every card between a 3060ti to a 3080, as long as the card is under $1000. I'm also doing the same thing for AMD 6000 cards. 15 shuffles so far, and still no damn luck


----------



## Mussels (Mar 27, 2021)

BarbaricSoul said:


> I have been trying to get a RTX 3000 card since Christmas. I have entered every Newegg shuffle for the past 3 weeks (when I found out about it) selecting any and every card between a 3060ti to a 3080, as long as the card is under $1000. I'm also doing the same thing for AMD 6000 cards. 15 shuffles so far, and still no damn luck


just get on a plane and come down here, the local centrecom has stock


----------



## Deleted member 205776 (Mar 27, 2021)

BarbaricSoul said:


> I have been trying to get a RTX 3000 card since Christmas. I have entered every Newegg shuffle for the past 3 weeks (when I found out about it) selecting any and every card between a 3060ti to a 3080, as long as the card is under $1000. I'm also doing the same thing for AMD 6000 cards. 15 shuffles so far, and still no damn luck


Do what I did and refresh store pages until 6 AM lol


----------



## BarbaricSoul (Mar 27, 2021)

Mussels said:


> just get on a plane and come down here, the local centrecom has stock


Pick me one up and send it to me, I'll pay you for it, plus shipping and $50 for your time.

On second thought, never mind. I can buy from a scalper here for what the cards there retail for.


----------



## Mussels (Mar 27, 2021)

BarbaricSoul said:


> Pick me one up and send it to me, I'll pay you for it, plus shipping and $50 for your time.
> 
> On second thought, never mind. I can buy from a scalper here for what the cards there retail for.


the secret is the free healthcare helps me recover from the lack of kidneys


----------



## dgianstefani (Mar 27, 2021)

I've had good experience with my setup - Aqua Computer ULTITUBE D5 PWM pump. Very quiet and a nice metal body, the res combo is also glass not plexi and it has a nice preinstalled filter. Glass/metal imo is hard to beat.

As for fittings EK Quantum Torque Black Nickel are very nice, and go with my XSPC black chrome 10k temp sensor, although I was tempted by the Optimus compression fittings when I bought my CPU block from them.


----------



## BarbaricSoul (Mar 27, 2021)

Mussels said:


> the secret is the free healthcare helps me recover from the lack of kidneys


Hmmmm, I don't drink, maybe I can get a 6900XT for my liver? Or would I have to give a kidney also?


----------



## Mussels (Mar 27, 2021)

BarbaricSoul said:


> Hmmmm, I don't drink, maybe I can get a 6900XT for my liver? Or would I have to give a kidney also?


probs a testicle


i cant wait for the hoses to arrive so i can start messing around with this stuff, this is my first custom water loop ever - and the last one i helped build was on a Q6600


----------



## Mussels (Mar 30, 2021)

So, updates

1. some of the fittings leaked, the 90 degree ones with rotating joins had dead washers inside
2. the straight fittings are from a different brand and dont like that hose, too easy to pull loose
3. still super educational, and now all new hose and fittings in easier to manage size ordered. within 2 weeks i'll be one of those snobs with a custom water 3090


----------



## nguyen (Mar 31, 2021)

Good news, even my old Z370 board get the Re-bar BIOS support LOL, Intel probably doesn't like it that now there is zero reason to upgrade from an 9900K to 10 or 11th gen. 






Performance improvement is very noticeable in CP2077 too, something like 10-15% with 4K RT Ultra, sweet lord. 
AC Valhalla only get about 5% improvement from Re-Bar
Testing other games now...


----------



## Dinnercore (Mar 31, 2021)

ReBAR confirmed for X299 too.

Did not run any of the supported games yet to see if there is a big difference. Funny how Inno3D just linked to the Nvidia update tool and it has overwritten the subvendor back to Nvidia.
Don't mind the low clocks, its in mining efficiency mode.


----------



## phanbuey (Mar 31, 2021)

I cant seem to get the msi live update to update my ventus bios... did you guys have any issues with it?


----------



## Mussels (Mar 31, 2021)

galax page has gone from a 404 to a page with no downloads, woooo


----------



## nguyen (Mar 31, 2021)

phanbuey said:


> I cant seem to get the msi live update to update my ventus bios... did you guys have any issues with it?



I just used the Asus .exe flashing tool, too lazy to use the nvflash even though I have used it extensively before   .
You can use Nvflash if the MSI live update give you some problem


----------



## Dinnercore (Mar 31, 2021)

Mussels said:


> galax page has gone from a 404 to a page with no downloads, woooo


Wasn't yours a reference pcb as well? If you are brave and have an emergency backup card for reflashing the old bios you could try nvidias tool for the reference parts.
I got it from Inno3D and it worked for my card and our pcbs are the same.

But that is just a bad idea, don't brick your card  (Still the tool does scan for compatibilty according to nvidia)


----------



## Mussels (Mar 31, 2021)

Dinnercore said:


> Wasn't yours a reference pcb as well? If you are brave and have an emergency backup card for reflashing the old bios you could try nvidias tool for the reference parts.
> I got it from Inno3D and it worked for my card and our pcbs are the same.
> 
> But that is just a bad idea, don't brick your card  (Still the tool does scan for compatibilty according to nvidia)


reference isnt founders so i'd rather not - and i dont want to get the wrong fan speed curve from a different card

edit: F5 and they're present now

edited edit: ah its like the displayport one in the past, its not a full BIOS just a patcher


----------



## Dinnercore (Mar 31, 2021)

Mussels said:


> reference isnt founders so i'd rather not - and i dont want to get the wrong fan speed curve from a different card
> 
> edit: F5 and they're present now
> 
> edited edit: ah its like the displayport one in the past, its not a full BIOS just a patcher


Yeah, I don't have a founders either. Its just an updater for your card and seems to work on reference pcbs, not just founders ones.


----------



## thesmokingman (Mar 31, 2021)

Here's my FE. It's flashed in a flash, doh.


----------



## FireFox (Apr 2, 2021)

It was a nightmare to clean all that mess






On air Temps when playing games were 60c and VRAM 76c
On Water, ambient temp 25c, card in idle sits at 25c, after 5 hours playing games, temp 37c and VRAM 50c.

I did a mistake, i thought that i could reinstall the original Backplate and i was wrong ( and i didn't ordered one from EKWB )   but lucky me the GPU Block cools the VRAM


----------



## Mussels (Apr 2, 2021)

i want more pics of your shiz under water so i can try and outdo you when mine arrives next week
i had to order the active backplate with a 3090, i knew i needed that extra cooling


----------



## FireFox (Apr 2, 2021)

Mussels said:


> i want more pics of your shiz under water so i can try and outdo you when mine arrives next week




I dont like how it looks 





Mussels said:


> i had to order the active backplate with a 3090, i knew i needed that extra cooling


i am wondering how much the VRAMs temp would decrease if i add the Backplate? it decreased 27c just with the GPU block.


----------



## Mussels (Apr 2, 2021)

on 3080 you should get upto a 10C drop, if theres airflow over whatever backplate you have there

Oh semi related, i showed my son UV reactive dye in the test loop (using to leak test before the blocks arrive)






my first thought was "jesus look at those rads, what is that an intel build"? (system specs confirmed it)
I'm thinking about a single 280mm for CPU and GPU here, with eco mode set to 87W or so i think it'd be fine tbh


----------



## FireFox (Apr 2, 2021)

Mussels said:


> my first thought was "jesus look at those rads, what is that an intel build"? (system specs confirmed it)




I know that what i built is overkill and a single 480mm would be enough but poor case was so empty.
A good thing is that the Fans 10x 120m and 3x 140 are inaudible and the water loop temp is 28c and 32c when playing games.


----------



## thesmokingman (Apr 2, 2021)

Mussels said:


> I'm thinking about a single 280mm for CPU and GPU here, with eco mode set to 87W or so i think it'd be fine tbh


You ain't beating anything with tiny bit of rad. Look at his amount of radage again and they look to be HWL too.



FireFox said:


> I know that what i built is overkill and a single 480mm would be enough but poor case was so empty.
> A good thing is that the Fans 10x 120m and 3x 140 are inaudible and the water loop temp is 28c and 32c when playing games.


Nice job, looks like it came together well. And for the record it's ALWAYS better to have more rad surface area than just enough.


----------



## FireFox (Apr 2, 2021)

thesmokingman said:


> they look to be HWL tooo.


They are HardwareLasbs, 2x480mm 1x420mm


----------



## Mussels (Apr 2, 2021)

TBF i wanted to get a slim 360 but got that 280 rad, pump and res for $100 (with vardar fans) second hand so screw it, the extra rad can wait


----------



## FireFox (Apr 2, 2021)

Got the Resizable BAR, i didn't need any weird flashing tool, *EVGA Precision X1 *did the job.





There is a Motherboard Bios available that improves ReSizable BAR compatibility for NVIDIA RTX30 series graphics cards but it is Beta


----------



## Durhamranger (Apr 11, 2021)

Aint had this card long...




Will Be going under water tommorow......


----------



## Mussels (Apr 12, 2021)

i'm still crying i got everything for my watercooling loop... except a CPU block or GPU block

GPU block wont ship til the 21st


----------



## Durhamranger (Apr 12, 2021)

Mussels said:


> i'm still crying i got everything for my watercooling loop... except a CPU block or GPU block
> 
> GPU block wont ship til the 21st


Gutted for ya, Nowt worse than having to wait.....


----------



## R-T-B (Apr 17, 2021)

I just got my ticket / 4hr order window to order an evga ftw3 3070.  Gosh, been waiting basically since waitlists started, was about to give up...

sweet!  I would've loved a 3080 but I'll take this given it has true hdmi 2.1 vs my present card!


----------



## jboydgolfer (Apr 17, 2021)

has anyone who owns a 3090FE noticed the card entering Zero fan speed mode?
ive owned 2 3090FE cards, & to my knowledge they cant be manually or otherwise set below 30% fan speed, yet i have seen with both my own eyes, as well as sensors, that the fans are at Zero Speed.
i cant replicate this on demand, so it must be a bug, its not tied to temp, or usage. the fans DO come on if the temp rises enough.

i posted a thread a while back,


----------



## KainXS (Apr 17, 2021)

R-T-B said:


> I just got my ticket / 4hr order window to order an evga ftw3 3070.  Gosh, been waiting basically since waitlists started, was about to give up...
> 
> sweet!  I would've loved a 3080 but I'll take this given it has true hdmi 2.1 vs my present card!


Are they selling it to you for the launch price or the current price


----------



## R-T-B (Apr 17, 2021)

KainXS said:


> Are they selling it to you for the launch price or the current price


I'm getting new tariff price as far as I can tell, as it's the new post april 16 invoicing cost / the website cost.  I could try to fight it as I am literally one day out, but I already feel like I won the lottery and I think they know that lol.  My bargaining power may be limited.



jboydgolfer said:


> has anyone who owns a 3090FE noticed the card entering Zero fan speed mode?
> ive owned 2 3090FE cards, & to my knowledge they cant be manually or otherwise set below 30% fan speed, yet i have seen with both my own eyes, as well as sensors, that the fans are at Zero Speed.
> i cant replicate this on demand, so it must be a bug, its not tied to temp, or usage. the fans DO come on if the temp rises enough.
> 
> i posted a thread a while back,


oof.  If it's a bug one hopes it won't get tied to runaway thermals or something bad like that.


----------



## Dinnercore (Apr 18, 2021)

After using my card for a while I'm more and more happy with it. I got back into CG thanks to it, as it speeds up previews and final render times so much that the workflow becomes smooth enough to not annoy me.
I think I can realize some ideas I wanted to try a few years ago but gave up on because it just took far to long to render high res images. These next gen RT cores are the best thing that could have happened to ray-/path-tracing renderers.


----------



## P4-630 (Apr 18, 2021)

Dinnercore said:


> After using my card for a while I'm more and more happy with it. I got back into CG thanks to it, as it speeds up previews and final render times so much that the workflow becomes smooth enough to not annoy me.
> I think I can realize some ideas I wanted to try a few years ago but gave up on because it just took far to long to render high res images. These next gen RT cores are the best thing that could have happened to ray-/path-tracing renderers.



I see you got a freesync monitor with Nvidia GPU's....
Hope it's at least G-Sync compatible?


----------



## Dinnercore (Apr 18, 2021)

P4-630 said:


> I see you got a freesync monitor with Nvidia GPU's....
> Hope it's at least G-Sync compatible?


Yes it is, no problems so far. In reality it rarely matters for me anyway. I stopped playing competitive shooters. These days I find myself spending a lot of time with childhood classics and I have not found a benefit running any kind of sync when playing e.g. Diablo 2  (Maybe with the resurrected version later this year)

When I got this monitor I was on a Vega 64. I sold and bought a lot of GPUs in the past few years. I wanted to experience what its like to have card XYZ. I started on a 1080ti after my GZTX 670 died, then sold that 1080ti when prices were inflated for a profit over the original price I paid and got the Vega 64. The extra money got me into full custom watercooling. After that Vega 64 I switched to a 1050ti for a while, picked up a 2080 Super later. In between I switched on the CPU side from a Ryzen 1800x to a Threadripper 1950x, later a dual socket SR-2 Xeon build and finally sold that to go X299 workstation. Also had a RTX 2060 for a short period and considered playing with an RX 580.

What I have in my specs is whats left currently. As you see, I don't really follow any logical path and just get hardware that interests me and funding that by selling what I have momentarily. It has been an interesting journey, I learned a lot and most important of all it was a ton of fun to skim through all sides of modern hardware.

So, Freesync with G-Sync compatibility was the 'obvious' choice for me. I considered selling the 3090 soon and taking a look at the RX6900 but as it stands with my re-ignited passion for CGI-art I might skip on that. Most renderers have not yet fully implemented AMDs side of pathtracing, experimental builds are the best you could hope for there.


----------



## Mussels (Apr 19, 2021)

I gave up on my adaptive sync as i got sick of the brightness flickering issues... third monitor with it, so meh (they're all freesync designed)


GPU block has reached melbourne, so its in country


----------



## wolf (Apr 19, 2021)

ReBAR'd my TUF 3080, that and the latest drivers have some wild game increases, CP2077 and Q-RTX have both picked up sizeable performance chunks. I can run Q-RTX @ 3440x1440 targeting 100fps with dynamic resolution scaling and it runs 96-100% render scale all the time, previously was 90-100% render scale but targeting 90fps. Solid increase in my books. It feels like Since Turing debuted RT performance has increased 2x, let alone what can be done leveraging Ampere's superior hardware.

Also starting to get cold nights here in Aus, I've revisited a fun little experiment where I have a few metres of 4" ducting with a fan on one end, I pop the fan end out the dog door and then feed that nice cool air straight to the 3080. The other night it was 24.5c inside and 14c outside, so the results can be quite pleasing. I was able to run a ~330w load and maxed at 58c, and played an hour so or of Bulletstorm Full Clip Edition, 100-120w load and maxed at 28c


----------



## R-T-B (Apr 19, 2021)

I'm going to be playing with my ASRock z390 Taichi to get Rebar running, but since I mod it to scrub the Intel management engine out, it's likely to be a bit more of an adventure for me... lol.


----------



## R-T-B (Apr 22, 2021)

I got rebar working, heh.  Wasn't even hard.

Anyhow, card is here and I am happy frog.  Pics available in "show off tech purchase" thread.


----------



## sepheronx (Apr 23, 2021)

So I have purchased a Dell Alienware R10 with a RTX 3080 (That is all I really wanted actually but I will still end up putting the rest of the system to good use).  Now I understand Dell's 3080's are very similar to the 3080 FE but with dual fans directly at the front.

I also understand these run hot.  So I will be pulling the GPU out and putting it in an open air setup.  I am aware of the thermalpaste mod by taking it apart, removing the stickers and applying thermalpaste around where the vram and vrm are.  But, this will void warranty for me.

So I was thinking of the previous mods - placing heatsinks on the backplate and a fan pulling heat away from the heatsinks.  I am also aware some members here have tried this with their other 3080's.  Any recommendations?  Like which heatsinks I should use?  Does it really work or be beneficial?

Other than that, was thinking of doing this: 



http://imgur.com/kunIvMK


but worried I may void warranty.


----------



## toilet pepper (Apr 23, 2021)

sepheronx said:


> So I have purchased a Dell Alienware R10 with a RTX 3080 (That is all I really wanted actually but I will still end up putting the rest of the system to good use).  Now I understand Dell's 3080's are very similar to the 3080 FE but with dual fans directly at the front.
> 
> I also understand these run hot.  So I will be pulling the GPU out and putting it in an open air setup.  I am aware of the thermalpaste mod by taking it apart, removing the stickers and applying thermalpaste around where the vram and vrm are.  But, this will void warranty for me.
> 
> ...


Wha are you going to use the 3080 for? If you are jus gaming you dont need to do anything to it. Just let it be.


----------



## sepheronx (Apr 23, 2021)

toilet pepper said:


> Wha are you going to use the 3080 for? If you are jus gaming you dont need to do anything to it. Just let it be.


negative.  I will be mining with it.


----------



## toilet pepper (Apr 23, 2021)

I've seen that reddit post a few days ago and I suggest you follow what they did there if you are up to it.

About the warranty, I'm not sure what your local laws are about that sticker being enforcable. Check if with your consumer laws.

In my area we have poor consumer protection and the stickers are enforcable. What I did was carefully remove those stickers with an exacto knife make the mods and place the sticker back in. I made sure that he mods I did was not permanent and can be put back to stock.


----------



## sepheronx (Apr 23, 2021)

toilet pepper said:


> I've seen that reddit post a few days ago and I suggest you follow what they did there if you are up to it.
> 
> About the warranty, I'm not sure what your local laws are about that sticker being enforcable. Check if with your consumer laws.
> 
> In my area we have poor consumer protection and the stickers are enforcable. What I did was carefully remove those stickers with an exacto knife make the mods and place the sticker back in. I made sure that he mods I did was not permanent and can be put back to stock.



I dont really know who to talk to regarding my local laws.  I will check around, maybe I can find answers.  Canada is different between provinces much like in the US with their states.


----------



## R-T-B (Apr 25, 2021)

So I've been force-enabling rebar support on unsupported games on my 3070 with rather impressive results.  On the games that get a performance penalty, it isn't enough of a real penalty for me to care.  Most games don't care at all.  Some though seem to like it.  A lot.  Tropico 6 for some reason loves it.  Don't ask me why, but it seemingly garners nearly 20% in my limited "off the top of my head" benching I did.

Also Kerbal Space Program sort of likes it, but given the scales involved in that game I'm not surprised.  I'm actually surprised it does not benefit more.  It's only getting a moderate boost.  Probably CPU bound from my mods.  I mean it's not exactly a graphical mouthwatering adventure, but it is a...  inefficient one if nothing else.  Throw some even moderate visual mods at it and that game becomes a pig.


----------



## Mussels (Apr 27, 2021)

wow looks like my EK package was stolen in the warehouse, they're investigating now

I bet someone saw "EK quantum 3090" and thought they were stealing a full GPU


----------



## kapone32 (Apr 27, 2021)

I wonder if you would consider my 3060M as an entry point to the club?


----------



## xkm1948 (Apr 27, 2021)

R-T-B said:


> So I've been force-enabling rebar support on unsupported games on my 3070 with rather impressive results.  On the games that get a performance penalty, it isn't enough of a real penalty for me to care.  Most games don't care at all.  Some though seem to like it.  A lot.  Tropico 6 for some reason loves it.  Don't ask me why, but it seemingly garners nearly 20% in my limited "off the top of my head" benching I did.
> 
> Also Kerbal Space Program sort of likes it, but given the scales involved in that game I'm not surprised.  I'm actually surprised it does not benefit more.  It's only getting a moderate boost.  Probably CPU bound from my mods.  I mean it's not exactly a graphical mouthwatering adventure, but it is a...  inefficient one if nothing else.  Throw some even moderate visual mods at it and that game becomes a pig.


Wish i can enable it for my X99. I am still cruising win-raid forum to see if i can somehow enable it by editing my own bios


----------



## R-T-B (Apr 27, 2021)

xkm1948 said:


> Wish i can enable it for my X99. I am still cruising win-raid forum to see if i can somehow enable it by editing my own bios


Oof.  X99.  I don't know if a module exists for that or not.  It's like, the forgotten chipset.


----------



## thesmokingman (Apr 28, 2021)

Mussels said:


> wow looks like my EK package was stolen in the warehouse, they're investigating now
> 
> I bet someone saw "EK quantum 3090" and thought they were stealing a full GPU


Oh damn that sucks. That quantum is darn pricey but its still a helluvalot cheaper than the FE block, omg. Where'd it get stolen from customs?

Unrelated, I think I might finally open a steam link. I bought a few of them when they were dirt cheap. Gonna setup CP2077 and that ups delivery game to stream to my tv so I can game in bed lol, put that 3090 to more use.


----------



## Mussels (Apr 28, 2021)

thesmokingman said:


> Oh damn that sucks. That quantum is darn pricey but its still a helluvalot cheaper than the FE block, omg. Where'd it get stolen from customs?
> 
> Unrelated, I think I might finally open a steam link. I bought a few of them when they were dirt cheap. Gonna setup CP2077 and that ups delivery game to stream to my tv so I can game in bed lol, put that 3090 to more use.


DHL express warehouse, before the handover to the people that deliver to my area

i'm meant to get a callback tomorrow once they've found footage of where it was


----------



## Durhamranger (Apr 28, 2021)

Mussels said:


> wow looks like my EK package was stolen in the warehouse, they're investigating now
> 
> I bet someone saw "EK quantum 3090" and thought they were stealing a full GPU


Dude that`s shocking to say the least....


----------



## Mussels (May 2, 2021)

Package turned up in amsterdam - it was on a cancelled flight due to snow, and the tracking thought it continued on instead of being unloaded.

pics!


----------



## Space Lynx (May 2, 2021)

Mussels said:


> Package turned up in amsterdam - it was on a cancelled flight due to snow, and the tracking thought it continued on instead of being unloaded.
> 
> pics!
> 
> View attachment 198852View attachment 198853View attachment 198854View attachment 198855View attachment 198856View attachment 198857View attachment 198858View attachment 198859



that's a heck of a nice setup you got there. I do hope you have some demanding games you are playing with that beast.


----------



## Mussels (May 2, 2021)

lynx29 said:


> that's a heck of a nice setup you got there. I do hope you have some demanding games you are playing with that beast.


ehhh... Forts and Dead rising 3?


----------



## nguyen (May 2, 2021)

Metro Exodus Enhanced edition coming out May 5th boys


----------



## R-T-B (May 4, 2021)

Mussels said:


> ehhh... Forts and Dead rising 3?


And crypto, amirite?

I'm tempted to setup a mining screensaver for my 3070 but not sure it's worth it wear wise, considering how hard they are to get...


----------



## Mussels (May 4, 2021)

R-T-B said:


> And crypto, amirite?
> 
> I'm tempted to setup a mining screensaver for my 3070 but not sure it's worth it wear wise, considering how hard they are to get...


oh hell yes, nicehash has already paid for the watercooling on the GPU, and this months electricity bill

I'll never be a hardcore miner, but if you own a 30 series card you might as well get some of the cost back


----------



## Deleted member 205776 (May 4, 2021)

I'd rather keep my GPU intact for the next 5 years. I don't trust anything anymore after my 2070 died after 1 year with mild usage.


----------



## wolf (May 4, 2021)

I've stopped mining on my 3080 now that the solar is in the deepest lull of the year. If I have +1kw being generated when I leave for work, and get home from work, I'lll get the 3080 mining, otherwise, it's just the GTX1080 box that's on Ubuntu and a solar schedule.


----------



## toilet pepper (May 4, 2021)

Mussels said:


> Package turned up in amsterdam - it was on a cancelled flight due to snow, and the tracking thought it continued on instead of being unloaded.
> 
> pics!
> 
> View attachment 198852View attachment 198853View attachment 198854View attachment 198855View attachment 198856View attachment 198857View attachment 198858View attachment 198859


You're not gonna leave us wondering about your temp update. lol


----------



## Mussels (May 4, 2021)

toilet pepper said:


> You're not gonna leave us wondering about your temp update. lol


full load while mining


----------



## wolf (May 4, 2021)

Mussels said:


> full load while mining


Oh dayum that's nice! is the backplate actively liquid-cooled too then?


----------



## toilet pepper (May 4, 2021)

Mussels said:


> full load while miningView attachment 199129


Dayumn. You pushed it to 120mhs now? How many radiators are you using? I'm kinda tempted to do a custom loop but my wife's gonna kill me now as I have been tinkering non-stop for the past few months.


----------



## Mussels (May 4, 2021)

toilet pepper said:


> Dayumn. You pushed it to 120mhs now? How many radiators are you using? I'm kinda tempted to do a custom loop but my wife's gonna kill me now as I have been tinkering non-stop for the past few months.


getting 110 if nothing else is open

Chrome and VLC drop it to 90


----------



## Dinnercore (May 4, 2021)

Mussels said:


> full load while miningView attachment 199129


Ah I've been wondering if I should try a block with an active backplate. But seems like it wouldn't do much for me, I'm already sitting at 72°C to 74°C with my heatsink mod.


----------



## xkm1948 (May 4, 2021)

Mussels said:


> getting 110 if nothing else is open
> 
> Chrome and VLC drop it to 90




Mine sits at 106MH/s with -400 core, no VRAM offset and 62% power target (260watt) Funny thing is it does not bounce around at all, like zoom or just word processing. Temps are no where near a water cooled though. My VRAM junction is constantly at 96C during mining.


----------



## nguyen (May 8, 2021)

Got bored today so I tried some higher TDP BIOS on my Asus TUF 3090 









						Result not found
					






					www.3dmark.com


----------



## wolf (May 8, 2021)

nguyen said:


> Got bored today so I tried some higher TDP BIOS on my Asus TUF 3090


Tell me more, what bios, what tdp, did it affect stock performance or just allow you to stretch its legs more with a nice custom oc?


----------



## nguyen (May 8, 2021)

wolf said:


> Tell me more, what bios, what tdp, did it affect stock performance or just allow you to stretch its legs more with a nice custom oc?



I flashed the Kingpin XOC BIOS that has 1000W PL, pretty much allowing my 3090 to maintain whatever clocks it can reach. I haven't tried higher clocks than 2130mhz/1.0Vcore since the 3090 is already pulling ~480W at that clocks , there are still lots of leg room remaining though.


----------



## FireFox (May 8, 2021)

nguyen said:


> I flashed the Kingpin XOC BIOS that has 1000W PL, pretty much allowing my 3090 to maintain whatever clocks it can reach. I haven't tried higher clocks than 2130mhz/1.0Vcore since the 3090 is already pulling ~480W at that clocks , there are still lots of leg room remaining though.


You people dont learn that it is not the right time to be messing with the GPUs, ah, and dont tell me you know what you are doing or that it is safe



Mussels said:


> full load while miningView attachment 199129


what is your memory temp when gaming?


----------



## Mussels (May 8, 2021)

FireFox said:


> You people dont learn that it is not the right time to be messing with the GPUs, ah, and dont tell me you know what you are doing or that it is safe
> 
> 
> what is your memory temp when gaming?


I dunno it hasnt stopped mining since the water went on, its around 80C at all times now


----------



## nguyen (May 8, 2021)

FireFox said:


> You people dont learn that it is not the right time to be messing with the GPUs, ah, and dont tell me you know what you are doing or that it is safe



Oh I know what I'm doing actually , been playing around with unlocked BIOS since Titan X Maxwell, 1080Ti, 2080Ti and they still live to this day. 
It's kinda jaw dropping how much power the 3090 can suck at 1.0Vcore though, all those GPU before the 3090 would be running at 1.2Vcore to use this much power.


----------



## FireFox (May 8, 2021)

Mussels said:


> I dunno it hasnt stopped mining since the water went on, its around 80C at all times now


Is there any GPU stress test that put the same kind of load like when mining? 
I am curious because i would like to try it.


----------



## Mussels (May 8, 2021)

FireFox said:


> Is there any GPU stress test that put the same kind of load like when mining?
> I am curious because i would like to try it.


not that i've seen, its super stressful on GPU and memory like nothing else

you could just grab nicehash and try the benchmark in it without linking an account?


----------



## ThrashZone (May 8, 2021)

FireFox said:


> Is there any GPU stress test that put the same kind of load like when mining?
> I am curious because i would like to try it.


Hi,
Blender 
Classroom demo file download has a gpu render file in it a long with a cpu render file just scroll to the bottom of the links page for it
All you need to have installed is blender and point it to the extracted demo file choice 









						Demo Files — blender.org
					

Home of the Blender project - Free and Open 3D Creation Software




					www.blender.org
				




Blender opendata also has an option to use cpu or gpu render testing this is a beast and easy to use
Blender Open Data — blender.org


----------



## FireFox (May 8, 2021)

Mussels said:


> you could just grab nicehash and try the benchmark in it without linking an account?


Will try that.


----------



## Mussels (May 8, 2021)

FireFox said:


> Will try that.


When you get to the benchmark tab, you can just disable anything that isnt "DaggerHashimoto - Excavator" as thats the one that mines eth/best on nvidia cards and then just bench/demo that one


or i can PM you my wallet QR code and you can leave it running for me


----------



## FireFox (May 8, 2021)

ThrashZone said:


> Classroom


It didn't put much stress on the GPU



Nicehash it is more like a GPU killer ( it ran for 20 minutes )




This wouldn't pay the Electricity bill


----------



## Mussels (May 8, 2021)

FireFox said:


> It didn't put much stress on the GPU
> View attachment 199649
> 
> Nicehash it is more like a killer GPU ( it ran for 20 minutes )
> ...


If you put your electricity rates in on the settings page, thats $9USD profit *after* electrical cost

The idea is that you leave that for a month and $9x28 =$250 after costs, out of nowhere (i leave mine running and gaming takes priority, but i get a good 75% of the estimates)


----------



## ThrashZone (May 8, 2021)

FireFox said:


> It didn't put much stress on the GPU
> View attachment 199649
> 
> Nicehash it is more like a killer GPU ( it ran for 20 minutes )
> ...


Hi,
Try opendata might be different.


----------



## FireFox (May 8, 2021)

Mussels said:


> thats $9USD profit *after* electrical cost


€7.39

forgot to mention, 400W


----------



## Mussels (May 8, 2021)

You can see why i might leave my 3090 running


Uhhh taking a screenshot kicked up the speed? maybe these results are not accurate and should be taken with a grain of salt









(I think i need to let it settle down to a real average, TPU capture glitches it)


----------



## FireFox (May 8, 2021)

Mussels said:


> You can see why i might leave my 3090 running


You have tweaked your for better profit. right?


----------



## ThrashZone (May 8, 2021)

FireFox said:


> €7.39
> 
> forgot to mention, 400W


Hi,
Most I've read you need to under volt the core clock and push only the memory clock.


----------



## FireFox (May 8, 2021)

ThrashZone said:


> Hi,
> Most I've read you need to under volt the core clock and push only the memory clock.


read that too.

lets see if mining on SSD comes true then i might try


----------



## Mussels (May 8, 2021)

FireFox said:


> You have tweaked your for better profit. right?


1.55GHz on the core at low voltages, with the ram yeeted up






I can run faster on the core easily, but i see no benefits in the games i play or in the mining results - so for now i'm enjoying the higher efficiency.
I honestly remember setting 1600Mhz and 0.825v as the voltage so i'm not sure when i changed to this? But its working and working well so i left it alone.

edit: Oh i had a "how fast can it go at min voltage" contest with my brother and his 3080, and this was the result and i must have saved over my regular profile


----------



## FireFox (May 8, 2021)

This was my mining Setup
1x GTX 1080 Classified
1x GTX1080 SC
1x GTX 1070 FTW2
1X 1060 SC
701W


----------



## Mussels (May 9, 2021)

Apparently i was just taking screenshots as Eth was booming overnight


----------



## nguyen (May 9, 2021)

Mussels said:


> Apparently i was just taking screenshots as Eth was booming overnight



Convert to Dogecoin and sell them when they reach 1usd


----------



## Mussels (May 9, 2021)

nguyen said:


> Convert to Dogecoin and sell them when they reach 1usd


got 200 doges sitting there waiting to increase


----------



## nguyen (May 9, 2021)

le wife just ruined my 2nd PC with 2080 Ti which I was planning to do some mining


----------



## R-T-B (May 9, 2021)

Guys, this is the ampere clubhouse, not the "look at literally any gpu cause it can mine" clubhouse.


----------



## Mussels (May 9, 2021)

yeah that broken 2080ti deserves it, go back to the turing clubhouse!



Make a thread about the smashy smashy tho, TPU loves that stuff (and i'm curious about the water loop)


----------



## Hachi_Roku256563 (May 9, 2021)

Im really hiped for the actual thing to replace the 1060 in terms of price and performance
maybe the 3050 can do it


----------



## the54thvoid (May 9, 2021)

R-T-B said:


> Guys, this is the ampere clubhouse, not the "look at literally any gpu cause it can mine" clubhouse.



What R-T-B said. If I remember, I'll go back through and LQ the excess mining posts. Don't take it personally but this is the Ampere thread, there is a proper mining one.


----------



## trog100 (May 9, 2021)

Mussels said:


> not that i've seen, its super stressful on GPU and memory like nothing else
> 
> you could just grab nicehash and try the benchmark in it without linking an account?



i dont think that is entirely true.. for mining the memory needs clocking up but the gpu and power limits are normally clocked down.. 

furmark is a good stress test but i think its taxing a graphics card  way more than mining ever is.. 

from personal experience i mined 24/7 on 10 x 1070 cards for over a year.. at the end of this every single cards performed exactly as it did when new.. no signs of mining use at all.. 

one card could be a fluke 10 cards most certainly not.. 

trog


----------



## Mussels (May 9, 2021)

trog100 said:


> i dont think that is entirely true.. for mining the memory needs clocking up but the gpu and power limits are normally clocked down..
> 
> furmark is a good stress test but i think its taxing a graphics card  way more than mining ever is..
> 
> ...


as long as you replace the thermal pads (which seems to be a given for all 3080 and 3090 owners) you'll be fine

at least THIS parts ampere specific


----------



## the54thvoid (May 9, 2021)

I'm just jealous I couldn't get one. Gonna punish you all.


----------



## trog100 (May 9, 2021)

Mussels said:


> as long as you replace the thermal pads (which seems to be a given for all 3080 and 3090 owners) you'll be fine
> 
> at least THIS parts ampere specific



i have two 3080 cards a gigabyte and a palit version.. the palit is in my desktop along with a 3070 card.. the fan speed is set at 55% and it runs nice and quite and cool with no modifications.. 

the gigabyte card on the other hand is a thermal disaster.. its had extra thermal pads added between the pcb and back-plate and needs its fans running at 95%..  its too noisy to be in my desktop machine its now upstairs in my open air mining rig with an extra industrial case fan blowing directly at the back-plate.. it still runs hot on the memory but dosnt throttle.. 

i have no rational explanation for how these two makes of 3080 cards differ.. but they sure do..

both are large three fan cards the cooler running palit being the smaller of the two..

trog


----------



## Dinnercore (May 9, 2021)

Some random thoughts about mining on these poor 3090s. I'd love to see a graph of how many 3090s died in a year time compared to other models. With the backside memory and mining I will not be buying another one used later down the line.

Not everyone knows or even monitors Tj on the memory, hell afterburner does not even show that temperature. All you see is a nice and comfy 75°C with an aircooled card on a mining profile, but once you open GPU-Z the memory sits at 10X°C+. 
They might survive the next few months but no one can convince me that this does not hurt the life expectancy of the cards. My GTX 670 is the only card I owned that died, and it was a cheap blower model with no memory cooling and obstructed airflow over two memory modules, go figure. And the 3090 has 24 chips, with all the solder joints added up there are a few thousand possible points of failure more on a 3090 than on the 8GB / 10GB cards.

On top of that mining is incredibly heavy on the MVDDC rail, pulls consistantly 40W more on that rail than other workloads. And I think from looking at it that this is the reference models weak point of the VRM design. Another point of concern for me that only really applies to the 3090 with its huge memory.

If this craze ever stops, I would avoid used 3090s like the plague. I'm only sitting at 76°C for my memory temp when my room warms up and it still slightly concerns me. I think I'll stop mining as soon as I got the difference between MSRP and what I paid back which will be like two weeks from now if eth stays on the current level.


On another note, I just got a chance to grab a new notebook. Got the MSI GP66 with a 3070 (130W max.). Since it was cheaper than a single regular 3070 card (1400€ for the 3070 desktop card atm) I figured it's a good deal. I am blown away by the options I can use from the extended bios settings. I'm used to XOC overclocking desktop mainboards with a ton of settings, but this MSI laptop bios has cranked that to 11. You can actually set usually locked registers for the CPU, you can unlock nearly everything. I'm only limited by the fused 50x ratio for CPU speed. Coming from a locked out HP notebook this is such a nice thing to have.


----------



## R-T-B (May 9, 2021)

the54thvoid said:


> What R-T-B said. If I remember, I'll go back through and LQ the excess mining posts. Don't take it personally but this is the Ampere thread, there is a proper mining one.


I think if you are asking about mining with Ampere I'd let it slide personally, but we were just posting anything for a bit there.


----------



## nguyen (May 11, 2021)

Let my 3090 join in on the fun 





3090 gets 110MH/s at 270W, meanwhile the 2080Ti get 60MH/s at 160W.

Couple of tips to get the best efficiency:
_Use the iGPU if available so that hash rate won't be affect when you browse the web
_Go to Curve editor in Afterburner and press CTRL + L at somewhere around 1200mhz, the min voltage applied will be 737mV but lower frequency can reduce power consumption without affecting hashrate


----------



## Mussels (May 11, 2021)

cltr-F is the curve, cltr-L is lock


----------



## R-T-B (May 12, 2021)

Just a shoutout to those mining while using their cards for gaming (figure a lot of Ampere users are), my mining screensaver was basically designed for this usecase.

It's in the crypto forum.


----------



## Mussels (May 12, 2021)

R-T-B said:


> Just a shoutout to those mining while using their cards for gaming (figure a lot of Ampere users are), my mining screensaver was basically designed for this usecase.
> 
> It's in the crypto forum.


I saw it and its fantastic, i just found that nicehash plays well with gaming and doesnt hurt my FPS.
Currently making an ABSURD $65 a day




What clocks have people got on 3090s? I know the active backplate gets me cooler than everyone, but i dont see many people with such high ram clocks

Mining clocks and temps:





Thing is i'm sure this RAM can go higher, i'm just adding +50 a day and its shown no issues yet


----------



## toilet pepper (May 12, 2021)

I


Mussels said:


> I saw it and its fantastic, i just found that nicehash plays well with gaming and doesnt hurt my FPS.
> Currently making an ABSURD $65 a day
> 
> View attachment 200074
> ...


How many mhs is that? Is that 65AUD with a single 3090? I'm not home right now I wish I can check what a 3080 gets.


I think your core clock's quite high but I'm checking your voltage and wowza you got a golden chip


----------



## budgetgaming (May 12, 2021)

15th Warlock said:


> Hi everyone, I don't think there's a club for Ampere based cards yet, so let's get the ball rolling!
> 
> I'll go first, here's my RTX 3090 FE, ordered from Best buy:
> 
> ...


Nice......can't afford 3090, mine max is 3080


----------



## nguyen (May 12, 2021)

toilet pepper said:


> I
> 
> How many mhs is that? Is that 65AUD with a single 3090? I'm not home right now I wish I can check what a 3080 gets.
> 
> ...



No need for high clocks when mining, I locked my 3090 to 1170mhz and it still get 110MH/s, only VRAM clocks matter
Core Voltage on 3090 can't go lower than 730mV so the power saving come from reducing the core clocks.


----------



## wolf (May 12, 2021)

Considering mining again on my 3080 (stopped due to solar power being pretty shit in winter), but leaving it overnight to keep that corner of my tiny house warm...hmmmmm

Still convinced this mental ETh peak will crash, but I'm in it for the long haul.

Tell me again what GDDR6X junction temps I should be OK with 24/7?


----------



## toilet pepper (May 12, 2021)

wolf said:


> Considering mining again on my 3080 (stopped due to solar power being pretty shit in winter), but leaving it overnight to keep that corner of my tiny house warm...hmmmmm
> 
> Still convinced this mental ETh peak will crash, but I'm in it for the long haul.
> 
> Tell me again what GDDR6X junction temps I should be OK with 24/7?


I mine using a pool and been getting ETH directly and hodling it. Awesomeminer plus miningpoolhub is quite easy to setup.


As per the gddr6x data sheet, it is 95C for sustained use.


----------



## Mussels (May 12, 2021)

toilet pepper said:


> I
> 
> How many mhs is that? Is that 65AUD with a single 3090? I'm not home right now I wish I can check what a 3080 gets.
> 
> ...


3090 getting 120, then a 1080 and 1070ti getting 50 between them


----------



## budgetgaming (May 12, 2021)

Never tried mining since everybody is bring it up, just 2 hors ago tried to mine with single rtx3080 and with nice has.....my accepted speed of mining is only 62.95 MH/s.....slow?


----------



## Mussels (May 12, 2021)

budgetgaming said:


> Never tried mining since everybody is bring it up, just 2 hors ago tried to mine with single rtx3080 and with nice has.....my accepted speed of mining is only 62.95 MH/s.....slow?


should be around 80? for a 3080

check your Tjunction (VRAM) temps in Hwinfo


edit: holy crap active backplate for the WIN


----------



## budgetgaming (May 12, 2021)

Mussels said:


> should be around 80? for a 3080
> 
> check your Tjunction (VRAM) temps in Hwinfo
> 
> ...


my GPU temp is around 49-52 but the cram temp goes from 96 to 102 degree......is that what makes it slow MH/S?


----------



## nguyen (May 12, 2021)

budgetgaming said:


> my GPU temp is around 49-52 but the cram temp goes from 96 to 102 degree......is that what makes it slow MH/S?



You lose around 20MH/s when you are using the PC (like web surfing, etc...), leave the PC alone for like a minute then you will see what your actual hashrate is.


----------



## Dinnercore (May 12, 2021)

Tested my mobile 3070 (MSI GP66 Leopard) with a mining load.
I have to give credit to MSIs thermal design on this one. Even when gaming with stock settings and no undervolting it holds up. Also the VRAM is covered by the heatsink too, I checked when I upgraded the RAM. Otherwise I would not have dared to run a miner.





Next I have to see what it can do in 3DMark. And I wonder how well the laptop 3070 handles 3D-rendering compared to desktop cards. I hope I can work on my blender projects with this notebook.


----------



## toilet pepper (May 12, 2021)

budgetgaming said:


> my GPU temp is around 49-52 but the cram temp goes from 96 to 102 degree......is that what makes it slow MH/S?


You are thermal throttling. Depends on card that you have but gigabyte is known to have used cheap thermal pads or no thermal pads at the back of the card. You can inrease the fan speed as well.


----------



## jjnissanpatfan (May 12, 2021)

I guess I should of joined the craze sooner. The EVGA 3060 TI FTW3 edition that I bought from BestBuy online in February. The most expensive 500$ and 550$ with tax...I wanted the 3070 for 500 or a 6800 to go all RED but bought the 3060TI because it was that or nothing. Sunday was the first day I started mining and all seemed good, Monday morning I looked at my temps and progress. The VGA said 56C and I never looked anything else. When I looked later that evening still at 56C so I looked at some settings and the fans were off. I somehow Sunday shutoff precision X1 and that stopped the fans. The hot spot was at 70C with no fans for 24HRS almost. Think I was lucky because if this was a basic 3060TI there might have been bad consequences.  I am liking this generation of cards....great gaming and pay for them self's when not in use.  One last thing I know the memory on these card can go allot higher...would that be of any benefit...or the optimized settings work the best?


----------



## toilet pepper (May 13, 2021)

jjnissanpatfan said:


> I guess I should of joined the craze sooner. The EVGA 3060 TI FTW3 edition that I bought from BestBuy online in February. The most expensive 500$ and 550$ with tax...I wanted the 3070 for 500 or a 6800 to go all RED but bought the 3060TI because it was that or nothing. Sunday was the first day I started mining and all seemed good, Monday morning I looked at my temps and progress. The VGA said 56C and I never looked anything else. When I looked later that evening still at 56C so I looked at some settings and the fans were off. I somehow Sunday shutoff precision X1 and that stopped the fans. The hot spot was at 70C with no fans for 24HRS almost. Think I was lucky because if this was a basic 3060TI there might have been bad consequences.  I am liking this generation of cards....great gaming and pay for them self's when not in use.  One last thing I know the memory on these card can go allot higher...would that be of any benefit...or the optimized settings work the best?
> 
> View attachment 200142View attachment 200143


That looks reasonable for the 3060ti. Make sure the fans are always on unless you want the vram chips to fry itself. That fan off things on cards usually arenot good for the vram at some point.

The default fan curve follows the gpu core then on some cards it will ramp up if the vram chips gets too hot.


----------



## FireFox (May 16, 2021)

Sorry but i couldn't resist


----------



## R-T-B (May 16, 2021)

FireFox said:


> Sorry but i couldn't resist
> View attachment 200474


We all have ampere's in here so kinda the wrong thread dude...


----------



## FireFox (May 16, 2021)

R-T-B said:


> We all have ampere's in here so kinda the wrong thread dude...


Well it wasn't said specifically 
which 3080, could be an Ampere one


----------



## R-T-B (May 16, 2021)

FireFox said:


> Well it wasn't said specifically
> which 3080, could be an Ampere one


Aren't we supposed to believe we never got one from the 2021 frame?

And what 3080 isn't an Ampere?  That ancient amd thing?

No worries just might fit better somewhere else...  that's all.


----------



## the54thvoid (May 16, 2021)

R-T-B is correct. This is the Ampere clubhouse. For Ampere owners and discussion of that chip. Please stay on topic.


----------



## toilet pepper (Jun 1, 2021)

I just ordered this a GPU AIO from an online shop. (aliexpress clone) The shop I ordered from can change the block to a 3080 Ventus but it will take roughly 2 weeks. It is also a lot cheaper costing less than $200 and it comes with a backplate.  One good thing is the radiator also has the pump so I can squeeze this and bottom mount it in my NR200

I'm thinking this is a good strating point for me before going full custom loop.


----------



## phill (Jun 2, 2021)

Well guys, it's been a while since I've looked in this thread but I can see all the excitement about the temps, mining and the like, so I thought I'd ask something about, sit down though please....  Gaming!!

I hear all about the spikes with the cards and the temps on the memory (this MSI card is terrible and I monitor every time it's turned on, desktop, mining, gaming or anything just to be safe) but what I was wondering about and I'm pretty sure that someone or a few of you in here, have been testing your cards with lower power limits for gaming and the FPS differences have been between 5 to 10 FPS if I recall correctly??  I was just wondering peoples mileage with it...  How do games fair for you all??    Well for those that game


----------



## R-T-B (Jun 2, 2021)

For my EVGA RTX 3070 FTW3 it games flawlessly and the vram temp is even pretty low.  It can even mine and stay cool on a hot day for brief spurts of a few hours with no real hard temp spikes, probably could go longer but I am only screensaver mining with my CMS crypto screensaver package.

That's a 3070 though, and it uses plain GDDR6, not GDDR6X


----------



## Mussels (Jun 2, 2021)

phill said:


> Well guys, it's been a while since I've looked in this thread but I can see all the excitement about the temps, mining and the like, so I thought I'd ask something about, sit down though please....  Gaming!!
> 
> I hear all about the spikes with the cards and the temps on the memory (this MSI card is terrible and I monitor every time it's turned on, desktop, mining, gaming or anything just to be safe) but what I was wondering about and I'm pretty sure that someone or a few of you in here, have been testing your cards with lower power limits for gaming and the FPS differences have been between 5 to 10 FPS if I recall correctly??  I was just wondering peoples mileage with it...  How do games fair for you all??    Well for those that game


I'm at 1550 for max efficiency, and notice no change at all in the games i play at 1440p 165hz
clock speed curve is faaaar better than a power slider limit


----------



## toilet pepper (Jun 2, 2021)

phill said:


> Well guys, it's been a while since I've looked in this thread but I can see all the excitement about the temps, mining and the like, so I thought I'd ask something about, sit down though please....  Gaming!!
> 
> I hear all about the spikes with the cards and the temps on the memory (this MSI card is terrible and I monitor every time it's turned on, desktop, mining, gaming or anything just to be safe) but what I was wondering about and I'm pretty sure that someone or a few of you in here, have been testing your cards with lower power limits for gaming and the FPS differences have been between 5 to 10 FPS if I recall correctly??  I was just wondering peoples mileage with it...  How do games fair for you all??  Well for those that game


I did not change the power limit of my 3080 Ventus as it isnt efficient that way. When you lower the powdr limit it just lowers the freq/volt curve. What happens is the gpu would still try to boost until it reaches the next limit which would be temps. It will cycle lowering temps and clocks then power spiking at the same time.

What I did is find the lowest voltage for an acceptable frequency and flatten the curve from there in afterburner. Mine does 1830mhz @818mv and it stays there no power/wattage spikes. I may be losing 3-5 fps during gaming but it stays around 60c.


----------



## sepheronx (Jun 2, 2021)

I got two 3080's.  A dell RTX 3080 and a MSI Ventus 3X.

The Dell one is terrible.  VRAM temps are around 104C average when mining.  the MSI Ventus is at 100C when mining which isn't too bad.

Temps are high but below the 110C.  The dell one is bothering me the most though.  But I dont want to open it and void the warranty.  Plus I will eventually be putting that in my main system (MSI Trident PC) and game with that.  When gaming, it gets to about 103C for the Dell 3080.  I know people say to open it up, put thermal paste around where the vram plate is.  But it has the manufacturer stickers on there and I am not sure what the laws are in Calgary regarding Warranty with these.  If it doesn't void warranty, then I will modify it.


----------



## phill (Jun 2, 2021)

Mussels said:


> I'm at 1550 for max efficiency, and notice no change at all in the games i play at 1440p 165hz
> clock speed curve is faaaar better than a power slider limit


I'll give it a go if I can tonight @Mussels, thanks  

What sort of frame drops do you get with it?


----------



## ratirt (Jun 2, 2021)

Mussels said:


> I'm at 1550 for max efficiency, and notice no change at all in the games i play at 1440p 165hz
> clock speed curve is faaaar better than a power slider limit


I would go a locked FPS and same outcome


----------



## Mussels (Jun 2, 2021)

phill said:


> I'll give it a go if I can tonight @Mussels, thanks
> 
> What sort of frame drops do you get with it?


Whats a frame drop?


----------



## Dinnercore (Jun 2, 2021)

The only 'demanding' title I play is Star Citizen. With that one the 3090 and mobile 3070 both perform the same. I even undervolted the 3070 to 0.7V, so it runs at a relaxed 54°C when gaming.

Besides that I can tell you that in Diablo 2 (the original with addon, 800x600) I get the same 25fps with and without undervolting / downclocking the cards.


----------



## sepheronx (Jun 2, 2021)

So if anyone cares.

I followed this guys video (ill post it below) to help reduce my Dell RTX 3080 GPU vram temp.  I mentioned earlier it in mining and gaming getting upwards to 106C.  This is with 100% Fan speed.

After applying the thermal paste (Noctua NT-H1.  I applied more than he put in by mistake, oh well, non conductive), I am now seeing memory temps fluctuate at 88 - 90C on the vram with fan speeds at 57%.

I recommend this modification to anyway who has heat issues on their Dell RTX 3080's










Now, I am awaiting on my GLIED Thermal pads to do the fix on my MSI Ventus 3X RTX 3080.


----------



## mouacyk (Jun 4, 2021)

Where the Tie owners at?  Don't be shy.  Let us look at you.


----------



## R-T-B (Jun 4, 2021)

mouacyk said:


> Where the Tie owners at?  Don't be shy.  Let us look at you.


What's a tie?  Do you mean Ti?


----------



## toilet pepper (Jun 5, 2021)

mouacyk said:


> Where the Tie owners at?  Don't be shy.  Let us look at you.


With dismay or approval? YES.


----------



## R-T-B (Jun 5, 2021)

toilet pepper said:


> With dismay or approval? YES.


Especially if it's a bow tie.

Sorry, I'll stop...


----------



## Mussels (Jun 5, 2021)

Well the FE cards do resemble a Tie fighter...


----------



## Mussels (Jun 15, 2021)

Yes this is my 1550Mhz underclock

But these load temps should be illegal








enjoy the reflection of my ear


----------



## wolf (Jun 15, 2021)

Mussels said:


> Yes this is my 1550Mhz underclock
> 
> But these load temps should be illegal


Epic. Can I see the MSI AB Curve? I thought I was doing great on-air loading at sub 60 degrees 99.9% of the time, even at ~300w loads, but damn that is awesome. Gotta love winter aye.


----------



## Mussels (Jun 15, 2021)

It's a flat earth curve







I'm happy as F because that front res has no LED's - thats purely lighting from the front fans diffusing into the EK mystic fog


----------



## nguyen (Jun 15, 2021)

This dude is getting 128MH/s on his Strix 3090 with Byskki block+active backplate









Seems like some thermal putty on top of the VRAM really help too, 56C Tjunction temp is way too good.


----------



## Mussels (Jun 15, 2021)

Thermal putty you say...
after watching the video he never shows what it is, and you gotta pay to get on his discord to find out??

Mine slooooooowly heats up and eventually heat soaks the rads, but thats cause i've also got the fans at slow ass speeds in a restrictive case

just gaming loads it never heats up and its glorioooous.


----------



## nguyen (Jun 15, 2021)

Mussels said:


> Thermal putty you say...
> after watching the video he never shows what it is, and you gotta pay to get on his discord to find out??
> 
> Mine slooooooowly heats up and eventually heat soaks the rads, but thats cause i've also got the fans at slow ass speeds in a restrictive case
> ...



He is probably using this thermal putty, at 10W/mK that's a little better than standard thermal pads, though the thermal putty also increase the surface area when it surround the VRAM module, leading to better thermal transfer.


----------



## Mussels (Jun 15, 2021)

nguyen said:


> He is probably using this thermal putty, at 10W/mK that's a little better than standard thermal pads, though the thermal putty also increase the surface area when it surround the VRAM module, leading to better thermal transfer.


if someone wants to pay for it, i'll guinea pig it


----------



## wolf (Jun 15, 2021)

Mussels said:


> if someone wants to pay for it, i'll guinea pig it


Ampere cards with good cooling just run so cold already, it's really blown me away. I'm all about chasing those last 10ths, but I'm not going to throw disproportionate money at it to gain them.

Not sure I said it here (posted on reddit) but one extra very small piece of thermal pad where there was none at stock on my TUF 3080 lowered VRAM temps by 10-14c, epic result for such a small amount more pad. Just remove the sticker and pop a pad there, same size and thickness to the one symmetrically to the left of the GPU.


----------



## toilet pepper (Jun 21, 2021)

I just finished installing the Bykski GPU AIO cooler for my RTX 3080 Ventus. The kit includes a 240mm aluminum rad with pump in it, 2 fans, a gpu block of your liking, an aluminum backplate and a bunch of screws and thermal pads for roughly $200 USD.















Installation was relatively easy considering they did not include instructions. Checked their website for instructions and it was garbage as well. I justh ave a problem with the Hot spot temps. They are around 15-20C delta between the GPU and hot spot.

Here is the build.








I call it "*Custom loop with extra steps.*"

Gaming temps:





Mining temp:





It aint bad considering I was getting 92C on memory when mining before. If it still seems high to you. My ambient is 35C.


----------



## Mussels (Jun 21, 2021)

Even my EK block has 10C+ up on the hotspot temps, so that aint too bad


----------



## toilet pepper (Jun 21, 2021)

Mussels said:


> Even my EK block has 10C+ up on the hotspot temps, so that aint too bad


It was worse with my first fit because the manual said to use washers. The hotspot had around 40c delta. Removed them and did a repaste and this is the best I could get.


----------



## MxPhenom 216 (Jun 21, 2021)

Anyone here with a 3080Ti attempt to undervolt yet? I am currently at 0.893v at 1900mhz on mine.

Got a waterblock coming for miine in about a week. How are you guys doing your TIM application. Its been a while for me. Do you do the dot in the middle and let the block spread, or do you spread yourself since theres no IHS?


----------



## toilet pepper (Jun 21, 2021)

MxPhenom 216 said:


> Anyone here with a 3080Ti attempt to undervolt yet? I am currently at 0.893v at 1900mhz on mine.


You're the first person to say they have one.


----------



## MxPhenom 216 (Jun 21, 2021)

toilet pepper said:


> You're the first person to say they have one.


Oh well lucky me. I won the Newegg Shuffle on the 4th the following day that the 3080Ti's launched. Wasnt super excited that I won the Zotac card out of the 3 that were available to sign up for, but I wasnt going to pass it up as it was really my only chance i have gotten to get a card.



Mussels said:


> It's a flat earth curve
> 
> 
> 
> ...


What are your thoughts on the mystic fog? I have heard it fades after a few months and just becomes a clear fluid. My new full CPU and GPU hardline build will prob use a red or blue dye for the fluid. Just curious to see what you that of that fog.


----------



## Mussels (Jun 21, 2021)

Mystic fog so far looks reaaaaaally cool in the tubes and res, but i'm seeing particle buildup in my GPU block despite flushing the loop and trying again with new coolant 

Like seriously that res has no lighting. Thats all from the front fans.


----------



## MxPhenom 216 (Jun 22, 2021)

Mussels said:


> Mystic fog so far looks reaaaaaally cool in the tubes and res, but i'm seeing particle buildup in my GPU block despite flushing the loop and trying again with new coolant
> 
> Like seriously that res has no lighting. Thats all from the front fans.


Yeah that mystic fog stuff is sweet. Now if only you were running hard tubing you could get frosted tubing. 

I am going to end up running EK Cryofuel clear with their blue or red die pack. 

Just need:
5900x
Asus X570 Hero Dark board 
Rest of the watercooling stuff to rebuilt my loop

And ill be able to start the new build


----------



## Raendor (Jun 25, 2021)

Hello clubbers. FInally an RTX 3080 FE owner since earlier this week as well. Put it in an Ncase M1 and works like a charm. I'm running with undervolt and after running port royale for stability I'm deciding whether I'd keep it at .856@1905 or .875@1920, which both run stable through series of benchmarking sessions. However I notice quite a bit difference in wattage between .856 and .875, where the former never even breaks 300w (sits more around 270ish at highest), while latter is just above it.

Here're some scores:
NVIDIA GeForce RTX 3080 video card benchmark result - Intel Core i7-11700 Processor,ASUSTeK COMPUTER INC. ROG STRIX B560-I GAMING WIFI (3dmark.com) at 1920
NVIDIA GeForce RTX 3080 video card benchmark result - Intel Core i7-11700 Processor,ASUSTeK COMPUTER INC. ROG STRIX B560-I GAMING WIFI (3dmark.com) at 1905


----------



## nguyen (Jun 25, 2021)

Raendor said:


> Hello clubbers. FInally an RTX 3080 FE owner since earlier this week as well. Put it in an Ncase M1 and works like a charm. I'm running with undervolt and after running port royale for stability I'm deciding whether I'd keep it at .856@1905 or .875@1920, which both run stable through series of benchmarking sessions. However I notice quite a bit difference in wattage between .856 and .875, where the former never even breaks 300w (sits more around 270ish at highest), while latter is just above it.
> 
> Here're some scores:
> NVIDIA GeForce RTX 3080 video card benchmark result - Intel Core i7-11700 Processor,ASUSTeK COMPUTER INC. ROG STRIX B560-I GAMING WIFI (3dmark.com) at 1920
> NVIDIA GeForce RTX 3080 video card benchmark result - Intel Core i7-11700 Processor,ASUSTeK COMPUTER INC. ROG STRIX B560-I GAMING WIFI (3dmark.com) at 1905



Nice, you can try multiple undervolt profiles depending on games/need. Most of the time I run 812mV/1860Mhz and when the game run at just below 60fps (CP2077) then I push 875mV/1965mhz.
Although with air cooled GPU I would undervolt to around 800mV for best efficiency/fan noise and just run games at lower setting to maintain playable framerates.


----------



## Raendor (Jun 25, 2021)

nguyen said:


> Nice, you can try multiple undervolt profiles depending on games/need. Most of the time I run 812mV/1860Mhz and when the game run at just below 60fps (CP2077) then I push 875mV/1965mhz.
> Although with air cooled GPU I would undervolt to around 800mV for best efficiency/fan noise and just run games at lower setting to maintain playable framerates.


My custom fan curve tops out at 60% at 80C and when I run Port Royal at .856@1905 temperature sits in mid 60s, which tbh I found quite low for FE model considering the power draw.


----------



## MxPhenom 216 (Jun 25, 2021)

nguyen said:


> Nice, you can try multiple undervolt profiles depending on games/need. Most of the time I run 812mV/1860Mhz and when the game run at just below 60fps (CP2077) then I push 875mV/1965mhz.
> Although with air cooled GPU I would undervolt to around 800mV for best efficiency/fan noise and just run games at lower setting to maintain playable framerates.



You got a good 3090. I need at least 875mv on my 3080Ti to do 1845-1860 or games crash (well AC Valhalla does). My card bounces hard off the power limiter (not sure why Zotac limitted it to only +10% when the design could clearly handle more, maybe with another 8 pin). in time spy at really anything above 875mV. I can run those benchmarks at about 1850 @ 850mV, but games are another story.

Though I have wondered if it might be me using PCIe 3.0 vs 4.0. I think 4.0  allows for more power from the slot. I never see my card pull more then 50w from it.

Games seem to not demand nearly as much as 3Dmark, so I never see my card really get much over 300w. But itll bounce off power limitter a lot in the benchmarks, which is kind of annoying since it skews results a bit here and there.

My Gaming profile is 1900 @ 912mV

I have a waterblock on the way for the card so I will probably run 0.9-0.925v and whatever clock its stable at for everything.

@everyone

Are your cards bouncing pretty hard off power limiter too?

What are you guys seeing for power being pulled from the PCIe slot?

Getting closer....

Ordered straight from EKWB shop last Sunday, and got it today. Their estimated delivery date was July 1st or something like that. Pleasantly surprised coming all the way from Slovenia.


----------



## Mussels (Jun 26, 2021)

MxPhenom 216 said:


> You got a good 3090. I need at least 875mv on my 3080Ti to do 1845-1860 or games crash (well AC Valhalla does). My card bounces hard off the power limiter (not sure why Zotac limitted it to only +10% when the design could clearly handle more, maybe with another 8 pin). in time spy at really anything above 875mV. I can run those benchmarks at about 1850 @ 850mV, but games are another story.
> 
> Though I have wondered if it might be me using PCIe 3.0 vs 4.0. I think 4.0  allows for more power from the slot. I never see my card pull more then 50w from it.
> 
> ...


sheeeeit booooooi, you're copying me


----------



## MxPhenom 216 (Jun 26, 2021)

Mussels said:


> sheeeeit booooooi, you're copying me


Except im going acrylic tubing


----------



## nguyen (Jun 26, 2021)

MxPhenom 216 said:


> You got a good 3090. I need at least 875mv on my 3080Ti to do 1845-1860 or games crash (well AC Valhalla does). My card bounces hard off the power limiter (not sure why Zotac limitted it to only +10% when the design could clearly handle more, maybe with another 8 pin). in time spy at really anything above 875mV. I can run those benchmarks at about 1850 @ 850mV, but games are another story.
> 
> Though I have wondered if it might be me using PCIe 3.0 vs 4.0. I think 4.0  allows for more power from the slot. I never see my card pull more then 50w from it.
> 
> ...



Oh my 3090 bounces off any power budget I can throw at it alright , my 3090 can reach clocks as high as 2190mhz/1.025V when I use XOC BIOS without power limiter (500W load), at stock 375W BIOS the clocks remain around 1950mhz/875mV average in Timespy and load heavy games (CP2077, Metro Exodus EE, etc..). 

When the card is under water, I guess you can extract around 45mhz outta it, so around 1950mhz. 

Max power draw from PCIe slot to be around 64W on my 3090 when TDP is around 350W


----------



## MxPhenom 216 (Jun 26, 2021)

Mussels said:


> sheeeeit booooooi, you're copying me


When you put your block on did you use the thermal pads EK sends with it or did you use Gelids or something? EK pads have a terrible thermal conductivity rating



nguyen said:


> Oh my 3090 bounces off any power budget I can throw at it alright , my 3090 can reach clocks as high as 2190mhz/1.025V when I use XOC BIOS without power limiter (500W load), at stock 375W BIOS the clocks remain around 1950mhz/875mV average in Timespy and load heavy games (CP2077, Metro Exodus EE, etc..).
> 
> When the card is under water, I guess you can extract around 45mhz outta it, so around 1950mhz.
> 
> Max power draw from PCIe slot to be around 64W on my 3090 when TDP is around 350W


Yeah gaming isnt a issue for me yet but bench marks like time spy and port royal murder the card at anything above 0.875v. I basically have a gaming profile and a benching profile now.


----------



## Mussels (Jun 27, 2021)

MxPhenom 216 said:


> When you put your block on did you use the thermal pads EK sends with it or did you use Gelids or something? EK pads have a terrible thermal conductivity rating
> 
> 
> Yeah gaming isnt a issue for me yet but bench marks like time spy and port royal murder the card at anything above 0.875v. I basically have a gaming profile and a benching profile now.


I used EK's, i have better here but had a friggin nightmare with my stock cooler due to compression being wrong and causing bad contact issues

I mean i went from 110C on the VRAM mining at stock to 80C with +1500MHz on the VRAM, so 30C+ drop was good enough for EK's pads to stay


----------



## MxPhenom 216 (Jun 27, 2021)

Mussels said:


> I used EK's, i have better here but had a friggin nightmare with my stock cooler due to compression being wrong and causing bad contact issues
> 
> I mean i went from 110C on the VRAM mining at stock to 80C with +1500MHz on the VRAM, so 30C+ drop was good enough for EK's pads to stay



I just ordered some of Gelid extreme 1mm thick to use instead. I just need to order the rest of the watercooling stuff and then a motherboard (hoping x570s boards start hitting US markets early July) and then ill be doing the build.


----------



## Mussels (Jun 29, 2021)

So i bought 4 bottles of EK mystic fog, which clog up in the 3090 RE block 

EK offered to send me new bottles of coolant in other colours without the particles... should be 4 bottles in random colours

WEEEEEEE


----------



## nguyen (Jun 29, 2021)

Mussels said:


> So i bought 4 bottles of EK mystic fog, which clog up in the 3090 RE block
> 
> EK offered to send me new bottles of coolant in other colours without the particles... should be 4 bottles in random colours
> 
> WEEEEEEE



No fancy coolant for me, just distilled water, colored coolant leave stains that you have to take apart the WB to clean....just too much work.


----------



## MxPhenom 216 (Jun 29, 2021)

nguyen said:


> No fancy coolant for me, just distilled water, colored coolant leave stains that you have to take apart the WB to clean....just too much work.


Some dyes certainly do, but if you flush every 6 months it shouldnt be bad, and the staining on tubing doesnt really happen if you use hard tubing.

My current idea for a loop plan when i get mine put together:


----------



## FireFox (Jun 29, 2021)

MxPhenom 216 said:


> but if you flush every 6 months it shouldnt be bad,


Just if you really are a fancy coolant fan then it is worth it, otherwise not many people are willing to flush every 6 months their Loops, at least not me



nguyen said:


> just too much work.


----------



## nguyen (Jun 29, 2021)

MxPhenom 216 said:


> Some dyes certainly do, but if you flush every 6 months it shouldnt be bad, and the staining on tubing doesnt really happen if you use hard tubing.
> 
> My current idea for a loop plan when i get mine put together:
> 
> View attachment 205865



That literally is my build , the Pump/Resevoir combo is Bitspower Sedna 011D. 





Built this in 2018 and it still look timeless 3 years later, just replace my 2080Ti for 3090 and 8700K for 9900K.


----------



## Mussels (Jun 30, 2021)

You people planned your loops? I just slapped the hardware in, and did the hoses one by one in order until it looked nice

and then threw a poop emoji and an octopus in there


----------



## MxPhenom 216 (Jun 30, 2021)

Mussels said:


> You people planned your loops? I just slapped the hardware in, and did the hoses one by one in order until it looked nice
> 
> and then threw a poop emoji and an octopus in there



Oh yeah. I have a ton of drawings. I also was looking on builds.gg for idea for months now.


----------



## Mussels (Jul 8, 2021)

Just noticed that since the BIOS mod to enable REBAR, my 3090 seems to be acting like its got a 300W power limit, not the regular 350W?
Bios says Board power limit  Target: 350.0 W  Limit: 371.0 W - but i'm getting power limit issues at just 300W...
edit: Changed to the KFA2 3090 BIOS with a 390W power limit, now i can see peaks of 350-370W at stupid clock settings

I was hitting the power limit when mining just from the VRAM chewing it all up, with the GPU above 1500MHz which just... what? BIOS flash fixed.

So with the other BIOS (which i havent confirmed REBAR works yet) i see 303W (a whole 3W more!) but the core runs at 1605Mhz instead of 1500MHz with dips down to like 1200, at the same voltage. Weird.


----------



## nguyen (Jul 8, 2021)

Mussels said:


> Just noticed that since the BIOS mod to enable REBAR, my 3090 seems to be acting like its got a 300W power limit, not the regular 350W?
> Bios says Board power limit  Target: 350.0 W  Limit: 371.0 W - but i'm getting power limit issues at just 300W...
> edit: Changed to the KFA2 3090 BIOS with a 390W power limit, now i can see peaks of 350-370W at stupid clock settings
> 
> ...



well you can flash the Kingpin XOC BIOS to bypass thermal and power limit , the sandwich WB can handle 500W 3090 just fine no doubt.


----------



## MxPhenom 216 (Jul 8, 2021)

I wish there was a better BIOS i could put on my 3080Ti, but I dont think there is. I also think im limitted by the 2 8 pins.


----------



## nguyen (Jul 8, 2021)

MxPhenom 216 said:


> I wish there was a better BIOS i could put on my 3080Ti, but I dont think there is. I also think im limitted by the 2 8 pins.



The highest TDP BIOS for 3080Ti with 2 x 8 pins is the Zotac AMP one with 385W Max, so you can extract a little bit more clocks outta your 3080 Ti if you so wish, or does your 3080Ti already come with 385W TDP?


----------



## MxPhenom 216 (Jul 8, 2021)

nguyen said:


> The highest TDP BIOS for 3080Ti with 2 x 8 pins is the Zotac AMP one with 385W Max, so you can extract a little bit more clocks outta your 3080 Ti if you so wish, or does your 3080Ti already come with 385W TDP?



That's the card I have, but it bounces off power limit even at +10% (max) and it dont see if get much over 350w, but then again I also havent seen waht it can pull when at a voltage above 0.95v for extended period of time so that maybe why the power isnt as high as it could be. I thought it might also be due to the card being in a PCIe 3.0 slot, since I only see it pull 40-50w from the slot when it could be 75w right?


----------



## nguyen (Jul 8, 2021)

MxPhenom 216 said:


> That's the card I have, but it bounces off power limit even at +10% (max) and it dont see if get much over 350w, but then again I also havent seen waht it can pull when at a voltage above 0.95v for extended period of time so that maybe why the power isnt as high as it could be. I thought it might also be due to the card being in a PCIe 3.0 slot, since I only see it pull 40-50w from the slot when it could be 75w right?



Yeah when you remove the undervolt and let the card boost as high as it can, it will bounce off the max TDP all the time (and will probably pull 75W off the PCIe slot too)


----------



## ShiBDiB (Jul 8, 2021)

New club member here, finally got my hands on a 3060Ti


----------



## Mussels (Jul 9, 2021)

nguyen said:


> well you can flash the Kingpin XOC BIOS to bypass thermal and power limit , the sandwich WB can handle 500W 3090 just fine no doubt.


its a 2x8 card, so i need BIOS with 2x8 power plugs according to le research
What i'm hitting is the extra power of my VRAM at 1500+ (it can even go higher!) is chewing up the TDP, leaving piss all for the core itself.

I moved to this one, being the same brand as mine:
VGA Bios Collection: GALAX RTX 3090 24 GB | TechPowerUp

Galax, 370-390W
Seems to be the best 2 plug i can find, and it has matching outputs so no missing one DP port or anything


----------



## nguyen (Jul 9, 2021)

Mussels said:


> its a 2x8 card, so i need BIOS with 2x8 power plugs according to le research
> 
> What i'm hitting is the extra power of my VRAM at 1500+ (it can even go higher!) is chewing up the TDP, leaving piss all for the core itself.



Nah my Asus TUF also has 2x8 power plug and it works just fine with the Kingpin XOC BIOS since there is no power limit on any of the 8pin power.

I think you are hitting the maximum current allowable on the memory since your 3090 only has 3 phases VRM for the memory (200W on the memory is almost 50A going through each VRM, GDDR6X voltage is 1.35V)


----------



## Mussels (Jul 9, 2021)

Perhaps, but the wattage on that BIOS is hoooge anyway, 390W is enough for me
i'm diagnosing some black screen issues where on a reboot i'd have no display, but the system was working (windows logon sounds, etc)

It's either:
1. PCI-E riser
2. undervolted SoC
3. Curve undervolted CPU

trying putting 2. and 3. closer to stock, and locking riser to 3.0

Ah. well.


It could also be that my ARGB extension leads were burning.


----------



## MxPhenom 216 (Jul 14, 2021)

Does anyone know if temperature of RAM, or hot spot effects how the card clocks? I noticed that my card will throttle even if its not at the power limit, but memory temperature is basically right at the temp limit. Core temperature is about 20c away from the limit.


----------



## Mussels (Jul 14, 2021)

MxPhenom 216 said:


> Does anyone know if temperature of RAM, or hot spot effects how the card clocks? I noticed that my card will throttle even if its not at the power limit, but memory temperature is basically right at the temp limit. Core temperature is about 20c away from the limit.


yeah, they have limits around the VRAM temp for sure - i know at 110C the fans ramp to 100% no matter what, as an example


----------



## the54thvoid (Jul 14, 2021)

I'm jealous. One of the guys I game with got an alert and bagged an RTX 3080 Founders for £650. He's a member here. 

You know who you are! I want pics.


----------



## MxPhenom 216 (Jul 14, 2021)

Mussels said:


> yeah, they have limits around the VRAM temp for sure - i know at 110C the fans ramp to 100% no matter what, as an example


Be interesting what getting this waterblock on the card will do. Maybe itll stop the throttling i see even if core temp and power isnt at their limits.

We got one bois! Time to finish my build


----------



## arni-gx (Aug 3, 2021)

MxPhenom 216 said:


> Does anyone know if temperature of RAM, or hot spot effects how the card clocks? I noticed that my card will throttle even if its not at the power limit, but memory temperature is basically right at the temp limit. Core temperature is about 20c away from the limit.





Mussels said:


> yeah, they have limits around the VRAM temp for sure - i know at 110C the fans ramp to 100% no matter what, as an example



is there any option to show that vram temperature in real time, on OSD from MSI AB ??


----------



## Mussels (Aug 3, 2021)

I got no idea about AB, go ask the devs to add it to AB/RTSS


----------



## nguyen (Aug 3, 2021)

arni-gx said:


> is there any option to show that vram temperature in real time, on OSD from MSI AB ??



You can import sensor data from HWinfo into Afterburner OSD


----------



## arni-gx (Aug 3, 2021)

nguyen said:


> You can import sensor data from HWinfo into Afterburner OSD
> 
> View attachment 211024View attachment 211025


is that i must install hw info app too ?? because i can not see that option in my MSI AB.......


----------



## nguyen (Aug 3, 2021)

arni-gx said:


> is that i must install hw info app too ?? because i can not see that option in my MSI AB.......



You can use the portable version of HWinfo, it will be started along with Afterburner when windows start.


----------



## oxrufiioxo (Aug 3, 2021)

Picked this up a couple weeks ago but forgot about this thread.


----------



## Mussels (Aug 3, 2021)

With stock PSU cables mines been rock solid again

slowly learning how to use the watercooling stuff like extenders and right angle fittings to tweak the aesthetics
(dusty glass ruined this shot partially, but still looks great)


----------



## MxPhenom 216 (Aug 4, 2021)

Mussels said:


> With stock PSU cables mines been rock solid again
> 
> *slowly learning how to use the watercooling stuff like extenders and right angle fittings to tweak the aesthetics*
> (dusty glass ruined this shot partially, but still looks great)
> View attachment 211147


Really? PSU cables effected it? EDIT: Nevermind, forgot your power cables were burning 

Just run hard tubes. 5head

Meanwhile working on this lately:


----------



## Mussels (Aug 4, 2021)

hard tubes are a future goal, after i master soft


----------



## Cheese_On_tsaot (Aug 4, 2021)

What are peoples thoughts on Nvidia's large driver overhead?


----------



## RealKGB (Aug 4, 2021)

I got myself a 30xx card, for less than MSRP too!


Spoiler






It's a Radeon 3000 iGPU. I don't own it, but it was a good opportunity for a joke, since it is a 30xx card!


----------



## MxPhenom 216 (Aug 4, 2021)

Cheese_On_tsaot said:


> What are peoples thoughts on Nvidia's large driver overhead?


Ive seen people mention it. I thought it was the other way around. Amd had high overhead while nvidia didnt. But maybe that was for dx11 titles several years ago and now we are into DX12.


----------



## Cheese_On_tsaot (Aug 4, 2021)

MxPhenom 216 said:


> Ive seen people mention it. I thought it was the other way around. Amd had high overhead while nvidia didnt. But maybe that was for dx11 titles several years ago and now we are into DX12.


Plenty of people complaining that their cards are not giving out the performance they should even in DX11 games like GTA V but that goes against what you just mentioned.
I don't know either, I am just wondering is all, I own a 5600x with an RTX 2060, there is zero bottlenecking from my CPU and I have no issues but the complaints are coming from anything from a 2060S to 3090...


----------



## oxrufiioxo (Aug 4, 2021)

Cheese_On_tsaot said:


> What are peoples thoughts on Nvidia's large driver overhead?



For cpu performances due no hardware scheduler I'm guessing you're referring to. Doesn't really effect me on either of my primary machines so couldn't say.

This really isn't the thread for that making your own thread about it makes more sense though if you're actually curious.


----------



## mrthanhnguyen (Aug 4, 2021)

Dont have time to play with it so I turned it to a mining rig. 2 FTW3 and 1 Kingpin Hydrocooper ( Optimus Block will be installed when they shipped it to me). It makes $600-$700/ month. Half way to breakeven.



When I’m boring, I also benchmark this. Currently #9 PR.


----------



## R-T-B (Aug 4, 2021)

Cheese_On_tsaot said:


> What are peoples thoughts on Nvidia's large driver overhead?


I've tried both sides.  AMDs DX11 overhead is 10x worse.  Nvidia gets shit because the driver is multithreaded and thus will use your cpu more...  to better benefit your fps ceiling.

So in short:  what issue?



oxrufiioxo said:


> For cpu performances due no hardware scheduler I'm guessing you're referring to.


Nvidia gpus have a hardware scheduler.  He's refering to the "nvidia uses more cpu" charts that have been making the rounds (ignoring the fact a multithreaded dx11 driver is a good thing).



MxPhenom 216 said:


> Ive seen people mention it. I thought it was the other way around. Amd had high overhead while nvidia didnt. But maybe that was for dx11 titles several years ago and now we are into DX12.


It is, the AMD driver is just single threaded so it looks better in usage while performing worse.


----------



## nguyen (Aug 4, 2021)

R-T-B said:


> I've tried both sides.  AMDs DX11 overhead is 10x worse.  Nvidia gets shit because the driver is multithreaded and thus will use your cpu more...  to better benefit your fps ceiling.
> 
> So in short:  what issue?
> 
> ...



Well some biased tech channel like Hardware Unboxed just like to attack Nvidia while glossing over AMD's short fall with DX11.
HUB even use some games that run like shit in DX12 (particularly Unreal Engine 4 games) like Borderlands 3, Fortnite and BF:V (Frostbite) as their main talking points.
In short HUB is looking at the 1% of all gamers who play AAA games with DX12 on crappy CPU and think it's Nvidia who have problem  , no wonder Nvidia wanted to cut tie with them.


----------



## the54thvoid (Aug 4, 2021)

Ampere thread guys. Generic Nvidia discussion can go in the Nvidia sub threads.

Thanks.


----------



## arni-gx (Aug 4, 2021)

nguyen said:


> You can import sensor data from HWinfo into Afterburner OSD
> 
> View attachment 211024View attachment 211025



share memory support has limit 12 hours, but if u buy that HWinfo, it will be forever 24x7 hours...... well, skip.........

anyway.....









						NVIDIA GeForce RTX 3090 Ampere Alone Has Higher GPU Share Than AMD's Entire Radeon RX 6000 RDNA 2 GPU Lineup
					

NVIDIA's GeForce RTX 30 Ampere GPUs reign supreme over AMD's Radeon RX 6000 RDNA 2 GPUs according to Steam's latest hardware survey.




					wccftech.com
				




as usually always, nvidia still has large fan base in steam hardware survey...... well, not bad.....


----------



## nguyen (Aug 4, 2021)

arni-gx said:


> share memory support has limit 12 hours, but if u buy that HWinfo, it will be forever 24x7 hours...... well, skip.........
> 
> anyway.....
> 
> ...



Is that from the latest version of HWinfo? you can just download an older version and share memory support is always there. 

3090 outsold the entire RX6000 series? now that's some news.


----------



## venturi (Aug 4, 2021)

on 3090s





__





						GravityMark Leaderboard
					

GravityMark GPU Benchmark




					gravitymark.tellusim.com
				




*70,664 *score


----------



## MxPhenom 216 (Aug 4, 2021)

nguyen said:


> Is that from the latest version of HWinfo? you can just download an older version and share memory support is always there.
> 
> 3090 outsold the entire RX6000 series? now that's some news.


I just dont think AMD is able to match the volume nvidia has been shipping even with the shortage. Every single AMD product right now is on tsmc 7nm with wafer priority going to console SOCs.


----------



## FireFox (Aug 5, 2021)

arni-gx said:


> share memory support has limit 12 hours


Meaning that the imported sensor data from HWinfo into Afterburner will be disabled?


----------



## arni-gx (Aug 22, 2021)

FireFox said:


> Meaning that the imported sensor data from HWinfo into Afterburner will be disabled?



i dont know about that...... but, after installed, there is no option like that.......


nguyen said:


> Is that from the latest version of HWinfo? you can just download an older version and share memory support is always there.
> 
> 3090 outsold the entire RX6000 series? now that's some news.



where is that option for VRM temperature on MSI AB ??

anyway.......

suddenly, there is no option GPU core clock in my MSI AB monitoring tab, why is that ?? how to fix that ??


----------



## FireFox (Aug 22, 2021)

arni-gx said:


> where is that option for VRM temperature on MSI AB ??





Fallow his steps








						The Ampere owners club - 30xx series
					

Just noticed that since the BIOS mod to enable REBAR, my 3090 seems to be acting like its got a 300W power limit, not the regular 350W? Bios says Board power limit  Target: 350.0 W  Limit: 371.0 W - but i'm getting power limit issues at just 300W... edit: Changed to the KFA2 3090 BIOS with a...




					www.techpowerup.com
				






arni-gx said:


> share memory support has limit 12 hours


Maybe it is a bug but for some unknown reason after a week it doesn't reset


----------



## MxPhenom 216 (Aug 22, 2021)

Well, my rig is done:

Current temps idle right now on my 3080ti with EK block and backplate (not active). Used GELID extreme pads instead of the ones EK provides.





Load temps:
GPU: ~50-55c
Hot spot: 60c
Memory: 60c

This is roughly a 10c delta between water temp and card temps at load:

Idle, the delta is 0 between water and component.

Before at load I had memory and hotspot temps in the 80s and GPU (undervolted) would bounce around from 70-75c


----------



## FireFox (Aug 23, 2021)

MxPhenom 216 said:


> This is roughly a 10c delta between water temp and card temps at load:


Does that means that the water temp is 40c/45c?


----------



## MxPhenom 216 (Aug 23, 2021)

FireFox said:


> Does that means that the water temp is 40c/45c?


Roughly yes. Little hotter than id like, but can't really fix that with current fan, rad setup in my case. It becomes a bit of a hot box in this set up it seems. It would prob be a lot better if i had fans on the side as intake, but that space is occupied by my EK FLT res.

I think I can keep it under 40c actually. Been playing more with fan curves.


----------



## FireFox (Aug 23, 2021)

Those load temps you posted above was doing what, and what was the ambient temps? 
Considering that you are using a Backplate i expected to see lower memory temp.


----------



## MxPhenom 216 (Aug 23, 2021)

FireFox said:


> Those load temps you posted above was doing what, and what was the ambient temps?
> Considering that you are using a Backplate i expected to see lower memory temp.


Ambient temp is about 25c

Its not the active backplate, and theres no RAM chips on the backside of the pcb on a 3080ti.

I was able to keep the coolant temps closer to 40c. bounced from 39-41c yesterday with a little more aggressive fan curve settings. I tweaked it again. see if thatll keep it under 40c now.


----------



## arni-gx (Aug 23, 2021)

nguyen said:


> You can import sensor data from HWinfo into Afterburner OSD
> 
> View attachment 211024View attachment 211025



there is no option like that with MSI AB and HWinfo is running....


----------



## FireFox (Aug 23, 2021)

arni-gx said:


> there is no option like that with MSI AB and HWinfo is running....


Oh boy, you again 


In HWiNFO tick the box next to shared memory support.
After that lets move to* MSI AB*









Open/launch *MSI AB* and go to settings*,* in settings go to monitoring and then *click step 1*, a new window will appear *(Hardware monitoring plugins) *thick *Hwinfow.dll,* when done then *click step 3* *( setup )* a new window will appear *( Hwinfo plugin settings ) *from there search/look for *GPU Memory Junction* and then click on *add*.









After you have done/followed all the steps go to *MSI AB *open settings then monitoring and you should be able to see/find* GPU Memory Junction*






My English is crap, i did my best.


----------



## arni-gx (Aug 24, 2021)

FireFox said:


> Oh boy, you again
> 
> 
> In HWiNFO tick the box next to shared memory support.
> ...



sorry, i cant find it in my MSI AB and HWinfo......


----------



## FireFox (Aug 24, 2021)

MxPhenom 216 said:


> Ambient temp is about 25c
> I was able to keep the coolant temps closer to 40c. bounced from 39-41c yesterday with a little more aggressive fan curve settings. I tweaked it again. see if thatll keep it under 40c now.


15c delta between water and ambient temp isn't that bad after all + GPU temps under load are fine.



arni-gx said:


> sorry, i cant find it in my MSI AB and HWinfo......


What is exactly what you cant find, and what version of MSI AB and HWinfo are you using?

Scroll down the page there's a video, that maybe could help.








						MSI Afterburner gets plugin support - VideoCardz.com
					

Yesterday Alexey Nicolaychuk updated MSI Afterburner with tons of new features. The MSI Afterburner 4.4.0 Beta 17 delivers a support for external hardware monitoring plugins, such as AIDA64 or HWINFO. This is Afterburner going back to its Rivatuner roots (which by the way was developed 20 years...




					videocardz.com


----------



## FireFox (Aug 26, 2021)

Nvidia RTX 30-Series Prices Are Falling Faster Than AMD RX 6000 GPUs in Germany​








						Nvidia RTX 30-Series Prices Are Falling Faster Than AMD RX 6000 GPUs in Germany
					

RTX cards get more affordable--at least in Germany.




					www.tomshardware.com


----------



## freeagent (Aug 26, 2021)

Yes indeed. My 3070 Ti is still cheaper than every 3060 on newegg.ca, and I had to buy a psu with mine too. 6700XT which is what I was looking at 3 days ago is also much more expensive. Newegg scalps almost as hard as their 3rd party sellers. Brutal.


----------



## oxrufiioxo (Aug 26, 2021)

FireFox said:


> Nvidia RTX 30-Series Prices Are Falling Faster Than AMD RX 6000 GPUs in Germany​
> 
> 
> 
> ...



What I find odd is RDNA2 cards are more expensive while being much more available from legit non scalper sources. If I could have sourced a 6900XT for less than my 3080 ti I would likely have grabbed it but most cost at least 200 usd more with the models I would actually want costing 600 usd more pre tax. 

Although Asus, Msi, Gigabyte are charging quite a bit more than Evga currently. I'm guessing nvidia is giving evga more allotment.


----------



## wolf (Oct 7, 2021)

Been back on the mining bandwagon coming out of winter into spring here in Aus, getting back in a time where my solar power is on before I leave for work and after I get home. I'm fairly strict to only mine on my own excess solar power.

Getting either ~95mh/s @ 71% power target w+750 mem, max GDDR6X junction temps of about 86-88c but the ambient temp is still low, in the height of summer even with a/c that will be in the mid 90's I'd wager.

Or the other profile is ~80mh/s @ 65% power target w-250 mem, max GDDR6X junction temps of about 80-82c, so a saving of about ~6c memory temp, it's the only reason I have this profile. There more as a backup/hotter days to really ensure the memory isn't cooking.

I definitely prefer getting 95 mh/s but I also want this card to run for years and years, so we'll see how I go over the height of summer.


----------



## freeagent (Oct 7, 2021)

After spending just over a month with my new card, I am pretty impressed with it. Its actually pretty quiet, runs cool, clocks alright. I think my 980 Classified was louder. Its not a 3080, but it was in stock, and I did have to buy a PSU with it as a bundle. An EVGA G1 750 I think.. the PSU feels like junk. Whatever.. at least I was able to retire my GTX 580. Many similarities when looking at power at the wall when running Fermi and then Ampere.. I would have gotten a stronger PSU last November when I built my rig, but PSU's were just not available.


----------



## Hyderz (Oct 7, 2021)

freeagent said:


> After spending just over a month with my new card, I am pretty impressed with it. Its actually pretty quiet, runs cool, clocks alright. I think my 980 Classified was louder. Its not a 3080, but it was in stock, and I did have to buy a PSU with it as a bundle. An EVGA G1 750 I think.. the PSU feels like junk. Whatever.. at least I was able to retire my GTX 580. Many similarities when looking at power at the wall when running Fermi and then Ampere.. I would have gotten a stronger PSU last November when I built my rig, but PSU's were just not available.



cool! enjoy your new gpu


----------



## nguyen (Oct 7, 2021)

freeagent said:


> After spending just over a month with my new card, I am pretty impressed with it. Its actually pretty quiet, runs cool, clocks alright. I think my 980 Classified was louder. Its not a 3080, but it was in stock, and I did have to buy a PSU with it as a bundle. An EVGA G1 750 I think.. the PSU feels like junk. Whatever.. at least I was able to retire my GTX 580. Many similarities when looking at power at the wall when running Fermi and then Ampere.. I would have gotten a stronger PSU last November when I built my rig, but PSU's were just not available.



Time to check out Cyberpunk 2077 then


----------



## cst1992 (Oct 7, 2021)

freeagent said:


> After spending just over a month with my new card, I am pretty impressed with it. Its actually pretty quiet, runs cool, clocks alright. I think my 980 Classified was louder. Its not a 3080, but it was in stock, and I did have to buy a PSU with it as a bundle. An EVGA G1 750 I think.. the PSU feels like junk. Whatever.. at least I was able to retire my GTX 580. Many similarities when looking at power at the wall when running Fermi and then Ampere.. I would have gotten a stronger PSU last November when I built my rig, but PSU's were just not available.


From a 580 to a 3070Ti?! Must have felt like a catapult.

I was actually very impressed when I jumped from my 970 to my 3060Ti. The 970 was starting to bottleneck in GTAV(I game at 1080p), so I thought: the time is right - why not get a 3060Ti? (Funny thing is my cousin was going to get an AMD-based rig with a 3700x and a 5700XT prebuilt so I was explaining to him how he should rather get a 3060Ti - It's like looking for a match for your brother but falling in love with her yourself!)


----------



## Simpleris (Oct 7, 2021)

So I have a RTX 3090 paired with an i9 10900k with a 1KW power supply and my mobo has a slot for an additional RTX 3090 (for deep learning purposes), would it be a good idea to add one more? I've read that the Super will not have SLI.


----------



## cst1992 (Oct 7, 2021)

Simpleris said:


> my mobo has a slot for an additional RTX 3090, would it be a good idea to add one more?


No. Just, no.


----------



## R-T-B (Oct 7, 2021)

SLI support on even the 3090 is nearly nonexistant.  I think it only works on benchmarks.

SLI is dead.


----------



## freeagent (Oct 7, 2021)

cst1992 said:


> From a 580 to a 3070Ti?! Must have felt like a catapult.


Lol no I had a 980, but my sons rig had the 580, now he has my 980 




nguyen said:


> Time to check out Cyberpunk 2077 then


I own it and it looks great


----------



## Mussels (Oct 7, 2021)

Simpleris said:


> So I have a RTX 3090 paired with an i9 10900k with a 1KW power supply and my mobo has a slot for an additional RTX 3090 (for deep learning purposes), would it be a good idea to add one more? I've read that the Super will not have SLI.


Why even ask?
You know it has no game support already

If you have a use that isn't gaming, you already know the answer yourself


----------



## arni-gx (Oct 8, 2021)

Simpleris said:


> So I have a RTX 3090 paired with an i9 10900k with a 1KW power supply and my mobo has a slot for an additional RTX 3090 (for deep learning purposes), would it be a good idea to add one more? I've read that the Super will not have SLI.











psu pure 1K watt, is not enough..... you must using psu pure 1600 watt for dual rtx 3090 24gb.......


----------



## ThaiTaffy (Oct 20, 2021)

I have know little about the M1 chip but is this even a real question?


----------



## cst1992 (Oct 20, 2021)

Sure, if the clickbait gives you enough visitors.


----------



## ThaiTaffy (Oct 20, 2021)

I laughed and carried on reading news I was interested in


----------



## oxrufiioxo (Nov 7, 2021)

This was pretty hilarious to mess around with..... I bet there are people out there with a similar setup hella bottlenecked 









						I scored 13 778 in Time Spy
					

AMD Ryzen 5 2600, NVIDIA GeForce RTX 3090 x 1, 16384 MB, 64-bit Windows 10}




					www.3dmark.com


----------



## kiddagoat (Nov 29, 2021)

I picked up an ASUS ROG Strix 3080Ti when I took back my Liquid Devil 6900XT back.  Put a Bitspower block on it and seems all good.  Not going over 60C and keeping a boost between 1995-2010Mhz.  The memory doesn't go over 70C with the passive backplate.  I can't complain.


----------



## nguyen (Nov 29, 2021)

kiddagoat said:


> I picked up an ASUS ROG Strix 3080Ti when I took back my Liquid Devil 6900XT back.  Put a Bitspower block on it and seems all good.  Not going over 60C and keeping a boost between 1995-2010Mhz.  The memory doesn't go over 70C with the passive backplate.  I can't complain.



60C is a bit high for watercooling, can you check the Hotspot temp? normal hotspot temp will be 6-14C higher than GPU temp


----------



## kiddagoat (Nov 29, 2021)

nguyen said:


> 60C is a bit high for watercooling, can you check the Hotspot temp? normal hotspot temp will be 6-14C higher than GPU temp



Yeah sorry, I didn't clarify, 60C is the Hotspot Temp.  The edge temp sometimes gets up to 53C.   I have only noticed it getting around this while playing Halo Infinite or if I am running 3DMark/Unigine Superposition for testing purposes.  It doesn't stay there for long, more like a blip on the chart and goes back down. 

Significantly better than the 6900XT that would hit 90C on the hotspot and up to 68C on the edge.  Definitely feel it was a warped water block and an outlier of what was typical with those cards.


----------



## Chomiq (Dec 21, 2021)

3080 Ti is coming tomorrow unless UPS messes up delivery. Any FE owners with Seasonic PSU tried to contact them and request the 12-pin cable? I wonder if they still offer it.

Edit.
Make this today, that's one hell of express shipping:


----------



## Chomiq (Dec 23, 2021)

In case anyone's wondering, Seasonic no longer offers the free cable for FE, your options are now:
- Order directly from Seasonic Taiwan: $22.50

- For US: https://www.btosinte.com/Seasonic-T...ls-Micro-Fit-30-16AWG-VGA-SS-MF12P-9A-VGA.htm $10 + shipping (sadly they don't ship to EU, at least I bounced back with "incorrect shipping location" when I tried to order via PayPal)

- For EU:


> If you need our Seasonic 12P cable for NVIDIA RTX FE, you can contact our German partner Sander Computer to order it. They are in charge to sell our original cables for the whole EU. Please provide them the following info:
> 
> Exact model of your PSU.
> Type of cable (Seasonic 12P) and quantity you need.
> ...


We'll see how much they ask for it.
Update:
€19.90 with shipping, ordered.

Part number is:
SS-MF12P-9A-VGA


----------



## nguyen (Dec 23, 2021)

Chomiq said:


> 3080 Ti is coming tomorrow unless UPS messes up delivery. Any FE owners with Seasonic PSU tried to contact them and request the 12-pin cable? I wonder if they still offer it.
> 
> Edit.
> Make this today, that's one hell of express shipping:


Sweet, fire up CP2077 to commemorate you joining the RTX gang


----------



## arni-gx (Dec 24, 2021)

nguyen said:


> Sweet, fire up CP2077 to commemorate you joining the RTX gang



 i have barely remember, that i am still stuck in chapter 2 in that rpg pc......


----------



## nguyen (Jan 9, 2022)

Seems like my idea of copper shim for GDDR6x was a solid idea , someone made a custom copper shim for his 3080Ti


----------



## Mussels (Jan 10, 2022)

It's like custom water, without the water


----------



## arni-gx (Jan 11, 2022)

https://www.nvidia.com/en-us/geforce/forums/geforce-graphics-cards/5/404418/rtx-3090-and-3080-displayport-flicker-issue-and-wo/
		


that is just some few example that nvidia RTX ampere has issue with it..... especially with LCD monitor 144hz and using DP cable..... is that true ??


----------



## bug (Jan 11, 2022)

arni-gx said:


> https://www.nvidia.com/en-us/geforce/forums/geforce-graphics-cards/5/404418/rtx-3090-and-3080-displayport-flicker-issue-and-wo/
> 
> 
> 
> that is just some few example that nvidia RTX ampere has issue with it..... especially with LCD monitor 144hz and using DP cable..... is that true ??


Not that I know of. Flickering is usually caused by faulty cables or trying to use G-Sync with non-certified monitors (Nvidia is more strict about VRR than AMD).


----------



## Chomiq (Jan 13, 2022)

bug said:


> Not that I know of. Flickering is usually caused by faulty cables or trying to use G-Sync with non-certified monitors (Nvidia is more strict about VRR than AMD).


And it can happen on AMD if you use cheap/poor quality cables:


----------



## arni-gx (Jan 14, 2022)

bug said:


> Not that I know of. Flickering is usually caused by faulty cables or trying to use G-Sync with non-certified monitors (Nvidia is more strict about VRR than AMD).



hmm, i am doubt that bugs its only coming from fault cables and trying use G-sync compatible monitor, because on nvidia official forum, many nvidia rtx ampere user has say, that problem is coming after updating their new nvidia driver vga.....


----------



## Chomiq (Jan 14, 2022)

arni-gx said:


> hmm, i am doubt that bugs its only coming from fault cables and trying use G-sync compatible monitor, because on nvidia official forum, many nvidia rtx ampere user has say, that problem is coming after updating their new nvidia driver vga.....


I'm running mine via DP with a g-sync compatible display. Granted it's FHD 144 Hz but still - no issues.


----------



## arni-gx (Jan 14, 2022)

Chomiq said:


> I'm running mine via DP with a g-sync compatible display. Granted it's FHD 144 Hz but still - no issues.



hmm, any 1 has problem ??

btw....









						Download: NVIDIA GeForce 511.23 WHQL driver download
					

This new Game Ready Driver provides the best day-0 gaming experience for God of War, which utilizes NVIDIA DLSS to maximize performance and NVIDIA Reflex to minimize latency. In addition, this new Gam...




					www.guru3d.com
				




that driver and this driver....









						NVIDIA Launches GeForce 511.17 WHQL Drivers
					

NVIDIA today released the latest version of GeForce driver software. Version 511.17 debuts the 510-series version numbering for the software. This driver introduces support for the GeForce RTX 3080 12 GB graphics card the company stealthily launched today. The drivers also fix a bug with...




					www.techpowerup.com
				




its never coming up on sticky thread on here.....



			https://www.nvidia.com/en-us/geforce/forums/game-ready-drivers/13/
		


why ??


----------



## VeqIR (Jan 26, 2022)

EVGA cards—how us the stock TIM, anyone compared to see if it makes sense to swap the preapplied paste right away to a known well-performing one?  Using the stock cooler.  I’ve got an EVGA 3060 XC that I guess I’ll keep, wondering if I should upgrade the TIM or no point in bothering.


----------



## oxrufiioxo (Jan 26, 2022)

VeqIR said:


> EVGA cards—how us the stock TIM, anyone compared to see if it makes sense to swap the preapplied paste right away to a known well-performing one?  Using the stock cooler.  I’ve got an EVGA 3060 XC that I guess I’ll keep, wondering if I should upgrade the TIM or no point in bothering.



On my FTW 3080 ti the stock tim seems excellent with only about a 5c varience from the edge vs hot spot temperature. 65-70c edge vs 70-75c hotspot. Memory temps are excellent as well. 

Assuming you don't have a massive temp delta on any one of these 3 temperatures I wouldn't mess with it.


----------



## R-T-B (Jan 27, 2022)

EVGA uses Shin Etsu TIM according to a rep.  As long as they apply it right at the factory (they have messed up in the past), that is actually not a bad TIM and has good longevity.

My 3070 FTW3 has never really been very hot running but the thermal difference between edge and hotspot is also pretty low, generally never exceeding 5 C.


----------



## freeagent (Jan 27, 2022)

I see about 10c between the core and hotspot. I have not pulled the card apart yet.. but I kind of want to


----------



## R-T-B (Jan 27, 2022)

freeagent said:


> I see about 10c between the core and hotspot. I have not pulled the card apart yet.. but I kind of want to


I mean that's totally legit too.  At least with EVGA their warranty cover thermal paste replacement with no fuss...  like it should be.


----------



## freeagent (Jan 27, 2022)

R-T-B said:


> At least with EVGA their warranty cover thermal paste replacement with no fuss... like it should be.


Excellent.. just need a few tools for disassembly and I should be good to go then.. oh and some pads. Just gotta figure out how thick they are.


----------



## looniam (Jan 28, 2022)

VeqIR said:


> EVGA cards—how us the stock TIM, anyone compared to see if it makes sense to swap the preapplied paste right away to a known well-performing one?  Using the stock cooler.  I’ve got an EVGA 3060 XC that I guess I’ll keep, wondering if I should upgrade the TIM or no point in bothering.


if you haven't found out already:

firestrike extreme - out of the box:






aggressive fan curve and max power/temp limit:




had it since july, never got around to repasting cuz i have a uniblock that would fit and if i take that card apart . . . crazy things can happen . .


----------



## critofur (Feb 5, 2022)

nguyen said:


> Also a superior option to thermal pads would be these copper shim


What dimensions?  20x20x2mm?  

I suppose, to be safe, after putting the shims onto the RAM chips (w/some thermal paste in between) we should use silicon rubber caulk around the edges to hold them in place and prevent them from sliding off and possibly touching another part of the PCB/components and causing an electrical short?


----------



## Mussels (Feb 6, 2022)

One day i'll man up and use copper shims

The conductivity scares me, but the concept is golden


----------



## nguyen (Feb 6, 2022)

critofur said:


> What dimensions?  20x20x2mm?
> 
> I suppose, to be safe, after putting the shims onto the RAM chips (w/some thermal paste in between) we should use silicon rubber caulk around the edges to hold them in place and prevent them from sliding off and possibly touching another part of the PCB/components and causing an electrical short?



You can order a custom shim from coolmygpu

othwise you can use 20x20x2mm copper shims too, i suppose to be safe you can put some masking tape on the pcb surrounding the VRAM.


----------



## Mussels (Feb 6, 2022)

nguyen said:


> You can order a custom shim from coolmygpu
> 
> othwise you can use 20x20x2mm copper shims too, i suppose to be safe you can put some masking tape on the pcb surrounding the VRAM.


Those prices hurt me inside


----------



## nguyen (Feb 6, 2022)

Mussels said:


> Those prices hurt me inside



these shims last forever, worth it when people are mining 24/7 on these Gddr6X Ampere .


----------



## arni-gx (Feb 6, 2022)

this game engine its so weird....



Spoiler: hitman 2 by steam













under 60 fps....










it can 144 fps.....


----------



## heky (Feb 6, 2022)

Sooo...its been quite a long run with my 1080ti. Finally bit the bullet and bought myself a 3080ti.
Went for the MSI Suprim X. Hope it lasts for a couple of years...


----------



## nguyen (Feb 6, 2022)

heky said:


> Sooo...its been quite a long run with my 1080ti. Finally bit the bullet and bought myself a 3080ti.
> Went for the MSI Suprim X. Hope it lasts for a couple of years...



Just in time for some sweet 4K Ultra RT 60FPS in Dying Light 2 (with DLSS Balanced)


----------



## Mussels (Feb 6, 2022)

nguyen said:


> these shims last forever, worth it when people are mining 24/7 on these Gddr6X Ampere .


You arent wrong, the concept of using them with a waterblock "forever" on my 3090 does appeal to me


It's just that price of that is a weeks rent, once we throw dollar conversions and shipping on top


----------



## Upgrayedd (Feb 6, 2022)

freeagent said:


> Excellent.. just need a few tools for disassembly and I should be good to go then.. oh and some pads. Just gotta figure out how thick they are.


On FTW3 style I use 1mm on VRM and 2mm on VRAM front and back. Went from mid 90s on VRAM while mining to low 80s high 70s with just thermal pads and aggressive fan.
Evga uses a puddy on the VRM to help with their unevenness but I've seen a lot of people use 1mm and be fine. Including myself. 


VeqIR said:


> EVGA cards—how us the stock TIM, anyone compared to see if it makes sense to swap the preapplied paste right away to a known well-performing one?  Using the stock cooler.  I’ve got an EVGA 3060 XC that I guess I’ll keep, wondering if I should upgrade the TIM or no point in bothering.


For standard gddr6 cards I wouldn't bother unless the card is throttling or if the core is abnormally high. Fresh air and proper aggressive fans will do more for a card than new pads and paste if it's still not getting fresh air. I've seen cards hit the same high temps after a repad because they aren't getting good air.


----------



## xenosys (Mar 9, 2022)

heky said:


> Sooo...its been quite a long run with my 1080ti. Finally bit the bullet and bought myself a 3080ti.
> Went for the MSI Suprim X. Hope it lasts for a couple of years...



I'm the same.  Just bought a 3080ti Suprim X.  Will be putting it under water as well, so hoping for some good clocks and temps.  How's yours been so far?


----------



## heky (Mar 9, 2022)

xenosys said:


> I'm the same.  Just bought a 3080ti Suprim X.  Will be putting it under water as well, so hoping for some good clocks and temps.  How's yours been so far?


Nice. Congrats on the purchase. Haven't really had much time for proper gaming sessions, but in Benchmarks its Fast!!!  Under load it gets up to 74°C (80°C Hotspot) on the core, 84°C on the memory. I get 19183 Points in 3DMark Time Spy on stock settings, which i think is not bad. And the Card is not loud at all...









						I scored 19 183 in Time Spy
					

AMD Ryzen 9 5900X, NVIDIA GeForce RTX 3080 Ti x 1, 32768 MB, 64-bit Windows 11}




					www.3dmark.com
				




I am also thinking of putting it under water, like i had my 1080ti, but at the moment there just isn't enough time for such projects. (I am a proud father of a almost 10months old son)


----------



## xenosys (Mar 9, 2022)

heky said:


> Nice. Congrats on the purchase. Haven't really had much time for proper gaming sessions, but in Benchmarks its Fast!!!  Under load it gets up to 74°C (80°C Hotspot) on the core, 84°C on the memory. I get 19183 Points in 3DMark Time Spy on stock settings, which i think is not bad. And the Card is not loud at all...
> 
> 
> 
> ...



Those are some impressive results.  I'll share mine when I get it in the mail later this week.  Thanks for the update ... and yes, kids should always come first.


----------



## Mussels (Mar 10, 2022)

Guys, crossflashing advice

I'm getting a bug where if i have two monitors connected (even if #2 is disabled and powered off) the first monitors backlight never shuts down when its meant to be sleeping.

Just updated from my old BIOS on 94.02.26.C0.35 to this similar one
VGA Bios Collection: GALAX RTX 3090 24 GB | TechPowerUp on 94.02.26.C0.3F

let's see if it helps


----------



## Mussels (Mar 12, 2022)

Uhh... yes. it worked.


I solved my never ending 'monitor sleeps but backlight stays on' issue by crossflashing my 3090. Huh.


----------



## arni-gx (Mar 15, 2022)

for the couple last days..........the price for almost all rtx 3000 series on my country, the average price has decreased, even though it hasn't reached the MSRP......


----------



## wolf (Mar 15, 2022)

arni-gx said:


> for the couple last days..........the price for almost all rtx 3000 series on my country, the average price has decreased, even though it hasn't reached the MSRP......


I wonder, will it fall to ~MSRP before 40 series launch, then go right back up for similar reasons? mostly crazy demand, assuming many people have reached this point and decided to hold off.


----------



## oxrufiioxo (Mar 15, 2022)

wolf said:


> I wonder, will it fall to ~MSRP before 40 series launch, then go right back up for similar reasons? mostly crazy demand, assuming many people have reached this point and decided to hold off.



RTX 3080s 10G are still 12-1400 usd I'm doubtful any models will approach msrp even when 4000 series launches. My guess is Nvidia will cut back manufacturing the minute sales slow.


----------



## arni-gx (Mar 15, 2022)

wolf said:


> I wonder, will it fall to ~MSRP before 40 series launch, then go right back up for similar reasons? mostly crazy demand, assuming many people have reached this point and decided to hold off.





oxrufiioxo said:


> RTX 3080s 10G are still 12-1400 usd I'm doubtful any models will approach msrp even when 4000 series launches. My guess is Nvidia will cut back manufacturing the minute sales slow.



yup, average price for rtx 3080 10gb series on my country..... today, between USD 1400-1600.....but, for rtx 3080 ti 12gb series, still USD 2000-2300..... but for all rtx 3050 8gb - rtx 3070 ti 8gb, the price has fallen, but not so much......


----------



## Mussels (Mar 20, 2022)

Okay the monitor backlight staying on issue:

It depends whats on the screen.

Edge, steam etc: they keep the backlight on. Everything minimised, or heaven forbid a forgotten game running? no problem, goes black.


Edit: i'll forget if i dont post it here, trying to disable "USB selective suspend" as there is a theory that windows suspends the hotpluggable DP monitor, rather than sleeping it


----------



## nguyen (Mar 20, 2022)

Mussels said:


> Okay the monitor backlight staying on issue:
> 
> It depends whats on the screen.
> 
> ...



I just turn the OLED TV off everytime I go outside the room .

joking aside I use nircmd with shortcut to turn off monitor, see if that completely turn off your backlight


----------



## chrcoluk (Mar 20, 2022)

A poll was done on a UK forum, to see the split of ownership between AIB and FE, and the result was roughly 2/3 of 3000 series owners have an FE card, only 1/3 AIB.  I wonder if same poll could be done here?  Seems in the UK FE availability has made massive in roads in AIB sales to gamers.


----------



## Mussels (Mar 20, 2022)

chrcoluk said:


> A poll was done on a UK forum, to see the split of ownership between AIB and FE, and the result was roughly 2/3 of 3000 series owners have an FE card, only 1/3 AIB.  I wonder if same poll could be done here?  Seems in the UK FE availability has made massive in roads in AIB sales to gamers.


That's gunna vary by region, guess what the FE ownership is in Au?


----------



## R-T-B (Mar 20, 2022)

Mussels said:


> That's gunna vary by region, guess what the FE ownership is in Au?


You?


----------



## Mussels (Mar 20, 2022)

R-T-B said:


> You?


no. we didnt get them here.


----------



## fevgatos (Mar 20, 2022)

heky said:


> Nice. Congrats on the purchase. Haven't really had much time for proper gaming sessions, but in Benchmarks its Fast!!!  Under load it gets up to 74°C (80°C Hotspot) on the core, 84°C on the memory. I get 19183 Points in 3DMark Time Spy on stock settings, which i think is not bad. And the Card is not loud at all...
> 
> 
> 
> ...


Here is mine with Palit Gamerock 3090. Have a higher one (around 23200) but it's with a custom 550w bios

NVIDIA GeForce RTX 3090 video card benchmark result - Intel Core i9-10900K Processor,Micro-Star International Co., Ltd. MEG Z490 ACE (MS-7C71) (3dmark.com)


----------



## heky (Mar 20, 2022)

fevgatos said:


> Here is mine with Palit Gamerock 3090. Have a higher one (around 23200) but it's with a custom 550w bios
> 
> NVIDIA GeForce RTX 3090 video card benchmark result - Intel Core i9-10900K Processor,Micro-Star International Co., Ltd. MEG Z490 ACE (MS-7C71) (3dmark.com)


Cool, you ran yours overclocked though, mine was stock. Will try some overclocking when I have some time to...


----------



## droopyRO (Mar 20, 2022)

Sold my 1070 Ti and RX 6600 XT and got a RTX 3060 Ti, don't know if i will keep it though. This is what i achieved with undervolting. 1925Mhz at 0.875V. Is it "good", how are your cards performing with UV ?


----------



## freeagent (Mar 20, 2022)

droopyRO said:


> how are your cards performing with UV


I haven't undervolted mine because I don't have problems with temps. But it does 1980 core on its own..


----------



## nguyen (Mar 20, 2022)

droopyRO said:


> Sold my 1070 Ti and RX 6600 XT and got a RTX 3060 Ti, don't know if i will keep it though. This is what i achieved with undervolting. 1925Mhz at 0.875V. Is it "good", how are your cards performing with UV ?
> View attachment 240586



You can undervolt+overclock like this so you don't have to drag each v/f point. 









1920mhz/875mV is pretty good bin for 3060ti, my 3090 (which is a very good sample with 2190mhz max clocks) is stable at 1920mhz/850mV


----------



## droopyRO (Mar 20, 2022)

freeagent said:


> I haven't undervolted mine because I don't have problems with temps. But it does 1980 core on its own..


I undervolt all my cards. I don't have to have a problem with it to UV. You get lower power consumption, lower case and room temperature and prolong the life of the card. Win-win. Same with the CPUs. My 12600K and RTX 3060Ti combo went from about 370W at the wall socket to 310W with UV.
This Gigabyte stays at about 62ºC with 1800 rpm, when playing DCS World. With a room temperature of about 25ºC.


----------



## R-T-B (Mar 20, 2022)

Mussels said:


> no. we didnt get them here.


Could've sworn you said somewhere yours was a founders.  Oh well...  Zero is the obvious answer then!


----------



## Mussels (Mar 21, 2022)

R-T-B said:


> Could've sworn you said somewhere yours was a founders.  Oh well...  Zero is the obvious answer then!


reference, aka RE

The better poll is did people get FE, RE, or custom PCB


----------



## arni-gx (Mar 29, 2022)

today, the prices of rtx 3000 series on my country, its look a like will be going down, but not close as to MSRP......but, for the price of rtx 3080 ti 12gb series, its still not much going down.....


----------



## Taraquin (Mar 29, 2022)

droopyRO said:


> Sold my 1070 Ti and RX 6600 XT and got a RTX 3060 Ti, don't know if i will keep it though. This is what i achieved with undervolting. 1925Mhz at 0.875V. Is it "good", how are your cards performing with UV ?
> View attachment 240586


Mine does 1965@900mv so in line with yours.

I usually run it at 1620@731mv and mining/gaming at the same time. Powerconsumption is max 125W vs 205W stock and perf is about 90%.


----------



## nguyen (Mar 30, 2022)

I'm seeing some EVGA 3080 12GB models selling for very good price at my local shop, the EVGA 3080 12GB FTW3 is selling for 1000usd+VAT, making it 100usd cheaper than 6800XT and 400usd cheaper than 3080Ti.


----------



## wolf (Mar 30, 2022)

nguyen said:


> I'm seeing some EVGA 3080 12GB models selling for very good price


A friend who has been ever-so-patiently waiting, opted for a prebuild yesterday. Until about 6 or so weeks ago, 3080 10GB LHR cards were going for ~$2000 AUD or more, this prebuild deal with a 10700F and 3080 10Gb started at $1888 AUD, and he upgraded to the 3080 12GB for $99 AUD which I said was a steal, I'd have one in a heartbeat at that price too.


----------



## Taraquin (Mar 30, 2022)

freeagent said:


> I haven't undervolted mine because I don't have problems with temps. But it does 1980 core on its own..


A quick comparison of the 3060ti's I have/had
Palit dual OC: 200W stock, avg clocks 1850-1900@950-1000mv stock, 1900@875mv stock perf at 170W.
Msi gaming Z trio: 240W stock, avg clocks 1950-2000@1030-1100mv stock, 1900@900mv stock perf at 180W.
Asus dual oc: 205W stock, avg clocks 1900-2000@930-1030mv stock, 1980@900mv stock perf at 180W.
Asus tuf oc: 200W stock, avg clocks 1850-1900@950-1000mv stock, 2070@950mv 104% perf at 205W.

Scaling on the Palit and Asus dual was, Msi 45MHz higher at same voltages, Asus tuf 30MHz on top of that
1515@700mv 100W@75% perf due to vram downclock
1545@731mv 120W@86% perf (lowest voltage vram runs full speed) 
1770@800mv 145W@93% perf
1920@900mv 180W@101% perf

At 1900 cards typically runs at 1000mv stock so there is 100-150mv higher than required.

If you don't care about temp/electricity cost stock is fine, but you can get 5-10% higher perf at same consumption or stock performance at 10-25% (most on 2x8 cards like Msi trio, Gigabyte gaming pro etc) lower consumption or 85-90% of perf at 40% lower consumption  Worth exploring, find your sweetspot.


----------



## wolf (Mar 30, 2022)

Agreed that I would undervolt any card I bought now.

Undervolting is the new overclocking.


----------



## Mussels (Mar 30, 2022)

wolf said:


> Agreed that I would undervolt any card I bought now.
> 
> Undervolting is the new overclocking.


We played halo infinite and R6 siege at the LAN party i was at last weekend

My CPU is all core underclocked to 4.6
GPU underclocked to 1.6GHz

I was flatlining 165FPS the entire time, pushing 200-230 when i turned Vsync off out of curiosity


----------



## LifeOnMars (Mar 31, 2022)

Gulp, found a buyer for my 3060 ti which has been exquisite for the past few months I've had it. 3080 12Gb and 850w PSU incoming tomorrow. 4K smoothness a go go Honestly, I've been playing practically everything at 4k DLSS just on my 3060 Ti and although yes it can go down to 30fps at times it's been a very smooth frametime experience and that was with ray tracing as well on Windows 11. Too excited......ha ha


----------



## LifeOnMars (Apr 5, 2022)

Sorry for double post but couldn't seem to edit my last post. Card is lovely, temps are around 65c with an undervolt @1950 constant. Mem junction tops out at 86c and hotspot is always 10c above GPU so around 75-76c all at 4k. Can't hear it either. 3600 holding it back only a tiny bit but couldn't resist so a nice 65w 5700x is incoming tomorrow  Then I need to focus on building up food supplies, securing the house and preparing for WWIII. Real life Fallout is coming with 1000 times the detail.


----------



## Mussels (Apr 6, 2022)

LifeOnMars said:


> Sorry for double post but couldn't seem to edit my last post. Card is lovely, temps are around 65c with an undervolt @1950 constant. Mem junction tops out at 86c and hotspot is always 10c above GPU so around 75-76c all at 4k. Can't hear it either. 3600 holding it back only a tiny bit but couldn't resist so a nice 65w 5700x is incoming tomorrow  Then I need to focus on building up food supplies, securing the house and preparing for WWIII. Real life Fallout is coming with 1000 times the detail.


Theres a time delay after which new posts dont auto merge, and you cant edit
It's so  people get notified of the new content

from experience with that CPU upgrade, you'll find your max FPS skyrocketing - my brother for example with his 3080 and his 4.2GHz 3700x was stuck at 110-130FPS 1440p in modern games, put in his 5900x and was sitting at 200+FPS in the same titles
(All single threaded gains, we checked)

TL;DR: you wont see big changes at higher res, but you WILL see big changes at high refresh rates

Okay: my GPU has had a label on it saying 3090Ti to annoy people for months

Legit 3090Ti's are now out


What's next? 4090? 3091? 3090 Ti^2?


----------



## Taraquin (Apr 6, 2022)

In Cyberpunk with 3060ti and dlss 1080p ultra I rarely exceeded 100fps with my 3600, stable around 80-90fps, 5600X on the pther hand avgs around 120-130fps so quite a difference.


----------



## LifeOnMars (Apr 6, 2022)

A few things made me jump on the 5700x. 65w, cost - it was 50 quid less than the 5800x and also the fact that I only have a no frills B450 board (which runs wells however). I chop and change between 1080p and 4K and have a huge range of games (Over a 1000 installed). I've noticed that even in 4k the 3600 is sometimes missing that bottom end grunt as well (albeit on badly optimised games) so that boost will be lovely. I'll report back later today when I get the chip how it all goes.

In other news, turns out my 3080 can easily hit 2000 constant with a tiny bump to .956 and no change in temp. Tested it last night in Cyberpunk 2007, Control, Metro Exodus and God Of War for a few hours. It's a fantastic card, but the biggest surprise is on lesser known titles where it now allows me to have a 4k Vsync locked experience that previously would not have been possible on the 3060 Ti. Gonna play around with it more tonight and see how far it will go on the core with a memory overclock on top. All of this in a shitty case....may start a thread asking for opinions on some of the best airflow cases.


----------



## Taraquin (Apr 6, 2022)

LifeOnMars said:


> A few things made me jump on the 5700x. 65w, cost - it was 50 quid less than the 5800x and also the fact that I only have a no frills B450 board (which runs wells however). I chop and change between 1080p and 4K and have a huge range of games (Over a 1000 installed). I've noticed that even in 4k the 3600 is sometimes missing that bottom end grunt as well (albeit on badly optimised games) so that boost will be lovely. I'll report back later today when I get the chip how it all goes.
> 
> In other news, turns out my 3080 can easily hit 2000 constant with a tiny bump to .956 and no change in temp. Tested it last night in Cyberpunk 2007, Control, Metro Exodus and God Of War for a few hours. It's a fantastic card, but the biggest surprise is on lesser known titles where it now allows me to have a 4k Vsync locked experience that previously would not have been possible on the 3060 Ti. Gonna play around with it more tonight and see how far it will go on the core with a memory overclock on top. All of this in a shitty case....may start a thread asking for opinions on some of the best airflow cases.


5700X seems good, perf is only 1-2% below 5800X in gaming, but runs much cooler with 76W PPT vs 142W on 5800X. I recommend you to tweak ram (10-20% better perf using dlss perf at 4k) and try curve optimizer (might boost allcore by 2-300MHz and will lower temp at single core).

If your monitor has adaptive sync, turn that on and set a fos cap at 57-58fps if Hz is 60 max, this will greatly improve input lag vs running v-sync at 60:


----------



## arni-gx (Apr 7, 2022)

NVIDIA GeForce 512.15 WHQL
is it safe for rtx 3080 10gb, using that new nvidia driver with win 10 64bit home ??


----------



## Mussels (Apr 7, 2022)

arni-gx said:


> NVIDIA GeForce 512.15 WHQL
> is it safe for rtx 3080 10gb, using that new nvidia driver with win 10 64bit home ??


The drivers for W10 and W11, i'm not sure what the question is


----------



## LifeOnMars (Apr 7, 2022)

Got my 5700x yesterday, as suspected, it has made the world of difference in the minimums of less optimised games and has now provided an extremely smooth ride with the 3080 at all resolutions. So, bearing in mind I'm on a fairly basic B450 board, only have 3200 RAM and a cooler that cost under 30 pounds. I managed to get a very stable 4.7Ghz all core overclock at 1.284v. It's a static overclock because of how limited the board is with the proper way of overclocking ryzen and how badly implemented the fan curve is in the bios. I've settled on a very chilly 4.5Ghz at 1.2v and will leave it there probably as I don't need anymore at 4K.

Nagging voice in my head says get a nice new board,case, rams and cooler but honestly that defeats the object of why I bought a Ryzen 3600 setup in the first place.....a drop in CPU upgrade when I wanted one without shelling out loads more money. It does what I want it to do so I guess I'm extremely happy, but still...


----------



## Mussels (Apr 7, 2022)

LifeOnMars said:


> Got my 5700x yesterday, as suspected, it has made the world of difference in the minimums of less optimised games and has now provided an extremely smooth ride with the 3080 at all resolutions. So, bearing in mind I'm on a fairly basic B450 board, only have 3200 RAM and a cooler that cost under 30 pounds. I managed to get a very stable 4.7Ghz all core overclock at 1.284v. It's a static overclock because of how limited the board is with the proper way of overclocking ryzen and how badly implemented the fan curve is in the bios. I've settled on a very chilly 4.5Ghz at 1.2v and will leave it there probably as I don't need anymore at 4K.
> 
> Nagging voice in my head says get a nice new board,case, rams and cooler but honestly that defeats the object of why I bought a Ryzen 3600 setup in the first place.....a drop in CPU upgrade when I wanted one without shelling out loads more money. It does what I want it to do so I guess I'm extremely happy, but still...


that 4.7GHz is still quite high, for zen 3


----------



## arni-gx (Apr 7, 2022)

Mussels said:


> The drivers for W10 and W11, i'm not sure what the question is



because there is lot of trouble in this section....



			https://www.nvidia.com/en-us/geforce/forums/game-ready-drivers/13/488704/geforce-grd-51215-feedback-thread-released-32222/
		


so, i dont know how good is that new nvidia driver..... in here, is there any problem with that driver... should i install or not...... etc....


----------



## Chomiq (Apr 7, 2022)

arni-gx said:


> because there is lot of trouble in this section....
> 
> 
> 
> ...


Unless it's solving specific issue that you're seeing in your system using your current driver or it has some performance improvements for new titles I see no reason in updating the driver.


----------



## Taraquin (Apr 7, 2022)

LifeOnMars said:


> Got my 5700x yesterday, as suspected, it has made the world of difference in the minimums of less optimised games and has now provided an extremely smooth ride with the 3080 at all resolutions. So, bearing in mind I'm on a fairly basic B450 board, only have 3200 RAM and a cooler that cost under 30 pounds. I managed to get a very stable 4.7Ghz all core overclock at 1.284v. It's a static overclock because of how limited the board is with the proper way of overclocking ryzen and how badly implemented the fan curve is in the bios. I've settled on a very chilly 4.5Ghz at 1.2v and will leave it there probably as I don't need anymore at 4K.
> 
> Nagging voice in my head says get a nice new board,case, rams and cooler but honestly that defeats the object of why I bought a Ryzen 3600 setup in the first place.....a drop in CPU upgrade when I wanted one without shelling out loads more money. It does what I want it to do so I guess I'm extremely happy, but still...


Pbo+curve optimizer is more beneficial  Also find out what ram die you have with thaiphoon burner and tweak/overclock it. Many 3200 kits can run 3800 with okay timings and boost performance by 10-20% easy!


----------



## LifeOnMars (Apr 7, 2022)

@Taraquin Yeh I know that's more beneficial but the motherboard is pretty average in terms of implementation, I'll wait until I get a better board. As far as ram goes I'll probably leave that as well. It's honestly doing everything I need it to do at the moment and I'm a happy gamer but again, I'll probably leave playing with it until I get a better board 

Anyways, back to ampere in this thread......anyone picked up a 3090 Ti?


----------



## Chomiq (Apr 7, 2022)

LifeOnMars said:


> Anyways, back to ampere in this thread......anyone picked up a 3090 Ti?


Still saving up for my yacht.


----------



## NDown (Apr 10, 2022)

LifeOnMars said:


> @Taraquin Anyways, back to ampere in this thread......anyone picked up a 3090 Ti?


It's around $500 more in here compared to my 3080Ti when i got it during the peak price hike 

as my money isnt infinite i have ZERO reason to get one


----------



## LifeOnMars (Apr 10, 2022)

NDown said:


> It's around $500 more in here compared to my 3080Ti when i got it during the peak price hike
> 
> as my money isnt infinite i have ZERO reason to get one


Absolutely agree, if money was no object I would still probably wait for the 4 series


----------



## Mussels (Apr 11, 2022)

arni-gx said:


> because there is lot of trouble in this section....
> 
> 
> 
> ...


You'd be better asking over there, than asking here without context like you did
No issues here on my 3 systems with W11, and my 10/30 series cards.

I dont see the problem however, even if a driver causes issues you can simply just roll back.


----------



## R-T-B (Apr 14, 2022)

I grabbed a 3090 ti after a particularly profitable work week and a possible bout of bad judgement (that I refuse to undo...)

I will post photos when it arrives.  It's the evga ftw3 variant.


----------



## LifeOnMars (Apr 14, 2022)

R-T-B said:


> I grabbed a 3090 ti after a particularly profitable work week and a possible bout of bad judgement (that I refuse to undo...)
> 
> I will post photos when it arrives.  It's the evga ftw3 variant.


Nice and Why not, life's too short   I'd welcome a thread on your new card or some benches in this one. Enjoy it my friend.


----------



## Mussels (Apr 14, 2022)

Oh i decided i had to outdo people like froggo:
Introducing the 3090 ShiTi edition, it's craptacular!


----------



## R-T-B (Apr 14, 2022)

Mussels said:


> outdo people like frogg


Impossible...  nothing can outdo my crappy cable management.  It's like Satan had a baby with Cthulu.

I make up for it with brute force airflow...  and moving my rig as far from me as possible.  Currently it's not even in the same room!  Hdmi 2.1 was tricky!

Big blowers and brute force solve all problems, questing fingers included.


----------



## Chomiq (Apr 14, 2022)

R-T-B said:


> I grabbed a 3090 ti after a particularly profitable work week and a possible bout of bad judgement (that I refuse to undo...)
> 
> I will post photos when it arrives.  It's the evga ftw3 variant.


Honestly, that's exactly how I felt when I ordered my 3080 Ti.

4200 Ti -> HD 4850 -> GTX 660 -> GTX 1060 6G -> 3080 Ti
One of them doesn't fit the pattern


----------



## R-T-B (Apr 14, 2022)

Chomiq said:


> Honestly, that's exactly how I felt when I ordered my 3080 Ti.
> 
> 4200 Ti -> HD 4850 -> GTX 660 -> GTX 1060 6G -> 3080 Ti
> One of them doesn't fit the pattern


Yeah, it's not our fault insanity comes in spurts, right?


----------



## R-T-B (Apr 18, 2022)

As you probably guessed from my status, it arrived.

Card is stupid fast, smells like a new car, and I have almost no time to use it... lol.  I did get some time to play Elder Scrolls Online today and that game it just chews up and spits out 4K@120hz no prob.  I think the minimum frame rate I saw was in the hundred and some odd frame numbers, and that was rare.

Nice.

Not so nice is I completely forget to photograph it in my excitement.  Which means you are going to have to settle for photos later on when I feel like taking my rig apart.  I know, I feel guilty too.  But the card weighs a metric butt-ton so the e-leash thing is actually like a necessity, and I don't want to move it too much without reason.


----------



## tabascosauz (Apr 28, 2022)

R-T-B said:


> Not so nice is I completely forget to photograph it in my excitement.  Which means you are going to have to settle for photos later on when I feel like taking my rig apart.  I know, I feel guilty too.  But the card weighs a metric butt-ton so the e-leash thing is actually like a necessity, and I don't want to move it too much without reason.



I've not actually had any triple-fan cards before, so you can probably imagine my surprise at the size of the TUF. The weight not so much, because 7970 GHz Vapor-X says hi  I like the way the FTW3 cards look, but after actually trying to fit it in the case I immediately appreciated the TUF being 2.5 slot.

Just starting to mess with undervolt. Out of the box it was ~1950 @ 1.08V, 290W 68C I think? No problem keeping cool but the flow-through cooler was taking a dump all over my 2x16 B-die and it was displeased..................right now it's at 1980MHz @ 0.925V, about 250W and still 65C. Maybe further, but I am keeping my expectations in check because G6X (which runs cool, 70 or so)

I did give up the whole loop just for this card though - TUF too big for Core P3, didn't like the Bykski look, didn't know if 280mm would hold up, etc. The cheap brace is very handy because the TUF is a little saggy (finstack does not join with the I/O plate), I'm using long-ass hex M3x25mm countersunk from the Cerberus 3D printed power socket extension DIY.


----------



## LifeOnMars (Apr 28, 2022)

tabascosauz said:


> I've not actually had any triple-fan cards before, so you can probably imagine my surprise at the size of the TUF. The weight not so much, because 7970 GHz Vapor-X says hi  I like the way the FTW3 cards look, but after actually trying to fit it in the case I immediately appreciated the TUF being 2.5 slot.
> 
> Just starting to mess with undervolt. Out of the box it was ~1950 @ 1.08V, 290W 68C I think? No problem keeping cool but the flow-through cooler was taking a dump all over my 2x16 B-die and it was displeased..................right now it's at 1980MHz @ 0.925V, about 250W and still 65C. Maybe further, but I am keeping my expectations in check because G6X (which runs cool, 70 or so)
> 
> ...


I love a build where it all fits so neatly, that looks great man. It's the main reason I recently bought a Meshify compact. Fits my new card beautifully (a few mms to spare) cools it extremely well and it looks like all the components were made for each other.


----------



## tabascosauz (Apr 29, 2022)

LifeOnMars said:


> I love a build where it all fits so neatly, that looks great man. It's the main reason I recently bought a Meshify compact. Fits my new card beautifully (a few mms to spare) cools it extremely well and it looks like all the components were made for each other.



Thanks man. Not many of my builds I feel are "just right", but this is one of them. The bottom 2 x 140mm intakes really shine with a higher TDP GPU; I can finally properly appreciate that. 1070 was too low power to care and 2060S FE cooler was too garbo to care. It really runs like a top with the undervolt profile. I was bracing for the worst but at its usual 200-250W it's very manageable.

The 3070 Ti actually draws less power most of the time than my old 2060 Super. In lighter games it's down in double digits while my 2060S was still 150W. Most noticeably, on multi-monitor idle it only draws about 15-20 watts. My 2060 Super was always at 45 watts idle because of the two monitors, and no way to change that behaviour even when waterblocked.


----------



## Chomiq (May 9, 2022)

First attempt at undervolting my 3080 Ti, I followed some guide on YT which basically was:
- run Heaven for some time, see what boost clocks are (they settle around 1830 after 10 or so minutes)
- in Afterburner set -250 Mhz offset on core
- set curve to 1815 Mhz at 0.875 V - nothing crashed, clocks were rock solid during the entire benchmark



72C on core, hotspot at 80C, memory at 92C avg. All at around 300W compared to 345W at stock.

Any tips?


----------



## nguyen (May 9, 2022)

Chomiq said:


> First attempt at undervolting my 3080 Ti, I followed some guide on YT which basically was:
> - run Heaven for some time, see what boost clocks are (they settle around 1830 after 10 or so minutes)
> - in Afterburner set -250 Mhz offset on core
> - set curve to 1815 Mhz at 0.875 V - nothing crashed, clocks were rock solid during the entire benchmark
> ...



This is how I undervolt









+120mhz Core Clock offset is just an example, try +180mhz. 

This way of undervolt can prevent your 3080Ti from downclocking agressively when hitting the power limit


----------



## onemanhitsquad (May 10, 2022)

running completely stock...no issues


----------



## HammerON (May 10, 2022)

Upgraded from my 3080 10 GB GPU to the 3090 Ti.  Got a good deal on the Gigabyte 3090 Ti Xtreme Waterforce:










Temps have been really good.  While folding, the temp stays around 47 degrees C.  I haven't messed with undervolting.


----------



## Mussels (May 10, 2022)

HammerON said:


> Upgraded from my 3080 10 GB GPU to the 3090 Ti.  Got a good deal on the Gigabyte 3090 Ti Xtreme Waterforce:
> View attachment 246867View attachment 246866
> 
> View attachment 246865
> ...


Did they fix the VRAM temps for your revision?


----------



## Chomiq (May 10, 2022)

nguyen said:


> This is how I undervolt
> 
> 
> 
> ...


Yeah I opted for this on my second attempt but still near that peak 300W of power consumption.


----------



## HammerON (May 10, 2022)

Mussels said:


> Did they fix the VRAM temps for your revision?


I will have to check...


----------



## nguyen (May 11, 2022)

Chomiq said:


> Yeah I opted for this on my second attempt but still near that peak 300W of power consumption.



I created multiple undervolt profiles and toggle them in game to see which on works best: 750mV, 800mV, 850mV and 900mV


----------



## Mussels (May 11, 2022)

nguyen said:


> I created multiple undervolt profiles and toggle them in game to see which on works best: 750mV, 800mV, 850mV and 900mV


I'm at 800mv 1600MHz because i lost my AB profile and forgot about fine tuning it, oops. Good thing you reminded me.
(I've been able to keep it sub 300W by not OCing the VRAM)

Is there a way to just type the damn volts and MHz you want, rather than fart about with the stupid graph?

Edit: 1600Mhz 750mv, dying light 1  (160FPS/capped)
Max 235W (in this game), 410W for entire system w/ screens


----------



## nguyen (May 11, 2022)

Mussels said:


> I'm at 800mv 1600MHz because i lost my AB profile and forgot about fine tuning it, oops. Good thing you reminded me.
> (I've been able to keep it sub 300W by not OCing the VRAM)
> 
> Is there a way to just type the damn volts and MHz you want, rather than fart about with the stupid graph?
> ...



Select a V/F point, press <Enter> and input an offset value (i.e. +150), cannot input actual core clocks though.  

You can change the V/F curve X/Y value in MSI config file so that each V/F point is further apart, easier to select them


----------



## FireFox (May 11, 2022)

I am pretty happy with my 3080.

In Afterburner set a -300Mhz offset on core then set a curve to 1875Mhz at 0.850V
It is around a year that has been running with those settings and never has crashed


----------



## HammerON (May 12, 2022)

I am happy with the VRAM temps.  While folding (Folding@home) with this GPU, the max VRAM temp was 68: 



The max temp while gaming was 58:


----------



## R-T-B (May 12, 2022)

Yeah my evga 3090 ti has pretty low vram temps too.  Core was a bit toasty till I changed my cooling setup a bit though...


----------



## SpittinFax (May 12, 2022)

Hey what's up, I'm visiting to just dip my toes in the 30xx series pool. I know the water is good but just doing research on what variant to buy and tracking prices. Currently on a GTX 1060 6GB and thinking that the RTX 3060 would be a big upgrade, particularly since I won't be going beyond 1080p 60fps gaming any time soon. I'm on a 5600X so CPU isn't a limiting factor.

Any recommendations on which model to go for? I'm thinking triple-fan cooler would be pretty nice. All of my current options are:

Gigabyte RTX 3060 Eagle OC LHR 12GB (AU$629)
MSI RTX 3060 Ventus 2X OC 12GB (AU$649)
Gigabyte RTX 3060 Gaming OC LHR 12GB (AU$659)
Gigabyte RTX 3060 Vision OC LHR 12GB (AU$659)
ASUS RTX 3060 KO Gaming LHR 12GB (AU$679)
ASUS RTX 3060 TUF Gaming OC LHR 12GB (AU$699)
Gigabyte RTX 3060 Aorus Elite LHR 12GB (AU$699)
And then the RTX 3060 Ti's (at least the decent ones) start at around AU$849 which is a bit too much of a price jump for me. But prices are still coming down right now, so we'll see how things pan out over the next few weeks.


----------



## arni-gx (May 17, 2022)

so, why today..... some user of RTX 3080 ti 12gb series, want to undervolt their GPU's ..... ???


----------



## FireFox (May 17, 2022)

arni-gx said:


> so, why today..... some user of RTX 3080 ti 12gb series, want to undervolt their GPU's ..... ???


To decrease power consumption, to lower VRAM temp, some cards perform better when undervolted? Take your pick.


----------



## Mussels (May 18, 2022)

arni-gx said:


> so, why today..... some user of RTX 3080 ti 12gb series, want to undervolt their GPU's ..... ???


This has been asked and answered so many times in this thread alone...

why the hell wouldnt we want to undervolt them? You only want to overclock hardware that isn't fast enough.


----------



## nguyen (May 18, 2022)

Mussels said:


> This has been asked and answered so many times in this thread alone...
> 
> why the hell wouldnt we want to undervolt them? You only want to overclock hardware that isn't fast enough.



That's like asking why should people drive fast cars at <50km/h in the city when those cars are capable of driving >300km/h .


----------



## Mussels (May 18, 2022)

nguyen said:


> That's like asking why should people drive fast cars at <50km/h in the city when those cars are capable of driving >300km/h .




More like if the RPM gauge on the dash maxes out at 10K RPM, why would i ever change gears and make that number slower?


----------



## Hyderz (Oct 6, 2022)

i have a question guys....
i have an rtx3090 with an older bios (before resizable bar) i haven't updated it to the newer one.
my motherboard is updated with resizable bar option there but i haven't enabled it..

so far my system runs fine with no problems and the reason i didnt update the bios is because 9th gen is not officially supported although ive seen benchmarks that it does work.
is there a noticeable difference in perf? should i enable it?


----------



## MetalMadness (Oct 6, 2022)

a bit late to the party but here is my system  not the most beautiful loop but it works great, just have to add the 3080Ti to the loop.


----------



## oxrufiioxo (Oct 6, 2022)

Hyderz said:


> i have a question guys....
> i have an rtx3090 with an older bios (before resizable bar) i haven't updated it to the newer one.
> my motherboard is updated with resizable bar option there but i haven't enabled it..
> 
> ...



To me it wouldn't be worth it the difference in most games is negligible the majority of people that would say otherwise just have a bad case of fomo.


----------



## Hyderz (Oct 6, 2022)

oxrufiioxo said:


> To me it wouldn't be worth it the difference in most games is negligible the majority of people that would say otherwise just have a bad case of fomo.



yeah my gut feeling tells me not to, it aint broken dont fix it  cheers for that


----------



## Mussels (Oct 7, 2022)

Why not update it?
It definitely does make a performance difference, it's just that the 3090 has a lot of performance already.


----------



## dgianstefani (Oct 7, 2022)

Rebar eliminates a potential bottleneck. Turn it on.


----------



## Hyderz (Oct 7, 2022)

dgianstefani said:


> Rebar eliminates a potential bottleneck. Turn it on.



fair enough.. done the update and bios turned on rebar...


----------



## vietanh2901 (Oct 17, 2022)

several years late...


----------



## Mussels (Oct 17, 2022)

vietanh2901 said:


> several years late...
> View attachment 265748


What card is that?


----------



## arni-gx (Oct 17, 2022)

NVIDIA has released a fix for the "GeForce 522.25 driver map glitches" in Cyberpunk 2077
					

Last week, NVIDIA released a new driver that introduced some map glitches in Cyberpunk 2077. Thankfully, though, there is now a hotfix for it.




					www.dsogaming.com
				




a fix for new nvidia driver..... it is not hotfix........


----------



## Mussels (Oct 18, 2022)

Ironically it's an "SLI" profile for a card that doesnt support SLI


Why they dont use geforce now to cloud-sync those profiles is a mystery.


----------



## R-T-B (Oct 18, 2022)

Mussels said:


> Ironically it's an "SLI" profile for a card that doesnt support SLI
> 
> 
> Why they dont use geforce now to cloud-sync those profiles is a mystery.


There are game profiles for way more things than SLI, man.  It's been that way for a bit.  This isn't an SLI profile in any sense of the word.


----------



## Mussels (Oct 18, 2022)

R-T-B said:


> There are game profiles for way more things than SLI, man.  It's been that way for a bit.  This isn't an SLI profile in any sense of the word.


Nvidia called it that, that's what i'm finding ironic


----------



## R-T-B (Oct 18, 2022)

Mussels said:


> Nvidia called it that, that's what i'm finding ironic
> 
> View attachment 265948


That's extremely dumb on their part, yeah.


----------



## b0uncyfr0 (Dec 16, 2022)

How does the Aorus Master 3070 stack up against mods/overclocking? Still determining if im getting the LHR or non-LHR version, but i doubt that makes a difference to performance.

Id like to push the sucker as much as possible.


----------



## Mussels (Dec 17, 2022)

I feel offended my water cooled 3090 and heavily tweaked 5800x3D arent doing better




Oh right i have a sane BIOS with a 375W limit


----------



## tabascosauz (Dec 17, 2022)

Mussels said:


> I feel offended my water cooled 3090 and heavily tweaked 5800x3D arent doing better



CPU score could use some work, you sure CO isn't working against you? I post better scores at -25 or -28 than -30, even they're all "stable".

I went back to 4006 because neither AGESA 1207 or 1208 are giving me any good reasons to stay. Just slower, shittier, hotter.


----------



## Mussels (Dec 17, 2022)

I may be more undervolted than neccesary, i went by effective clocks not dropping vs performance when i undervolted


----------



## tabascosauz (Dec 17, 2022)

Mussels said:


> I may be more undervolted than neccesary, i went by effective clocks not dropping vs performance when i undervolted



if you went to the latest bios for that sweet pbo menu, then you probably didn't do anything wrong and the bios itself probably accounts for 100% of the performance deficit lol

4303 I think I saw in MW2 a 10°C increase in temps compared to 4006? a huge yikes


----------



## Mussels (Dec 17, 2022)

Current CPU result, i'll reduce the UV and try again






tabascosauz said:


> if you went to the latest bios for that sweet pbo menu, then you probably didn't do anything wrong and the bios itself probably accounts for 100% of the performance deficit lol
> 
> 4303 I think I saw in MW2 a 10°C increase in temps compared to 4006? a huge yikes


It needs lower voltages for OCing the memory, i reduced those values and the temps are back to normal. I think they boosted certain voltages to increase ram compatibility.


-0.500 uv





really similar so i wont blame the CPU on that one - yes, the BIOS could be related (I already had everything unlocked, modded BIOS)





There we go, balls to the wall mode
This is with a custom OC curve to 1935Mhz on the GPU, higher hits the wattage limits




I'll take it
NVIDIA GeForce RTX 3090 video card benchmark result


----------



## tabascosauz (Dec 17, 2022)

@Mussels if you saved the results file at the bottom of the page you can check your CPU clocks during the CPU test (last segment after the last dotted line)

Mine is 4450 steady through the entire test, but yours looks like it's dropping slightly as the CPU test goes on

I doubt yours runs hotter than mine, so pretty safe to say it's just 1207 doing its thing


----------



## Mussels (Dec 17, 2022)

CPU and GPU Dropping at the exact same times between tests is just a loading screen





just.... wow.
achievements on a benchmark.





The lowest the CPU dipped is 50Mhz, as the test was ending
coulda been simple heat at that point with a 375W 3090 in the loop


----------



## Mussels (Dec 25, 2022)

Anyone having trouble with displays not sleeping?

In the last week or so my monitor just refuses to sleep, despite windows being set to do so





apparently this was an issue with geforce experience in the past, anyone else seeing it?

Ugh figured it out

I've got a virtual display adapoter added by the VR streaming software for my quest 2, and disabling it broke the sleep state

I was trying to minimise background resource usage and forgot about it


----------



## TITAN RTX 4100 (Dec 25, 2022)

Mussels said:


> just.... wow.
> achievements on a benchmark.
> View attachment 274771


3Dmark?


----------



## johnspack (Dec 25, 2022)

I want a 3070...  how much will it help me over a 980ti at 1080 res?  Should I spend 600can for a used one?  Or just wait a year or so and get it for the 400can max I want to spend....


----------



## Mussels (Dec 25, 2022)

johnspack said:


> I want a 3070...  *how much will it help me over a 980ti at 1080 res?*  Should I spend 600can for a used one?  Or just wait a year or so and get it for the 400can max I want to spend....


That's not a question with an answer, since about a thousand things factor into the reply

It's a lot faster than what you have, but if you run ultra low settings with a 60Hz display you'd see no change at all.


----------



## johnspack (Dec 25, 2022)

That's what I thought...  I'll just do for another 2 years.  I'd rather have better sound quality than faster graphics at this point.
Also,  all my games are 4+ years old...  so I just don't need it.


----------



## Mussels (Dec 25, 2022)

If you run ultra low settings at 720p, and you have lag issues: Time to update the platform and CPU

If you run settings maxed out and you want better performance, GPU time
(Same with if you want to go higher res)


Sound quality can be tough, as sometimes the source material just aint good enough. Definitely focus on stereo sound over anything spatial, since so many things (like youtube) are only in stereo and then people get so confused their new 10.1 sound system doesnt work right (and 4K60 HDMI limits with their receiver)


----------



## tabascosauz (Dec 25, 2022)

johnspack said:


> I want a 3070...  how much will it help me over a 980ti at 1080 res?  Should I spend 600can for a used one?  Or just wait a year or so and get it for the 400can max I want to spend....



Really not sure how well a Ivy Bridge E5 and Haswell E5 are going to handle a 3070 at 1080p......would probably be more palatable at 4K. Playable sure.........maybe at half  of the $600 worth of performance you're paying for

Considering a lower SKU 6750XT was just on sale last week for $500, $600cad for a used 3070 just seems like highway robbery unless it's barely used and has full transferable warranty. BC GPU prices are still holding a bit too high, but not $600 used high. New 3060 Tis are starting around $600.

One of my friends was mulling a 3070 FE for $400, like 3 months ago. Not saying I would be confident buying a $400 FE (no transferable warranty), just saying that $600 used is not a deal.

The way the 30 series are priced in relation to RDNA2 in Canada, you'd better be making heavy use of DLSS in most of the games you play, which clearly doesn't sound like it's the case. May as well wait it out and plan for a platform upgrade too. Prices might improve once 40 series and RDNA3 lineups are a bit more fleshed out

I totally forgot about the $700 TUF 6800XT last week too. It's about 2 years too late to be buying into bad last gen deals.


----------



## Mussels (Dec 25, 2022)

tabascosauz said:


> Really not sure how well a Ivy Bridge E5 and Haswell E5 are going to handle a 3070 at 1080p......would probably be more palatable at 4K. Playable sure.........maybe at half  of the $600 worth of performance you're paying for


Hence my comments about running low res and seeing how the CPU fares in his games/refresh rate




tabascosauz said:


> Considering a lower SKU 6750XT was just on sale last week for $500, $600cad for a used 3070 just seems like highway robbery unless it's barely used and has full transferable warranty. BC GPU prices are still holding a bit too high, but not $600 used high. New 3060 Tis are starting around $600.
> 
> One of my friends was mulling a 3070 FE for $400, like 3 months ago. Not saying I would be confident buying a $400 FE (no transferable warranty), just saying that $600 used is not a deal.
> 
> The way the 30 series are priced in relation to RDNA2 in Canada, you'd better be making heavy use of DLSS in most of the games you play, which clearly doesn't sound like it's the case. May as well wait it out and plan for a platform upgrade too. Prices might improve once 40 series and RDNA3 lineups are a bit more fleshed out


Prices are weird in regions. I'm seeing 1080Ti's for $300, but 1650 supers for $500 here. It's weird.
3080's for $800 Au but 2060 supers for $1200


Second hand market is so unpredictable, it's the kinda thing you need to compare to new card prices that very moment


----------



## johnspack (Dec 25, 2022)

Yep I'll stick with my 980ti..  still a powerhouse at 1080,  especially with the older games I play.  Spending my money on audio instead.
I need a TR system before I upgrade... I'm done with intel.


----------

