• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 3080 Unboxing & Preview

I really wish AMD was not making us wait until october 28th. I am very curious if they can match Nvidia this round. they seem to be confident they will. bleh. i don't know what to do. wait or buy wait or buy... i just feel like its going to be sold out instantly, so i think im just gonna buy the 3080 and not worry about it.
Balls, you have not been rushed so far, I can't see the point jumping now , think, competition.
 
Perhaps someone can clarify it for me; but there's been images floating around saying the rear fan sucks air from the bottom and up, thus flowing the warmed air into a CPU cooler (if one is used). Yet the rear fan looks more like it sucks from the top and blows down, which would then force that warmed air to be sucked into the front fan and either out the case or across the angled fins. Which way does the rear fan actually spin?
 
Look closely how fan blades are bent and concave, this will hint you something about rotation. With other coolers all the hot air remains in the case pushing from the non Pcie slot side of the GPU. It all goes up. With this cooler big part of the heat is immediately exited, so how is this worse. How do you say this is any different.
 
Sounds like a challenge.

Yes I live for the day that some one will post a topic with title : Help... I need to reapply thermal paste at my RTX 3080 ..
There is no answers about that, whom will take the responsibility of his own advice's ?
 
Oh wow when Nvidia copied Sapphire's cooler or any other open back design from years back they got praises and hailed as geniuses, unlike Nvidia's design, sapphire fans don't cook themselves to death during hard operation https://cdn.wccftech.com/wp-content...ury-Tri-X-4-GB-HBM-Graphics-Card_Back-PCB.jpg

This is the best ever design of a GPU. Nobody from the partners has done anything like this IMO. To be clear this is definitely ultra polished end-consumer product. It isn't in any way a "reference-design".
Did you know something like this existed a few years back? Oh wow would you look at that, the fans are in front so that they don't cook themselves to premature death https://cdn.wccftech.com/wp-content...ury-Tri-X-4-GB-HBM-Graphics-Card_Back-PCB.jpg
 
I was about to get into the hype but while having a 2080 Super with 8GB of vRAM, I rather wait for the bigger 3080 (Super or Ti), hopefully with double the memory size (20GB I believe), and get that instead (hopefully for around 950€ or less). I want to play in 4K even if I have to rely on DLSS (which I also hope more games use it or start to implement because it works just fine).

I played at 1080p since 2009 (on my old HDTV, a Sharp Aquos 47") and hate bad Anti Aliasing so bad, playing at 4K is a pleasure to my eyes (now with my new LG CX6 55" OLED TV). I just need more frames of course.
The amount of memory is the thing now, Horizon Zero Dawn ( a game from 2016) uses 8GB and with the game maxed I can play at 40FPS average at 4K but with some stutters here and there (the game is not so well optimised have to say), and when you see that you think about the future for sure. Playing at 1440p (the right resolution for my GPU, no doubt about it) I have no issues at all with any game for example. But I'm one of those little 2% of Steam charts that want to play in 4K and now is the time for us.

So I'll wait.

PS: Can't wait to see the reviews and the actual gains from previous generation, also if you can use a Ryzen CPU like mine (3900X) to test not Intels only as usually, that also could be very welcome. ;)
 
Wizz as always doing the job done.
 
I posted these to the GN thread in the lounge, but it belongs here too.


Jay did one too!

Jay's vid is better IMO. Certainly funnier!

@W1zzard
Looking forward to your review in a few days!
 
Last edited:
Jays video was fantastic, genuinely funny

As an air cooled CPU guy, fark. The FE arent available in aus but i'd genuinely consider it for those looks, despite that clunky AF adaptor.
The warning says no custom converter cables but nothing about direct to PSU cables which is reasonable i guess, they may have concerns about cheapass cables not handling the power draw.
 
Do I understand right that this fan goes anti-clock wise?
View attachment 168255
Yes, it does.
The lower edge of the fan blade acts as a scoop, lifting the air like the front foil on a F1 car.

It is at the end of the card, the housing is much longer than the actual card so it will always end up there where it is unless it was placed near the rear, but then you would need to route all that power to the other side to be handled by the power circuitry so it would be a daft design.
I already know where the end of the card is, and it is only a daft design in your opinion.
Obviously you have no imagination or technical ability.
 
Perhaps someone can clarify it for me; but there's been images floating around saying the rear fan sucks air from the bottom and up, thus flowing the warmed air into a CPU cooler (if one is used). Yet the rear fan looks more like it sucks from the top and blows down, which would then force that warmed air to be sucked into the front fan and either out the case or across the angled fins. Which way does the rear fan actually spin?

The rear fan sucks air through the HS and blows it into your CPU's HSF

airflow-model_LI.jpg
 
If you have a case with good airflow, CPU temps will not be affected to any measurable degree.
As most cases coming out nowadays seem to be more Form over Function it looks like we'll just have to wait for reviews that take this into consideration
 
The FE arent available in aus but i'd genuinely consider it for those looks, despite that clunky AF adaptor.

You looking at getting a 3080 in Aus too then? With the picture even looking like perhaps no preorders at all, I'm considering lining up at a store early for one, if that's what it takes.
 
You looking at getting a 3080 in Aus too then? With the picture even looking like perhaps no preorders at all, I'm considering lining up at a store early for one, if that's what it takes.


I got savings for this launch, might get a 3090 if prices arent insane... will be waiting for a hybrid one with factory water tho
 
I got savings for this launch, might get a 3090 if prices arent insane... will be waiting for a hybrid one with factory water tho

Same, Money waiting just ready to buy it, seems we both had a GTX1080 and skipped Turing eh, like many others surely.
 
Same, Money waiting just ready to buy it, seems we both had a GTX1080 and skipped Turing eh, like many others surely.

I made a rule: never upgrade without at least a 50% performance/capacity gain
1080 to 3090 smashes that, and then the 1080 smashes the 980 in the VR system
 
The rear fan sucks air through the HS and blows it into your CPU's HSF

View attachment 168324
Its interesting they choose to illustrate the airflow as mostly going around the CPU cooler. But the reality is that the CPU fan will pull most of the hot air into the cooler. For AIO, I think this may not be an issue, but for people using a massive CPU air cooler, I believe the CPU temps will creep up over prolong load on the GPU.
 
Its interesting they choose to illustrate the airflow as mostly going around the CPU cooler. But the reality is that the CPU fan will pull most of the hot air into the cooler. For AIO, I think this may not be an issue, but for people using a massive CPU air cooler, I believe the CPU temps will creep up over prolong load on the GPU.

Well, what temp will that air actually be at? If the GPU was at 70C at the core, it'd get cooler as it got away - maybe 50C at the far end of the heatsink then it'd merge with fresh intake air and drop even lower (40c or less)

So while hot air into CPU intake sounds bad, i'm definitely confident they mathed this out and tested how much heat actually passes through... in all honestly i think the numbers i posted above are way too high, cause seeing 70C CPU temps the heatpipes on my CPU cooler are usually cold to the touch
 
I made a rule: never upgrade without at least a 50% performance/capacity gain
1080 to 3090 smashes that, and then the 1080 smashes the 980 in the VR system

Same really, the only mildly compelling Turing part was a 2080Ti and my lord, they wanted over 2k AUD for one for the longest time...
 
Back
Top