• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Announces GeForce Ampere RTX 3000 Series Graphics Cards: Over 10000 CUDA Cores

Joined
Jan 8, 2017
Messages
9,434 (3.28/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
That is garbage. CPU always works together with the GPU. CPU instructs the GPU to do things all the time.

I tried, yet you made sure to prove to me you have absolutely no idea what you are talking about on every occasion. Do you realize how idiotic it would be if I were to write say a + b in a GPU shader and the CPU would have to instruct the GPU step by step on how to do that ? What would there even be the point of having a GPU ?

Shaders run exclusively on the GPU, line by line, instruction by instruction with no intervention from the CPU side, that's what GPUs do. You know precisely nothing about this subject yet you insist to correct me on everything I say. May you delve in your ignorance for as long as you want, I'm out.

 
Joined
Sep 11, 2015
Messages
624 (0.19/day)
I tried, yet you made sure to prove to me you have absolutely no idea what you are talking about on every occasion. Do you realize how idiotic it would be if I were to write say a + b in a GPU shader and the CPU would have to instruct the GPU step by step on how to do that ? What would there even be the point of having a GPU ?

Shaders run exclusively on the GPU, line by line, instruction by instruction with no intervention from the CPU side, that's what GPUs do. You know precisely nothing about this subject yet you insist to correct me on everything I say. May you delve in your ignorance for as long as you want, I'm out.

You are just insanely inaccurate about everything you say.

I never said CPU does the computation of the GPU line by line. That is something you just invented somehow. You just can't stop lying about what I said, you really don't give a f*** it seems.

The CPU can give instructions to the GPU and the GPU will actually compute those instructions. The CPU doesn't need to do it, but it does need to orchestrate what the GPU does. That is what that program is doing that someone wrote in C. The program itself doesn't do anything either, it's just the instructions for the CPU and the CPU passes those instructions on to the GPU to compute whatever that program wants it to compute.

If you knew something about C you would know it gets compiled down to Assembly language and then down to Machine code for x86 CPUs. Assembly usually just knows one instruction set, so it can't just run on anything you would like.

You are only projecting your own lack of knowledge on to others. You don't seem to be that bright either, literally everything you said was wrong so far and even others have corrected you, it's actually hard for me to remember encountering someone as willfully ignorant or even just as purposefully deceptive as you on here before.
 
Last edited:
Joined
Jan 8, 2017
Messages
9,434 (3.28/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Maybe you have trouble remembering what you wrote so I will post the exact comments and order in which you replied.

Yes, it runs purely on the GPU, instruction by instruction for each instance of the shader.

To which you responded very confidently that this is "garbage" :

That is garbage. CPU always works together with the GPU. CPU instructs the GPU to do things all the time.

Now you write :

I never said CPU does the computation of the GPU line by line. That is something you just invented somehow. You just can't stop lying about what I said, you really don't give a f*** it seems.

No buddy, you thought that idea was "garbage", you wrote that yourself not knowing even what I was talking about.

Stop, you are the laughing stock of everyone reading this.
 
Last edited:
Joined
Sep 11, 2015
Messages
624 (0.19/day)
Maybe you have trouble remembering what you wrote so I will post the exact comments and order in which you replied.



To which you responded very confidently that this is "garbage" :



Now you write :



No buddy, you thought that idea was "garbage", you wrote that yourself.

Stop, you are the laughing stock of everyone reading this.
How is that even supposed to be against what I said???

YOU said it runs line by line on the GPU. Then I replied that's garbage / totally wrong. I said it runs on both every time and you don't seem to get that simple principle. You have posted total non-sense over and over again.

The CPU is still always the host processor, it orchestrates what the GPU has to do. The program just doesn't purely run on the GPU, but also on the CPU. You have no sense of the abstraction happing with that C program. I'm still accurate on that, despite your silly attempts to frame this otherwise. You make extremely illogical points, so that won't happen either way. But somehow you are still trying.
 
Joined
Jan 8, 2017
Messages
9,434 (3.28/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
I said it runs on both every time and you don't seem to get that simple principle.

It doesn't run on both, you have no clue in the slightest how these things work. I said above how unimaginably idiotic it would be for that to be the case. A shader is compiled and runs on the GPU alone, not on both. Can you still not understand how dumb that would be ?

The program just doesn't purely run on the GPU

The shader runs purely on GPU, period. I don't know what's this "program" you talk about, I wrote exclusively about shaders.

YOU said it runs line by line on the GPU.

Which is 100% correct. Everything in the shader, the code that I posted earlier, runs on the GPU. All of it, never on both.

You have no sense of the abstraction happing with that C program.

:roll: :roll: :roll: :roll:

IT'S NOT A C PROGRAM. It's a GLSL shader using a C syntax style language, I wrote that as clear as possible, of course you don't have any idea of what that means. That's why you don't understand any of this either. This is what I am trying to show you.
 
Last edited:
Joined
Sep 11, 2015
Messages
624 (0.19/day)
It doesn't run on both, you have no clue in the slightest how these things work. I said above how unimaginably idiotic it would be for that to be the case. A shader is compiled and runs on the GPU alone, not on both. Can you still not understand how dumb that would be ?



The shader runs purely on GPU, period. I don't know what's this "program" you talk about, I wrote exclusively about shaders.
The computation for the shader runs on the GPU, that's it. The C programs run on both, that's it. The discussion was about the C program, and you started it, so you're just out of your mind now suddenly changing the facts up to appear right. That's some weird, childish reasoning.

Let's sum up:
You were wrong on the TFLOPS argument.
You were wrong on the general-purpose argument.
You are still wrong on the C program argument, which btw. you stared and which has nothing at all to do with this whole discussion.

Or what did this have to do with anything now? You wrote this:
Shading languages don't run on specialized hardware, they can't, they need generic all-purpose processors.

Of course they run on specialized hardware. It's called a Shading Unit inside a GPU. That's definitely as specialized as it gets, my friend. And there is no such thing as a "shading language", if it's written in C. C is an all-purpose language, no a shading one.... Whatever that is even supposed to mean.

Now stop this completely silly discussion. But you just don't seem to know when to stop, do you?
 
Joined
Jan 8, 2017
Messages
9,434 (3.28/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
The discussion was about the C program

IT'S NOT A C PROGRAM

You can't figure that out even now ? Read back that comment.

You were wrong on the TFLOPS argument.
You were wrong on the general-purpose argument.
You are still wrong on the C program argument, which btw. you stared and which has nothing at all to do with this whole discussion.

I was correct about all of them.

More TFLOPS indicate more performance.
GPUs are general purpose, that's why they are used for all sorts of things besides graphics. That's also why they're capable of being GPGPU (General Purpose Graphics Processing Units). It's in the name.


Thoroughly explained why, can you do the same ?

And there is no such thing as a "shading language", if it's written in C. C is an all-purpose language, no a shading one.... Whatever that is even supposed to mean.


Just for how long will you make me do this ? Have I not proven you wrong enough times ? I feel like a bully.

You're gonna use that cognitive dissonance thing and tell me that you never said that, right ?
 
Last edited:
Joined
Feb 18, 2017
Messages
688 (0.24/day)
$699 3080???????????????????

OH MY GOD

Wut? This may be the same pricing as the RTX 2000 series. No Founders Edition mentioned in the charts, so the cheapest AIB ones may start from this price range too - just like with the RTX 2080.

Nope. To me Nvidia is only competing with itself at this point. In other words, how to get the Pascal crowd to move on and upgrade? Jensen even mentioned that in his talk!
Competition doesn't mean only the high end. It's there at mid and entry range too.
 
Last edited:
Joined
Sep 11, 2015
Messages
624 (0.19/day)
IT'S NOT A C PROGRAM

You can't figure that out even now ? Read back that comment.
You can keep changing and editing posts but that's ok.

You haven't answered the question of how that language (C or not, it doesn't matter) purely runs on the GPU. Everything has to go through the CPU at some point, GPU and CPU work together constantly and for every process, they do to reach the result of a game on the screen. That's why you still need a CPU to play games. We wouldn't even need a CPU, if you could just program on the GPU. It's definitely not how it works.

And either way, what is this point even trying to accomplish? What are you trying to reach with this exact point, besides all the other points that you were wrong on? You still haven't responded to this, so I'll quote you again:
Shading languages don't run on specialized hardware, they can't, they need generic all-purpose processors.
Are you actually saying Shading Units aren't specialized hardware when it's literally in the name??? :kookoo::kookoo::kookoo:
 
Last edited:
Joined
Jan 27, 2015
Messages
454 (0.13/day)
System Name Marmo / Kanon
Processor Intel Core i7 9700K / AMD Ryzen 7 5800X
Motherboard Gigabyte Z390 Aorus Pro WiFi / X570S Aorus Pro AX
Cooling Noctua NH-U12S x 2
Memory Corsair Vengeance 32GB 2666-C16 / 32GB 3200-C16
Video Card(s) KFA2 RTX3070 Ti / Asus TUF RX 6800XT OC
Storage Samsung 970 EVO+ 1TB, 860 EVO 1TB / Samsung 970 Pro 1TB, 970 EVO+ 1TB
Display(s) Dell AW2521HFA / U2715H
Case Fractal Design Focus G / Pop Air RGB
Audio Device(s) Onboard / Creative SB ZxR
Power Supply SeaSonic Focus GX 650W / PX 750W
Mouse Logitech MX310 / G1
Keyboard Logitech G413 / G513
Software Win 11 Ent
The power usage is crazy. Both the 90 and 80 are above 300W and even the 70 is approaching the territory that used to be reserved for the 80 Ti cards. But then if 3070 is actually faster than 2080 Ti by a good margin, that means power efficiency has been improved with Ampere. BTW where is that 12-pin pcie aux power connector?
 
Joined
Mar 23, 2016
Messages
4,841 (1.53/day)
Processor Core i7-13700
Motherboard MSI Z790 Gaming Plus WiFi
Cooling Cooler Master RGB something
Memory Corsair DDR5-6000 small OC to 6200
Video Card(s) XFX Speedster SWFT309 AMD Radeon RX 6700 XT CORE Gaming
Storage 970 EVO NVMe M.2 500GB,,WD850N 2TB
Display(s) Samsung 28” 4K monitor
Case Phantek Eclipse P400S
Audio Device(s) EVGA NU Audio
Power Supply EVGA 850 BQ
Mouse Logitech G502 Hero
Keyboard Logitech G G413 Silver
Software Windows 11 Professional v23H2
Specs are looking pretty good (on paper), perheps except TDP numbers.
I thought the 220 watts for the 3070 wasn’t a bad TDP for the performance their advertising.
 
Joined
Jan 8, 2017
Messages
9,434 (3.28/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
You haven't answered the question of how that language (C or not, it doesn't matter) purely runs on the GPU. Everything has to go through the CPU.

You write "int a = 1;" in a shader, it's compiled then sent onto the GPU where it's executed by it. Is that too much to comprehend ? Do you understand what "run" means ?

Are you actually saying Shading Units aren't specialized hardware when it's literally in the name??? :kookoo::kookoo::kookoo:

Both AMD and Nvidia have stopped calling them like that for a very long time. Anyway, you still haven't addressed how shading languages don't exist despite this little thing :


(GLSL) is a high-level shading language with a syntax based on the C programming language.

And either way, what is this point trying to accomplish?

That you know nothing, you are a compulsive liar, and really, really stubborn. It wasn't my initial goal, but alas.
 
Last edited:
Joined
Sep 11, 2015
Messages
624 (0.19/day)
You write "int a = 1;" in a shader, it's compiled then sent onto the GPU where it's executed by it. Is that too much to comprehend ? Do you understand what "run" means ?
So you still really think everything that code does only and purely runs on the GPU. That is completely wrong and I won't keep repeating this point because you don't seem to understand how computers or Von Neumann Architecture works on a basic level.
Both AMD and Nvidia have stopped calling them like that for a very long time. Anyway, you still haven't addressed how shading languages don't exist despite this little thing :
No, it's still called that because Nvidia or AMD don't determine what basic hardware components are called. Intel definitely also uses Shading Units in their iGPUs.
https://en.wikipedia.org/wiki/Unified_shader_model
Unified shader architecture (or unified shading architecture) is a hardware design by which all shader processing units of a piece of graphics hardware are capable of handling any type of shading tasks. Most often Unified Shading Architecture hardware is composed of an array of computing units and some form of dynamic scheduling/load balancing system that ensures that all of the computational units are kept working as often as possible.


That you know nothing, you are compulsive liar, and really, really stubborn. It wasn't my initial goal, but alas.

Again, totally projecting. What was your initial goal other than presenting how much you know about shaders but not the overall picture of how a computer works? I don't need to know how everything works because I know how abstraction in computer architecture works, that's the point of having it. So, you still haven't told me about your initial goal. It seems to be that you just wanted to talk s*** at other members of the forum and nothing else. You are truthfully a sad and silly human being. I kind of pitty you that you had to go through this length to defend a useless point that doesn't even further anything about the discussion. You seem to just vent because something went wrong in your life?
 
Last edited:
Joined
Jan 8, 2017
Messages
9,434 (3.28/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
So you still really think everything that code does only and purely runs on the GPU. That is completely wrong

It's a fucking GPU shader ! Of course it runs only on the GPU, THAT'S WHAT IT'S FOR ! My God, you can't be this dense, really I hope you're just a bad troll. :roll:

you don't seem to understand how computers or Von Neumann Architecture works on a basic level.

Lmao you're just writing some random ass computer science thingies that you know about.

Again, totally projecting. What was your initial goal other than presenting how much you know about shaders but not the overall picture of how a computer works? I don't need to know how everything works because I know how abstraction in computer architecture works, that's the point of having it. So, you still haven't told me about your initial goal. It seems to be just wanted to talk s*** at other members of the forum and nothing else. You are truthfully a sad and silly human being. I kind of pitty you that you had to go through this length to defend a useless point that doesn't even further anything about the discussion. Did you just need to vent from something that went wrong in your life?

Damn are you getting emotional, want a tissue or something ? You're not gonna cry on me are you ?
 
Joined
Apr 8, 2010
Messages
1,008 (0.19/day)
Processor Intel Core i5 8400
Motherboard Gigabyte Z370N-Wifi
Cooling Silverstone AR05
Memory Micron Crucial 16GB DDR4-2400
Video Card(s) Gigabyte GTX1080 G1 Gaming 8G
Storage Micron Crucial MX300 275GB
Display(s) Dell U2415
Case Silverstone RVZ02B
Power Supply Silverstone SSR-SX550
Keyboard Ducky One Red Switch
Software Windows 10 Pro 1909
Over9000.gif
 
Joined
Sep 11, 2015
Messages
624 (0.19/day)
It's a fucking GPU shader ! Of course it runs only on the GPU, THAT'S WHAT IT'S FOR ! My God, you can't be this dense, really I hope you're just a bad troll. :roll:

You have no idea what a small line of higher-level code is compiled to in the computer and what components carry out that code. And get this, it doesn't even matter, it's called abstraction! But you have no clue what that is, do you?

Now, you still haven't answered why you even wrote that comment about languages not running on specialized hardware, which they actually can do with the help of the CPU, since it literally had nothing to do with anything being discussed. It's not relevant to the specialized hardware point that you were wrong about. You can still have specialized hardware, like Shader Units or TMUs or ROPs... It's also not relevant to the TFLOPS argument that you were also wrong about. So what was it even relevant for, you silly goose? Tell me, I asked at least 3 times already and you can't answer, but I can continue asking. :laugh: :laugh: :laugh:
Lmao you're just posting some random ass computer science thingies that you know about.
Sure I am. I just have a CS degree for nothing. But you are the expert... Probably you tried programming a game once and didn't even work out and now you like to play yourself up as someone of importance here. It's not working for you, sorry to be the one who tells you the truth.
Damn are you getting emotional, want tissue or something ? You're not gonna cry on me are you ?
I'm not the one who needs to cry, but people who are around you. And they probably also need to run from what kind of pathetic lier and wannabe you are. I won't even say troll because you somehow manage to be way beneath that.
 
Last edited:
D

Deleted member 185088

Guest
Finally a real leap with focus on Cuda cores, even the 3070 has more than the 2080ti, a bit disappointed by the memory amounts though, 12 GB for the 3070 and 16gb for the 3080 would've been better.
Wonder what will AMD do now, they have to launch a big GPU, we need the power to drive to run games at 4k.
 
Joined
Aug 27, 2011
Messages
998 (0.21/day)
Processor Intel core i9 13900ks sp117 direct die
Motherboard Asus Maximus Apex Z790
Cooling Custom loop 3*360 45mm thick+ 3 x mo-ra3 420 +Dual D5 pump and dual ddc pump
Memory 2x24gb Gskill 8800c38
Video Card(s) Asus RTX 4090 Strix
Storage 2 tb crucial t700, raid 0 samsung 970 pro 2tb
Display(s) Sammsung G7 32”
Case Dynamic XL
Audio Device(s) Creative Omni 5.1 usb sound card
Power Supply Corsair AX1600i
Mouse Model O-
Keyboard Hyper X Alloy Origin Core
can anyone calculate how many more fps for 1440p going from 2080ti to 3090?
 
Joined
Jan 31, 2011
Messages
2,210 (0.44/day)
System Name Ultima
Processor AMD Ryzen 7 5800X
Motherboard MSI Mag B550M Mortar
Cooling Arctic Liquid Freezer II 240 rev4 w/ Ryzen offset mount
Memory G.SKill Ripjaws V 2x16GB DDR4 3600
Video Card(s) Palit GeForce RTX 4070 12GB Dual
Storage WD Black SN850X 2TB Gen4, Samsung 970 Evo Plus 500GB , 1TB Crucial MX500 SSD sata,
Display(s) ASUS TUF VG249Q3A 24" 1080p 165-180Hz VRR
Case DarkFlash DLM21 Mesh
Audio Device(s) Onboard Realtek ALC1200 Audio/Nvidia HD Audio
Power Supply Corsair RM650
Mouse Rog Strix Impact 3 Wireless | Wacom Intuos CTH-480
Keyboard A4Tech B314 Keyboard
Software Windows 10 Pro
wait,
Oh so from what ive read, each core can do INT+FP, FP+FP vs previous generation of INT+ FP only? Its still 2 core inside but only one when doing specific operations?
 
Joined
Dec 10, 2019
Messages
27 (0.01/day)
Location
United Kingdom
Processor 13700K 5.6GHz 8/16
Motherboard PRO Z690-A DDR4
Memory V10 2x16GB 4100MHz CL15
Video Card(s) 3080 Ti Gamerock
Benchmark Scores 3DMark CPU Pr. Test /No-L. Nitrogen 13700K @6.0 1300-14543 (5.)
I'm thankful and glad for the probable generational leap, and for the price of the two lower more mainstream models.
 
Last edited:
D

Deleted member 185088

Guest
wait,
Oh so from what ive read, each core can do INT+FP, FP+FP vs previous generation of INT+ FP only? Its still 2 core inside but only one when doing specific operations?
Interesting, can someone explain this for the laymen among us.
 
Joined
Apr 30, 2008
Messages
4,897 (0.81/day)
Location
Multidimensional
System Name Boomer Master Race
Processor Intel Core i5 12600H
Motherboard MinisForum NAB6 Lite Board
Cooling Mini PC Cooling
Memory Apacer 16GB 3200Mhz
Video Card(s) Intel Iris Xe Graphics
Storage Kingston 512GB SSD
Display(s) Sony 4K Bravia X85J 43Inch TV 120Hz
Case MinisForum NAB6 Lite Case
Audio Device(s) Built In Realtek Digital Audio HD
Power Supply 120w External Power Brick
Mouse Logitech G203 Lightsync
Keyboard Atrix RGB Slim Keyboard
VR HMD ( ◔ ʖ̯ ◔ )
Software Windows 11 Home 64bit
Benchmark Scores Don't do them anymore.
I know Nvidia recommends a 750W PSU for the 3080 but I'm hoping my 650W Gold rated PSU will suffice, reviews/time will tell.
 
Joined
Nov 11, 2016
Messages
3,403 (1.16/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
Interesting, can someone explain this for the laymen among us.

Read through this concurrent FP+INT article

Basically 2080 Ti has 13.5TFLOPs of FP32 and 13.5TFLOPs of INT32, if a game fully leverage both FP32 and INT32 instructions then 2080 Ti would effectively have 27TFLOPs of combined FP+INT. So depends on how much INT32 instructions are used, 2080 Ti's effective TFLOPs range from 13.5 to 27. From the SoTR example, 38 out 100 instructions are INT, that means effectively 2080 Ti has 13.5 + 13.5(x 38/62) = 21.77 TFLOPs

Meanwhile the 20TFLOPs of the 3070 are fixed (FP32 or FP32+INT32) and does not depend on the game engine's usage of INT instructions.

@Vya Domus and @PowerPC You guys forgot that 2080 Ti can do concurrent FP32+INT32 so effectively 2080 Ti has 27TFOPS in rare instances.
 
Last edited:
Joined
Feb 18, 2012
Messages
2,715 (0.58/day)
System Name MSI GP76
Processor intel i7 11800h
Cooling 2 laptop fans
Memory 32gb of 3000mhz DDR4
Video Card(s) Nvidia 3070
Storage x2 PNY 8tb cs2130 m.2 SSD--16tb of space
Display(s) 17.3" IPS 1920x1080 240Hz
Power Supply 280w laptop power supply
Mouse Logitech m705
Keyboard laptop keyboard
Software lots of movies and Windows 10 with win 7 shell
Benchmark Scores Good enough for me
I know Nvidia recommends a 750W PSU for the 3080 but I'm hoping my 650W Gold rated PSU will suffice, reviews/time will tell.
I wouldnt push it.
If I ever build a desktop again, I would never mess with or cheap out on the power supply, I would buy everything else used but the power supply, I would buy new. So I wouldnt push it.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,169 (1.27/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Well well well, looks like a lot of rose coloured crystal balls were off the mark.

A 3080 is definitely the card I was waiting for!
 
Top