• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Announces GeForce Ampere RTX 3000 Series Graphics Cards: Over 10000 CUDA Cores

That is garbage. CPU always works together with the GPU. CPU instructs the GPU to do things all the time.

I tried, yet you made sure to prove to me you have absolutely no idea what you are talking about on every occasion. Do you realize how idiotic it would be if I were to write say a + b in a GPU shader and the CPU would have to instruct the GPU step by step on how to do that ? What would there even be the point of having a GPU ?

Shaders run exclusively on the GPU, line by line, instruction by instruction with no intervention from the CPU side, that's what GPUs do. You know precisely nothing about this subject yet you insist to correct me on everything I say. May you delve in your ignorance for as long as you want, I'm out.

 
I tried, yet you made sure to prove to me you have absolutely no idea what you are talking about on every occasion. Do you realize how idiotic it would be if I were to write say a + b in a GPU shader and the CPU would have to instruct the GPU step by step on how to do that ? What would there even be the point of having a GPU ?

Shaders run exclusively on the GPU, line by line, instruction by instruction with no intervention from the CPU side, that's what GPUs do. You know precisely nothing about this subject yet you insist to correct me on everything I say. May you delve in your ignorance for as long as you want, I'm out.

You are just insanely inaccurate about everything you say.

I never said CPU does the computation of the GPU line by line. That is something you just invented somehow. You just can't stop lying about what I said, you really don't give a f*** it seems.

The CPU can give instructions to the GPU and the GPU will actually compute those instructions. The CPU doesn't need to do it, but it does need to orchestrate what the GPU does. That is what that program is doing that someone wrote in C. The program itself doesn't do anything either, it's just the instructions for the CPU and the CPU passes those instructions on to the GPU to compute whatever that program wants it to compute.

If you knew something about C you would know it gets compiled down to Assembly language and then down to Machine code for x86 CPUs. Assembly usually just knows one instruction set, so it can't just run on anything you would like.

You are only projecting your own lack of knowledge on to others. You don't seem to be that bright either, literally everything you said was wrong so far and even others have corrected you, it's actually hard for me to remember encountering someone as willfully ignorant or even just as purposefully deceptive as you on here before.
 
Last edited:
Maybe you have trouble remembering what you wrote so I will post the exact comments and order in which you replied.

Yes, it runs purely on the GPU, instruction by instruction for each instance of the shader.

To which you responded very confidently that this is "garbage" :

That is garbage. CPU always works together with the GPU. CPU instructs the GPU to do things all the time.

Now you write :

I never said CPU does the computation of the GPU line by line. That is something you just invented somehow. You just can't stop lying about what I said, you really don't give a f*** it seems.

No buddy, you thought that idea was "garbage", you wrote that yourself not knowing even what I was talking about.

Stop, you are the laughing stock of everyone reading this.
 
Last edited:
Maybe you have trouble remembering what you wrote so I will post the exact comments and order in which you replied.



To which you responded very confidently that this is "garbage" :



Now you write :



No buddy, you thought that idea was "garbage", you wrote that yourself.

Stop, you are the laughing stock of everyone reading this.
How is that even supposed to be against what I said???

YOU said it runs line by line on the GPU. Then I replied that's garbage / totally wrong. I said it runs on both every time and you don't seem to get that simple principle. You have posted total non-sense over and over again.

The CPU is still always the host processor, it orchestrates what the GPU has to do. The program just doesn't purely run on the GPU, but also on the CPU. You have no sense of the abstraction happing with that C program. I'm still accurate on that, despite your silly attempts to frame this otherwise. You make extremely illogical points, so that won't happen either way. But somehow you are still trying.
 
I said it runs on both every time and you don't seem to get that simple principle.

It doesn't run on both, you have no clue in the slightest how these things work. I said above how unimaginably idiotic it would be for that to be the case. A shader is compiled and runs on the GPU alone, not on both. Can you still not understand how dumb that would be ?

The program just doesn't purely run on the GPU

The shader runs purely on GPU, period. I don't know what's this "program" you talk about, I wrote exclusively about shaders.

YOU said it runs line by line on the GPU.

Which is 100% correct. Everything in the shader, the code that I posted earlier, runs on the GPU. All of it, never on both.

You have no sense of the abstraction happing with that C program.

:roll: :roll: :roll: :roll:

IT'S NOT A C PROGRAM. It's a GLSL shader using a C syntax style language, I wrote that as clear as possible, of course you don't have any idea of what that means. That's why you don't understand any of this either. This is what I am trying to show you.
 
Last edited:
It doesn't run on both, you have no clue in the slightest how these things work. I said above how unimaginably idiotic it would be for that to be the case. A shader is compiled and runs on the GPU alone, not on both. Can you still not understand how dumb that would be ?



The shader runs purely on GPU, period. I don't know what's this "program" you talk about, I wrote exclusively about shaders.
The computation for the shader runs on the GPU, that's it. The C programs run on both, that's it. The discussion was about the C program, and you started it, so you're just out of your mind now suddenly changing the facts up to appear right. That's some weird, childish reasoning.

Let's sum up:
You were wrong on the TFLOPS argument.
You were wrong on the general-purpose argument.
You are still wrong on the C program argument, which btw. you stared and which has nothing at all to do with this whole discussion.

Or what did this have to do with anything now? You wrote this:
Shading languages don't run on specialized hardware, they can't, they need generic all-purpose processors.

Of course they run on specialized hardware. It's called a Shading Unit inside a GPU. That's definitely as specialized as it gets, my friend. And there is no such thing as a "shading language", if it's written in C. C is an all-purpose language, no a shading one.... Whatever that is even supposed to mean.

Now stop this completely silly discussion. But you just don't seem to know when to stop, do you?
 
The discussion was about the C program

IT'S NOT A C PROGRAM

You can't figure that out even now ? Read back that comment.

You were wrong on the TFLOPS argument.
You were wrong on the general-purpose argument.
You are still wrong on the C program argument, which btw. you stared and which has nothing at all to do with this whole discussion.

I was correct about all of them.

More TFLOPS indicate more performance.
GPUs are general purpose, that's why they are used for all sorts of things besides graphics. That's also why they're capable of being GPGPU (General Purpose Graphics Processing Units). It's in the name.


Thoroughly explained why, can you do the same ?

And there is no such thing as a "shading language", if it's written in C. C is an all-purpose language, no a shading one.... Whatever that is even supposed to mean.


Just for how long will you make me do this ? Have I not proven you wrong enough times ? I feel like a bully.

You're gonna use that cognitive dissonance thing and tell me that you never said that, right ?
 
Last edited:
$699 3080???????????????????

OH MY GOD

Wut? This may be the same pricing as the RTX 2000 series. No Founders Edition mentioned in the charts, so the cheapest AIB ones may start from this price range too - just like with the RTX 2080.

Nope. To me Nvidia is only competing with itself at this point. In other words, how to get the Pascal crowd to move on and upgrade? Jensen even mentioned that in his talk!
Competition doesn't mean only the high end. It's there at mid and entry range too.
 
Last edited:
IT'S NOT A C PROGRAM

You can't figure that out even now ? Read back that comment.
You can keep changing and editing posts but that's ok.

You haven't answered the question of how that language (C or not, it doesn't matter) purely runs on the GPU. Everything has to go through the CPU at some point, GPU and CPU work together constantly and for every process, they do to reach the result of a game on the screen. That's why you still need a CPU to play games. We wouldn't even need a CPU, if you could just program on the GPU. It's definitely not how it works.

And either way, what is this point even trying to accomplish? What are you trying to reach with this exact point, besides all the other points that you were wrong on? You still haven't responded to this, so I'll quote you again:
Shading languages don't run on specialized hardware, they can't, they need generic all-purpose processors.
Are you actually saying Shading Units aren't specialized hardware when it's literally in the name??? :kookoo::kookoo::kookoo:
 
Last edited:
The power usage is crazy. Both the 90 and 80 are above 300W and even the 70 is approaching the territory that used to be reserved for the 80 Ti cards. But then if 3070 is actually faster than 2080 Ti by a good margin, that means power efficiency has been improved with Ampere. BTW where is that 12-pin pcie aux power connector?
 
Specs are looking pretty good (on paper), perheps except TDP numbers.
I thought the 220 watts for the 3070 wasn’t a bad TDP for the performance their advertising.
 
You haven't answered the question of how that language (C or not, it doesn't matter) purely runs on the GPU. Everything has to go through the CPU.

You write "int a = 1;" in a shader, it's compiled then sent onto the GPU where it's executed by it. Is that too much to comprehend ? Do you understand what "run" means ?

Are you actually saying Shading Units aren't specialized hardware when it's literally in the name??? :kookoo::kookoo::kookoo:

Both AMD and Nvidia have stopped calling them like that for a very long time. Anyway, you still haven't addressed how shading languages don't exist despite this little thing :


(GLSL) is a high-level shading language with a syntax based on the C programming language.

And either way, what is this point trying to accomplish?

That you know nothing, you are a compulsive liar, and really, really stubborn. It wasn't my initial goal, but alas.
 
Last edited:
You write "int a = 1;" in a shader, it's compiled then sent onto the GPU where it's executed by it. Is that too much to comprehend ? Do you understand what "run" means ?
So you still really think everything that code does only and purely runs on the GPU. That is completely wrong and I won't keep repeating this point because you don't seem to understand how computers or Von Neumann Architecture works on a basic level.
Both AMD and Nvidia have stopped calling them like that for a very long time. Anyway, you still haven't addressed how shading languages don't exist despite this little thing :
No, it's still called that because Nvidia or AMD don't determine what basic hardware components are called. Intel definitely also uses Shading Units in their iGPUs.
https://en.wikipedia.org/wiki/Unified_shader_model
Unified shader architecture (or unified shading architecture) is a hardware design by which all shader processing units of a piece of graphics hardware are capable of handling any type of shading tasks. Most often Unified Shading Architecture hardware is composed of an array of computing units and some form of dynamic scheduling/load balancing system that ensures that all of the computational units are kept working as often as possible.


That you know nothing, you are compulsive liar, and really, really stubborn. It wasn't my initial goal, but alas.

Again, totally projecting. What was your initial goal other than presenting how much you know about shaders but not the overall picture of how a computer works? I don't need to know how everything works because I know how abstraction in computer architecture works, that's the point of having it. So, you still haven't told me about your initial goal. It seems to be that you just wanted to talk s*** at other members of the forum and nothing else. You are truthfully a sad and silly human being. I kind of pitty you that you had to go through this length to defend a useless point that doesn't even further anything about the discussion. You seem to just vent because something went wrong in your life?
 
Last edited:
So you still really think everything that code does only and purely runs on the GPU. That is completely wrong

It's a fucking GPU shader ! Of course it runs only on the GPU, THAT'S WHAT IT'S FOR ! My God, you can't be this dense, really I hope you're just a bad troll. :roll:

you don't seem to understand how computers or Von Neumann Architecture works on a basic level.

Lmao you're just writing some random ass computer science thingies that you know about.

Again, totally projecting. What was your initial goal other than presenting how much you know about shaders but not the overall picture of how a computer works? I don't need to know how everything works because I know how abstraction in computer architecture works, that's the point of having it. So, you still haven't told me about your initial goal. It seems to be just wanted to talk s*** at other members of the forum and nothing else. You are truthfully a sad and silly human being. I kind of pitty you that you had to go through this length to defend a useless point that doesn't even further anything about the discussion. Did you just need to vent from something that went wrong in your life?

Damn are you getting emotional, want a tissue or something ? You're not gonna cry on me are you ?
 
It's a fucking GPU shader ! Of course it runs only on the GPU, THAT'S WHAT IT'S FOR ! My God, you can't be this dense, really I hope you're just a bad troll. :roll:

You have no idea what a small line of higher-level code is compiled to in the computer and what components carry out that code. And get this, it doesn't even matter, it's called abstraction! But you have no clue what that is, do you?

Now, you still haven't answered why you even wrote that comment about languages not running on specialized hardware, which they actually can do with the help of the CPU, since it literally had nothing to do with anything being discussed. It's not relevant to the specialized hardware point that you were wrong about. You can still have specialized hardware, like Shader Units or TMUs or ROPs... It's also not relevant to the TFLOPS argument that you were also wrong about. So what was it even relevant for, you silly goose? Tell me, I asked at least 3 times already and you can't answer, but I can continue asking. :laugh: :laugh: :laugh:
Lmao you're just posting some random ass computer science thingies that you know about.
Sure I am. I just have a CS degree for nothing. But you are the expert... Probably you tried programming a game once and didn't even work out and now you like to play yourself up as someone of importance here. It's not working for you, sorry to be the one who tells you the truth.
Damn are you getting emotional, want tissue or something ? You're not gonna cry on me are you ?
I'm not the one who needs to cry, but people who are around you. And they probably also need to run from what kind of pathetic lier and wannabe you are. I won't even say troll because you somehow manage to be way beneath that.
 
Last edited:
Finally a real leap with focus on Cuda cores, even the 3070 has more than the 2080ti, a bit disappointed by the memory amounts though, 12 GB for the 3070 and 16gb for the 3080 would've been better.
Wonder what will AMD do now, they have to launch a big GPU, we need the power to drive to run games at 4k.
 
can anyone calculate how many more fps for 1440p going from 2080ti to 3090?
 
wait,
Oh so from what ive read, each core can do INT+FP, FP+FP vs previous generation of INT+ FP only? Its still 2 core inside but only one when doing specific operations?
 
I'm thankful and glad for the probable generational leap, and for the price of the two lower more mainstream models.
 
Last edited:
wait,
Oh so from what ive read, each core can do INT+FP, FP+FP vs previous generation of INT+ FP only? Its still 2 core inside but only one when doing specific operations?
Interesting, can someone explain this for the laymen among us.
 
I know Nvidia recommends a 750W PSU for the 3080 but I'm hoping my 650W Gold rated PSU will suffice, reviews/time will tell.
 
Interesting, can someone explain this for the laymen among us.

Read through this concurrent FP+INT article

Basically 2080 Ti has 13.5TFLOPs of FP32 and 13.5TFLOPs of INT32, if a game fully leverage both FP32 and INT32 instructions then 2080 Ti would effectively have 27TFLOPs of combined FP+INT. So depends on how much INT32 instructions are used, 2080 Ti's effective TFLOPs range from 13.5 to 27. From the SoTR example, 38 out 100 instructions are INT, that means effectively 2080 Ti has 13.5 + 13.5(x 38/62) = 21.77 TFLOPs

Meanwhile the 20TFLOPs of the 3070 are fixed (FP32 or FP32+INT32) and does not depend on the game engine's usage of INT instructions.

@Vya Domus and @PowerPC You guys forgot that 2080 Ti can do concurrent FP32+INT32 so effectively 2080 Ti has 27TFOPS in rare instances.
 
Last edited:
I know Nvidia recommends a 750W PSU for the 3080 but I'm hoping my 650W Gold rated PSU will suffice, reviews/time will tell.
I wouldnt push it.
If I ever build a desktop again, I would never mess with or cheap out on the power supply, I would buy everything else used but the power supply, I would buy new. So I wouldnt push it.
 
Well well well, looks like a lot of rose coloured crystal balls were off the mark.

A 3080 is definitely the card I was waiting for!
 
Back
Top