Monday, July 2nd 2018

NVIDIA "GT104" Based GeForce GTX 1180 Surfaces on Vietnamese Stores

A Vietnamese online store put up the first listing of a GeForce GTX 1180 based ASUS ROG Strix graphics card. The store even put out some specifications of the card, beginning with it being based on the "GT104" silicon, based on the "Turing" series. With "Turing" NVIDIA appears to be forking its GPU architectures on the basis of chips that feature DPFP (double-precision floating point) cores and Tensor cores, and those that lack both (and only feature SPFP cores). "Turing" is probably a fork of "Volta" that lacks both DPFP CUDA cores and Tensor cores; and sticks to the cheaper GDDR6 memory architecture, while "Volta" based GPUs, such as the TITAN V, implement pricier HBM2 memory.

Among the specifications of the GeForce GTX 1180 are 3,584 CUDA cores, and 16 GB of GDDR6 memory across a 256-bit wide memory interface. The memory is clocked at 14 GHz (GDDR6-effective), which works out to 409.6 GB/s of memory bandwidth. Pre-launch prices, just like most specifications, tend to be bovine excrement, which in this case converts to a little over USD $1,500, and isn't really relevant. What is, however, interesting is the availability date of September 28.
Source: samcuu (Reddit)
Add your own comment

122 Comments on NVIDIA "GT104" Based GeForce GTX 1180 Surfaces on Vietnamese Stores

#76
xkm1948
RejZoRHoly shit, I literally can't say anything about NVIDIA anymore without everyone being f**king pissy about it. Get a life ffs.
I have a very happy life. You are the one who has pissed on for several pages now for a simple naming scheme that does not fit your narrative. Get help @RejZoR FFS
Posted on Reply
#77
rtwjunkie
PC Gaming Enthusiast
SDR82Well, the current GPU generation is called the "Geforce 10 Series" (read "ten" and not "one thousand", OK). Before Pascal, Nvidia incremented the number in the series by 100, starting with the Geforce 100 Series all the way up to the 900 series, where that naming scheme ended. Going forward the series numbers will increase by +1 instead of +100. Hence the next generation will be the "Geforce 11 Series".
It is still a jump by 100. Check the math skills. No change, and it didn’t end at 900, since a 900 plus 100 equals 1000, or ten hundred. It’s not unusual in the business world. Just look at dates: eleven-hundred, twelve-hundred, nineteen-hundred and ninety-eight, the year ten sixty six.
Posted on Reply
#78
bug
RejZoRHoly shit, I literally can't say anything about NVIDIA anymore without everyone being f**king pissy about it. Get a life ffs.
You went on for several pages about 1080Ti's power saving being "stupid". Now you're making a fuss about numbers on a box (leaked in Vietnam, no less). Did you really not see this coming?
Posted on Reply
#79
15th Warlock
cucker tarlsonStill not as misleading as rebranding 480 as 580 and 290 as 390.
btw AMD did the same thing with rx560, but cut the cards after the initial reviews, and did it secretly. Tell me that wasn't intentional :rolleyes: Just like their fake launch msrp for vega to get better initial reviews. You know, the real shady stuff that adoretv doesn't mention on his channel.
The 580 was not a rebrand of the 480, it had more cuda cores (512 vs 480) and a higher clock, it also ran much quieter and at lower temps than the 480.

Also, as far as I remember there was no GTX 390, while the 3xx series where a rebrand of the 2xx series, there was never a flagship release, cards in the 3xx series we're mostly only available to OEMs, and nvidia all but skipped the 3xx moniker and went directly to the 4xx series, they did the same thing with the 8xx series.

Back on topic, yea, this is basically a poor photoshop of the 1080 Strix, I should know, I have two of those cards, the specs might also be off but we'll find out soon.

I recommend extreme caution preordering any of these cards, until we have official confirmation from Nvidia about the 11xx or 20xx series, whatever they decide to use.

On a side note, I have no preference for a specific numbering scheme for the new cards, all I want is an upgrade to an architecture that has been around over 2 years already...
Posted on Reply
#80
qubit
Overclocked quantum bit
cucker tarlsonIf it was real it should have "16G" or "whatever G" on the side of the box. It's just this edited shot of 1080 strix. Even shadows are precisely the same.




Dang well spotted. :D
Posted on Reply
#81
Basard
15th WarlockThe 580 was not a rebrand of the 480, it had more cuda cores (512 vs 480) and a higher clock, it also ran much quieter and at lower temps than the 480.

Also, as far as I remember there was no GTX 390, while the 3xx series where a rebrand of the 2xx series, there was never a flagship release, cards in the 3xx series we're mostly only available to OEMs, and nvidia all but skipped the 3xx moniker and went directly to the 4xx series, they did the same thing with the 8xx series.

Back on topic, yea, this is basically a poor photoshop of the 1080 Strix, I should know, I have two of those cards, the specs might also be off but we'll find out soon.

I recommend extreme caution preordering any of these cards, until we have official confirmation from Nvidia about the 11xx or 20xx series, whatever they decide to use.

On a side note, I have no preference for a specific numbering scheme for the new cards, all I want is an upgrade to an architecture that has been around over 2 years already...
Pretty sure he's talking about AMD, as they had 580/480 and 390/290.


Holy shit you guys all gotta settle down though. lol......
Posted on Reply
#82
15th Warlock
BasardPretty sure he's talking about AMD, as they had 580/480 and 390/290.


Holy shit you guys all gotta settle down though. lol......
Got it, his post sent me on a trip down memory lane, triggered memories of my old GTX 480s and their big ass heatpipes protruding out of the heatsink shroud, they were like the Harley Davidson of video cards LMAO

I remember upgrading to the 580 a year after, and what a difference that was... Good times.

Anyway, apologies to the OP for the mistake, just an old fart talking about old technology you kids probably don't remember anything about.

As for settling down... All I can say is: get off my lawn! :laugh::kookoo:
Posted on Reply
#83
Arjai
15th Warlock... All I can say is: get off my lawn! :laugh::kookoo:
And go ring somebody else's Door Bell!!
Posted on Reply
#84
cucker tarlson
15th Warlockjust an old fart talking about old technology you kids probably don't remember anything about.

As for settling down... All I can say is: get off my lawn! :laugh::kookoo:
Lol my first card was voodoo
Posted on Reply
#85
NTM2003
Going to have to buy one probably to run the games this fall. But hope not
Posted on Reply
#86
cucker tarlson
NTM2003Going to have to buy one probably to run the games this fall. But hope not
How's your 1080 doing at 4K tho ? Mine is still ripping it at 1440p, I seriously consider skipping 1180 or maybe getting a used TX if the prices of used ones drop to 1170 levels like what happend to TXM after AIB 1070s beat it.
Posted on Reply
#87
15th Warlock
cucker tarlsonLol my first card was voodoo
That 3dfx loading screen before every game started, man those were the days! Uncontested 3D acceleration domination! No other card came close to a Voodoo!

I had the 6MB Canopus version of the Voodoo, man I loved that card, a big upgrade from my S3 3D "decelarator" hahaha

I tip my hat to you sir :respect:

Nowadays we have stagantion and a market driven by the whims of supply and demand, I miss the old days haha
ArjaiAnd go ring somebody else's Door Bell!!
Hahaha yeah, you get the idea :p
Posted on Reply
#88
cucker tarlson
15th WarlockThat 3dfx loading screen before every game started, man those were the days! Uncontested 3D acceleration domination! No other card came close to a Voodoo!

I had the 6MB Canopus version of the Voodoo, man I loved that card, a big upgrade from my S3 3D "decelarator" hahaha

I tip my hat to you sir :respect:

Nowadays we have stagantion and a market driven by the whims of supply and demand, I miss the old days haha
I forgot I had s3 savage 8mb too. Probably cause it was shit. Just no other word to describe it.

I still like what we have now. The blazing fast GPUs that can render unbelievable resolutions, technologies like variable refresh rate, though they both come at a hefty price.

If someone back then showed me a game like Witcher 3 running at 100fps on a 3440x1440 ultrawide curved LED panel, with everything just butter smooth and no tearing/lag thanks to g-sync, I'd crap my pants instantly. Now it's becoming standard for enthusiast PC builders.
Posted on Reply
#89
NTM2003
cucker tarlsonHow's your 1080 doing at 4K tho ? Mine is still ripping it at 1440p, I seriously consider skipping 1180 or maybe getting a used TX if the prices of used ones drop to 1170 levels like what happend to TXM after AIB 1070s beat it.
It's been holding up well haven't had no problems with it I just got it last April so really it's still new
Posted on Reply
#90
15th Warlock
cucker tarlsonI forgot I had s3 savage 8mb too. Probably cause it was shit. Just no other word to describe it.

I still like what we have now. The blazing fast GPUs that can render unbelievable resolutions, technologies like variable refresh rate, though they both come at a hefty price.

If someone back then showed me a game like Witcher 3 running at 100fps on a 3440x1440 ultrawide curved LED panel, with everything just butter smooth and no tearing/lag thanks to g-sync, I'd crap my pants instantly. Now it's becoming standard for enthusiast PC builders.
Yes, can't deny that, we saw an amazing period of performance growth, just sucks us, regular mortals without access to a $3,000 Titan V haven't been able to experience what a quantum leap in architecture brings to gaming.

We are way overdue for an update to what's currently available in the market, also, price fluctuations during the last generation of video cards have really left a bad taste in my mouth.

Here's hoping for more market stability once the overloards at AMD and Nvidia finally clear their inventories and decide to release their new video cards :toast:
Posted on Reply
#91
dalekdukesboy
cucker tarlsonIf it was real it should have "16G" or "whatever G" on the side of the box. It's just this edited shot of 1080 strix. Even shadows are precisely the same.




Not sure which is better, your screenname cleverly invoking Tucker Carlson, or your Trump meme where he calls Jim Acosta of CNN Fake News...
Posted on Reply
#92
cucker tarlson
15th WarlockYes, can't deny that, we saw an amazing period of performance growth, just sucks us, regular mortals without access to a $3,000 Titan V haven't been able to experience what a quantum leap in architecture brings to gaming.

We are way overdue for an update to what's currently available in the market, also, price fluctuations during the last generation of video cards have really left a bad taste in my mouth.

Here's hoping for more market stability once the overloards at AMD and Nvidia finally clear their inventories and decide to release their new video cards :toast:
Currently best we have is 4K/60 or 3440x1440/100 monitors, and a 1080Ti can pretty much saturate both. Not if you consider min. fps, but avg. fps is definitely going to hover around the max capabilities of modern enthusiast screens.
Posted on Reply
#93
ppn
We want 7NM full Volta 5376 Core 314 mm^2 only and runs at 2,5GHz unlike current Volta at 1,5GHz hitting power limit. I don't care about 1180 clone of 1080Ti with 5GB bonus memory and insane pricing fake or not. Best thing I can trick myself to accept is 1170 with 60% more perf than 1070 at 399$.
Posted on Reply
#94
cucker tarlson
ppnWe want 7NM full Volta 5376 Core 314 mm^2 only and runs at 2,5GHz unlike current Volta at 1,5GHz hitting power limit. I don't care about 1180 clone of 1080Ti with 5GB bonus memory and insane pricing fake or not. Best thing I can trick myself to accept is 1170 with 60% more perf than 1070 at 399$.
1170 at 1080Ti performance would be a good upgrade even for a 1080 owner like me, especially if the TDP is very good. I don't like those +300W monsters. Had 980Ti, no more such power hogs in my rig.


btw, if we're talking rumours and fake news here, I've seen people say they're bringing back separate shadow clock/core clock speed, known from their eariler cards.
Posted on Reply
#95
qubit
Overclocked quantum bit
cucker tarlsonHad 980Ti, no more such power hogs in my rig.
How times change. I remember how the great the power usage was considered to be really good in the reviews for power efficiency and framereate performance at release and now it's considered a power hog. Heck, my 780 Ti with just 3GB RAM and previous gen GPU was a powerhouse! :laugh:

I've also got a 1080 (see specs) and it blows both of these cards away in every way, as you'd expect.
Posted on Reply
#96
cucker tarlson
qubitHow times change. I remember how the great the power usage was considered to be really good in the reviews for power efficiency and framereate performance at release and now it's considered a power hog. Heck, my 780 Ti with just 3GB RAM and previous gen GPU was a powerhouse! :laugh:

I've also got a 1080 (see specs) and it blows both of these cards away in every way, as you'd expect.
The difference I saw between 980Ti and 1080 was just amazing to me in a practical sense. Not only did I see noticeably better performance out of the box, but I could OC the card and still keep the fan speed at 50%. 980Ti (MSI Gaming 6G with twinfrozr 5) had it set 65-70% out of the box in auto mode. 1080 sits at 38-40% on auto.
Posted on Reply
#97
R0H1T
I think you need to step off the internet for a while RejZor, enjoy the World Cup watching Brazil get closer to a 6th title & if you dislike Germany it's like Christmas coming early.
Posted on Reply
#98
cucker tarlson
R0H1TI think you need to step off the internet for a while RejZor, enjoy the World Cup watching Brazil get closer to a 6th title & if you dislike Germany it's like Christmas coming early.
I still bet the French talents are gonna explode with form in the semis and the final.

I can't believe how much nicer this thread is without the screeching of the guy.
Posted on Reply
#99
Ruru
S.T.A.R.S.
cucker tarlsonHad 980Ti, no more such power hogs in my rig.
Had R9 290, didn't give a crap about power consumption, and I still don't. The reference cooler was the only truly bad thing about it.

Why a crappy photoshopping like this made it even to TPU news? o_O
Posted on Reply
Add your own comment
Aug 25th, 2024 20:29 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts