• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Demonstrates Revolutionary 14 nm FinFET Polaris GPU Architecture

Joined
Apr 18, 2015
Messages
234 (0.07/day)
Viewing angles are meaningless for gaming imo. For comfortable gaming you're facing monitor dead on anyway. Besides, even if you lean a bit to any side, trust me, in the heat of the battle, you'll NEVER notice tiny gradients of colors that are a bit off. And with pixel response times, 1ms (TN) compared to 5ms (IPS), zero shadowing. When I first brought it home, image was so sharp in motion it was weird to look at the image because it was so sharp even during insane motion (Natural Selection 2). Or the road in NFS Hot Pursuit 2010. I could actually see road texture sharply where on old monitor it was just a blurry mess and I had a 2ms 75Hz gaming screen. But it was an older TN panel and it showed its age a bit.

True and true.
For sure you will not notice the viewing angles during gaming. For me, I don't care about anti-aliasing for example, and I game with it disabled most of the time, although in some games I could easily enable it, without dropping under 60 frames. If you look carefully on a static image you will notice it, but during movement and action ... not really.

I'm also pretty sure that the low response time does make a difference and I do plan to get to 120hz myself, but somehow I still find it hard to let go on my old monitor, which still works perfectly and served me well for so many years.
 
Joined
Oct 2, 2004
Messages
13,791 (1.87/day)
The anti-aliasing, I disagree. Shading techniques used in games like Doom 3 or FEAR do mask jaggy edges quite a bit, but you'll still notice them heavily on thin elements like wire fences, railings, tree branches, electrical wires hanging in the air, first person held guns (because they are static and not part of environment). Things like this drive me insane seeing stupid jaggies moving around.

I personally rather use FXAA or MLAA and lose tiny bit of sharpness and have edges smoothed than having 100% sharp textures and jaggy edges. Games use heavy post processing anyway so even without FXAA, textures will feel blurry. And FXAA/MLAA in most cases filter edges like a 24x FSAA mode. Not in all conditions but most of the time and that's great. Especially since they barely affect framerate.
 
Joined
Apr 2, 2011
Messages
2,850 (0.57/day)
I've got to ask a fundamental question here. Do more colors really matter?

Before we go on, I've got to qualify. Every single color of light is a combination of the wavelengths we perceive. Our three types of color detecting cells respond over a relatively narrow range of energies. As such, the difference between any two colors can be represented roughly as deltaE = sqrt((g2-g1)+(a2-a1)+(b2-b1)) where the delta has to be 2.3 or greater for a human being to notice any difference in coloration. https://en.wikipedia.org/wiki/Color_difference

The longer explanation, with a bit of background and why the measurement is still subjective, can be found here: http://zschuessler.github.io/DeltaE/learn/


Short answer though, is that at some point adding more colors does not produce appreciable differences. A few year back a professional troll decided to make the point that makeup was crap by asking a simple question, was there a difference between Revlon's "Red Reinvented" and "Cherry Desirable?" The short answer is that I couldn't tell, and without the color values I wouldn't have thought them any different. http://www.thebestpageintheuniverse.net/c.cgi?u=fashion


With respect to monitors, does 10 bit to 12 bit produce an appreciable difference? I can't honestly say that I know, but my experience points me to the conclusion that more monitor generally trumps more accurate colors. Heck, I don't know of the last time where the difference between a slightly less blue purple would have been as much of a deal breaker as not having access to a relatively cheap 1920x1080 monitor. Personally, pixel count>refresh rate (assuming 30 Hz minimum)>color fidelity. Maybe I'm backwards, but I'd prefer Polaris to push 4K before AMD stated focusing on color depth.
 
Joined
Aug 15, 2008
Messages
5,941 (0.99/day)
Location
Watauga, Texas
System Name Univac SLI Edition
Processor Intel Xeon 1650 V3 @ 4.2GHz
Motherboard eVGA X99 FTW K
Cooling EK Supremacy EVO, Swiftech MCP50x, Alphacool NeXXos UT60 360, Black Ice GTX 360
Memory 2x16GB Corsair Vengeance LPX 3000MHz
Video Card(s) Nvidia Titan X Tri-SLI w/ EK Blocks
Storage HyperX Predator 240GB PCI-E, Samsung 850 Pro 512GB
Display(s) Dell UltraSharp 34" Ultra-Wide (U3415W) / (Samsung 48" Curved 4k)
Case Phanteks Enthoo Pro M Acrylic Edition
Audio Device(s) Sound Blaster Z
Power Supply Thermaltake 1350watt Toughpower Modular
Mouse Logitech G502
Keyboard CODE 10 keyless MX Clears
Software Windows 10 Pro
I personally prefer resolution/performance over color. I work on calibrated IPS monitors all day but when I play a game I don't really care about that crap.
 
Joined
Apr 30, 2012
Messages
3,881 (0.84/day)
I personally prefer resolution/performance over color. I work on calibrated IPS monitors all day but when I play a game I don't really care about that crap.

Neither does the game. It wouldn't matter if you did or not, even if your system is capable of it. Games will be the last to adopt such standards due to time/cost.

The reason more color 10-bit/12-bit is being talked about is because its already part of 4k standards.

Broadcast TV
4k Blu-Ray
TV Manufactures / slowest to adopt

They all have their groups that have established the base of what is to be. The last thing that all 3 haven't adopted as a standard but will likely be included in the future is the luminance data.

When you have 4k standards images and shrink them to 1440p or 1080p it will look a lot better then a 1080p standards image or movie. Provided your system is capable of course. I think both AMD VSR and Nvidia DSR have proven that for gamers.
 
Joined
Apr 2, 2011
Messages
2,850 (0.57/day)
Neither does the game. It wouldn't matter if you did or not, even if your system is capable of it. Games will be the last to adopt such standards due to time/cost.

The reason more color 10-bit/12-bit is being talked about is because its already part of 4k standards.

Broadcast TV
4k Blu-Ray
TV Manufactures / slowest to adopt

They all have their groups that have established the base of what is to be. The last thing that all 3 haven't adopted as a standard but will likely be included in the future is the luminance data.

When you have 4k standards images and shrink them to 1440p or 1080p it will look a lot better then a 1080p standards image or movie. Provided your system is capable of course. I think both AMD VSR and Nvidia DSR have proven that for gamers.

I was under the impression that super sampling was already doing this in the gaming space (though not to that large of a resolution difference). Am I mistaken?
 
Joined
Aug 15, 2008
Messages
5,941 (0.99/day)
Location
Watauga, Texas
System Name Univac SLI Edition
Processor Intel Xeon 1650 V3 @ 4.2GHz
Motherboard eVGA X99 FTW K
Cooling EK Supremacy EVO, Swiftech MCP50x, Alphacool NeXXos UT60 360, Black Ice GTX 360
Memory 2x16GB Corsair Vengeance LPX 3000MHz
Video Card(s) Nvidia Titan X Tri-SLI w/ EK Blocks
Storage HyperX Predator 240GB PCI-E, Samsung 850 Pro 512GB
Display(s) Dell UltraSharp 34" Ultra-Wide (U3415W) / (Samsung 48" Curved 4k)
Case Phanteks Enthoo Pro M Acrylic Edition
Audio Device(s) Sound Blaster Z
Power Supply Thermaltake 1350watt Toughpower Modular
Mouse Logitech G502
Keyboard CODE 10 keyless MX Clears
Software Windows 10 Pro
I'm already 4k native :rockout:
 
Joined
Apr 18, 2015
Messages
234 (0.07/day)
The anti-aliasing, I disagree. Shading techniques used in games like Doom 3 or FEAR do mask jaggy edges quite a bit, but you'll still notice them heavily on thin elements like wire fences, railings, tree branches, electrical wires hanging in the air, first person held guns (because they are static and not part of environment). Things like this drive me insane seeing stupid jaggies moving around.

I personally rather use FXAA or MLAA and lose tiny bit of sharpness and have edges smoothed than having 100% sharp textures and jaggy edges. Games use heavy post processing anyway so even without FXAA, textures will feel blurry. And FXAA/MLAA in most cases filter edges like a 24x FSAA mode. Not in all conditions but most of the time and that's great. Especially since they barely affect framerate.

I think we should agree to disagree, and be happy that we are all different.

Few years back I couldn't play dirt 3 without anti-aliasing, because in the menu there were some floating boxes which had jaggies on all sides and were really annoying, but the exception doesn't make it a rule. I honestly don't feel the benefit of AA while gaming. And I feel that with AA enabled there is very slight input lag even if fps is pretty much the same and in general very high.
 
Joined
Apr 30, 2012
Messages
3,881 (0.84/day)
CES 2016: AMD Shows Polaris Architecture Demo

CES 2016: AMD FreeSync working over HDMI

CES 2016: AMD Talks Polaris GPU and HDR Monitors (HDR support coming to 300 series)

CES 2016: AMD Talks Bringing HDMI Support to FreeSync (HDMI FreeSync monitor availability starting Q1 2016)
 
Last edited:

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.44/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
They said they're supporting GDDR5 and HBM. I wonder if the same silicon will do both or are they making a lower-end silicon for GDDR5 (99% sure they're separate).

I still don't get the point of FreeSync on HDMI. If you want to use FreeSync you should be buying a DisplayPort monitor. No tech in the HDMI ecosystem (except Radeon cards) will support FreeSync over HDMI. HDMI, the standard, doesn't officially support adaptive sync where DisplayPort does. I doubt the HDMI standard will ever add adaptive sync because, excepting consoles, none of the home theater equipment should fall below prescribed framerate.

I want HDR now!
 
Last edited:
Joined
Apr 30, 2012
Messages
3,881 (0.84/day)
They said they're supporting GDDR5 and HBM. I wonder if the same silicon will do both or are they making a lower-end silicon for GDDR5 (99% sure they're separate).

I still don't get the point of FreeSync on HDMI. If you want to use FreeSync you should be buying a DisplayPort monitor. No tech in the HDMI ecosystem (except Radeon cards) will support FreeSync over HDMI. HDMI, the standard, doesn't officially support adaptive sync where DisplayPort does. I doubt the HDMI standard will ever add adaptive sync because, excepting consoles, none of the home theater equipment should fall below prescribed framerate.

I want HDR now!

I suspect HDMI will be cost efficient for lower end panels. Value series that stick to entry level VRR 35-60hz

Radeon Technologies Group Real-Time High Dynamic Range Demo
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.44/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
You'd think DisplayPort would be cheaper to implement partly because the royalties are much lower (like $0.20 versus $1 per port).

Yeah, HDR looks like what screens should look like. Like right now, looking at my task bar, it should be pitch black but it isn't because my monitor is incapable of doing the white of the open browser at the same time of black of the task bar.
 
Joined
Jun 13, 2012
Messages
1,412 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Audio Device(s) Logitech Z906 5.1
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
I think AMD should compare to their own cards rather then nvidia. Comparing their card to a 950 which even if AMD claims is same machine, given AMD's history wouldn't shock me if there wasn't some trickery involved, aka AMD PR slides for Fury X vs 980ti that had fury x 30% faster then a 980ti as an example.

Instead of comparing to nvidia's last gen cards, use your own and show how much you have improved since your last gen. Won't look good if come april, pascal cards drop on the market and they end up roasting this. Just my opinion on the matter.
 
Joined
Apr 30, 2012
Messages
3,881 (0.84/day)
I think AMD should compare to their own cards rather then nvidia. Comparing their card to a 950 which even if AMD claims is same machine, given AMD's history wouldn't shock me if there wasn't some trickery involved, aka AMD PR slides for Fury X vs 980ti that had fury x 30% faster then a 980ti as an example.

Instead of comparing to nvidia's last gen cards, use your own and show how much you have improved since your last gen. Won't look good if come april, pascal cards drop on the market and they end up roasting this. Just my opinion on the matter.

I agree that these showcase demos lean towards the demonstrated hardware no matter who is showing off. Maxwell is more power efficient 28nm so why not compare.

Is AMD suppose to hold off until Nvidia showcases their 16nm part and then do a comparison. Is AMD suppose to ask Nvidia to lend them a card that they haven't announced to please their fans in such comparisons ?

Nvidia compares there current gen to two prior, they don't compare there cards to a gen revision. I wonder if you offer the same level of criticism towards them.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.44/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Isn't GTX 950 the lowest power model based on Maxwell? Even if it is, the demonstration is moot because we don't know how capable of that Polaris chip is. It could be R9 Nano-like with a 90w cap or it could be something like Tonga which competes directly with GTX 950 in the market.

I think AMD selected GTX 950 to demonstrate Polaris can be more power efficient than Maxwell under the same workload. The comparison to Maxwell makes sense if AMD gets Polaris to market before Pascal is available (seems likely seeing how AMD is already demonstrating chips). Maxwell loses the power efficiency argument when compared to Polaris (well, duh).
 
Joined
Apr 30, 2012
Messages
3,881 (0.84/day)
Isn't GTX 950 the lowest power model based on Maxwell? Even if it is, the demonstration is moot because we don't know how capable of that Polaris chip is. It could be R9 Nano-like with a 90w cap or it could be something like Tonga which competes directly with GTX 950 in the market.

I think AMD selected GTX 950 to demonstrate Polaris can be more power efficient than Maxwell under the same workload. The comparison to Maxwell makes sense if AMD gets Polaris to market before Pascal is available (seems likely seeing how AMD is already demonstrating chips). Maxwell loses the power efficiency argument when compared to Polaris (well, duh).

Its the 750 but yes we don't know the specs of the Polaris GPU and the 750 probably doesn't have enough muscle to do 60 fps in that scenario and if it was choosen then you'd have arby saying why they didn't use a 950. Sane people will wait for their respected parts to be out in the market and then compare unless your a impulse buyer.
 
Joined
Jun 13, 2012
Messages
1,412 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Audio Device(s) Logitech Z906 5.1
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
Isn't GTX 950 the lowest power model based on Maxwell? Even if it is, the demonstration is moot because we don't know how capable of that Polaris chip is. It could be R9 Nano-like with a 90w cap or it could be something like Tonga which competes directly with GTX 950 in the market.

I think AMD selected GTX 950 to demonstrate Polaris can be more power efficient than Maxwell under the same workload. The comparison to Maxwell makes sense if AMD gets Polaris to market before Pascal is available (seems likely seeing how AMD is already demonstrating chips). Maxwell loses the power efficiency argument when compared to Polaris (well, duh).
Not likely since AMD only had a prototype for what 1.5months now or so where as Nvidia has had there's fora good 7months or so. AMD has work yet to do on chip before its ready, not likely gonna be out before pascal. If it was on a mature node then maybe could do it in 4-5 months but with new node, cutting corners is not really smart move.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.44/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
When NVIDIA announced Pascal, they only presented a CGI rendering of the chip and said what it was about. NVIDIA hasn't demonstrated Pascal yet most likely because they're waiting on TMSC (again).

AMD likely obtained this Polaris chip a long time ago from Samsung. The Samsung Galaxy S6 had a 14nm chip and that was announced back in March. AMD announced Polaris about the first of the year and were demonstrating it a few days later.

Samsung's 14nm process is mature where TSMC's 16nm process is not.
 
Joined
Jun 13, 2012
Messages
1,412 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Audio Device(s) Logitech Z906 5.1
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
When NVIDIA announced Pascal, they only presented a CGI rendering of the chip and said what it was about. NVIDIA hasn't demonstrated Pascal yet most likely because they're waiting on TMSC (again).

AMD likely obtained this Polaris chip a long time ago from Samsung. The Samsung Galaxy S6 had a 14nm chip and that was announced back in March. AMD announced Polaris about the first of the year and were demonstrating it a few days later.

Samsung's 14nm process is mature where TSMC's 16nm process is not.
Um nvidia part was taped out like 7 months ago, reason likely nvidia hasn't said much is well to keep info about it secret. No reason to release specs or anything about it when don't need to. AMD talking about there's is a bit of a double edge sword in a sense. Just cause they haven't demonstrated it yet doesn't mean anything as there is nothing to go on. Just cause a fab was used to make a small low power ARM cpu doesn't mean its mature and good enough for a large GPU.
 
Joined
Apr 30, 2012
Messages
3,881 (0.84/day)
Um nvidia part was taped out like 7 months ago, reason likely nvidia hasn't said much is well to keep info about it secret. No reason to release specs or anything about it when don't need to. AMD talking about there's is a bit of a double edge sword in a sense. Just cause they haven't demonstrated it yet doesn't mean anything as there is nothing to go on. Just cause a fab was used to make a small low power ARM cpu doesn't mean its mature and good enough for a large GPU.

For it being a secret you sure talk like you know a lot about it. :laugh:
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.44/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Just cause they haven't demonstrated it yet doesn't mean anything as there is nothing to go on. Just cause a fab was used to make a small low power ARM cpu doesn't mean its mature and good enough for a large GPU.
Except that AMD already demonstrated a large (compared to ARM anyway which have well under a billion) GPU. GTX 950 has 3 billion transistors so the Polaris demo had to have 2+ billion to keep pace.
 
Joined
Apr 30, 2012
Messages
3,881 (0.84/day)
Except that AMD already demonstrated a large (compared to ARM anyway which have well under a billion) GPU. GTX 950 has 3 billion transsistors so the Polaris demo had to have a lot to keep pace.

Just FYI
Anandtech said:
The MXM modules in the picture are almost component-for-component identical to the GTX 980 MXM photo we have on file. So it is likely that these are not Pascal GPUs, and that they're merely placeholders.

On that note, while DRIVE PX 2 was the focus of NVIDIA’s presentation, it was GTX Titan X that was actually driving all of the real-time presentations
 
Last edited:
Joined
Jun 13, 2012
Messages
1,412 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Audio Device(s) Logitech Z906 5.1
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
For it being a secret you sure talk like you know a lot about it. :laugh:
For someone to claim that there is an issue when they don't know a damn thing either is worse. Yea i don't know if they are just keeping it a secret or if they got issues. But keeping it a secret is more likely as if there was issues would probably heard. Someone claiming there is a problem based on info they pulled outta their #(*@, well they are ones that started this all and you should be given them crap first.
When NVIDIA announced Pascal, they only presented a CGI rendering of the chip and said what it was about. NVIDIA hasn't demonstrated Pascal yet most likely because they're waiting on TMSC (again).
 
Joined
Apr 30, 2012
Messages
3,881 (0.84/day)
For someone to claim that there is an issue when they don't know a damn thing either is worse. Yea i don't know if they are just keeping it a secret or if they got issues. But keeping it a secret is more likely as if there was issues would probably heard. Someone claiming there is a problem based on info they pulled outta their #(*@, well they are ones that started this all and you should be given them crap first.

Please point to where I pointed out there was an issue?

All I see is your usual AMD thread trolling (Not just in this forum).
 
Last edited:
Top