Tuesday, April 17th 2018

AMD Responds to NVIDIA's GPP: AIB Partners to Announce New Radeon-Exclusive Brands

In a blog post on its gaming website, AMD has decided to put on the white gloves for a distinctive strike against NVIDIA's GPP initiative, which has seen rivers of ink and public discussion already. In the blog post, entitled "Radeon RX Graphics: A Gamer's Choice", the company is clearly putting its footing on the same stance it always finds itself positioned to by NVIDIA: the freedom of choice, and freedom of standards side of the equation.

The blog post entirely reads as an anti environment-lock manifesto, extorting the virtues of PC gaming and the open-ended building and assembly of parts from various manufacturers that it's built upon. As a move against NVIDIA's decision to enforce their GPP initiative to lock-in AIB partners towards having an NVIDIA-exclusive brand, AMD has come out of the gates saying that the simple solution is for partners to announce new, AMD-exclusive brands as well. This is logical; was to be expected; and is really AMD's only move out of this forced hand it was dealt with.
AMD's opinion is written on the walls of its blog post, though: "The freedom to tell others in the industry that they won't be boxed in to choosing proprietary solutions that come bundled with "gamer taxes" just to enjoy great experiences they should rightfully have access to." We've already seen one such brand being announced today by ASUS with its AREZ, AMD-exclusive brand. Others will follow suit, and the only thing NVIDIA will likely be left with is users' opinion on whether exactly this was a required move from the company.


AMD's blog post follows in full:

Radeon RX Graphics: A Gamer's Choice
"Our proud pastime of PC gaming has been built on the idea of freedom. Freedom to choose. How to play the game. What to do and when to do it. And specifically, what to play it on. PC gaming has a long, proud tradition of choice. Whether you build and upgrade your own PCs, or order pre-built rigs after you've customized every detail online, you know that what you're playing on is of your own making, based on your freedom to choose the components that you want. Freedom of choice is a staple of PC gaming.

Over the coming weeks, you can expect to see our add-in board partners launch new brands that carry an AMD Radeon product. AMD is pledging to reignite this freedom of choice when gamers choose an AMD Radeon RX graphics card. These brands will share the same values of openness, innovation, and inclusivity that most gamers take to heart. The freedom to tell others in the industry that they won't be boxed in to choosing proprietary solutions that come bundled with "gamer taxes" just to enjoy great experiences they should rightfully have access to. The freedom to support a brand that actively works to advance the art and science of PC gaming while expanding its reach.
The key values that brands sporting AMD Radeon products will offer are:

A dedication to open innovation
AMD works tirelessly to advance PC gaming through close collaboration with hardware standards bodies, API and game developers, making our technologies available to all to help further the industry. Through our collaboration with JEDEC on memory standards like HBM and HBM2, Microsoft on DirectX and Khronos on Vulkan, and through the GPUOpen initiative where we provide access to a comprehensive collection of visual effects, productivity tools, and other content at no cost, we're enabling the industry to the benefit of gamers.

A commitment to true transparency through industry standards
Through industry standards like AMD FreeSync technology, we're providing the PC ecosystem with technologies that significantly enhance gamers' experiences, enabling partners to adopt them at no cost to consumers, rather than penalizing gamers with proprietary technology "taxes" and limiting their choice in displays.

Real partnerships with real consistency
We work closely with all our AIB partners, so that our customers are empowered with the best, high-performance, high quality gaming products and technologies available from AMD. No anti-gamer / anti-competitive strings attached.

Expanding the PC gaming ecosystem
We create open and free game development technologies that enable the next generation of immersive gaming experiences across PC and console ecosystems. These efforts have resulted in advancements such as AMD FreeSync adoption on TVs for Xbox One S or X, integration of forward looking "Vega" architecture features and technologies into Far Cry 5 without penalizing the competition, and inclusion of open sourced AMD innovations into the Vulkan API which game developers can adopt freely.
We pledge to put premium, high-performance graphics cards in the hands of as many gamers as possible and give our partners the support they need without anti-competitive conditions. Through the support of our add-in-board partners that carry forward the AMD Radeon RX brand, we're continuing to push the industry openly, transparently and without restrictions so that gamers have access to the best immersive technologies, APIs and experiences.

We believe that freedom of choice in PC gaming isn't a privilege. It's a right."
Source: AMD Gaming Blog
Add your own comment

113 Comments on AMD Responds to NVIDIA's GPP: AIB Partners to Announce New Radeon-Exclusive Brands

#101
Vayra86
sith'ariI have the Palit GTX1080 Gamerock ( www.techpowerup.com/reviews/Palit/GeForce_GTX_1080_GameRock/29.html ) at my brother's pc and never noticed any temperature problems.
As i can see from your system-specs, you have MSI's GTX1080 Aero, ( www.newegg.com/Product/Product.aspx?Item=N82E16814127944 ) which bears a reference-like blower design, so it will have a similar performance with the reference one .
If you look the charts i put ,you'll see that the GTX1080reference can indeed reach 83°C . It's never a good idea to prefer a reference cooling design when you can choose a much better one.
It doesn't matter. Even in a case with sufficient airflow and normal ambient temps, once you run an overclock, and regardless of cooler, Pascal will hit the thermal ceiling. Its because it was designed this way.

I have a 1080 Gaming X and only at stock will it stay under 80 C - but even then, depending on the game, once this GPU is 100% load and pushing high FPS or using all of its resources, you will see it go to 79-83 C over time. Kepler behaved the same, and all cards in between do it as well. They just boost until they hit that 83 C ceiling and at 79 C they start clocking down or reducing the vcore to stick to that ceiling.

What really matters is what clocks you can push when you're at that temp, not so much the temperature itself. This btw does not make it a hot GPU, in a relative sense and for its performance and power usage Kepler, Maxwell and Pascal run quite cool. But they are also designed to use that headroom to extract higher performance. Most air cooling can already push the clocks up to the limit really, going cooler nets minimal gains. Its just very tightly managed in every sense and it is exactly this refined form of GPU Boost that AMD is so sorely lacking, even today.

Charts & reviews =/= real life and performance under prolonged use and varied loads.
kanecvr... my GTX 1080 does 83C and then throttles down. A lot. So it's not a cool card by any means. I replaced the blower style cooler with a NZXT Kraken water cooler, and temps on the GPU went down.. to 75C... that is A LOT for water cooling - worse yet, since the kraken only covers the GPU, the whole card is allmost uniformally heated to 70C, witch makes me worry. I installed vram heatsinks and a custom backplate, and temps went down to 64C on the card and 72 for the GPU - witch is still a lot for a "cool card". As for power usage, the card draws 180-220W when using the stock cooler. It seems nvidia advertised the power usage in non-boost scenaros, as at 1600mhz it does indeed use 180w - but at 1744MHz (it's boost clock at 83C) it goes up to 220w. At 1911MHz (boost clocks on water cooling) it goes up to a staggering 250-260w....

So it's not a cool card, and it has pretty high power consumption - but it is pretty fast, and a good product overall - but I was expecting better.
Your clock pre and post water cooling prove my point too. Air gets these cards to max or near-max clock potential already and water is 99% epeen value.
Posted on Reply
#102
bug
@Vayra86 I will add that these days chips (CPUs, GPUs) are designed to run at least at 80C. You can try to lower that if you push the transistors by increasing the voltage, but otherwise lowering the temps is akin to going 60mph on a 75mph road.
What matters when comparing cards is the TDP. That will go onto your electricity bill and determine how much cooling you need.

Edit: I will add that keeping temps in check used to be of paramount importance back when chips couldn't throttle themselves. But that was then.
Posted on Reply
#103
Adam Krazispeed
John NaylorA lot of fluff but doesn't really saying anything other than "We can't compete in these performance niches so we'll spout platitudes instead ." Since the subject came up, all I can remember is the commercial for Burger King with the old granny muttering "Where's the beef ?". When nVidia came out w/ PhysX, AMD could have a) produced a competing technology or b) licensed it. When nVidia came out w/ G-Sync, they could have a) produced a competing technology or b) licensed it ...instead they chose c) Create a name similar to G-Sync, only provide a part of the technology and sell the lesser featured package at reduced price. AMD could have included a hardware module in the Freesync monitors, but they chose not to ... some Freesync monitor manufacturers did include such a MBR module but they were not well recived because when the Freesync monitors were able to offer motion blur reduction buy adding the necessary hardware, they no longer had that big price advantage... and AMD never jumped on the MBR bandwagon cause they chose instead to sell on price.

nVidia has been taking more and more control from it's board partners legally, driver wise and physically with successive generations. Now it is willing to give 3rd party vendors a boost by partnering with them to create high performance model lines that customers are willing to pay for. We will write our drivers so as to allow higher clocks, if during the install it detects PCBs that meet certain criteria with regard to voltage control, cooling, etc.

And if they do so, all they are saying that if you are using what we give you to increase mindshare and generate high margins, you can't allow our competitor to take advantage of the branding ***we*** helped you build. This is business as usual in America ... newsflash .... America is a capitalist dog eat dog country... deal with it. If you own a pizza joint, Coca Cola will give you a refrigerator to hold its products... you want to put Pepsi in there, you violate the licensing agreement and we take back OUR fridge.

Where's the beef ? If Asus calls the nVidia line Strix and their Radeon line Arez, so what ? If AMD says that Asus can't not sell an AMD based card called Arez, would there be such a steenk ? Buger King can sell a 1/4 pound burger but thye can not call it "the quarter pounder" There is nothing anti-competitive; nothing more sinister limiting the use of the name then there is about not putting our competitor's products in the free fridge we gave you. In the end, all AMD is saying ... "well we gonna offer free fridges too"... and now when we buy pizza, we'll see two fridges ...one with AMD stuff inside and one with nVidia ... great EXACTLY what I wanted ... a way to read the logo on top of the fridge telling me this is where I can find an nVidia product inside and here's where I can find and AMD product inside. Nothing anti competitive, more like truth in advertising. The nVidia Strix products of recent generations are overclocking by 14 - 31%. The AMD cards are in single digits for the most part. The only thing AMD loses by the name limiting partnership agreement is that no one will be purchasing a product thinking that because their nVidia Strix OC's 25%, their AMD Strix is capable of doing the same.

I hope Intel soon does the same as I am frustrated by confused users sending me proposed 8700k builds with X370 MoBos cause they think X370 is a cheaper version of the Z370.
wrong buddy,

Ageia Technologies was the.. "CREATOR" Of PhysX, NOT NVIDIA?????? THEY STRONG ARMED THEM AND ACQUIRED THE TECH FROM AGEIA TECHNOLOGIES?? DO YOUR RESEARCH!!!!!!!!!!!!!!!! the same TEHY DID TO 3DFX BACK IN 2002 REMEMBER 3Dfx, 3Dfx was my favorite 3D accelerator of theat era??? then nvidia Destroyed them, and iv never supported nvidia ever scence..

en.wikipedia.org/wiki/Ageia

References[edit]
  1. Jump up^ AGEIA Acquires Meqon Research AB, MOUNTAIN VIEW, Calif. — September 1, 2005
  2. Jump up^ Smalley, Tim. "Nvidia set to acquire Ageia" bit-tech.net, 4 February 2008. Accessed at www.bit-tech.net/news/2008/02/04/nvidia_set_to_acquire_ageia/1 on 5 February 2008.
  3. Jump up^ NVIDIA completes Acquisition of AGEIA Technologies, NVIDIA, SANTA CLARA, CA — FEBRUARY 13, 2008 (press-release)
  4. Jump up^ Nvidia finalises Ageia deal, details future plans, Tim Smalley, 14th February 2008, bittech
  5. Jump up^ "Overview". PhysX. GeForce. Retrieved 2 April 2013.
nvida also made it so that even the AGEIA PhysX cards would "NOT" Work with ATI GPUs at that time. neither with ATI or AMD/RTG GPUs either, (STILL,) (TECHNICALLY ATI) but amd owns RTG not (WAS ATI)
Posted on Reply
#104
gamerman
1st,amd do ont earn 'gaming' 'ROG'markstheyr lausygpu's, and sure still it want flynviida wings of it.

all start nvidia own trademark and tts slogan 'The Way It's Meant To Be Played',and sureamd want part of it. and when we took amd,always cheat and handicap way...typical also thatcomment...just sympaty licking... huh!

2nd nvidias excellent gpuhelp us to lay 4Kgames for now,without itwe must use amd junk gpus... omggg!

amd,just satrt work..planning,building ,testingand thenrelease goodproduct as customer...for now 3 generation of amd gpus,we get ONLY old tech gpu with different namesnad withLAUSYTERRIBLE EFFICIENCY.

shame amd!
Posted on Reply
#105
Space Lynx
Astronaut
gamerman1st,amd do ont earn 'gaming' 'ROG'markstheyr lausygpu's, and sure still it want flynviida wings of it.

all start nvidia own trademark and tts slogan 'The Way It's Meant To Be Played',and sureamd want part of it. and when we took amd,always cheat and handicap way...typical also thatcomment...just sympaty licking... huh!

2nd nvidias excellent gpuhelp us to lay 4Kgames for now,without itwe must use amd junk gpus... omggg!

amd,just satrt work..planning,building ,testingand thenrelease goodproduct as customer...for now 3 generation of amd gpus,we get ONLY old tech gpu with different namesnad withLAUSYTERRIBLE EFFICIENCY.

shame amd!
Did you have a stroke while writing that? Or is English not your first language? My poor poor eyes...
Posted on Reply
#106
kanecvr
Vayra86It doesn't matter. Even in a case with sufficient airflow and normal ambient temps, once you run an overclock, and regardless of cooler, Pascal will hit the thermal ceiling. Its because it was designed this way.

I have a 1080 Gaming X and only at stock will it stay under 80 C - but even then, depending on the game, once this GPU is 100% load and pushing high FPS or using all of its resources, you will see it go to 79-83 C over time. Kepler behaved the same, and all cards in between do it as well. They just boost until they hit that 83 C ceiling and at 79 C they start clocking down or reducing the vcore to stick to that ceiling.

What really matters is what clocks you can push when you're at that temp, not so much the temperature itself. This btw does not make it a hot GPU, in a relative sense and for its performance and power usage Kepler, Maxwell and Pascal run quite cool. But they are also designed to use that headroom to extract higher performance. Most air cooling can already push the clocks up to the limit really, going cooler nets minimal gains. Its just very tightly managed in every sense and it is exactly this refined form of GPU Boost that AMD is so sorely lacking, even today.

Charts & reviews =/= real life and performance under prolonged use and varied loads.



Your clock pre and post water cooling prove my point too. Air gets these cards to max or near-max clock potential already and water is 99% epeen value.
You're missing the point - the GTX1080 RUNS HOT. The 1080Ti even more so - and they are very power hungry - yet somehow, people say vega is hot and hungry.
Posted on Reply
#107
bug
kanecvrYou're missing the point - the GTX1080 RUNS HOT. The 1080Ti even more so - and they are very power hungry - yet somehow, people say vega is hot and hungry.
Let's see.
Temps:
1080Ti - 84C www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_Ti/34.html
Vega64 - 85C (when not holding back) www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_64/36.html

Avg power under load:
1080Ti - 231W
Vega64 - 292W

So yeah, Vega isn't hotter then 1080Ti. But it's certainly more power hungry. 60W avg while gaming will show up on your electricity bill. Won't break the bank, but it will be there.
Posted on Reply
#108
Space Lynx
Astronaut
bugLet's see.
Temps:
1080Ti - 84C www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_Ti/34.html
Vega64 - 85C (when not holding back) www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_64/36.html

Avg power under load:
1080Ti - 231W
Vega64 - 292W

So yeah, Vega isn't hotter then 1080Ti. But it's certainly more power hungry. 60W avg while gaming will show up on your electricity bill. Won't break the bank, but it will be there.
:o news to me, my 1080 ti MSI The Duke never breaks 62 celsius and thats at 2060 core time spy extreme stress test, though to be fair i do have a strong fan curve. so I suppose a 3 fan cooler on a strong fan curve is not a fair comparison :D
Posted on Reply
#109
bug
lynx29:eek: news to me, my 1080 ti MSI The Duke never breaks 62 celsius and thats at 2060 core time spy extreme stress test, though to be fair i do have a strong fan curve. so I suppose a 3 fan cooler on a strong fan curve is not a fair comparison :D
Yes, custom solutions will work to various degrees of effectiveness, that's why I picked reference designs for comparison. Though at the end of the day, it's still Vega that ouputs more heat.
Posted on Reply
#110
Caring1
lynx29Did you have a stroke while writing that? Or is English not your first language? My poor poor eyes...
:roll::roll::roll::roll:
Posted on Reply
Add your own comment
Feb 17th, 2025 09:28 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts