• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

EVGA Announces Cancelation of NVIDIA Next-gen Graphics Cards Plans, Officially Terminates NVIDIA Partnership

you lost me. i'm just saying in all amd material they always use a reference card. i doubt they pay for aib's marketing
... and you're still not understanding what I'm saying. All official Nvidia material also uses their Founders designs. That's not what I'm talking about at all - I'm talking about brands marketing their own products - say, EVGA promoting EVGA GPUs - and how this marketing is largely (but not entirely) paid for by GPU makers through marketing support programs. I'm not talking about chipmakers' first-party ads, I'm talking about ads from Asus, Gigabyte, MSI, EVGA, Powercolor, Sapphire, etc. All of these are significantly funded by chipmakers.

They days of GPUs as add in cards are numbered anyways. It's a pretty open secret that the future of gaming is all cloud based. PC will be the first to go and we'll all be using SOCs or APUs and actual graphics "cards" are all going to be on cloud servers to do the work for you. Consoles will be the next thing to go with portable units being the last. The concept of getting a graphics card for your computer unless you are talking some multi thousand dollar workstation type deal will be as antiquated as a horse and buggy within a decade.
Sure. That's been promised for, what, a decade now? It has arguably gotten better in that time period, but it's still nowhere near a replacement for local hardware. The vast majority of the world is nowhere near having the internet connectivity necessary for this, and even for those who do, the disadvantages of cloud gaming typically far outweigh the advantages.
 
... and you're still not understanding what I'm saying. All official Nvidia material also uses their Founders designs. That's not what I'm talking about at all - I'm talking about brands marketing their own products - say, EVGA promoting EVGA GPUs - and how this marketing is largely (but not entirely) paid for by GPU makers through marketing support programs. I'm not talking about chipmakers' first-party ads, I'm talking about ads from Asus, Gigabyte, MSI, EVGA, Powercolor, Sapphire, etc. All of these are significantly funded by chipmakers.

OK i get it now. Cross promotion. You're initial point was EVGA would be more supported then the rest, right. I doubt it, small bussiness the others would find out and get pissed, and this things always come out inside the bussiness, employes that move for example. But even if they have better deals that should not be meaningfull for the bottomline, case and point what happened to EVGA and the others are still standing.
 
... and you're still not understanding what I'm saying. All official Nvidia material also uses their Founders designs. That's not what I'm talking about at all - I'm talking about brands marketing their own products - say, EVGA promoting EVGA GPUs - and how this marketing is largely (but not entirely) paid for by GPU makers through marketing support programs. I'm not talking about chipmakers' first-party ads, I'm talking about ads from Asus, Gigabyte, MSI, EVGA, Powercolor, Sapphire, etc. All of these are significantly funded by chipmakers.


Sure. That's been promised for, what, a decade now? It has arguably gotten better in that time period, but it's still nowhere near a replacement for local hardware. The vast majority of the world is nowhere near having the internet connectivity necessary for this, and even for those who do, the disadvantages of cloud gaming typically far outweigh the advantages.
I wish I could give the conclusion of this post a thousand upvotes. APUs, by their very nature, will never be a replacement even for the midrange discrete GPUs like the RX6600. APUs made by merchant chip makers, i.e. everyone besides Apple, are limited by many economic factors from achieving their technological limits:
  1. Limited memory bandwidth
  2. Sharing of limited memory bandwidth with the CPU
  3. limited die size (small dies for CPUs, and only a fraction of that space is devoted to the GPU)
  4. limited power (35 to 65 W) compared to a discrete GPU (north of 100 W)
All of these are limits imposed by the necessity of these APUs being available in affordable laptops. When these limits don't apply, we can see great APUs like the various console chips that AMD has made. However, this will never be the case for the APUs that we can buy for desktops or buy in the form of a whole laptop.
 
OK i get it now. Cross promotion. You're initial point was EVGA would be more supported then the rest, right. I doubt it, small bussiness the others would find out and get pissed, and this things always come out inside the bussiness, employes that move for example. But even if they have better deals that should not be meaningfull for the bottomline, case and point what happened to EVGA and the others are still standing.
... you seem to have a very ill informed idea of the relative power of the parties involved in such a negotiation. If, say, Gigabyte tells Nvidia "we heard you give more marketing support relative to sales to EVGA - we want you to give us a matching deal" - do you imagine Nvidia would just keel over and give them that? Of course not. Nvidia would say "Yes, and they are an exclusive partner, and they get more support in exchange for that exclusivity. You also work with AMD, so you'll not get the same level of support". There is literally nothing whatsoever an AIB could do about this outside of ending their deal with Nvidia outright. Which Nvidia knows, and reportedly uses to strong-arm AIB partners into very, very lopsided deals. AIB partners are pissed at Nvidia. They have been for years. But they haven't got a choice, unless they're willing to significantly scale back their business - which most companies aren't.

As for marketing support not being meaningful for their bottom lines ... what planet are you living on? Do you think they'd survive without marketing? Do you think AIB partners have the tens if not hundreds of millions of dollars necessary for a significant marketing presence in all the markets they are a part of? 'Cause they dont - outside of possibly Asus, being the juggernaut they are (but they also have way more products to market, which again eats into these funds).

When your highest revenue product segment has low single digit profit margins, even if you sell millions of that product and ASPs are high, you'll still be struggling to cover basic operating costs for your business, let alone have cash left over for marketing, R&D, etc.
I wish I could give the conclusion of this post a thousand upvotes. APUs, by their very nature, will never be a replacement even for the midrange discrete GPUs like the RX6600. APUs made by merchant chip makers, i.e. everyone besides Apple, are limited by many economic factors from achieving their technological limits:
  1. Limited memory bandwidth
  2. Sharing of limited memory bandwidth with the CPU
  3. limited die size (small dies for CPUs, and only a fraction of that space is devoted to the GPU)
  4. limited power (35 to 65 W) compared to a discrete GPU (north of 100 W)
All of these are limits imposed by the necessity of these APUs being available in affordable laptops. When these limits don't apply, we can see great APUs like the various console chips that AMD has made. However, this will never be the case for the APUs that we can buy for desktops or buy in the form of a whole laptop.
Yeah, APUs have promise in theory (and current ones are pretty good for their power level) but the board and platform requirements to realize that promise just don't really make sense for consumer applications. Special motherboards with GDDR on board would be exorbitantly expensive and difficult to sell; on-chip memory (like HBM) is expensive and very difficult to fit in a small package, etc. DDR5 has some promise towards alleviating this, but as you say you'd still struggle to come close to an RX 6600. I for one really hope AMD pushes their APUs higher in terms of GPU power - 100-150W with a 20CU GPU would be amazing, even with just dual channel DDR5 - but I don't see them taking over PC gaming any time soon, that's for sure.
 
It's not quite that simple - they likely get significantly better deals, pricing and marketing partnerships through being an exclusive partner.
IDK if you recall the drama over Nvidia's main stipulation of its partnership program, it resulted in partners that sold other brands were treated poorly, more than those that sold only Nvidia cards.
 
IDK if you recall the drama over Nvidia's main stipulation of its partnership program, it resulted in partners that sold other brands were treated poorly, more than those that sold only Nvidia cards.
Yep, the GPP was essentially an attempt at taking this type of scheme and pushing it into an even more explicitly anticompetitive form, with really draconian requirements like Geforce and Radeon cards not sharing branding (anyone remember the never-really-launched AREZ branding that was supposed to replace ROG on AMD products?) and an intensification of the inequality of funds provided. I have absolutely zero doubt that their regular marketing support programmes still differentiate between exclusive and non-exclusive partners - that is somewhat reasonable, after all - it just depends on the degree, as well as other factors such as the size and status of the AIB partner in question.
 
It's not about theory, it's about what's profitable. Point blank AIB GPUs do not make sense. They are going to die. APUs and high profit cloud based GPUs where you pay x per month for 30 fps 720, more for 60 fps 1080p, more for 240 fps 1080, more for 4k 120hz, and also charge by detail settings does make economic sense. And that's the future of PC gaming. You'll lease a could item and pay say 100 a month for a good one, you won't own the hardware.

Every industry player has been admitting this is where it's going. To many outright stating this is what they are going to do. The notion of you owning the hardware is as idiotic as that software isn't subscription based. You're going to rent an RTX, you don't get a vote in this.
 
It's not about theory, it's about what's profitable. Point blank AIB GPUs do not make sense. They are going to die. APUs and high profit cloud based GPUs where you pay x per month for 30 fps 720, more for 60 fps 1080p, more for 240 fps 1080, more for 4k 120hz, and also charge by detail settings does make economic sense. And that's the future of PC gaming. You'll lease a could item and pay say 100 a month for a good one, you won't own the hardware.

Every industry player has been admitting this is where it's going. To many outright stating this is what they are going to do. The notion of you owning the hardware is as idiotic as that software isn't subscription based. You're going to rent an RTX, you don't get a vote in this.
If that turns out to be the case, Nvidia will be the loser, because big cloud companies won't be beholden to their terms like ASUS or EVGA. They also won't pay Nvidia as much for the chips as the AIB partners have to. If Nvidia plays hardball, they could always throw money at AMD or Intel, and kill Nvidia's consumer gaming division forever.
 
Last edited:
It's not about theory, it's about what's profitable. Point blank AIB GPUs do not make sense. They are going to die. APUs and high profit cloud based GPUs where you pay x per month for 30 fps 720, more for 60 fps 1080p, more for 240 fps 1080, more for 4k 120hz, and also charge by detail settings does make economic sense. And that's the future of PC gaming. You'll lease a could item and pay say 100 a month for a good one, you won't own the hardware.

Every industry player has been admitting this is where it's going. To many outright stating this is what they are going to do. The notion of you owning the hardware is as idiotic as that software isn't subscription based. You're going to rent an RTX, you don't get a vote in this.
What "makes economic sense" doesn't matter whatsoever unless you can convince people to actually use these things. And so far, cloud gaming isn't convincing anyone. High performance APUs are great in consoles, but are fundamentally unsuited for desktop PC use due to significant feature mismatch.

And, crucially, you'll struggle a lot to get people to pay anywhere near $100/month even for a "2160p120" cloud gaming plan - 'cause the experience will be noticeably worse and more complicated than dedicated local hardware, access to games will be more limited (and thus more complicated), and you'll be reliant on an unreliable internet connectionto deliver what will always be a noticeably worse quality image. Any game with a dark look is nigh on unplayable from the cloud due to how video compression handles blacks and dark tones, and you can forget about meaningful HDR, let alone clarity and sharpness anywhere near a local game. What you're drawing up might be the fever dream of a bunch of silicon valley types, but it's unlikely to pan out in real life.
 
LOLOLOLOL! Thanks for the laugh. You made my day! Got anything more funny?
Not funny! Read TPU reviews of last 3 generations and you will be convinced.
 
Why all this talk about Intel ARC? It's basically dead. Buried six feet under. It's such an absolute failure.
Says who? People who want Intel to fail.
 
Seems to be something about Huang's personality at play. And it is hurting NV in major ways.
Nvidia stands for:
  • Price gouging customers and partners
  • Proprietary bullshit in hardware features, software, API, and of course physical compatibility
  • Bribery of devs with tools that are intentionally crippled on competitors' hardware.
  • Closed-source everything when they rely on open-source APIs, OSes, and frameworks.
  • Manipulation/unfair treatment of independent media who publicise flaws/truth
  • Antitrust lawsuits with numerous large-scale global entities
  • Running foul of legislation in around half the regions they operate in
  • Attempting to monopolise segments that are critical to healthy market operation (ARM was unsuccessful but other buyouts have succeeded and more will surely come)
There's a lot more than just this list, but I think every single point on this list is individually a pretty damning black mark against Nvidia.
 
Nvidia stands for:
  • Price gouging customers and partners
  • Proprietary bullshit in hardware features, software, API, and of course physical compatibility
  • Bribery of devs with tools that are intentionally crippled on competitors' hardware.
  • Closed-source everything when they rely on open-source APIs, OSes, and frameworks.
  • Manipulation/unfair treatment of independent media who publicise flaws/truth
  • Antitrust lawsuits with numerous large-scale global entities
  • Running foul of legislation in around half the regions they operate in
  • Attempting to monopolise segments that are critical to healthy market operation (ARM was unsuccessful but other buyouts have succeeded and more will surely come)
There's a lot more than just this list, but I think every single point on this list is individually a pretty damning black mark against Nvidia.
This is why I have been been buying AMD since a personal experience with the GTS 450.
 
The PSUs are okay and sell, but the boards are sitting in the shelves like dead ducks. I really doubt they can survive on PSUs and will close down if they don't partner with AMD or Intel.
Given than EVGA cites the US and UK as it's two primary regions, EVGA boards are super-rare over here, and they're terrible value compared to the competition. Sure, if you're after high-end perhaps that's a different market, but isn't the ultra-flagship partly about reputation and kudos too? If true, then surely Asus/MSI have that segment tied up with ROG Maximus and MEG Godlike and other pretenders are just that - pretenders.
 
What "makes economic sense" doesn't matter whatsoever unless you can convince people to actually use these things. And so far, cloud gaming isn't convincing anyone. High performance APUs are great in consoles, but are fundamentally unsuited for desktop PC use due to significant feature mismatch.

And, crucially, you'll struggle a lot to get people to pay anywhere near $100/month even for a "2160p120" cloud gaming plan - 'cause the experience will be noticeably worse and more complicated than dedicated local hardware, access to games will be more limited (and thus more complicated), and you'll be reliant on an unreliable internet connectionto deliver what will always be a noticeably worse quality image. Any game with a dark look is nigh on unplayable from the cloud due to how video compression handles blacks and dark tones, and you can forget about meaningful HDR, let alone clarity and sharpness anywhere near a local game. What you're drawing up might be the fever dream of a bunch of silicon valley types, but it's unlikely to pan out in real life.

You don't have a choice. They will force this on you, and if you want to game you will eat it and like it. And that's the only thing you will be able to do. That's the free market, the seller sets the term, you don't like it then fuck you you don't get to touch anything.
 
I have a feeling this is going to backfire on Nvidia. Not so much EVGA, but levitating the prices like this.

I've been on Nvidia for most of a decade now, but was using Radeons and even Athlon X4s in the 2000s (after 3dfx started to fail).

The price delta between equivalent upper midrange parts between Nvidia and AMD is growing. I've always said, it's worth an extra $50-$75 to avoid software issues, keep Nvidia's longer support for its cards, and so on. I still believe that to be true. However, the delta is getting too big.

Objectively, the 6700XT performs a little better than a 3060 Ti, and consumes a little less power. Its price is just $20 more than a vanilla 3060.

Although I had planned to wait until after new years for a GPU, I may go for a new one along with a platform upgrade in the next 4-6 weeks. I'd like to see what the A750/A770 can do first though.

For the first time in about 10 years, I'm not really even thinking about Nvidia's cards. My thinking now is all about 6700XT vs whatever Intel winds up releasing.
 
You don't have a choice. They will force this on you, and if you want to game you will eat it and like it. And that's the only thing you will be able to do. That's the free market, the seller sets the term, you don't like it then fuck you you don't get to touch anything.
Or man b builds an alternative and everyone buys that, capitalism heard of it.
There's been cries of that shit year's yet here we are, stadia did well.

I won't be surprised if other aibs follow Evga out that door.
 
Nvidia stands for:
  • Price gouging customers and partners
  • Proprietary bullshit in hardware features, software, API, and of course physical compatibility
  • Bribery of devs with tools that are intentionally crippled on competitors' hardware.
  • Closed-source everything when they rely on open-source APIs, OSes, and frameworks.
  • Manipulation/unfair treatment of independent media who publicise flaws/truth
  • Antitrust lawsuits with numerous large-scale global entities
  • Running foul of legislation in around half the regions they operate in
  • Attempting to monopolise segments that are critical to healthy market operation (ARM was unsuccessful but other buyouts have succeeded and more will surely come)
There's a lot more than just this list, but I think every single point on this list is individually a pretty damning black mark against Nvidia.

ooouch!! So its more than just God like exalted pricing for profit.

Some of the comments on this thread have been an eye opener for me. I was looking forward to RTX 4000 ... i think you guys have weighed down the eagerness in me. But as always, if the price is right im on it like a rabbit!

Just curious any idea when AMD will drop next Gen graphics cards?
 
You don't have a choice. They will force this on you, and if you want to game you will eat it and like it. And that's the only thing you will be able to do. That's the free market, the seller sets the term, you don't like it then fuck you you don't get to touch anything.
For that to happen you need a single actor or a consolidated group of actors willing to force this to happen. But the gaming market is wide open, with many actors involved - console makers in fierce competition, hardware makers across PC, console and mobile, software vendors across all kinds of platforms, +++. There is nobody with the kind of clout necessary to implement the kind of shift you're describing here. Not even close. Google have tried, and failed miserably. Microsoft is working on it, but showing zero signs of pushing for a cloud-only future - they just want you to buy Game Pass, and will happily sell you a console if that makes you do so. On the PC, nobody has that kind of power, as hardware and software sales have nothing to do with each other.

ooouch!! So its more than just God like exalted pricing for profit.

Some of the comments on this thread have been an eye opener for me. I was looking forward to RTX 4000 ... i think you guys have weighed down the eagerness in me. But as always, if the price is right im on it like a rabbit!

Just curious any idea when AMD will drop next Gen graphics cards?
Before the end of the year, November is rumored but there's no confirmation AFAIK. (IMO Nov is late unless it's a true hard launch, given that they'll miss the holiday season by launching then - but if that's when the cards are ready I guess that's when they're ready.)
 
I wish Radeon was still a good company able to be #1 against nVidia greed and gluttony. We need a good card now after 10 years of no competition at a price we can afford.
 
Just curious any idea when AMD will drop next Gen graphics cards?

AMD has repeatedly stated before the end of 2022. There's a window of opportunity for holiday sales so most likely sometime in October or the first half of November.

This would not be unusual behavior specific to AMD, pretty much all consumer electronics companies keep the holidays in mind. Remember that both PS5 and Xbox Series X|S launched for holiday 2020. Apple likes to have their holiday product lineup finalized by early November at latest.

AMD Radeon RX 6800 and 6800 XT debuted in mid-November 2020 with the 6900 XT following in very early December.

Without a doubt, AMD will stagger their release over months, starting with their high end models (7900, 7800). I assume Dr. Su's keynote at January's CES tradeshow will include a launch of some mid-range Radeon 70 series models.
 
Before the end of the year, November is rumored but there's no confirmation AFAIK

AMD has repeatedly stated before the end of 2022

I need to keep up with stuff.. this is news to me. From what i gathered, 2022 would see AM5, RPL and possibly RTX 4000. If we're getting next Gen AMD cards too... woohooo! hehe

Gonna wait this one out until all the reviews/feedback is widely available (esp. the TPU family's input) before pulling the trigger.
 
You don't have a choice. They will force this on you, and if you want to game you will eat it and like it. And that's the only thing you will be able to do. That's the free market, the seller sets the term, you don't like it then fuck you you don't get to touch anything.
Hehehe
Your example of, free market, is all kinda bass ackward.
I'll stick to reality thanks
;)
 
Those Founders cards look pretty wimpy. They look like a boutique product that won't last the test of time, though they are probably fine, I am sure. Except the disassembly. Then you look at something from EVGA and you think damn that is a serious piece of tech. Then you feel the weight, its rigidity, and you think dam that is nice. Mmm. Someone posted that 3080Ti shot, that is a purty card, I would touch it inappropriately.

Edit:

Got a little carried away :D
I take it you not handled a 3000 series FE card then?

The things are heavy, the shroud is metal. Anything but wimpy. :)

The gigabyte 1080ti I owned before my 3080 FE (granted not an EVGA) I felt I had to grip it so loose when removing it from case else I would snap the paper like shroud.
 
Back
Top