I'm not the one claiming a Graphics API can help cool your overheating computer.
I'm not even going to respond to BS. That guys problem with physical silicon limitation is fairly unlikely to be a problem for PC enthusiasts, throw the default cooler off, 1. put a better one on, 2. put watercooling on, problem solved. No more discussion needed. If it's a problem for "teh mainstream", they will have to join the PC enthusiasts crowd's ways, at least temporairly until the GPU vendor replaces the fans with better default ones; but , no handholding, no spoonfeeding, no babysitting, that's what Mantle is for, people who are sick of being limited by the petty mainstreamers, trying to hold us back.
Am I coming down too hard, maybe, I'm telling you the truth, someone has to do it, and definitely I'm not trying to attack anyone with this, just pointing out these compartmentalized groups, how they behave and all the background, but that's not important for Mantle, they won't change anything, they can just assume the moral high-ground until the numbers get out, and again, don't come back on me because I'm saying again, It will definitely not be a 100% jump in performance on day one, but in time, this is not such an unrealistic value, BUT, with time, this number will be murky, because it will be gradual, and we all know what that means, it's not as noticable, and certainly the compartmentalized groups aren't going to analyze the numbers to caluclate the total boost in it's entirety unless we have a controlled benchmark and those benchmarks don't represent all the other engines either.
Remember, I'm on the defensive side, I don't come out attacking the mainstream, nobody's taking your DX and OGL away from you, AMD has made that perfectly clear. I'm taking this fight dead serious, I've waited for this since 4 years ago when first started to slowly became aware of all the problems of PC APIs, look if it fails, it fails, there's no buts of ifs, I might be taking this attitude part personal, but all these explanations I am doing my best to stay objective and scientific, I'm not perfect so my stuff sounds a bit emotional, but all I'm doing is laying it out the differences and if I'm wrong I admit it and correct it; plus what's the worst that can happen, some bozo developer on MountainDew and Doritos comes from college and tries to build a Mantle game and fails, and blames AMD, and everyone in the industry takes him seriously and jumps ship?
And some people keep thinking that this whole Mantle thing is some kind of AMD's massive PR, the announcements being made just before the GPUs ship might be strategic, but I don't see any BS spin on it considerig 4 other developers are involved. It's not like they were showing
puppies.
Sooooo DICE makes shit games, but since they helped AMD make Mantle they deserve medals, so they can make
efficient shit games?
I also wouldn't even put OpenGL and DirectX in the same league. OpenGL is used by a very small portion of game developers where as DirectX is essentially the defacto Graphics API. Even then almost nobody uses DirectX 11, which allows for many performance gains assuming you batch calls which almost nobody does because they'd rather hack a DX11 version together to say they have one.
With Mantle, it takes a bit longer to build the rendering pipeline, but it's a fixed cost, they don't have to maintain the codebase, they can keep improving it, and they have more time to worry about other parts of the game, it will shift the work away from all the effort going into making sure the game runs on PC, all the constant talk between GPU vendors.
Curious that you mention driver hacks specifically and also mention RAGE was a well developed game and all the problems were OpenGL's fault and not iD Software or AMD's fault--yet only AMD cards had issues with the game, it ran fine on Nvidia from what I recall.
Rage codebase was superclean and stable of almost every other game, the drivers were the only reason the game didn't work.
1. both drivers are hacks, "peformance" and "working" isn't the same thing so this is the big thing I want to point out, if it's working, it doesn't mean the optimizations are proper, it's still a ton of overhead, even if Nvidia's GPU driver hacks are better than AMDs, they're still hacks, kapish.
2. AMD Released the wrong beta driver mistakenly, the package contained the DLL file which was 3 weeks older than it should be
3. Proper AMD OGL drivers weren't as good as Nvidia's, not just app-specific, but AMD's driver had core support problems
4. Rage was ahead in complexity of any other OGL application.
Sorry for double post (i bet on the idea you would reply by now
)
---------------------------------------
----------------------------------------
Should this be the smoking gun of all quotes from the Q&A:
When asked "what's the benefit to consumers": AMD Driver Guy Responded, but he then said he gave word to devs who can explain better:
"Increased performance means two things, right? First, it means you can run faster, naturally, on decent hardware, the other way to look at it is that if I don't need to run faster, can I write on a lower end hardware? If I write on a lower end hardware, how does it expand my user base? So suddenly everybody with a small form factor not so powerful notebook can run all these games, right? Extremely expanded user base. So that's one way to look at it, right? The other way to look at it is performance is nice, but to me this in only a stepping stone, because if you're looking at 20% improvement, 2x, 3x, this is purely a performance advantage, when you think about 10x or more, the question you should start asking yourself, what is the new concept I can put on top of this, what new types of games ..."
Exact time:
http://youtu.be/sSY2KXBoro0?t=26m45s
Also good to point out:
I'm not quoting exactly but this is what the AMD guy also said, he in his 30 Years of being at ATI and AMD for GPUs, said that no other API was developed with direct game developer contact, always isolated.