# Ubisoft and NVIDIA Team Up On Assassin's Creed Unity, Far Cry 4 And More



## Cristian_25H (Jun 5, 2014)

Ubisoft and NVIDIA today announced the next chapter in their strategic partnership bringing amazing PC gaming experiences to life in Ubisoft's highly anticipated upcoming titles including Assassin's Creed Unity, Far Cry 4, The Crew and Tom Clancy's The Division.

NVIDIA's GameWorks Team is working closely with Ubisoft's development studios to incorporate cutting edge graphics technology and gaming innovations to create game worlds that deliver unprecedented realism and immersion. NVIDIA's GameWorks technology includes TXAA antialiasing, which provides Hollywood-levels of smooth animation, soft shadows, HBAO+ (horizon-based ambient occlusion), advanced DX11 tessellation, and NVIDIA PhysX technology.



 



"Working with NVIDIA has enabled us to bring an enhanced gameplay experience to our PC players," said Tony Key, senior vice president of sales and marketing, Ubisoft. "We look forward to continuing our partnership with NVIDIA on our biggest upcoming titles."

This announcement builds on the successful collaboration between Ubisoft and NVIDIA that added visually stunning effects to Tom Clancy's Splinter Cell Blacklist, Assassins Creed IV Black Flag and Watch Dogs.

"We're excited to continue our long-term partnership with Ubisoft in bringing our latest PC technology to their games", said Tony Tamasi, senior vice president of Content & Technology at NVIDIA. "Through GameWorks, we have been able to add unique visual and gameplay innovations to deliver amazing experiences for these stellar Ubisoft games, I can't wait to play them myself."

*View at TechPowerUp Main Site*


----------



## Ronnyv1 (Jun 5, 2014)

Bringing the amazing watch dogs experience to pc kappa


----------



## EzioAs (Jun 5, 2014)

Ronnyv1 said:


> Bringing the amazing watch dogs experience to pc kappa


 
I really wanted to know what the problem people have with watch dogs on the PC. I finished the game with a 3570K and GTX660, setting the graphics to reasonable level (high texture, SMAA, other settings mostly high) and it seems fine. 

The only problem I have with the game is that if I alt+tab out and in, there's a huge stutter + freeze but it's fixed if I restart the game.


----------



## natr0n (Jun 5, 2014)

oh great purposely poor performing games ahead.

They way its meant to be paid.


----------



## Prima.Vera (Jun 5, 2014)

Still promoting the junk called TXAA I see. How can they still pursue this garbage forward, is beyond my understanding....


----------



## btarunr (Jun 5, 2014)

EzioAs said:


> I really wanted to know what the problem people have with watch dogs on the PC. I finished the game with a 3570K and GTX660, setting the graphics to reasonable level (high texture, SMAA, other settings mostly high) and it seems fine.
> 
> The only problem I have with the game is that if I alt+tab out and in, there's a huge stutter + freeze but it's fixed if I restart the game.



The problem is lack of good performance with AA cranked up on Radeon. Radeon users have no TXAA, so we're left with 4x MSAA to have any hope of clean graphics, and 4x MSAA roasts the GPU, leading to throttling and rubber-banding.

The end result still doesn't end up looking like it needed so much GPU power to process.

This is what makes Gameworks suck, and incomparable to AMD GamingEvolved. Games bearing GamingEvolved play just as good on AMD and NVIDIA, with the same effects and features; while Gameworks gives games features that are exclusive to GeForce.

Don't bring in the Mantle argument. Mantle doesn't give a game any new eye-candy. It only makes low-end CPUs play the game better.


----------



## Prima.Vera (Jun 5, 2014)

btarunr said:


> The problem is lack of good performance with AA cranked up on Radeon. Radeon users have no TXAA, so we're left with 4x MSAA to have any hope of clean graphics, and 4x MSAA roasts the GPU, leading to throttling and rubber-banding.
> 
> The end result still doesn't end up looking like it needed so much GPU power to process.


Relax. No need for the blurry-resource hog TXAA. I am playing using Temporal SMAA which is way, way better, both quality and speed, than any TXAA, MSAA or other resource hungry techniques.


----------



## FrustratedGarrett (Jun 5, 2014)

I didn't see any special visual effects playing Watch Dogs on my GTX670. The graphics look very cartoonish and flamboyantly bright. The physics effects and movement mechanics are pretty much identical to those in Assassin's Creed.  I didn't finish the game, or rather I couldn't get myself to play the game after a couple of days or 3.5 hours of game play. 

I'm also opposed to Nvidia's Gameworks "middleware". Considering the game doesn't look better than BF4 and runs much worse than BF4, I don't see what the point was in using a bunch of specially compiled .dll files by Nvidia without any source code around to make sense of by both Ubisoft and AMD/Intel.


----------



## Razorfang (Jun 5, 2014)

btarunr said:


> The problem is lack of good performance with AA cranked up on Radeon. Radeon users have no TXAA, so we're left with 4x MSAA to have any hope of clean graphics, and 4x MSAA roasts the GPU, leading to throttling and rubber-banding.
> 
> The end result still doesn't end up looking like it needed so much GPU power to process.
> 
> ...



Yet developers are choosing to use GameWorks regardless of everything you said.


----------



## Lionheart (Jun 5, 2014)

Razorfang said:


> Yet developers are choosing to use GameWorks regardless of everything you said.



$$$$$


----------



## EzioAs (Jun 5, 2014)

Prima.Vera said:


> Still promoting the junk called TXAA I see. How can they still pursue this garbage forward, is beyond my understanding....



Imo, it's the best AA/performance hit there is that pretty much removes visible aliasing. Blurry? Yes and it's quickly noticeable but I think just leaving the option there for those that prefer having no aliasing rather than sharper image is a pretty good idea. At least we can choose whether we want to use it or not while Nvidia (or others) works on something new (hopefully).

I really hope SMAA catches up though. It baffles me that some games still only have FXAA, since SMAA is plain better while having around the same performance hit.


----------



## claylomax (Jun 5, 2014)

btarunr said:


> The problem is lack of good performance with AA cranked up on Radeon. Radeon users have no TXAA, so we're left with 4x MSAA to have any hope of clean graphics, and 4x MSAA roasts the GPU, leading to throttling and rubber-banding.
> 
> The end result still doesn't end up looking like it needed so much GPU power to process.
> 
> ...



Well said.


----------



## Krekeris (Jun 5, 2014)

*NOOOOO!!!* Sad thing, Nvidia workshop means all Ubi games be unoptimized laggy mess even for Nvidia users. GG.


----------



## MustSeeMelons (Jun 5, 2014)

For me, as an AMD GPU user this means that I will download the game to check how it runs, before paying anything.


----------



## Recus (Jun 5, 2014)

natr0n said:


> oh great purposely poor performing games ahead.
> 
> They way its meant to be paid.



I thought you will be happy. (GR: Future Soldier, Tomb Raider, Dirt Showdown, Far Cry 3, Hitman Absolution...)



Prima.Vera said:


> Still promoting the junk called TXAA I see. How can they still pursue this garbage forward, is beyond my understanding....



Why scrubs so desperately trying to prove that FXAA/TXAA blurs especially when they don't even use it?



btarunr said:


> This is what makes Gameworks suck, and incomparable to AMD GamingEvolved. Games bearing GamingEvolved play just as good on AMD and NVIDIA, with the same effects and features; while Gameworks gives games features that are exclusive to GeForce.



This remind me Dirt Showdown and global illumination.





So what's better: don't have some features (GameWorks) or have it but can't turn them on because gameplay would be impossible.



FrustratedGarrett said:


> I didn't see any special visual effects playing Watch Dogs on my GTX670. The graphics look very cartoonish and flamboyantly bright. The physics effects and movement mechanics are pretty much identical to those in Assassin's Creed.  I didn't finish the game, or rather I couldn't get myself to play the game after a couple of days or 3.5 hours of game play.
> 
> I'm also opposed to Nvidia's Gameworks "middleware". Considering the game doesn't look better than BF4 and runs much worse than BF4, I don't see what the point was in using a bunch of specially compiled .dll files by Nvidia without any source code around to make sense of by both Ubisoft and AMD/Intel.



Because BF4 based on renamed Frostbite 2 engine from 2011. Ant it runs on 2014 hardware.



> bunch of specially compiled .dll files by Nvidia without any source code around to make sense of by both Ubisoft and AMD/Intel.



Where did I heard this? Oh yeah, Mantle. No sence for Nvidia, Intel, Mali, PowerVR.


----------



## Xzibit (Jun 5, 2014)

Ignorant as always



Recus said:


> Why scrubs so desperately trying to prove that FXAA/TXAA blurs especially when they don't even use it?



Ever occurred to you people might own both ?..



Recus said:


> Where did I heard this? Oh yeah, Mantle. No sence for Nvidia, Intel, Mali, PowerVR.



Mantle doesn't affect competitors DX performance.  The compiled .DLL issue with GameWorks was raised by developers through social media and how it was bad for the industry before Nvidia partially opened it up to certain features after pressure.


----------



## mroofie (Jun 5, 2014)

Xzibit said:


> Ignorant as always.
> 
> Mantle doesn't affect competitors DX performance.  The compiled .DLL issue with GameWorks had to be raised by developers complaining through social media and how it was bad for the industry before Nvidia partially opened it up to certain features.


 and yet what recus mentioned above is true, stop being a fanboy all of the evidence shows that if amd had more developers in their pocket the same would happen to us. Amd is far from being classified as "good" company !!



mroofie said:


> and yet what recus mentioned above is true, stop being a fanboy all of the evidence shows that if amd had more developers in their pocket the same would happen to us.
> 
> Amd is far from being classified as "good" company just remember that!!


----------



## midnightoil (Jun 5, 2014)

Ubisoft do seem amazingly determined to make people pirate their games.

Personally, I'll avoid anything with this GameWorks crap entirely, despite having a mix of AMD & NVIDIA.


----------



## sunaiac (Jun 5, 2014)

Awesome, the rape will go on 
nVidia, destroying PC video gaming the way it's meant to be destroyed.


----------



## trustedsource (Jun 5, 2014)

btarunr said:


> The end result still doesn't end up looking like it needed so much GPU power to process.



This is not a GameWorks related problem. Watch Dogs only use this lib for two effects. Even if you turn these off, the game still run poorly.
The problem is the engine. Ubisoft spends a lot of money to understand D3D11 deferred contexts. But even Microsoft admits that deferred contexts is a complete failure, so spending money to implement that function is the worst way to gain speed in a D3D renderer. It is hard to implement it right and stable, and the technique is simply not working in a complex game.
Most publishers just don't care about this, and they research other techniques to gain speed. For example well known effects redesigned with compute shaders in mind, or there is a D3D11_MAP_WRITE_NO_OVERWRITE function in D3D11.1, which is a huge help for CPU-limited scenarios.

Ubisoft just don't want to spend too much money on PC. They aware now that even if an idea is good the actual standard APIs are not working correctly in complex scenarios, so the actual implementation could be sucks. In this case they just licenc the effects from NVIDIA, because it's cheap ... and if a studio won't implement Mantle the only thing that matters is to bring the PC port in the cheapest way possible.



btarunr said:


> Don't bring in the Mantle argument. Mantle doesn't give a game any new eye-candy. It only makes low-end CPUs play the game better.


There will be Mantle titles where some effects doesn't possible in D3D. In this case these will be Mantle exclusives. Mostly high-tech console effects.


----------



## semantics (Jun 5, 2014)

I always find it laughable, to me it just seems like butt hurt people wanting everything anytime _mantel _or _gameworks _comes up.

Can't do _xxx_ because i don't own card from company _yyy_. A complaint that can easily be summed up because it feels wrong and it feels wrong because a person cannot have both without doing something unreasonable like having two gaming computers.

Trying to blame one or the other while especially absolving the other is just mud throwing and unproductive just encourages more partisan tactics.

Simple truth is AMD pays game companies either literally or with resources such as engineers to get mantel into games. It's the same deal with Nvidia and gameworks they give away their time and resources to get their own product showcased.

Mantel wouldn't be in anything larger than indie without that support from AMD and gameworks wouldn't be a thing if nvidia charged companies to use it.

It's the same idea just different vectors.


----------



## raptori (Jun 5, 2014)

I hope you Nvidia cooperate more efficiently with gaming industry and bring more SLI utilization not like the garbage Watch dogs ,OMG "The division" RIP.


----------



## trustedsource (Jun 5, 2014)

raptori said:


> I hope you Nvidia cooperate more efficiently with gaming industry and bring more SLI utilization not like the garbage Watch dogs ,OMG "The division" RIP.


NVIDIA can do nothing if an engine is too complex for efficient AFR. D3D11 és OpenGL don't provide good QoS for multi-GPU and this won't change.


----------



## Xzibit (Jun 5, 2014)

trustedsource said:


> NVIDIA can do nothing if an engine is too complex for efficient AFR. D3D11 és OpenGL don't provide good QoS for multi-GPU and this won't change.



Bad programming.  The engine is a hybrid of Anvil and Dunia and if Ubisoft hasn't figured out how to optimize their own engines time and time again.  There is little hope for PC users wanting more then a higher texture/audio console port.

*Ubisoft: Watch Dogs’ Engine Was Originally Built for Driver*


> It’s not like Watch Dogs started as Watch Dogs. The Watch Dogs project was initially another game. At some point it changed. That’s at least three years ago, and then the Watch Dogs project reused some of the work that had been done on this driving engine.”


----------



## trustedsource (Jun 5, 2014)

Xzibit said:


> Bad programming.  The engine is a hybrid of Anvil and Dunia and if Ubisoft hasn't figured out how to optimize their own engines time and time again.  There is little hope for PC users wanting more then a higher texture/audio console port.


But the same programmers write extremely good code for consoles. So they are extremely skilled. But even the most talented programmers can't able write a good code for the standard APIs. There is a reason why Mantle is so popular. If you find a problem in the Mantle renderer you just profile it and fix it. You can profile a D3D renderer also, but some case it is almost impossible to fix performance problems. More complex engines means more unfixable scenarios.


----------



## spectatorx (Jun 5, 2014)

" NVIDIA's GameWorks technology includes TXAA antialiasing, which provides Hollywood-levels of smooth animation" - so do they mean all their games will be running at 23fps? I think fps in movies is not the same what fps in games and it leads me to conclusion: they do not think.


----------



## Durvelle27 (Jun 5, 2014)

EzioAs said:


> I really wanted to know what the problem people have with watch dogs on the PC. I finished the game with a 3570K and GTX660, setting the graphics to reasonable level (high texture, SMAA, other settings mostly high) and it seems fine.
> 
> The only problem I have with the game is that if I alt+tab out and in, there's a huge stutter + freeze but it's fixed if I restart the game.


The problem is

Stuttering, Bad FPS, VRAM hog etc.....


----------



## Yellow&Nerdy? (Jun 5, 2014)

I think Ubisoft is quickly catching up to EA when it comes to being a scumbag company. I really dislike GameWorks, and I disagree with it being the same as Mantle. Mantle doesn't affect Nvidia cards' performance, nor does it give people with AMD cards better visuals. 

That being said, Ubisoft also did a pretty shitty job optimizing the game and game engine itself, which also contributed to the problems Watch Dogs has.


----------



## Prima.Vera (Jun 5, 2014)

To be honest the graphics in Watch Dogs is not as spectacular as one would have think. Even on ULTRA it looks cartoonish and flat.



spectatorx said:


> " NVIDIA's GameWorks technology includes TXAA antialiasing, which provides Hollywood-levels of smooth animation" - so do they mean all their games will be running at 23fps? I think fps in movies is not the same what fps in games and it leads me to conclusion: they do not think.


Again, you are comparing apples with waffles.


----------



## renz496 (Jun 5, 2014)

Prima.Vera said:


> Still promoting the junk called TXAA I see. How can they still pursue this garbage forward, is beyond my understanding....



people calling PhysX junk for years but did it stop nvidia from further developing and pushing the tech to game developer?


----------



## HM_Actua1 (Jun 5, 2014)

Let the AMD rage begin......


----------



## FrustratedGarrett (Jun 5, 2014)

Hitman_Actual said:


> Let the AMD rage begin......



People are voicing their valid legitimate complaints. It  doesn't seem like Game Works is good for anything other than destroying performance. The game is mediocre looking and yet it performs like crap.

Comparing Mantle to Game Works doesn't work. Mantle can be compared to Nvidia's Physix or CUDA as a so-far proprietary API. Having a Mantle renderer doesn't affect performance on Nvidia cards. 

Game Works is basically Nvidia-optimized compiled libraries that no other company can look into or optimize their drivers for. It's a violation of the industry's ethics and doesn't help anyone, including Nvidia. I mean the game still runs like crap on Nvidia's cards too.


----------



## Hilux SSRG (Jun 5, 2014)

AMD hardware is in XBOX/PS4 with Mantle optimizations.  That's why NVidia is pushing more and more Gameworks.  It's there way to get in the middle and make sure stuff runs well on their discrete gfx cards.


----------



## Casecutter (Jun 5, 2014)

Promoting PC gaming on proprietary hardware is a serious impairment on innovation within gaming houses they’ll all pull from the same graphic libraries… gamers should not be happy with such ideas!  Gaming houses will be beholden to waiting for Nvidia/AMD next big graphic source library of innovations running on their latest supposed hardware abilities, then such hardware will want rehashed games there to show-case the hardware.  That’s backwards, gaming houses should be in competition to develope innovation(s) and push the hardware to build for that.  Open source/cross-platform is in all gamers best interests, and we should resist on a common front, unless you want nothing other the "console" PC’s.


----------



## sweet (Jun 5, 2014)

renz496 said:


> people calling PhysX junk for years but did it stop nvidia from further developing and pushing the tech to game developer?


TXAA, PhysX are junks, but developers still use it because nVidia feeds them with money, a lot. And thanks to nVidia's success in the professional scene, they have enough money to play dirty, and then charge the premium on consumers to get even more money.


----------



## Fluffmeister (Jun 5, 2014)

It is interesting reading views around the inter-webs about this. The goal posts do seem to have moved a bit in recent years. 

I gather closed and proprietary is just fine now, as long as it doesn't effect their very fiercest rivals who they are in direct competition with.


----------



## SKL_H (Jun 5, 2014)

I use Ndivia graphics  and I have to say that this "gameworks" stuff is just Nvdia saying "hey we sell expensive GPUs and we want people to buy anyway" and Ubisoft helping them, and people do buy and developer use their techs, that's jus the way it is.

"THE WAY ITS MEANT TO BE"


----------



## Eric_Cartman (Jun 5, 2014)

btarunr said:


> Radeon users have no TXAA, so we're left with 4x MSAA to have any hope of clean graphics, and 4x MSAA roasts the GPU, leading to throttling and rubber-banding.



So your problem with the game is that AMD can't manage to put out graphics cards that don't overheat and start to throttle, so demanding games run like shit on them, and this is some how the fault of the game developers and nVidia.  Wow!


----------



## FrustratedGarrett (Jun 5, 2014)

SKL_H said:


> I use Ndivia graphics  and I have to say that this "gameworks" stuff is just Nvdia saying "hey we sell expensive GPUs and we want people to buy anyway" and Ubisoft helping them, and people do buy and developer use their techs, that's jus the way it is.
> 
> "THE WAY ITS MEANT TO BE"



I don't think it's fair to force people to buy high-end graphics cards to play games that could run perfectly well on mid-range $200ish graphics cards. I don't support that and my next GFX ain't gonna be an Nvidia one.


----------



## Xzibit (Jun 5, 2014)

SKL_H said:


> I use Ndivia graphics  and I have to say that this "gameworks" stuff is just Nvdia saying "hey we sell expensive GPUs and we want people to buy anyway" and Ubisoft helping them, and people do buy and developer use their techs, that's jus the way it is.
> 
> "THE WAY ITS MEANT TO BE"



Like last time. *Nvidia Closes $5 Million Deal with Ubisoft**.  
*
Ubisoft had been using Intels Havok so if switching to GameWorks brings in extra revenue instead of spending money.   Its a smart business move for Ubisoft.

We've gone from companies paying for exclusive bundles to exclusive optimization to possibly only me optimization on a standard API.


----------



## Fluffmeister (Jun 5, 2014)

Xzibit said:


> We've gone from companies paying for exclusive bundles to exclusive optimization to possibly only me optimization on a standard API...



...to only me optimization on a proprietary API.

*AMD paid up to $8 million for Battlefield 4 deal*

http://bf4central.com/2013/10/amdamd-paid-ea-5-million-battlefield-4-deal/


----------



## Durvelle27 (Jun 5, 2014)

Eric_Cartman said:


> So your problem with the game is that AMD can't manage to put out graphics cards that don't overheat and start to throttle, so demanding games run like shit on them, and this is some how the fault of the game developers and nVidia.  Wow!


This post is hilarious


----------



## arbiter (Jun 6, 2014)

Recus said:


> This remind me Dirt Showdown and global illumination.



I've seen enough graph's from AMD like this that are complete BS and this looks like another one.



Xzibit said:


> Ignorant as always
> Ever occurred to you people might own both ?..
> Mantle doesn't affect competitors DX performance.  The compiled .DLL issue with GameWorks was raised by developers through social media and how it was bad for the industry before Nvidia partially opened it up to certain features after pressure.



As for saying mantle doesn't effect it? How can you be sure? Since Now these dev's have to work on mantle api in their game debugging and optimizing it that takes away time from working and optimizing game on DX as a whole.




FrustratedGarrett said:


> Comparing Mantle to Game Works doesn't work. Mantle can be compared to Nvidia's Physix or CUDA as a so-far proprietary API. Having a Mantle renderer doesn't affect performance on Nvidia cards.
> Game Works is basically Nvidia-optimized compiled libraries that no other company can look into or optimize their drivers for. It's a violation of the industry's ethics and doesn't help anyone, including Nvidia. I mean the game still runs like crap on Nvidia's cards too.



Unlike Mantle source for game works is available. Though do need to license it but Least they can get it Unlike Mantle that is completely locked and NO ONE outside few dev's and AMD can get it.



Durvelle27 said:


> This post is hilarious



It is true, AMD put out a card that has to run 95c and clocked 200mhz higher to even compete.


Besides all that, Sad how Much AMD Whine's about half things nvidia does, Yet you don't seen nvidia whine about AMD. Nvidia calls their driver dev's up and tells them to fix it, AMD just all up their PR tells them to complain.


----------



## sweet (Jun 6, 2014)

arbiter said:


> I've seen enough graph's from AMD like this that are complete BS and this looks like another one.
> 
> As for saying mantle doesn't effect it? How can you be sure? Since Now these dev's have to work on mantle api in their game debugging and optimizing it that takes away time from working and optimizing game on DX as a whole.
> 
> ...



Do your research. You can register with AMD to get Mantle code, only the SDK requires subscription fee though. 

And my 290x run at 55c, you got a problem with that? They save money on ref cooler and I'm fine with that cause I put a block on it anyways. And still cheaper than a vanilla 780Ti, well.


----------



## Xzibit (Jun 6, 2014)

arbiter said:


> As for saying mantle doesn't effect it? How can you be sure? Since Now these dev's have to work on mantle api in their game debugging and optimizing it that takes away time from working and optimizing game on DX as a whole.



Incase you haven't notice Mantle is a separate API.  GameWorks is libraries which cater to Nvidia hardware on current DirectX API.  If you feel that way about Mantle you should be more pissed off at Nvidias GameWorks.



arbiter said:


> Besides all that, Sad how Much AMD Whine's about half things nvidia does, Yet you don't seen nvidia whine about AMD. Nvidia calls their driver dev's up and tells them to fix it, AMD just all up their PR tells them to complain.



Atleast read something and educate yourself on the subject.

*Extremetech - GameWorks FAQ: AMD, Nvidia, and game developers weigh in on the GameWorks controversy*



> Nvidia’s Tony Tamasi acknowledged on the phone that there are some bugs that can only be fixed by looking at the source.





> Nvidia: Developers can now license the right to see source code on GameWorks libraries in a standardized fashion (before March this was apparently handled on a case-by-case basis).





> Some pointed to the source code difference (keep in mind, you still have to buy/negotiate the right to GW library source code with Nvidia).





> Love it or hate it, no one sees GameWorks and Mantle as equivalents.


----------



## Frick (Jun 6, 2014)

Xzibit said:


> *Extremetech - GameWorks FAQ: AMD, Nvidia, and game developers weigh in on the GameWorks controversy*



I just want to make it clear I'm thanking you for the link, not anything else.


----------



## Relayer (Jun 6, 2014)

semantics said:


> I always find it laughable, to me it just seems like butt hurt people wanting everything anytime _mantel _or _gameworks _comes up.
> 
> Can't do _xxx_ because i don't own card from company _yyy_. A complaint that can easily be summed up because it feels wrong and it feels wrong because a person cannot have both without doing something unreasonable like having two gaming computers.
> 
> ...



*Please* read into what the devs say about Mantle. Notice that it got msft off of their butts to make DX competitive with Mantle. Notice Also that we have "Metal" a new close to the metal rendering API from Apple now (and Plants vs. Zombies, a Mantle game (hmmm?) the first to be ported to it.). Mantle is the crest of the wave that is building in the gaming industry. Devs aren't supporting it because they are being compensated. Devs have been begging anyone that would listen since DX9 for an API like Mantle. That's why they've embraced it. Someone finally listened to them.


----------



## Durvelle27 (Jun 6, 2014)

arbiter said:


> I've seen enough graph's from AMD like this that are complete BS and this looks like another one.
> 
> 
> 
> ...


Where do you get your information buddy


----------



## HisDivineOrder (Jun 6, 2014)

I look forward to more blur filtering by nVidia-exclusive AA's and more Ubisoft lazy porting, which will swiftly be blamed on nVidia despite Ubisoft's long, storied history of lazy porting.


----------



## semantics (Jun 6, 2014)

Relayer said:


> *Please* read into what the devs say about Mantle. Notice that it got msft off of their butts to make DX competitive with Mantle. Notice Also that we have "Metal" a new close to the metal rendering API from Apple now (and Plants vs. Zombies, a Mantle game (hmmm?) the first to be ported to it.). Mantle is the crest of the wave that is building in the gaming industry. Devs aren't supporting it because they are being compensated. Devs have been begging anyone that would listen since DX9 for an API like Mantle. That's why they've embraced it. Someone finally listened to them.


You know those totally unbais devs that have no connections to AMD and their programs; just like the opinions of the unbias devs who talk about gameworks that have no conflict of interest to nvidia.

Devs been begging for an api that locks them into one gpu or another since dx9? Where did you get that source out of AMD's PR team? Plants vs Zombies from EA who has ties with AMD to push mantle not because they believe in it but they are paid to. Plus the whole thing about how drafts for openGL and directX about changes similar to mantle but gpu agnostic were public months before mantle was public shows this isn't more than AMD PR move. Considering Dx12 is another large rewrite to Dx like DX10 was shows it's something that takes years to develop not just a month or two after mantle's not really public release. The only difference between mantle and gameworks to me is that mantle is less sustainable. Both are piles of shit.

Gameworks is middleware something AMD abandoned for the most part and mantle is just an separate code path for AMD products.

Like i said they are both attempts at showcasing one company's product, not to put the competitors product down but to make theirs seem special. Both are cheats neither is malicious towards the other but cheats none the less.


----------



## Eric_Cartman (Jun 6, 2014)

Durvelle27 said:


> This post is hilarious


Hilariously accurate.  I don't have any problems with throttling using MSAA on my 660's.



sweet said:


> And my 290x run at 55c, you got a problem with that? They save money on ref cooler and I'm fine with that cause I put a block on it anyways. And still cheaper than a vanilla 780Ti, well.



Yeah, you have to water cool the 290x just to keep it from throttling.  Even the aftermarket air coolers, that all the AMD fanboys said would save the 290x, still couldn't keep the thing cool and they still ended up loosing out to a stock 780Ti.  But if your are looking at expense, just buy a 780.  They are the same price as a 290x, at stock perform within 10% of a 290x, overclock to match or better a 290x(because you can't overclock a 290x on air cooling), and you don't have to spend $200 on a liquid cooling system to keep the 780 from throttling and making games perform like shit!


----------



## 64K (Jun 6, 2014)

UbiSlop is a console publisher. That's their bread and butter. UbiSoft will be Nvidia's whore whenever they can profit from it. To them it's just business as usual.


----------



## TheGuruStud (Jun 6, 2014)

Eric_Cartman said:


> Hilariously accurate.  I don't have any problems with throttling using MSAA on my 660's.
> 
> 
> 
> Yeah, you have to water cool the 290x just to keep it from throttling.  Even the aftermarket air coolers, that all the AMD fanboys said would save the 290x, still couldn't keep the thing cool and they still ended up loosing out to a stock 780Ti.  But if your are looking at expense, just buy a 780.  They are the same price as a 290x, at stock perform within 10% of a 290x, overclock to match or better a 290x(because you can't overclock a 290x on air cooling), and you don't have to spend $200 on a liquid cooling system to keep the 780 from throttling and making games perform like shit!



You must be trollin. Mine is OCed to 1,180 and temp is 71C max. The fan profile isn't very aggressive (normal fan noise). Try, again.


----------



## Durvelle27 (Jun 6, 2014)

Eric_Cartman said:


> Hilariously accurate.  I don't have any problems with throttling using MSAA on my 660's.
> 
> 
> 
> Yeah, you have to water cool the 290x just to keep it from throttling.  Even the aftermarket air coolers, that all the AMD fanboys said would save the 290x, still couldn't keep the thing cool and they still ended up loosing out to a stock 780Ti.  But if your are looking at expense, just buy a 780.  They are the same price as a 290x, at stock perform within 10% of a 290x, overclock to match or better a 290x(because you can't overclock a 290x on air cooling), and you don't have to spend $200 on a liquid cooling system to keep the 780 from throttling and making games perform like shit!


I have a R9 290X @1150/1400 fan @65% the highest the temps have ever gone is 82*C but in majority of all games it sits around 73*C


----------



## Eric_Cartman (Jun 6, 2014)

TheGuruStud said:


> You must be trollin. Mine is OCed to 1,180 and temp is 71C max. The fan profile isn't very aggressive (normal fan noise). Try, again.





Durvelle27 said:


> I have a R9 290X @1150/1400 fan @65% the highest the temps have ever gone is 82*C but in majority of all games it sits around 73*C



Oh nice, neither or you could even manage a 100Mhz overclock and you're trying to brag!

And all the reviews say these cards overheat.

BTA even said these cards overheat and throttle under load.

What you fools don't want to admit is that it isn't just about the GPU temperature.

The PWMs on these cards all overheat and cause the cards to throttle.

That is why a full coverage waterblock is the only way to stop these things from throttling under load!

No one wants to admit that these things are AMD's Fermi's, but they are.


----------



## Durvelle27 (Jun 6, 2014)

Eric_Cartman said:


> "*Oh nice, neither or you could even manage a 100Mhz overclock and you're trying to brag!"*
> 
> And all the reviews say these cards overheat.
> 
> ...


Ummmm i think you confused yourself


Stock AMD R9 290X is 1000/1250


@TheGuruStud 

Core: 1180 <---- (180MHz OC)

@Durvelle27

Core: 1150 <---- (150MHz OC)
Mem: 1400 <---- ( 150MHz OC)

Nonetheless i haven't experienced any throttling you speak of


----------



## SKL_H (Jun 6, 2014)

FrustratedGarrett said:


> I don't think it's fair to force people to buy high-end graphics cards to play games that could run perfectly well on mid-range $200ish graphics cards. I don't support that and my next GFX ain't gonna be an Nvidia one.



I am thinking of AMD 2, cause AMD cards are affordable, I mean I always buy entry to mid range GPUs and if things like game works continue we will end up with games that are not filly compatible with other GPUs for lets just thank Direct X API for now.

"cause the feature may have zantle, tantle, natle or zodiatle.... who knows"


----------



## Eric_Cartman (Jun 6, 2014)

Durvelle27 said:


> Ummmm i think you confused yourself
> 
> 
> Stock AMD R9 290X is 1000/1250
> ...


You are correct, I thought the stock clock on the 290x was 1100MHz. 

But the point is still the same, those overclocks are rubbish.

15-18% is horrible.

What does a stock 780 do? 20-25%

And if you haven't experienced any of the throttling, and hence, none of the rubberbanding, then BTA's entire argument was wrong to begin with wasn't it and there really wasn't any problem with Watch Dogs.

So which is it?

We seem to have two AMD users that are claiming there isn't any throttling an no rubberbanding, but we have BTA saying the whole problem with the game is AMD cards throttle and cause rubberbanding.

Funny.


----------



## Xzibit (Jun 6, 2014)

Maybe he was talking about this

*MaximumPC - Ubisoft Working on a Patch to Tame Watch Dog Issues on PC*



> Unlike graphics cards for the PC, next-generation consoles like the Xbox One and PlayStation 4 don't have dedicated RAM for graphics -- both consoles share 8GB across the entire system. Apparently this is causing issues on the PC with the way Watch Dogs is coded.
> 
> There are some steps you can take while you wait for a patch. First and foremost, be sure you're running the latest graphics card drivers from AMD and Nvidia. And secondly, Viard suggests turning down texture quality, level of Anti-Aliasing, or even display resolution.


----------



## Durvelle27 (Jun 6, 2014)

Eric_Cartman said:


> You are correct, I thought the stock clock on the 290x was 1100MHz.
> 
> But the point is still the same, those overclocks are rubbish.
> 
> ...


Man I'm not gonna go back and fourth with you.


----------



## Prima.Vera (Jun 7, 2014)

Forum derailment detected.


----------



## Relayer (Jun 7, 2014)

Eric_Cartman said:


> You are correct, I thought the stock clock on the 290x was 1100MHz.
> 
> But the point is still the same, those overclocks are rubbish.
> 
> ...










Look at this chart. The column on the left is boost clocks out of box performance. The column on the right (uber) is boost clocks with a pair of 120mm fans blowing on the cards to stop throttling. This is why sites like [H] and Hardware.fr (where this chart comes from) report lower performance for GK110 cards than sites who simply run 1-2 minute benches. They warm the cards until clocks stabilize and don't give nVidia the advantage of higher boost clocks while cool.


----------



## arbiter (Jun 7, 2014)

Relayer said:


> Look at this chart. The column on the left is boost clocks out of box performance. The column on the right (uber) is boost clocks with a pair of 120mm fans blowing on the cards to stop throttling. This is why sites like [H] and Hardware.fr (where this chart comes from) report lower performance for GK110 cards than sites who simply run 1-2 minute benches. They warm the cards until clocks stabilize and don't give nVidia the advantage of higher boost clocks while cool.



nvidia cards do have a base clock which they will run no matter what, AMD card have only "up to" and it can throttle down as much as 30% maybe even more. corsair had a demo showing their water cooler mount for a 290x, 2 cards side by side one with corsair water cooler on it other with ref cooler. Ref cooled card was throttled all way down to 750-780mhz range.  And funny thing with that chart, no AMD card listed I wonder why?

AMD kinda made a mistake with that 95c limit, they wanted their card to match nvidia they killed off temp head room to do it.


----------



## Relayer (Jun 7, 2014)

arbiter said:


> nvidia cards do have a base clock which they will run no matter what, AMD card have only "up to" and it can throttle down as much as 30% maybe even more. corsair had a demo showing their water cooler mount for a 290x, 2 cards side by side one with corsair water cooler on it other with ref cooler. Ref cooled card was throttled all way down to 750-780mhz range.  And funny thing with that chart, no AMD card listed I wonder why?
> 
> AMD kinda made a mistake with that 95c limit, they wanted their card to match nvidia they killed off temp head room to do it.


1, This is just to respond to the belief that only AMD cards throttle. Truth is both brands throttle, but people have been lead to believe that it's an AMD only trait.

2, The reason they aren't showing AMD cards is this wasn't a comparison. It was the 780 ti review.


----------



## Eric_Cartman (Jun 7, 2014)

Relayer said:


> 1, This is just to respond to the belief that only AMD cards throttle. Truth is both brands throttle, but people have been lead to believe that it's an AMD only trait.



At this point it is only AMD cards that throttle.

Throttling is  running *BELOW THE ADVERTISED BASE CLOCK SPEED*!

Not being able to run at the full boost speed all the time is NOT throttling.

AMD cards can't even maintain their base clock speeds, that is the issue we are discussing here and the issue BTA is claiming is Ubisoft/nVidia's fault, for whatever stupid reason!


----------



## Jurassic1024 (Jun 7, 2014)

natr0n said:


> oh great purposely poor performing games ahead.



Just like Mantle.


----------



## Jurassic1024 (Jun 7, 2014)

sunaiac said:


> Awesome, the rape will go on
> nVidia, destroying PC video gaming the way it's meant to be destroyed.



Destroying it? Um, nVIDIA isn't in any consoles and their move to mobile is quite slow. All nVIDIA has is PC gaming, and you think they are destroying it? WOW. Just wow.


----------



## arbiter (Jun 7, 2014)

Eric_Cartman said:


> At this point it is only AMD cards that throttle.
> 
> Throttling is  running *BELOW THE ADVERTISED BASE CLOCK SPEED*!



AMD cards don't have a base clock speed, they say "up to xxxx mhz" Its like an ISP when they say up to 50 mbit.


----------



## Relayer (Jun 7, 2014)

Eric_Cartman said:


> At this point it is only AMD cards that throttle.
> 
> Throttling is  running *BELOW THE ADVERTISED BASE CLOCK SPEED*!
> 
> ...



I can appreciate that is your definition. Considering AMD doesn't advertise anything except it's max boost clocks though I think you are painting yourself into a corner a bit. It also allows for shady marketing from both companies. I'll give you mine so you can understand what I am saying. Thermal throttling is clocks being reduced to stop the chip from running over it's temp limit. Both companies have thermal limits built into their cards and both companies top chips run into their limits with reference coolers.

I think we should drop the off topic though. This isn't really about the game at all. It's just another AMD/nVidia pissing contest.


----------



## sweet (Jun 8, 2014)

Jurassic1024 said:


> Destroying it? Um, nVIDIA isn't in any consoles and their move to mobile is quite slow. All nVIDIA has is PC gaming, and you think they are destroying it? WOW. Just wow.


The most profitable market for nVidia is professional scene FYI.
And if nVidia keep these Gameworks stories running, they will destroy PC gaming for sure.

Imagine the day 50% PC game only run good on nVidia, others only good on AMD, what should a consumer do? They will buy a PS4  And then the studios? They will only optimize the console versions and give craps to PC gamers.


----------



## arbiter (Jun 9, 2014)

sweet said:


> The most profitable market for nVidia is professional scene FYI.
> And if nVidia keep these Gameworks stories running, they will destroy PC gaming for sure.
> 
> Imagine the day 50% PC game only run good on nVidia, others only good on AMD, what should a consumer do? They will buy a PS4  And then the studios? They will only optimize the console versions and give craps to PC gamers.



On the optimize console version that has been the case for last few years, it runs good on console then they half ass port it to pc.  Even now Nvidia on pc gaming side has iver 50% of the market, for numbers I will use steam hardware survey since that is a rather Large pool of numbers so should be pretty close to what market is.  52.4% is nvidia, AMD has 30.7%. As for people only buying console cause that is unlikely cause gpu in console is what i would barely label a mid range part, Low mid range mostly.


----------



## btarunr (Jun 10, 2014)

Razorfang said:


> Yet developers are choosing to use GameWorks regardless of everything you said.



That doesn't in any way invalidate my argument.

NVIDIA is vomiting GameWorks around, because it wants more people to buy GeForce. Sadly, people won't base their next GPU purchase decision over GameWorks, they'll base it around how many FPS a GPU is offering, and at what price, like they always have.

With Xbox One and PS4, an increasing number of games will be inherently optimized for Radeon. All GameWorks does is ruin that optimization with pointless code that runs slow on GCN.


----------



## HM_Actua1 (Jun 11, 2014)

Eric_Cartman said:


> So your problem with the game is that AMD can't manage to put out graphics cards that don't overheat and start to throttle, so demanding games run like shit on them, and this is some how the fault of the game developers and nVidia.  Wow!


lol right....

AMD never seizes  to amaze me with their finger point BS.


----------



## Xzibit (Jun 11, 2014)

Hitman_Actual said:


> lol right....
> 
> AMD never seizes  to amaze me with their finger point BS.



The real BS is that somehow middleware to favor one hardware vendor will clean up engine code in the first place or somehow make console ports more efficient.  In the end you just get small visual enhancements for a hardware vendor adding its library onto the current code/port making it worse.

They are getting the middleware for free with a big dollar amount on top of it just to add a few effects on the PC version.

This just fuels big studios/publishers to be even lazier when it comes to PC games.


----------



## sweet (Jun 11, 2014)

btarunr said:


> That doesn't in any way invalidate my argument.
> 
> NVIDIA is vomiting GameWorks around, because it wants more people to buy GeForce. Sadly, people won't base their next GPU purchase decision over GameWorks, they'll base it around how many FPS a GPU is offering, and at what price, like they always have.
> 
> With Xbox One and PS4, an increasing number of games will be inherently optimized for Radeon. All GameWorks does is ruin that optimization with pointless code that runs slow on GCN.


Finally someone point this out.
Actually if it is just a bad PC port, AMD will have an advantage thanks to theirs higher VRAM and similarity of micro architecture (GCN). But a bad port with Gamework is somehow a different story.


----------



## omnimodis78 (Jun 12, 2014)

sweet said:


> Finally someone point this out.
> Actually if it is just a bad PC port, AMD will have an advantage thanks to theirs higher VRAM and similarity of micro architecture (GCN). But a bad port with Gamework is somehow a different story.


Their higher video RAM?  I didn't know AMD had a monopoly on higher video memory solutions in the GPU market.  Better port as per GCN?  I believe there have been Gaming Evolved titiles that ran better on comparative NVIDIA GPUs.  I might be one of those people that understands what GameWorks is, so I'm not defending it but why should end-users be all battling this out like it's some big conspiracy.  You want to run a game that is NVIDIA "supported", get an NVIDIA card, just like those who have NVIDIA cards have to settle for Gaming Evolved games.  We're not talking about UNICEF here people, it's corporations using marketing tools.


----------



## Relayer (Jun 13, 2014)

omnimodis78 said:


> Their higher video RAM?  I didn't know AMD had a monopoly on higher video memory solutions in the GPU market.  Better port as per GCN?  *I believe there have been Gaming Evolved titiles that ran better on comparative NVIDIA GPUs.*  I might be one of those people that understands what GameWorks is, so I'm not defending it but why should end-users be all battling this out like it's some big conspiracy.  You want to run a game that is NVIDIA "supported", get an NVIDIA card, just like *those who have NVIDIA cards have to settle for Gaming Evolved games.*  We're not talking about UNICEF here people, it's corporations using marketing tools.



Aren't these contradictory?


----------



## Ebo (Jul 2, 2014)

Well, i still think that the close relation between Ubisoft and Nvidia will hurt games on pc as a whole.

1. Nvidia fucked up with xhysics, not that many games support it after a decade. Most games uses Intels Havok which can almost the same, or the difference isent worth mentioning.

2. Now Nvidia has gone into bed with Ubisoft which might hurt owners of Radeon cards preformancewise on Ubisoft titles in the future. Or not, since AMD delivers to xbox one and also PS4, that has to be taken into account, since the consoles it where the money is, not in the pc game market, thats only a niece now.

I would like studios to embrace Mantel, since it only makes dx11 look bad, no matter which GFX you have in your machine. Which got Microsoft off their feet, and mabye make dx12 funtion much better that dx10, dx10.1 and dx11, then im a happy camper.

So far Mantle gets  support from more studios, and i rekon since AMD sits on all consoles, Nvidia shits themselves there missing out.

I dont care if my setup runs 3 frames less on AMD than on a Nvidia card, as long as my gaming experience is solid and enjoyable then im happy.

I for one, hate monolopy, so i allways backup the little guy, thats why ive shifted from Intel/Nvidia to purely AMD as my main computer, it can run everything i wants it to do, so im happy with the change, end of story.


----------



## HM_Actua1 (Jul 2, 2014)

Ebo said:


> Well, i still think that the close relation between Ubisoft and Nvidia will hurt games on pc as a whole.
> 
> 1. Nvidia fucked up with xhysics, not that many games support it after a decade. Most games uses Intels Havok which can almost the same, or the difference isent worth mentioning.
> 
> ...



You must have been fed retard sandwiches your entire life. Ever heard of Unreal engine?


----------



## Ebo (Jul 2, 2014)

Hitman_Actual said:


> You must have been fed retard sandwiches your entire life. Ever heard of Unreal engine?


Unreal WHO ? Epic has done nothing to push their engine to the limits the last 2 years(I hope), otherwise Frostbite and refurbished Dunia will run them over, like they allready have


----------



## HM_Actua1 (Jul 3, 2014)

Ebo said:


> Unreal WHO ? Epic has done nothing to push their engine to the limits the last 2 years(I hope), otherwise Frostbite and refurbished Dunia will run them over, like they allready have


lol uhhh forums are funny. It's liking speaking to the mythical unicorns of tards.


----------



## Relayer (Jul 5, 2014)

Hitman_Actual said:


> You must have been fed retard sandwiches your entire life. Ever heard of Unreal engine?





Hitman_Actual said:


> lol uhhh forums are funny. It's liking speaking to the mythical unicorns of tards.



Do you have anything to say about the topic, or does calling people names give you some sort of cheap thrill?


----------



## Jurassic1024 (Jul 5, 2014)

sweet said:


> The most profitable market for nVidia is professional scene FYI.
> And if nVidia keep these Gameworks stories running, they will destroy PC gaming for sure.
> 
> Imagine the day 50% PC game only run good on nVidia, others only good on AMD, what should a consumer do? They will buy a PS4  And then the studios? They will only optimize the console versions and give craps to PC gamers.



When i said gaming I also meant workstation.
Gameworks won't destroy gaming. What game studio in their right mind would continue to use it if it hurts gamers... aka their customers and potential customers. Especially when it is out in the open, the practices nVIDIA are using (dll's vs source code) are bad form and hurts more than it helps. Gameworks is bad, but I'm not calling the end of the world, because that would just be insanely foolish.


----------



## Jurassic1024 (Jul 5, 2014)

Relayer said:


> I think we should drop the off topic though. This isn't really about the game at all. It's just another AMD/nVidia pissing contest.



It's far from just a pissing contest when it hurts gamers. Don't get it twisted.


----------



## MxPhenom 216 (Jul 6, 2014)

Hilux SSRG said:


> AMD hardware is in XBOX/PS4 with Mantle optimizations.  That's why NVidia is pushing more and more Gameworks.  It's there way to get in the middle and make sure stuff runs well on their discrete gfx cards.


 
Xbox One and PS4 don't use Mantle. AMD has already said Mantle is not intended to be used on consoles, its strictly a PC API. Consoles don't need Mantle because of the way developers are able to code to the metal already.


----------



## MxPhenom 216 (Jul 6, 2014)

Eric_Cartman said:


> Oh nice, neither or you could even manage a 100Mhz overclock and you're trying to brag!
> 
> And all the reviews say these cards overheat.
> 
> ...


 
You are making the worlds biggest fool of yourself.


----------



## eidairaman1 (Jul 7, 2014)

mroofie said:


> and yet what recus mentioned above is true, stop being a fanboy all of the evidence shows that if amd had more developers in their pocket the same would happen to us. Amd is far from being classified as "good" company !!



Yeah n nvidia is worse so stop being a fanboy


----------

