# AMD Beats NVIDIA's Performance in the Battlefield V Closed Alpha



## Raevenlord (Jul 4, 2018)

A report via PCGamesN points to some... interesting performance positioning when it comes to NVIDIA and AMD offerings. Battlefield V is being developed by DICE in collaboration with NVIDIA, but it seems there's some sand in the gears of performance improvements as of now. I say this because according to the report, AMD's RX 580 8 GB graphics card (the only red GPU to be tested) bests NVIDIA's GTX 1060 6GB... by quite a considerable margin at that.

The performance difference across both 1080p and 1440p scenarios (with Ultra settings) ranges in the 30% mark, and as has been usually the case, AMD's offerings are bettering NVIDIA's when a change of render - to DX12 - is made - AMD's cards teeter between consistency or worsening performance under DX 12, but NVIDIA's GTX 1060 consistently delivers worse performance levels. Perhaps we're witnessing some bits of AMD's old collaboration efforts with DICE? Still, It's too early to cry wolf right now - performance will only likely improve between now and the October 19th release date.



 

 



*View at TechPowerUp Main Site*


----------



## mtcn77 (Jul 4, 2018)

I bet - after signing _the treaty of Versailles_ - you couldn't 'cry wolf' if Nvidia were ahead.


----------



## dj-electric (Jul 4, 2018)

AMD beats nvidia at something, lets make a news piece. This is big, you guys

_What? a patch and a driver update might iron it out? damn..._


----------



## Vya Domus (Jul 4, 2018)

Meh , because of the way Nvidia's hardware works they always need more driver work. Can't say this means much.


----------



## kastriot (Jul 4, 2018)

I smell green jealousy


----------



## dj-electric (Jul 4, 2018)

kastriot said:


> I smell green jealousy



Literally thousands of people who own geforce cards are marching in the street for their pre-patched BFV closed alpha performance.


----------



## Raevenlord (Jul 4, 2018)

dj-electric said:


> AMD beats nvidia at something, lets make a news piece. This is big, you guys
> 
> _What? a patch and a driver update might iron it out? damn..._



AMD beats NVIDIA in a "The Way It's Meant to be Played"Game (worthy of news already) with a graphics card that usually delivers the same performance as its NVIDIA counterpart.

And your "damn..." revelation was mentioned at the end of the piece, so it's not even that we (I) tried to make it sound more than it is.

I mean, this isn't anywhere even remotely going after NVIDIA, I don't understand your attitude.


----------



## dj-electric (Jul 4, 2018)

Raevenlord said:


> I mean, this isn't anywhere even remotely going after NVIDIA, I don't understand your attitude.



Have you followed the previous BF games' performance progression?
Do you honestly think the situation will remain like this for the final game?


----------



## HD64G (Jul 4, 2018)

+60-70% in minimum FPS for 580 vs 1060 is very much for a nVidia sponsored game. Crew2 shows good results for AMD vs nVidia also without any driver update from AMD side. Weird times we live in...


----------



## the54thvoid (Jul 4, 2018)

Vya Domus said:


> Meh , because of the way Nvidia's hardware works they always need more driver work. Can't say this means much.



Yup, that's the compute for you. 

Once the game is closer to release, I'm quite sure Nvidia driver team will be working their usual magic.


----------



## Raevenlord (Jul 4, 2018)

dj-electric said:


> Have you followed the previous BF games' performance progression?
> Do you honestly think the situation will remain like this for the final game?



"Perhaps we're witnessing some bits of AMD's old collaboration efforts with DICE? Still, It's too early to cry wolf right now - performance will only likely improve between now and the October 19th release date."

That's literally in the article.


----------



## Xzibit (Jul 4, 2018)

Doesn't Nvidia tauts working with game devs from Day 1.

FB3 was one of the engines being shown off with RTX.


----------



## Blueberries (Jul 4, 2018)

Damn... comparing a 120W card to a 185W card (that's more power draw than a GTX 1080). The bias is real.


----------



## Jism (Jul 4, 2018)

Blueberries said:


> Damn... comparing a 120W card to a 185W card (that's more power draw than a GTX 1080). The bias is real.



60 watts .. what are we talking about here. Like people waste more power these days leaving their system day in and day out ON.

Radeon cards have alot of potential and uncorked performance. This proberly does it the right way.


----------



## Nkd (Jul 4, 2018)

Blueberries said:


> Damn... comparing a 120W card to a 185W card (that's more power draw than a GTX 1080). The bias is real.



lol. as long as it delivers that much performance its delivering damn good performance per watt. At the current performance level your statement makes no damn sense for this game.


----------



## Fluffmeister (Jul 5, 2018)

This is does seem to be clutching at straws in the extreme, but hey go AMD!

If the picture doesn't change in four months time.... when the game is actually out, then AMD can certainly pop open the champagne bottles.


----------



## Mistral (Jul 5, 2018)

"AMD's RX 580 8 GB graphics card (the only red GPU to be tested) bests NVIDIA's GTX 1060 6GB "

That seems in line with how the cards perform. I'm sure though that nVidia will do something to "fix" it, it's a game in "collaboration with NVIDIA" after all.


----------



## Litzner (Jul 5, 2018)

Is this that surprising? I thought it was always pretty common for AMD to beat Nvidia when it come to low level hardware pipelines like DX12 and Vulkan? We have just hardly seen any games built in DX12 or Vulkan.


----------



## AsRock (Jul 5, 2018)

Until nVidia junk is added to the game,  mean i wounder if it will on release


----------



## TheGuruStud (Jul 5, 2018)

AsRock said:


> Until nVidia junk is added to the game,  mean i wounder if it will on release



Yeah, this is nothing gimpworks can't fix.

It'll be ballsy to kill AMD perf, since these numbers are out.


----------



## HTC (Jul 5, 2018)

Mistral said:


> "AMD's RX 580 8 GB graphics card (the only red GPU to be tested) bests NVIDIA's GTX 1060 6GB "
> 
> That seems in line with how the cards perform. *I'm sure though that nVidia will do something to "fix" it, it's a game in "collaboration with NVIDIA" after all.*



Simple: just have the publisher remove DX12 version and nVidia leads again.

They did similar with a game (don't recall it's name) that had DX10.1 which ran much better on AMD's cards but nVidia "managed" to get the game's publisher to remove DX10.1 in favor of the already supported DX10, which ofc ran much better on nVidia's cards.


----------



## FordGT90Concept (Jul 5, 2018)

RX 580 8 GiB has significantly more compute resources than GTX 1060 6 GiB.  This was inevitable to happen as optimizations are made for Polaris and not just in Battlefield.  The RX 580 average comes out ahead of GTX 1060 at all resolutions now.


----------



## lewis007 (Jul 5, 2018)

Well at least its good to know that my RX470 should be good enough for this title.


----------



## AsRock (Jul 5, 2018)

TheGuruStud said:


> Yeah, this is nothing gimpworks can't fix.
> 
> It'll be ballsy to kill AMD perf, since these numbers are out.



I was just having a dig into  nVidia is all. for all we know nVidia might of not done any\many performance tweaks.

In fact the thread is pretty pointless when you think about it.


----------



## prnsforum (Jul 5, 2018)

good news is only 1 game and still alpha


----------



## Fluffmeister (Jul 5, 2018)

prnsforum said:


> good news is only 1 game and still alpha



After Vega let them down they need something... anything, if it's a closed alpha of a game months away... i say let them have their win.


----------



## TheinsanegamerN (Jul 5, 2018)

Litzner said:


> Is this that surprising? I thought it was always pretty common for AMD to beat Nvidia when it come to low level hardware pipelines like DX12 and Vulkan? We have just hardly seen any games built in DX12 or Vulkan.


That was the case originally, but over time more Dx12 games have started favoring Nvidia (because surprise surprise, when one company owns 70+% of the market, you optimize for that OEM first). Battlefield has been favoring AMD for awhile though, since at least Battlefield 4, when Mantle was implemented into the Frostbite engine. 

And with so many games either a dead tie or favoring nvidia, news of a game heavily favoring AMD performance wise is pretty unusual.


----------



## Xzibit (Jul 5, 2018)

Fluffmeister said:


> After Vega let them down they need something... anything, if it's a closed alpha of a game months away... i say let them have their win.



Yeah, You tell them!!!  Its not like they are gaining market share or anything when they haven't released a new arc to compete with the mighty pascal right ?


----------



## Fluffmeister (Jul 5, 2018)

Xzibit said:


> Yeah, You tell them!!!  Its not like they are gaining market share or anything when they haven't released a new arc to compete with the mighty pascal right ?



Chill, when i go fishing, I need to use bait. You're my first bite.


----------



## Space Lynx (Jul 5, 2018)

lulz. a rx 580 and a gtx 1060...  yawnfest 2018.


----------



## sergionography (Jul 5, 2018)

I doubt its anything to do with dx12 since AMD performs much better in dx11 as well. AMD in general performs much better than nvidia when a game isn't optimized. its how AMD architectures are designed; to be more general purpose and require less software/driver work. Nvidia on the other hand make streamline designs that heavily rely on their robust driver/software team; in doing so they can also quickly phase out older products to push for more sales on newer products.


----------



## TheGuruStud (Jul 5, 2018)

AsRock said:


> I was just having a dig into  nVidia is all. for all we know nVidia might of not done any\many performance tweaks.
> 
> In fact the thread is pretty pointless when you think about it.



It's the same engine. What's there to do? Except Nvidia tricks...

Slows news day, I guess lol


----------



## TheinsanegamerN (Jul 5, 2018)

Xzibit said:


> Yeah, You tell them!!!  Its not like they are gaining market share or anything when they haven't released a new arc to compete with the mighty pascal right ?


LOLcalmdown.

GPU shipments /= gaming market share. Yes, AMD is selling more cards, but cryptomining made up a huge number of their shipments. If that market falters, like it is starting to do, AMD would be in trouble.

Check out the steam GPU numbers, from a platform that well north of 90% of PC gamers use. While AMD has managed to go from 8.9 to 15% of the GPU market, Nvidia dominates with 74.3% of the market. That means for every AMD user, there are roughly 5 nvidia users. Perhaps even more damning, the 7700 series has more active GPUs then the RX480. A 6 year old low end GPU pair has more users then the 2 year old mid range flagship for AMD. And notice that Vega, a very useful mining chip, is completely absent from the list of active GPUs, suggesting that VEGA56 or 64 cards have less then 0.16% of the market on steam. By comparison, the geforce 1080 has 2.43% of the market, and the 1070 has 3.84%.

AMD shipments are up, as is steam marketshare, but not by nearly as much as you think, and when nvidia dominated 3/4ths of the market, and _EVERY SINGLE PASCAL CARD INDIVIDUALLY_ (with the exception of the late comer 1030) has more active users then the most popular AMD card, and that AMD card is 6 years old, game developers are not going to pay AMD much attention.


----------



## Xzibit (Jul 5, 2018)

TheinsanegamerN said:


> LOLcalmdown.
> 
> GPU shipments /= gaming market share. Yes, AMD is selling more cards, but cryptomining made up a huge number of their shipments. If that market falters, like it is starting to do, AMD would be in trouble.
> 
> ...



Am I not calm? I'm not the one pointing to OPT-IN polls.



			
				Steam Hardware & Software Survey said:
			
		

> *Participation in the survey is optional*, and anonymous.



The numbers I linked to are from JPRs Q1 2018

We've yet to hear 300,000 GPUs being returned to AMD due to mining slow down.  If they have, they are better at hiding it then Nvidia

*Seeking Alpha: Nvidia Appears To Have A GPU Inventory Problem*


----------



## FordGT90Concept (Jul 5, 2018)

AMD has had a perpetual problem of not being able to keep up with demand for the last year.  I think JPR reflects that.


----------



## cucker tarlson (Jul 5, 2018)

Well I guess it's official, rx580 aged better.


----------



## mtcn77 (Jul 5, 2018)

Rationalizations were made.


----------



## ZoneDymo (Jul 5, 2018)

Blueberries said:


> Damn... comparing a 120W card to a 185W card (that's more power draw than a GTX 1080). The bias is real.



Well you compare based on the price they are sold at, the performance matters per price category.


----------



## Fierce Guppy (Jul 5, 2018)

Here's another considerable margin.  The cheapest GTX 1060 is $399(NZD).

https://pricespy.co.nz/product.php?q=gtx 1060&p=3853855

 The cheapest AMD RX 580 is $499(NZD).

https://pricespy.co.nz/product.php?q=rx 580&p=4445327


----------



## sergionography (Jul 5, 2018)

Fierce Guppy said:


> Here's another considerable margin.  The cheapest GTX 1060 is $399(NZD).
> 
> https://pricespy.co.nz/product.php?q=gtx 1060&p=3853855
> 
> ...




Lol cash me outside
MSI Radeon RX 580 DirectX 12 RX 580 ARMOR MK2 8G OC 8GB 256-Bit GDDR5  Video Card , https://www.newegg.com/Product/Product.aspx?Item=14-137-290


EVGA GeForce GTX 1060 GAMING, ACX 2.0 (Single Fan), 06G-P4-6161-KR, 6GB GDDR5, DX12 OSD Support (PXOC), Only 6.8 Inches , https://www.newegg.com/Product/Product.aspx?Item=14-487-260


In the usa* just noticed the NZD
In other words you are in the wrong country lol. But not gonna hatw because new zealand is freaking awesome


----------



## kastriot (Jul 5, 2018)

RX 580 8GB        MSRP 250$
GTX 1060 6GB   MSRP 250$


Soo forget artificial prices made by mining craze


----------



## londiste (Jul 5, 2018)

Performance in alpha? That is still several stages from being a release, this will likely change.
Anyone else wonder why only comparing GTX1060 and RX580? How about 1070/1070Ti/1080/1080Ti/Vega56/Vega64?

DX12 is a red herring. Even RX580 performs worse or at the best same as DX11. At the same time people in reddit/forums are still reporting stuttering in DX12 with both AMD and Nvidia GPUs.

Ultra settings and cards were apparently ASUS ROG Strix versions:
https://www.techpowerup.com/gpudb/b4418/asus-rog-strix-rx-580-gaming
https://www.techpowerup.com/gpudb/b4226/asus-rog-strix-gtx-1060-gaming


----------



## Tsukiyomi91 (Jul 5, 2018)

just one game... 1 game where AMD beaten Nvidia made the whole world go mad... really guys... and the game isn't even in it's polished state.


----------



## looniam (Jul 5, 2018)

that polaris card is ~BF I performance but that pascal . . . well,  NV will have 3 game ready and 8 hot fix drivers the first week.


----------



## nguyen (Jul 5, 2018)

Perhaps the RX580 are not rendering something as they should ? this happens all the time when game are in alpha state, hardly newsworthy, or ain't they.


----------



## huguberhart (Jul 5, 2018)

there are disadvantages for AMD GPU owners in the alpha. thread


----------



## nemesis.ie (Jul 5, 2018)

@prnsforum How is it "good news" that it's only one game? It sounds like you want AMD users to have lower performance or at least always be behind NV, how is that good for anyone/the market/card prices etc? Competition is good for us consumers even if you prefer one company over another.

@huguberhart Interesting, it appears AMD cards are even rendering more stuff (snow) while performing better (at the moment).

This also applies to @nguyen's comment, it seems the opposite may be true and the NV cards are maybe not rendering something/as much?



TheinsanegamerN said:


> Check out the steam GPU numbers, from a platform that well north of 90% of PC gamers use. While AMD has managed to go from 8.9 to 15% of the GPU market,



By your own numbers that means AMD has nearly doubled their market share for (Steam) gaming then?


----------



## las (Jul 5, 2018)

HD64G said:


> +60-70% in minimum FPS for 580 vs 1060 is very much for a nVidia sponsored game. Crew2 shows good results for AMD vs nVidia also without any driver update from AMD side. Weird times we live in...



Nvidia sponsored maybe, but DICE has been working closely with AMD for years


----------



## PerfectWave (Jul 5, 2018)

TheinsanegamerN said:


> That was the case originally, but over time more Dx12 games have started favoring Nvidia (because surprise surprise, when one company owns 70+% of the market, you optimize for that OEM first). Battlefield has been favoring AMD for awhile though, since at least Battlefield 4, when Mantle was implemented into the Frostbite engine.
> 
> And with so many games either a dead tie or favoring nvidia, news of a game heavily favoring AMD performance wise is pretty unusual.



those title u said are not full dx12 title. full dx12 title run better on AMD and thats it

steam survive is bullshit. It asked me when i was runnig steam on my vm machine........... and it ask it randomly so this does not make a lot of sense to me


----------



## londiste (Jul 5, 2018)

PerfectWave said:


> those title u said are not full dx12 title. full dx12 title run better on AMD and thats it


Why would a full DX12 title run better on AMD? The closest we have to full DX12 titles are AOTS and Frostbite-based games (especially Battlefields). They do not seem to have a more noticeable bias than some DX11 games or some earlier DX9 games that do have a native bias to one GPU architecture or another.

AFAIK the only DX12 game that is showing a noticeable and constant boost for DX12 over DX11 is Sniper Elite 4.

From more recent stories: Crew 2 has RX580 beating GTX1060 by a large margin:
https://www.hardocp.com/article/2018/06/27/crew_2_video_card_performance_iq_preview/4


----------



## MuhammedAbdo (Jul 5, 2018)

londiste said:


> Why would a full DX12 title run better on AMD? The closest we have to full DX12 titles are AOTS and Frostbite-based games (especially Battlefields). They do not seem to have a more noticeable bias than some DX11 games or some earlier DX9 games that do have a native bias to one GPU architecture or another.
> 
> AFAIK the only DX12 game that is showing a noticeable and constant boost for DX12 over DX11 is Sniper Elite 4.
> 
> ...


6 fps difference (13%) is a large margin now?

I can show you a couple of games where the 1060 wipes the floor with the 580, starting with AC:O































Like Forza 7 and Destiny 2 before that, NVIDIA will release a driver that fixes the fps drop. Nothing to worry about mates.


----------



## Boosnie (Jul 5, 2018)

What's the fuss? They will add tassellation.


----------



## Vya Domus (Jul 5, 2018)

TheinsanegamerN said:


> GPU shipments /= gaming market share. Yes, AMD is selling more cards, but cryptomining made up a huge number of their shipments. If that market falters, like it is starting to do, AMD would be in trouble.



Same old mantra.

Come on dude get real Nvidia stocks have been decimated by miners just as much. If you feel like those numbers are skewing AMD's market share then the same is true for Nvidia as well. Otherwise if you think that all of that 65.1% market share was driven purely by sales to gamers you are being delusional.

To your absolute horror AMD will be around for quite a while even if the crypto market falls flat. Hey, if you were to look back in history you'd realize it happened before and guess what they are still around and with increasing sales and market share. How about that.


----------



## Tsukiyomi91 (Jul 5, 2018)

a handful of "AMD optimized" games only yield 5-6fps gain for an RX580 8GB over a mid range Pascal card (GTX1060 6GB) is kinda laughable.

I wanna see how terrible is the time-frame between those 2 GPUs & see which one has better overall latency.


----------



## Vya Domus (Jul 5, 2018)

Tsukiyomi91 said:


> a handful of "AMD optimized" games only yield 5-6fps gain



As someone else pointed out , it's not a handful anymore , it's the majority. 

https://www.techpowerup.com/reviews/ASRock/RX_580_Phantom_Gaming_X/31.html


----------



## FordGT90Concept (Jul 5, 2018)

Yeah, those games MuhammedAbdo posted are the ones that 1060 6 GiB has a fairly significant lead in at 1920x1080. At 3840x2160, that lead evaporates to tenths of an fps or RX 580 even gets ahead.
Origins: https://www.techpowerup.com/reviews/ASRock/RX_580_Phantom_Gaming_X/7.html
GTAV: https://www.techpowerup.com/reviews/ASRock/RX_580_Phantom_Gaming_X/17.html

It's the aggregate that matters because most people play a variety of games.

RX 580 and GTX 1060 are both midrange cards.  Performance per dollar, RX 580 actually beats all cards except the 3 GB version of GTX 1060.  That said, 3 GB isn't enough in a growing number of games.


----------



## Th3pwn3r (Jul 5, 2018)

Best click bait I've seen in a while. What a title. Instead of RX580 beats 1060, AMD beats Nvidia, good job showing bias and trying to start a war haha.


----------



## las (Jul 5, 2018)

I'd rather see high-end GPU performance, these are 1080p cards


----------



## INSTG8R (Jul 5, 2018)

las said:


> I'd rather see high-end GPU performance, these are 1080p cards


Well mainstream results are really a better thing to see. That’s the majority of the market.  Do I wanna see how my Vega does? Sure but 1080p/midrange is the majority.


----------



## ssdpro (Jul 5, 2018)

Raevenlord said:


> "Perhaps we're witnessing some bits of AMD's old collaboration efforts with DICE? Still, It's too early to cry wolf right now - performance will only likely improve between now and the October 19th release date."
> 
> That's literally in the article.


Then again, the title of the article is literally "AMD Beats NVIDIA's Performance in the Battle V closed alpha".  The title is literally a blanket statement.


----------



## las (Jul 5, 2018)

INSTG8R said:


> Well mainstream results are really a better thing to see. That’s the majority of the market.  Do I wanna see how my Vega does? Sure but 1080p/midrange is the majority.



Better for who? Most people that care about Alpha performance and GPU benchmarks probably don't use mid-end

They could have tested Vega 64 and 1080 / 1080 Ti too


----------



## nemesis.ie (Jul 5, 2018)

Maybe they did, maybe the results were even more in AMD's favour ... Vega is a newer architecture than the RX580s. The 1060 uses the same as the 1070 and 1080/Ti, right?

It would be nice to know for sure.


----------



## londiste (Jul 5, 2018)

nemesis.ie said:


> Maybe they did, maybe the results were even more in AMD's favour ... Vega is a newer architecture than the RX580s. The 1060 uses the same as the 1070 and 1080/Ti, right?
> 
> It would be nice to know for sure.


In most comparisons where RX580 does well against GTX1060, Vega performs as usual against GTX1080 - more or less equal.


----------



## Vya Domus (Jul 5, 2018)

las said:


> Most people that care about Alpha performance and GPU benchmarks probably don't use mid-end



Who says that's the case ? You ?


----------



## MuhammedAbdo (Jul 5, 2018)

FordGT90Concept said:


> Yeah, those games MuhammedAbdo posted are the ones that 1060 6 GiB has a fairly significant lead in at 1920x1080. At 3840x2160, that lead evaporates to tenths of an fps or RX 580 even gets ahead.


Who buys these cards at 4K anyway? stop posting crap to justify AMD's underperformance in most famous games.



FordGT90Concept said:


> It's the aggregate that matters because most people play a variety of games.


Apparently the aggregate doesn't matter anymore according to this article, AMD fanboys are celebrating an Alpha build victory!

Anyway like Forza 7 and Destiny 2 before that, NVIDIA will release a driver that fixes the fps drop. Nothing to worry about mates.


----------



## INSTG8R (Jul 5, 2018)

las said:


> Better for who? Most people that care about Alpha performance and GPU benchmarks probably don't use mid-end
> 
> They could have tested Vega 64 and 1080 / 1080 Ti too


Again we’re a minority with the halo cards and 1440 monitors. We’re the “1%”


----------



## I No (Jul 5, 2018)

Wow... color me shocked.... a game in *alpha* leaking some crap benches... how wonderful .... click bait nothing else to see here. Both AMD and nVidia will release drivers when the game hits the stores ...


----------



## Vya Domus (Jul 5, 2018)

MuhammedAbdo said:


> Anyway like Forza 7 and Destiny 2 before that, NVIDIA will release a driver that fixes the fps drop.



Destiny 2 ran significantly  better on Nvidia *before and after release*. Guess it's hard to keep track with those green tinted glasses on all the time. Making up stuff that never happened.

You can spot when a game has been optimized through drivers on Nvidia during development from miles away, Destiny 2 was one of them and oddly enough BF5 isn't.


----------



## FordGT90Concept (Jul 5, 2018)

MuhammedAbdo said:


> Who buys these cards at 4K anyway? stop posting crap to justify AMD's underperformance in most famous games.


Which burdens the GPU more?  4K or 1080p?  4K.  AMD is not held back by its GPUs, it's held back by it's drivers needing more CPU time to churn out a frame.  Remove the CPU time (by going higher resolution, Direct3D 12, or Vulkan) and AMD offers a lot more bang per dollar than NVIDIA does.


----------



## MuhammedAbdo (Jul 5, 2018)

FordGT90Concept said:


> Which burdens the GPU more? 4K or 1080p? 4K. AMD is not held back by its GPUs, it's held back by it's drivers needing more CPU time to churn out a frame. Remove the CPU time (by going higher resolution, Direct3D 12, or Vulkan) and AMD offers a lot more bang per dollar than NVIDIA does.


That's crap, a customer buys a GPU as a whole package, he cares not for drivers or whatever. He cares about final performance, at 4K these cards offer no playable fps whatsoever, people buy them for 1080p alone, if you can't deliver suitable fps at this resolution. then you lose. Which what happened, the GTX 1060 sold several times more the RX 580.


----------



## FordGT90Concept (Jul 5, 2018)

Look at the pictures you posted again.  Which of those games does the RX 580 run at unacceptable framerates?


----------



## Vya Domus (Jul 5, 2018)

Seeing people trying to deny the fact that RX 580 overtook the 1060 is hilarious.


----------



## MuhammedAbdo (Jul 5, 2018)

Vya Domus said:


> Seeing people trying to deny the fact that the RX 580 overtook the 1060 is hilarious.


Yeah in an Alpha build, enjoy your short lived imaginary victory.


----------



## Vya Domus (Jul 5, 2018)

MuhammedAbdo said:


> Yeah in an Alpha build, enjoy your short lived imaginary victory.



You mean this imaginary alpha build victory ?


----------



## I No (Jul 5, 2018)

Vya Domus said:


> You mean this imaginary alpha build victory ?


The horror ....1% difference from a Reference card vs an AIB one.....


----------



## londiste (Jul 5, 2018)

With that TPU review picture you mean to say that equal cards are equal?


----------



## las (Jul 5, 2018)

Vya Domus said:


> Who says that's the case ? You ?



Yeah I don't care about low and mid-end GPU's


----------



## Vya Domus (Jul 5, 2018)

I No said:


> The horror ....1% difference from a Reference card vs an AIB one.....



Well I would have shown the 4K one but apparently in the virtual world he lives where anything outside 1080p does not exist that's irrelevant.


----------



## I No (Jul 5, 2018)

Vya Domus said:


> Well I would have shown the 4K one but apparently in the virtual world he lives where anything outside 1080p does not exist that's irrelevant.


No one in their right minds would spend $250 and expect 4K out of that GPU...


----------



## Vya Domus (Jul 5, 2018)

I No said:


> No one in their right minds would spend $250 and expect 4K out of that GPU...



Which is relevant because of what ? GPU grunt is GPU grunt , the 580 has more of it.

Today's game at 4K is tomorrow's game running at 1080p in terms of load , things evolve.


----------



## I No (Jul 5, 2018)

Vya Domus said:


> Which is relevant because of what ? GPU grunt is GPU grunt , the 580 has more of it.


Dude if you have the money for a 4k panel your GPU buget must be over $250. Otherwise is just like buying a Mercedes with a VW Polo engine.. Wtf... They can render those resolutions but hell no they can't damn play it... Seriously that's a 1080p card and it will die as a 1080p card and at that resolution it's on par with the competition. Give it a rest


----------



## Vya Domus (Jul 5, 2018)

I No said:


> Dude if you have the money for a 4k panel your GPU buget must be over $250. Otherwise is just like buying a Mercedes with a VW Polo engine.. Wtf... They can render those resolutions but hell no they can't damn play it... Seriously that's a 1080p card and it will die as a 1080p card



It's telling because it shows which card is more powerful in absolute terms. This BF5 alpha benchmark or whatever isn't relevant now but that's going to become the norm once support for Pascal weakens.


----------



## I No (Jul 5, 2018)

Vya Domus said:


> It's telling because it shows which card is more powerful in absolute terms. This BF5 alpha benchmark or whatever isn't relevant now but that's going to become the norm once support for Pascal weakens.


And what is the guarantee that AMD's support will stay the same?..... Those charts and graphs are the way they are because some of the games favor AMD or nVidia by a long margin that's what skews the results. The cards are the same it just comes down to price... Get whichever is cheaper they both perform the same overall, 1% isn't ground breaking and that might be well within the margin of error...


----------



## Vya Domus (Jul 5, 2018)

I No said:


> And what is the guarantee that AMD's support will stay the same?



That's the thing , it doesn't need to stay the same , due to architectural design choices GCN suffers less from lack of optimizations. "Fine wine" or whatever fanboys called was wrongfully attributed to "improvements over time from drivers" , nothing actually improves , it's rather about the lack of improvement on Nvidia'as side after a while.



I No said:


> 1% isn't ground breaking



No , it's not ground breaking, but it's there to the absolute horror of few that scrabble to find explanations on how their dear color is still superior.


----------



## MuhammedAbdo (Jul 5, 2018)

Vya Domus said:


> You mean this imaginary alpha build victory ?


No I mean this:
A week ago and in 27 games, the 1060 is 3% faster overall. Despite the mystical delusions of FineWine and despite DX12 and Vulkan games.












Vya Domus said:


> whatever isn't relevant now but that's going to become the norm once support for Pascal weakens.


Pascal support weakens? LOL! That's the most pathetic justification I've heard in a while!


----------



## Vya Domus (Jul 5, 2018)

MuhammedAbdo said:


> Pascal support weakens? LOL! That's the most pathetic justification I've heard in a while!



Justification? Of what ? What am I trying to justify exactly.


----------



## dozenfury (Jul 5, 2018)

It seems odd to run comparison benchmarks on an closed alpha or give much credence to them.  Performance numbers for games this far before launch, and without launch drivers, aren't really indicative of much imo.  And I would say that for either red or green numbers for any game this far before release.  About the only takeaway I'd have is that it BF V seems to be coded to like more memory and it seems to be working well with DX12.  

Also keep in mind DX12 was beyond buggy with BF 1 and most players including me went back to DX11 just to even get the game to be stable.  It's an example of why numbers this early are kind of like taking a bite of food that's half cooked and trying to decide how good it will be when it's finished.


----------



## Xaled (Jul 5, 2018)

This just proves onve again that nVidia's leadership is just fake ..
No problem nVidia will soon bribe DICE so they let AMD cards run slower.


----------



## I No (Jul 5, 2018)

Xaled said:


> This just proves onve again that nVidia's leadership is just fake ..
> No problem nVidia will soon bribe DICE so they let AMD cards run slower.



Oh you must have some inside information, please share with the class.


----------



## FordGT90Concept (Jul 5, 2018)

MuhammedAbdo said:


> No I mean this:
> A week ago and in 27 games, the 1060 is 3% faster overall. Despite the mystical delusions of FineWine and despite DX12 and Vulkan games.


2% of that is because of Frostpunk (which clearly has problems on AMD hardware).  When they remove it, the difference is 1%.  Then if we compare specific titles between HardwareUnboxed compared to TechPowerUp, like Hitman, TPU shows a 4% advantage RX 580 versus HardwareUnboxed's 1%.  How can that be?  They use the same processor with TPU at 4.8 GHz versus HU at 5.0 GHz.   Only answer I can come up with is the test GTX 1060 has higher clockspeeds.


----------



## Xaled (Jul 5, 2018)

I No said:


> Oh you must have some inside information, please share with the class.


you dont need to have an inside guy when games that are being developed with nVidia become shitty after were amazing when were being developed with AMD. for instance: Battlefield.


----------



## R0H1T (Jul 5, 2018)

We need more tiki torches, let's burn'em at the stake!


----------



## cucker tarlson (Jul 5, 2018)

FordGT90Concept said:


> 2% of that is because of Frostpunk (which clearly has problems on AMD hardware).  When they remove it, the difference is 1%.  Then if we compare specific titles between HardwareUnboxed compared to TechPowerUp, like Hitman, TPU shows a 4% advantage RX 580 versus HardwareUnboxed's 1%.  How can that be?  They use the same processor with TPU at 4.8 GHz versus HU at 5.0 GHz.   Only answer I can come up with is the test GTX 1060 has higher clockspeeds.


Frostpunk is a great new game, it should be included and it's AMD problem that they can't optimize it. 580 is a better card overall (extra +2GB vram) , regardless of the 2% advantage of the 1060.Stop cherry picking games you children. As for the power draw discussion: it's a vaild point that 1060 has noticeably lower power draw, but as long as you're below 200W, AIB cooling solutions will make this a moot point, they can deal with 190W-200W and not even break a sweat. It starts to be a real issue when comparing 200W card like 1080 vs 300W card like, cause apart from extreme air coolers like Lightning Z, ordinary AIB coolers are starting to become insufficient after you approach that +250W mark.


----------



## Krzych (Jul 5, 2018)

Closed Alpha... Thats digging very deep for a cheap drama. Although thats understandable when there are no new products and no big releases so there are no real topics to cover and get views from. There are no good leaks even, let alone products.


----------



## cucker tarlson (Jul 5, 2018)

Krzych said:


> Closed Alpha... Thats digging very deep for a cheap drama. Although thats understandable when there are no new products and no big releases so there are no real topics to cover and get views from. There are no good leaks even, let alone products.


I bet adoredtv is already preparing a video on this


----------



## HD64G (Jul 5, 2018)

To make things clear about DX12, its main purpose is to get off the weight of the game engine from CPU (as much as possible) and offload it to GPU. So, when CPU is very powerful, less gains there are to get in FPS. And it usually benefits AMD GPUs more because they have more raw power and they are bottlenecked easier by the CPU. Their drivers used to have more cpu overhead also and so, another possible gain from this for older games not well optimised at least. So, for a new game based on DX11 to have much faster performance on AMD GPUs, it is a sign of not good optimisation for nVidia which is weird as I previously said for a game sponsored (=partly paid) by the green team.



FordGT90Concept said:


> 2% of that is because of Frostpunk (which clearly has problems on AMD hardware).  When they remove it, the difference is 1%.  Then if we compare specific titles between HardwareUnboxed compared to TechPowerUp, like Hitman, TPU shows a 4% advantage RX 580 versus HardwareUnboxed's 1%.  How can that be?  They use the same processor with TPU at 4.8 GHz versus HU at 5.0 GHz.   Only answer I can come up with is the test GTX 1060 has higher clockspeeds.


In fact Steve from HU who reviewed them said that both of them were oced if I remember well.


----------



## cucker tarlson (Jul 5, 2018)

HD64G said:


> So, for *a new game based on DX11 to have much faster performance on AMD GPUs*, it is a sign of not good optimisation for nVidia which is weird as I previously said for a game sponsored (=partly paid) by the green team.


Lol, except it doesn't cause it's not even in its beta stage atm.
Here's what's gonna happen - the game launches in october, nvidia wins by +5% in dx11, amd wins by +5% in dx12, both gtx 1060 and rx580 run 1080p/60 no problems. Amd fanboys keep complaining about gameworks even though they're so demanding that rx580 gets 35 fps and 1060 gets 40 fps and neither 580 nor 1060 owners even think about enabling them seriously but adoretv has new "material" to work with.



Xaled said:


> you dont need to have an inside guy when games that are being developed with nVidia become shitty after were amazing when were being developed with AMD. for instance: Battlefield.



Here's a soothing tune for you


----------



## Xaled (Jul 5, 2018)

cucker tarlson said:


> ..
> Here's what's gonna happen - the game launches in october, nvidia wins by +5% in dx11, amd wins by +5% in dx12, both gtx 1060 and rx580 run 1080p/60 no problems. ..


it is 2018/h2 and nVidia fans are still thankful for getting 1080/60 from a 300-400$ card!!..


----------



## cucker tarlson (Jul 5, 2018)

Xaled said:


> it is 2018/h2 and nVidia fans are still thankful for getting 1080/60 from a 300-400$ card!!..


Well I'm glad AMD fans cherish the fact that they at least have Vega for that.


----------



## Divide Overflow (Jul 5, 2018)

Coming soon:  Battlefield V now with nVidia Hairworks mandatory implementation!


----------



## looniam (Jul 5, 2018)

cucker tarlson said:


> I bet adoredtv is already preparing a video on this


i think jim is still editing 18 hours of "content" about some NDA stuff.


----------



## FordGT90Concept (Jul 5, 2018)

cucker tarlson said:


> Frostpunk is a great new game, it should be included and it's AMD problem that they can't optimize it.


Just because a game is "great" doesn't meant it is a good candidate for hardware benchmarking.  For benchmarking, it has to have a reliable, repeatable render scene so test runs are consistent.  If there's also serious performance issues, the benchmarker needs to contact the hardware manufacturer to verify if the numbers they see are correct.  NVIDIA put out a Game Ready driver for it back in April.  Prior to that performance was awful.  AMD hasn't yet and without contacting AMD for information on their plans for optimizing the game, it shouldn't be used for benchmarks (HU made no statement that they did).


----------



## cucker tarlson (Jul 5, 2018)

All I can say is you're 100% wrong. Whatever is being released and people play needs to be included, especially a game like this

https://www.google.com/search?q=frostpunk+review&ie=utf-8&oe=utf-8&client=firefox-b-ab


you're complaining about frostpunk in a thread where closed alpha performance of a game that is launching in 4 months is suddenly important


----------



## FordGT90Concept (Jul 5, 2018)

You're forgetting the type of game it is: strategy.  30 fps is typically enough in those kinds of games so the performance issues aren't hugely reflected in reviews.


----------



## cucker tarlson (Jul 5, 2018)

FordGT90Concept said:


> You're forgetting the type of game it is: strategy.  30 fps is typically enough in those kinds of games so the performance issues aren't hugely reflected in reviews.


I didn't play it and neither did you, neither of us know what performance is needed except you think you know it's 30 fps.


----------



## INSTG8R (Jul 5, 2018)

cucker tarlson said:


> I didn't play it and neither did you, neither of us know what performance is needed except you think you know it's 30 fps.


And I have played it and maxed at out at 1440.  I’m getting 75 to 85 FPS in the opening scenario. Only oddity is it absolutely eating all 8GB of VRAM.


----------



## cucker tarlson (Jul 5, 2018)

INSTG8R said:


> And I have played it and maxed at out at 1440.  I’m getting 75 to 85 FPS in the opening scenario. Only oddity is it absolutely eating all 8GB of VRAM.


Is it in line with other benches ? Do other scenarios play worse ? Seems very good for 1440p. Early benches have Vega with absolutely miserable performance, just like in the last video of 1060 vs 580.

btw how are you liking the game ? I was going to get it on launch but I'm completely occupied with division and battlefront 2.


----------



## INSTG8R (Jul 5, 2018)

cucker tarlson said:


> Is it in line with other benches ? Do other scenarios play worse ? Seems very good for 1440p.
> 
> btw how are you liking the game ? I was going to get it on launch but I'm completely occupied with division and battlefront 2.


I just grabbed it because a mate put some money in my Steam for my birthday last week. I’m really just at that opening scenario and confused. It’s pretty and runs well is my only real opinion right now. 
I’ve no reference points against any published benches just my own experience and the FPS counter.


----------



## MuhammedAbdo (Jul 5, 2018)

FordGT90Concept said:


> AMD hasn't yet and without contacting AMD for information on their plans for optimizing the game, it shouldn't be used for benchmarks (HU made no statement that they did).


If only you would apply that same logic to BFV alpha!


----------



## FordGT90Concept (Jul 5, 2018)

AMD has issues too:


----------



## cucker tarlson (Jul 5, 2018)

FordGT90Concept said:


> AMD has issues too:


Lol. Are the blind amd fanboys in this thread going to open their eyes and see you can't judge performance by alpha now ? This is absolutely unplayable on amd.

btw I follow a few gaming youtube channels and I've seen people getting +100 fps @1440 ultra on their 1080Ti's, I knew the 30-32 fps on 1060 must have been fake.


I suggest two things: either close this thread, or rename it "BF5 unplayable on AMD hardware, fine on nvidia".


----------



## INSTG8R (Jul 5, 2018)

cucker tarlson said:


> Lol. Are the blind amd fanboys in this thread going to open their eyes and see you can't judge performance by alpha now ? This is absolutely unplayable on amd.
> 
> btw I follow a few gaming youtube channels and I've seen people getting +100 fps @1440 ultra on their 1080Ti's, I knew the 30-32 fps on 1060 must have been fake.
> 
> ...


Do you really think the underlying engine is going to be any different? It’s Frostbite it’s not going to change. Are Nvidia fanboys gonna open their eyes and just accept and AMD product runs it better? I’m sorry your precious crown has been tilted for once but please stop trying to downplay an AMD “win” at any cost. Frostbite was developed pretty closely with AMD so I’m not sure why this is really a big surprise it runs well on their products, just funny watching Green Team making every effort to downplay it despite the fact as a whole the difference is minimal between the 2 cards but just happens to favour AMD slightly.


----------



## cucker tarlson (Jul 5, 2018)

INSTG8R said:


> Do you really think the underlying engine is going to be any different? It’s Frostbite it’s not going to change. Are Nvidia fanboys gonna open their eyes and just accept and AMD product runs it better? I’m sorry your precious crown has been tilted for once but please stop trying to downplay an AMD “win” at any cost. Frostbite was developed pretty closely with AMD so I’m not sure why this is really a big surprise it runs well on their products, just funny watching Green Team making every effort to downplay it despite the fact as a whole the difference is minimal between the 2 cards but just happens to favour AMD slightly.









lol did you watch the video ?
and did you see my post saying that it's going to be the same thing as bf1, +5% for nvidia on dx11, +5% for amd on dx12.

apparently you did neither of them.


----------



## Xaled (Jul 5, 2018)

cucker tarlson said:


> I suggest two things: either close this thread, or rename it "BF5 unplayable on AMD hardware, fine on nvidia".



Or you can either unfollow this thread or logout from tpu forums


----------



## INSTG8R (Jul 5, 2018)

cucker tarlson said:


> lol did you watch the video ?
> and did you see my post saying that it's going to be the same thing as bf1, +5% for nvidia on dx11, +5% for amd on dx12.
> 
> apparently you did neither of them.


I’m just parroting your “fanboy” BS back at you, funny how that works...Like your meaninglessness drivel is any more important than mine...


----------



## cucker tarlson (Jul 5, 2018)

I'm not the one spreading fake alpha benchmarks and making a meal of them. Don't drag me down to Xaled's level of stupidity.

Pcgh uploaded the video, hardwareluxx posted their results too.


----------



## INSTG8R (Jul 5, 2018)

cucker tarlson said:


> I'm not the one spreading fake alpha benchmarks and making a meal of them. Don't drag me down to this level of stupidity.
> 
> Pcgh uploaded the video, hardwareluxx posted their results too.
> 
> View attachment 103555


Then cut the fanboy BS and you won’t look so ignorant...


----------



## cucker tarlson (Jul 5, 2018)

I say we leave this thread at all. It's been meaningless since the beginning as we're talking alpha. Yet it was fun to see an amd biased source get crushed by reputable sites. They're the reason for this whole fuss since they think they can poop out any result they want and fanboys would believe them.


----------



## 8tyone (Jul 5, 2018)

nvidia should step in and cripple the game so that rx580 gets the same fps as that of the gtx1060.


----------



## Fluffmeister (Jul 5, 2018)

FordGT90Concept said:


> AMD has issues too:



Wow that is horrible on the RX 580, I guess looking at figures on graphs doesn't paint the full picture after all.

If that is the victory some people want, let them have it... i can see why it's a closed alpha!



8tyone said:


> nvidia should step in and cripple the game so that rx580 gets the same fps as that of the gtx1060.



Looking at the vid the 580 gets a higher FPS, enjoy!


----------



## Tatty_One (Jul 5, 2018)

Xaled said:


> it is 2018/h2 and nVidia fans are still thankful for getting 1080/60 from a 300-400$ card!!..


I can't speak for your country but in mine the Vega 56 is a fair bit pricier than the GTX 1070.


----------



## INSTG8R (Jul 5, 2018)

Tatty_One said:


> I can't speak for your country but in mine the Vega 56 is a fair bit pricier than the GTX 1070.


Well I just purchased the hands down best Vega out there(Nitro+) for 7500kr a 1080Ti from any big name is 9000kr+ So I think I’m getting the right performance for what I paid for.
Edit: checked current pricing


----------



## cucker tarlson (Jul 5, 2018)

INSTG8R said:


> Well I just purchased the hands down best Vega out there(Nitro+) for 7500kr a 1080Ti from any big name is 9000kr+ So I think I’m getting the right performance for what I paid for.
> Edit: checked current pricing


How much is a decent 1080 ? Not talking most expensive aib, talking a decent one. 
btw Sapphire are amazing, wish they made coolers for nvidia too. my 290 trixx was whisper quiet despite the power draw.


----------



## INSTG8R (Jul 5, 2018)

cucker tarlson said:


> How much is a decent 1080 ? Not talking most expensive aib, talking a decent one.


6000kr average.


----------



## cucker tarlson (Jul 5, 2018)

INSTG8R said:


> 6000kr average.


Then it's a pretty bad deal paying 25% more for v64 with same performance,same amount of vram and huge power consumption.


----------



## INSTG8R (Jul 5, 2018)

cucker tarlson said:


> Then it's a pretty bad deal paying 25% more for v64 with same performannce,same amount of vram and huge power consumption.


Sapphire and my FreeSync  monitor dictates my purchasing decisions.I had a Fury so my only AMD upgrade path is Vega. I don’t use more than 240W I don’t consider that “huge” Idle is most impressive, 10-11W and 26mhz. I’m not disappointed.


----------



## cucker tarlson (Jul 5, 2018)

Yup, it's a good deal with freesync. I heard this monitor only goes to like 90hz in freesync mode tho. Dunno if I'm being 100% correct here.


----------



## INSTG8R (Jul 5, 2018)

cucker tarlson said:


> Yup, it's a good deal with freesync. I heard this monitor only goes to like 90hz in freesync mode tho. Dunno if I'm being 100% correct here.


It does and I’m fine with that. I run most stuff at 90 but it does 144hz so the option for high refresh gaming is an option when permitted. 
Also a Strix Vega is 6500. So I’m happy paying the Sapohire premium for a better version


----------



## cucker tarlson (Jul 5, 2018)

Yeah I like sappire. And their support is good. Had two 290s die, both got replaced with a new one quickly.


----------



## MuhammedAbdo (Jul 5, 2018)

Fluffmeister said:


> Wow that is horrible on the RX 580, I guess looking at figures on graphs doesn't paint the full picture after all.
> 
> If that is the victory some people want, let them have it... i can see why it's a closed alpha!


This is a cluster fist of stutters, totally unplayable on the RX 580.


----------



## 5150Joker (Jul 5, 2018)

Great, AMDs low end cards do a bit better in one sources early alpha benchmarks. What a win! Must be a really slow news day to have posted this garbage.


----------



## efikkan (Jul 5, 2018)

Looks like alpha quality software to me…


----------



## Th3pwn3r (Jul 6, 2018)

Vya Domus said:


> Seeing people trying to deny the fact that RX 580 overtook the 1060 is hilarious.



I wish I thought it was funny, I basically don't care because neither of those cards are of a performance level I would install into one of my machines. I'm only here for entertainment and got baited by the title of course...


----------



## MuhammedAbdo (Jul 6, 2018)

NVIDIA's performance seems fine to me here
https://www.sweclockers.com/test/25...-v-closed-alpha-sju-grafikkort-i-snoig-batalj


----------



## cucker tarlson (Jul 6, 2018)

MuhammedAbdo said:


> NVIDIA's performance seems fine to me here
> https://www.sweclockers.com/test/25...-v-closed-alpha-sju-grafikkort-i-snoig-batalj


but 580 beats 1060 at 4K......


----------



## yeeeeman (Jul 6, 2018)

If the game has some extra stuff only for geforce cards, then this difference in performance can be explained.


----------



## cucker tarlson (Jul 6, 2018)

yeeeeman said:


> If the game has some extra stuff only for geforce cards, then this difference in performance can be explained.


what difference in performance ?


----------



## MuhammedAbdo (Jul 6, 2018)

cucker tarlson said:


> but 580 beats 1060 at 4K......


By a very small margin that is within the statistical variance between the two cards, and not the same as the outrageous previous result.
I wonder if TPU will modify it's news piece to include this benchmark now ..


----------



## Vya Domus (Jul 6, 2018)

Why would they change it when hardly anything actually changed. Satistical variance here is signified by the variable nature of the game. The ranking,  as expected, is still the same. Funny , it's 4K as well , how about that. The resolution that's "irrelevant".

Do you people have nothing better to do than endlessly spam this thread with charts and wishful thinking?


----------



## cucker tarlson (Jul 6, 2018)

Vya Domus said:


> Why would they change it when hardly anything changed. Satistical variance here is signified by the variable nature of the game. The ranking,  as expected, is still the same.
> 
> Do you people have nothing better to do than endlessly spam this thread with charts ?


What the hell do you think you're people doing spamming this fake news thread with facts from reputable sites like pcgh,hwluxx or sweclockers.

lol, I was joking about the 4K performance, neither of them can run 4K.


----------



## MuhammedAbdo (Jul 6, 2018)

Vya Domus said:


> Why would they change it when hardly anything actually changed. Satistical variance here is signified by the variable nature of the game.


So now statistical variance is important? where was that when AMD was ahead according to one site?



Vya Domus said:


> it's 4K as well , how about that. The resolution that's "irrelevant".


Want some 1080p numbers? Here you go, just don't feel sad because they are now dead even!


----------



## Vya Domus (Jul 6, 2018)

MuhammedAbdo said:


> So now statistical variance is important? where was that when AMD was ahead according to one site?
> 
> 
> Want some 1080p numbers? Here you go, just don't feel sad because they are now dead even!
> ...



*The ranking is still the same. *The game has no built in benchmark therfore results are highly variable when testing. However should there be a faster card , on average that card should eventually emerge on the top. And it does.

You can spam us all you want with your charts , it paints the same picture in this context and I am astonished that you can't see that. It's like you have shutters on.


----------



## MuhammedAbdo (Jul 6, 2018)

Vya Domus said:


> *The ranking is still the same. *The game has no built in benchmark therfore results are highly variable when testing. However should there be a faster card , on average that card should eventually emerge on the top. And it does.


I say you had enough hair splitting and straw debating for today.


----------



## Vya Domus (Jul 6, 2018)

MuhammedAbdo said:


> I say you had enough hair splitting and straw debating for today.





Basically you can't come up with anything to prove me wrong , huh ? It's fine buddy , don't worry.


----------



## cucker tarlson (Jul 6, 2018)

Well, can you come up with anything to prove you right yourself ? Unless by being right you mean grasping at straws and bending whatever puny evidence you have to your liking. I said this thread should be over after the pgch video and I was right.


----------



## JoniISkandar (Jul 6, 2018)

prnsforum said:


> good news is only 1 game and still alpha



1 game Nvidia The Way Meant to be played Nvidia sponsored title


----------



## 5150Joker (Jul 7, 2018)

JoniISkandar said:


> 1 game Nvidia The Way Meant to be played Nvidia sponsored title



Just because it's an NVIDIA partnered title doesn't mean Frostbite is all of the sudden slanted towards NVIDIA GPUs. Frostbite has always been a balanced engine that gives good performance for both red and green.


----------



## MuhammedAbdo (Jul 7, 2018)

Wait for wait for it, another game where the 1060 is 40% faster than RX580 @4K! Will the media write about this now?






https://www.overclock3d.net/reviews/software/crash_bandicoot_n_sane_trilogy_pc_performance_review/8


----------



## Vya Domus (Jul 7, 2018)

MuhammedAbdo said:


> Will the media write about this now?



Dude , you seriously need to find something better to do with your time than scrounging the net for benchmarks and post them on here , let this thread that's full of nonsense die already.


----------



## cucker tarlson (Jul 7, 2018)

It's a good game but I don't think anyone is interested in benching non-AAA titles. That's why they're making a big deal out of BF5 alpha, but smaller games get no coverage. No one seems to care that amd is getting absolutely destroyed in frostpunk, even though it's a fantastic game. One chart about bf5 alpha and the internet is on fire. It's a shame though that people took the bait cause those numbers were absolutely fake.


----------



## efikkan (Jul 7, 2018)

Finally, the evidence we need to prove AMD is better in Direct3D 12. Oh wait, the numbers doesn't show that…

This type of news is what happens when people are looking for evidence of a predetermined conclusion, sometimes forgetting to verify the source.


----------



## Tom_ (Jul 7, 2018)

That is embarrassing for NVIDIA.


----------



## cucker tarlson (Jul 8, 2018)

You know what I find funny? When the game comes out and nvidia matches/beats amd, people who took the bait with this fake benchmark will say that nvidia broke performance on amd with gameworks.
I'm still seeing new headlines quoting this article today and morons who don't read into the thread and are just going by the title and op.


----------



## INSTG8R (Jul 8, 2018)

cucker tarlson said:


> You know what I find funny? When the game comes out and nvidia matches/beats amd, people who took the bait with this fake benchmark will say that nvidia broke performance on amd with gameworks.
> I'm still seeing new headlines quoting this article today and morons who don't read into the thread and are just going by the title and op.


You know what I find funny? The Green team digging up every benchmark they can to try to reassert their “dominance”. Can’t let AMD get ahead even if it is just 2FPS...


----------



## cucker tarlson (Jul 8, 2018)

Read this thread again,from the OP to the last post of mine.


----------



## HD64G (Jul 8, 2018)

Whoever doubts about DX12 and Vulkan dominance from CGN over Pascal should have a look here (recent test on latest drivers):

http://www.guru3d.com/articles_pages/geforce_gtx_1050_3gb_review,12.html (BF1@DX12: 580 gets 106FPS vs 1060's 82FPS @1080P)
http://www.guru3d.com/articles_pages/geforce_gtx_1050_3gb_review,10.html (SF2@DX12: 94 vs 85)
http://www.guru3d.com/articles_pages/geforce_gtx_1050_3gb_review,11.html (ROTR@DX12: 81 vs 78)
http://www.guru3d.com/articles_pages/geforce_gtx_1050_3gb_review,13.html (DE MD@DX12: 71 vs 63)
http://www.guru3d.com/articles_pages/geforce_gtx_1050_3gb_review,14.html (W2C@Vulcan: 86 vs 66)

So, BFV being on the same engine as BF1 SHOULD play much better on 580 vs 1060. If it goes on sale and results show otherwise, it will be a proof that the collaboration with nVidia caused that difference. For anyone willing to debate on this, only arguments and facts on topic please, or else negligence will be my only answer.


----------



## cucker tarlson (Jul 8, 2018)

lol at "much better". 580 is like 5% faster than 1060 in BF1. 78 fps for 580 in dx12, 73 fps for 1060 in dx11.

https://www.purepc.pl/karty_graficz...geforce_gtx_1060_9_gbps_msi_gaming_x?page=0,8
https://www.purepc.pl/karty_graficz...geforce_gtx_1060_9_gbps_msi_gaming_x?page=0,7

but I guess it seems like a huge difference looking through your red glasses.


yes, *very genarally* speaking gcn is better in vulkan and dx12, but the difference usually comes down to a few percent. 1080 still manages to beat match V64 in BF1. This is stock 1080 FE (10gbps) vs AIB vega. AIB 1080 with 11gbps memory would pull ahead.


https://www.purepc.pl/karty_graficz...x_vega_64_strix_gaming_oc_red_is_bad?page=0,7
https://www.purepc.pl/karty_graficz...x_vega_64_strix_gaming_oc_red_is_bad?page=0,6

btw this thread was never about gcn vs pascal in dx12, why did you make it about that ? Discussing 580 vs 1060 performance is splitting hairs. whatever you can play on 580, you can play at he same level of smoothness and visual quality with 1060, and vice versa. 1060 does it with a smaller chip and less powerful hardware, cause on paper rx580 should be noticeably faster with a bigger die, 14nm process and 8 gigs of 256-bit memory.


----------



## INSTG8R (Jul 8, 2018)

cucker tarlson said:


> lol at "much better". 580 is like 5% faster than 1060 in BF1. 78 fps for 580 in dx12, 73 fps for 1060 in dx11.
> 
> https://www.purepc.pl/karty_graficz...geforce_gtx_1060_9_gbps_msi_gaming_x?page=0,8
> https://www.purepc.pl/karty_graficz...geforce_gtx_1060_9_gbps_msi_gaming_x?page=0,7
> ...


Yet here you are just proving my point again...Can’t let Red get one up on you at any cost..Just stop with your self  rightousness when you just continually prove my point...


----------



## efikkan (Jul 8, 2018)

Yeah, every game which doesn't favor AMD is obviously proof of foul play by Nvidia.

There will always be some games where one architecture performs slightly better or worse, sometimes it comes down to small details in the shader design, and it doesn't mean one hardware architecture is better than the other. This is why good reviews rely on a wide selection of representative games, unlike some people who find the edge cases to prove their own agenda.

There is not anything in either Direct3D 12 or Vulkan which inherently benefits GCN, and the games we've seen so far doesn't even use Direct3D 12 natively without and abstraction layer, so judging architectures based on edge cases is just ridiculous.


----------



## MuhammedAbdo (Jul 8, 2018)

HD64G said:


> http://www.guru3d.com/articles_pages/geforce_gtx_1050_3gb_review,12.html (BF1@DX12: 580 gets 106FPS vs 1060's 82FPS @1080P)
> http://www.guru3d.com/articles_pages/geforce_gtx_1050_3gb_review,10.html (SF2@DX12: 94 vs 85)
> http://www.guru3d.com/articles_pages/geforce_gtx_1050_3gb_review,11.html (ROTR@DX12: 81 vs 78)
> http://www.guru3d.com/articles_pages/geforce_gtx_1050_3gb_review,13.html (DE MD@DX12: 71 vs 63)
> http://www.guru3d.com/articles_pages/geforce_gtx_1050_3gb_review,14.html (W2C@Vulcan: 86 vs 66)


All of these games have awful DX12 implementation, DX11 runs better than DX12 on all hardware.

You want a better comparison? factor in all of the DX12 games, games like Forza 6, Forza Horizon 3, Halo Wars 2, Gears 4, Gears Ultimate, Civ 6 .. etc. GTX 1060 is on top in all of them.


----------



## Vya Domus (Jul 8, 2018)

INSTG8R said:


> Yet here you are just proving my point again...Can’t let Red get one up one you at any cost..Just stop with your self  rightousness when you just continually prove my point...



I always wondered why are these people so annoyed that they just can't let go. We have 6 pages of comments calling out that the news story is fake and irrelevant but paradoxically they find themselves coming back endlessly with the puniest justifications. 

I typically try to stay away from maymays on TPU but I can't contain myself this time.


----------



## HD64G (Jul 8, 2018)

cucker tarlson said:


> lol at "much better". 580 is like 5% faster than 1060 in BF1. 78 fps for 580 in dx12, 73 fps for 1060 in dx11.
> 
> https://www.purepc.pl/karty_graficz...geforce_gtx_1060_9_gbps_msi_gaming_x?page=0,8
> https://www.purepc.pl/karty_graficz...geforce_gtx_1060_9_gbps_msi_gaming_x?page=0,7
> ...



Frostbyte engine works better for AMD in general (as you said) and helps their arch even more when DX12 is activated (which allows for slower CPUs to not lose many FPS due to part of COU and RAM usage loaded on GPU and VRAM). As SHOULD be in BFV which IS THE TOPIC OF THIS THREAD. So, my post is very much on topic while your answer isn't at all. After all, not all gamers own top CPUs. DX12 and Vulcan are made mainly for the ones who don't and work much better for them.


----------



## Slavic_Un (Jul 8, 2018)

Spoiler: :)


----------



## FordGT90Concept (Jul 8, 2018)

efikkan said:


> Yeah, every game which doesn't favor AMD is obviously proof of foul play by Nvidia.


The vast majority of games out there are created on NVIDIA hardware.  The fact AMD manages to get a victory at all in price/performance is significant.



MuhammedAbdo said:


> All of these games have awful DX12 implementation, DX11 runs better than DX12 on all hardware.







Worst case scenario for D3D12 is that it uses D3D11 calls so even though it's technically D3D12, it performs like D3D11.  A lot early D3D12 games did this--especially games that released as D3D11 and were updated to support D3D12 (The Division comes to mind).  Software that's developed for D3D12 from the ground up will see a significant improvement in performance and reduced CPU load across GCN-based cards and Pascal-based (because of async compute) cards.  Maxwell and Kepler cards see a minor improvement (mostly because of reduced CPU load).



MuhammedAbdo said:


> You want a better comparison? factor in all of the DX12 games, games like Forza 6, Forza Horizon 3, Halo Wars 2, Gears 4, Gears Ultimate, Civ 6 .. etc. GTX 1060 is on top in all of them.


The aggregate is RX 580 is a little bit faster with 2 GiB extra VRAM.  Both cards will drive 1920x1080 at 60 fps fine.


----------



## cucker tarlson (Jul 8, 2018)

You can't get any point past a person's mind when all they wanna hear is confirmation of what they believe. We come back to disprove the story ? Sure,cause the pcgh video left so much unexplained....



HD64G said:


> Frostbyte engine works better for AMD in general (as you said) and helps their arch even more when DX12 is activated (which allows for slower CPUs to not lose many FPS due to part of COU and RAM usage loaded on GPU and VRAM). As SHOULD be in BFV which IS THE TOPIC OF THIS THREAD. So, my post is very much on topic while your answer isn't at all. After all, not all gamers own top CPUs. DX12 and Vulcan are made mainly for the ones who don't and work much better for them.



Again,only theoretically. Look at Hitman, the game that was dx12 from grounds up. dx12 does nothing for a much fatster rx480 paired with a slower i3/i5 CPU, it's only after they use overclocked 4690K that a faster card pulls away.
https://www.purepc.pl/karty_graficz...e_gtx_970_test_na_kilku_procesorach?page=0,10
in rotr, even in dx12 mode with 4790k, rx480 cannot performa as well as it should.
https://www.purepc.pl/karty_graficz...e_gtx_970_test_na_kilku_procesorach?page=0,13
The more I look at the results the more I think there has to be some sort of flaw in design or driver support for gcn cards that dx12 can thoeretically help, but not always  does cause it's not able to. They can look great tested on i7's and i9's, but the story about dx12 being able to carry those compute monsters on slow cpus is not true in many cases.


----------



## INSTG8R (Jul 8, 2018)

Vya Domus said:


> I always wondered why are these people so annoyed that they just can't let go. We have 6 pages of comments calling out that the news story is fake and irrelevant but paradoxically they find themselves coming back endlessly with the puniest justifications.
> 
> I typically try to stay away from maymays on TPU but I can't contain myself this time.


Pretty much bang on mate! Usual suspects that just won’t give up. Almost Whataboutism...


----------



## John Naylor (Jul 8, 2018)

Why is it that when there's an agenda to promote, the published sites that show the exact opposite are not listed ?

https://www.sweclockers.com/test/25...-v-closed-alpha-sju-grafikkort-i-snoig-batalj


----------



## INSTG8R (Jul 8, 2018)

John Naylor said:


> Why is it that when there's an agenda to promote, the published sites that show the exact opposite are not listed ?
> 
> https://www.sweclockers.com/test/25...-v-closed-alpha-sju-grafikkort-i-snoig-batalj


Not like I’m on board with this game but I’d get 72fps @1440 so I have nothing to complain about. Oh and I’d have a higher minimum FPS than a 1080Ti.


----------



## MuhammedAbdo (Jul 8, 2018)

INSTG8R said:


> Not like I’m on board with this game but I’d get 72fps @1440 so I have nothing to complain about. Oh and I’d have a higher minimum FPS than a 1080Ti.


Did you test the same area as Swesclocker or the same sequence??  or is it the AMD mentality to brag about numbers pulled out of thin air every time benchmarks are posted?


----------



## cucker tarlson (Jul 8, 2018)

INSTG8R said:


> Not like I’m on board with this game but I’d get 72fps @1440 so I have nothing to complain about. Oh and I’d have a higher minimum FPS than a 1080Ti.


complains about people pulling various charts and splitting hairs about nvidia's performance,brags about how his amd card has better min. fps in alpha charts. 


good grief....


----------



## Vya Domus (Jul 8, 2018)

Also something that bugs me : "Alpha"

The game is a mere 2-3 months from release , built on the same engine as the previous iteration with no obvious big changes. Whoever seriously believes the end product is going to come with major shifts in terms of performance from any of the two vendors is deluding himself.



MuhammedAbdo said:


> or is it the AMD mentality to brag about numbers pulled out of thin air every time benchmarks are posted?



Is it Nvidia mentality to give a damn ?

I don't get you , if you feel like you have provided us with irrefutable proof for whatever the hell is it that you are trying to prove why are you still bothered by that ? You are borderline trolling to say the least.


----------



## cucker tarlson (Jul 8, 2018)

Vya Domus said:


> Also something that bugs me : "Alpha"
> 
> The game is a mere 2-3 months from release , built on the same engine as the previous iteration with no obvious big changes. Whoever seriously believes the end product is going to come with major shifts in terms of performance from any of the two vendors is deluding himself.


Took you long enough to realize that.

the game looks amazing,even in alpha. Will probably be my first BF game


----------



## INSTG8R (Jul 8, 2018)

cucker tarlson said:


> complains about people pulling various charts and splitting hairs about nvidia's performance,brags about how his amd card has better min. fps in alpha charts.
> 
> 
> good grief....


Well why don’t you just post another link to another green victory for me....you know, keep up your fight against those 2 FPS...It really seems to bother you


----------



## MuhammedAbdo (Jul 8, 2018)

Vya Domus said:


> The game is a mere 2-3 months from release , built on the same engine as the previous iteration with no obvious big changes. Whoever seriously believes the end product is going to come with major shifts in terms of performance from any of the two vendors is deluding himself.


I am sorry if you have fish memory. but NVIDIA always gets it's performance targets even after the game is long released, examples: Forza 7, Destiny 2, and Hitman DX12. So don't worry, pretty soon NVIDIA will be ahead come final release. And NO there are big changes in the engine now, more particles, and more destruction. Optimizations will be made for both, and wait .. there is more, NVIDIA will introduce some of those optional GameWorks things you are always so butt hurt about. Probably some PCSS+ and HFTS shadows like StarWar Battlefront 2, some Ansel is also likely.


----------



## efikkan (Jul 8, 2018)

FordGT90Concept said:


> The vast majority of games out there are created on NVIDIA hardware.  The fact AMD manages to get a victory at all in price/performance is significant.


The majority of top PC games are console ports, and the console sales still makes up much of the sales for many of these developers, which is why there are more AMD partner games than there are Nvidia partner games in this segment.



FordGT90Concept said:


> Worst case scenario for D3D12 is that it uses D3D11 calls so even though it's technically D3D12, it performs like D3D11.  A lot early D3D12 games did this--especially games that released as D3D11 and were updated to support D3D12 (The Division comes to mind).  Software that's developed for D3D12 from the ground up will see a significant improvement in performance and reduced CPU load across GCN-based cards and Pascal-based (because of async compute) cards.  Maxwell and Kepler cards see a minor improvement (mostly because of reduced CPU load).


At this point games should be developed for Direct3D 12 or Vulkan exclusively, there is no point in supporting pre-Fermi an pre-GCN for new top titles, and pre-GCN cards dropped driver support a while ago anyway. Windows 7/8 support is probably the only reason to have legacy support, but if by doing so you have to design a bad engine, then you should only support the old API.

It's pointless to simulate the old API through an abstraction layer. Sometimes these abstractions can perform worse than the old API, because the new APIs are designed around a different approach. The point of the new APIs were to leverage lower level control over the hardware, and re-adding wrappers to abstract that away defeats the whole purpose. Utilizing these APIs properly requires entire new engines built from the ground up to leverage this new level of control, and since this is painstakingly hard, we probably wouldn't see any widespread proper adaption of these APIs anytime soon. Most developers will continue to use abstractions to deal with the new APIs, and might even continue to do so. Games are unfortunately mass-produced trash these days, not only in terms of recycling the same concepts, but also in terms of code. They are usually stitched together before the shipping date, and then they move on to the next title.


----------



## INSTG8R (Jul 9, 2018)

MuhammedAbdo said:


> Did you test the same area as Swesclocker or the same sequence??  or is it the AMD mentality to brag about numbers pulled out of thin air every time benchmarks are posted?


Dude seriously? you’re the one spamming link after link trying to discredit results and make sure your precious Green Team is winning. I just looked at one set of benches and that was my takeaway. So if anyone is attempting to pull results out of thin air here it’s been you...Oh and guess who came out on top on that bench between the 1060 and 580? Or do you need to post a dozen more links to try to discredit those results too?
Give it up man you’re really trolling at this point and it’s sad. Your NVIDIA mentality is rather pathetic at this point.


----------



## FordGT90Concept (Jul 9, 2018)

efikkan said:


> The majority of top PC games are console ports, and the console sales still makes up much of the sales for many of these developers, which is why there are more AMD partner games than there are Nvidia partner games in this segment.


Those console ports are still developed on NVIDIA hardware.  As a result, it's well optimized for Windows from the start.  They have to optimize for the target consoles in order to get qualified.  AMD desktop cards rarely get an optimization pass unless there's major problems.  There are exceptions like Deus Ex: Mankind Divided and Hitman which were developed in collaboration with AMD.

Bare in mind that even though consoles tend to sell more copies of a game, publishers make more money per sale on PC because profit margins are much better (no qualifying, distributors take a smaller cut, don't need to produce and ship physical media to stores, patches are free to push, etc.).



efikkan said:


> At this point games should be developed for Direct3D 12 or Vulkan exclusively, there is no point in supporting pre-Fermi an pre-GCN for new top titles, and pre-GCN cards dropped driver support a while ago anyway. Windows 7/8 support is probably the only reason to have legacy support, but if by doing so you have to design a bad engine, then you should only support the old API.


Unreal Engine 4 nor Unity officially support D3D12 nor Vulkan yet.  The majority of PC games are built on those engines.  The reason why Vulkan/D3D12 support is sparse is because they are a huge paradigm shift from OpenGL/D3D11: the entire renderer has to be rewritten.  I'd argue most games out today that support Vulkan/D3D12 are half-assed implementations of it.  It'll be years yet before we see games that fully exploit the technology.


----------



## cucker tarlson (Jul 9, 2018)

Lol I remember people claiming 290 would catch up with or even beat 980Ti back in 2016 cause they saw a 3d mark draw calls benchmark looks like dx11 ain't going anywhere. There's one thing I'm convinced about - once nvidia wants to really invest in dx12/vulkan - it's going to boom. They've been comfortable sitting on dx11 cause their performance is good. We need to wait for nvidia to decide they need to tap into the dx12/vulkan for performance to make new gen cards faster to see a breakthrough.


----------



## efikkan (Jul 9, 2018)

FordGT90Concept said:


> Those console ports are still developed on NVIDIA hardware.  As a result, it's well optimized for Windows from the start.  They have to optimize for the target consoles in order to get qualified.  AMD desktop cards rarely get an optimization pass unless there's major problems.  There are exceptions like Deus Ex: Mankind Divided and Hitman which were developed in collaboration with AMD.


Console games are developed and debugged on AMD hardware. By the time they are tested on Nvidia hardware, the games are already implemented, and the rendering pipeline are not redesigned unless it have major problems.



FordGT90Concept said:


> Bare in mind that even though consoles tend to sell more copies of a game, publishers make more money per sale on PC because profit margins are much better (no qualifying, distributors take a smaller cut, don't need to produce and ship physical media to stores, patches are free to push, etc.).


That depends, top titles generally sells much more on console, and many top publishers earn much more from console sales, despite high fees from console makers. The PC market is on the other hand much more diversified, and for the most part less focusing on a few popular titles.



FordGT90Concept said:


> Unreal Engine 4 nor Unity officially support D3D12 nor Vulkan yet.  The majority of PC games are built on those engines.  The reason why Vulkan/D3D12 support is sparse is because they are a huge paradigm shift from OpenGL/D3D11: the entire renderer has to be rewritten.  I'd argue most games out today that support Vulkan/D3D12 are half-assed implementations of it.  It'll be years yet before we see games that fully exploit the technology.


You're right about many titles using Unreal, but Unity is mostly used or "shovelware" titles, and very few or none of those are relevant when it comes to good performance.

Unity will probably never benefit properly from Direct3D 12 or Vulkan, since the rendering pipeline has to be tailored to the game to fully utilize the potential in the new APIs. Support will probably arrive, but it will suck as much as before.



cucker tarlson said:


> Lol I remember people claiming 290 would catch up with or even beat 980Ti back in 2016 cause they saw a 3d mark draw calls benchmark looks like dx11 ain't going anywhere. There's one thing I'm convinced about - once nvidia wants to really invest in dx12/vulkan - it's going to boom. They've been comfortable sitting on dx11 cause their performance is good. We need to wait for nvidia to decide they need to tap into the dx12/vulkan for performance to make new gen cards faster to see a breakthrough.


Yes, we are always promised that AMD hardware are "better", you just don't see it yet. Well, most people waiting for their 2xx/3xx series to unveil their benefits have already moved on or will be when the next Nvidia cards arrive shortly.

This is also the problem with purely synthetic benchmarks, even more so, benchmarks with only measures one tiny aspect of rendering. And these benchmarks are just misleading to buyers; what does the average buyer know about "draw calls"? And displaying and edge case is just ridiculous, especially since dummy draw calls have little to do with what the hardware can do in an actual scene.


----------



## mtcn77 (Jul 9, 2018)

cucker tarlson said:


> Lol I remember people claiming 290 would catch up with or even beat 980Ti back in 2016 cause they saw a 3d mark draw calls benchmark looks like dx11 ain't going anywhere. There's one thing I'm convinced about - once nvidia wants to really invest in dx12/vulkan - it's going to boom. They've been comfortable sitting on dx11 cause their performance is good. We need to wait for nvidia to decide they need to tap into the dx12/vulkan for performance to make new gen cards faster to see a breakthrough.


GCN needs scalarization of code in order to launch multiple wavefronts. So, it could be anything in between unless the path that consoles take is available on Windows.


----------



## Th3pwn3r (Jul 11, 2018)

Instead of this being RX580 vs 1060 you fan boys made it AMD vs Nvidia. BOTH cards suck! Let's just leave it at that. I still blame the title for starting all of this mess.


----------



## MuhammedAbdo (Jul 11, 2018)

Ryzen CPUs (as usual ) are the cause of the disparity in results:

See, PCGamesN and Hardwareluxx were both using processors less powerful than the one used by Sweclockers. *PCGamesN was using a Ryzen 2700X* while *Hardwareluxx used an AMD Threadripper 1950X* processor – which is decidedly not clocked for gaming purposes. Sweclockers, on the other hand, *used the king of all gaming CPUs: the Core i7-8700K*. They even benchmarked the processors to further elaborate on this reasoning:







As you can see, the difference between the Ryzen 7 2700X (which PCGamesN used) and the Core i7-8700k is very significant. In fact, this is probably the sole reason why we see AMD cards pushing ahead of the GTX 1080 Ti against all odds and why we see the 1080 Ti maintaining a clear lead in the Sweclocker results of the same settings and same resolution. In other words, once you remove the CPU bottleneck from the equation, it looks like the GTX 1080 Ti is still king.



https://wccftech.com/battlefield-v-closed-alpha-benchmarks/


----------



## cucker tarlson (Jul 11, 2018)

I think the cause of disparity was the author's stupidity not Ryzen. I can't believe how many small time reviewers are shilling for AMD these days, that's why it's always best to take info from big,reputable sites like pcgh,computerbase or TPU (if they do their own testing).


----------



## INSTG8R (Jul 11, 2018)

Yet here you and Muhammed are shilling for Green constantly... Keep up the good “Green Fight’”  funny to see so much propaganda over 2FPS...


----------



## medi01 (Jul 12, 2018)

dj-electric said:


> AMD beats nvidia at something, lets make a news piece.


FFS, AMD regularly "beats nvidia at something", but it doesn't happen that often, that *AMD beats nVidia by a whopping 30% in an nVidia sponsored game*.



efikkan said:


> Yes, we are always promised that AMD hardware are "better", you just don't see it yet.


It's hard to see with eyes wide shut.



MuhammedAbdo said:


> and Hitman DX12. So don't worry, pretty soon NVIDIA will be ahead come final release.


So, nVidia got ahead in DX12 Hitman?

Why does a minor alpha game benchmark force you to start spitting nonsense, pretty please? Why butheart for Huang at all?


----------



## MuhammedAbdo (Jul 12, 2018)

medi01 said:


> So, nVidia got ahead in DX12 Hitman?


I didn't say ahead, I said they improved their DX12 performance to the point they are matching AMD.







https://techreport.com/review/32766/nvidia-geforce-gtx-1070-ti-graphics-card-reviewed/7
https://www.hardware.fr/articles/971-9/benchmark-hitman.html

It must be a shock to you doesn't it? Here is another shock, remember Ashes of Singularity?









https://www.hardware.fr/articles/971-11/benchmark-ashes-of-the-singularity.html
https://www.anandtech.com/show/11987/the-nvidia-geforce-gtx-1070-ti-founders-edition-review/5

Too bad for you right? Remember Forza 7?






https://techreport.com/review/32766/nvidia-geforce-gtx-1070-ti-graphics-card-reviewed/3

What's that? 1060 is 25% faster than RX580 in a DX12 game? No way! Let'ss make this a news piece!

Stay in your lands of denial please ..
Mic dropped ..


----------



## cucker tarlson (Jul 12, 2018)

Well actually 1080 beats V64 in Hitman dx12 @1440p

https://www.purepc.pl/karty_graficz...tx_1070u_chinski_przepis_na_pascala?page=0,15

although it's splitting hairs about which one is faster.

Muhhamed is on point here. First and foremost there are a lot of dx12 games that run really well on nvidia. Can't tell specifically, but I'd be surprised if you gathered up all dx12 games on the market and 1080 to V64 was more than +/- 5%. Second of all, nvidia did make a reasonable progress optimizing typically AMD games like hitman and doom, to the point that amd no longer wins clealy in any of them,if they win at all. Most of it came from reducing cpu overhead,cause geforce cards used to get worse performance on launch in games that use dx12 features. since amd has hardware implementation,it ran fine on launch. Nvidia had to take time for optimizations, took longer in the past, but nowadays they can deal with it pretty quickly.


----------



## MuhammedAbdo (Jul 12, 2018)

cucker tarlson said:


> although it's splitting hairs about which one is faster.


Exactly ..

Though move up to 1440p and the 1080 is ahead of the Vega 64 by a small margin, after driver upgrades:

https://www.purepc.pl/karty_graficz...70_ti_nemezis_amd_radeon_rx_vega_56?page=0,15

Use a newer review and you will find the 1080 is 7% faster at 1080p:

https://www.purepc.pl/karty_graficz..._vega_64_nitro_niereferencyjna_vega?page=0,15


----------



## cucker tarlson (Jul 12, 2018)

Good find,the newer one uses 388.59, the older one uses 388.13.


----------



## MuhammedAbdo (Jul 12, 2018)

cucker tarlson said:


> Nvidia had to take time for optimizations, took longer in the past, but nowadays they can deal with it pretty quickly.


The fruits of such efforts can be found in games like Ashes of Singularity Escalation, where a 1080 now easily beats a Vega 64!









https://www.anandtech.com/show/11987/the-nvidia-geforce-gtx-1070-ti-founders-edition-review/5


----------



## cucker tarlson (Jul 12, 2018)

They did a lot to vulkan and dx12 in 2017. Funny how people who say people buying nvidia are ignorant cause AMD gets better with time are the ones that didn't even know nvidia has caught up to AMD in hitman/doom/aots a long time ago. You can tell them a thousand times, next day they'll be out again with the same arguments.

<iframe src="https://giphy.com/embed/bfGK4ihDc2rDi" width="480" height="269" frameBorder="0" class="giphy-embed" allowFullScreen></iframe><p><a href="



">via GIPHY</a></p>


----------



## mtcn77 (Jul 12, 2018)

Again, when did Nvidia stop making frametimes the headline and the fps counter is back in the spotlight? I guess it gets rephrased how it fits the purpose...


----------



## wiak (Jul 26, 2018)

Vya Domus said:


> Meh , because of the way Nvidia's hardware works they always need more driver work. Can't say this means much.


nah they need to add gimpworks then nvidia is ahead and everything goes 50% slowe


----------



## StrayKAT (Jul 26, 2018)

FordGT90Concept said:


> Those console ports are still developed on NVIDIA hardware.  As a result, it's well optimized for Windows from the start.  They have to optimize for the target consoles in order to get qualified.  AMD desktop cards rarely get an optimization pass unless there's major problems.  There are exceptions like Deus Ex: Mankind Divided and Hitman which were developed in collaboration with AMD.
> 
> Bare in mind that even though consoles tend to sell more copies of a game, publishers make more money per sale on PC because profit margins are much better (no qualifying, distributors take a smaller cut, don't need to produce and ship physical media to stores, patches are free to push, etc.).
> 
> ...



Really? Mankind Divided is still the most demanding game I have. I can't see it running at max, except with a 1080ti.


----------



## seb777 (Jul 27, 2018)

*AMD Beats NVIDIA's Performance ????? Mmmmmmm ..bit dramatic title for this article, how can a GTX1080(i) not smoke both of these cards I dont get it ??? *


----------



## Vya Domus (Jul 27, 2018)

How about we let this cancer of a thread die already.


----------



## Caring1 (Jul 28, 2018)

seb777 said:


> *AMD Beats NVIDIA's Performance ????? Mmmmmmm ..bit dramatic title for this article, how can a GTX1080(i) not smoke both of these cards I dont get it ??? *


Reading or subtlety not your specialties either?


----------



## x86overclock (Aug 24, 2018)

the54thvoid said:


> Yup, that's the compute for you.
> 
> Once the game is closer to release, I'm quite sure Nvidia driver team will be working their usual magic.


 I'm positive they will work their magic, but I bet when they do the Radeon 580 drop in performance. After all this is Nvidia working with Dice and this always happens.



MuhammedAbdo said:


> The fruits of such efforts can be found in games like Ashes of Singularity Escalation, where a 1080 now easily beats a Vega 64!
> 
> 
> 
> ...


The Vega 64 loses because Oxide removed Microsoft DX12's Asynchronous Compute instruction and replaced it with Nvidia's proprietary Simulated Asynch Compute instruction. This was done a month after the launch of the first Ashes of the Singularity because the Geforce cards did very poorly. Nvidia had asked Oxide to implement their instruction because their hardware was incompatible with DX12's Asynchronous Compute and Nvidia insisted that it also improved Radeon's performance which ended up being completely false. After the Nvidia Asynch Compute implementation Radeon cards saw a 17% loss in performance while the Geforce cards improved performance by 8%. DX12's Asynchronous Compute is not implemented in most games, instead most games use Nvidia's simulated Asynch Compute because those games are developed on Nvidia hardware and they have to use Cuda which automatically implements Nvidia optimizations including their simulated Asynch Compute.


----------



## JRMBelgium (Aug 25, 2018)

MuhammedAbdo said:


> Ryzen CPUs (as usual ) are the cause of the disparity in results:
> 
> See, PCGamesN and Hardwareluxx were both using processors less powerful than the one used by Sweclockers. *PCGamesN was using a Ryzen 2700X* while *Hardwareluxx used an AMD Threadripper 1950X* processor – which is decidedly not clocked for gaming purposes. Sweclockers, on the other hand, *used the king of all gaming CPUs: the Core i7-8700K*. They even benchmarked the processors to further elaborate on this reasoning:
> 
> ...



Actually, I would love to see more reviewers using mid-range budget CPU's. It would reflect a more realistic gaming experience for the customer. The reality is that more gamers have a CPU that comes close to Ryzen 2700x performance then the amount of gamers tha have a CPU with 8700K performance.

People see a GTX 1060 review. They see it getting 60+fps in many games. Then they put it in their PC and they encounter framedrops to 40fps. Because all reviewers test these cards on one of the fastest CPU's.


----------



## MuhammedAbdo (Aug 25, 2018)

x86overclock said:


> The Vega 64 loses because Oxide removed Microsoft DX12's Asynchronous Compute instruction and replaced it with Nvidia's proprietary Simulated Asynch Compute instruction. This was done a month after the launch of the first Ashes of the Singularity because the Geforce cards did very poorly. Nvidia had asked Oxide to implement their instruction because their hardware was incompatible with DX12's Asynchronous Compute and Nvidia insisted that it also improved Radeon's performance which ended up being completely false. After the Nvidia Asynch Compute implementation Radeon cards saw a 17% loss in performance while the Geforce cards improved performance by 8%. DX12's Asynchronous Compute is not implemented in most games, instead most games use Nvidia's simulated Asynch Compute because those games are developed on Nvidia hardware and they have to use Cuda which automatically implements Nvidia optimizations including their simulated Asynch Compute.


That's a load of hosrecrap, none of this happened and Oxide challenged NVIDIA and refused their involvement in any way in their demos which are heavily subsided by AMD, and remain so to this day.

No developer implements Async Compute because it's a pain in the ass to get it working and supported on most architectures, and the gains are limited most of the time anyway.


----------



## jabbadap (Aug 25, 2018)

MuhammedAbdo said:


> That's a load of hosrecrap, none of this happened and Oxide challenged NVIDIA and refused their involvement in any way in their demos which are heavily subsided by AMD, and remain so to this day.
> 
> No developer implements Async Compute because it's a pain in the ass to get it working and supported on most architectures, and the gains are limited most of the time anyway.



Afaik even nvidia implements directx async compute on their Nvidia FleX particle based simulations. But it's not the silver bullet to anything. That Oxides AotS is the best case scenario for async compute, some different type of game and performance benefit from async compute is much much less.


----------



## x86overclock (Aug 26, 2018)

MuhammedAbdo said:


> That's a load of hosrecrap, none of this happened and Oxide challenged NVIDIA and refused their involvement in any way in their demos which are heavily subsided by AMD, and remain so to this day.
> 
> No developer implements Async Compute because it's a pain in the ass to get it working and supported on most architectures, and the gains are limited most of the time anyway.


 It did happen https://tech4gamers.com/nvidia-acti...-games-to-implement-directx-12-async-compute/
In Asynchronous Compute games 1080ti gets it's butt handed to it like Forza 7 and Wolfenstein 2 The New Colossus by both the Vega 56 and 64. In Nvidia DX12 titles Nvidia has the upper hand because they do not use DX12's Asynchronous Compute instead they use Nvidia's Asynch Compute the two are not to be confused. Asynchronous is a pain in 7the butt to implement but when developers develop DX12 titles using Cuda, it is automatically implemented.
https://digiworthy.com/2017/11/03/wolfenstein-2-benchmarks-amd-vs-nvidia/ and this https://www.guru3d.com/news-story/forza-7-pc-graphics-performance-benchmarks.html


----------



## londiste (Aug 27, 2018)

Oxide has been taken help from both/all vendors. While AotS's Nitrous engine got its start as a showcase for Mantle - massive amount of draw calls as well as async shaders being the main selling point - it did get worked over for both DX12 and later, Vulkan. It is a fairly objective take on engine development with new lower level APIs in mind. Nvidia cards have a different take on Async Shaders, whether that is better or worse, real or fake, is somewhat irrelevant as there is a way to achieve pretty much the same result. Oxide implemented that at some point. It is worth noting that while the engine is a good showcase at least initially it did not use some of the usual optimizations for example to reduce reliance on excessive draw calls.

AMD/Nvidia have gone back and forth over time when it comes to AotS performance with Nvidia cards barely edging out the comparative AMD cards right now. And by that, I mean DX12 (and Vulkan). AMD's DX11 performance in AotS has been outright atrocious all the time allowing them to claim huge performance increase as DX12 benefit. At the same time, Nvidia cards run DX11 AotS with respectable enough results.

*x86overclock*, Forza 7 had problems on Nvidia cards that were resolved rather quickly with a driver update. Wolfenstein 2 does have a nice boost for Vega (not all AMD) architecture. The main cause is that iteration of idTech6 using Rapid Packed Math (2xFP16 instead of FP32) for some shaders, leading to some performance benefit. The implementation of some AA modes also tend to favor AMD cards.

DX12 games are not implementing CUDA nor have they ever.


----------



## x86overclock (Aug 27, 2018)

londiste said:


> Oxide has been taken help from both/all vendors. While AotS's Nitrous engine got its start as a showcase for Mantle - massive amount of draw calls as well as async shaders being the main selling point - it did get worked over for both DX12 and later, Vulkan. It is a fairly objective take on engine development with new lower level APIs in mind. Nvidia cards have a different take on Async Shaders, whether that is better or worse, real or fake, is somewhat irrelevant as there is a way to achieve pretty much the same result. Oxide implemented that at some point. It is worth noting that while the engine is a good showcase at least initially it did not use some of the usual optimizations for example to reduce reliance on excessive draw calls.
> 
> AMD/Nvidia have gone back and forth over time when it comes to AotS performance with Nvidia cards barely edging out the comparative AMD cards right now. And by that, I mean DX12 (and Vulkan). AMD's DX11 performance in AotS has been outright atrocious all the time allowing them to claim huge performance increase as DX12 benefit. At the same time, Nvidia cards run DX11 AotS with respectable enough results.
> 
> ...


CUDA is the software developer tool that is used by developers when they use Nvidia Hardware to develop a title. Wehat I was referring to was Nvidia's Asynch Compute which is different than DX12 Asynchronous Compute. https://developer.nvidia.com/cuda-zone


----------



## londiste (Aug 27, 2018)

CUDA is Nvidia's compute API. Games are using a graphics API - in this context DX12 and Vulkan - both of which allow doing asynchronous compute.
It is still DX12 asynchronous compute, vendors just want this to be set up differently for optimal use.


----------



## jabbadap (Aug 27, 2018)

x86overclock said:


> CUDA is the software developer tool that is used by developers when they use Nvidia Hardware to develop a title. Wehat I was referring to was Nvidia's Asynch Compute which is different than DX12 Asynchronous Compute. https://developer.nvidia.com/cuda-zone



No cuda is not used for that. Nvidia uses standard Directx DirectCompute for async compute.


----------



## MuhammedAbdo (Sep 3, 2018)

x86overclock said:


> In Asynchronous Compute games 1080ti gets it's butt handed to it like Forza 7 and Wolfenstein 2 The New Colossus by both the Vega 56 and 64. In Nvidia DX12 titles Nvidia has the upper hand because they do not use DX12's Asynchronous Compute instead they use Nvidia's Asynch Compute the two are not to be confused. Asynchronous is a pain in 7the butt to implement but when developers develop DX12 titles using Cuda, it is automatically implemented.


Nope. The regular 1080 and Vega 64 are close in both FC5 and W2 after driver and game updates.












https://www.patreon.com/posts/radeon-rx-vega-20791677


----------



## x86overclock (Sep 3, 2018)

14 fps and 17 fps are not close.


----------



## MuhammedAbdo (Sep 3, 2018)

x86overclock said:


> 14 fps and 17 fps are not close.


Are you blind? The OC'ed 1080 is faster than OC'ed Vega 64 in both games.
At stock the Vega 64 is ahead by 3 or 6 fps. A very minor difference.


----------



## x86overclock (Sep 3, 2018)

MuhammedAbdo said:


> Are you blind? The OC'ed 1080 is faster than OC'ed Vega 64 in both games.
> At stock the Vega 64 is ahead by 3 or 6 fps. A very minor difference.


Oced not stock. And the Vega 64 is 1 FPS faster in Farcry 5. For that matter you might as well compare it the Vega 64 Liquid Edition. The Liquid Edition has a stock clock of 1750/945 and can be overclocked to 1850/1000. The air cooled Vega 64s do not do well with overclocks because of the high voltage which causes higher temperatures which also causes the clocks to throttle lower than their default because they hit their thermal limit. The liquid Editions do not have this issue


----------



## StrayKAT (Sep 3, 2018)

The difference is negligible imo. Barring optimization/software issues, Vega and 1080 seem roughly equivalent. Ignore anyone who says otherwise. AMD should take criticism seriously when it comes to 1080Ti and above though... they have no answer for it (but perhaps it was intentional.. I guess? I wish they had larger GPU ambitions).


----------



## hat (Sep 4, 2018)

StrayKAT said:


> The difference is negligible imo. Barring optimization/software issues, Vega and 1080 seem roughly equivalent. Ignore anyone who says otherwise. AMD should take criticism seriously when it comes to 1080Ti and above though... they have no answer for it (but perhaps it was intentional.. I guess? I wish they had larger GPU ambitions).



Intentional? That would be silly. If you were running a company, why would you just let your competitor have a superior product?


----------



## StrayKAT (Sep 4, 2018)

hat said:


> Intentional? That would be silly. If you were running a company, why would you just let your competitor have a superior product?



It makes no sense to me either  I'm just kind of repeating what a lot of people say: That AMD intentionally targets the midrange market. I'd prefer they aimed higher, but whatever.

edit: Actually Zen breaks this mold CPU-wise.


----------



## hat (Sep 4, 2018)

Meh, I think they just do what they can with what they currently have. Hopefully Navi will be a much better architecture, and with 7nm, they can fit more of it in a smaller space.


----------



## MuhammedAbdo (Sep 4, 2018)

x86overclock said:


> Oced not stock. And the Vega 64 is 1 FPS faster in Farcry 5. For that matter you might as well compare it the Vega 64 Liquid Edition. The Liquid Edition has a stock clock of 1750/945 and can be overclocked to 1850/1000. The air cooled Vega 64s do not do well with overclocks because of the high voltage which causes higher temperatures which also causes the clocks to throttle lower than their default because they hit their thermal limit. The liquid Editions do not have this issue


You might as well compare that to 1080 Liquid cooled cards, they hit 2100Mhz easily as well.


----------



## londiste (Sep 4, 2018)

hat said:


> Intentional? That would be silly. If you were running a company, why would you just let your competitor have a superior product?


$700 GPU is a high end, if not ultra high end product. Historically the margins there are excellent but volume just isn't there.
When money is tight, they may not want to design and manufacture a large and expensive GPU.


----------



## MuhammedAbdo (Sep 5, 2018)

And the 580 is a 6fps faster than 1060 in BFV beta. So much for the hyperbolic trash in this thread


----------



## x86overclock (Sep 5, 2018)

MuhammedAbdo said:


> You might as well compare that to 1080 Liquid cooled cards, they hit 2100Mhz easily as well.


----------



## MuhammedAbdo (Sep 5, 2018)

These are OLD benchmarks dating back to several months, the ones I posted are only 10 days old, with latest drivers and game patches.


----------



## x86overclock (Sep 5, 2018)

MuhammedAbdo said:


> You might as well compare that to 1080 Liquid cooled cards, they hit 2100Mhz easily as well.


That would be a fair comparison although the the liquid cooled 1080s are not reference. I would like to see how they would compare considering a stock Vega 64 with a 1200 MHz/1536 boost beats a stock 1080 with 1607/1733 boost.


----------

