# Battlefield V Benchmark Performance Analysis



## W1zzard (Nov 9, 2018)

Battlefield V, the most anticipated title this year, has just launched. We test the game with the latest game-ready drivers, using the whole NVIDIA RTX lineup and important graphics cards from AMD. We present results for all major resolutions, including 4K, and look at memory usage, too.

*Show full review*


----------



## darkangel0504 (Nov 9, 2018)

nice job, *W1zzard *


----------



## L33t (Nov 9, 2018)

Great job. 

@W1zzard if possible consider including Ultrawide 3,440 x 1,440 testing, it is an increasingly pupular resolution within PC gaming community. Not quite as demanding as 4k but more so than standard 1440p. 

Thx


----------



## erixx (Nov 9, 2018)

Veeery nice W1zzard! Magnificent screenshots that I have yet to play myself (working.....)


----------



## W1zzard (Nov 9, 2018)

L33t said:


> Ultrawide


Don't have a ultrawide monitor, and no plans to buy one until support for this tech improves significantly


----------



## Rahmat Sofyan (Nov 9, 2018)

"The GTX 1060 6 GB is significantly faster than the RX 570 4 GB .." ??? 

only 5 fps avg with all resolution .. I think not really that significant ..


----------



## L33t (Nov 9, 2018)

W1zzard said:


> Don't have a ultrawide monitor, and no plans to buy one until support for this tech improves significantly



Fortunately it works quite well on BF and most recent titles. I only play BF and Cod so.. 

But it's understandable that my tastes are not that general  

Keep it up!


----------



## Dante Uchiha (Nov 9, 2018)

Was the test done at DX11?

I missed a CPU test or at least core-count scaling...


----------



## droopyRO (Nov 9, 2018)

> especially with its in-game cutscenes that depicted a female hero charging men into battle, with some even calling it "historically inaccurate." If you're one of them, grow up.


Prosthetic arm woman that overpowers men in hand to hand combat, on the Western Front and we have to grow up ? adding womans to a video game, fine.
Adding cyborg-women and calling someone who disagrees with, a child or uneducated, not fine.
My grandfather was an officer, fought in WW2, Eastern Front, got a lot of "recognition" from the communists for it after August 1944, wonder what he would have said about this ... re-writing of history.


----------



## kastriot (Nov 9, 2018)

I think 1060 6Gb should have more or less same performance as rx 580 8GB on ultra 1080/1440p


----------



## W1zzard (Nov 9, 2018)

Dante Uchiha said:


> Was the test done at DX11?


DX12.

Couldn't do CPU testing this time because of the horrible 5 activations limit.. and I even bought three memberships, so 15 activations total


----------



## L33t (Nov 9, 2018)

droopyRO said:


> Prosthetic arm woman that overpowers men in hand to hand combat, on the Western Front and we have to grow up ? adding womans to a video game, fine.
> Adding cyborg-women and calling someone who disagrees with, a child or uneducated, not fine.
> My grandfather was an officer, fought in WW2, Eastern Front, got a lot of "recognition" from the communists for it after August 1944, wonder what he would have said about this ... re-writing of history.



It's a game, not a documentary.

Not knowing your grandfather but knowing quite a few present in such times and living in Europe myself, if I asked such a silly question they would just tell me to fuck off and grow up. That's about it.


----------



## ShurikN (Nov 9, 2018)

Damn, that 3GB 1060 got hit hard. A card that's 99% of the times ahead of the 470/570. 
Guess that 4GB of VRAM should be the lowest point in going forward.


----------



## droopyRO (Nov 9, 2018)

L33t said:


> It's a game, not a documentary.
> 
> Not knowing your grandfather but knowing quite a few present in such times and living in Europe myself, if I asked such a silly question they would just tell me to fuck off and grow up. That's about it.


Have you lived a day thru communism ? if so, then you might understand why i "don't grow up" or "educate myself" if you lived in the West or in Europe after 1990, then i understand why you see it that way.
But yeah it's just a game, i will not buy.
Came here only for the technical part of it, and found politics, i'm out.


----------



## Pewzor (Nov 9, 2018)

Yea that 2.5fps from the 1060... people on /r/Nvidia were swearing 3gb is all you need for 1080p gaming because nvidia has great memory compression and some other bs... so Nvidia's 3gb is better than AMD's 6GB ect... it can't even handle a 4gb RX570... guess those nvidia guys trolled me good.  
I mean even at 1080p that junk is 30% slower than a 570 (and 3 times and 11 times slower in 1440p and 4k).


----------



## neatfeatguy (Nov 9, 2018)

droopyRO said:


> Prosthetic arm woman that overpowers men in hand to hand combat, on the Western Front and we have to grow up ? adding womans to a video game, fine.
> Adding cyborg-women and calling someone who disagrees with, a child or uneducated, not fine.
> My grandfather was an officer, fought in WW2, Eastern Front, got a lot of "recognition" from the communists for it after August 1944, wonder what he would have said about this ... re-writing of history.



Wonder what my grandpa would think - being a WWII vet.....then again, I'm sure he's got other things on his mind than what is done in a video game. All I'm hoping is he's still around for his 96th birthday in February - unfortunately I can't find time away form work for the holidays to get out to visit him. He's finally starting to talk about his time in WWII experiences and has written out some memoirs about them. I'm excited to read them and see if he's willing to talk about anything else. 

As for the game - seems like if your GPU has less than 4GB you're kind screwed if you want eye candy. I can't say the game interests me, but it's always nice to read up on performance reviews because it gives me something to do while I sift through papers at work.


----------



## M2B (Nov 9, 2018)

Pewzor said:


> Yea that 2.5fps from the 1060... people on /r/Nvidia were swearing 3gb is all you need for 1080p gaming because nvidia has great memory compression and some other bs... so Nvidia's 3gb is better than AMD's 6GB ect... it can't even handle a 4gb RX570... guess those nvidia guys trolled me good.
> I mean even at 1080p that junk is 30% slower than a 570 (and 3 times and 11 times slower in 1440p and 4k).



Better color compression means less load on memory bandwidth, not less memory usage, that's why GTX 1060 6GB is slower than RX 580 at 1080p and faster at 4K, that's mostly due to nvidia's superior color compression; it's not bullshit.
RTX 2070 is also slower than Vega 64 at 1080p and faster at higher resolutions.
GTX 1060 3GB can still maintain very playable framerates at 1080p if you lower some memory related settings, that's not a huge deal, it's a last gen entry-level gaming GPU after all.


----------



## ArbitraryAffection (Nov 9, 2018)

Aha. So it's safe to say my OC 56 is probably around ~2070 performance in this game I'm never going to buy or play.  This still makes me happy though.

As for the game itself; what's the big deal, Sure it's not historically accurate but then at the end of the day it's a _video game._

I think some people are getting triggered because they think it's some form of aggressive feminism, lol. *shrug*


----------



## the54thvoid (Nov 9, 2018)

BF's military historical accuracy has always been shite. I'm talking about BF1 planes that were resistant to AA fire when in reality, they were as flimsy as airborne paper tissues. Horses with more oomph than cannons, that could charge across sand-dunes (camels people!), impossibly armoured tanks that never broke down, snipers that could run and gun.... The list goes on. If you want accuracy in a mil-sim, there are other options, ARMA is one of them. But BF has always been about gorgeous visuals, 'mostly' stable gameplay and reliability. 

I have bought it and I'm thinking, do I get EA access to play it early? Because I'm sick to death of PUBG... And too slow for CoD.


----------



## TheGuruStud (Nov 9, 2018)

ArbitraryAffection said:


> Aha. So it's safe to say my OC 56 is probably around ~2070 performance in this game I'm never going to buy or play.  This still makes me happy though.
> 
> As for the game itself; what's the big deal, Sure it's not historically accurate but then at the end of the day it's a _video game._
> 
> I think some people are getting triggered because they think it's some form of aggressive feminism, lol. *shrug*



Aggressive pandering. It's a joke. The devs are a joke.


----------



## Vya Domus (Nov 9, 2018)

Being on the same engine, dropping the settings a notch down likely gives considerable performance increase with little impact on visuals.


----------



## gamerman (Nov 9, 2018)

wehh, if there is rx 580 i think there should be gtx 1070 ti gpu, why not?

too much nvidia winners..

anyway, i cant ever wonder how lausy junk gpu amd vega gpu series are, vega64 loose 2070 average gpu...well its nothing but vega64 eat almost 100%... yes you read right 100% more power than nvidia 2070
..and for dot.. vega64 eat more than 2080 ti gpu... i think every1 can imagine how lausy junk gpus amd offer as.. i call it piiing eye and lie...

also this kind test should included gpu powerdraws.


----------



## INSTG8R (Nov 9, 2018)

gamerman said:


> wehh, if there is rx 580 i think there should be gtx 1070 ti gpu, why not?
> 
> too much nvidia winners..
> 
> ...


Besides the fact that most of that is painful to read it’s also mostly nonsense...


----------



## W1zzard (Nov 9, 2018)

gamerman said:


> if there is rx 580 i think there should be gtx 1070 ti gpu, why not?


as mentioned before in the comments and in the review, there is only 5 hardware changes possible within 24 hours. how did i bench more than 5 cards? i bought *three* subscriptions and had to prioritize cards


----------



## INSTG8R (Nov 9, 2018)

W1zzard said:


> as mentioned before in the comments and in the review, there is only 5 hardware changes possible within 24 hours. how did i bench more than 5 cards? i bought *three* subscriptions and had to prioritize cards


Sad Devs don’t cooperate with reviewers better anymore. Just about the bottom line rather than helping themselves get their products reviewed in their case “for free” you’re stuck with the bill and the activation limit alone is a dick move.


----------



## ArbitraryAffection (Nov 9, 2018)

W1zzard said:


> as mentioned before in the comments and in the review, there is only 5 hardware changes possible within 24 hours. how did i bench more than 5 cards? i bought *three* subscriptions and had to prioritize cards


Damn, well thanks for doing it so we can see how the cards perform. Much appreciated.


----------



## SIGSEGV (Nov 9, 2018)

for me, amd radeon did a good job and clear winner here.
thanks for the review.


----------



## bug (Nov 9, 2018)

Playable at Ultra up to 1440p is quite a nice showing from both 1060 6GB and 580.


----------



## TRIPTEX_CAN (Nov 9, 2018)

W1zzard said:


> as mentioned before in the comments and in the review, there is only 5 hardware changes possible within 24 hours. how did i bench more than 5 cards? i bought *three* subscriptions and had to prioritize cards



Id be willing to offer my account for a 24h period to allow you more flexibility in testing. Just take these stars away and we can make a deal.


----------



## cadaveca (Nov 9, 2018)

W1zzard said:


> DX12.
> 
> Couldn't do CPU testing this time because of the gay 5 activations limit.. and I even bought three memberships, so 15 activations total


 Wait whut!?! Only 5 activations? How...

That definitely makes testing their games harder... but maybe they don't care about people reviewing their games for performance.


----------



## the54thvoid (Nov 9, 2018)

Given that I'm probably going to buy Anthem and another game in the next year, I just paid £90 for EA Premium for the year. Cheaper than buying games, even at discounted price. Might even install ME Andromeda. Downloading the deluxe edition now...


----------



## MercJ (Nov 9, 2018)

Ran into the stupid 5-system activation limit when trying to find out CPU differences in the beta.  Wanted to see if I could get away with a Haswell i5 for some friends' builds, or even an i3.  Couldn't fully test 

From what I remember though - and like most Battlefields - core count >= 8 is nice, and CPU architecture doesn't matter too much, the rest is GPU...  Ryzen 2700X was right up there with my i7-7700K @ 5GHz too I think.

Anyway, thanks for the GPU breakdown!  Sorry to hear there wasn't any way around the activation limit, I thought for sure reviewers would be provided a way around this but no DICE I guess...


----------



## TRIPTEX_CAN (Nov 9, 2018)

MercJ said:


> Ran into the stupid 5-system activation limit when trying to find out CPU differences in the beta.  Wanted to see if I could get away with a Haswell i5 for some friends' builds, or even an i3.  Couldn't fully test
> 
> From what I remember though - and like most Battlefields - core count >= 8 is nice, and CPU architecture doesn't matter too much, the rest is GPU...  Ryzen 2700X was right up there with my i7-7700K @ 5GHz too I think.
> 
> Anyway, thanks for the GPU breakdown!  Sorry to hear there wasn't any way around the activation limit, I thought for sure reviewers would be provided a way around this but no DICE I guess...



I didn't play the Beta but I read the Ryzen 2700x was underperforming and under utilized. I was impressed to see the game using 60-70% on all 16 cores yesterday.


----------



## B-Real (Nov 9, 2018)

Vega 64 on par with the RTX 2070 in 1440P and beating it on FHD, LOL!  Nice job.


----------



## newtekie1 (Nov 9, 2018)

W1zzard said:


> as mentioned before in the comments and in the review, there is only 5 hardware changes possible within 24 hours. how did i bench more than 5 cards? i bought *three* subscriptions and had to prioritize cards



So does that mean that after 24 hours you can go back and bench more cards if you wanted to?  Obviously I know you only have a certain amount of time to dedicated to each thing, and you're very busy with other reviews.  But just thinking it might be cool to come back to revisit this review and maybe add in a few more cards. For example I'd really like to see an RX 580 4GB to see exactly how much the 4GB really affects the performance.

The reason I say that is because the RX 570 4GB doesn't really seem to suffer much from only have 4GB.  It's performance is about what I would expect from the RX 570 compared to the RX 580.  So while the game seems to use more than 4GB, even at 1080p, only have 4GB of memory doesn't seem to be an issue.  However, obviously only have 3GB of memory is a major issue because the GTX 1060 3GB really suffers compared to the GTX 1060 6GB, way more than it should be suffering if memory amount wasn't an issue.


----------



## W1zzard (Nov 9, 2018)

newtekie1 said:


> So does that mean that after 24 hours you can go back and bench more cards if you wanted to?


Yup, that's what I'm waiting for


----------



## Bjørgersson (Nov 9, 2018)

W1zzard said:


> as mentioned before in the comments and in the review, there is only 5 hardware changes possible within 24 hours. how did i bench more than 5 cards? i bought *three* subscriptions and had to prioritize cards


Nevermind, you just replied while I was writing my comment.


----------



## newtekie1 (Nov 9, 2018)

W1zzard said:


> Yup, that's what I'm waiting for



Awesome!


----------



## Sasqui (Nov 9, 2018)

gamerman said:


> wehh, if there is rx 580 i think there should be gtx 1070 ti gpu, why not?
> 
> too much nvidia winners..
> 
> ...



I think this:



> Battlefield V's DRM won't allow more than five hardware changes per 24 hours, which includes switching the graphics card. We will add more graphics cards and suitably revise our conclusion within the next 24 hours.



Crazy stupid DRM mechanism


----------



## B-Real (Nov 9, 2018)

gamerman said:


> wehh, if there is rx 580 i think there should be gtx 1070 ti gpu, why not?
> 
> too much nvidia winners..
> 
> ...


Pretty little lies.

1. RX 580 against a 1070 Ti? What the...?
2. Vega 64 is on par with the 1080 back to a 1,5 year old Techpowerup review. Since then in most games they are on par or the Vega 64 is a bit faster in more games than the 1080 is than the Vega 64. Those are games supported by NV or just being always better than NV (Assassin's Creed Series). I do NOT say that Vega 64 is better on its own, but if you consider a Sync monitor (and when we speak about a GPU this expensive, it's not a rare thing), it's absolutely the better price-performance option.
3. Vega 64 has ~270W power consumption without manual undervolting, the 2070 draws ~190W That's 40-45% more power consumption, not your lied 100%. With undervolt you can reach around 210-230W without losing performance.

Why should these benchmarks include power draws? You can see it in every GPU review here and on other sites.


----------



## Shatun_Bear (Nov 9, 2018)

I'm all set with my EVGA FTW2 1070 Ti. Or maybe not, as I don't buy games new, will wait a month or two and pick this up cheaper.


----------



## M2B (Nov 9, 2018)

B-Real said:


> Pretty little lies.
> 
> 1. RX 580 against a 1070 Ti? What the...?
> 2. Vega 64 is on par with the 1080 back to a 1,5 year old Techpowerup review. Since then in most games they are on par or the Vega 64 is a bit faster in more games than the 1080 is than the Vega 64. Those are games supported by NV or just being always better than NV (Assassin's Creed Series). I do NOT say that Vega 64 is better on its own, but if you consider a Sync monitor (and when we speak about a GPU this expensive, it's not a rare thing), it's absolutely the better price-performance option.
> ...



You made me laugh.
GTX 1080 is a better card in all aspects.
It's faster, way more efficient and cheaper at the same time. the cheapest custom design Vega 64 I could find at NewEgg was 500$ while you can find custom 1080s for cheaper. (buying a reference design Vega 64 is a horrible idea because it's utter garbage)
GTX 1080 is also a much better overclocker and gains significantly more performance when overclocked compared with the Vega 64. 










B-Real


----------



## Octopuss (Nov 9, 2018)

TRIPTEX_CAN said:


> Id be willing to offer my account for a 24h period to allow you more flexibility in testing. Just take these stars away and we can make a deal.


I could do that as well.
Hit me up @W1zzard if you want.
(unless I misunderstood and the account needs to own the damn game, heh)


----------



## Pewzor (Nov 9, 2018)

M2B said:


> You made me laugh.
> GTX 1080 is a better card in all aspects.
> 
> 
> B-Real



This made me laugh.

GTX 1080 struggles against Vega 64 in well made games that's not sponsored by Nvidia like Wolfenstein and in many cutting edge API games that properly utilize Vulkan and DX12.

GTX 1080 does not support FreeSync, fans are screwed out of $200 to 300 extra for the same thing.

GTX 1080 is a far less compute card than any of the Vega cards. Actually Vega 64 trades blows with GTX 1080Ti and RTX 2080 in most compute senario, and will kill any Pascal cards other than the P100 in FP16 compute by a landslide.  

just a few things Vega has over 1080.

B-Real


----------



## W1zzard (Nov 9, 2018)

I'm not sure if Origin account sharing will get you banned, so better not take risks. I'll have more activations soon

edit: added results for vega 56 and 1070


----------



## Pewzor (Nov 9, 2018)

W1zzard said:


> I'm not sure if account sharing will get you banned, so better not take risks. I'll have more activations soon
> 
> edit: added results for vega 56 and 1070



This is the only account I have for TPU, I am just pointing out the poster I replied to made some comments that made me laugh.  
And I would love him to B-Real.


----------



## INSTG8R (Nov 9, 2018)

Pewzor said:


> This is the only account I have for TPU, I am just pointing out the poster I replied to made some comments that made me laugh.
> And I would love him to B-Real.


It wasn’t directed at you W1z couldn’t finish his testing because of BFVs activation limits. People were offering them there accounts so he could.


----------



## M2B (Nov 9, 2018)

Pewzor said:


> This made me laugh.
> 
> GTX 1080 struggles against Vega 64 in well made games that's not sponsored by Nvidia like Wolfenstein and in many cutting edge API games that properly utilize Vulkan and DX12.
> 
> ...



Who gives a fuck about compute performance of these cards? these are both gaming GPUs, try to understand it dude.
And again, GTX 1080 is a better card in all aspects that actually matter (more overclocking headroom, better overall gaming performance and better efficiency)
did you even watch the video?


----------



## Readlight (Nov 9, 2018)

Price will drop 80% my hardware can not run it. ps4 4k definitely not.


----------



## Shatun_Bear (Nov 9, 2018)

W1zzard said:


> edit: added results for vega 56 and 1070



Vega 56 is a top card. If you managed to snag one of those around launch for close to MSRP, I would be a happy customer. Performance here is significantly faster than the 1070.


----------



## SniperHF (Nov 10, 2018)

Would like to see the 970 in this seeing how the memory situation works out.


----------



## Lionheart (Nov 10, 2018)

gamerman said:


> wehh, if there is rx 580 i think there should be gtx 1070 ti gpu, why not?
> 
> too much nvidia winners..
> 
> ...



Ummm What?






Apart from that, thanks for the benchmark Wizzard.


----------



## W1zzard (Nov 10, 2018)

SniperHF said:


> Would like to see the 970 in this seeing how the memory situation works out.


added 970 results just for you

also added 1080, 1070 ti, fury x


----------



## TheHunter (Nov 10, 2018)

W1zzard said:


> added 970 results just for you
> 
> also added 1080, 1070 ti, fury x


Custom or normal 980ti would be nice too..thanks


----------



## Pewzor (Nov 10, 2018)

M2B said:


> Who gives a fuck about compute performance of these cards? these are both gaming GPUs, try to understand it dude.
> And again, GTX 1080 is a better card in all aspects that actually matter (more overclocking headroom, better overall gaming performance and better efficiency)
> did you even watch the video?



Just saying how stupid your comment was, and made me laugh very hard, thanks, and B-Real.

Thanks for the GTX 970 and Vega 56 number, weird how it's barely faster than 1060 3gig in 1080p but in 1440p its significantly faster than 1060.  
Good to see Vega 56 trading blows with RTX 2070 in 1440p and 4k.


----------



## W1zzard (Nov 10, 2018)

Pewzor said:


> weird how it's barely faster than 1060 3gig in 1080p but in 1440p its significantly faster than 1060.


1060 3 GB still does ok in 1080p, but lack of memory is starting to affect it on 1440p, whereas the 4 GB cards can still handle 1440p


----------



## efikkan (Nov 10, 2018)

W1zzard said:


> 1060 3 GB still does ok in 1080p, but lack of memory is starting to affect it on 1440p, whereas the 4 GB cards can still handle 1440p


Thanks for doing your best within the restrictions.

Can we expect more testing at a later date? E.g. how does various 3 GB and 4 GB cards handle lower detail settings?


----------



## cucker tarlson (Nov 10, 2018)

Why do you insist on dx12 while dx11 runs better for both amd and nvidia ? Explain.


----------



## INSTG8R (Nov 10, 2018)

cucker tarlson said:


> Why do you insist on dx12 while dx11 runs better for both amd and nvidia ? Explain.


Well if Nvidia wants to show off their RTX they better get it running better on DX12 they shouldn’t get a gimme.


----------



## cucker tarlson (Nov 10, 2018)

INSTG8R said:


> Well if Nvidia wants to show off their RTX they better get it running better on DX12 they shouldn’t get a gimme.


yes,this is another downside I just realized today. in order to use dxr, the 20 series cards have to run on this broken api. I ran BF1 on dx12 and the experience was smooth, but it still delivered more fps on dx11.


----------



## FreedomEclipse (Nov 10, 2018)

1080ti master race 

Looking forward to seeing the CPU scaling review


----------



## Liviu Cojocaru (Nov 10, 2018)

Played a bit yesterday and it looks really good, I get between 75-110 fps mostly in the 90's. I will have a 1080Ti in a couple of days and I hope I can have constant 100+ fps


----------



## rtwjunkie (Nov 10, 2018)

droopyRO said:


> Prosthetic arm woman that overpowers men in hand to hand combat, on the Western Front and we have to grow up ? adding womans to a video game, fine.
> Adding cyborg-women and calling someone who disagrees with, a child or uneducated, not fine.
> My grandfather was an officer, fought in WW2, Eastern Front, got a lot of "recognition" from the communists for it after August 1944, wonder what he would have said about this ... re-writing of history.


I’m quoting droopyRO because I agree and it’s the same topic.

I’ve got to say @W1zzard, for those of us educated in college and beyond with a plethora of history classes, and well versed in studious historians like Stephen Ambrose, to be so dismissive of us and our knowledge of history is rather off-putting.  

EA took complete “poetic” or “artistic” license with this, because in actuality, despite 300 million people fighting, this was not historical fact, especially on the Western Front.  You had some female partisans, sure, but that’s about it.  Every army used their women in support roles, far from the front.  

Anyway, your dismissiveness of those who want a historically based game to at least try to be a little accurate really doesn’t belong in a review.

When sticking to the performance matters, you gave a fine review as always.



SIGSEGV said:


> for me, amd radeon did a good job and clear winner here.
> thanks for the review.


Not exactly clear winner, but it was nice to see the Vega64 right up behind the 1080Ti.


----------



## W1zzard (Nov 10, 2018)

cucker tarlson said:


> Why do you insist on dx12 while dx11 runs better for both amd and nvidia ? Explain.


Had I tested in DX11 only, what would you say?


----------



## Agony (Nov 10, 2018)

So sad its 2018 and again 16:9 ,   I thought 21:9 was the minimum standard. Even mobile phones get pass this old 16:9  ratio.


----------



## rtwjunkie (Nov 10, 2018)

Agony said:


> So sad its 2018 and again 16:9 ,   I thought 21:9 was the minimum standard. Even mobile phones get pass this old 16:9  ratio.


Well between 1920 x 1080 and 2560 x 1440 you have 65 to 70% of the monitor market.  Both resolutions are 16:9.  

You don’t ignore what most people play at.


----------



## cucker tarlson (Nov 10, 2018)

W1zzard said:


> Had I tested in DX11 only, what would you say?


I asked for dx11 so if you tested in dx11, that would please me. PCGH,computerbase and guru3d all tested in dx11 cause it's better. TPU is the only site that always tests in dx12 even if it runs worse. That was the case with BF1 and Deus Ex, though I can kind of understand those since at least it provides some performance improvement for Radeons. BF5 runs better in dx11 on all cards.


----------



## Vya Domus (Nov 10, 2018)

W1zzard said:


> Had I tested in DX11 only, what would you say?



I am glad you didn't, we gotta move on at some point. The transition from DX11 is slow and painful as it is, I am happy you tested DX12 first.



Agony said:


> I thought 21:9 was the minimum standard.



Well, clearly you thought wrong. Monitors/TVs are overwhelmingly still 16:9.


----------



## Liviu Cojocaru (Nov 10, 2018)

I can honestly say as well that dx11 runs better for me...dx12 is stuttering


----------



## rtwjunkie (Nov 10, 2018)

Liviu Cojocaru said:


> I can honestly say as well that dx11 runs better for me...dx12 is stuttering


DX12 has problems in another recent game that it shipped with and that it is automatically set to run with, Shadow of The Tomb Raider.  DX11 ran much better and both looked the same.

I honestly think it will be awhile before developers get it nailed down.  And that is going to be bad for RTX features to be implemented.


----------



## Cataclysm_ZA (Nov 10, 2018)

W1zzard said:


> as mentioned before in the comments and in the review, there is only 5 hardware changes possible within 24 hours. how did i bench more than 5 cards? i bought *three* subscriptions and had to prioritize cards



I hope you've 2FA'd those accounts. There are people who target Origin accounts like yours and take over them to play new games for free and cheat with impunity.


----------



## matar (Nov 11, 2018)

GTX 970 @1440p doing great and this is at stock clocks , with a good overclock on the 970  will push it to a GTX1060 6GB performance


----------



## raptori (Nov 11, 2018)

DX11 is better for BF1 and BFv both runs smoother , and I can't remember a game where dx12 is better than dx11 , maybe a bit more fps but strutting is a killer especially in multiplayer .


----------



## cucker tarlson (Nov 11, 2018)

look at this. dx12 is the biggest piece of dud in the recent years. it's been like 3 years since it was released and it still absolutely sucks.that's why I'm so surprised to see tpu always push dx12 as this thing of the future while other tech sites gave it up a long time ago, testing dx11 where it runs better (almost everywhere)


----------



## efikkan (Nov 11, 2018)

cucker tarlson said:


> look at this. dx12 is the biggest piece of dud in the recent years. it's been like 3 years since it was released and it still absolutely sucks.that's why I'm so surprised to see tpu always push dx12 as this thing of the future while other tech sites gave it up a long time ago, testing dx11 where it runs better (almost everywhere)


Direct3D 12 is going to continue to suck until we get native Direct3D 12 games, which will not happen until Direct3D 11 support is dropped and they stop using Direct3D 11-like abstraction layers, if it's ever going to be used properly in most games…

This is not Direct3D's fault, it's the developers and of course tech sites giving people the wrong expectations.


----------



## EarthDog (Nov 11, 2018)

Did I miss where the testing method (scene? Built in bench?) is listed? I see ultra dx12 on p1 and settings on p2... but how was this tested?


----------



## rvalencia (Nov 11, 2018)

Pewzor said:


> This made me laugh.
> 
> GTX 1080 struggles against Vega 64 in well made games that's not sponsored by Nvidia like Wolfenstein and in many cutting edge API games that properly utilize Vulkan and DX12.
> 
> ...


Compute TFLOPS doesn't include rasterization  and ROPS read/write/graphics fix function bottleneck issues. Both Vega 64 and GTX 1080 has similar quad rasterization and 64 ROPS configuration while GTX 1080 Ti has six rasterization and 88 ROPS superiority.

Pure Compute workloads uses TMU read/write path which both GTX 1080 Ti and Vega 64 has similar TMU unit count.

CUDA mode also disables graphic path's NV superior delta color compression.

For raster graphics, TFLOPS is almost meaningless without  rasterization  and ROPS read/write/graphics fix functions being factored in. I expect more from AMD. I do NOT support Raja Koduri's TFLOPS bias arguments i.e. AMD was increasing their GPU's ROPS (32 ROPS to 64 ROPS)  power before Koduri jointed AMD in 2013.

AMD should be focused on designing GPUs not large DSPs with smaller GPU hardware.


----------



## EarthDog (Nov 12, 2018)

EarthDog said:


> Did I miss where the testing method (scene? Built in bench?) is listed? I see ultra dx12 on p1 and settings on p2... but how was this tested?


@W1zzard 

...when you have a chance.


----------



## Domokun (Nov 12, 2018)

I'm curious about the VRAM utilisation. Benchmarks recently performed by Hardware Unboxed indicated that the performance at 1080P with ultra (DX11) settings was identical between the 4GB and 8GB variants of the RX 580. Additionally, even though Tech Power Up has used DX12 (which may potentially increase VRAM utilisation), the performance delta between the 4GB RX 570 and 8GB RX 580 is pretty insignificant (to the point where it could simply be attributed to the difference in core configurations).

So, I'm not saying the test results are incorrect, but surely the performance would be affected more adversely if the VRAM buffer is actually being exceeded, right?

I recently sold my GTX 1070 with the intent to purchase a RX 570/580 for better Linux compatibility. However, I haven't made up my mind. In Australia, the 4GB RX 570 is $199, the 4GB RX 580 is $239, and the 8GB RX 580 is $329. Since my displays are 1080P, the 4GB RX 580 seemed like the best ratio of performance/value, but if Battlefield V is actually using 5GB of VRAM at 1080P, well I should probably cough up the extra $90.


----------



## newtekie1 (Nov 12, 2018)

Domokun said:


> So, I'm not saying the test results are incorrect, but surely the performance would be affected more adversely if the VRAM buffer is actually being exceeded, right?



VRAM usage is a lot more complicated than just "the game needs this much and if you have less you're going to have a bad time".  It used to be this way, when VRAM was just used as a framebuffer.  But those days are long gone.

Now game developers use the VRAM for more, they store all types of things in VRAM.  Some of what they store isn't actually needed to render the current scene, but they put it there anyway just in case the next scene might need it.  That is why you'll see some games that, given the space, will use 11GB of VRAM but still run just fine on a 4GB card.  All that extra crap stored in VRAM isn't actually needed, so when a lower VRAM card is used all that crap is dumped with no real performance impact.

It seems to be that the actual VRAM necessary for this game is somewhere between 3GB and 4GB with Ultra settings.


----------



## cucker tarlson (Nov 12, 2018)

hwunboxed made a huge gpu comparison, in dx11 










probably the biggest difference between TPU and this is gtx 1080 now leading V56 not by 2% but by 13%.


----------



## moob (Nov 12, 2018)

cucker tarlson said:


> I asked for dx11 so if you tested in dx11, that would please me. PCGH,computerbase and guru3d all tested in dx11 cause it's better. TPU is the only site that always tests in dx12 even if it runs worse. That was the case with BF1 and Deus Ex, though I can kind of understand those since at least it provides some performance improvement for Radeons. BF5 runs better in dx11 on all cards.


guru3D used DX12 (they recommend using DX11 but tested in DX12 since you'll need it for DXR): https://www.guru3d.com/articles_pages/battlefield_v_pc_performance_benchmarks,6.html


rtwjunkie said:


> DX12 has problems in another recent game that it shipped with and that it is automatically set to run with, Shadow of The Tomb Raider.  DX11 ran much better and both looked the same.


It depends on the system. With my Vega 56 I got far better performance with DX12 in SotTR. If you look at Steam's forum results are definitely mixed. Some people have better performance with DX11, some with DX12.


----------



## L33t (Nov 14, 2018)

@W1zzard Time for some RTX benchmarks! 

"we" can now use RTX. 



__ https://twitter.com/i/web/status/1062635899786276864
Microsoft re-released the "October" update with the pipework (DirectX) to support RTX, Dice has updated the game and we now have drivers with specific BF:V tweaks from NVIDIA. So.. All is in place.

Have fun!


----------



## caleb (Nov 14, 2018)

Hi

Do you have any comparisons on CPU ? I've already bought a motherboard and im wondering which will be better for BFV 8700 or 9600k ?
Im leaning towards 9600K but there are opinions that threads are important for BF5 on multiplayer


----------



## L33t (Nov 14, 2018)

BF:V was developed to use upto 12 threads, according to DICE.

My thinking is they targeted the six core 12 thread CPUs, mostly on AMD value. 

But Intel decided to once again crap on their clients and do HT shenanigans making the i5 i7 and i9 branding meaningless once again and confusing as hell from previous generations standpoint.


----------



## TRIPTEX_CAN (Nov 14, 2018)

L33t said:


> BF:V was developed to use upto 12 threads, according to DICE.
> 
> My thinking is they targeted the six core 12 thread CPUs, mostly on AMD value.
> 
> But Intel decided to once again crap on their clients and do HT shenanigans making the i5 i7 and i9 branding meaningless once again and confusing as hell from previous generations standpoint.




BFV is using every core/thread on my 2700x. Even load across the entire CPU with no idle threads or cores.


----------



## plonk420 (Nov 22, 2018)

anyone get that MGS:V vibe from some of those screenshots?


----------



## nashathedog (Jan 12, 2019)

It's a shame you don't include more strategic older cards like the 980ti & 390x, It's always interesting to see how older gen cards that had good performance in their day line up and compare with other old models. After all they don't all get magically telleported to the GPU Retirement Home, they're still out there running in someones PC.


----------



## bug (Jan 12, 2019)

nashathedog said:


> It's a shame you don't include more strategic older cards like the 980ti & 390x, It's always interesting to see how older gen cards that had good performance in their day line up and compare with other old models. After all they don't all get magically telleported to the GPU Retirement Home, they're still out there running in someones PC.


I'm pretty sure this is time issue. Old card have been benched on different systems and would need benchmarking again.
I _would_ like to see that info, too. And it is very useful to more rational people that hang onto their video cards for 3-4 years before updating to see what they'd get. But at the same time, I understand why that's not happening.


----------



## neatfeatguy (Jan 12, 2019)

nashathedog said:


> It's a shame you don't include more strategic older cards like the 980ti & 390x, It's always interesting to see how older gen cards that had good performance in their day line up and compare with other old models. After all they don't all get magically telleported to the GPU Retirement Home, they're still out there running in someones PC.



True, but it does take more time adding more cards to the lineup for testing. Sometimes they're pressed for time or don't see the value in doing such a thing because....For example, a 980Ti that has a nice OC on it will perform like a 1070 (or even slightly better). Meaning, try to find a newer card that's equivalent to how the 980Ti performs to have a rough idea of how it would handle a game the game. While the 980Ti still delivers great performance on 1080p and it can do well on 1440p, the card is pushing close to 4 years old.


----------



## Artas1984 (Jan 12, 2019)

This might turn into another Crysis 3 game, than no one will play in multiplayer, yet continue to benchmark it for years to come? I wonder how long BF 5 will last as a benchmark game... This RTX nonsense is so insignificant compared to the *content* and *mechanics* of what truly makes a great game.


----------



## bug (Jan 12, 2019)

Artas1984 said:


> This might turn into another Crysis 3 game, than no one will play in multiplayer, yet continue to benchmark it for years to come? I wonder how long BF 5 will last as a benchmark game... This RTX nonsense is so insignificant compared to the *content* of what truly makes a great game.


It depends on what other titles bring to the table. You read so much about BFV because so far it's the only title that makes use of DXR (and does a pretty good job at it). If other titles manage to look better, they'll have no trouble joining BFV in benchmarking.


----------



## SniperHF (Jan 14, 2019)

Artas1984 said:


> This might turn into another Crysis 3 game, than no one will play in multiplayer, yet continue to benchmark it for years to come? I wonder how long BF 5 will last as a benchmark game.



Probably till the next BF release .  The sort of tent pole franchises usually stick around.
Or even something like Metro 2033, which wasn't actually all that big a hit at least on PC anyway, was such a system melter that it was useful for a while.

BTW Crysis 3 MP was a blast for the short 3-4 months it was alive.


----------

