• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Alan Wake 2 Performance Benchmark

Are you an Epic games rep? Let me give you my two cents:

1. The game is NOT coming for Steam because it is published by Epic directly. There is no 1-year exclusivity contract here.
2. EGS is pure garbage. I just bought the game on it because I see that I don't have any other choice, and while it was installing, my monitor wouldn't go to sleep. This is some ludicrous level of software writing bollocks. Not to mention, there is nothing graphical in the storefront, it's a plain dark grey canvas with a web-based UI. There is no reason to keep my screen awake, especially when the app is minimised.
3. There is zero customisability. I can't even set up the app to show my games on startup instead of the store! Not to mention, the store is just a basic web page with no relation to my library whatsoever. It is still offering me to buy Alan Wake 2, despite the fact that I already have it.
4. There is ZERO other feature in the app. It literally has no reason to exist other than to launch your games, which you could do by clicking on the .exe or the desktop icon.

All in all, Epic is a pure steaming pile of horse manure that wouldn't survive a single second against Steam and GOG if not for the shady as F exclusivity deals and occasional free games.

Sorry to shatter your illusions. The next time you register for a forum, maybe start with something other than telling people what to like.
Hey I wouldn't rule out that it could appear on Steam. Diablo IV is on Steam.
 
Hey I wouldn't rule out that it could appear on Steam. Diablo IV is on Steam.
Let's hope you're right, but knowing Epic, I very much doubt it.
 
Let's hope you're right, but knowing Epic, I very much doubt it.

Epic Games Store GM Steve Allison said Alan Wake II will be exclusive to the Epic Games Store for "a long time," though he did not provide a specific time period for the exclusivity. He didn't say, forever, though.

I think it will eventually
 
I didn't get any light bleed through the table with the path tracing on. Everything is on max settings.
NMJeInn.jpg
 
Hey I wouldn't rule out that it could appear on Steam. Diablo IV is on Steam.
That seemed to line up with Microsoft finally closing their deal to buy Activision. I guess Microsoft could buy Epic at some point. Stranger things have happened.
 
That seemed to line up with Microsoft finally closing their deal to buy Activision. I guess Microsoft could buy Epic at some point. Stranger things have happened.
You got a point there, just four days after the purchase closed.
 
Would have been very helpful to mention that Alan Wake 2 is sponsored and optimized for Nvidia Hardware. Even so, RT, PT is rather a demo for next gen cards than a playable experience.
 
Using the xess bridge from Starfield seems to work. Balance and lower settings tend to look better than FSR. Also can use a tinge of sharpening while I'm in the ini file too. Not sure what the usage range is but 0.1 seems a nice start.

If I change the scaling in game it crashes but saves it correctly so I don't need to change it in the ini file. Just reopen the game.

4k monitor with 6800 XT is still struggling on medium sometimes.
 
Even so, RT, PT is rather a demo for next gen cards than a playable experience.
That's it! Max settings have always been a tech demo for the future rather than what you should use at the time of release. Doom 3 didn't run at Ultra graphics on any available graphics card at the time of its release. So stop crying, everyone. Ultra+RT+PT isn't the only option in the game.
 
Would have been very helpful to mention that Alan Wake 2 is sponsored and optimized for Nvidia Hardware.
Thats not an accurate observation, statisticly the Nvidia GPUs get higher performance scores constantly over a decade now even on non sponsored, non optimized games so imo it went unmentioned because it would make hardly any difference and just generate a bias that readers would ultimately discredit the results.

Max settings have always been a tech demo for the future rather than what you should use at the time of release. Doom 3 didn't run at Ultra graphics on any available graphics card at the time of its release. So stop crying, everyone.
Doom 3 was relased over 19 years ago, you cant just assume that your observation from 20 years age would or should be the case now or would make a fair argument in 2023 for technical developments and doesnt take the anti-consumer industry parctise into account, the hardware is here to run new games with max settings etc. but these companies dont release their latest and greatest pieces, so they could sell minor improvements easier like the SUPER and Ti series.

In addition to that, I think thats a very low standard to go for, wich gives them just the excuse for their business model, by saying " yeah obviously with a 1500 $ video card you cant play games at max settings" but we have here this new SUPER model for a couple of hundreds more than the base model, wich prices are of course in no relation to its performance.

The criticism wich you label as" crying" is just is a rational reaction from people that are frustrated and disappointet because a product does not hold to its promises.
 
Thats not an accurate observation, statisticly the Nvidia GPUs get higher performance scores constantly over a decade now even on non sponsored, non optimized games so imo it went unmentioned because it would make hardly any difference and just generate a bias that readers would ultimately discredit the results.


Doom 3 was relased over 19 years ago, you cant just assume that your observation from 20 years age would or should be the case now or would make a fair argument in 2023 for technical developments and doesnt take the anti-consumer industry parctise into account, the hardware is here to run new games with max settings etc. but these companies dont release their latest and greatest pieces, so they could sell minor improvements easier like the SUPER and Ti series.

In addition to that, I think thats a very low standard to go for, wich gives them just the excuse for their business model, by saying " yeah obviously with a 1500 $ video card you cant play games at max settings" but we have here this new SUPER model for a couple of hundreds more than the base model, wich prices are of course in no relation to its performance.

The criticism wich you label as" crying" is just is a rational reaction from people that are frustrated and disappointet because a product does not hold to its promises.
1.- What promises? No really, what promises?
2.- Kingdom Come: Deliverance's high settings are specifically that, too, the whole settings for the future. You mean to tell me you thought that Cyberpunk 2077 Overdrive was a setting for now?
 
For Gaming GPUs, xx60s are entry level and if half of all gaming machines run it won’t change that fact.
Then you have more money than you know what to do with and that means you have no clue of what entry level is. An entry level GPU is a card that can get you started pc gaming at the lowest price. See xx50 cards. The 4060ti has an MSRP of $400. That is the same price for the digital PS5 console. When the 1070 released a few years ago no one considered it entry level and it had an MSRP of $375.

Entry level is defined by price. Not performance or whatever logic you can come up with. It's the cost to enter pc gaming. I'm sure that AMD and Nvidia would love to preach that entry level GPU's cost 300-400 bucks because they have completely abandoned the bottom half of the segment and only focused on expensive cards with high gross margins.
 
Then you have more money than you know what to do with and that means you have no clue of what entry level is. An entry level GPU is a card that can get you started pc gaming at the lowest price. See xx50 cards. The 4060ti has an MSRP of $400. That is the same price for the digital PS5 console. When the 1070 released a few years ago no one considered it entry level and it had an MSRP of $375.

Entry level is defined by price. Not performance or whatever logic you can come up with. It's the cost to enter pc gaming. I'm sure that AMD and Nvidia would love to preach that entry level GPU's cost 300-400 bucks because they have completely abandoned the bottom half of the segment and only focused on expensive cards with high gross margins.
More money than I know what to do with... Anyways... Entry level Nvidia cards are RTX xx50/xx60. Main stream is RX xx60Ti to xx70Ti. High end/enthusiast is xx80 to xx80Ti. Extreme xx90 to xx90Ti. If you are talking GTX anything as entry level, I can't help do anything for you. We aren't speaking the same language.

Basing the card on a price segment is a fools errand. So during the crypto boom when a 60s card was $700 did it get promoted to mainstream? Its really based on the the last two digits of the model number. I'm glad you mentioned the PS5 because it is a beautiful piece of machinery and yes...its basically a budget PC aka entry level with the sole focus of gaming. I love my PS5 and can't wait for the PS5 Pro model. Playing Spiderman 2 on a 4K OLED, its squarely in the entry level for gaming. That doesn't make it bad by any means, it is what it is.

I'm not crapping on anyone who can't afford anything above entry level, we've all been there at one point in life. I know damn sure I have been. However, I currently play @ 1440p and anything on the GTX silicon is even a consideration. I may as well have an APU.
 
Doom 3 was relased over 19 years ago, you cant just assume that your observation from 20 years age would or should be the case now or would make a fair argument in 2023 for technical developments and doesnt take the anti-consumer industry parctise into account, the hardware is here to run new games with max settings etc. but these companies dont release their latest and greatest pieces, so they could sell minor improvements easier like the SUPER and Ti series.
So if a game released in 2004 doesn't run on high-end hardware, it's called development, but if a game in 2023 does the same, it's just the devs being scumbags? :kookoo:

In addition to that, I think thats a very low standard to go for, wich gives them just the excuse for their business model, by saying " yeah obviously with a 1500 $ video card you cant play games at max settings" but we have here this new SUPER model for a couple of hundreds more than the base model, wich prices are of course in no relation to its performance.
Then don't buy the next $1500 video card. Buy the $600 one in 2025, it'll give you a similar experience. There's no 4090 SUPER coming anyway, AFAIK.

High-end graphics cards are, and always have been, a waste of money due to their insane value depreciation.

The criticism wich you label as" crying" is just is a rational reaction from people that are frustrated and disappointet because a product does not hold to its promises.
What promises?

More money than I know what to do with... Anyways... Entry level Nvidia cards are RTX xx50/xx60. Main stream is RX xx60Ti to xx70Ti. High end/enthusiast is xx80 to xx80Ti. Extreme xx90 to xx90Ti. If you are talking GTX anything as entry level, I can't help do anything for you. We aren't speaking the same language.

Basing the card on a price segment is a fools errand. So during the crypto boom when a 60s card was $700 did it get promoted to mainstream? Its really based on the the last two digits of the model number. I'm glad you mentioned the PS5 because it is a beautiful piece of machinery and yes...its basically a budget PC aka entry level with the sole focus of gaming. I love my PS5 and can't wait for the PS5 Pro model. Playing Spiderman 2 on a 4K OLED, its squarely in the entry level for gaming. That doesn't make it bad by any means, it is what it is.

I'm not crapping on anyone who can't afford anything above entry level, we've all been there at one point in life. I know damn sure I have been. However, I currently play @ 1440p and anything on the GTX silicon is even a consideration. I may as well have an APU.
If you look back 10-15 years, you'll see high-end cards running games at high to ultra settings (such as the 8800 GTX), mid-range running them at medium settings, or at high with slightly lower resolutions (such as the 8600 GT), and you'll see low-end cards running low settings (such as the 8400 GS). With this logic, high-end is x80-x90 cards (or x700-x900 AMD), mid-range is x50-x70 (or x600 AMD), and the low-end is the 6500 XT, the 6400 and Nvidia's 16 series.
 
1.- What promises? No really, what promises?
2.- Kingdom Come: Deliverance's high settings are specifically that, too, the whole settings for the future. You mean to tell me you thought that Cyberpunk 2077 Overdrive was a setting for now?
1. Just look at the buzzwords Nvidia is using on their website to advertise the RTX 4090:
BEYOND FAST
The NVIDIA® GeForce RTX™ 4090 is the ultimate GeForce graphics card. This offers a huge leap in performance, efficiency and AI-based graphics. Discover games with extremely high performance, incredibly detailed virtual worlds, the next generation of gaming,Ray-tracing- Ultrarealistic. Super fast.With the power of RTX 40 series GPUs and 3rd generation RT cores, you can explore incredibly detailed virtual worlds like never before.


This is the future, you get my point? :D


Of course people will be upset when they dont get 60 FPS.... what do you expect?!

2.Do you start using Nvidias marketing terms just to make their products look better or what is the definition of "settings for the future" and "settings for now" can you select these in the game :D and they will eventually unlock when some years have passed? :laugh:

Nvidia already states that the 4090 is next generation so you are telling me we need to wait for the next next generation of ultimate extreme high performance superfast GPUs?


but if a game in 2023 does the same, it's just the devs being scumbags?
You realise that these companies make the cards deliberately not too capable so they can sell the TI and SUPER series 6 months later. Also you don't take the fact in consideration that graphic improvement has been slowed in the last decade comparing to the decade before when you make statements like: 20 years ago Doom 3 didn't run at high settings so it must be the same with a new game in 2023.

Then don't buy the next $1500 video card. Buy the $600 one in 2025, it'll give you a similar experience. There's no 4090 SUPER coming anyway, AFAIK.
That is rally not the point is it, I don't like to speculate about hypothtical products from 2025. The fact is that GPUs in 2023 should perform better than they do at the moment for the price and how they are advertised. People should be able to acknowledge that performance is disappointing in most games specially in Dead Space, Hogwarts, Forspoken, Cyberpunk and now with AW2, the list goes on.
What promises?
you will see, just visit the gtx 4090 product site on nvidias website. BEYOND FAST- ultimate extreme high performance superfast GPUs.
.
With this logic
There is not much logic left if there where any in the naming of the GPUs and the performance you could expect. Its even worse on notebooks.
 
So if a game released in 2004 doesn't run on high-end hardware, it's called development, but if a game in 2023 does the same, it's just the devs being scumbags? :kookoo:


Then don't buy the next $1500 video card. Buy the $600 one in 2025, it'll give you a similar experience. There's no 4090 SUPER coming anyway, AFAIK.

High-end graphics cards are, and always have been, a waste of money due to their insane value depreciation.


What promises?


If you look back 10-15 years, you'll see high-end cards running games at high to ultra settings (such as the 8800 GTX), mid-range running them at medium settings, or at high with slightly lower resolutions (such as the 8600 GT), and you'll see low-end cards running low settings (such as the 8400 GS). With this logic, high-end is x80-x90 cards (or x700-x900 AMD), mid-range is x50-x70 (or x600 AMD), and the low-end is the 6500 XT, the 6400 and Nvidia's 16 series.
In 1995, my first computer a Pentium 100 with onboard graphics running on DOS. Should I base entry level GPUs to onboard? As time goes on, so does the threshold for what is entry level. Entry level cards are 30 FPS min @ 1080p in my book. Alan Wake 2 Nvidia 3060 44.9fps/39.3, Nvidia 3050 31.8fps/28.1, Nvidia 1660Ti 27.6fps/19.7... 50/60 are entry level. Nvidia's 4060 50.3FPS/44.3 is about as low as anyone should go buying a new card for entry level game play. These number are provided by TPU in raster only. Now if you want to include DLSS and all that other fakery, by all means as I don't include that as what makes entry level cards or not. Nice to have if you can't reach playable framerates but even then, it doesn't even look good.
 
1. Just look at the buzzwords Nvidia is using on their website to advertise the RTX 4090:
BEYOND FAST
The NVIDIA® GeForce RTX™ 4090 is the ultimate GeForce graphics card. This offers a huge leap in performance, efficiency and AI-based graphics. Discover games with extremely high performance, incredibly detailed virtual worlds, the next generation of gaming,Ray-tracing- Ultrarealistic. Super fast.With the power of RTX 40 series GPUs and 3rd generation RT cores, you can explore incredibly detailed virtual worlds like never before.


This is the future, you get my point? :D


Of course people will be upset when they dont get 60 FPS.... what do you expect?!

2.Do you start using Nvidias marketing terms just to make their products look better or what is the definition of "settings for the future" and "settings for now" can you select these in the game :D and they will eventually unlock when some years have passed? :laugh:

Nvidia already states that the 4090 is next generation so you are telling me we need to wait for the next next generation of ultimate extreme high performance superfast GPUs?



You realise that these companies make the cards deliberately not too capable so they can sell the TI and SUPER series 6 months later. Also you don't take the fact in consideration that graphic improvement has been slowed in the last decade comparing to the decade before when you make statements like: 20 years ago Doom 3 didn't run at high settings so it must be the same with a new game in 2023.


That is rally not the point is it, I don't like to speculate about hypothtical products from 2025. The fact is that GPUs in 2023 should perform better than they do at the moment for the price and how they are advertised. People should be able to acknowledge that performance is disappointing in most games specially in Dead Space, Hogwarts, Forspoken, Cyberpunk and now with AW2, the list goes on.

you will see, just visit the gtx 4090 product site on nvidias website. BEYOND FAST- ultimate extreme high performance superfast GPUs.
.

There is not much logic left if there where any in the naming of the GPUs and the performance you could expect. Its even worse on notebooks.
Wait, nvidia made alan wake 2? Not remedy? News to me mate

The game that brings a gpu to its knees is not made by the gpu manufacturer, so how the game performs and what the settings are intended for is a choice from the devs, not nvidia.
 
Wait, nvidia made alan wake 2? Not remedy? News to me mate

The game that brings a gpu to its knees is not made by the gpu manufacturer, so how the game performs and what the settings are intended for is a choice from the devs, not nvidia.
I think these kinds of people think that ultra settings is something set in stone and every game should be equally demanding on ultra. They don't actually care how the game looks like, they just want to play it at ultra.

Imagine a game coming out that with low settings it plays on a 1060 and looks better than any other game currently in existence. It will still get flames because a 4060ti can't play at ultra.
 
'Will it run Crysis?' Remember those words? Now it is 'will it run Alan Wake II and probably any other major title coming out moving forward'
I think the issue here is not 'will the GPU run the game', it's does the GPU have enough memory to run ANY game. It's not like before where we only had to worry about 720p or 1080p resolutions, now were in 1440p and 2160p territory and the demand at those resolutions are much more.

I ended up returning this game on PC and got it for my PS5. My 3070Ti just doesn't have enough memory to run it at the settings I want. I'm sure if it had 16GB, it would run it fairly well.
 
1. Just look at the buzzwords Nvidia is using on their website to advertise the RTX 4090:
BEYOND FAST
The NVIDIA® GeForce RTX™ 4090 is the ultimate GeForce graphics card. This offers a huge leap in performance, efficiency and AI-based graphics. Discover games with extremely high performance, incredibly detailed virtual worlds, the next generation of gaming,Ray-tracing- Ultrarealistic. Super fast.With the power of RTX 40 series GPUs and 3rd generation RT cores, you can explore incredibly detailed virtual worlds like never before.


This is the future, you get my point? :D


Of course people will be upset when they dont get 60 FPS.... what do you expect?!

you will see, just visit the gtx 4090 product site on nvidias website. BEYOND FAST- ultimate extreme high performance superfast GPUs.
That's not a promise. "Beyond fast" is a buzzword with no meaning. Nowhere in your quoted marketing material does it say that it will play Alan Wake 2 at 4K with RT+PT.

You need to read marketing material literally, and not imagine your own expectations in them.

2.Do you start using Nvidias marketing terms just to make their products look better or what is the definition of "settings for the future" and "settings for now" can you select these in the game :D and they will eventually unlock when some years have passed? :laugh:

Nvidia already states that the 4090 is next generation so you are telling me we need to wait for the next next generation of ultimate extreme high performance superfast GPUs?
What does "next generation" mean? Now, you're the one using Nvidia's buzzword to expect something that Nvidia never said.

You realise that these companies make the cards deliberately not too capable so they can sell the TI and SUPER series 6 months later.
Yes, but that's not the point here. A few unlocked shaders, or a few extra MHz won't magically give you double the performance in Alan Wake 2.

Also, you're complaining about the 4090, whereas Nvidia clearly stated that there's no 4090 Super coming. Or do you think the 4080 Super will be better?

That is rally not the point is it, I don't like to speculate about hypothtical products from 2025. The fact is that GPUs in 2023 should perform better than they do at the moment for the price and how they are advertised. People should be able to acknowledge that performance is disappointing in most games specially in Dead Space, Hogwarts, Forspoken, Cyberpunk and now with AW2, the list goes on.
I'm not speculating. The mid-range of the future will, at some point, be faster than the high-end of today. This has always been the case. It's called progress.

There is not much logic left if there where any in the naming of the GPUs and the performance you could expect. Its even worse on notebooks.
So you buy graphics cards based on their name alone? Congratulations!

In 1995, my first computer a Pentium 100 with onboard graphics running on DOS. Should I base entry level GPUs to onboard? As time goes on, so does the threshold for what is entry level. Entry level cards are 30 FPS min @ 1080p in my book. Alan Wake 2 Nvidia 3060 44.9fps/39.3, Nvidia 3050 31.8fps/28.1, Nvidia 1660Ti 27.6fps/19.7... 50/60 are entry level. Nvidia's 4060 50.3FPS/44.3 is about as low as anyone should go buying a new card for entry level game play. These number are provided by TPU in raster only. Now if you want to include DLSS and all that other fakery, by all means as I don't include that as what makes entry level cards or not. Nice to have if you can't reach playable framerates but even then, it doesn't even look good.
Yep, that's in your book. As for me, I refuse to call x60/x600 series cards that cost $300-350 and can run (some) new games at 1080p Ultra "entry level".

Entry level is a glorified display adapter that can also do some light gaming and costs around the $150 range maximum. That's the GTX 1630, 1650 and the RX 6400 and 6500 XT. That's it.
 
That's not a promise. "Beyond fast" is a buzzword with no meaning. Nowhere in your quoted marketing material does it say that it will play Alan Wake 2 at 4K with RT+PT.

You need to read marketing material literally, and not imagine your own expectations in them.


What does "next generation" mean? Now, you're the one using Nvidia's buzzword to expect something that Nvidia never said.


Yes, but that's not the point here. A few unlocked shaders, or a few extra MHz won't magically give you double the performance in Alan Wake 2.

Also, you're complaining about the 4090, whereas Nvidia clearly stated that there's no 4090 Super coming. Or do you think the 4080 Super will be better?


I'm not speculating. The mid-range of the future will, at some point, be faster than the high-end of today. This has always been the case. It's called progress.


So you buy graphics cards based on their name alone? Congratulations!


Yep, that's in your book. As for me, I refuse to call x60/x600 series cards that cost $300-350 and can run (some) new games at 1080p Ultra "entry level".

Entry level is a glorified display adapter that can also do some light gaming and costs around the $150 range maximum. That's the GTX 1630, 1650 and the RX 6400 and 6500 XT. That's it.
This is where we differ... You're talking about a graphical display device, I'm talking about an actual GAMING graphics processing unit :rockout: . Imagine a GTX 1630 on Alan Wake 2 - abysmal. I did find someone running this on a 1660 Super.

 
well I guess I'm going to sell my brain and buy a rtx 4080. Sell the 6800 for whatever.
 
This is where we differ... You're talking about a graphical display device, I'm talking about an actual GAMING graphics processing unit :rockout: . Imagine a GTX 1630 on Alan Wake 2 - abysmal. I did find someone running this on a 1660 Super.

Some games do run on a 1630. Alan Wake 2, while might be a good game, isn't the be-all-end-all of gaming, not to mention, it was nowhere to be seen when the 1630 was released. Besides, the 1630 is built on the Turing architecture, which is 2 generations old by now. So if we really think about it, neither Nvidia, nor AMD has a truly current-gen entry level card.
 
Some games do run on a 1630. Alan Wake 2, while might be a good game, isn't the be-all-end-all of gaming, not to mention, it was nowhere to be seen when the 1630 was released. Besides, the 1630 is built on the Turing architecture, which is 2 generations old by now. So if we really think about it, neither Nvidia, nor AMD has a truly current-gen entry level card.

Especially after the Alan Wake II release, even when it comes to used. You can get a used 5600 XT for around $100 which is good entry level price and performance, but no mesh shaders means AW2 is pretty much unusable (take 15-20% off 5700 XT fps). And it's reasonable to assume more games will require them though that rate may be slow. At least there are vast numbers of great older games like CP2077 (lol, older!) that run fine on a 5600 XT and if you're coming to PC gaming with a $100 GPU, you likely haven't played (m)any of them yet.
 
well I guess I'm going to sell my brain and buy a rtx 4080. Sell the 6800 for whatever.
Not worth it.. If you max out settings on games like Alan Wake 2 or Cyberpunk, the 4080 gets slow as hell, even @1440p. Unless you like to play @35-45 FPS... Even the 4090 takes a beating when path tracing comes to play.
Better tweak the games to get acceptable frame rates with your 6800 now and wait for 50XX series... Hopefully nVidia gets their shit together and ray tracing/path tracing gets to be more usable.
 
Back
Top