• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Hogwarts Legacy Benchmark Test & Performance Analysis

Joined
Jun 14, 2020
Messages
3,667 (2.20/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
3090 or 7900XT if you can find both in the same price?
Bonus: or ~+100£ for 4070Ti?
4070ti > 7900xt > 3090. The 3090s problem isn't that much performance but power draw was kinda nuts compared to my 4090
 

enderdragon2236

New Member
Joined
Feb 19, 2023
Messages
1 (0.00/day)
I have a question. I play this game with everything cranked at 1080p with RT on (I have 6700xt, r7 1700, 16gb DDR4) and for some reason I can't look at bushes. I'm serious, whenever I go outside and look at a tree or something, my FPS goes down from 40-60 FPS to 4 FPS. I think it is doing this because it is trying to ray trace theough the leaves and is not having a good time. Does anyone have any idea what might be happening here? Thanks in advance :)
 
Joined
Sep 23, 2022
Messages
1,290 (1.54/day)
I have a question. I play this game with everything cranked at 1080p with RT on (I have 6700xt, r7 1700, 16gb DDR4) and for some reason I can't look at bushes. I'm serious, whenever I go outside and look at a tree or something, my FPS goes down from 40-60 FPS to 4 FPS. I think it is doing this because it is trying to ray trace theough the leaves and is not having a good time. Does anyone have any idea what might be happening here? Thanks in advance :)

You might have Ray Tracing set on High/Ultra. Bring it down in quality, or turn it off. RT on AMD isn't working like it's supposed to right now. See the Benchmark section of this review you're replying to:

performance-rt-1920-1080.png
 
Joined
Nov 4, 2020
Messages
7 (0.00/day)
People seem to keep waiting for the day when the 10GB 3080 chokes because of VRAM, but that day is yet to come :p
It definitely happens. I had a 3080 when I upgraded to a 48in 4k OLED. There were 2-3 titles it was running between 5-10fps vs 60+ before. Reducing vram intensive settings fixed it. Never had any issues at 3440x1440.

Nah, the RT is pretty much just borked in places right now.
Also the RT effects are pretty low res and noisy, so definitely an optimization issue. CP2077 looks better and runs less poorly in RT.
It is doing 33% resolution you can fix it in the INI file. 100% resolution with 1 sample looks better than 33% resolution with 4 (default).

I just played for a few hours, haven't noticed one stutter, perhaps me locking CPU + GPU to a fixed frequency is keeping the frametimes consistent :rolleyes:

Looks like I get 10FPS higher in CPU limited area than the YTer with 7700X
Just two days ago, I hit a spot that dropped me into seconds per frame territory. I'm running a 4090/7950x combo with 64GB of ram...
 
Last edited:
Joined
Dec 10, 2022
Messages
486 (0.64/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
Strawman bad. This isn't in the spirit of what they said at all.
My whole point was that the amount of VRAM matters. Any argument against that would have to be that it doesn't otherwise there's no argument.
 
Joined
Dec 22, 2011
Messages
3,890 (0.82/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
It definitely happens. I had a 3080 when I upgraded to a 48in 4k OLED. There were 2-3 titles it was running between 5-10fps vs 60+ before. Reducing vram intensive settings fixed it. Never had any issues at 3440x1440.

What games out of interest were those, I'm at 4K and haven't had any issues as it stands.

But as you say, when you can drop one or two settings, which can massively improve performance whilst playing spot the difference to IQ it's hard to be too upset when it's literally one or two games.
 
Joined
Feb 1, 2019
Messages
3,684 (1.70/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
Every AAA release this year so far (Forspoken, Dead Space and now this) performs dreadfully despite providing no meaningful uplift in graphical quality. Hogwarts and Forspoken especially look more or less like PS4 titles with slightly better textures.
I find it a big shame that DLSS/FSR are now just used as a crutch instead of enabling next-gen graphics.
Is it a surprise though?

The few dev's that speak openly about game development on big AAA titles, always talk about how by the time they start working on performance, its right at the end, so the engine and features are set in stone, and its usually just before a game is released so very little time to do it, its very low priority.

So if a tech is released which improves performance, that would just translate to not having to achieve as much optimisation, as the tech now does it for them.

The video footage of this game is an embarassement, should never have been approved for release, but stuttery gameplay sadly seems to be the norm now.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,271 (1.28/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
My whole point was that the amount of VRAM matters. Any argument against that would have to be that it doesn't otherwise there's no argument.
Reality is a lot more nuanced than "any argument against that would have to be that it doesn't [matter]", you don't really get to decide that. and using a strawman to try and make your point doesn't help you at all either.
 
Joined
Jul 14, 2018
Messages
473 (0.20/day)
Location
Jakarta, Indonesia
System Name PC-GX1
Processor i9 10900 non K (stock) TDP 65w
Motherboard asrock b560 steel legend | Realtek ALC897
Cooling cooler master hyper 2x12 LED turbo argb | 5x12cm fan rgb intake | 3x12cm fan rgb exhaust
Memory corsair vengeance LPX 2x32gb ddr4 3600mhz
Video Card(s) MSI RTX 3080 10GB Gaming Z Trio LHR TDP 370w| 566.36 WHQL | MSI AB v4.65 | RTSS v7.36
Storage NVME 2+2TB gen3| SSD 4TB sata3 | 1+2TB 7200rpm sata3| 4+4+5TB USB3 (optional)
Display(s) AOC U34P2C (IPS panel, 3440x1440 75hz) + speaker 5W*2 | APC BX1100CI MS (660w)
Case lianli lancool 2 mesh RGB windows - white edition | 1x dvd-RW usb 3.0 (optional)
Audio Device(s) Nakamichi soundstation8w 2.1 100W RMS | Simbadda CST 9000N+ 2.1 352W RMS
Power Supply seasonic focus gx-850w 80+ gold - white edition 2021 | APC BX2200MI MS (1200w)
Mouse steelseries sensei ten | logitech g440
Keyboard steelseries apex 5 | steelseries QCK prism cloth XL | steelseries arctis 5
VR HMD -
Software dvd win 10 home 64bit oem + full update 22H2
Benchmark Scores -
so, are this rtx 3080 10gb, still can playing with ultra detail on 2160p 60fps, without RT, but using only DLSS quality, in this rpg pc.... ??
 
Joined
Dec 10, 2022
Messages
486 (0.64/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
Reality is a lot more nuanced than "any argument against that would have to be that it doesn't [matter]", you don't really get to decide that. and using a strawman to try and make your point doesn't help you at all either.
Actually, you're wrong on a couple of things. Firstly, I DO get to decide if I think that a lack of VRAM on a card is detrimental and I DO get to decide if I want to tell people about it on a forum that exists for that very purpose.

If something is obviously true, there is no such thing as a strawman argument supporting it. An argument can only be strawman if it's supporting a falsehood because that's why it's called strawman in the first place, there's nothing holding it up except for false rhetoric. Saying that a card without enough VRAM is a bad purchase is not false rhetoric, it is an obvious truth. Therefore, it's not an argument made of straw but an argument made of granite.

The reason that it's especially relevant today is that we're in a pretty specific situation here where so many people who bought last-gen cards paid double or more what they would've paid for the previous-gen. The fact that they did have to pay that much makes every shortcoming on these cards all the more egregious, especially shortcomings that limit the cards' longevity.

If you suddenly had to pay twice as much as what was historically normal for something, wouldn't you be all the more angry that the company that made the product had knowingly done something that would severely limit its useful life? I sure would and I don't think that this is a strawman argument. Sure, in the days of the RTX 20 and RX 5000 cards, it wasn't nearly as bad a thing because cards were priced in a much more sane manner than they were in the storm that followed so I'm guessing that this is the nuance that you're talking about.

The nuance that I'm talking about is people paying twice as much (or more) for cards that have had their useful lives artificially shortened by small VRAM buffers. I had the same thing happen to me twice (the insufficient VRAM, not the paying double). Once was with the HD 7970 because the GPU was strong enough to use more than 3GB of VRAM and the other was with the R9 Fury. With the Fury I knowingly bought it because the first mining craze of 2017 was on and I was stuck with that 3GB HD 7970 since Crossfire had been essentially dropped. Since the Fury had a TDP of 275W, it wasn't suitable for mining and so was half the price of the RX 580 despite being faster. I decided that the low price and high level of performance for the money was worth it despite the low VRAM because everything else was literally double the price or more. Greg Salazar did a video about it:
However, my long-term experience with the R9 Fury, as long-lived as it was, taught me to never again choose a card, especially a high-end card, if it had significantly less VRAM than its rival. The R9 Fury has had decent longevity but not at anything higher than 1080p and if I had paid full price for it, I probably would've felt cheated.

My whole point is based on altruism and empathy because I'm not suffering from a lack of VRAM, I have an RX 6800 XT and 16GB is not lacking. I'm saying what I'm saying so that people who have been screwed like this know that it happened and why so that they'll see it coming next time and avoid it. I remember pulling my hair out because my HD 7970 had enough GPU horsepower to play a specific game (I don't remember which game because it's been so long) but the 3GB of VRAM just ruined the whole experience. I just don't want others to have to live with that same frustration. I don't see how that makes me a bad person, oh yeah, because it doesn't. I don't know why anyone would push back against this unless they're people who have made this mistake and are in denial or know that it's true and are just being belligerent because they don't want to admit that they messed up.

There's no shortage of people like that on the internet and you know it, especially among the younger crowd.
 
Last edited:
Joined
Jun 14, 2020
Messages
3,667 (2.20/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Actually, you're wrong on a couple of things. Firstly, I DO get to decide if I think that a lack of VRAM on a card is detrimental and I DO get to decide if I want to tell people about it on a forum that exists for that very purpose.

If something is obviously true, there is no such thing as a strawman argument supporting it. An argument can only be strawman if it's supporting a falsehood because that's why it's called strawman in the first place, there's nothing holding it up except for false rhetoric. Saying that a card without enough VRAM is a bad purchase is not false rhetoric, it is an obvious truth. Therefore, it's not an argument made of straw but an argument made of granite.

The reason that it's especially relevant today is that we're in a pretty specific situation here where so many people who bought last-gen cards paid double or more what they would've paid for the previous-gen. The fact that they did have to pay that much makes every shortcoming on these cards all the more egregious, especially shortcomings that limit the cards' longevity.

If you suddenly had to pay twice as much as what was historically normal for something, wouldn't you be all the more angry that the company that made the product had knowingly done something that would severely limit its useful life? I sure would and I don't think that this is a strawman argument. Sure, in the days of the RTX 20 and RX 5000 cards, it wasn't nearly as bad a thing because cards were priced in a much more sane manner than they were in the storm that followed so I'm guessing that this is the nuance that you're talking about.

The nuance that I'm talking about is people paying twice as much (or more) for cards that have had their useful lives artificially shortened by small VRAM buffers. I had the same thing happen to me twice (the insufficient VRAM, not the paying double). Once was with the HD 7970 because the GPU was strong enough to use more than 3GB of VRAM and the other was with the R9 Fury. With the Fury I knowingly bought it because the first mining craze of 2017 was on and I was stuck with that 3GB HD 7970 since Crossfire had been essentially dropped. Since the Fury had a TDP of 275W, it wasn't suitable for mining and so was half the price of the RX 580 despite being faster. I decided that the low price and high level of performance for the money was worth it despite the low VRAM because everything else was literally double the price or more. Greg Salazar did a video about it:
However, my long-term experience with the R9 Fury, as long-lived as it was, was to never again choose a card, especially a high-end card, if it had significantly less VRAM than its rival. The R9 Fury has had decent longevity but not at anything higher than 1080p and if I had paid full price for it, I probably would've felt cheated.

My whole point is based on altruism and empathy because I'm not suffering from a lack of VRAM, I have an RX 6800 XT and 16GB is not lacking. I'm saying what I'm saying so that people who have been screwed like this know that it happened and why so that they'll see it coming next time and avoid it. I remember pulling my hair out because my HD 7970 had enough GPU horsepower to play a specific game (I don't remember which game because it's been so long) but the 3GB of VRAM just ruined the whole experience. I just don't want others to have to live with that same frustration. I don't see how that makes me a bad person, oh yeah, because it doesn't. I don't know why anyone would push back against this unless they're people who have made this mistake and are in denial or know that it's true and are just being belligerent because they don't want to admit that they messed up.

There's no shortage of people like that on the internet and you know it, especially among the younger crowd.
You make it sound like the only tradeoff between nvidia / amd is the vram. It's not, going for the extra vram on the amd means you sacrifice a ton of RT performance, probably - at least in the case on the new gen of cards - worse power draw and efficiency, insane amounts of power draw while playing a youtube video or if you use dual monitor etc. On the other hand, what do you have to sacrifice IF there is a game that needs more vram than your card has, you drop textures from ultra to very high instead. So, not a whole lot.
 
Joined
Nov 26, 2021
Messages
1,709 (1.50/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
You make it sound like the only tradeoff between nvidia / amd is the vram. It's not, going for the extra vram on the amd means you sacrifice a ton of RT performance, probably - at least in the case on the new gen of cards - worse power draw and efficiency, insane amounts of power draw while playing a youtube video or if you use dual monitor etc.
You're correct about the tradeoff in ray tracing performance. However, the power consumption in video playback and multiple monitors has improved significantly since launch. It's also important to note that the higher consumption is observed in the case of monitors with different resolutions and refresh rates.

1677181730268.png
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,271 (1.28/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
If you don't think that VRAM is a big deal, then I guess that GPU manufacturers should just put 8GB on everything. After all, it's not a big deal, eh?
Dude, that's a strawman, he never said put 8gb on everything, you did, Strawman 101.
Firstly, I DO get to decide if I think that a lack of VRAM on a card is detrimental and I DO get to decide if I want to tell people about it on a forum that exists for that very purpose.
What you don't get to decide is conditions for a successful argument against it, or conclude that none exists. I mean sure, nobody can prevent you saying that and making that argument, I guess, well done. Nobody is saying VRAM doesnt matter at all, of course it does, cards can't function without it and generally speaking more is better, but it's not the be all end all of a video card.

I don't really care about people who bought at twice the price, they were silly enough to do it at all, they get what they deserve. I'm glad you enjoy your 6800xt, good for you. I like me 3080, good for me.
 
Last edited:
Joined
Dec 10, 2022
Messages
486 (0.64/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
Dude, that's a strawman, he never said put 8gb on everything, you did, Strawman 101.
No he said it's not a big deal, I was exaggerating on purpose to show that it actually is. That's not a strawman argument, that's a demonstrative metaphor but you can call it whatever you want, it doesn't make it true.
I don't really care about people who bought at twice the price, they were silly enough to do it at all, they get what they deserve. I'm glad you enjoy your 6800xt, good for you. I like me 3080, good for me.
Some people just have to learn the hard way I guess. The people who liked my post tell me that some understand the lesson to be learnt so it was worth it. I really hope, for your sake, that the 10GB holds up longer than I expect it to. I don't revel or get pleasure from seeing other people in less than optimal situations and I know that a large reason that things played out the way they did was because of ignorance.

If you point out a problem that people didn't notice (like the RTX 3080 having less VRAM than the GTX 1080 Ti), some people will ignore you but some will say "Yeah, that IS pretty low..." and maybe they'll take an RTX 3060 or RX 6800 XT instead, realising that the 10GB is a limiting factor. To call 10GB a limiting factor on a card as powerful as the RTX 3080 isn't an opinion, it's a fact that has been proven more than once over time.

If you didn't spend a pantload on your RTX 3080 then you're all good and my post wasn't intended for you anyway. It's the people who way overspent (and yes, it was stupid of them to do it) on it that my post was aimed at. There's nothing that they can do now, but maybe they'll remember this in the future.
 
Joined
Jun 14, 2020
Messages
3,667 (2.20/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
To call 10GB a limiting factor on a card as powerful as the RTX 3080 isn't an opinion, it's a fact that has been proven more than once over time.
It absolutely is an opinion. The same way it's an opinion that the RT performance of the 6800xt is a limiting factor. I find that disgustingly bad RT performance way more restrictive than the 10gb vram.
 
Joined
Jan 14, 2019
Messages
13,055 (5.97/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
No he said it's not a big deal, I was exaggerating on purpose to show that it actually is. That's not a strawman argument, that's a demonstrative metaphor but you can call it whatever you want, it doesn't make it true.

Some people just have to learn the hard way I guess. The people who liked my post tell me that some understand the lesson to be learnt so it was worth it. I really hope, for your sake, that the 10GB holds up longer than I expect it to. I don't revel or get pleasure from seeing other people in less than optimal situations and I know that a large reason that things played out the way they did was because of ignorance.

If you point out a problem that people didn't notice (like the RTX 3080 having less VRAM than the GTX 1080 Ti), some people will ignore you but some will say "Yeah, that IS pretty low..." and maybe they'll take an RTX 3060 or RX 6800 XT instead, realising that the 10GB is a limiting factor. To call 10GB a limiting factor on a card as powerful as the RTX 3080 isn't an opinion, it's a fact that has been proven more than once over time.

If you didn't spend a pantload on your RTX 3080 then you're all good and my post wasn't intended for you anyway. It's the people who way overspent (and yes, it was stupid of them to do it) on it that my post was aimed at. There's nothing that they can do now, but maybe they'll remember this in the future.
Most people who have a 3080 will swap it for something newer long before the 10 GB VRAM becomes a limiting factor.

And to suggest that a 3060 is better because it has more VRAM is weird, to say the least. I would suggest rethinking this point.

People who overspent on a GPU during the mining crisis is an entirely different matter altogether. Based on the past, it was logical to assume that GPU prices would come back down eventually as the availability issue was slowly eliminated. People who couldn't wait and bought a GPU, any GPU, not just a 3080, for way over MSRP, have only themselves to blame (unless they don't care about money, which is also a possibility).
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,271 (1.28/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
you can call it whatever you want, it doesn't make it true.
Same to you my dude, about your Strawman and your other rants. You're an interesting one, and it's nice to get alternative opinions on things, even when I disagree.
 
Joined
Dec 10, 2022
Messages
486 (0.64/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
Most people who have a 3080 will swap it for something newer long before the 10 GB VRAM becomes a limiting factor.
That doesn't mean that they wouldn't have preferred it if their expensive-as-hell card lived longer.
And to suggest that a 3060 is better because it has more VRAM is weird, to say the least. I would suggest rethinking this point.
I said that it's better with regard to how much VRAM it has. Of course it's weird, it's weird that the RTX 3060 has more VRAM than the RTX 3080. We're talking Twilight Zone-level weird here!
People who overspent on a GPU during the mining crisis is an entirely different matter altogether. Based on the past, it was logical to assume that GPU prices would come back down eventually as the availability issue was slowly eliminated.
I completely agree. I was one of those fools myself. Fortunately, I managed to mine enough Ethereum that I only paid about $200CAD more than MSRP so I didn't do too badly. I still curse at the mirror from time to time over it though... :laugh:
People who couldn't wait and bought a GPU, any GPU, not just a 3080, for way over MSRP, have only themselves to blame (unless they don't care about money, which is also a possibility).
It's true, I couldn't agree more. I just hope that this time, they learn their lesson instead of going back time and time again to get fleeced like so many dumb sheep.
 
Joined
Jan 14, 2019
Messages
13,055 (5.97/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
That doesn't mean that they wouldn't have preferred it if their expensive-as-hell card lived longer.
That's kind of a contradiction. If you sell your card 2 years after you bought it to buy the next shiny new thing, then you won't care if it performs poorly in a game 3 years later, do you? ;)

I said that it's better with regard to how much VRAM it has. Of course it's weird, it's weird that the RTX 3060 has more VRAM than the RTX 3080. We're talking Twilight Zone-level weird here!
Oh but there's an explanation. 6 GB wouldn't have been enough for a card of that calibre, but 8 and 12 aren't doable due to the GPU's core config and memory controller. Not to mention the marketing value.

It's true, I couldn't agree more. I just hope that this time, they learn their lesson instead of going back time and time again to get fleeced like so many dumb sheep.
On a 3080-4080 level, maybe, but people who want the best of the best just for the sake of having it will never learn. You could easily sell them a steaming pile of turd with a green 5090 logo on it for $10k. It only has to be marginally better at 4K, or have a new feature that you'll never use.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,271 (1.28/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Fortunately, I managed to mine enough Ethereum that I only paid about
I had completely forgotten about this too, my full hash rate 3080, bought on day 1 before the price crazyness, mined in excess of its own value in eth, the card paid for itself and then put even more money in my pocket. Would be hard to be mad at it even if the 10gb ever let's me down in a meaningful way.
 
Joined
May 6, 2017
Messages
10 (0.00/day)
Just sad that we are on 3rd Gen Raytracing and still need upscaling cheats to play with RT on a $1,200 GPU in 2023 at decent fps. I feel like just when sharp detailed artifact free high res high refresh gaming was in our grasp in the mainstream... along comes RT and says NOPE!
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.90/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
Just sad that we are on 3rd Gen Raytracing and still need upscaling cheats to play with RT on a $1,200 GPU in 2023 at decent fps. I feel like just when sharp detailed artifact free high res high refresh gaming was in our grasp in the mainstream... along comes RT and says NOPE!
Because RT was made to sell graphics cards, not to advance graphics
Look at PhysX after nvidia bought them

The RT cores were researched for their enterprise level GPUs for render farms, and then they found a way to make it "useful" in games and pushed to have it implemented, instead of desigining a feature for gamers exclusively (like DLSS was, but admittedly as a backlash to how poorly RTX ran)
 
Joined
Jun 14, 2020
Messages
3,667 (2.20/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Because RT was made to sell graphics cards, not to advance graphics
Look at PhysX after nvidia bought them

The RT cores were researched for their enterprise level GPUs for render farms, and then they found a way to make it "useful" in games and pushed to have it implemented, instead of desigining a feature for gamers exclusively (like DLSS was, but admittedly as a backlash to how poorly RTX ran)
So why are 3d movies made with rt and not traditional raster technology? Are movie studios trying to sell gpus as well?

...
 
Joined
Dec 12, 2012
Messages
780 (0.18/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
CGI in movies is not rendered in real time, and it's not hardware accelerated. It's rendered on high core count CPUs at less than 1 frame per second.

RT in games will never ever look as good as movies, it's just not possible. I don't think most people realize just how demanding it is. And RT is not about life-like looking graphics, it's about the accurate behavior of light.
Developers have gotten so good at faking stuff with rasterization, that RT often doesn't look mind-blowing. And for it to look much better, you need a lot more computing power.
 
Joined
Jun 14, 2020
Messages
3,667 (2.20/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
CGI in movies is not rendered in real time, and it's not hardware accelerated. It's rendered on high core count CPUs at less than 1 frame per second.

RT in games will never ever look as good as movies, it's just not possible. I don't think most people realize just how demanding it is. And RT is not about life-like looking graphics, it's about the accurate behavior of light.
Developers have gotten so good at faking stuff with rasterization, that RT often doesn't look mind-blowing. And for it to look much better, you need a lot more computing power.
The point is, it's preferred cause it looks better.
 
Top