• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 9070 and 9070 XT Official Performance Metrics Leaked, +42% 4K Performance Over Radeon RX 7900 GRE

Mrgravia

New Member
Joined
Jun 10, 2024
Messages
14 (0.05/day)
As someone with a Radeon 6700 10gb, this is very welcome news.

Though i'm waiting to see how the 9060 shakes out.
 
Joined
Jun 14, 2020
Messages
4,519 (2.63/day)
System Name Mean machine
Processor AMD 6900HS
Memory 2x16 GB 4800C40
Video Card(s) AMD Radeon 6700S
Did you read my post at all before replying ? They have the means to own/rent large server farms and they're also in a competitive industry, so of course they're gonna make the investment to have perfect raster & RT so they can be praised about the arts of their movies but they do that by pushing frames so complexely created that even on modern hardware, render rate is HOURS PER FRAME, not FRAMES PER SECOND like in gaming, we boot up our PCs and get gaming on the spot, animation movies take MONTHS to be completed so this is impossible to use as a comparison point
Im asking why would they invest doing anything in RT if it doesn't look better? CGI from 20 years ago (even in high budget movies, ie matrix) look a lot worse than todays games. I can see games from 20 years from now looking as good as today's CGI.
 

bug

Joined
May 22, 2015
Messages
14,129 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
these are not rumours anymore, it is AMD's own numbers
Still to be taken with a grain of salt. E.g., some games are benchmarked both with RT on and off, some aren't. Could be cherry picking, could simply be lack of time...
I try to block any supposed leaks and concentrate on day-1 reviews. Served me well so far, I believe it will continue to do so.
 
Joined
Feb 14, 2025
Messages
30 (2.73/day)
Location
France
System Name 1st build ! No longer chained to laptops !
Processor AMD Ryzen 7 9800X3D
Motherboard ASUS TUF GAMING X670E-PLUS
Cooling Thermalright Frozen Infinity 360 ARGB
Memory Patriot Viper Elite 5 RGB TUF-GA 2x24GB 6600CL34 [tuned 6200CL30]
Video Card(s) XFX AMD Radeon RX 9070 XT MERCURY Magnetic Air
Storage 500GB Crucial P5 - 1TB Silicon Power US75 - 2TB Silicon Power US75
Display(s) 2 x 24" 1440p 144Hz IPS
Case Thermaltake The Tower 600
Audio Device(s) HyperX Cloud III wireless
Power Supply MSI MPG A850G
Mouse Logitech G502 Lightspeed
Keyboard Keychron V6 Max ISO
VR HMD PICO Neo 3 Link
Software Windows 11 24H2 (debloated - no Recall, no Edge)
Im asking why would they invest doing anything in RT if it doesn't look better? CGI from 20 years ago (even in high budget movies, ie matrix) look a lot worse than todays games. I can see games from 20 years from now looking as good as today's CGI.
Go right ahead, if you think you can find a single movie from the mid 2000s that looks better than, idfk-

*throwing darts at random Google results*

...what the fuck ?

DreamWorks is working on a live action "How To Train Your Dragon" ??

Arcane Season 2, Avatar 3 (did they really need to make a third ?), whatever...
 
Joined
Jun 2, 2017
Messages
9,816 (3.47/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
The PS being what?
The hardware system to play 3D rendered Games. Yes you had sprite based Games like Street Fighter but the new bandwidth allowed for Ray Tracing. Think Medieval.

Go right ahead, if you think you can find a single movie from the mid 2000s that looks better than, idfk-

*throwing darts at random Google results*

...what the fuck ?

DreamWorks is working on a live action "How To Train Your Dragon" ??

Arcane Season 2, Avatar 3 (did they really need to make a third ?), whatever...
The first Spiderman Game blew me away the first time I saw it on my Nephew's 720P TV.
 
Joined
Dec 6, 2022
Messages
622 (0.77/day)
Location
NYC
System Name GameStation
Processor AMD R5 5600X
Motherboard Gigabyte B550
Cooling Artic Freezer II 120
Memory 16 GB
Video Card(s) Sapphire Pulse 7900 XTX
Storage 2 TB SSD
Case Cooler Master Elite 120
Don't even waste your time with that kiddo.

View attachment 386428

He admitted on another thread he buys nvidia regardless, so he's here just to troll anyways.
Was willing to give him a chance, but not worth it.

Placed it on the lovely Ignore list.
 
Last edited:
Joined
Apr 30, 2011
Messages
2,746 (0.54/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
Since AMD decided to make only relatively small dies for this gen in order to be able to produce much more and sell for less to gain marketshare while keeping good margins, $600 for the XT and $500 for the other one are good price points to succeed in their target. They just need to make plenty of those to flood the market, not allowing scalpers and retailers to move prices up. Marketshare will change very fast if they do so. Much more if FSR4 is close to DLSS.
 
Joined
Oct 28, 2012
Messages
1,322 (0.29/day)
Processor AMD Ryzen 3700x
Motherboard asus ROG Strix B-350I Gaming
Cooling Deepcool LS520 SE
Memory crucial ballistix 32Gb DDR4
Video Card(s) RTX 3070 FE
Storage WD sn550 1To/WD ssd sata 1To /WD black sn750 1To/Seagate 2To/WD book 4 To back-up
Display(s) LG GL850
Case Dan A4 H2O
Audio Device(s) sennheiser HD58X
Power Supply Corsair SF600
Mouse MX master 3
Keyboard Master Key Mx
Software win 11 pro
Before the name change, it was supposed to be the 8800XT ; by definition, the -800(XT) model being AMD's mid-tier SKU
They confirmed they wouldn't produce a Big Navi SKU on RDNA4 (for the others : NO they didn't say they'd stop making high end GPUs, they just said they'd focus on the mid/low-mid SKUs for RDNA4 while they're reorganizing their R&D effort for UDNA)

Oh I see... I'm guessing that's why there's so little difference between RT on/off ?

Ah yes, ML/AI. Of course, they have to promote what *datacenters* are after and insist on the one feature that caused them to make Blackwell 2.0 (because 1.0 was *that bad* I'm guessing -see reports of cracked dies from heat in DC racks from big customers-) with little raster improvement but SO MANY AI accelerators jam-packed on those dies...

Of all things I wanted AI to be used on, graphics wasn't one of them... Imagine how neat it would have been to run games' NPCs on AI from the GPUs ! Now that would have been epic. Maybe in games like Stellaris, Cities Skylines or, idk, CoD campaign enemies ?
Intel and AMD are working in that direction, as well. But seeing at how DX12 just started to support that tech, it's going to take a while before it's seeing widespread use in games. AMD UDNA launch might probably mention it, wich that arch being more optimized for ML/AI.

Forums lately are very quick at saying that raster is being artficially stalled, but even AMD who's supposedly still focusing on that isn't exaclty making big stride to bring insane generational performance uplift, with them letting the AI/RT company taking the top spot even in raster perf.

Then there's just how poorly the shader core count seems to scale. A 5090 is twice the GPU as a 5080, but it's not twice as fast, use a lot of power and that brings a lot of issues with it. Then there's TSMC asking a new born in exchange for a bleeding edge wafer, I can understand why ML is being seriously considered to improving the visuals.
 
Joined
Apr 14, 2018
Messages
863 (0.34/day)
Since AMD decided to make only relatively small dies for this gen in order to be able to produce much more and sell for less to gain marketshare while keeping good margins, $600 for the XT and $500 for the other one are good price points to succeed in their target. They just need to make plenty of those to flood the market, not allowing scalpers and retailers to move prices up. Marketshare will change very fast if they do so. Much more if FSR4 is close to DLSS.

Considering there aren’t any reference cards, AMD can do very little to enforce retailer pricing through AIBs and virtually nothing to stop scalping.

I hope to be wrong but don’t expect pricing to be a home run. I think were in for another gen of AMD generally offering better value with slightly cheaper prices than competing cards.
 
Joined
Oct 28, 2012
Messages
1,322 (0.29/day)
Processor AMD Ryzen 3700x
Motherboard asus ROG Strix B-350I Gaming
Cooling Deepcool LS520 SE
Memory crucial ballistix 32Gb DDR4
Video Card(s) RTX 3070 FE
Storage WD sn550 1To/WD ssd sata 1To /WD black sn750 1To/Seagate 2To/WD book 4 To back-up
Display(s) LG GL850
Case Dan A4 H2O
Audio Device(s) sennheiser HD58X
Power Supply Corsair SF600
Mouse MX master 3
Keyboard Master Key Mx
Software win 11 pro
Why did you say RTRT so many times in the beginning? Is this real time ray tracing?

Also, in reference to CG artists. You must mean that they enjoy real time ray tracing versus having lower end scenes that they then render for the full effect. Or are you saying that having the RT cores allows for full scene rendering to be done faster?
Yep, RTRT stands for real time ray tracing. And yhea RT/PT is really the graal of rendering methods more accurate, can enable more effects, and is overall more straightforward to use. Heavier compute requirement, but definitely better at producing an image. Raster is faster compute wise, but the technical aspect of raster rendering is heavier, there's a lot of different rasterisation techniques but each have their own limitations.

With RT you think more about how you are going to set-up the parameters of the materials and the lights to get the desired effect without being too ineficient. With raster you really have to think about wich technique is going to be suited for reflections, ambient occlusion, GI, separately, for your usecase, budget/technical skill level. From I've seen it's not uncommon to see custom lighting algorithm having to be developped if you are unhappy with the engine default tools.
And that's probably explaining why UE is becoming more and more widely used, because EPIC is somewhat handling that aspect themselves. When having to develop and maintain a custom engine seems to have become a daunting task in the modern era.

The one issue that would have with the current state of CG in games, is that unlike the movies industry, they seems to be focused on pushing the technicalities for realism, when the movies industry is also making big strides in stylized effect. Offline 3D renderer like Renderman or Arnold are not just good at path traced realism, they are also insane for stylized rendering. But few studios seems willing to explore that kind of aesthetic.
Stylized Presets Turntable sur Vimeo
1740418400260.png
 
Last edited:
Joined
Dec 3, 2024
Messages
27 (0.32/day)
RT doesn't always mean better graphics, the best recent exemple is the Half Life 2 rtx remaster video. Realistic lighting - checked ; 100% correct material properties on all objects - unchecked. Difficult to be wowed, when rust metal and wood look the same and interact the same with light. That was the same problem with early movie and cgi when everything looked like rubber, we have PBR rendering now, but unless it is applied carefully, RT will just made material properties mistake look even more obvious...
 
Joined
Sep 19, 2014
Messages
177 (0.05/day)
The 9070XT being 42% faster than a 7900GRE puts it ahead of the 5070Ti according to TPU relative performance.

42% Using RT

37% if no RT

7900GRE AVG fps = 59.3 +37% = 81fps avg
Slower than 4080 in 4K
average-fps-3840-2160.png


If the performance is around the 5070 Ti I'll quite happily pay $650 for the XT, especially after the whole nvidia fake pricing farce, heck, this whole nvidia generation has been fake everything, by comparison (if the performance proves real), the AMD offerings look like a bargain.
650 + Vat + AIB model extra.
U know it cost more than 650 to get one?

whit 20% vat = 780 + Aib extra if want better model

So, 30-35% on average uplift in 1440p from a factory downclocked 7900GRE in a list of games AMD choose themself. I bet they was testing it without any overclocking. And factory downclocked 7900GRE in 1440p is a bit above 7800XT, maybe 10% difference. Well, not so bad, everything depends on price. If 5070Ti was available at MSRP AMD had no chances, but it costs more than $1000, so red team has a good chances.

750$ + 20% vat = 900
So +100 extra for AIB model.
We have 5070Ti sold 940e and we have 25.5% Vat

Why ppls think we can buy GPUs MSRP whitout VAT
 

SomeUnknownUser

New Member
Joined
Feb 24, 2025
Messages
2 (2.00/day)
42% Using RT

37% if no RT

7900GRE AVG fps = 59.3 +37% = 81fps avg
Slower than 4080 in 4K
View attachment 386462


650 + Vat + AIB model extra.
U know it cost more than 650 to get one?

whit 20% vat = 780 + Aib extra if want better model



750$ + 20% vat = 900
So +100 extra for AIB model.
We have 5070Ti sold 940e and we have 25.5% Vat

Why ppls think we can buy GPUs MSRP whitout VAT
Based on the performance leaks and estimates for the hypothetical RTX 9070 XT, it would likely have an FPS value around 100. This estimate is based on the assumption that the RTX 9070 XT would outperform the RTX 4080 Super 16 GB (85.5 FPS) and be close to the RTX 5080 16 GB (96.7 FPS). Therefore, the estimated FPS for the RTX 9070 XT would be approximately 100 FPS
 
Joined
Jul 9, 2015
Messages
3,468 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
were in for another gen of AMD generally offering better value with slightly cheaper prices than competing cards.
Man, in which country is that the case?

The "slightly" part.

In Germany freaking 12GB piece of 4070 scheisse costs more than 7900XT 20GB, a card that is 22% faster per TPU benchmarks.
 
Joined
Apr 14, 2018
Messages
863 (0.34/day)
Man, in which country is that the case?

The "slightly" part.

In Germany freaking 12GB piece of 4070 scheisse costs more than 7900XT 20GB, a card that is 22% faster per TPU benchmarks.

In regards to MSRP. We all know Nvidia doesn’t do price reductions unless it’s a super rebrand. On the other hand AMD prices will see discounts after initial release improving value, like the case seems to be in your country.
 
Joined
Apr 30, 2020
Messages
1,076 (0.61/day)
System Name S.L.I + RTX research rig
Processor Ryzen 7 5800X 3D.
Motherboard MSI MEG ACE X570
Cooling Corsair H150i Cappellx
Memory Corsair Vengeance pro RGB 3200mhz 32Gbs
Video Card(s) 2x Dell RTX 2080 Ti in S.L.I
Storage Western digital Sata 6.0 SDD 500gb + fanxiang S660 4TB PCIe 4.0 NVMe M.2
Display(s) HP X24i
Case Corsair 7000D Airflow
Power Supply EVGA G+1600watts
Mouse Corsair Scimitar
Keyboard Cosair K55 Pro RGB
Then there's just how poorly the shader core count seems to scale. A 5090 is twice the GPU as a 5080, but it's not twice as fast, use a lot of power and that brings a lot of issues with it. Then there's TSMC asking a new born in exchange for a bleeding edge wafer, I can understand why ML is being seriously considered to improving the visuals.
R.O.P are the same as the RTX 4090in the RTX 5090. At some point the CPU cannot push enough data to large GPU's. I really think Nvidia didn't add more R.O.P's because they already knew the RTX 4090 was at the limit CPU being CPU bond. Adding shaders while stifling R.O.P increase & tuning RT cores to be faster in the latency pipeline is the only thing Nvidia has done for Black well. Normally with a 50% increase in shaders you end up with about 33% increase in performance. All Nvidia really achieved in Blackwell was a minor increase of the usually 33% to 35% it's not really impressive at all unless you take the R.O.P limit into around.

Both AMD & Nivida have decided to limit themselves in specific areas of GPU to drive innovations on both sides
 

SomeUnknownUser

New Member
Joined
Feb 24, 2025
Messages
2 (2.00/day)
R.O.P are the same as the RTX 4090in the RTX 5090. At some point the CPU cannot push enough data to large GPU's. I really think Nvidia didn't add more R.O.P's because they already knew the RTX 4090 was at the limit CPU being CPU bond. Adding shaders while stifling R.O.P increase & tuning RT cores to be faster in the latency pipeline is the only thing Nvidia has done for Black well. Normally with a 50% increase in shaders you end up with about 33% increase in performance. All Nvidia really achieved in Blackwell was a minor increase of the usually 33% to 35% it's not really impressive at all unless you take the R.O.P limit into around.

Both AMD & Nivida have decided to limit themselves in specific areas of GPU to drive innovations on both sides
Here, it must be clarified that, in the RTX 5090, the same number of ROPs are used as in the RTX 4090. High-end GPUs face more CPU bottlenecking, which likely played a role in Nvidia's decision not to increase the number of ROPs. They went for the more shaders and extensive optimization of their RT cores so as to lessen latency in pipeline operations.
While the 50% increase in the number of shaders should have ideally resulted in about a 33% increase in performance, the Blackwell architecture managed to get a small improvement to 35% - though to someone less technically savvy, this may at first come off as unimpressive; however, the overall balance and efficiency of GPU designs must also be taken into account.
Both AMD and Nvidia, of course, have strategically limited certain aspects of their GPUs in order to spur innovation and provide themselves with a competitive edge that allows for targeted resources to be directed where the GPU needs improvement, such as AI and generative computing, and still deliver high-performance solutions in graphics.
 
Joined
Dec 12, 2016
Messages
2,320 (0.77/day)
9070 (not the XT) is faster than the 5070 and more energy efficient.
But no halo card this time so no one will buy it. AMD knows this so I wonder why they didn't bother.

Plus I always have a laugh watching poor people with $600 RTX 4070s act like $620 7900 XTs are slow. In 8/10 games you're the slowpoke.
Intel is also selling GPUs without a halo product. I think they are moving some volume.
 
Joined
Apr 30, 2020
Messages
1,076 (0.61/day)
System Name S.L.I + RTX research rig
Processor Ryzen 7 5800X 3D.
Motherboard MSI MEG ACE X570
Cooling Corsair H150i Cappellx
Memory Corsair Vengeance pro RGB 3200mhz 32Gbs
Video Card(s) 2x Dell RTX 2080 Ti in S.L.I
Storage Western digital Sata 6.0 SDD 500gb + fanxiang S660 4TB PCIe 4.0 NVMe M.2
Display(s) HP X24i
Case Corsair 7000D Airflow
Power Supply EVGA G+1600watts
Mouse Corsair Scimitar
Keyboard Cosair K55 Pro RGB
Here, it must be clarified that, in the RTX 5090, the same number of ROPs are used as in the RTX 4090. High-end GPUs face more CPU bottlenecking, which likely played a role in Nvidia's decision not to increase the number of ROPs. They went for the more shaders and extensive optimization of their RT cores so as to lessen latency in pipeline operations.
While the 50% increase in the number of shaders should have ideally resulted in about a 33% increase in performance, the Blackwell architecture managed to get a small improvement to 35% - though to someone less technically savvy, this may at first come off as unimpressive; however, the overall balance and efficiency of GPU designs must also be taken into account.
Both AMD and Nvidia, of course, have strategically limited certain aspects of their GPUs in order to spur innovation and provide themselves with a competitive edge that allows for targeted resources to be directed where the GPU needs improvement, such as AI and generative computing, and still deliver high-performance solutions in graphics.

The only issue with that is that architecture changes/redesigns have produced higher increases in performance, with slightly more changes. This does looks like just a major change in to increase A.I, but Blackwell is sometimes slower in specific loads compared to Ada Lovelace in some A.I workloads.
 
Joined
Jan 14, 2019
Messages
14,566 (6.52/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case It's not about size, but how you use it
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
AMD is set to release the Radeon RX 9070 Series shortly, but it probably won't match the performance of the RTX 5070 Ti.
I didn't think this snarky comment in the 5070 Ti reviews would age so poorly. I hope real review data will confirm.
 
Joined
May 13, 2008
Messages
937 (0.15/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
What if 9070xt overclocks from 3.15GHz to 3.5GHz?

Likely locked down, which is fine because of bandwidth limitation (and perhaps binning for higher-end part).
This is partially to keep prices low, partially not to give away the farm. Part of it truly is to compete with nVIDIA's stack, where perhaps they would have differently at one point.

That doesn't make them bad, especially for the price. Just, you know, don't expect them to be great at what they currently may be able to do 'just barely', forever, for reasons I have explained before.
This is no different than a lot of nVIDIA parts, but cheaper (because sensible use of cheap RAM, etc). Some things just don't line up very well right now, but will on 3nm, as I've again said countless times.
Many things I've postulated are pretty much coming true, even looking at this. 3244mhz (~7508sp raster utilization bc 64 rops?) would be bw-limited at around a little less than 22000mhz GDDR5...iow fits ram OC.
Now watch as 9070 is limited to around 3100mhz (power-limited) or so, maybe less, which (with 7168sp) matches the stock speed of 20gbps GDDR5...because something like that would make sense.
Hooray, 10% (or more) product segmentation. That doesn't need to be there, but w/e! That's just how it is, IG. Marketing FTW. Also, yeah, clearly they want to make 9070 cheap, compete 5070 in perf, (5060ti price?)
I dunno; the kicker is that 16GB instead of mangling the bus and 12GB. It's cool they gave it, but might hike the price up a little. Maybe they binned them some way I'm not considering; all possible.

Again, with 8192sp, to use all compute would require 24gbps @ ~3264mhz (by my math; others may vary slightly). Weird where this clocks...weird. Obviously using it as raster requires less and could clock higher.
How high? Well...If using the same bw calc as 9070xt...about the same as I expect these chips to clock without any power limits (on average). Huh. Again, very weird. Except, you know, not...bc logical.

They 'haven't tested 5070ti' because it's not it's competition, is the point. I mean, it is...but it isn't. I chuckled waiting to see if anyone else would catch that (they didn't, afaik).

C'mon, peeps, be logical.

Do you truly believe they don't have a 5070ti? Don't you think it's more likely they expected to lose (against certain cards with a certain config)...
....but are trying to match them where they can now (for blowout perf/$ if not higher asp for SKUs than planned)? I mean, I get it...I don't want to explain it, and perhaps should less in the future, but I get it.

I truly do find all this amusing, fwiw. Obviously products are fluid and chips can be sliced up different ways, but this is one the most-fascinating I've seen develop more-publically before our very eyes.

Mostly bc nVIDIA is ripping people off, and AMD is finding ways to capitalize on that with their always-planned logical iterations.
I wouldn't doubt if originally they were 20/24gbps 16GB products....that lost to 5070ti/5080...but still had to be priced against 5070.

But now...we'll have to see what happens w/ 24gbps ram. 16GB or 32GB? Maybe they expected to have to compete with TI w/ higher-end 16GB product, but now can with the lower-end, and hence why so.

Will they then try to match the 5070ti in price, >16GB against all current 16GB B203 products (with a 32GB model)? Hmm...makes sense, doesn't it?

What people *really* want is a 16GB 24gbps model at 5070 pricing, without clock/power limits, and they are purposely avoiding that, but with 32GB they might be able to up-sell to 5070ti pricing against a 5080.
Ofc, nVIDA will likely release a 24GB model that more definately wins the fight where AMD could only *almost* compete against a 16GB model (even if 16GB themselves), but that's not the point.

The point is popcorn.

1740436984810.gif


You really shouldn't ignore people, that's not very polite, even if you disagree. When they say things you don't like, and then are kinda-sorta-pretty-much right, you could've been prepared for that possibility.

But hey, if you are like that, or follow people like that, you do you. I just think it's unhealthy. Sometimes you have to listen to other people, even eventually take the L, and no, it's not fun when it happens.
I just do not support people living in their bubble of affirmation predicated on marketing...Expand your horizons; it promotes greater thought and independance. Sometimes you may be right/have a good idea.
Sometimes, and I am no different, you get it wrong, but at least you gave people something to think about and/or understand that some people think/feel that way. If you know differently; have the conversation.
Sometimes, especially when math/history is involved and you're familiar, and you can prove why, you get it right.
Sometimes you still get it wrong (and/or forget things). That's human. Nobody should be ashamed of that; when people feel that way they often contribute less, and then one group controls the narrative.
And that group is not always representative of all people and/or always factually correct. Most of the time they're trying to sell you something, and/or get something from it. I am not.

Sometimes, and maybe this part is just me, one writes and/or explains too damn much. It's just...in this atmosphere...with so many people blindly following and questioning nothing...I have to do it sometimes.

Because I care; not because I need to be right. I just want the ideas out there and for the needed sentiment to be represented. That can be extremely difficult, and i'm not perfect; always learning/adapting.

I think that's okay, for me and/or anyone else, as long as they're being honest and attempting to contribute to the best of their knowledge/ability; and hopefully without any ulterior motives/preconceptions.

:)
 
Last edited:
Joined
Jan 14, 2019
Messages
14,566 (6.52/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case It's not about size, but how you use it
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
They haven't tested 5070ti because it's not it's competition. I truly do find all this amusing, fwiw. Everything I've explained is pretty much coming true, even looking at this.
They probably didn't test against anything Nvidia to keep it professional. Testing against their own outgoing product is the right thing to do, this is what Nvidia does as well, by the way (even if with MFG skewed data, but that's besides the point here).
 
Joined
Dec 1, 2022
Messages
497 (0.61/day)
42% Using RT

37% if no RT

7900GRE AVG fps = 59.3 +37% = 81fps avg
Slower than 4080 in 4K
The 9070XT isn't a high end card, so I'm not expecting amazing 4k performance out of it, 37% faster is still decent, IMO.
AMD just needs to get the pricing right, if they price the XT at $649 and make sure it has plenty of stock, they have a chance to get lots of sales because anything from Nvidia is overpriced or unobtainable.
 
Joined
Dec 17, 2024
Messages
106 (1.51/day)
Location
CO
System Name Zen 3 Daily Rig
Processor AMD Ryzen 9 5900X with Optimus Foundation block
Motherboard ASUS Crosshair VIII Dark Hero
Cooling Hardware Labs 360GTX and 360GTS custom loop, Aquacomputer HighFlow NEXT, Aquacomputer Octo
Memory G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
Video Card(s) Nvidia RTX 3080 Ti Founders Edition with Alphacool Eisblock
Storage x2 Samsung 970 Evo Plus 2TB, Crucial MX500 1TB
Display(s) LG 42" C4 OLED
Case Lian Li O11 Dynamic
Power Supply be Quiet! Straight Power 12 1500W
Mouse Corsair Scimitar RGB Elite Wireless
Keyboard Keychron Q1 Pro
Software Windows 11 Pro
The 9070XT isn't a high end card, so I'm not expecting amazing 4k performance out of it, 37% faster is still decent, IMO.
AMD just needs to get the pricing right, if they price the XT at $649 and make sure it has plenty of stock, they have a chance to get lots of sales because anything from Nvidia is overpriced or unobtainable.
$649 for what they are comparing to the 7900 GRE from last gen...a $549 MSRP card would not be accepted well by the market imo. They gotta price it better than that. The 7800 XT was a $499 card and they chose "70" branding for this new one. I think $649 would just be seen as more "slotting into Nvidia bracket, just not as high" as opposed to the market disruption they really need to do if they want to really make a statement and maybe start gaining back some marketshare. I'm afraid all $649 will appeal to are those already inclined to buy AMD in the first place. I really don't think that will be aggressive enough. Sure there's MSRP and then real price of the 5070 Ti, so I kind of get it, but if the idea is to disrupt and reset what the mainstream is, I don't think $649 is gonna wow anyone.

Just my read, but I could be wrong. I have a 3080 Ti and game on 4k. My biggest gripe right now is VRAM. I realize the 9070 XT isn't "high end", but it should still be an improvement in 4k over my card, and gets me 16GB. If priced right, I'd get it to tide me over till next gen. I would have considered a 5080 if it was 24GB, and the 5070 Ti at $900 can go fly a kite. If the RT has improved as much as they claim and the rumors suggest and if FSR4 end up being good, then I don't see why not.
 
Joined
Jan 14, 2019
Messages
14,566 (6.52/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case It's not about size, but how you use it
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
$649 for what they are comparing to the 7900 GRE from last gen...a $549 MSRP card would not be accepted well by the market imo. They gotta price it better than that. The 7800 XT was a $499 card and they chose "70" branding for this new one. I think $649 would just be seen as more "slotting into Nvidia bracket, just not as high" as opposed to the market disruption they really need to do if they want to really make a statement and maybe start gaining back some marketshare. I'm afraid all $649 will appeal to are those already inclined to buy AMD in the first place. I really don't think that will be aggressive enough. Sure there's MSRP and then real price of the 5070 Ti, so I kind of get it, but if the idea is to disrupt and reset what the mainstream is, I don't think $649 is gonna wow anyone.
That's an 18% higher price for 38% higher 1440p and 42% higher 4K performance. Model numbers are irrelevant.
 
Top