• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD 7nm "Vega" by December, Not a Die-shrink of "Vega 10"

Joined
Jan 15, 2015
Messages
362 (0.10/day)
LOL, what?! You clearly have no understanding of cryptocurrencies.
That's possible but not at all illuminating.
AMD can't just "Make another Ethereum or Monero."
Why is that? Is there an international governing body that creates them, preventing all others from coming into being?
Furthermore, they don't need to because mining is still profitable for the right people.
Is Vega still sold out because of mining?
 
Last edited:
Joined
Feb 12, 2015
Messages
1,104 (0.31/day)
That's possible but not at all illuminating.

Why is that? Is there an international governing body that creates them, preventing all others from coming into being?

Is Vega still sold out because of mining?

What question do you actually want answered?

Do you want me to talk to you about the fundamentals of the Future of Money? Do you want to know Vega's true use cases? What?
 
Joined
Jan 15, 2015
Messages
362 (0.10/day)
Nope, i have sworn i wont ever buy Nvidia and Intel again. Sold my dual Xeon 2690v3 and dual 1080ti, and got me a 32core epyc, and two Radeon frontier watercooled. Screw Nvidia and screw Intel. They wont ever get my Money ever again. So i will go with whatever AMD brings to the table. for me Intel and Nvidia cease to exist.
TechRadar just ran an article that suggested Navi will be for the "consoles" (low-end PCs masquerading as a separate platform) only. Or, "PC" gamers will get leftovers from a design that is targeted toward the "consoles".

https://www.techradar.com/news/amd-navi
 
Joined
Feb 12, 2015
Messages
1,104 (0.31/day)
TechRadar just ran an article that suggested Navi will be for the "consoles" (low-end PCs masquerading as a separate platform) only. Or, "PC" gamers will get leftovers from a design that is targeted toward the "consoles".

https://www.techradar.com/news/amd-navi

LOL your credibility is now officially dead.

The $500 XBX has a card that competes with a 1070 Ti in gaming, and yet you call it "low end".

Hey Genius - the PS4 has more-or-less a 7870, but that is the same arch as the 7970 GHz technically. It took the 780 Ti over a year later to beat that. All that we know is Navi is FINALLY built for gaming first - that's it. That should scare any Nvidiot, but that should make every gamer happy ;).
 
Joined
Jan 15, 2015
Messages
362 (0.10/day)
Delta compression? Both have it but Nvidia's has so far been considered (and measured) to be much more effective.
Bandwidth might be a problem but not severe enough to solve with a solution as expensive as that.

APU is just CPU and GPU on one die/package. Nothing really stops AMD from replacing Jaguar cores with Zen/+/2. AMD can easily do a Raven Ridge with larger GPU and very likely is doing exactly that.
Apparently, AMD is producing a Zen-based APU for the Chinese market that will have HBM. It will reportedly be deployed there in desktops and then in a Chinese-market console.

https://www.extremetech.com/gaming/274919-more-details-on-the-new-amd-powered-chinese-console

The main drawback, in terms of the standard PC gaming platform, is the RAM split:

Hruska said:
The same pool of RAM is being used for both CPU and GPU, with a likely 4GB subdivision.

The Vega tech that helps with lower VRAM will likely help with the VRAM being 4 GB but programs might have an issue with 4 GB of system RAM.
Nobody is going to be interested in the new, "hot" crypto that's easy to mine if nobody is willing to buy or sell it. Also, difficulty is not what's behind the bubble bursting, but the simple fact that a system based purely on gambling and BS claims of value isn't sustainable over time. In other words, inventing a new currency changes nothing. If anything, there's a glut of currencies, and they're not helping anything.
I heard the same things after Bitcoin. Ethereum became the next big thing, though. It doesn't seem at all certain to me that there won't be another "next big thing" in crypto. Having many existing coins also doesn't prevent that possibility. There are plenty of examples in tech where the market was saturated by players — and yet there were next big things. The NES. The PlayStation. The XBox. The greatest example is the IBM PC, which came onto a market with a lot of microcomputers already available. It had a big corporation behind it, which is why it succeeded. It wasn't its technical merits that made it sell. There were also enough well-known search engines, including metasearch engines, that plenty of people didn't predict Google.

Gambling has existed for a long time and there is a lot of money involved in it.

My point about Jaguar is the reason it was used, and especially kept for a second iteration, is because of the artificial effect of duopoly. Like monopoly, only with less severity, consumers get less product for their money. There are benefits of monopolization but the overall picture is a negative for consumers.
I feel so stupid for not knowing all these before, I really believed that each process node was the same for every foundry.
Truthiness abounds in the tech business.
 
Joined
Feb 3, 2017
Messages
3,810 (1.33/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
Apparently, AMD is producing a Zen-based APU for the Chinese market that will have HBM. It will reportedly be deployed there in desktops and then in a Chinese-market console.
https://www.extremetech.com/gaming/274919-more-details-on-the-new-amd-powered-chinese-console
HBM part was misread in the initial batch of news. GDDR5. It is not that crucial though given fast enough memory whatever type it comes in.
The main drawback, in terms of the standard PC gaming platform, is the RAM split:
The Vega tech that helps with lower VRAM will likely help with the VRAM being 4 GB but programs might have an issue with 4 GB of system RAM.
Split itself should not be a problem. Software these days can handle it dynamically enough. Minor this about the dynamic pool is that one memory type or another may be more or less suitable for CPU or GPU though. GDDR trades latency for bandwidth, for example.
 
Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Apparently, AMD is producing a Zen-based APU for the Chinese market that will have HBM. It will reportedly be deployed there in desktops and then in a Chinese-market console.

https://www.extremetech.com/gaming/274919-more-details-on-the-new-amd-powered-chinese-console

The main drawback, in terms of the standard PC gaming platform, is the RAM split:

The Vega tech that helps with lower VRAM will likely help with the VRAM being 4 GB but programs might have an issue with 4 GB of system RAM.
That article explicitly states that the console uses GDDR5, not HBM. As confirmed by the pictures, and every other publication covering it.

As with other APUs in Windows, it likely uses dynamically allocated shared memory (with a fixed base amount), so VRAM allocation is likely adjusted by need. This works perfectly fine.

I heard the same things after Bitcoin. Ethereum became the next big thing, though. It doesn't seem at all certain to me that there won't be another "next big thing" in crypto. Having many existing coins also doesn't prevent that possibility. There are plenty of examples in tech where the market was saturated by players — and yet there were next big things. The NES. The PlayStation. The XBox. The greatest example is the IBM PC, which came onto a market with a lot of microcomputers already available. It had a big corporation behind it, which is why it succeeded. It wasn't its technical merits that made it sell. There were also enough well-known search engines, including metasearch engines, that plenty of people didn't predict Google.

Gambling has existed for a long time and there is a lot of money involved in it.
Was there ever a time before when people were scared of buying (and desperate to sell) Bitcoin? I sure can't remember that. Skepticism, sure, but not the "run away" attitude seen today. The situation is fundamentally different. This of course doesn't mean that a new wave of crypto won't appear - the financial "industry" doesn't like to leave potential ways of generating money alone for long, even if they're currently terrified of it. But it will likely take some time.

My point about Jaguar is the reason it was used, and especially kept for a second iteration, is because of the artificial effect of duopoly. Like monopoly, only with less severity, consumers get less product for their money. There are benefits of monopolization but the overall picture is a negative for consumers.
That's a bit of a stretch. Of course, we could speculate that if the X86 CPU market wasn't a duopoly, there might have been an established low-power CPU arch available in 2011-2012 when this console generation was designed, but that's rather meaningless speculation.

As for the mid-gen refreshes (Pro and X), they both arrived too early to implement Zen - the design wasn't yet ready for the PS4 Pro, let alone tested and known to perform outside of AMD's labs. The One X arrived later, but still too early for an implementation like that (which requires the design to be very well tested and known good). Then there's the issue of dramatically increasing CPU power in games - how do you make games scale for CPU power across such radically different designs? This makes sense for a "new generation" (which is becoming an increasingly meaningless term in the age of X86 consoles, but still makes sense in terms of software development), but not for a mid-gen refresh - you'd end up with games only working on the refresh, pissing off all the people who bought the other console 1-4 years earlier. Consoles are expected to have 5-8-year life cycles, not ~3 like a PC.

Then there's the issue of die area and cost. The Scorpio Engine is a 359mm2 die (on TSMC 16FF), of which the 8 CPU cores make up a tiny fraction. For a console, this is huge. In comparison, an original Zen Zeppelin (8c) on GloFo 14nm is 213mm2. Of course that has components that could be removed in a console (such as the DDR4 controller, USB, SATA, and PCIe PHY), but adding 8 Zen cores would still balloon die size dramatically. The addition of L3 cache alone would grow the die noticeably. Even reducing it to a single CCX (44mm2 when excluding everything else) for 4c8t would still entail a noticeable size increase, not to mention the issue of patching the OS, games and apps to account for 4 fast and 4 slow threads. Then there's the licencing cost of Zen cores vs. Jaguar cores, which would likely be in the 5-10x range given how new the arch was. And $1000 consoles don't really exist - for a good reason, as they wouldn't sell. $500 consoles usually struggle. This has little to do with a duopoly, and much more to do with the realities of chip design, chip production and fab costs. The tech simply wasn't ready in time, and while an argument can be made that a higher power/IPC/clock arch with fewer cores would have been better for gaming in the short term, that's not the direction either of the big console makers went (and thanks to the 8-core designs, consoles have a lot of cool functionality that would have been impossible otherwise). This is also likely due to them seeing single core perf flatlining and wanting to prepare their developers for the multi-core future. IMO, that's sound long-term planning.
 
Joined
Aug 6, 2017
Messages
7,412 (2.75/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
The $500 XBX has a card that competes with a 1070 Ti in gaming, and yet you call it "low end".
1070Ti is a 1080p/60 GPU now ?

XBX can run 3200x1800 at 30 fps or 1080p at 60, which rx570 can easily do at console quality.

1070Ti is faster than 10.5 TFlop Vega 56, and close to 12.5 TFlop Vega 64, while xbx has 6. Talk about losing credibility :rolleyes:
 
Last edited:
Joined
Feb 3, 2017
Messages
3,810 (1.33/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
Xbox One X's GPU is not comparable to GTX1070 or GTX1070Ti.
It is a bit larger than RX580 but on lower clocks. Comparable to RX580 (or GTX1060 6GB from the other camp).
It has wider memory bus and thus better bandwidth going for it but at the same time that is a resource it needs to share with CPU.

Strictly midrange.
 
Joined
Feb 12, 2015
Messages
1,104 (0.31/day)
1070Ti is a 1080p/60 GPU now ?

XBX can run 3200x1800 at 30 fps or 1080p at 60, which rx570 can easily do at console quality.

1070Ti is faster than 10.5 TFlop Vega 56, and close to 12.5 TFlop Vega 64, while xbx has 6. Talk about losing credibility :rolleyes:

I am no fan of XBOX, but you are flat-out wrong. The 1070 isn't exactly some masterpiece of 4K gaming lol:

https://tpucdn.com/reviews/Performance_Analysis/Monster_Hunter_World/images/2160.png

^1800p@30 is about what the 1070 is capable of too (at best). Oh, and your TFLOP comparisons are hilarious - most people do not seem to understand that Nvidia reports their "TFLOPS" as the card running at its base clock. Thus a Nvidia card that by default boosts to 1800-2000MHz is actually comparable to the TFLOP's of their AMD counterparts. The 1080 Ti is really a 13-15 TFLOP card.

Xbox One X's GPU is not comparable to GTX1070 or GTX1070Ti.

The Polaris-XBX card has a 384-bit bus, and 10% more SP's. That makes it easily 30% better overall, and thus ahead of a 1070 (at least in 1440p-4K).
 
Joined
Aug 6, 2017
Messages
7,412 (2.75/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
Lol,stunning lack of knowledge, GTX 1080 can run consistently in 40s in witcher 3 at high preset 5K, tested it myself.1070 can't even do 4K 30 ?

I took the lastest tpu review (of rx580 mech) and calculated what 1070 can do at 4K. It averages 41 fps in 21 games at native 2160p PC Ultra quality. You mean it's equivalent to XBX's 30 fps 1800p console quality ?

Oh, and your TFLOP comparisons are hilarious - most people do not seem to understand that Nvidia reports their "TFLOPS" as the card running at its base clock. Thus a Nvidia card that by default boosts to 1800-2000MHz is actually comparable to the TFLOP's of their AMD counterparts. The 1080 Ti is really a 13-15 TFLOP card.
You don't seem to understand. I took the nvidia equvalent of amd's card for my comparison. V56 is 10.5tflop, xbox x is 6. 1070ti is slightly faster than V56, so it's not same range as xbox x.
 
Last edited:
Joined
Jun 3, 2010
Messages
2,540 (0.48/day)
It is always hilarious between opposing side fanboys and their blanket assumptions.
Xbox X is not in the same class as 1070 and it is not about the performance. It is a console and for new hlsl extensions, the early access pass is via one such console. Notice, with all the bells and whistles that come featured on a pc-class graphics solution, there is an attached compromise that, at best, blurs the entire Z-fighting domain and, at worst, does not apply any filter on flickering.
We know the Moore's Law is ever marching forward and that memory latency will be ever greater, so the ideal approach is through more complex filtering of displayed pixels.
There are pointers to pass on the pixel shader so it doesn't interpolate non-native texels that belong to different polygons - that should elevate box filtering efficiency to 4, instead of ¼ when discontinuities are churned together. Also, very ALU costly bilateral reconstruction filters can harvest seamless gradients from noisy edges. These are some ways in which Moore's Law scales well with visual quality, otherwise more pixels are more interpolation ridden edges at the cost of 4x the bandwidth.
 
Joined
Jan 24, 2008
Messages
888 (0.14/day)
System Name Meshify C Ryzen 2019
Processor AMD Ryzen 3900X
Motherboard X470 AORUS ULTRA GAMING
Cooling AMD Wraith Prism LED Cooler
Memory 32GB DDR4 ( F4-3200C16D-32GTZKW, 16-16-16-36 @ 3200Mhz )
Video Card(s) AMD Radeon RX6800 ( 2400Mhz/2150Mhz )
Storage Samsung Evo 960
Display(s) Pixio PX275h
Case Fractal Design Meshify C – Dark TG
Audio Device(s) Sennheiser GSP 300 ( Headset )
Power Supply Seasonic FOCUS Plus Series 650W
Mouse Logitech G502
Keyboard Logitech G 15
Software Windows 10 Pro 64bit
Do you understand the concept of time and node size ? Maxwell was lightyears ahead of amd with maxwell

Do you understand the concept of budget? The fact that AMD is already matching Intel's CPU performance with such a limited budget and years of illegal price arangements by Intel is aleady amazing. You want them to keep upt with Nvidia as well? I don't understand how people can have such unrealistic expectations.

Just wait for Ryzen 3 next year. If they can beat Intel and release a GPU next year who can compare with the GTX 1080TI with lower power consumption and a lower price. I will be more then happy to upgrade my Ryzen CPU and upgrade to Navi.
 
Joined
Aug 14, 2017
Messages
74 (0.03/day)
i cant bleive amds all of time bsh1t took and took and cheat and i say lie...

amd CANT release nothing bfore 2019..or sure it can,but its only 4th version of vega, and IF amd try it compete agains even gtx 1000 series its must do HUGE work and also its need alot cash.

also its need moust important engineer work.


all thouse amd has NOT.


7nm gpu,, LOLOLOLOL


if some release 7nm gpu its sure nvidia or intel. thst one big lie from amd....or well it can release it but late 2019 when nvidia release ampere or new gpu AFTER Turing.


even when amd always release smalle line made gpu its ALWAYS eat more power and also was slower.

hmm, i cant continue anymore.... just check amd latest 3 gpus.... 200 series,furyx and vega

all thouse ALOT power eat and for that they are all slower and bcoz amd need they fstest watercool its tell only one thing,junks!

i hate amd bcoz they cheat so much and try sell junk for ppl.

amd never ever cant make rtx 2080 level gpu
 
Joined
Aug 6, 2017
Messages
7,412 (2.75/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
Do you understand the concept of budget? The fact that AMD is already matching Intel's CPU performance with such a limited budget and years of illegal price arangements by Intel is aleady amazing. You want them to keep upt with Nvidia as well? I don't understand how people can have such unrealistic expectations.

Just wait for Ryzen 3 next year. If they can beat Intel and release a GPU next year who can compare with the GTX 1080TI with lower power consumption and a lower price. I will be more then happy to upgrade my Ryzen CPU and upgrade to Navi.
At this point I'm gonna say that if AMD is not paying you for your posts then you're getting screwed cause they damn well should.
Yes,I do understand AMD is like 1/10th nvidia's budget,so what ? No one can tell me to root for AMD if nvidia has a better card. I'm not brand loyal, I'd buy a VIA CPU and a Matrox Pharelia GPU next week if they dropped sth better than nvidia and intel have. AMD reaped what they sowed if you ask me,they tried to kill ten birds with one stone while nvidia went for heavy segmentation with very focused approach for each market and new architectures. AMD tried to scale gcn for everything from small apu through consoles, mid-range and enthusiast gaming to huge prosumer chips and profit from mining at the same time. They still did remarkably well but high end gaming proved too much for them, can't have it all with one architecture.
 
Last edited:
Joined
Jan 13, 2018
Messages
6 (0.00/day)
And AMD, once again, leaves an entire segment to nvidia for a third generation in a row.

AMD sould just sell radeon by this point. They can do really well with GPUs, or CPUs, but not both. Sell Radeon to somebody that can actually produce decent GPUs. Vega was a year late and many watts too high, and is about to get eclipsed by a new generation of GPUs from nvidia.

Don't think you understand. Radeon hasn't been "enthusiast level" since ATI days. Recall when ATI was whooping Nvidia's a$$ up until they sold the Radeon line, back during the Elder Scrolls Oblivion days.

Let's step back and look at what the AMD product line is, CPU and GPU-wise. They are about the performance to dollar ratio, not cramming a bunch of next gen silicon onto a die and selling it for top dollar. If Nvidia is selling a flagship card for nearly 599, then AMD is selling theirs for 399. It won't be faster, but it will be at a sweeter performance to dollar ratio. AMD has always been this, since I've been using them, back in the Turion and Thunderbird days. They were never, ever faster than an Intel chip...not even the legendary Phenom II. They just, every once in a while, produce a legendary chip that rivals the compeition at 2/3 the cost. I can't remember a time when one of these "magic silicons" were ever faster than a flashhip Intel, or Nviidia product. Unless you go back to the ATI GPU days, which always had a habit of trading blows with Nvidia, since before Nvidia was any good...back in the 3DFX days. AMD = budget minded. They steal the business away by being attractive at that price point level.
 
Joined
Aug 6, 2017
Messages
7,412 (2.75/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
You know why they're not selling their flagship as high as nvidia is ? You think they planned to release a bigger die with hbm2 at lower price than 1080ti ? stop selling that story,we've heard it over and over again. Radeon is so budget friendly that Vega 64 sells at 3000pln here while 1080 is 2200.I bought my 1080 new cheaper in 2016 than I'd pay for V64 now.
 
Last edited:
Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
At this point I'm gonna say that if AMD is not paying you for your posts then you're getting screwed cause they damn well should.
Yes,I do understand AMD is like 1/10th nvidia's budget,so what ? No one can tell me to root for AMD if nvidia has a better card. I'm not brand loyal, I'd buy a VIA CPU and a Matrox Pharelia GPU next week if they dropped sth better than nvidia and intel have. AMD reaped what they sowed if you ask me,they tried to kill ten birds with one stone while nvidia went for heavy segmentation with very focused approach for each market and new architectures. AMD tried to scale gcn for everything from small apu through consoles, mid-range and enthusiast gaming to huge prosumer chips and profit from mining at the same time. They still did remarkably well but high end gaming proved too much for them, can't have it all with one architecture.
I don't quite know what you're talking about here - AMD and Nvidia's GPU arch strategies have been very similar for quite a while. Nvidia uses their architectures for just as wide a range of products as AMD, if not more - from self-driving car tech to the Switch and Shield series (though those are based on old-ass tech, that's not for any other reason than cost and availability) to their entire range of GPUs for both workstation, server, HPC, AI and gaming, they're all based on the same architecture. They don't segment any more or less than AMD, outside of historically disabling more FP64 features in their consumer parts than AMD. The main difference is that Nvidia's R&D budget far outstrips AMD's, and always has (including ATI).

As for your ideological choices, those are yours to make, but arguing that consumers have zero responsibility for the large-scale consequences of our aggregated purchases is a bit naive. At the very least, it entirely strips you of the right to complain when monopolies or near-monpolies drive up prices and make hardware impossible to affor for regular users. If your only motivated by pure performance numbers, you're by default rooting for the company with the largest development resources, and as such promoting monopolistic market development. Again: this is your right, but you need to be aware of the systems your decisions play a part in. Brand loyalty is, in its pure form, a really dumb concept. We don't owe giant corporations anything. But when a market is dominated by a few large players, rooting for the underdog is good for everyone.

Don't think you understand. Radeon hasn't been "enthusiast level" since ATI days. Recall when ATI was whooping Nvidia's a$$ up until they sold the Radeon line, back during the Elder Scrolls Oblivion days.

Let's step back and look at what the AMD product line is, CPU and GPU-wise. They are about the performance to dollar ratio, not cramming a bunch of next gen silicon onto a die and selling it for top dollar. If Nvidia is selling a flagship card for nearly 599, then AMD is selling theirs for 399. It won't be faster, but it will be at a sweeter performance to dollar ratio. AMD has always been this, since I've been using them, back in the Turion and Thunderbird days. They were never, ever faster than an Intel chip...not even the legendary Phenom II. They just, every once in a while, produce a legendary chip that rivals the compeition at 2/3 the cost. I can't remember a time when one of these "magic silicons" were ever faster than a flashhip Intel, or Nviidia product. Unless you go back to the ATI GPU days, which always had a habit of trading blows with Nvidia, since before Nvidia was any good...back in the 3DFX days. AMD = budget minded. They steal the business away by being attractive at that price point level.
This isn't true. While you have a point in taking value into the consideration ("who has the fastest CPU?" is a meaningless question if the winner is $10000), AMD has not only played the value card. My Fury X cost as much as a 980Ti, and roughly matched its performance (though it did include water cooling, which similarly priced 980Tis did not, but that's hardly a value play). Vega cards aren't cheap either. The 7000-series GPUs were perhaps a bit cheaper than Nvidia, but also for a time the fastest on the market, bar none. Polaris was a clear budget market segment play, and a very smart one at that. Ryzen gave users more cores for less money, but wasn't really a "budget" option - remember, the 1800X was $499 when it launched. Even if Intel's cheapest 8-core at the time was far more, that's not a budget CPU by any stretch of the imagination. Their prices have been cut as Intel has responded with more cores and lower prices, but Ryzen is not a budget alternative to Intel Core - it's an alternative. Period. In short: you're oversimplifying things. AMD has in the past 3-ish years executed an excellent strategy in turning around their CPU business with limited resources. This has led to GPU development having a lower priority, and the focus has been on compute and datacenter, where Vega excels, and where margins are far higher than gaming. Now that Zen is here, Zen2 is close to arriving, and AMD is profitable again, it's likely that they've been pouring some of that sweet Zen cash into development of Navi, and I'm hopeful that Navi will come close to catching Nvidia's current efficiency advantage (as that's necessary to reach performance parity - cooling more than 250-ish watts in an AIC isn't really feasible), but we'll have to see. But if AMD can deliver that with Navi, and thus compete in the high end, there's no doubt in my mind that they're going to try. Hopefully they won't rise to Nvidia's recent idiotic price levels ($1200 for a consumer-level GPU? Hell no.), but I'm definitely expecting a ~$700 follow-up to the Fury X and Vega 64.
 
Joined
Jul 9, 2015
Messages
3,413 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
which rx570 can easily do at console quality.

Oh, console quality, let me guess, what is that exactly? This kind of stuff (ran on vanilla PS4, not even pro, XBX is 2+ times faster than that):






or perhaps you have mistaken normal consoles to Nintendo's handheld with huang's "shield" chip in it?



well, yeah, that one is not in the same league, but it's a portable, after all.
 
Joined
Aug 6, 2017
Messages
7,412 (2.75/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
Yes,the first one.It looks good,4K medium-high would look the same on a PC game.Also, is this even from gameplay footage or cutscene,and most importantly what fps is it running ?

This is WD2 running native 4K high settings at 60. Ps4 Pro can't do native 4K at 30, rx570 would do that easily, even more than 30, 1080 is not 2x faster.





I'm not saying there aren't gorgeous games for ps4. I'd like a ps4 profor myself too to play gow3 or uncharted, but when you take a well optimized pc game,it'll run higer resolution and framerate than ps4 one.





Not saying there aren't any gorgeous games fos ps4,there are many,I'd like a ps4 pro for myself to play gow and uncharted. But if you take a well optimized pc game,it'll run higher res and fps than console.
 
Last edited:
Top