For the purpose of AV1 encoding, that's still a lot of extra money over the A380, which is going for $100Sparkle A750 is $189 right now on amazon if someones going intel that's probably the better option in the US anyways.
For the purpose of AV1 encoding, that's still a lot of extra money over the A380, which is going for $100Sparkle A750 is $189 right now on amazon if someones going intel that's probably the better option in the US anyways.
?WhereArc A770 $240
Processor | Ryzen 7 5700X |
---|---|
Memory | 48 GB |
Video Card(s) | RTX 4080 |
Storage | 2x HDD RAID 1, 3x M.2 NVMe |
Display(s) | 30" 2560x1600 + 19" 1280x1024 |
Software | Windows 10 64-bit |
?Where
?Where I can buy
?Where I can buy Intel Arc A770 16GB for $240
System Name | 4K-gaming / console |
---|---|
Processor | AMD Ryzen 7 5800X / Intel Core i7-6700K |
Motherboard | Asus ROG Crosshair VII Hero / Asus Z170-K |
Cooling | Alphacool Eisbaer 360 / Alphacool Eisbaer 240 |
Memory | 32GB DDR4-3466 / 16GB DDR4-3000 |
Video Card(s) | Asus RTX 3080 TUF OC / Powercolor RX 6700 XT |
Storage | 3.5TB of SSDs / several small SSDs |
Display(s) | Acer 27" 4K120 IPS + Lenovo 32" 4K60 IPS |
Case | Corsair 4000D AF White / DeepCool CC560 WH |
Audio Device(s) | Sony WH-CN720N |
Power Supply | EVGA G2 750W / Fractal ION Gold 550W |
Mouse | Logitech MX518 / Logitech G400s |
Keyboard | Roccat Vulcan 121 AIMO / NOS C450 Mini Pro |
VR HMD | Oculus Rift CV1 |
Software | Windows 11 Pro / Windows 11 Pro |
Benchmark Scores | They run Crysis |
Processor | Intel i5-12600k |
---|---|
Motherboard | Asus H670 TUF |
Cooling | Arctic Freezer 34 |
Memory | 2x16GB DDR4 3600 G.Skill Ripjaws V |
Video Card(s) | EVGA GTX 1060 SC |
Storage | 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500 |
Display(s) | Dell U3219Q + HP ZR24w |
Case | Raijintek Thetis |
Audio Device(s) | Audioquest Dragonfly Red :D |
Power Supply | Seasonic 620W M12 |
Mouse | Logitech G502 Proteus Core |
Keyboard | G.Skill KM780R |
Software | Arch Linux + Win10 |
Remember when ATI cards used to have red PCBs?A pretty "meh" release. Not good but not bad either.
+pretty good price/performance
-too late for the market. I thought that Intel was going to release the 300/500/700 series at the same time
And those blue PCIe connectors look actually pretty damn good.
System Name | 4K-gaming / console |
---|---|
Processor | AMD Ryzen 7 5800X / Intel Core i7-6700K |
Motherboard | Asus ROG Crosshair VII Hero / Asus Z170-K |
Cooling | Alphacool Eisbaer 360 / Alphacool Eisbaer 240 |
Memory | 32GB DDR4-3466 / 16GB DDR4-3000 |
Video Card(s) | Asus RTX 3080 TUF OC / Powercolor RX 6700 XT |
Storage | 3.5TB of SSDs / several small SSDs |
Display(s) | Acer 27" 4K120 IPS + Lenovo 32" 4K60 IPS |
Case | Corsair 4000D AF White / DeepCool CC560 WH |
Audio Device(s) | Sony WH-CN720N |
Power Supply | EVGA G2 750W / Fractal ION Gold 550W |
Mouse | Logitech MX518 / Logitech G400s |
Keyboard | Roccat Vulcan 121 AIMO / NOS C450 Mini Pro |
VR HMD | Oculus Rift CV1 |
Software | Windows 11 Pro / Windows 11 Pro |
Benchmark Scores | They run Crysis |
Hell yeah I do. Since the Radeon 9700 days.Remember when ATI cards used to have red PCBs?
Processor | Ryzen 7 5700X |
---|---|
Memory | 48 GB |
Video Card(s) | RTX 4080 |
Storage | 2x HDD RAID 1, 3x M.2 NVMe |
Display(s) | 30" 2560x1600 + 19" 1280x1024 |
Software | Windows 10 64-bit |
RX 6600 non-XT is not part of my standard comparison cards. For the recent RX 7600 release I've retested its performance, so we know where it stands today in comparison to other cards. This data is useful for this review, too, so I've included it. I didn't retest power because the card wasn't that much of a competitor to RX 7600 XT at the time. For A580 it is interesting data, so I've retested RX 6600 and updated the power charts in both reviews.why is the 6600 (vanilla) missing from the efficiency chart?
My argument was that AMD's strategy is not aggressive on pricing anymore. Keep observing the market, it averages out surprisingly close to 10%. 100% agree with you that many performance aspects are considerably better in the matchups you mentioned, I was focusing on pricing strategy onlythere's no strategy of AMD with "follow Nvidia -10% pricing"
I enable it in every single game that supports it and rather turn off DLSS upscaling because I can't stand the loss of high-contrast details.you keep stressing DLSS 3 (Frame Generation)
That is very useful! I like that. Sparkle makes good card as a general rule and I really like the looks of this one. Very good visual stylings!The card comes with a nifty RGB-lighting temperature monitoring feature that doesn't require any additional software.
Nor should it be. The performance differences between the XT and non-XT version of the 6600 are not worth all the extra effort that would be needed.RX 6600 non-XT is not part of my standard comparison cards.
You just contradicted yourself. DLSS3 is a progression and evolution of DLSS.- in general DLSS is important, DLSS 3 isn't (frame generation is pretty rarely needed for fps and lowers image quality).
That is too bad, it's the most sold card of AMD and one of the most sold cards in general.RX 6600 non-XT is not part of my standard comparison cards.
I don't agree, 7900 XTX is faster than 4080 while costing 200$ less, that's pretty aggressive. You can't give it away on a loss obviously, I hope you're not arguing like those people who don't understand what cards cost to produce nowadays. 7800 XT is nearly as fast as 4070 Ti with more vram, can be aggressively overclocked to match it, while costing about 300$ less. Aggressive enough. Hard disagree.My argument was that AMD's strategy is not aggressive on pricing anymore
FG isn't that good quality wise so I can't understand this (maybe it will be in the future). Regular DLSS is good quality wise and often a improvement over native, while FG at best you're lucky and don't lose quality (or you don't perceive it as a human being), or you straight have artifacts and picture errors as I observed myself while playing the prime game of frame generation Cyberpunk2077 which some people call Nvidia tech demo nowadays -- another hard disagreement.I enable it in every single game that supports it and rather turn off DLSS upscaling because I can't stand the loss of high-contrast details.
Not really and you failed to understand my point.You just contradicted yourself. DLSS3 is a progression and evolution of DLSS.
Yes, really. But please do explain further.Not really and you failed to understand my point.