ergo a classic case of the there's no bad products, only bad pricing
ergo a classic case of the there's no bad products, only bad pricing
You were right.OMG I think you're suffering from a mental derangement. I guess what you BELIEVE is more important than reality.
System Name | Deneb |
---|---|
Processor | AMD Ryzen 7 7800X3D |
Motherboard | Asus ProArt X670E-Creator WiFi |
Cooling | CPU: Noctua D15. Case additional : ML140 Pro blue PWM (400-2000) x1, ML140 Pro RGB PWM (400-1200) x2 |
Memory | G.Skill Flare X5 DDR5 6000Mbps EXPO 2x16Go |
Video Card(s) | Geforce RTX 4070 Asus Dual 12Go |
Storage | SSD Solidigm (SK Hynix) P44 Pro 2To (Gen4). HDD WD80EAZZ (8To, CMR) x2 in AMD RAID1 |
Display(s) | Asus VG27AQ 27" 1440p 165Hz ELMB Sync, Freesync/Gsync compatible |
Case | Fractal Design R6 Tempered Glass Black |
Audio Device(s) | Motherboard soundcard. Logitech Z623 2.1 THX speakers. |
Power Supply | Corsair RMx 2018 850W |
Mouse | Logitech G502 |
Keyboard | Filco Majestouch Convertible 2 (USB/BT) TKL Cherry blue |
Software | Windows 11 Pro |
Benchmark Scores | Cinebench R23 18440, Geekbench 6 CPU 2671/14823, GPU 181588, 3DMark Speedway 4723, Steel Nomad 4019 |
Ok ok I was kinda joking but I mean, there are even more people that still have a GTX 1080 (like at least 10 times more but whatever...), why not include it in the comparison ?Lots of people still have the Vega64. Don't be so narrow minded.
While true, most owners of Vega64 are doing more than just gaming with that card. And while the steam survey can be useful, it is by no means an "end-all-be-all" indicator.PS: Actually the Vega64 is not even displayed in the Steam Hardware Survey :
I watched it and wondered why did they switch the processor? then the first test the top 3 cards all have the exact same FPS and realised they deliberately CPU bound the cards to rig the test. Luckily YouTube has a bunch of people who bought the card and have posted videos of real world performance and it seems quite good; better than the review sites anyways.These utter clowns today put out a video in which they purported to show how terrible the 6500XT is by pairing it with, a, uh, 5600G, which is the only current CPU with PCIE 3.0
Because of course you spend more $$$ on less performance for an APU you know will slow down the GPU you just bought that only works well on PCIE 4.0.
Lol...
The stated argument for this disingenuous choice of CPU was that the 5600G has 'hardware encoding'. And you know, like the i5-12400 is $25 cheaper and is faster and also has hardware encoding, and won't slow down your GPU for no reason whatsoever, but um, yeah, just trust us we're the expert tech tubers, not just making clickbait BS.
Absolutely pathetic.
And then all the clowns in the comments were like 'omigod it lost to the 5500 XT in every test'. Well no shit, try running the 5500 XT at PCIE 3.0 x 4 for no reason whatsoever and see how it compares.
So the argument they give is then 'buy a used RX 570 4GB for $188', because somehow a used card which was 17.5% slower, even with the 'deliberately crippled 6500XT build', is better than a brand new one for $230.
Its a monster of a card, nothing else is available that comes close to its performance at the price. AMD have produced a fantastic product they have completely out manoeuvred Nvidia and Intel.Lmao @ these circus worthy comments shafting HW Unboxed, one of the only reliable tech channels on that site alongside GamersNexus. No other channel make higher quality reviews. LTT's reviews are a complete joke, and all the others seem to follow in their footsteps of low effort graphs. You still think they're biased for AMD, or biased for Intel, or whatever other company? Keep your fanboy dreams to yourself. They've appropriately shafted AMD, Nvidia, Intel, and any other company, when they deserve it.
6500 XT is a completely garbage product that shouldn't exist, no matter how you look at it. It should have been a mobile product, but AMD decided to slap it on a PCB and sell it as a desktop product while handicapping it to produce it cheap and sell it to the current hungry market for extra margins.
Did their video mocking TechPowerUp for thinking them having Nvidia boxes in their videos means Nvidia is sponsoring every video they make, struck a nerve of yours or what?
EITHER WAY, what does all of this have to do with the original 3050 review?
Lol wot?Lmao @ these circus worthy comments shafting HW Unboxed, one of the only reliable tech channels on that site alongside GamersNexus. No other channel make higher quality reviews. LTT's reviews are a complete joke, and all the others seem to follow in their footsteps of low effort graphs. You still think they're biased for AMD, or biased for Intel, or whatever other company? Keep your fanboy dreams to yourself. They've appropriately shafted AMD, Nvidia, Intel, and any other company, when they deserve it.
6500 XT is a completely garbage product that shouldn't exist, no matter how you look at it. It should have been a mobile product, but AMD decided to slap it on a PCB and sell it as a desktop product while handicapping it to produce it cheap and sell it to the current hungry market for extra margins.
Did their video mocking TechPowerUp for thinking them having Nvidia boxes in their videos means Nvidia is sponsoring every video they make, struck a nerve of yours or what?
The TDP of modern cards have increased significantly since. The current xx50 line-up can no longer be fed exclusively off the PCI-E busI don't like how triple slot cooler are now so common in even sub XX50 cards, i remember not too long ago buying dual fan cooling solution for such a card was considered a waste of money, and that was when they were only two slots lol
System Name | Bragging Rights |
---|---|
Processor | Atom Z3735F 1.33GHz |
Motherboard | It has no markings but it's green |
Cooling | No, it's a 2.2W processor |
Memory | 2GB DDR3L-1333 |
Video Card(s) | Gen7 Intel HD (4EU @ 311MHz) |
Storage | 32GB eMMC and 128GB Sandisk Extreme U3 |
Display(s) | 10" IPS 1280x800 60Hz |
Case | Veddha T2 |
Audio Device(s) | Apparently, yes |
Power Supply | Samsung 18W 5V fast-charger |
Mouse | MX Anywhere 2 |
Keyboard | Logitech MX Keys (not Cherry MX at all) |
VR HMD | Samsung Oddyssey, not that I'd plug it into this though.... |
Software | W10 21H1, barely |
Benchmark Scores | I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000. |
There are 3050 and 3050Ti laptop chips (GA106) with TGPs of 35W, and 3060 laptop GPUs (GA104) with TGPs of 65W. So there is absolutely no reason why a 3050 could not be fed off a 75W PCIe slot alone, but manufacturers are choosing not to make them.The TDP of modern cards have increased significantly since. The current xx50 line-up can no longer be fed exclusively off the PCI-E bus
the 50 series is supposed to be far from a 60 series and sold for around $129. the 3050 is stronger than usual relative to higher models because of the state of the gpu market. it would be crazy to limit it to 75W given that it was launched into a '60 type price point, or higher.There are 3050 and 3050Ti laptop chips (GA106) with TGPs of 35W, and 3060 laptop GPUs (GA104) with TGPs of 65W. So there is absolutely no reason why a 3050 could not be fed off a 75W PCIe slot alone, but manufacturers are choosing not to make them.
Admittedly, the 35W and 40W variants of GA106 in laptops are heavily downclocked as low as 1100MHz, but at the more common 60W TGP they typically boost close to 1700MHz which, although not desktop stock speeds of 1777, results in performance that's very similar, if not within margin of error.
System Name | Bragging Rights |
---|---|
Processor | Atom Z3735F 1.33GHz |
Motherboard | It has no markings but it's green |
Cooling | No, it's a 2.2W processor |
Memory | 2GB DDR3L-1333 |
Video Card(s) | Gen7 Intel HD (4EU @ 311MHz) |
Storage | 32GB eMMC and 128GB Sandisk Extreme U3 |
Display(s) | 10" IPS 1280x800 60Hz |
Case | Veddha T2 |
Audio Device(s) | Apparently, yes |
Power Supply | Samsung 18W 5V fast-charger |
Mouse | MX Anywhere 2 |
Keyboard | Logitech MX Keys (not Cherry MX at all) |
VR HMD | Samsung Oddyssey, not that I'd plug it into this though.... |
Software | W10 21H1, barely |
Benchmark Scores | I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000. |
LOL, this isn't 2012.the 50 series is supposed to be far from a 60 series and sold for around $129.
LOL, this isn't 2012.
The 650Ti launched at $149 a decade ago and that was widely reviewed as excellent performance/$ at the time, with factory overclocked models up to around $180 getting 'editor's choice' accolades or similar.
Even before ETH mining caused the first serious GPU shortage in 2017, the 900-series x50 representative was well above the $130 mark, with a GTX 950 typically selling for $175 and base models with that nasty plastic blower had an MSRP of $159...
Nonsense. If the A2000 can run on slot power, the 3050 can too.The current xx50 line-up can no longer be fed exclusively off the PCI-E bus
System Name | Bragging Rights |
---|---|
Processor | Atom Z3735F 1.33GHz |
Motherboard | It has no markings but it's green |
Cooling | No, it's a 2.2W processor |
Memory | 2GB DDR3L-1333 |
Video Card(s) | Gen7 Intel HD (4EU @ 311MHz) |
Storage | 32GB eMMC and 128GB Sandisk Extreme U3 |
Display(s) | 10" IPS 1280x800 60Hz |
Case | Veddha T2 |
Audio Device(s) | Apparently, yes |
Power Supply | Samsung 18W 5V fast-charger |
Mouse | MX Anywhere 2 |
Keyboard | Logitech MX Keys (not Cherry MX at all) |
VR HMD | Samsung Oddyssey, not that I'd plug it into this though.... |
Software | W10 21H1, barely |
Benchmark Scores | I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000. |
Sure, the 1050 was cut down too much and it sucked donkey balls which is why it was less than half the price of the 1060.Yeah?
And the 1050 was $110 in 2016, and '50 Ti is not '50.
However you look at it, the 3050 was launched into a market where it was going to sell for hundreds of dollars and be far beyond the normal entry level where people might reasonably have some sort of potato of a PSU that can't cope with more than a dim light bulb's worth of power.