• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Six Year Old GTX 1060 Beats Intel Arc A380, GeForce GTX 1630 and Radeon RX 6400, Wins TPU popularity contest

Joined
Dec 25, 2020
Messages
6,833 (4.75/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
A GTX 1080 or more so the GTX 1080 Ti is juuuuuuuuust fine for a 2022 build, not even shabby. I think it will still be fine in 2023. The people who bought it on launch (and still have it) made a really good investment.

I disagree. Pascal owners love their cards for good reason - but I keep seeing people swearing by these ancient graphics cards as if they were special. You simply need to experience Ampere if you think that's something to write home about. You'll find more than adequate a match for a GTX 1070 on an RTX 3050. Maybe even a 1070 Ti; and that's on the games that Pascal can run decently (DX11).

I know I may sound harsh calling Pascal ancient - but it's really an architecture from 2016. That's well over six years ago, more than half a decade. To put that into perspective, that's the year the iPhone 7 and the Galaxy S7 launched. See how these two mighty flagships compare to even midrange phone like a Galaxy A33 5G today? Tech has gone forward so much, and I am all too happy to acknowledge its age. Even if some people aren't willing to, but then again, many would defend using Windows 7 in this present year and I think that is completely :kookoo: myself.

End of the day, you know what's best for your own personal needs. I can respect that. But I personally won't recommend a Pascal architecture graphics card for a modern build.
 
Last edited:
Joined
Feb 20, 2019
Messages
8,302 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
I disagree. Pascal owners love their cards for good reason - but I keep seeing people swearing by these ancient graphics cards as if they were special. You simply need to experience Ampere if you think that's something to write home about. You'll find more than adequate a match for a GTX 1070 on an RTX 3050. Maybe even a 1070 Ti; and that's on the games that Pascal can run decently (DX11).
I just sold a GTX 1080 for £200. It's better than anything else you can buy for £200 right now, so as far as I'm concerned the buyer got a good deal. 3050 is almost 50% more than that and 3060 is almost double.

If it were me buying a new card right now I'd be grabbing a new RX 6600 8GB, you can find them in the UK for £250 and they're far and away the best performance/£ for anything new right now. They're not 25% faster than a 1080 on average though, so despite that - a £200 GTX 1080 is still a reasonable if you either don't have a £250 budget, or if you're comparing it to any Nvidia card (perhaps you want the drivers, the encoder, CUDA support - whatever...)
 
Last edited:
Joined
Dec 25, 2020
Messages
6,833 (4.75/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
I just sold a GTX 1080 for £200. It's better than anything else you can buy for £200 right now, so as far as I'm concerned the buyer got a good deal. 3050 is almost 50% more than that and 3060 is almost double.

If it were me buying a new card right now I'd be grabbing a new RX 6600 8GB, you can find them in the UK for £250 and they're far and away the best performance/£ for anything new right now.

That may be so, but at the same time, the buyer probably isn't going after state of the art technology, which is somewhat of the point of my argument in favor of the Arc GPU. It's the one with the freshest technology in it, even if the drivers are not yet ready. These things aren't developed overnight, and I think it's quite unfair to measure up this infant product stack vs. the decades of driver expertise you will find in both NV and AMD GPUs right now.

Other than that, both Ampere and RDNA 2 products are on the same list, and at this kind of price point, there are things that I personally value more than raw power. But you do you fellas, that's the beauty of the PC, end of the day, what makes it special is that you can tailor it to your own needs.
 
Joined
Feb 20, 2019
Messages
8,302 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
That may be so, but at the same time, the buyer probably isn't going after state of the art technology, which is somewhat of the point of my argument in favor of the Arc GPU. It's the one with the freshest technology in it, even if the drivers are not yet ready. These things aren't developed overnight, and I think it's quite unfair to measure up this infant product stack vs. the decades of driver expertise you will find in both NV and AMD GPUs right now.

Other than that, both Ampere and RDNA 2 products are on the same list, and at this kind of price point, there are things that I personally value more than raw power. But you do you fellas, that's the beauty of the PC, end of the day, what makes it special is that you can tailor it to your own needs.
I agree with you about the freshest tech and feature set, but I don't think Intel's 1st-gen GPUs are the ones to buy for that. By the time their drivers are in a decent state, and by the time DX9 and DX11 performance is less relevant, this first-gen will already be retired/obsolete.

We know there are uncorrectable errors in the silicon itself (Intel admitted as much) and I believe that their second attempt next generation will be the one to seriously consider. They're still new at dGPUs and this first gen is full of mistakes and things that qualify as "trial and error" lessons.

We just have to hope that intel persist with this loss-leader and make it to a second generation. The fear is that it won't turn an immediate profit and so the stupid board of directors will just bow to shortsighted shareholder pressure to can the whole GPU lineup. Intel's expertise and in-house fab could mean that Intel become the #1 player in time, but it could easily take a decade for that to happen, if it happens at all. They have the money and they have the supply chain. All they need is commitment and persistence.
 
Joined
Dec 25, 2020
Messages
6,833 (4.75/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
I agree with you about the freshest tech and feature set, but I don't think Intel's 1st-gen GPUs are the ones to buy for that. By the time their drivers are in a decent state, and by the time DX9 and DX11 performance is less relevant, this first-gen will already be retire/obsolete.

We know there are errors in the silicon itself (Intel admitted as much) and I believe that their second attempt next generation will be the one to seriously consider. They're still new at dGPUs and this first gen is full of mistakes and things that qualify as "trial and error" lessons.

Agreed, it is a bumpy start, but it is a start nonetheless. One must crawl before they can walk. I am sure many people thought the same when the GeForce 256 and the Radeon launched back in 1999-2000, or when unified shader cards launched with the G80 in 2006. New tech often comes with big changes, and change can understandably be scary.

RDNA 2 is by all means a polished and well maintained architecture, so I'd nudge people who want a more carefree experience towards it.

Those who seek raw performance will weigh their choices and take any path they deem best (such as buying older, less efficient but powerful hardware, even if they're behind in features or are approaching their end of life), and then there are those like me. I'm excited by new technology, and this matters more to me, and that's why I like to keep things fresh :)

I can't wait for RDNA 3, for example. I'm gonna have a field day with it.
 
Joined
Feb 20, 2020
Messages
9,340 (5.34/day)
Location
Louisiana
System Name Ghetto Rigs z490|x99|Acer 17 Nitro 7840hs/ 5600c40-2x16/ 4060/ 1tb acer stock m.2/ 4tb sn850x
Processor 10900k w/Optimus Foundation | 5930k w/Black Noctua D15
Motherboard z490 Maximus XII Apex | x99 Sabertooth
Cooling oCool D5 res-combo/280 GTX/ Optimus Foundation/ gpu water block | Blk D15
Memory Trident-Z Royal 4000c16 2x16gb | Trident-Z 3200c14 4x8gb
Video Card(s) Titan Xp-water | evga 980ti gaming-w/ air
Storage 970evo+500gb & sn850x 4tb | 860 pro 256gb | Acer m.2 1tb/ sn850x 4tb| Many2.5" sata's ssd 3.5hdd's
Display(s) 1-AOC G2460PG 24"G-Sync 144Hz/ 2nd 1-ASUS VG248QE 24"/ 3rd LG 43" series
Case D450 | Cherry Entertainment center on Test bench
Audio Device(s) Built in Realtek x2 with 2-Insignia 2.0 sound bars & 1-LG sound bar
Power Supply EVGA 1000P2 with APC AX1500 | 850P2 with CyberPower-GX1325U
Mouse Redragon 901 Perdition x3
Keyboard G710+x3
Software Win-7 pro x3 and win-10 & 11pro x3
Benchmark Scores Are in the benchmark section
Hi,
GPU's are just now are seeing price reductions so repeating vnidia PR line "just buy it" for an rtx cards is insane.

I won't be buying a new gpu until I have a real need for one so 980ti/ 1080ti/ titanXp will still do their thing until they can't anymore and I sure would waste any time or money with a 3050 or 3060 lol that's just crazy talk, unless of course if they came on a laptop but for a desktop that's just funny :laugh:
 
Joined
Dec 25, 2020
Messages
6,833 (4.75/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Hi,
GPU's are just now are seeing price reductions so repeating vnidia PR line "just buy it" for an rtx cards is insane.

I won't be buying a new gpu until I have a real need for one so 980ti/ 1080ti/ titanXp will still do their thing until they can't anymore and I sure would waste any time or money with a 3050 or 3060 lol that's just crazy talk, unless of course if they came on a laptop but for a desktop that's just funny :laugh:

Correct me if I'm wrong, but don't you use Windows 7? If your operating system doesn't support any of these newer graphics APIs, there is really nothing for you to see. It stands to reason that the 10 series would be fully capable of offering everything that the WDDM 1.1 model was designed to do.

My argument is by no means the same as Avram Piltch's infamous just buy it pitch, since we're talking about budget hardware. I fully realize that I'm playing devil's advocate.
 
Joined
Feb 20, 2020
Messages
9,340 (5.34/day)
Location
Louisiana
System Name Ghetto Rigs z490|x99|Acer 17 Nitro 7840hs/ 5600c40-2x16/ 4060/ 1tb acer stock m.2/ 4tb sn850x
Processor 10900k w/Optimus Foundation | 5930k w/Black Noctua D15
Motherboard z490 Maximus XII Apex | x99 Sabertooth
Cooling oCool D5 res-combo/280 GTX/ Optimus Foundation/ gpu water block | Blk D15
Memory Trident-Z Royal 4000c16 2x16gb | Trident-Z 3200c14 4x8gb
Video Card(s) Titan Xp-water | evga 980ti gaming-w/ air
Storage 970evo+500gb & sn850x 4tb | 860 pro 256gb | Acer m.2 1tb/ sn850x 4tb| Many2.5" sata's ssd 3.5hdd's
Display(s) 1-AOC G2460PG 24"G-Sync 144Hz/ 2nd 1-ASUS VG248QE 24"/ 3rd LG 43" series
Case D450 | Cherry Entertainment center on Test bench
Audio Device(s) Built in Realtek x2 with 2-Insignia 2.0 sound bars & 1-LG sound bar
Power Supply EVGA 1000P2 with APC AX1500 | 850P2 with CyberPower-GX1325U
Mouse Redragon 901 Perdition x3
Keyboard G710+x3
Software Win-7 pro x3 and win-10 & 11pro x3
Benchmark Scores Are in the benchmark section
Correct me if I'm wrong, but don't you use Windows 7? If your operating system doesn't support any of these newer graphics APIs, there is really nothing for you to see. It stands to reason that the 10 series would be fully capable of offering everything that the WDDM 1.1 model was designed to do.

My argument is by no means the same as Avram Piltch's infamous just buy it pitch, since we're talking about budget hardware. I fully realize that I'm playing devil's advocate.
Hi,
There are win-7 drivers for 20 & 30 series so not sure what your point is
My point is your statement is just wrong think maybe the devil made you do it stands to reason :laugh:

I use 7 - 10 and 11.
 
Joined
Dec 25, 2020
Messages
6,833 (4.75/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Hi,
There are win-7 drivers for 20 & 30 series so not sure what your point is
My point is your statement is just wrong think maybe the devil made you do it stands to reason :laugh:

The point is that even if drivers exist (even disregarding the fact that they are out of date), it doesn't mean you can use every functionality available because your OS is not aware of what these features are or what they are for.

You being blissfully oblivious to what I speak of doesn't make me wrong, but I'll tell you this, just having multiplane overlay support makes the upgrade worth it in my eyes. MPOs are unfortunately not supported in Pascal, though AMD does support this in Vega.
 
Joined
May 17, 2021
Messages
3,005 (2.32/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
1060 6gb and 1080ti will never die, iconic beasts
 
Joined
May 20, 2020
Messages
1,375 (0.83/day)
...These things aren't developed overnight, and I think it's quite unfair to measure up this infant product stack vs. the decades of driver expertise you will find in both NV and AMD GPUs right now...
I feel I have to reply to this - who has been the leading graphics chip producer (regardless of the GPU size) for ages and with the biggest market share? Intel.
If they aren't able to produce stable drivers now, it's not likely they will any time soon.
If we look at nvidia with their GeForce256 and up - they already had very stable drivers even with Riva TNT2, much more so with GF1 (256), it is not that GF256 was anything new, just continuation and upgrade of their architecture and all higher chips too (quite an achievement really, the key is probably discipline which Intel lacks).
 
Joined
May 17, 2021
Messages
3,005 (2.32/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
I feel I have to reply to this - who has been the leading graphics chip producer (regardless of the GPU size) for ages and with the biggest market share? Intel.
If they aren't able to produce stable drivers now, it's not likely they will any time soon.
If we look at nvidia with their GeForce256 and up - they already had very stable drivers even with Riva TNT2, much more so with GF1 (256), it is not that GF256 was anything new, just continuation and upgrade of their architecture and all higher chips too (quite an achievement really, the key is probably discipline which Intel lacks).

Developing igpu's is not the same thing.

"Our software release on our discrete graphics was clearly underperforming," said Gelsinger. "We thought that we would be able to leverage the integrated graphics software stack, and it was wholly inadequate for the performance levels, gaming compatibility, etc. that we needed. So we are not hitting our four million unit goal in the discrete graphics space, even as we are now catching up and getting better software releases."

Riva was done at a different time, everything was much simpler. And both Nvidia and especially AMD even with decades of experience with high performance drivers still make same spectacularly shitty drivers. You can easilly search AMD releasing updated graphics for a single game giving 30% more performance (and i remenber cases of 40% and more in my time with AMD)

"you have an AMD GPU with a product name starting ‘Radeon RX 6…’ the driver should deliver the following performance improvements in these games:
  • World of Warcraft: Shadowlands – up to 30%
  • Assassin’s Creed Odyssey – up to 28%"
 
Joined
Dec 25, 2020
Messages
6,833 (4.75/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Developing igpu's is not the same thing.

"Our software release on our discrete graphics was clearly underperforming," said Gelsinger. "We thought that we would be able to leverage the integrated graphics software stack, and it was wholly inadequate for the performance levels, gaming compatibility, etc. that we needed. So we are not hitting our four million unit goal in the discrete graphics space, even as we are now catching up and getting better software releases."

Riva was done at a different time, everything was much simpler. And both Nvidia and especially AMD even with decades of experience with high performance drivers still make same spectacularly shitty drivers. You can easilly search AMD releasing updated graphics for a single game giving 30% more performance (and i remenber cases of 40% and more in my time with AMD)

"you have an AMD GPU with a product name starting ‘Radeon RX 6…’ the driver should deliver the following performance improvements in these games:
  • World of Warcraft: Shadowlands – up to 30%
  • Assassin’s Creed Odyssey – up to 28%"

Yup, and the biggest mistake with Arc is that whoever was in charge of the software development (folks blaming Raja again, he's like the boogeyman or something) thought they could leverage their existing driver code base for their latest generation integrated graphics and work from there. Let's just say that wasn't such a great idea
 
Joined
May 17, 2021
Messages
3,005 (2.32/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
Yup, and the biggest mistake with Arc is that whoever was in charge of the software development (folks blaming Raja again, he's like the boogeyman or something) thought they could leverage their existing driver code base for their latest generation integrated graphics and work from there. Let's just say that wasn't such a great idea

thought they could is a weird thing. What were they doing, i'm sure they had a prototype for so long now, i can't understand how any of this came as a surprise for Intel. Back in February they were promising us the moon.

The argument is the igpu's never recieved optimized drivers, so wtf why didn't they tried that first before jumping head first into making new gpu's with drivers they didn't get to optimise on the existing igpu's. There's an insane amount of bad workmanship and leadership at Intel.
 
Joined
Dec 25, 2020
Messages
6,833 (4.75/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
thought they could is a weird thing. What were they doing, i'm sure they had a prototype for so long now, i can't understand how any of this came as a surprise for Intel. Back in February they were promising us the moon.

The argument is the igpu's never recieved optimized drivers, so wtf why didn't they tried that first before jumping head first into making new gpu's with drivers they didn't get to optimise on the existing igpu's. There's an insane amount of bad workmanship and leadership at Intel.

I totally imagine that this preposterous idea was imposed by the suits that Gelsinger delegated to take care of the graphics division as he restructured the company.

I'm also fairly sure and inclined to believe that they were warned by the engineers but executives haven't the faintest clue, and probably saw that their integrated graphics ran CSGO and Dota and refused to allow development of an all new stack, which is a millionaire investment. Except that this time they can't just call in the Linux wiz kids they hire to maintain their open source iGPU drivers, so it all fell completely on top of the hardware itself.

At least Gelsinger is personally owning up to that mistake, and I strongly feel that he's done Intel very well in his tenure as CEO thus far.
 
Joined
May 17, 2021
Messages
3,005 (2.32/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
Maybe it's a culture thing inside Intel, but in my company we are encouraged to point out things like this, that can be the difference between a lot of money won or lost, and we probably get a reward.
This company has a massive problem inside.
 
Joined
Dec 25, 2020
Messages
6,833 (4.75/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Maybe it's a culture thing inside Intel, but in my company we are encouraged to point out things like this, that can be the difference between a lot of money won or lost, and we probably get a reward.
This company has a massive problem inside.

Most mega corps have people in charge that shouldn't be, you know, the usual. But hopefully it's a learning experience.
 
Top