• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Shows Off its Arc Graphics Card at Intel Extreme Masters

Joined
Aug 6, 2020
Messages
729 (0.46/day)
Over the corse of years nvidia has developed impressive harmony towards their hardware in terms of software. Likewise Amd did the same.
Intel has excellent hardware, proper innovation in and out of their architectural history yet lack software harmony. This is where they are making sure their ARC series makes a breakthrough and delaying the release almost from last 4+ months.
Common example of what im saying is the power consumption vs performance graphs. Intel graphics eat watts but produce less numbers compared to Nvidia or Amd cards on same power draw charts.
Intel has produced working gpus by now even showcased the limited edition but going through the phase of software to hardware harmony. They are exceptionally working on software/drivers to harvest the power of their power drawing hardware.
I believe blue team will definitely make a difference in the market. Especially those useless hardware issue Nvidia and Amd cards have leaving users to through away and buy new. Atleast Intel hardware would be more superior in general point of view.


The problem with this methodology: in all previous instances where there have been critical delays, the market has had only a single competitor to deal with.

AMD's delay of TeraScale by nearly a year could be covered by continued refreshes of the beefcake x1900 XTX. And NVIDIA survived the 8-month delay of Fermi with endless re-brands of G92 and GT200 x2,

What happens to a delayed architecture when there are two other strong releases waiting on their doorstep? Will Intel even get a foothold, without massively discounted parts?
 
Last edited:
Joined
May 17, 2021
Messages
3,005 (2.32/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
No one really wants to downgrade, but something is better than nothing though. Also can only do what budget/time allow. I play Terraria at 4k with my RX 480 :)

that seems fair, better then playing terraria at 4k on a 3090, and yes they exist, i know one (not terraria but point proven)
 
Joined
Nov 4, 2005
Messages
11,988 (1.72/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs, 24TB Enterprise drives
Display(s) 55" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
Joined
Jun 4, 2021
Messages
24 (0.02/day)
Processor Intel Core i7-4790
Memory 32 GB
Video Card(s) NVIDIA GeForce RTX 2060
Software Manjaro Linux GNOME, Windows 11 Pro (with Debian Linux in WSL 2)
Now you see it, now you don't.

Classic vaporware.

From the article:
It's unclear if it was a working sample or just a mockup.
 
Last edited:
Joined
Jan 14, 2019
Messages
12,368 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Ohhh man you haven't tried Freelancer on a big 4K monitor. Generally I agree with you, but that game (and the texture mod) REALLY shines at higher resolution and 32".
That's an old game that doesn't need a 3080 even at 4K. But now that you mentioned it, I've got a light gaming capable HTPC and a 4K TV, so I guess it's time to try. :rolleyes:
 
D

Deleted member 24505

Guest
I have a 4k TV, not once have i tried running my games on it with my 1080ti. Happy with my 1440p monitor. If i ever get £2k+ to piss up a wall on a 3090/ti would i just to try gaming at 4k? would i f ck.
 

Frick

Fishfaced Nincompoop
Joined
Feb 27, 2006
Messages
19,600 (2.86/day)
Location
Piteå
System Name White DJ in Detroit
Processor Ryzen 5 5600
Motherboard Asrock B450M-HDV
Cooling Be Quiet! Pure Rock 2
Memory 2 x 16GB Kingston Fury 3400mhz
Video Card(s) XFX 6950XT Speedster MERC 319
Storage Kingston A400 240GB | WD Black SN750 2TB |WD Blue 1TB x 2 | Toshiba P300 2TB | Seagate Expansion 8TB
Display(s) Samsung U32J590U 4K + BenQ GL2450HT 1080p
Case Fractal Design Define R4
Audio Device(s) Plantronics 5220, Nektar SE61 keyboard
Power Supply Corsair RM850x v3
Mouse Logitech G602
Keyboard Cherry MX Board 1.0 TKL Brown
Software Windows 10 Pro
Benchmark Scores Rimworld 4K ready!
That's an old game that doesn't need a 3080 even at 4K. But now that you mentioned it, I've got a light gaming capable HTPC and a 4K TV, so I guess it's time to try. :rolleyes:

Seriously, do it. If you liked Freelancer back then you'll love it now. Sit close to it. I too dismissed the notion of "increased screen estate hightens imershun" but in this case it really works.
I have a 4k TV, not once have i tried running my games on it with my 1080ti. Happy with my 1440p monitor. If i ever get £2k+ to piss up a wall on a 3090/ti would i just to try gaming at 4k? would i f ck.

It's not so much the resolution as the size. Sure the programs/games that actually scales look good, but not "worth it" good. At this point in time I actually wouldn't mind a 42" 4K monitor.
 
Joined
Sep 17, 2014
Messages
22,499 (6.03/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Over the corse of years nvidia has developed impressive harmony towards their hardware in terms of software. Likewise Amd did the same.
Intel has excellent hardware, proper innovation in and out of their architectural history yet lack software harmony. This is where they are making sure their ARC series makes a breakthrough and delaying the release almost from last 4+ months.
Common example of what im saying is the power consumption vs performance graphs. Intel graphics eat watts but produce less numbers compared to Nvidia or Amd cards on same power draw charts.
Intel has produced working gpus by now even showcased the limited edition but going through the phase of software to hardware harmony. They are exceptionally working on software/drivers to harvest the power of their power drawing hardware.
I believe blue team will definitely make a difference in the market. Especially those useless hardware issue Nvidia and Amd cards have leaving users to through away and buy new. Atleast Intel hardware would be more superior in general point of view.
It was going fine until you said Intel doesnt deal in planned obscolescence / insinuate Intel cards magically wont suffer from getting old and just dying. I think you oughta change the color of those glasses to transparent, uncolored.
 
D

Deleted member 24505

Guest
Seriously, do it. If you liked Freelancer back then you'll love it now. Sit close to it. I too dismissed the notion of "increased screen estate hightens imershun" but in this case it really works.


It's not so much the resolution as the size. Sure the programs/games that actually scales look good, but not "worth it" good. At this point in time I actually wouldn't mind a 42" 4K monitor.

We have a 58" 4k TV
 
Joined
Sep 17, 2014
Messages
22,499 (6.03/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Seriously, do it. If you liked Freelancer back then you'll love it now. Sit close to it. I too dismissed the notion of "increased screen estate hightens imershun" but in this case it really works
Yeah but too bad those on screen markers are so effin huge man! Kinda turned me off and boy did the game age a bit, too.
 
Joined
Apr 12, 2013
Messages
7,546 (1.77/day)
It doesn't matter as long as the price is right. Even if their top card is around the next generation RTX4060 or 7600XT it can still sell quite well.
With the impending global recession don't be so sure! In fact if oil prices stay this high for longer, also the war continues, you can bet your bottom dollar that Intel will be cursing themselves for not launching this 1-2 years back :slap:
 
Joined
Jan 14, 2019
Messages
12,368 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Seriously, do it. If you liked Freelancer back then you'll love it now. Sit close to it. I too dismissed the notion of "increased screen estate hightens imershun" but in this case it really works.
I didn't like it. I LOVED IT! :D I'll definitely try it on my TV. I don't know why I didn't think about this before. :rolleyes: I can imagine screen estate adds to the experience when it gives you more space (literally) to look at.

I just don't feel like a bigger monitor would add enough to my general gaming experience to justify its own, and a faster and noisier GPU's cost. That is, I'd much rather game at 1080p 24" with a dead silent 6500 XT than pay for a 4K monitor and a not so silent 3080.
 

aQi

Joined
Jan 23, 2016
Messages
646 (0.20/day)
The problem with this methodology: in all previous instances where there have been critical delays, the market has had only a single competitor to deal with.

AMD's delay of TeraScale by nearly a year could be covered by continued refreshes of the beefcake x1900 XTX. And NVIDIA survived the 8-month delay of Fermi with endless re-brands of G92 and GT200 x2,

What happens to a delayed architecture when there are two other strong releases waiting on their doorstep? Will Intel even get a foothold, without massively discounted parts?
Intel wont stand tough against rdna3 and rtx4000 series but they could just try to catch up. I remember Intel job openings for discrete gpu graphic driver development in US and China on urgent basis. I guess they want the bells to ring but they just need more and more time to suite up against both green and red team.

It was going fine until you said Intel doesnt deal in planned obscolescence / insinuate Intel cards magically wont suffer from getting old and just dying. I think you oughta change the color of those glasses to transparent, uncolored.
Well im not saying their hardware is immortal atleast the build quality engineering is a bit ahead of others. Atleast their hardware carry their reputation whether its a processor or even some controller on pci x1, it would perform as good as it should even after a decade.
 
Joined
Nov 4, 2005
Messages
11,988 (1.72/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs, 24TB Enterprise drives
Display(s) 55" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
Intel wont stand tough against rdna3 and rtx4000 series but they could just try to catch up. I remember Intel job openings for discrete gpu graphic driver development in US and China on urgent basis. I guess they want the bells to ring but they just need more and more time to suite up against both green and red team.


Well im not saying their hardware is immortal atleast the build quality engineering is a bit ahead of others. Atleast their hardware carry their reputation whether its a processor or even some controller on pci x1, it would perform as good as it should even after a decade.
Laughs in 1100T on decade old AMD components…..

You should try trolling elsewhere or read a little more before spouting off literal fanboy BS.

Flexing boards, flexing CPUs, the terrible idea of the original 775 coolers that caused noobs and even pros to kill boards, their relabeled and unsupported Killer NiC series, their hardware security issues, the number of overheated chips that died in OEMs.

Intel isn’t magical, they are a for profit company just like AMD and their bullshitdozer that got stuck in the ponds.

Quality is quality, buying a 300 board from either means or should last longer than a 79 dollar board from either. Same for coolers they don’t make, memory, PSUs, and hard drives.
 
Joined
Oct 3, 2019
Messages
155 (0.08/day)
Processor Ryzen 3600
Motherboard MSI X470 Gaming Plus Max
Cooling stock crap AMD wraith cooler
Memory Corsair Vengeance RGB Pro 16GB DDR4-3200MHz
Video Card(s) Sapphire Nitro RX580 8GBs
Storage Adata Gammix S11 Pro 1TB nvme
Case Corsair Caribide Air 540
That Bitboys Oy purchase technical know-how sure is paying dividends already.
 
Joined
Feb 17, 2010
Messages
1,650 (0.31/day)
Location
Azalea City
System Name Main
Processor Ryzen 5950x
Motherboard B550 PG Velocita
Cooling Water
Memory Ballistix
Video Card(s) RX 6900XT
Storage T-FORCE CARDEA A440 PRO
Display(s) MAG401QR
Case QUBE 500
Audio Device(s) Logitech Z623
Power Supply LEADEX V 1KW
Mouse Cooler Master MM710
Keyboard Huntsman Elite
Software 11 Pro
Benchmark Scores https://hwbot.org/user/damric/
Seems like the perfect card to play Star Citizen
 
Joined
Jan 15, 2021
Messages
337 (0.24/day)
I agree. Graphics cards are getting ridiculously overpowered - not just in terms of power consumption, but also performance. I mean, game technology isn't evolving as fast as graphics performance is. That's why they're advocating completely pointless 4K gaming, because there's nothing else that would need the performance of a 3080 level graphics card, not to mention 4080. They have to sell those oversized monsters somehow. I've just recently downgraded my 2070 to a 6500 XT because of the noise, and I'm happy... with an enrty-level graphics card! This has never happened to me before.
Agreed. RT is also 100% useless. However I can attest that 4K is nice for non competitive games. If the 20xx amd 30xx card didn't had those 100% useless tensor/RT cores it could have 30-40% more performance per die. Also I saw the UE5 demos, yes it looks cool but at the cost of MASSIVE storage requirements and a complete rework of the data pipeline, and even then it would still have bad framerate and more importantly extremely bad mouse input for competitive games.
 
Joined
Feb 1, 2019
Messages
3,614 (1.69/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
I agree. Graphics cards are getting ridiculously overpowered - not just in terms of power consumption, but also performance. I mean, game technology isn't evolving as fast as graphics performance is. That's why they're advocating completely pointless 4K gaming, because there's nothing else that would need the performance of a 3080 level graphics card, not to mention 4080. They have to sell those oversized monsters somehow. I've just recently downgraded my 2070 to a 6500 XT because of the noise, and I'm happy... with an enrty-level graphics card! This has never happened to me before.
They are kind off, there is two things which in my opinion are keeping the market alive.

One is generated demand via hardware exclusive features, so e.g. with Nvidia, Gsync, DLSS and hardware based RT. All three features thanks to AMD have alternatives that dont require hardware lock in, VRR, FSR and RT using rasterisation hardware.

I get the merit of VRR, Nvidia deserve praise for introducing the concept, but they did attempt vendor lock in, not only via GPU but also using expensive chips on monitors. DLSS I also get the merit and out of the three this is for me by the far the most useful tech in terms of potential, but initially limited to games where dev's implement it (so limited to a a fraction of new games released), however we have more recently seen a new variant of DSR that uses DLSS so can be implemented now driver side, not sure if FXR can do via driver (someone can clarify for me this would be great). Finally RT, this one I have little love for, I feel lighting can be done very well in games via traditional methods, and I thought was lame where RT developed games would have heavily nerfed non RT lighting and to have good lighting you needed RT hardware, AMD have at least proven the hardware isnt needed to make this a bit less lame. But ultimately RT has served to increase the level of hardware required to achieve a quality level so is a bad thing for me.

The second demand for new GPUs is been fuelled by the high FPS/HZ craze, which is primarily desired by first person shooter and esport gamers. If there was no high refresh rate monitors, then Nvidia and AMD would be having a much harder time now in convincing people to buy new gen GPUs, as we starting to get to the point a single GPU can handle anything thrown at it to a degree at 60fps. The to a degree comment is that of course game developers whether its via lazyness or bungs thrown at them by AMD/Nvidia are optimising their games less so they need more grunt to hit a quality and frame rate target. Nvidia have been caught in the past when a developer was rendering tesselation underneath the ground in a game which helped sell Nvidia cards. There was also problems noticed in the FF15 benchmark tool where it was rendering things out of view which lowered the framerate. In that game as well Nvidia exclusive features absolutely tanked framerate and spiralled VRAM demand skywards.
 
Joined
Nov 13, 2007
Messages
10,794 (1.73/day)
Location
Austin Texas
System Name stress-less
Processor 9800X3D @ 5.42GHZ
Motherboard MSI PRO B650M-A Wifi
Cooling Thermalright Phantom Spirit EVO
Memory 64GB DDR5 6000 CL30-36-36-76
Video Card(s) RTX 4090 FE
Storage 2TB WD SN850, 4TB WD SN850X
Display(s) Alienware 32" 4k 240hz OLED
Case Jonsbo Z20
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse DeathadderV2 X Hyperspeed
Keyboard 65% HE Keyboard
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
idk... im super stoked playing games at 4k with a 48" oled. That kind of experience is something else -- to the point that it makes some boring games fun.

I think 4k is worth it... text sharpness gfx fidelity, it's really nice... I think a 4080 would be a good upgrade to my gaming experience -- esp using larger screens.
 
Joined
Aug 6, 2020
Messages
729 (0.46/day)
Intel wont stand tough against rdna3 and rtx4000 series but they could just try to catch up. I remember Intel job openings for discrete gpu graphic driver development in US and China on urgent basis. I guess they want the bells to ring but they just need more and more time to suite up against both green and red team.
And how many quarters of losses in the graphics card sector is Intel going t continue to endure before the board cuts their losses??

At least compute GPUs they can sorta justify (via the supercomputer contracts) but how long do you expect Intel to continue to be #3 in the consumer market, while bleeding thee losses? (untill they fire the entire driver team , and absorb the best innto compute modules.)?
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
42,364 (6.65/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
For all the haters, they do have demo computers running Intel GPUs at the event. Laptops with the smaller version of the silicon are also becoming available worldwide (i'm not sure the drivers are up to par yet, but at least they should work better than 5700xt initial launch)

There's also a lot of reasons to buy an Intel GPU other than games, like linux support, the advanced media engine or if you want to dable with OneApi stuff.

Anyway, now you made sound like an Intel fanboy :D , it's fun to mock the forever delayed intel gpu but can we keep some semblance of reality?


8+6 power connector is disapointing, I hate 6 pin connectors, more often than not if you're not running custom cables you'll be left with a 2 pin connector dangling of the gpu, guess this was designed so long ago that the new gen5 power connector wasn't available :D
Linux support for AMD has been since the ATi days
 

aQi

Joined
Jan 23, 2016
Messages
646 (0.20/day)
Laughs in 1100T on decade old AMD components…..

You should try trolling elsewhere or read a little more before spouting off literal fanboy BS.

Flexing boards, flexing CPUs, the terrible idea of the original 775 coolers that caused noobs and even pros to kill boards, their relabeled and unsupported Killer NiC series, their hardware security issues, the number of overheated chips that died in OEMs.

Intel isn’t magical, they are a for profit company just like AMD and their bullshitdozer that got stuck in the ponds.

Quality is quality, buying a 300 board from either means or should last longer than a 79 dollar board from either. Same for coolers they don’t make, memory, PSUs, and hard drives.
I agree those are the issues faced with most of the companies (partners) i am magnifying the gpu soldering issue and the artifects issue we see every now and then with both nvidia and amd cards. If you positively see there is less ration of intels late north and south bridge to desolder and cause issues. Something I use to experience with nforce chips. If you see more into this Intel ICs on other peripherals also prove to have more life and less malfunctions.
You are right and thats what i mentioned earlier. Intel has software issues and its harmony towards hardware coming from software is less compared to nvidia and amd. This is what I am trying to say that it wants to cover this segment for its ARC gpus causing massive delay.
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
42,364 (6.65/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
It was going fine until you said Intel doesnt deal in planned obscolescence / insinuate Intel cards magically wont suffer from getting old and just dying. I think you oughta change the color of those glasses to transparent, uncolored.
Their socket changes every 2 years leaves users high and dry.
 

aQi

Joined
Jan 23, 2016
Messages
646 (0.20/day)
And how many quarters of losses in the graphics card sector is Intel going t continue to endure before the board cuts their losses??

At least compute GPUs they can sorta justify (via the supercomputer contracts) but how long do you expect Intel to continue to be #3 in the consumer market, while bleeding thee losses? (untill they fire the entire driver team , and absorb the best innto compute modules.)?
Lol if Intel is really concerned in giving back to the gamer community and devoting to graphics then they wont care about losing anything at all but I highly doubt that frame this exactly where things are going. Nvidia was the only company who was on panel of Tesla motors for graphics and AI. Amd took over alot of interest from time to time introducing low cost and highly effective rdna structure. Intel introduced hardware AI in their 10th gen processor family yet there was always something missing and Intel was leaving itself behind and foreseen losing markets in most of the Industries due to poor support in graphics. Its the right time to release so called limited edition graphics but there is no point in releasing something which has poor driver support. If I were Intel I would do the same holding everything down just to make an appearance rather then thinking about losing money.
 
Top