- Joined
- Dec 16, 2010
- Messages
- 1,681 (0.33/day)
- Location
- State College, PA, US
System Name | My Surround PC |
---|---|
Processor | AMD Ryzen 9 7950X3D |
Motherboard | ASUS STRIX X670E-F |
Cooling | Swiftech MCP35X / EK Quantum CPU / Alphacool GPU / XSPC 480mm w/ Corsair Fans |
Memory | 96GB (2 x 48 GB) G.Skill DDR5-6000 CL30 |
Video Card(s) | MSI NVIDIA GeForce RTX 4090 Suprim X 24GB |
Storage | WD SN850 2TB, Samsung PM981a 1TB, 4 x 4TB + 1 x 10TB HGST NAS HDD for Windows Storage Spaces |
Display(s) | 2 x Viotek GFI27QXA 27" 4K 120Hz + LG UH850 4K 60Hz + HMD |
Case | NZXT Source 530 |
Audio Device(s) | Sony MDR-7506 / Logitech Z-5500 5.1 |
Power Supply | Corsair RM1000x 1 kW |
Mouse | Patriot Viper V560 |
Keyboard | Corsair K100 |
VR HMD | HP Reverb G2 |
Software | Windows 11 Pro x64 |
Benchmark Scores | Mellanox ConnectX-3 10 Gb/s Fiber Network Card |
And having to spend any money at all to buy anything extra to retain functionality of every other card is a con. There is no getting around that.
I really hate to agree with you on anything NT, but I do agree here.
This omission is hardly a dealbreaker and can be worked around with that little $25 active adapter, but still annoying. It's the same situation as when mobo ports got removed over time such as IDE and PCI, for example, preventing your old, but useful, kit from being used. It's the price of progress, I guess.
I can see how it would turn off some people, but I strongly believe AMD made the right choice. The improvements AMD made to its display controller (4 links allowing 2x DL-DVI as well as three clock generators allowing 3x SL-DVI) will help more people than the lack of VGA will hurt.
I don't deny that VGA users exist or that they aren't potential customers; I just argue that the tradeoff for a better digital display controller was worth it. I don't see this as any different from other changes in the past. Back with the GTX400/HD5000 series both manufacturers got rid of S-Video and RCA outputs completely in order to allow more digital displays, and there were complaints but they were a small minority. The market is always moving to newer standards, and no matter what standard you're talking about, retaining legacy compatibility with new products has always meant buying adapters.
I don't think this means that AMD will abandon VGA on the low end just yet, but AMD has begun the march toward the death of VGA, and the high end is a good place to start. To VGA I say good riddance. I'll be glad when VGA dies as the standard to connect to projectors in meeting rooms; they always have long, low quality VGA cables. That results in a ton of noise on the video signal, which is very distracting when trying to show a presentation containing static content and high contrast.
unless they did just that, but were then aiming for the 20nm node, which at the start of 2013 was decided not to be viable in time. I think this was said over at techreport in the comments of the 290 review without providing actual proof. Your post made me think of it though, and judging by the state of affairs over at TSMC i would regard it as possible atleast, since it would mean that they had to redo quite a bit to adapt this chip for 28nm.
All very much speculation though(which i personally love to indulge in on matters like these
Yeah, it is completely speculation, but speculation is fun!
What you describe is exactly what AMD admitted happened to Cayman/69xx. It was designed for 32nm but was moved back to 40nm because 32nm was delayed (and eventually cancelled). So I could believe that AMD wanted to take advantage of 20nm for its next high end chip, but at some point AMD realized that 20nm was too far away and needed to design a 28nm chip to suffice until 20nm chips were available. I think this happened to NVidia this time too considering that the rumors sugegst the initial Maxwell chips are 28nm instead of 20nm
Last edited: