They are still running a business, sharing assets between games is good practice from a productivity point of view. Even Rockstar didn't commit to a PC version of GTA V until it could be built up simultaneously with PS4/XB1 which are very similar in architecture. Games are very expensive to produce, anywhere you can save money is good. They don't owe us anything to spend more money, it's their investment to make and their choice on compromising one system on behalf of another. We as consumers can just wait until the game comes out, evaluate and then decide if it's worth our money. Pre-ordering doesn't entitle you to complaint or guarantee that you'll have a say in the game satisfying your wants, it buys you some extra goodies to go along with your game and that's it, and the opportunity cost is the inability to fully evaluate the product before purchase. Many people who pre-order tend to think that the developer "owes" them for fronting them the money when/if the game doesn't meet their expectations. You should of course be entitled to a return should the game be defective or if you were falsely advertised, However, you can't really consider any pre-release footage as false advertising because they're always tagged full of disclaimers saying it's not representative of the final product and is subject to change without notice. Pre-ordering is not an investment, it's an incentive and gamers don't seem to get that. (And I apologize in advance if this sounds like I'm attacking you, I was merely commenting on your observations and the way some people have reacted making the same observations.)
Also, considering the console versions run at 900p and are basically running on gimped Radeon HD 7850s, let's do some math. 1920 x 1080 / 1600 x 900 = 1.44, or 44% increase. Going from 30fps to 60fps is a 100% increase, take an efficiency hit due to API overhead, we'll say it's a 2 digit hit ??%, and then tack on the fact the PC version is likely capable of more computationally expensive effects on ultra settings, add another 2 digit hit ??% then add in Hairworks which has a measureable impact, also a 2 digit hit ??% (I don't know the exact numbers for AMD and Nvidia).
Let's call the gimped 7850 in the PS4 25% slower than a desktop 7850 just for giggles and that our best case mystery double digit scenarios are 15% each. Then to play the Witcher 3 at 1080 at 60fps with ultra settings we need a graphics card that is: .75*1.44*2.00*1.15*1.15*1.15 = 3.3x faster than an HD 7850. This is of course just a model with many things assumed to be scaling linearly with some numbers blatantly guessed, but assumed to be best case scenarios. 3x 7850 I would believe sits somewhere between a GTX970/R9-290 and a Titan X in terms of computational throughput.
Now, this model isn't perfect, but it's more work than most people complaining are doing to justify their claims of the game being unoptimized. According to the model here, the way the game is performing right now seems to agree with the model pretty well. The only anomaly seems to be Kepler, which for some unknown reason seems to be underperforming Maxwell and its GCN competitors. AMD also takes a bigger hit for Hairworks (as does Keppler), which I don't think can be attributed to tessellation performance (can someone check that? How does Kepler/Maxwell/GCN compare on Tessmark?), but rather inaccessible to optimization besides brute forcing lower computational complexity through the driver.
Comments, scrutiny, skepticism welcome. I'll get off my soap box.