# Are we at pretty much the max needed for high end graphics cards in 2013/2014?



## vawrvawerawe (Jun 18, 2014)

I have 2x 7950 I got last year, and even just one of them plays any game (ANY game!) on ultra with maximum settings at 1080p.

I can't really see anyone even needing to get a new graphics card, because in addition to overpowered graphics cards (for the high end), graphics themselves have pretty much plateaued. For example the PS4 was not even a graphical improvement over the PS3. Yes, a **marginal** but in fact I have found many comparisions of ps3 vs ps4 to be so ambiguous that I could not tell which side was the PS4. In fact, I GUESSED WRONG several times thinking the PS3 side looked better so thinking that side was the PS4 until they showed at the end of the video it was in fact ps3.

Granted, we are talking about PC graphics here, not console -- but consider in any case the fact that the PS4 ***is just a glorified PC running a unix OS*** (yes, seriously). And my PC I built last year is better than this new PS4.

Anyhow, my point is that graphics have reached a point that they can't get much better. And being that they can't get much better, then the demand for increased graphics processing power is decreasing by the same proportion.

I'm not even sure I'm ever going to need to upgrade to a new graphics card until (and IF) 4K becomes the new standard. The best Radeon graphics card on the market is the R9 290X -- and is is LESS POWERED than my two 7950. And keep in mind it is twice the price I paid for each of my 7950's, and the fact is is not really a better graphics card but really just a "two for one deal" all wrapped up into one shiny package.

Basically, graphics card people have been releasing "new" graphics cards which are pretty much the same graphics card times two, and for double the price a year later. Yeah, what a bargain... Titan, R9, etc - these overpriced cards are no better than the two cards they they condensed into one (well, MARGINAL improvement, depending on your card), but for double the price. So in other words, the SAME PRICE FOR THE SAME PRODUCT, just you have to buy two (..in one package, yeah yeah whatever).

So I'm just wondering your opinion. There will always be people wanting the latest and greatest, even if it's no real improvement. Well besides those people, do you think there will be a real demand for any graphics upgrade anytime in the near future? I can't see it happenning, at least until and if 4K mainstreams.

Now for a nice comparison:

Which one is the PS4? Can you guess correctly?

After I get enough responses I will post the photo again which will show at the bottom which is which (I just placed a black box with a question mark over the 3 and 4).


----------



## Steevo (Jun 18, 2014)

We doesn't care for those console peasantry low res stuff. 

Twice the performance for the same base price as the last gen, its not the same as it used to be where software and hardware have a lot of improvements to do, for now its 50% increase in the same price bracket plus other features.


----------



## Wonsu (Jun 18, 2014)

The second one is the PS4.


----------



## Deelron (Jun 18, 2014)

Wonsu said:


> The second one is the PS4.



What he said, although I kind of want to be wrong so I save money on my next graphics card.


----------



## human_error (Jun 18, 2014)

Go play in the Star Citizen Arena Commander module and then tell me there's no need for more graphics power 

That photo comparison is dreadful by the way - the PS3 was at its limits where the PS4 edition of the game isn't finished and won't be maxing out the PS4's capability. Put a screenshot from the original Crysis and ask people which looks best.


----------



## AphexDreamer (Jun 18, 2014)

The only thing the PS4 has going for it over the PS3 is the ability to have higher res textures as far as from what I've seen and of course churn out higher frames. I'm not going to be getting a console. last gen my friends had consoles so I had to get one just to play with them. Now that I've shown them the glory of the PC, they have all gotten gaming PC's (that I've built for them) and we play lots of league. 

The only way they can get people to even buy consoles is with exclusives and gimmicky Augment Reality features that in reality the PC can do.


----------



## Wrigleyvillain (Jun 18, 2014)

Uh you could also upgrade from now-lowly 1080P.


----------



## Solaris17 (Jun 18, 2014)

Wrigleyvillain said:


> Uh you could also upgrade from now-lowly 1080P.


lel please 1080p is still by far the industry standard in mass market resolutions.


----------



## sneekypeet (Jun 18, 2014)

Solaris17 said:


> lel please 1080p is still by far the industry standard in mass market resolutions.



Since when did you become fine with the industry standards? 

@ OP, for 1080p there isn't a whole ton of need for serious GPU power, but many in the gaming realm are already on higher res IPS panels, and most are just waiting for all the stars to align for a reasonable deal on a 4K screen.


----------



## Ahhzz (Jun 18, 2014)

Someone isn't in the Star Citizen universe, otherwise they wouldn't make such silly statements as "graphics themselves have pretty much plateaued."


----------



## Solaris17 (Jun 18, 2014)

sneekypeet said:


> Since when did you become fine with the industry standards?



you are correct and I was mistaken. most computers that can access the internet are actually in 1366x768

http://www.w3counter.com/globalstats.php


----------



## Champ (Jun 18, 2014)

I feel like we've hit the cap for what we need graphically. Anybody that buys top of the line gear now shouldn't have to upgrde again until we get a gpu that can single handedly run 4k. Like if someone buys 2 780 TI's or 2 290x's and the top z97 or x79 board/processor, they honestly shouldn't have to upgrade anything but the gpu when 4k becomes the norm. I'm debating this now. The buying top of the line parts


----------



## erocker (Jun 18, 2014)

I'm still fine with my 7970 on 1440p. I don't think I've held on to a GPU for so long, but it's been a great card.


----------



## Champ (Jun 18, 2014)

erocker said:


> I'm still fine with my 7970 on 1440p. I don't think I've held on to a GPU for so long, but it's been a great card.



I didn't realize those were that powerful. I went from a 4850 to a 780, so I missed everything in between


----------



## Vario (Jun 18, 2014)

I think textures and effects are larger now but the difference isn't all that noticeable so we don't view current games as looking much better than 2 year old games, however current games do require more GPU power.

Also you are running 1080P and the $250-350 VGA segment has always been the sweet spot.


----------



## vawrvawerawe (Jun 20, 2014)

Interesting, about half people tihnk it's the left and half people think it's the right side. Goes to show how much not better ps4 is if people can't even be sure which is which. It's basically like, flip a coin, that one is PS4. Worth the extra 300% cost plus a whole new game library? I think not. There was no need for a new PS4. Sony just wants to make more money off almost nothing.


----------



## vawrvawerawe (Jun 20, 2014)

Champ said:


> I feel like we've hit the cap for what we need graphically. Anybody that buys top of the line gear now shouldn't have to upgrde again until we get a gpu that can single handedly run 4k. Like if someone buys 2 780 TI's or 2 290x's and the top z97 or x79 board/processor, they honestly shouldn't have to upgrade anything but the gpu when 4k becomes the norm. I'm debating this now. The buying top of the line parts



I bought top of the line last year on black friday (not this past black friday but the 2012 black friday), for less than the exact same parts cost for brand new today over a year and a half later. In fact, SSD has gone UP for the exact same drives since then. And my PC still is a top of the line PC and at this rate it will continue to be top of the line for another year or more. My PC still blows away 99.999% of PCs and the remainder it is on par with, and I built it over a year and a half ago.


----------



## Champ (Jun 20, 2014)

I actually seen a 4k display yesterday and it was amazingly sharp and vivid. It looks really good, but I don't understand why some much power is needed to drive it. I am buying one. They said at BB that they probably wont be on sale for black friday. But I've seen two 780 TIs and 290x's drive it on high\ultra setting on youtube, with no AA of course. I don't see the need for it at that rez.


----------



## remixedcat (Jun 21, 2014)

Solaris17 said:


> you are correct and I was mistaken. most computers that can access the internet are actually in 1366x768
> 
> http://www.w3counter.com/globalstats.php




Picked a random time frame on my site stats software and here's what I got visiting my site:


----------



## erocker (Jun 21, 2014)

vawrvawerawe said:


> Interesting, about half people tihnk it's the left and half people think it's the right side. Goes to show how much not better ps4 is if people can't even be sure which is which. It's basically like, flip a coin, that one is PS4. Worth the extra 300% cost plus a whole new game library? I think not. There was no need for a new PS4. Sony just wants to make more money off almost nothing.



That game in particular is a poor representation. They didn't really change much of anything with the port over. Last of Us is a great looking game, but more in an overall sense, not necessarily good looking close up. Low res. textures for consoles that can't handle higher res. textures. GTA V should have some good differences. Newer "nex-gen" games are looking better. Either way you're fine with your setup, probably don't even need the second card at all.


----------



## Fourstaff (Jun 21, 2014)

remixedcat said:


> Picked a random time frame on my site stats software and here's what I got visiting my site:
> 
> View attachment 57365



Are visitors to your site more technologically inclined than the general public? It will skew datasets.

Then again, Steam data survey posts 34% 1920x1080p users


----------



## 64K (Jun 21, 2014)

When the install base of the new gen consoles become high enough for developers to start using more resources for games then more GPU performance will be needed for PC gaming. The last I heard they had sold around 10 million units combined but I read that they guessed that there are around 150 million Xbox 360s and PS3s out there. The more resources the developers have the more they use. Here's an example of what I'm seeing.

http://www.ign.com/articles/2014/04/09/why-borderlands-the-pre-sequel-isnt-on-ps4-or-xbox-one


----------



## remixedcat (Jun 21, 2014)

Fourstaff said:


> Are visitors to your site more technologically inclined than the general public? It will skew datasets.
> 
> Then again, Steam data survey posts 34% 1920x1080p users



Mine targets enterprises too and I've had a fair share of lower res older systems too. Its a mixed content blog as well. I have architecture and music stuff on there too so it has a wide audience.


----------



## TheoneandonlyMrK (Jun 21, 2014)

Consoles have nothing to do with graphics cards,  is this one of the most baffled threads or is it me.
We need more gpu grunt, try a 4k screen or three for starters never mind trying the same graphical fidelity we have now in games in immersed 3d at 100fps per eye .
More x10 just for now imho


----------



## 64K (Jun 21, 2014)

theoneandonlymrk said:


> Consoles have nothing to do with graphics cards,  is this one of the most baffled threads or is it me.
> We need more gpu grunt, try a 4k screen or three for starters never mind trying the same graphical fidelity we have now in games in immersed 3d at 100fps per eye .
> More x10 just for now imho



Most of our PC games are ported from console versions is what I was getting at. The ports may be better in the future for PC games from the new gen consoles since they are closer to just an average PC now but they will also have far more resources to use than was available to the last gen consoles. 

But yeah, if 4k does catch on mainstream then we will need more performance from GPUs for that reason alone.


----------



## TheoneandonlyMrK (Jun 21, 2014)

Most isn't all and in All id like ultra ultra settings 4k (1 screen for now) 144fps vsyncd please.
We have quite a way to go yet.


----------



## Vario (Jun 21, 2014)

The ported games are just the ones from Ubisoft/EA etc and should probably be avoided anyway.


----------



## vawrvawerawe (Jun 25, 2014)

Well actually consoles use a graphics chip just the same as PCs. In order to run graphics, you need a graphics chip. In fact, PS4 uses a graphics chip that compares with between a Radeon HD 7850 and 7870.

However, my point was not to compare PC graphics to consoles. I was merely making a point about the progress of graphics in general; and comparing a previous console version to current console version is a better generalization than trying to compare an old and new graphics card, because it's pretty difficult to try to find an example of the exact same game played on a 10 year old PC with a graphics card to a *directly comparable PC* using the same game and newer graphics card. Not to mention the game also would be outdated and any 10 year old PC could not run a modern high-res PC game for accurate comparison purposes.

Thus, it is reasonable to consider consoles as a constant in the increase in graphics resolution; and being that consoles are consistent hardware-wise, consoles are the best way to compare by screenshot increases in graphics over the period of about a decade.


----------



## Toothless (Jun 25, 2014)

Max graphics? What is this? I used a GT 220 as my first ever GPU, and still played games at a decent rate. (Given the lowest resolution was needed)

Switched to a GTX660OC and I still don't care for maxing out games. I think some people will join me on thinking this way but, as long as I can play the game and not lag to death, I'll do it.


----------



## TheHunter (Jun 25, 2014)

Wait for real next-gen games in 2015, then you will see them gpus doing some crazy workout 


I personally can't wait for those games, 5tflops power should be enough for anything imo, U4E apparently needs ~3tflops power.


----------



## 15th Warlock (Jun 25, 2014)

vawrvawerawe said:


> Interesting, about half people tihnk it's the left and half people think it's the right side. Goes to show how much not better ps4 is if people can't even be sure which is which. It's basically like, flip a coin, that one is PS4. Worth the extra 300% cost plus a whole new game library? I think not. There was no need for a new PS4. Sony just wants to make more money off almost nothing.



So you're making an argument for what console is better based on a single crappy screenshot for a game that was probably coded to run on PS3 natively and then ported to PS4?

You sir have won all the internets.


----------



## remixedcat (Jun 25, 2014)

15th Warlock said:


> So you're making an argument for what console is better based on a single crappy screenshot for a game that was probably coded to run on PS3 natively and then ported to PS4?
> 
> You sir have won all the internets.




it's what he does


----------



## AsRock (Jun 25, 2014)

Their not going max out the PS4 so soon it's not logical as they make more money making small improvements over time which extends the life of the console.

And as people pick up 4k for the PC video cards will have to get more powerful.


----------



## Deleted member 67555 (Jun 25, 2014)

Consoles just now made it to real 1080p and 4k TV's are just now coming out without a real standard for content delivery...
1080p is here to stay for at least the next 5-6 years...So yes, for now just about any middle grade GPU is fine...
I'm still waiting for an end to GFX cards and I'm hoping that happens before 4k is the standard resolution for TV's.


----------



## TRWOV (Jun 25, 2014)

Another "issue" is that, even if we have access to all this graphical power, games are still bound to what developers can do within their budgets. There's always going to be the Skyrim or Battlefield that pushes thing to the edge but those games are far and between.


----------



## AsRock (Jun 25, 2014)

jmcslob said:


> Consoles just now made it to real 1080p and 4k TV's are just now coming out without a real standard for content delivery...
> 1080p is here to stay for at least the next 5-6 years...So yes, for now just about any middle grade GPU is fine...
> I'm still waiting for an end to GFX cards and I'm hoping that happens before 4k is the standard resolution for TV's.



To me 4k will be a no go for a hell long time i be more interested in these company's who make the games a lot more about playability. As a game can have great graphics and you would not play it as the rest of the game is medicore.

Any thing over 1080 has no point to me as i don't want to have a comp sucking up some 600w +,  as i have a feeling 4k will all so require more gpu power and that's power usage is getting crazy and is just silly amount of power just to play a game..


----------



## Deleted member 67555 (Jun 25, 2014)

Imma explain my whole thought process here....
What we know as a GFX is basically throwing POWER at something until we can figure out how to do it efficiently.....
Here is the absolute best example I feel can be given 
From the mid 90's to the best mobile tech you can fit in your hand
http://en.wikipedia.org/wiki/List_of_PowerVR_products

Now if they can do what they did with PowerVR, Think what they'll be able to do with more modern instruction sets in 5-10 years...


----------



## vawrvawerawe (Jun 27, 2014)

TRWOV said:


> Another "issue" is that, even if we have access to all this graphical power, games are still bound to what developers can do within their budgets. There's always going to be the Skyrim or Battlefield that pushes thing to the edge but those games are far and between.



That's like saying, "you should buy something now for $300 more than it will be 5 to 10 years from now when developers *maybe* might have bigger budgets to spend on better graphics, when the PS4 system costs only $200 by that time.. even though you will get practically zero ROI for the additional expenditure since during that 5-10 years the graphics will remain the same (because if you waited a few years to buy the system the price would be less than half the current price, based on PS3 cost decreases; and during those few years the PS3 has the same or better graphics depending on who you're asking, and there is no guarantee that PS4 graphics really can get any better, or if developers will ever spend millions more in their budgets to increase graphics to any noticeable difference.)"

Tell me if you were trying to say something different.

p.s. Battlefield 4 is basically exactly the same graphics as Battlefield 3. Many people are calling it "Battlefield 3.5" because Battlefield 4 is not really a new game or better graphics but just an upgrade (or downgrade, depending on who you ask) from Battlefield 3. Only major difference is some new maps, which they could have just released new maps for Battlefield 3.


----------



## vawrvawerawe (Jul 21, 2014)

I agree


----------



## sneekypeet (Jul 22, 2014)

vawrvawerawe said:


> I agree



With yourself???
A month later no less?????


----------



## yogurt_21 (Jul 22, 2014)

sneekypeet said:


> With yourself???
> A month later no less?????


it apparently takes time for one personality to take over for another. At any rate there's no denying that the ps4 is more powerful than the ps3. Eventually games will take advantage of that. Seeing as the last cycle was 6 years long there's plenty of time.


----------



## savas (Jul 23, 2014)

I was baffled as how graphic cards have not advanced as much as they used to.. It's really weird, but then when you think about it and pin point exactly when this trend started and it seems to be that with the console market aligning with PC games and in that sense being comparable/competitive, you notice that there may be a connection.

They want games to be similar and playable on all platforms. When a game developer makes a game they make it so it works for a console - if that's their intention - so that means PC graphic cards aren't struggling, because the console graphics is their standard... If they go beyond what a console can handle they won't be able to release it and that would be a big chunk of a market out.. They could however make it so a PC can up the quality and the consoles can stay on a level that works, but that means extra work and clear visual differences that they would probably want to avoid just for the silly sense of not pissing off console makers... AND WE ALL KNOW that when it comes to doing extra stuff developers will skip it if it doesn't change the overall game objectives.

So I think when you would best upgrade your card is when just after console developers release a new version, which allows game developers to finish their games for them so your graphic card purchase would be more than enough until the next one, as long as your card can easily handle everything on max at the time..


----------



## 64K (Jul 23, 2014)

savas said:


> I was baffled as how graphic cards have not advanced as much as they used to.. It's really weird, but then when you think about it and pin point exactly when this trend started and it seems to be that with the console market aligning with PC games and in that sense being comparable/competitive, you notice that there may be a connection.
> 
> They want games to be similar and playable on all platforms. When a game developer makes a game they make it so it works for a console - if that's their intention - so that means PC graphic cards aren't struggling, because the console graphics is their standard... If they go beyond what a console can handle they won't be able to release it and that would be a big chunk of a market out.. They could however make it so a PC can up the quality and the consoles can stay on a level that works, but that means extra work and clear visual differences that they would probably want to avoid just for the silly sense of not pissing off console makers... AND WE ALL KNOW that when it comes to doing extra stuff developers will skip it if it doesn't change the overall game objectives.
> 
> So I think when you would best upgrade your card is when just after console developers release a new version, which allows game developers to finish their games for them so your graphic card purchase would be more than enough until the next one, as long as your card can easily handle everything on max at the time..



This response should be considered.


----------



## LAN_deRf_HA (Jul 23, 2014)

This is such a random thread. A half-assed port is a meaningless example. Most of the stuff I see around the enthusiasts community is about how horribly underpowered today's cards are. With a 1.3 GHz 780 Ti I can get acceptable frame rates in Crysis 3/Modded Skyrim at the relatively common 1440p res, but that's the top single card with a really high 24/7 clock and with bare minimum AA, and we're still talking below 60 FPS a lot of the time. Cards aren't advancing fast enough for the up coming wave of next gen ports and 4k monitors. We need like 2-3x the graphical power we currently have only we need it now, not 2 years from now which is when we'll get it.


----------



## Champ (Jul 23, 2014)

I feel like we just got to the point where we can ran 1080p to its full potential. I still see 4K being a long ways off if it truly catches on. I know some people don't like the youtube tech guy Linus, but he believes UltraWide is the future and not 4K. I'm really leaning toward buying a UW monitor. Those can be run now if you hae a fairly powerful rig. Like the LG 1440p UW monitor. 4K is still a long ways down the road before its runs with fluidity


----------



## XSI (Jul 23, 2014)

+1 for ultrawide. even 4k is nice i would preffer ultrawide 29-34" monitor.


----------



## sliderider (Jul 24, 2014)

Solaris17 said:


> you are correct and I was mistaken. most computers that can access the internet are actually in 1366x768
> 
> http://www.w3counter.com/globalstats.php



That's all those cheap laptops and tablets doing that. I can't actually remember the last time I saw a 1366x768 monitor for sale anywhere. Heck, I can't even remember the last time I saw a 1440x900 or 1680x1050 monitor for sale anywhere. Most desktop PC monitors I see for sale are 1920x1080 or greater now.


----------



## Solaris17 (Jul 25, 2014)

Solaris17 said:


> lel please 1080p is still by far the industry standard in mass market resolutions.





sliderider said:


> That's all those cheap laptops and tablets doing that. I can't actually remember the last time I saw a 1366x768 monitor for sale anywhere. Heck, I can't even remember the last time I saw a 1440x900 or 1680x1050 monitor for sale anywhere. Most desktop PC monitors I see for sale are 1920x1080 or greater now.



ok


----------



## a_ump (Jul 26, 2014)

sliderider said:


> That's all those cheap laptops and tablets doing that. I can't actually remember the last time I saw a 1366x768 monitor for sale anywhere. Heck, I can't even remember the last time I saw a 1440x900 or 1680x1050 monitor for sale anywhere. Most desktop PC monitors I see for sale are 1920x1080 or greater now.



Not that i shop to buy hardware from wal-mart, but half of their monitors are 1600x900.  As an enthusiast you probably don't remember 1680x1050 or lower because that's not what you look for. I can go to my in-law's house, my buddies or any number of family friends pc's that i have worked on and most are sporting 1280x1024 lcd's or 1440x900. I feel 1680x1050 was the odd medium for those that didn't want to pay 200-300 bucks for 1080p 3-5 years ago(i was one of them ).


As for the comment way up above, i agree the best time to upgrade a GPU is after a new console release and maybe 2 years after a console release when dev have gotten a hold on how to fully utilize the consoles potential.  Me personally, i'm going to wait for the GTX 8XX or R9-3XX? gen to come out and then upgrade from my GTX 560.


----------



## RejZoR (Jul 26, 2014)

LAN_deRf_HA said:


> This is such a random thread. A half-assed port is a meaningless example. Most of the stuff I see around the enthusiasts community is about how horribly underpowered today's cards are. With a 1.3 GHz 780 Ti I can get acceptable frame rates in Crysis 3/Modded Skyrim at the relatively common 1440p res, but that's the top single card with a really high 24/7 clock and with bare minimum AA, and we're still talking below 60 FPS a lot of the time. Cards aren't advancing fast enough for the up coming wave of next gen ports and 4k monitors. We need like 2-3x the graphical power we currently have only we need it now, not 2 years from now which is when we'll get it.



Say thanks to all makers reheating old stuff under new names. Radeon R9 series below 290 spring to my mind. And because lower end modern versions are the same as old ones, no one has any expectations for the top end. And that's why we are stagnating for ages now. NVIDIA is no better in this regard.


----------



## rtwjunkie (Jul 28, 2014)

Did I miss it somewhere?  I thought the OP was going to tell us which picture was which Playstation?


----------



## yogurt_21 (Jul 28, 2014)

funny I thought the consensus was that any game that's playable on both will likely show very little variance from the older one. Dev's being budget minded and all. 

Think about it, PlayStations have always been nice about being backwards compatible. Pop Tony Hawk into your PS2, then Pop it in your PS3...see a difference? no? "OMG conspiracy, PS3 suxxors you should have stuck with PS2!"

That's pretty much what this thread is. A game built on an old engine so as to be compatible with both older gen consoles as well as newer isn't very likely to take much advantage of the newer gen's GPU horsepower. The games that do take advantage of it, aren't likely compatible with the older gen to test. So we get this nice little waste of time.


----------



## newconroer (Jul 28, 2014)

At 1600p, two R9 290 cannot fully max Crysis at 60FPS. So.... maybe a better question would have been whether the hardware horsepower in modern GPUs is more than enough, but the APIs and architectures could be more efficient, and waste less horsepower.

?


----------



## Champ (Jul 29, 2014)

Interesting. We carry on about 4k and our somewhat normal resolutions (1600p isn't normal I think) aren't being maxed yet.


----------



## Champ (Jul 30, 2014)

Well, me speaking on 4K vs ultrawide. Looks like lg is charging the UW movement http://www.tweaktown.com/news/39357...ution-of-5120x2160-with-21-9-ratio/index.html


----------



## dr0thegreatest (Aug 7, 2014)

Champ said:


> I feel like we've hit the cap for what we need graphically. Anybody that buys top of the line gear now shouldn't have to upgrde again until we get a gpu that can single handedly run 4k. Like if someone buys 2 780 TI's or 2 290x's and the top z97 or x79 board/processor, they honestly shouldn't have to upgrade anything but the gpu when 4k becomes the norm. I'm debating this now. The buying top of the line parts


i dont even think you would need to upgrade the GPU, 2 290x are pretty darn good in 4k, and if hes got another slot throw in another 290 to the mix and it should be very good for 4k unless some serioussly demanding 4k game comes out which is obviously going to happen.


----------



## Steevo (Aug 27, 2014)

vawrvawerawe said:


> Interesting, about half people tihnk it's the left and half people think it's the right side. Goes to show how much not better ps4 is if people can't even be sure which is which. It's basically like, flip a coin, that one is PS4. Worth the extra 300% cost plus a whole new game library? I think not. There was no need for a new PS4. Sony just wants to make more money off almost nothing.


Who needs consoles again?


----------



## D007 (Aug 28, 2014)

And then 4k came out and everything you have doesn't cut it anymore lol..
On a big screen TV 4k looks amazing.
It makes 1080p look like 710.
Once you look at them, there is no going back lol.

Comparing performance vs quality is also impossible in still pictures.
60 fps vs 30 fps is a big difference.
PC will outperform, unquestionably, especially with 4k.

Desktop color and control panel settings, will also make the picture look different.


----------



## XL-R8R (Aug 28, 2014)

It appears wmwmmwmwmawavewmw (or whatever that crap is any way) has once again bailed from his own thread that he apparently started with the sole purpose to either troll, start in-fighting between members or just to be pointless.


----------



## lilhasselhoffer (Aug 28, 2014)

I'm having a problem here, that doesn't seem to be resolved yet on this thread.

I want 4K.  I want it because everyone wants to max out everything and have the best stuff.  That's why this forum exists.


My conundrum is that despite wanting 4K I know that it's stupid.  Before I get called for flaming, or start some sort of debate, allow me to explain myself.  

Consumers currently have three options for content delivery.  They can stream, buy a DVD, or buy a Blu-ray.  Gaming falls into either the streaming (Steam, Origin, GOG, etc...), or the DVD category.  Neither of these content delivery systems has a resolution limit, because the computer generates images on the fly.  You could theoretically compress a program onto a DVD and have it run a thousand monitors once activated, assuming the hardware existed to do so.

Alternatively, you've got movies and television which can utilize all content delivery streams.  Streaming allows instant content access, if limited by internet speeds.  DVDs are great, assuming you can compress the crap out of them and get your two hours of video.  Blu-ray is better, because the information storage capacity is much higher.  The problem for the last two standards is that neither of them are dynamic.  Stored pictures, even when compressed very well, are only that.  No piece of hardware can truly improve their fidelity, even if interpolation can allow the images to appear less pixelated once expanded.


So the reason 4K is stupid is simple, it's not adopted by one of the largest bodies and only just being introduced in the other one.  Movies and television don't exist in the realm of 4K.  If the movie industry doesn't support the hardware with content, nobody buys it.  If the hardware doesn't exist, there's no content created for it.  No content and no hardware means that despite its advantages 4K isn't worth adopting for the main stream consumer right now.



So looking at this all, my problem is simple.  Why concern yourself with 4K and the cards you need to run it today?  If you've actually got the money for a 4K monitor, the graphics cards to run it probably aren't a big cost to you.  If you're trying to buy cards that are future proof enough to run 4K, then you're comiting to GPUs for a substantial chunk of time.  No installation base now means 4K won't be at a reasonable pricing for at least several more years; a content pool drives consumer buying, which is what drives prices down.  You're asking whether cards running right now can drive a theoretical monitor in the distant future.  If that's really your intent, ask how many people are still running a 4xxx series Radeon GPU or a 2xx series Nvidea GPU.  Those cards came at about the time that 720p was the standard in media, with 1080p still on the horizon.  The only difference is that the progress of hardware improvement has slowed dramatically since then.


----------



## D007 (Aug 28, 2014)

Not trying to bash ya Hassle but you must not know the current state of 4k gaming, if this is what you think.
4k gaming is alive and well.
Even on single GPU systems like mine. I run 4k playing mass effect 1, 2 and 3 and it runs very nicely.
Even with one gpu. 3840x2160

Also 4k is extremely reasonable in pricing now.
My 50" samsung 4k , LCD TV was only 1,500.00 US.
That's exactly what I paid for my old 50" 1080p samsung.
The price has dropped insanely over the last year.

Not only that but watching my blurays have never been more vivid due to upscaling.
The difference is immediately noticeable.
Like looking at a 710 TV, next to a 1080 TV.

Word of advice though. Get a monitor or TV with display port 1.2.
HDMI support is lacking and only works with the HDMI 1.4 work around by nvidia or Eyefinities similar work around.
Currently sli is not supported but it should be very very soon.
Word direclty from Nvidia seems like it should be within a month.
Quote "Our next driver release"
Which should come out very soon. They release a new driver monthly usually and we are due for a new one any day now.

You're not entirely wrong though, it has been a pain in the ass.
My topic here about it: http://www.techpowerup.com/forums/t...dvertising-and-sli.204582/page-2#post-3155731


----------



## lilhasselhoffer (Aug 28, 2014)

D007 said:


> Not trying to bash ya Hassle but you must not know the current state of 4k gaming, if this is what you think.
> 4k gaming is alive and well.
> Even on single GPU systems like mine. I run 4k playing mass effect 1, 2 and 3 and it runs very nicely.
> Even with one gpu. 3840x2160
> ...



When I began reading I was getting ready to make the comment about your other thread.

Now, the problem I have with this statement is simple.  $1500 isn't a reasonable price for your average consumer.  When you can get a 50" 1080p television for $300, and a "next gen" console for about half of the cost of just a decent 4K monitor you're not aiming to the mainstream.


As far as Mass Effect, I call crap on that.  ME 1 ran on the original Xbox, and I got it running well at 1080p on a 3650 GPU.  ME 2 and 3 might have required more chutzpah, but we're only looking at an Xbox 360.  If a modern card, that is functionally 3 or more generations improved upon the mid-range GPU in the 360, cannot run the game at 4K it would be immensely surprising.


Now, the next problem with your assumptions relates to interpolation.  A 4K monitor isn't any clearer than a 1080p monitor, with a 1080p signal.  You're saying the image is clearer, but what you really mean is that the pixel density is high enough that everything looks less jagged.  Interpolation is effectively just smearing enough vaseline on the screen so that the lack of actual pixel differentiation is not noticeable. My 23" 1080p monitor has a higher pixel density than a 50" 4K monitor.  Conflating the two is at best a disingenuous comparison.


Finally, everything else wrong about what you are saying.  The average consumer couldn't understand the difference between 1080i and 1080p.  The differentiations in the various HDMI standards, what the heck DP is, and how to even begin getting the correct driver and work-arounds is well beyond their capacity.  For a moment, consider that the average consumer is your grand mother (assuming that old of a relation still exists for you).  They want to buy cable x, plug into device y, and connect to Monitor/TV z.  All of this complexity is a barrier to any regular person adopting the technology.



I guess, put simply, you and I are not the center of the world.  In all consideration, none of us on a tech forum discussing 4K really are the real world.  No matter what you can do today, the relevant question is what can grandma do without a call to tech support.  4K isn't anywhere near being easy enough for grandma, and thus isn't a pressing issue right now.  Heck, if the current generation of consoles lasts 7 years the first time we'll really be getting into this debate is in another 4 years when the new console rumors start to swirl, and a 4K monitor 46"+ costs less than $500.  PC gaming is experiencing an upswing because of indie development, but indies don't work with the Frostbyte engine.


----------



## D007 (Aug 28, 2014)

lilhasselhoffer said:


> When I began reading I was getting ready to make the comment about your other thread.
> 
> Now, the problem I have with this statement is simple.  $1500 isn't a reasonable price for your average consumer.  When you can get a 50" 1080p television for $300, and a "next gen" console for about half of the cost of just a decent 4K monitor you're not aiming to the mainstream.



Good luck waiting for that, if you want big screen gaming.
1080p TV's are still 600ish dollars in the 50" range.
No one is going to wait 10 years for prices to become "reasonable" as you call it.
I consider the prices reasonable currently.
Not everyone can afford it but such is life.
Not everyone buys a Ferrari either.



lilhasselhoffer said:


> As far as Mass Effect, I call crap on that.  ME 1 ran on the original Xbox, and I got it running well at 1080p on a 3650 GPU.  ME 2 and 3 might have required more chutzpah, but we're only looking at an Xbox 360.  If a modern card, that is functionally 3 or more generations improved upon the mid-range GPU in the 360, cannot run the game at 4K it would be immensely surprising.



I should of elaborated and that's my fault. I am running ME 1, 2, and 3, with 4k mods like this one for ME1:
(MEUITM)http://www.moddb.com/mods/mass-effect-1-new-texture-updatesimprovements-mod
and ini tweaks for 4k shadows and all the extras.
My game is NOTHING like your Xbox, dumbed down graphics version lol..
If Xbox tried to play what I am playing, it would shit it's self and die in short order. 



lilhasselhoffer said:


> Now, the next problem with your assumptions relates to interpolation.  A 4K monitor isn't any clearer than a 1080p monitor, with a 1080p signal.  You're saying the image is clearer, but what you really mean is that the pixel density is high enough that everything looks less jagged.  Interpolation is effectively just smearing enough vaseline on the screen so that the lack of actual pixel differentiation is not noticeable. My 23" 1080p monitor has a higher pixel density than a 50" 4K monitor.  Conflating the two is at best a disingenuous comparison.



Your argument is based on what I consider tiny, 23" monitors. Go big or go home is how I see it.
50" gaming ftw.
Of course you won't see much if anything @ 23".
@ 50" however it is immensely better, so your argument is mute in that aspect, if someone wants big screen gaming like myself.
I don't crunch myself into some desk chair and huddle in front of my monitor like Golem and his precious.
I sit back and  lounge on my lazy boy, in comfort, 6 feet away, with full surround sound baby!
Go big or go home. 



lilhasselhoffer said:


> Finally, everything else wrong about what you are saying.  The average consumer couldn't understand the difference between 1080i and 1080p.  The differentiations in the various HDMI standards, what the heck DP is, and how to even begin getting the correct driver and work-arounds is well beyond their capacity.  For a moment, consider that the average consumer is your grand mother (assuming that old of a relation still exists for you).  They want to buy cable x, plug into device y, and connect to Monitor/TV z.  All of this complexity is a barrier to any regular person adopting the technology.



You insult the intelligence of the average consumer and must think yourself to be immensely intelligence about the i vs p thing..
Everyone and their grandma knows p is better.
Don't flatter yourself lol..
There is no "work around" you need to worry about at all as you have implied.
The work around implements it's self, you do nothing.
4k works as is, no modifications necessary and it is a huge visual improvement.

You base your arguments on people in retirement homes for high end gaming?
That is beyond ridiculous man. How many people in retirement homes you know of that even care about 4k?
How many you know doing high end gaming? lol
How many you know buying 4k set ups?
NONE.. They aren't even in the picture, not part of the demograph, not part of the sales pitch.
4k and Nvidia high end gpus do not market to 80 year olds.
They market to the "Enthusiast" group which is generally younger.
That is common sense.
Lmfao man.. Come on now at least say things that seem possible. That's beyond nonsensical.



lilhasselhoffer said:


> I guess, put simply, you and I are not the center of the world.  In all consideration, none of us on a tech forum discussing 4K really are the real world.  No matter what you can do today, the relevant question is what can grandma do without a call to tech support.  4K isn't anywhere near being easy enough for grandma, and thus isn't a pressing issue right now.  Heck, if the current generation of consoles lasts 7 years the first time we'll really be getting into this debate is in another 4 years when the new console rumors start to swirl, and a 4K monitor 46"+ costs less than $500.  PC gaming is experiencing an upswing because of indie development, but indies don't work with the Frostbyte engine.



Again basing high end gaming and super gpu sales on people in retirement homes?
Half of the people you are talking about will die with 1080p and be happy with it.
I still know a lot of them using tube TV's and have no desire to upgrade what so ever.
They are not part of the demographic and anyone knows that.

In 4 years, 4k will NOT be 500 dollars @ 50".
Just like over  6 years after 1080p is introduced (which is now) STILL is not 500 dollars in the 50" range, for high end sets. 
You will wait forever and NEVER play on a big screen, 4k system.
If that's what you want then fine. But I'm pretty sure the rest of us will pay up if we want too and rock out.


----------



## Champ (Aug 28, 2014)

I brought my monitor now planning for tomorrow. If you have the funds, that's what I recommend. By the looks of it, a 60 hz 4K monitor brought now will last you about a good 5 years before we can start maxing it or the single card affordable solutions comes along. I'm about to run 3 290s. Just ordered 2. I'm just riding along until the next big 4K breakthough


----------



## andrewsmc (Aug 28, 2014)

Which one is the PS4?


----------



## lilhasselhoffer (Aug 28, 2014)

D007 said:


> Good luck waiting for that, if you want big screen gaming.
> 1080p TV's are still 600ish dollars in the 50" range.
> No one is going to wait 10 years for prices to become "reasonable" as you call it.
> I consider the prices reasonable currently.
> ...





...not sure if trolling, or completely detached from reality.


$300 50" sets appear quite frequently, as deals and offers.  The argument that "high-end" displays will never be that cheap is fallacious.  By definition, a high-end monitor is not cheap...  Not seeing why you conflate the high-end of televisions with what the average purchase is.  Right now a decent 1080p television runs near $500, but you're still looking at a console+TV at a price of $1000.  That's 67% of the $1500 you've quoted, and significantly less when you've got a $1000 or more computer that you've purchased in order to even run 4K content.  1000/2500 = 2/5 = 40%.  So you could experience gaming on a console for 40% of the cost of a PC.  The PC may be much prettier, but not 250% of the price prettier.


Yes, I equate the average user to my grandma.  I'm not sure why you're so against it, but be realistic.  When 1080 resolutions first came of people conflated all values together.  720P must be worse than 1080i, because 1080>720.  People eventually worked that crap out, but eventually wasn't instantly.  Taking offense to this statement is foolish, as there are plenty of people who proudly still own 1080i sets.

You somehow assume that most people are doing a bunch of research.  I call crap, and I call you misinformed.  Without searching it out, what is the difference between HDMI 1.0, 1.2, 1.3, 1.4, 1.4a, 1.4b, and 2.0?  Can't tell me, can you?  That's relatively easy for us to look up, but what about the consumer who sees "HDMI compatible" on an cheaply priced TV?  What about the person who has spent 50+ hours working and just wants to watch a movie or play a game, and not sink twenty hours figuring out how to get it working correctly?  You seriously think that isn't the bulk of consumers?  I'll concede a little here, the measuring stick shouldn't be grandma, it should be your father/mother.  If it takes more than 10% of the usable time to get something running for entertainment then it isn't worth it to them.  Life is stressful enough, without having to spend hours getting crap setup.  This is why services like geek squad exist, despite the fact that a teenager can set this stuff up.  $50 to not deal with BS means more enjoyment.  

The ME argument is face palm stupid.  Increasing shaders and adding higher resolution textures is nice, but it isn't anything new.  I'd conjecture that 3+ generations of GPU developments should do this easily, as the amount of shader units on GPUs has increased by that much in the allotted time.  What I said was that ME ran on the original Xbox.  If all you are doing is pumping up the textures and tweaking shaders 90% of the graphics are the same as before.  It may be prettier, but immensely taxing isn't a valid way to describe it.    

Finally, are you seriously getting into an e-peen measuring contest.  The "my screen is big enough to see from the living room" argument is just face-palm stupid.  You're saying that four monitors glued together is just fine, but one monitor at 1/4 the viewing distance is somehow worse.  You know what, I'll concede.  I can spend $140 a monitor, $20 in glue, and $100 on the stand, for a total or $680.  That'll give me the 46" size, 4K resolution, and still have me at less than half of what you paid.  I cannot put it any more concisely than that.  4K will match 1080p only when our streaming service, or physical media, can deliver content.  Gaming is great, but far from the deciding reason that most people buy a television.  Television sales and monitor sales are inextricably linked, so 4K isn't gaining a lot of ground.



Argue all you'd like, but I ask you to read previous posts.  The vast majority of users were at 1080p or less on resolutions, with 4K being basically a statistical anomaly instead of a real segment.  What you're saying that I don't get is basically the market share of Windows Vista.  It exists, but it isn't anything more than a niche at this moment.  Planning around a niche is stupid, hence why 4K is stupid.  In a few years, it won't be Vista, it'll be Windows 7.  The market share, and ease to entry, will be low.  When pricing and barrier to entry is low, 4K will make sense.  Of course, by that point the 7xx and Rx 2xx series GPUs will be an anachronism.  Asking if they'll still run the content at that point is foolish, and seems to be what the OP is stuck on.  Just because we can, doesn't mean that doing it matters.


----------



## 64K (Aug 28, 2014)

lilhasselhoffer said:


> I cannot put it any more concisely than that.  4K will match 1080p only when our streaming service, or physical media, can deliver content.  Gaming is great, but far from the deciding reason that most people buy a television..



I don't know about the situation with others but my Comcast HD service that I pay extra for gives me most channels at 720p or 1080i citing bandwidth issues as the main reason they are not mostly 1080p. If this is true then I doubt a 4K TV will have many if any channels at all actually at 4K.


----------



## D007 (Aug 29, 2014)

My Lord Hassle. You are a complete moron.. idk what else to say.
You have a severe attitude problem and you must be like 12 years old.
One more for the ignore list.
You have exactly ZERO idea of what you are talking about.
You spew the most nonsensical shit I have ever heard..
Iggy time..

Have fun waiting for that hardware for the next 20 years..lol..


----------



## Toothless (Aug 29, 2014)

D007 said:


> My Lord Hassle. You are a complete moron.. idk what else to say.
> You have a severe attitude problem and you must be like 12 years old.
> One more for the ignore list.
> You have exactly ZERO idea of what you are talking about.
> ...


Insults and no hard proof...

Hmmmmm....


----------



## D007 (Aug 29, 2014)

Lightbulbie said:


> Insults and no hard proof...
> 
> Hmmmmm....


Proof of what?
That people cry about paying for top end hardware?
You want proof of that?
Talk to the guy using a mid grade card, crying about why he can't afford a 4k TV.
As if he could even run it if he had one.

I'm not crying that I can't buy that Ferrari.
Deal with it.

I don't need to prove anything to you.
I can see my proof every single time I fire up my badass 4k TV lol..


----------



## lilhasselhoffer (Aug 29, 2014)

D007 said:


> Proof of what?
> That broke ass people cry about paying for top end hardware?
> You want proof of that?
> Talk to the guy using a mid grade card, crying about why he can't afford a 4k TV.
> ...



You are an arrogant spoiled child, unwilling to consider any view but your own and unwilling to bring facts to the table that support your conclusion.  Obviously, you're going to want the last word, so you're welcome to it, after I put our discourse together so you can point out where I was somehow a "moron" despite the provided evidence.

As to what is being said, let's go back over it all.
1) At no point did I say 4K was impossible.  I said talking about it relating it to graphics cards currently on the market is stupid.  4K exists, but the amount of people actually using it are a statistical anomaly, as proven with facts in the thread.  You responded with unintelligible anger about me having a crappy screen if all I wanted was 1080p on a 23" monitor.
2) You said the pricing of a 4K monitor was reasonable at $1500 for a 50" screen.  I pointed out that the average purchase, by definition, isn't a high-end monitor.  The average price of a monitor somewhere in that size, and at 1080p, is currently between $500 and $300.  You seemed to then imply that all of us "peasants" should get a better job and buy something more expensive.
3) When confronted with a more than 250% difference between console 1080p and PC 4K gaming, you dismissed the difference out of hand.  No response to the facts that this difference exists was ever given, unless you calling us "broke ass" is a response. 
4) After all of this, you seem to believe offering proof for your point of view is somehow beneath you.   You constantly insult us "peasants," and somehow come to the conclusion that you are above having to provide a reasonable response.
5) Despite everything else, your insults have been tolerated.  You should have been flagged as intentionally offensive on this forum.  For the record, I'm not doing so because you seem to just be on a tangent and undeserving of scrutiny by a mod.  Everyone has their off days, and moments where emotions flare.  I've kept my responses civil, and I expect the same from another adult.  If you are incapable of this I will be asking for a review of your conduct.

You have demonstrably acted as a jerk.  This isn't the first time either.  You said you'd never respond to me a few months back on another thread.  What has been proven here is that you believe you are better than other people, and thus other people are idiots.  Fine, I'd suggest you take your own advice and stop responding to me.  If you'd done so this whole discussion never would have happened.

My last comment, before I open the floor to another round of misguided insults, is that you are completely incapable of an adult discussion.  If you wanted to prove that you were right you could have brought in sales figures for 4K televisions, advertisements listing how many GPUs say they can run a 4K monitor, pricing figures showing that 4K costs are going down, or even figures that show increased uptake in HDMI 2.0 and DP uptake that show more devices capable of using 4K signals.  I have seen exactly none of this data from you, and only the anecdote that "my TV is bad ass" to support your statements.  Instead you whine like a child scorned, and call me names.  I'm still waiting on a response to the differences between the HDMI standards.  Perhaps, just perhaps, next time you can come to the table with an argument.  As it stands, your argument is name calling and sticking your fingers in your ears.


----------



## ne6togadno (Aug 29, 2014)

Steevo said:


> Who needs consoles again?
> 
> View attachment 58727


 for that skyrim screenshot


----------



## newconroer (Aug 29, 2014)

Think people are forgetting here that games aren't shipping with 4k or 8k native textures and third party modding isn't always using native textures either.

The only thing I fear about 4k is that it will get dominated by 'tv' resolution formats, and give us pixel counts that are too wide when they should be more tall/high like that found in PC format monitors.


----------



## andrewsmc (Aug 29, 2014)




----------



## vawrvawerawe (Sep 21, 2014)

Okay guys, it's been about 3 months since I created this thread.

Time for the answer!

According to your votes,
22 of you thought the left side was the PS4 (40.7%), whereas
32 of you thought the right side was the PS4 (59.3%).






AND NOW FOR THE ANSWER:


Spoiler



Sadly, more than half of you thought the PS3 graphics were actually better than PS4 graphics. This is because:

The answer is that the LEFT side is the PS4! So, should you upgrade to PS4 for graphics' sake alone? Probably not. Unlike the way PS3 was a huge improvement over PS2, well PS4 was hardly even an upgrade - if you can even call it an "upgrade" with the loss of DLNA and media streaming and other things. Personally I will *not* be getting the PS4 any time soon.


----------



## Toothless (Sep 21, 2014)

vawrvawerawe said:


> Okay guys, it's been about 3 months since I created this thread.
> 
> Time for the answer!
> 
> ...


I fail to see how it is sad. Seems like all Sony did was add some features, make games exclusive to the PS4 and went off from there.


----------



## Vario (Sep 21, 2014)

Well I voted for the left side.  The shirt had blurry textures on the right.  Looks like crap on both though.


----------



## TheoneandonlyMrK (Sep 21, 2014)

I fail to see the relevance of the thread title with ps4s.
As I said before helllllllll no were no where near having one gpu that can max all features and games at 4k with sustained fps high enough for all let alone three such monitors as some (me inc) dream of owning or one gpu that will push the same game quality at upto 120fps at dual hd type resolution for oculus vr etc but hey ho another 20-30%just got dished up by nvidia.

As for the ps4 I want one asap , not because its better than my pc because a recent look at fifa15 tells me my pc can still beat it on quality and performance but because of Social gaming and exclusive games plus ease of use........

Better thread next time please one that makes sense if possible.


----------



## Aquinus (Sep 21, 2014)

theoneandonlymrk said:


> I fail to see the relevance of the thread title with ps4s.


I fail to see the point of most of the threads that @vawrvawerawe makes as he seems to like leaving and coming back on a monthly basis but not stick around for more than a few days.

Last time he posted was 2 months ago and was that long until he posted again.

Lets not keep reviving vawrvawerawe troll threads.


----------



## TheoneandonlyMrK (Sep 21, 2014)

Shit troll thread though,  I mean I would not mind discussing with yall what we would like and hope to see in pc gfx but this wasn't it


----------



## Naito (Sep 21, 2014)

It's way too early to form a solid opinion in regards to PS4 (and Xbox One) graphics this earlier into its lifespan. Once the SDK matures and the developers squeeze every ounce of performance out of the hardware, then one could form said opinion. The game used in the comparison is a port, not something specifically 'hardcoded' to the PS4 hardware. How do you think they get games like GTA V running on 'ancient' hardware such as Xbox 360 and PS3? Standardised hardware allows developers to program much, much closer to hardware. This can not simply be done on PC, not only due to the vast differences in architectures, but generational and configuration differences too. 

So I believe that over the next 2 to 3 years, when current consoles hardware has been fully exploited (and thus become more capable), games on PC should gradually get more demanding.


----------



## commander calamitous (Sep 22, 2014)

I have long suspected stagnation.  Have the OP beaten by many years.

The laws of physics, a dumbed down population, and a limp economy may prove insurmountable

Still, there are frontiers to push:
120 fps.  I must have, and that is why I multi GPU
1440p


----------



## Toothless (Sep 22, 2014)

Aquinus said:


> I fail to see the point of most of the threads that @vawrvawerawe makes as he seems to like leaving and coming back on a monthly basis but not stick around for more than a few days.
> 
> Last time he posted was 2 months ago and was that long until he posted again.
> 
> Lets not keep reviving vawrvawerawe troll threads.


He seemed to have revived his own this time.


----------



## entropy13 (Sep 22, 2014)

It's a bit misleading comparison though. Colors are different too, not necessarily image quality.

If I get a screenshot from the same scene in a game, with the same video card, except for a slight difference in color settings, I'd get the same perception that they're "different" even though the performance level is the same, the hardware is the same, the game is the same.


----------



## RejZoR (Sep 22, 2014)

Where is the voting option "The Dark side" ?


----------



## Tonduluboy (Sep 22, 2014)

SO basically, OP trying to tell us no point of getting PS4 becoz of no improvement at all? Becoz of 1 picture?
When a reviewer do a GPU review, the review will include with so many games, so that, we get the general idea if the GPU good or not. 

I dont have PS3 or 4, i only have ps1 & ps2, i dont like playing FPS games using PS controller, using mouse much more easier and PC can do much more than gaming only, and if i wanna use controller i can just use my Logitech f710.


----------



## rooivalk (Sep 22, 2014)

_If this thread is talking about FPS:_
As long as single mid-high GPU could play most of decent (taxing) games at industry standard resolution at decent fps (>50 for me), I consider it's enough.
Currently it's 1920x1080, but 4K is right at the door, and since single 970 still not even pass 30fps at BF4 4K noAA, then yes we're still far from max. 
No need to see Crysis 3 result though.

_If this thread is talking about image quality:_
Image quality will always be improved. 
I though NFS Hot Pursuit (2010) is gorgeous and I don't think NFS Most Wanted (2012) and Rival (2014) are better. But, after playing the latter two, going back to Hot Pursuit is a bit eye sore.

_If this thread is talking about console:_
That's not why you buy consoles. You want console exclusive games and family/friends local co op, beside the newer consoles (xb1/ps4) are still in infancy.


----------



## rtwjunkie (Sep 22, 2014)

I got it right! That and $2 will get me a cup of coffee, LOL.


----------



## erocker (Sep 22, 2014)

Don't let one game that was basically a direct port from a PS3 to a PS4 make the decision for you. Either way, good for you that you're not buying a PS4. You seem to have a decent PC.


----------



## RCoon (Sep 22, 2014)

Aquinus said:


> vawrvawerawe



I have been summoned.
I came here to kick ass and beat heretical varwawawhwahdwaewera tail. And I'm all out of ass.

I recall him being the fellow that made a million threads on his first day, suggested a submerged PC inside a submerged tank inside a fish tank, and then made a build thread that turned out to be a fictional build, unless I am recalling thing incorrectly. He also said he was a web developer and general PC Masterrace member. I am here to state otherwise. Last I recall the mods told him to calm his baby lumps, and he left. How this thread got by unnoticed is beyond my comprehension. The guy is like the replying version of Jorge, only he stopped replying.

I vote for thread closure, nothing of value lost.

EDIT: For the love of god, he's the one that thought putting two ethernet cables into one machine, from the same router, would double his internet speeds because speedtest.net told him so.


----------



## TheMailMan78 (Sep 22, 2014)

RCoon said:


> I have been summoned.
> I came here to kick ass and beat heretical varwawawhwahdwaewera tail. And I'm all out of ass.
> 
> I recall him being the fellow that made a million threads on his first day, suggested a submerged PC inside a submerged tank inside a fish tank, and then made a build thread that turned out to be a fictional build, unless I am recalling thing incorrectly. He also said he was a web developer and general PC Masterrace member. I am here to state otherwise. Last I recall the mods told him to calm his baby lumps, and he left. How this thread got by unnoticed by me is beyond my comprehension. The guy is like the replying version of Jorge, only he stopped replying.
> ...


<<< (slowly disconnects his second Ethernet cable). YEAH WHAT A DUMBASS!


----------



## Frick (Sep 22, 2014)

RCoon said:


> EDIT: For the love of god, he's the one that thought putting two ethernet cables into one machine, from the same router, would double his internet speeds because speedtest.net told him so.



A friend claimed he had modded two 56k-modems into one and thus achieved 112kb/s.


----------



## Sasqui (Sep 22, 2014)

Frick said:


> A friend claimed he had modded two 56k-modems into one and thus achieved 112kb/s.



I recently plugged two computers into the same monitor, it looked better than the PS8!


----------



## yogurt_21 (Sep 22, 2014)

erocker said:


> Don't let one game that was basically a direct port from a PS3 to a PS4 make the decision for you. Either way, good for you that you're not buying a PS4. You seem to have a decent PC.


zomg Half Life 1 looks exactly the same on my R9 290 as it did on my Radeon 9000 128MB. I have totally wasted my money upgrading all these years...


----------



## Aquinus (Sep 22, 2014)

Frick said:


> A friend claimed he had modded two 56k-modems into one and thus achieved 112kb/s.


Link aggregation? PPP network devices? No problem. Linux supports multilink PPP. The ISP must support it though, so I doubt be "modded" it if it was true, but rather actually had an ISDN connection.


			
				Wikipedia said:
			
		

> *Multilink PPP* (also referred to as *MLPPP*, *MP*, *MPPP*, *MLP*, or Multilink) provides a method for spreading traffic across multiple distinct PPP connections. It is defined in RFC 1990. It can be used, for example, to connect a home computer to an Internet Service Provider using two traditional 56k modems, or to connect a company through two leased lines.
> 
> On a single PPP line frames cannot arrive out of order, but this is possible when the frames are divided among multiple PPP connections. Therefore Multilink PPP must number the fragments so they can be put in the right order again when they arrive.
> 
> Multilink PPP is an example of a link aggregation technology. Cisco IOS Release 11.1 and later supports Multilink PPP.


----------



## rtwjunkie (Sep 22, 2014)

TheMailMan78 said:


> <<< (slowly disconnects his second Ethernet cable). YEAH WHAT A DUMBASS!


 
LMAO!!  Quite funny...my coworkers asked what I was laughing at.

@RCoon:  I had no idea....and I participated here. SMH


----------



## Frick (Sep 22, 2014)

Aquinus said:


> Link aggregation? PPP network devices? No problem. Linux supports multilink PPP. The ISP must support it though, so I doubt be "modded" it if it was true, but rather actually had an ISDN connection.



You'd need two lines for that, and they didn't have that. This was in the 8th grade or thereabout. I was even more ignorant than I am now, but that guy just made stuff up to impress people. Years later I met him and he told me he was in collage to become a "security expert" and that he toyed around with Linux a lot but he had no idea what a distribution was. Or what computer he had, just that it was fast.


----------



## Aquinus (Sep 22, 2014)

Frick said:


> You'd need two lines for that, and they didn't have that. This was in the 8th grade or thereabout. I was even more ignorant than I am now, but that guy just made stuff up to impress people. Years later I met him and he told me he was in collage to become a "security expert" and that he toyed around with Linux a lot but he had no idea what a distribution was. Or what computer he had, just that it was fast.



So really what you're trying to say is this person you knew acted like a know it all is basically the kind of thing @vawrvawerawe does. In reality he probably lives in his parent's basement playing WoW all day long.


----------



## Ahhzz (Sep 26, 2014)

http://www.nextpowerup.com/news/13434/the-evil-within-pc-version-requires-4-gb-vram.html

'Nuff Said.


----------



## 64K (Sep 26, 2014)

Ahhzz said:


> http://www.nextpowerup.com/news/13434/the-evil-within-pc-version-requires-4-gb-vram.html
> 
> 'Nuff Said.



Yeah, a lot of us knew that was coming. When developers have more to work with (newgen consoles) they use more resources.

A little off topic but I was reading an article a little while ago that says than Nvidia and AMD are preparing for 8K monitors in the future. Basically the number of pixels of 4 4K or 16 1080p monitors. What amount of GPU  power that's going to require I can't even imagine.






http://www.pcgamesn.com/both-nvidia...an-eye-that-resolution-is-close-to-perfection


----------



## Frick (Sep 26, 2014)

"share requirements that show the game the way it was meant to be played."

It also comes for Xbox360 and PS3, so I don't buy that argument.


----------



## Ahhzz (Sep 26, 2014)

64K said:


> ....
> A little off topic but I was reading an article a little while ago that says than Nvidia and AMD are preparing for 8K monitors in the future. Basically the number of pixels of 4 4K or 16 1080p monitors. What amount of GPU  power that's going to require I can't even imagine.





and, not to be left out.....
http://www.nextpowerup.com/news/134...-mordor-ultra-textures-require-6-gb-vram.html


----------



## rtwjunkie (Sep 26, 2014)

Ahhzz said:


> and, not to be left out.....
> http://www.nextpowerup.com/news/134...-mordor-ultra-textures-require-6-gb-vram.html


 
 What....the.....????  This is going to start getting ridiculous real quicklike for anyone prior to the 900 series!


----------



## Ahhzz (Sep 26, 2014)

My thoughts too... I was hoping to slide into a 290 series at some point, but now I think I'll just crawl under a rock somewhere and play my old gameboy B&W.....


----------



## rtwjunkie (Sep 26, 2014)

Ahhzz said:


> My thoughts too... I was hoping to slide into a 290 series at some point, but now I think I'll just crawl under a rock somewhere and play my old gameboy B&W.....


 
I had the same thought...just don't buy any new games and replay what I have....as well as stuff I've never touched.  *Sigh*


----------



## Frick (Sep 26, 2014)

Ahhzz said:


> and, not to be left out.....
> http://www.nextpowerup.com/news/134...-mordor-ultra-textures-require-6-gb-vram.html



Had it been a Jedi Knight I would have liked it. I've wanted a Jedi Knight since the Doom 3 engine came out.


----------



## 64K (Sep 26, 2014)

Ahhzz said:


> and, not to be left out.....
> http://www.nextpowerup.com/news/134...-mordor-ultra-textures-require-6-gb-vram.html



I guess that would help to explain the rumors about Big Maxwell coming with 8GB VRAM. This may turn out to be marketing hype about Shadow of Mordor but I felt like we turned a corner with Watch Dogs 3 GB VRAM minimum requirement for Ultra settings. Even the mid range Maxwells GTX 970/980 are coming with 4 GB VRAM.


----------



## vawrvawerawe (Jan 16, 2015)

RCoon said:


> I have been summoned.
> I came here to kick ass and beat heretical varwawawhwahdwaewera tail. And I'm all out of ass.
> 
> I recall him being the fellow that made a million threads on his first day, suggested a submerged PC inside a submerged tank inside a fish tank, and then made a build thread that turned out to be a fictional build, unless I am recalling thing incorrectly. He also said he was a web developer and general PC Masterrace member. I am here to state otherwise. Last I recall the mods told him to calm his baby lumps, and he left. How this thread got by unnoticed is beyond my comprehension. The guy is like the replying version of Jorge, only he stopped replying.
> ...



you must be talking about someone else.

1) didn't want a pc in a fish tank
2) I never made a fictional build. The real build I made 2 years ago is the build I am using to type this post. It's still top of the line over 2 years later.
3) What is PC Masterrace? Definitely not one since I don't even know what it is.

Go troll elsewhere. No wonder no one listened to your request to close.

And stop saying God's name in vain.



Ahhzz said:


> http://www.nextpowerup.com/news/13434/the-evil-within-pc-version-requires-4-gb-vram.html
> 
> 'Nuff Said.


Sweet


----------



## Blue-Knight (Jan 16, 2015)

vawrvawerawe said:


> What is PC Masterrace?


My definition:
PCMasterRace is a group of humans who believe to be superior just because they have the ability to build expensive, "high-end" computers just to play games at a yearly or lower time frame. They often appears in a post or comment by saying: #PCMasterRace <text>

One of their many common behaviors include, but are not limited to: Exhibit their hardware between them in a reserved space, area or topic; run several benchmarks and compare their results against each others'; and most obviously to insult console and console gamers.

They can also be identified by the use of similar avatars/profile pictures: http://i2.kym-cdn.com/photos/images/original/000/508/616/01f.jpg

Wikipedia's definition:
http://en.wikipedia.org/wiki/PC_Master_Race


----------



## rtwjunkie (Jan 16, 2015)

Ok, his post and blue-knight's were the biggest piece of trolling I have seen in here in a long time. He's on my ignore list.

B-K I will give the benefit of the doubt to since I have interacted with him quite a bit on this site, although it boggles my mind how someone is on a basically PC-centric site and make such hated comments against pc users.


----------



## quest4glory (Jan 16, 2015)

Blue-Knight said:


> My definition:
> PCMasterRace is a group of humans who believe to be superior just because they have the ability to build expensive, "high-end" computers just to play games at a yearly or lower time frame. They often appears in a post or comment by saying: #PCMasterRace <text>
> 
> One of their many common behaviors include, but are not limited to: Exhibit their hardware between them in a reserved space, area or topic; run several benchmarks and compare their results against each others'; and most obviously to insult console and console gamers.
> ...


 
Really it's just a bunch of dorks on reddit who think they're being satirical and ironic, and condescending at the same time, yet most of the subscribers truly believe in the BS which has been spewed.  It's a self feeding beast of silliness that's good for nothing more than a brief chortle before moving along with your life.


----------



## Solaris17 (Jan 16, 2015)

Well this got out of control fast.


----------

