• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Since when became 60fps gaming a must

If you hook up a PSX to a HDTV that still has those god awful composite inputs, it'll look terrible compared to an old CRT TV with the same hookups. There are upscalers though... I have a cheap SCART to HDMI converter/upscaler that makes PS1 games look pretty decent. Not a difficult job when starting with SCART though...
Thing is if it's used with upscalers or not, newer high res TV's makes all the lack of resolution with older games stand out and the entire thing just looks worse.
It's almost like amplifying all the graphical limitations/imperfections of an older game because newer TV's make these stand out in sharp detail vs older ones smoothing/blending them instead.

Believe me, I've tried it with an HDTV, my 4K monitor and that's always been the results vs just using my old TV instead and that's across all the older consoles I have.
Even a PS2 can look crappy but it's not quite as bad, it's only with the PS3 and similar gen consoles it's looks fine with an HDTV but it's also a reflection on the advancement of the tech as time has passed.
Not much we can do about it except deal - It is what it is.
 
No, human eye can see 60 fps. the reason movies are made 29fps is that we don't typically have fast movements.
however in 29fps, if you watch a soccer player kicking a ball, you'll see some ghost effect. skiing, racing....

This is actually not why. In the film industry 24 fps is the standard was originally the minimum for smooth motion that was budget friendly. It is way too low so your brain has to 'imagine' the frames between giving the movie a more dramatic feel and helping suspend disbelief (an unintended effect) -- but this is still done today to give movies a certain feel - we can definitely shoot in 60 but it doesn't feel the same 24FPS is the 'cinematic standard'.

Watch a cringy high school play at 60 fps and then slow it down to 24fps and you will see what I mean.


EDIT: go to TV setting #4

Needless to say you want your games to feel as responsive as possible since faster response = more control = more fun.

Old school nintendo games ran at 60 / 50 fps depending on the tv, and they felt AWESOME.
 
Last edited:
No, human eye can see 60 fps. the reason movies are made 29fps is that we don't typically have fast movements.
however in 29fps, if you watch a soccer player kicking a ball, you'll see some ghost effect. skiing, racing....
Yes we do. Action movies, car chases...

No, the reason 29 fps (25 in Europe) movies are fine is because it's constant. The eyes have a lot easier job filling out the details when you're presented with still frames at a constant pace. It's the same reason why DOS games were enjoyable without running anywhere near 60, or even 30 fps. They were locked to your CPU's clock speed, so their frame pacing was as constant as it could be. That's also why I prefer a constant 30 fps to a variable 40-60 with dips and hikes.
 
Last edited:
Thing is if it's used with upscalers or not, newer high res TV's makes all the lack of resolution with older games stand out and the entire thing just looks worse.
It's almost like amplifying all the graphical limitations/imperfections of an older game because newer TV's make these stand out in sharp detail vs older ones smoothing/blending them instead.

Believe me, I've tried it with an HDTV, my 4K monitor and that's always been the results vs just using my old TV instead and that's across all the older consoles I have.
Even a PS2 can look crappy but it's not quite as bad, it's only with the PS3 and similar gen consoles it's looks fine with an HDTV but it's also a reflection on the advancement of the tech as time has passed.
Not much we can do about it except deal - It is what it is.
My point was that using an upscaler with a good quality connection turns out a lot better than simply plugging in a crappy composite connection to an HDTV. Today's TVs have a really hard time with low resolution video, and often can't even properly display low resolutions over better connections like component.
 
My point was that using an upscaler with a good quality connection turns out a lot better than simply plugging in a crappy composite connection to an HDTV. Today's TVs have a really hard time with low resolution video, and often can't even properly display low resolutions over better connections like component.
Oh yeah, I agree with all that.
I also know some games themselves are a factor since some were done better than others concerning graphics, again it's just something we have to deal with.
The ones with better graphics tend to have a lower FPS due to the hardware limitations of the time.
There were only a few games that had both, good graphics and FPS to go with it but even then at times the FPS suffered.
 
60 should be the minimum. I can easily tell the difference between 60 and 120, it's just so much more fluid and responsive. I can also tell the difference between 120 and 144, just 24 higher, but it's quite subtle. I can't comment with certainty about higher fps, because I don't have a monitor that goes higher than 144, but I reckon I'd see the difference and feel it for sure.
 
Then I'm not a normal person as I can't tell the difference between 40 and 60 fps. :roll:
I dont believe that is the case. Even just on the windows desktop there is a huge difference between 60 and 144,just scrolling a webpage or moving your mouse. Have you tried a 144 monitor?

I hvent tried more than 144, but i assume we start hitting diminishing returns at 240hz.
 
I dont believe that is the case. Even just on the windows desktop there is a huge difference between 60 and 144,just scrolling a webpage or moving your mouse. Have you tried a 144 monitor?

I hvent tried more than 144, but i assume we start hitting diminishing returns at 240hz.
No, I haven't. My phone can do 90 Hz, but the only time I can sort of see it is when I'm scrolling, but sometimes not even then. The difference is negligible, at most.

I'd much rather have accurate colours than high refresh rates. I'll take a 60 Hz IPS or VA over a 144 Hz TN without a second thought.
 
No, I haven't. My phone can do 90 Hz, but the only time I can sort of see it is when I'm scrolling, but sometimes not even then. The difference is negligible, at most.

I'd much rather have accurate colours than high refresh rates. I'll take a 60 Hz IPS or VA over a 144 Hz TN without a second thought.
I don't think those are your options, there are 240hz VA panels for example. Anyways...
 
No, I haven't. My phone can do 90 Hz, but the only time I can sort of see it is when I'm scrolling, but sometimes not even then. The difference is negligible, at most.

I'd much rather have accurate colours than high refresh rates. I'll take a 60 Hz IPS or VA over a 144 Hz TN without a second thought.
tbh it's really not that noticeable between 60 and 90. 60 and 120 on the other hand is like night and day. Even just moving the mouse pointer around it's very noticeable and even more so when windows are dragged around, let alone gaming. I know, because I've got a 144Hz monitor and have tried all the different refresh rates on it.

On top of that, my iPad can do 120Hz. Switch that off and 60Hz looks surprisingly juddery on its crisp display.
 
There is also an event where Ubisoft CEO Guillemot states 30 FPS is what gamers want for their 'realistic' gaming experiences. Its the 'best way to game' according to the man. We all know it is utter bullshit. 30 FPS is a frametime that for many a game, is not just uncanny, but can kill your gameplay. When it can drop from that bottom line, it gets exponentially worse.

Enjoy this laugh while we're on the subject :D We all know where AC Unity went, by now, into the books as easily the shittiest AC ever, or among them.

You mean this? :laugh:

ubisoft.png
 
Not entirely. You need a pretty good HDTV to get a similar image quality that you would have on an old TV. Quite simply because scanlines aren't pixel perfect. Basically old TVs had their own way to pass AA over anything, and it did the job well to smooth out the image. LCDs lack that, they have a pattern they map to.
Absolutely true, when I was a kid we had a big Philips CRT TV in the living room but it had a matrix screen which wasn't too good and you could see the pixel grid easily, when I entered high school I got as a gift a smaller Grunding that had a planar screen (if I remember "Athens line") and it was crazy good, tons of free antialiasing and the colors were insane, day and night difference with Philips.
I got it together with PS1 and the upgrade was just insane!
 
Absolutely true, when I was a kid we had a big Philips CRT TV in the living room but it had a matrix screen which wasn't too good and you could see the pixel grid easily, when I entered high school I got as a gift a smaller Grunding that had a planar screen (if I remember "Athens line") and it was crazy good, tons of free antialiasing and the colors were insane, day and night difference with Philips.
I got it together with PS1 and the upgrade was just insane!

16- and 32-bit (plus N64) generation console titles are def best played on a ~19" CRT (IMO)
 
16- and 32-bit (plus N64) generation console titles are def best played on a ~19" CRT (IMO)
19” CRT monitor? Not for me, I had a Sony and an Eizo (much better) but I never tried any console on them, I didn't have a converter plus when I tried some games at 640X480 (PC) the pixels were very annoying visible (the usual resolution for 19" was 1280X1024 or 1024X768 at least)
(16bit was around a quarter of 640X480, I can't imagine that a 256X224 (SNES) looked good on 19" CRT or are you talking about emulation?)
But depends, I guess tastes are different plus you may had a good converter?
 
19” CRT monitor? Not for me, I had a Sony and an Eizo (much better) but I never tried any console on them, I didn't have a converter plus when I tried some games at 640X480 (PC) the pixels were very annoying visible (the usual resolution for 19" was 1280X1024 or 1024X768 at least)
(16bit was around a quarter of 640X480, I can't imagine that a 256X224 (SNES) looked good on 19" CRT or are you talking about emulation?)
But depends, I guess tastes are different plus you may had a good converter?

Not on a monitor, but a TV. I'm talking running native here, rather than emulated. Should have specified. Anyway, modern CRT TVs were all 480 scanlines (CRTs have no pixels) regardless of size, and console output scaled to fit. The signal was also interlaced, which combined with a little upscaling produced the "natural" antialiasing referred to earlier in the thread. 19" is large enough that one can still see detail from ~2m away, but not so large that everything starts to look blocky and gross.
 
Not on a monitor, but a TV. I'm talking running native here, rather than emulated. Should have specified. Anyway, modern CRT TVs were all 480 scanlines (CRTs have no pixels) regardless of size, and console output scaled to fit. The signal was also interlaced, which combined with a little upscaling produced the "natural" antialiasing referred to earlier in the thread. 19" is large enough that one can still see detail from ~2m away, but not so large that everything starts to look blocky and gross.
OK you meant TV.
Regarding the sharpness of the image produced, a shadow mask or aperture grill and it's quality made a huge difference I think.And the difference in clarity between a 640X480 image and a 1280x1024 one was huge imo on a 19" CRT monitor.
 
OK you meant TV.
Regarding the sharpness of the image produced, a shadow mask or aperture grill and it's quality made a huge difference I think.And the difference in clarity between a 640X480 image and a 1280x1024 one was huge imo on a 19" CRT monitor.

Oh, for sure. I ran 1152x864 on my last CRT monitor (a 19", incidentally), though, cuz screen elements got too small otherwise.
 
60 should be the minimum. I can easily tell the difference between 60 and 120, it's just so much more fluid and responsive. I can also tell the difference between 120 and 144, just 24 higher, but it's quite subtle. I can't comment with certainty about higher fps, because I don't have a monitor that goes higher than 144, but I reckon I'd see the difference and feel it for sure.

60 looks bad. 80-90 is what I like as my minimum, but sometimes put up with around 60 for SP games. Typically ray tracing gives me frame rates around 50-60 in games with DLSS. I often decide to turn it off and get 80-90 frame rates without DLSS because the difference in frame rates is very noticeable.

60 is my absolute minimum. I try and get 120 or so. It looks better than 90 without a doubt, but 90 isn't too bad. Now if you're playing an online FPS like Battlefield or CoD, you really want more like 120 average frame rates.
 
Isnt a must for me today, probably never will be, been having fun playing eternal sonata in RPCS3 with frame rate from around 18FPs up to 30FPS, and for the first time getting my money's worth from VRR. I usually value image quality over frame rate, unless its a case of stutter/tearing fest from not been able to hit frame target. I am typically happy with 30 or 60 frame rate targets.

I also feel if frame rate is too high then lose the immersion, so e.g. flames in games 100% look better at 30, and the higher the frame rate the less immersive a RPG will feel, I also hate it when TVs interpolate frames in movies, as it then feels like i am watching a TV soap.
 
It's always been a must for me ever since the Super Nintendo days and the 90s-2000s arcade era. I think what made me sensitive to framerates was when I first visited an arcade as a kid, I could tell the arcade machines had faster moving images almost lifelike movement and big screens which to me made it look really fun and premium for example Marvel VS Capcom, Dark Stalkers, Metal Slug etc. Then there were other machines with 3D fighting that ran at slower frames and this is what helped me differentiate between slow/fast frames.

Moving onto a decade later and I'm trying to achieve 60fps on Crysis lol, for games low FPS just feels less responsive and less fun. I can tolerate 50-60fps though before I start getting bothered by the movement.

Movies however need to stay at 24fps. 60FPS would make every movie look like those old soap operas that were recorded in 60fps lol.
 
All ya have to do is to play something like Yakuza in 30 fps and in 60 fps mode. 60 is so much more responsive, 30 feels sluggish. Sure, you can get used to it, but motion fidelity and responsiveness is a must for something like brawler. For some RTS game or something much slower paced sure, you can get by with 30 fps.
On top of that there's also issue of frame pacing and stuttering, which pays even bigger role. Hell, I'd take locked 60 with proper frame pacing and no stutter over 100 fps + with stutter caused by frametime spikes any day.
 
Moving onto a decade later and I'm trying to achieve 60fps on Crysis lol, for games low FPS just feels less responsive and less fun. I can tolerate 50-60fps though before I start getting bothered by the movement.
It's possible, but Crysis was surprisingly heavy on few CPU cores. My FX 6300 dropped to 30s, so basically CPU with 2x better single core performance is needed. Surprisingly it's not the graphics card that is actually the biggest bottleneck. My old FirePro V8800 (Radeon 5870) can run it at 1080p medium-high. Something like RTX 3050 should be fine for 1080p high-ultra. I also suspect that Crysis can't utilize more than 4GB RAM due to it being 32 bit game, but that's just speculation.

All ya have to do is to play something like Yakuza in 30 fps and in 60 fps mode. 60 is so much more responsive, 30 feels sluggish. Sure, you can get used to it, but motion fidelity and responsiveness is a must for something like brawler. For some RTS game or something much slower paced sure, you can get by with 30 fps.
On top of that there's also issue of frame pacing and stuttering, which pays even bigger role. Hell, I'd take locked 60 with proper frame pacing and no stutter over 100 fps + with stutter caused by frametime spikes any day.
Ah yes, you know you are Yakuza player, when you have hardware for it to run at 60 fps +, but it just drops to 30s for seemingly no reason. But to be fair, it's a weird quirk of Japanese 3D games. I also played Sonic Generations and my fps was unstable for seemingly no reason. Also had same issue with Sleeping Dogs.
 
Back
Top