I never understood that argument. It seems to be rooted in old studies that said the human eye couldn't see more than 30-60hz, but I sure as hell notice the difference betwwen, 30, 60, 75, 85, 100, and 120hz.
There's a difference between Hz and fps. The human eye can see minor/overall changes like refresh rates or a sudden change on
one thing (i.e a screen changing from red to green in your pheripheral view or something) at a much faster rate than it can actually see with detail, such as a frame in a movie where the brain will generally try on getting as much information as posible.
There's also a difference between what we can see (if we try) and what we need or is sufficient for a sense of fluid motion. This is from where most arguments come from. I think it's beyond proven that for the grand mayority of people, 24 fps and 72 Hz is enough to be "satisfactory", since that's what films are played on. Of course more is better, but there's probably a point not much higher where it becomes irrelevant to go higher.
But that is for motion pictures, on games it's vastly different, because while you game, your brain is not concentrated on the entire picture, it's trained to be focused on some select details like the enemy, so the rate at which it can "see" changes on those select areas is higher. Also low frame rate goes hand in hand with slow response time and bad mouse control and that can be more easily sensed by our brain too.
So high framerates are necesary and desirable for gaming, but usually people is wrong when they say they can "see" at very high
frame rates. Most of people who claim that, if presented with a movie playing at 48 fps or 96 fps changing back and forth would not see the difference if they were watching the film.