Thursday, December 13th 2012

HD 7950 May Give Higher Framerates, but GTX 660 Ti Still Smoother: Report

The TechReport, which adds latency-based testing in its VGA reviews, concluded in a recent retrospective review taking into account recent driver advancements, that Radeon HD 7950, despite yielding higher frame-rates than GeForce GTX 660 Ti, has higher latencies (time it takes to beam generated frames onto the display), resulting in micro-stutter. In response to the comments-drama that ensued, its reviewer did a side-by-side recording of a scene from "TESV: Skyrim" as rendered by the two graphics cards, and slowed them down with high-speed recording, at 120 FPS, and 240 FPS. In slow-motion, micro-stuttering on the Radeon HD 7950 is more apparent than on the GeForce GTX 660 Ti.

Find the slow-motion captures after the break.


Source: The TechReport
Add your own comment

122 Comments on HD 7950 May Give Higher Framerates, but GTX 660 Ti Still Smoother: Report

#101
Wile E
Power User
It's not a "so called" issue. It's true in some setups. My 4870x2 stuttered. When I went to replace it, I initially went with a 5870, much less stuttering (was still there in some games though), but not enough performance. Basically a cross-grade for me.

So I tried 2 x 5850. Nice performance boost, but stuttering was back in full effect.

So I decided to try nVidia. My 580 gets roughly the same frame rates as the 5850 Crossfire combo in the things I tested, but is noticeably smoother. Stuttering is a rare occurrence on my 580. It does happen on some occasions though. Much less frequent however.

Of course, ymmv. I'm sure there's more to it than just AMD and their drivers. But on my setup, I get a better experience with nVidia.
Posted on Reply
#102
Melvis
Wile EIt's not a "so called" issue. It's true in some setups. My 4870x2 stuttered. When I went to replace it, I initially went with a 5870, much less stuttering (was still there in some games though), but not enough performance. Basically a cross-grade for me.

So I tried 2 x 5850. Nice performance boost, but stuttering was back in full effect.

So I decided to try nVidia. My 580 gets roughly the same frame rates as the 5850 Crossfire combo in the things I tested, but is noticeably smoother. Stuttering is a rare occurrence on my 580. It does happen on some occasions though. Much less frequent however.

Of course, ymmv. I'm sure there's more to it than just AMD and their drivers. But on my setup, I get a better experience with nVidia.
See this is what i mean, i ran a single 4870X2 then went to 2x 4870X2 and never saw this stuttering issue. I think it depends more on software installed on each individual machine and hardware then all down to just what there claiming.

And lets face it if its THAT BAD then no one would be buying AMD cards period, but we both know that isnt true?
Posted on Reply
#103
mediasorcerer
I saw this over at ars yesterday, i'm very happy with mine, as if the human eye can see microstuttering anyway lol, cinema is 24 frames per second and nobodies complained about that for the last 100 years have they?

We are talking milliseconds, anyone out there can honestly tell me they can see in milliseconds lol?


Give me the bus width and extra vid ram anyday, much more futureproof.


They are a great card for the money, plain and simple
Posted on Reply
#104
Wile E
Power User
MelvisSee this is what i mean, i ran a single 4870X2 then went to 2x 4870X2 and never saw this stuttering issue. I think it depends more on software installed on each individual machine and hardware then all down to just what there claiming.

And lets face it if its THAT BAD then no one would be buying AMD cards period, but we both know that isnt true?
I think it's more a combination of setup, software, and also a individual's natural ability to see it or not. Though on my setup, AMD was much more guilty of it. Hard to say why that is for sure, but notice I said my setup. Again, ymmv.
mediasorcererI saw this over at ars yesterday, i'm very happy with mine, as if the human eye can see microstuttering anyway lol, cinema is 24 frames per second and nobodies complained about that for the last 100 years have they?

We are talking milliseconds, anyone out there can honestly tell me they can see in milliseconds lol?


Give me the bus width and extra vid ram anyday, much more futureproof.


They are a great card for the money, plain and simple
Movies and games are recorded and rendered differently. Movies have blurring which effect the perceived smoothness. The blurring is caused by the camera at capture time. Games generate the images, not capture them, and therefore are not blurred, but perfect frame by frame. The human eye DOES perceive the difference. Blurring fools the human eye and brain into seeing smooth movement.

The few games that do have some sort of blurring option, generally run much smoother at much lower framerates. Just look at Crysis 1 as an example. With blurring, it rendered smoothly on most setups all the way down in the 30's, whereas other games require a much higher framerate to achieve the same level of smoothness.

Besides, you do not have to be able to see each individual frame to recognize when something isn't looking smooth. Most people I let see these issues first hand can't put a finger on what's wrong, but they see microstuttering as something that's just a little off, and doesn't feel quite right.


EDIT: Found what I was looking for to prove my point. Even at 60fps, some settings show a noticeable difference. It's even more pronounced if you view on a high quality CRT.
frames-per-second.appspot.com/
Posted on Reply
#105
ValenOne
entropy13That probably explains why Tech Report had an HD 7770 giveaway three weeks ago, an HD 7870 giveaway two weeks ago, and an HD 7950 giveaway last week. :rolleyes:

And since you're quite insistent that there is an Nvidia bias, why not read this page about Sleeping Dogs, an "AMD Gaming Evolved" title?

techreport.com/r.x/radeon-win8/dogs-fps.gif
AMD IS BETTER!!! NVIDIA SUCKS!!!

techreport.com/r.x/radeon-win8/dogs-99th.gif
Not much difference, it's just a tie...

techreport.com/r.x/radeon-win8/dogs-beyond-50.gif
Oh...THAT'S NVIDIA'S FAULT!!!



Video card reviews should just focus on framerates? Testing for latency is embarrassing?
techreport.com/review/23150/amd-radeon-hd-7970-ghz-edition/6



techreport.com/review/23150/amd-radeon-hd-7970-ghz-edition/7


Might as well return to XFX 7950 Black with 900Mhz and no turbo boost.

For AIB overclock vs AIB overclock product, Techreport should have used Sapphire 100352VXSR i.e. 7950 @ 950Mhz with no turboboost.
Posted on Reply
#106
mediasorcerer
Wile EI think it's more a combination of setup, software, and also a individual's natural ability to see it or not. Though on my setup, AMD was much more guilty of it. Hard to say why that is for sure, but notice I said my setup. Again, ymmv.



Movies and games are recorded and rendered differently. Movies have blurring which effect the perceived smoothness. The blurring is caused by the camera at capture time. Games generate the images, not capture them, and therefore are not blurred, but perfect frame by frame. The human eye DOES perceive the difference. Blurring fools the human eye and brain into seeing smooth movement.

The few games that do have some sort of blurring option, generally run much smoother at much lower framerates. Just look at Crysis 1 as an example. With blurring, it rendered smoothly on most setups all the way down in the 30's, whereas other games require a much higher framerate to achieve the same level of smoothness.

Besides, you do not have to be able to see each individual frame to recognize when something isn't looking smooth. Most people I let see these issues first hand can't put a finger on what's wrong, but they see microstuttering as something that's just a little off, and doesn't feel quite right.


EDIT: Found what I was looking for to prove my point. Even at 60fps, some settings show a noticeable difference. It's even more pronounced if you view on a high quality CRT.
frames-per-second.appspot.com/
If you say so, you may well be right, i don't get that with my card, and it's just a stock 7950 too, i'm not using boost bios though. Thanks for the reply and info.
Posted on Reply
#107
jihadjoe
mediasorcererI saw this over at ars yesterday, i'm very happy with mine, as if the human eye can see microstuttering anyway lol, cinema is 24 frames per second and nobodies complained about that for the last 100 years have they?

We are talking milliseconds, anyone out there can honestly tell me they can see in milliseconds lol?
There's actually an interesting paper from Utah University about that:
webvision.med.utah.edu/book/part-viii-gabac-receptors/temporal-resolution/

And xbitlabs also had a look at how display technology affects perceived response times:
www.xbitlabs.com/articles/monitors/display/lcd-parameters_3.html

Anyways they something like the eye (thanks to the brain) is actually able to perceive changes up to 5ms. That's 200 frames per second.

Cinema is smooth at 24fps because those frames are delivered consistently.

i.e., if you plot time vs frames, then
at 0ms you get frame 1,
at 41.6ms you get frame 2,
at 83.3ms you get frame 3 and so on.

The frames always arrive right on time, and your brain combines them into an illusion of fluid motion. Of course it kinda helps that every frame in a movie is already done and rendered, so you dont have to worry about and render delays.

On a computer, the case might be like:

at 0 ms you get frame 1
at 16.7ms you get frame 2
at 40ms you get frame 3 (now this frame should have arrived at 33.3ms)
at 50ms you get frame 4

Frame 3 was delayed by 8ms. Going by a consistent 60fps it should have arrived at 33.3ms, but processing delays meant it rolls off the GPU late. Your in-game fps counter or benchmark tool wont notice it at all because it still arrived before 50ms (or when frame 4 was due), but your eye, sensitive to delays of up to 5ms notices this as a slight stutter.
Posted on Reply
#108
Pehla
MelvisAnd lets face it if its THAT BAD then no one would be buying AMD cards period, but we both know that isnt true?
i agree...
i think ppl who bought nvidia must say negative coments about amd becouse...well lets face it they can't do nothing else..they have nvidia!!and since im not fan of any of those
i can say the sam about amd fans!i just go price performance and now.. that is amd!!!
i would even go with amd cpu setup just becouse they are cheaper..but that dont give pcie gen3 suport so i give it up!!
:nutkick:
Posted on Reply
#109
Wile E
Power User
mediasorcererIf you say so, you may well be right, i don't get that with my card, and it's just a stock 7950 too, i'm not using boost bios though. Thanks for the reply and info.
Oh, that doesn't mean it's going to effect everyone. That's not the argument I'm trying to make. I'm just saying that it is real, and does effect some.

If you have a great experience with your card, by all means, keep using it. I'm not here to tell you otherwise. After all, what works best for one, doesn't always work best for another. I'm not here to tell you AMD is bad for you. If it works great in your setup, there's no reason for you to worry about it at all.

The cards just don't seem to work their best in my particular setup. I can't speak for everyone though.

On the topic of this particular article and related reviews, however, I do like the latency based approach to testing. It seems to fall into line with how my system behaves with these cards.
Posted on Reply
#110
okidna
rvalenciatechreport.com/review/23150/amd-radeon-hd-7970-ghz-edition/6

techreport.com/r.x/radeon-hd-7970-ghz/dirt-beyond-50.gif

techreport.com/review/23150/amd-radeon-hd-7970-ghz-edition/7
techreport.com/r.x/radeon-hd-7970-ghz/skyrim-beyond-50.gif

Might as well return to XFX 7950 Black with 900Mhz and no turbo boost.

For AIB overclock vs AIB overclock product, Techreport should have used Sapphire 100352VXSR i.e. 7950 @ 950Mhz with no turboboost.
:D Old driver is OLD.

With newer BETA driver :

techreport.com/review/23527/review-nvidia-geforce-gtx-660-graphics-card/5


techreport.com/review/23527/review-nvidia-geforce-gtx-660-graphics-card/8
Posted on Reply
#112
Ferrum Master
It is a shame we cannot compare such tests on Linux or MacOS... at least for now... but AMD still lags behind nvidia driver binary blobs tough there...

Anyway I see this as necessary evil. This will shaken up AMD driver team at least.
Posted on Reply
#113
seronx
www.sapphiretech.com/presentation/product/?cid=1&gid=3&sgid=1157&lid=1&pid=1547&leg=0

950 MHz for ALUs/TMUs/ROPs(1792/112/32)
= 3404.8 GFlops/106.4 GTexels/30.4 GPixels
5 GHz 384-bit GDDR5
= 240 GB/s

www.zotac.com/index.php?page=shop.product_details&flypage=flypage_images-SRW.tpl-VGA&product_id=496&category_id=186&option=com_virtuemart&Itemid=100313&lang=en

1111 MHz for ALUs/TMUs/ROPs(1344/112/24)
= 2986.368 GFlops/124.432 GTexels/26.664 GPixels
6.608 GHz 192-bit GDDR5
= 158.592 GB/s

---
In my conclusion it would appear that the 660 Ti has faster timings(Renders the scene faster) and does more efficient texel work(Can map the textures faster).

--> Higher clocks = faster rendering. <--
Games don't use the ALUs, the ROPs, and the RAM efficiently on the PC, so more Hz means more power even if you have significantly less units.

Games(+HPC with CUDA): Nvidia <--unless AMD is cheaper for the same performance.
High Performance Computing(Not with CUDA): AMD
Posted on Reply
#114
Ferrum Master
seronxwww.sapphiretech.com/presentation/product/?cid=1&gid=3&sgid=1157&lid=1&pid=1547&leg=0

950 MHz for ALUs/TMUs/ROPs(1792/112/32)
= 3404.8 GFlops/106.4 GTexels/30.4 GPixels
5 GHz 384-bit GDDR5
= 240 GB/s

www.zotac.com/index.php?page=shop.product_details&flypage=flypage_images-SRW.tpl-VGA&product_id=496&category_id=186&option=com_virtuemart&Itemid=100313&lang=en

1111 MHz for ALUs/TMUs/ROPs(1344/112/24)
= 2986.368 GFlops/124.432 GTexels/26.664 GPixels
6.608 GHz 192-bit GDDR5
= 158.592 GB/s

---
In my conclusion it would appear that the 660 Ti has faster timings(Renders the scene faster) and does more efficient texel work(Can map the textures faster).

--> Higher clocks = faster rendering. <--
Games don't use the ALUs, the ROPs, and the RAM efficiently on the PC, so more Hz means more power even if you have significantly less units.

Games(+HPC with CUDA): Nvidia <--unless AMD is cheaper for the same performance.
High Performance Computing(Not with CUDA): AMD
To prove it is right... we need to downclock the 660ti even through output numbers and then do the benches... then we'll see it those are kernel/driver problems or hardware limitation by itself... although yes skyrim is a mess even so... project stutter.
Posted on Reply
#115
jihadjoe
Edit: Ah fk it I just realized I'm DAYS late to the party...

Just a little tweet from Anand Shimpi:
twitter.com/anandshimpi/status/279440323208417282
I've known @scottwasson for a while and I've never known him to be biased in his GPU coverage.
I'm pretty confident btarunr wouldn't have linked the article here either if he felt it was biased.
Posted on Reply
#116
entropy13
LOL yeah, and talking about "bias"...I'm also a regular in the comments section at Tech Report, and it's Cyril Kowalski that's more frequently called an "Nvidia fanboy" even though he recommended the 7850/7870 over its Nvidia counterparts because of the prices at the time of the reviews.
Posted on Reply
#117
the54thvoid
Super Intoxicated Moderator
Having read more into it there is no bias. Any issue with the latency is on a game to game, driver to driver basis. Here are the older latency graphs for Skyrim. The only card to suffer is the GTX 570.



Yes, older drivers (12.7 beta) but the entire point is, no bias and no AMD crap out. Also, for each latency blip to be identifed as a non-glitch requires continual rerunning of the same scene and seeing how the latencies play out.

And yes, to repeat, the graph above are older drivers but the point is still valid - there are no inherent issues with the AMD cards. Nvidia's Vsync may well be doing it's intended job here to minimise latency (effectively reducing it by throwing resources - speed - at those more difficult scenes.)
Posted on Reply
#118
crazyeyesreaper
Not a Moderator
Nvidia has a bit of an edge when it comes to stutter free game play what people don't realize is NVIDIA using some of those transistors in the GPU for that very purpose while AMD not so much, essentially NVIDIA is using GPU die space to improve smoothness of gameplay to an extent how well it works well thats up to ppl with their GPUs to decide its. In the end both companies can provide fantastic performance and stutter free gameplay apparently for AMD it just requires driver switching lol.
Posted on Reply
#119
kristimetal
Sad

I'm sad now, i have a 7950 Windforce.
In Skyrim with 12.8, it had sometimes a small stuttering, i have updated to 12.10 and the stuttering increased in outside areas (not in the cities, in the cities it runs smoothly, but in some caves the stuttering appears, weird).

I boughted it in july, there was a special offer, paid 310 euros, it was a deal back then.
Now i think i should have gone with a 670gtx, but the damn thing even now is around 360 euros (the cheapest with standard cooling), Asus DCU or Gigabyte windforce beeing at 380-390 euros.
:cry:
Hope AMD improves their drivers fast.:ohwell:
Posted on Reply
#120
sergionography
whatever the case turbo for gpus never made sense to me
it sounds like it could skew average fps due to super high fps in easy to render scenes that allow thermal headroom but not so much to the intensive ones that allow no thermal headroom which is were you need the power
if anyone knows of any good reviews that compare minimum fps between boast and non boast it would be greatly appreciated
Posted on Reply
#121
okidna
sergionographyif anyone knows of any good reviews that compare minimum fps between boast and non boast it would be greatly appreciated
7950 Boost reviews but you can also find 7950 non-boost version minimum FPS (and average) as a comparison :

www.bit-tech.net/hardware/2012/08/16/amd-radeon-hd-7950-3gb-with-boost/1
www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-HD-7950-3GB-PowerTune-Boost-Review
www.hardwarecanucks.com/forum/hardware-canucks-reviews/56220-powercolor-hd-7950-3gb-boost-state-review.html
Posted on Reply
#122
sergionography
okidna7950 Boost reviews but you can also find 7950 non-boost version minimum FPS (and average) as a comparison :

www.bit-tech.net/hardware/2012/08/16/amd-radeon-hd-7950-3gb-with-boost/1
www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-HD-7950-3GB-PowerTune-Boost-Review
www.hardwarecanucks.com/forum/hardware-canucks-reviews/56220-powercolor-hd-7950-3gb-boost-state-review.html
see just like i thought
www.pcper.com/files/imagecache/article_max_width/review/2012-08-13/bf3-1680-bar.jpg
here the minimum fps is the same

www.pcper.com/files/imagecache/article_max_width/review/2012-08-13/bac-1920-bar.jpg
and here the boast has lower minimum for some reason

so yeah it appears the whole boast on graphic cards isnt as reliable, it only affects average fps due to higher max fps, which is useless if you ask me because anything higher than 60fps isnt noticable, while going below 60 in some competitive games might be a big problem
Posted on Reply
Add your own comment
Dec 22nd, 2024 13:38 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts