Friday, February 22nd 2019

AMD Partners Cut Pricing of Radeon RX Vega 56 to Preempt GeForce GTX 1660 Ti

AMD cut pricing of the Radeon RX Vega 56 in select markets to preempt the GeForce GTX 1660 Ti, and help the market digest inventory. The card can be had for as little as €269 (including VAT) for an MSI RX Vega 56 Air Boost, which is a close-to-reference product. The GTX 1660 Ti reportedly has a starting price of $279.99 (excluding taxes). This development is significant given that the GTX 1660 Ti is rumored to perform on-par with the GTX 1070, which the RX Vega 56 outperforms. The RX Vega series is still very much a part of AMD's product stack, and AMD continues to release new game optimizations for the card. NVIDIA is expected to launch the GeForce GTX 1660 Ti within February. Although based on the "Turing" architecture, it lacks real-time raytracing and AI acceleration features, yet retains the increased IPC of CUDA cores from the new generation.
Add your own comment

120 Comments on AMD Partners Cut Pricing of Radeon RX Vega 56 to Preempt GeForce GTX 1660 Ti

#26
Xaled
xkm1948That is a very weak inference without actual source to back it up
No. This is the truth, that only Nvidia and its die hard fan denies. The truth that Freesync is as good as gsync.
Why someone would recommend people to buy 8600k over 2700x just to save 50$ dollars then would recommend gsync over Freesync despite it is the same but 200$ more expensive?
Only a true bandwagon fan would do that.
Posted on Reply
#27
moproblems99
illliThis is a great deal considering you can undervolt and o/c and comes with a 3 game bundle. People like to bitch about anything.
Ps for you people complaining about the noise or w/e there ARE 3rd party cooling solutions.
Just remember, those actually complaining haven't used, or will use one. Just ignore what they have to say. Those have that have used one know the difference. Those of that have one, also know that it isn't a 2080ti and don't pretend it is.
Posted on Reply
#28
mtcn77
The joke is on Nvidia when they enabled 'G-Sync', not 'FreeSync', on those crappy panels.
Posted on Reply
#29
bug
xkm1948Clearing out left over Vega stock, plain and simple
You clean out stock when you have replacements incoming. Just sayin'.
Posted on Reply
#30
jabbadap
mtcn77The joke is on Nvidia when they enabled 'G-Sync', not 'FreeSync', on those crappy panels.
How could they? Freesync is a trademark of AMD, they can't just call it that. Gsync compatible is using Vesa standard adaptive sync, which they have been all ready used through eDP on laptops and called it just Gsync many years. Which one was actually first on the market, freesync on monitors or gsync on laptops?
Posted on Reply
#31
mtcn77
jabbadapHow could they? Freesync is a trademark of AMD, they can't just call it that. Gsync compatible is using Vesa standard adaptive sync, which they have been all ready used through eDP on laptops and called it just Gsync many years. Which one was actually first on the market, freesync on monitors or gsync on laptops?
The patent holder of course. Nice try.
Posted on Reply
#32
notb
bugYou clean out stock when you have replacements incoming. Just sayin'.
That's just one of possible reasons.
But on fundamental level, it's just a matter of price vs demand - it's not really important whether the "replacing" product is made by you or the competition.
IMO this is a move against Turing like the title suggests.
It seems Navi is still a fairly distant future if we think about actual availability.

However, if AMD decided to repeat the Ryzen hype, i.e. flood us with very favourable benchmark leaks, a Vega sale would make sense even as early.
I mean: if they announce that a card coming within a year matches Nvidia lineup (maybe even in features like tensor cores), they would kill the sales of current products.
Posted on Reply
#33
cucker tarlson
ArbitraryAffectionThese RX Vega price cuts hurt the 590 more than NVIDIA lol.
look at it this way.
rx590 probab;y sold like crap anyway.
now they'll be forced to cut the price on this too and it'll sell better.
Posted on Reply
#34
Dimi
XaledNo. This is the truth, that only Nvidia and its die hard fan denies. The truth that Freesync is as good as gsync.
Why someone would recommend people to buy 8600k over 2700x just to save 50$ dollars then would recommend gsync over Freesync despite it is the same but 200$ more expensive?
Only a true bandwagon fan would do that.
Compare Freesync hz ranges with G-sync hz ranges.

ALL G-sync monitors start their adaptive sync range from the lowest supported hz up to its maximum refresh rate aka 30hz and up. 95% of the G-Sync monitors are high refresh rate monitors too so you get a great implimentation of adaptive sync.

Good luck finding a Freesync monitor with a 30-165hz adaptive sync range.

www.displayninja.com/freesync-monitor-list/

Ps: my 1440p 165hz G-Sync monitor costs around 300$
Posted on Reply
#35
Xaled
Have fun gaming at 30fps, 30hz.
Posted on Reply
#36
jabbadap
XaledHave fun gaming at 30fps, 30hz.
You sure you understand what VRR is?
Posted on Reply
#37
Xaled
jabbadapYou sure you understand what VRR is?
You sure you understand what Sync means?
Posted on Reply
#38
bug
XaledYou sure you understand what Sync means?
It doesn't mean gaming at 30fps, that's for sure.
Posted on Reply
#39
Vya Domus
DimiGood luck finding a Freesync monitor with a 30-165hz adaptive sync range.
And why would that really matter ?

If you have a game that swings from 30 fps to 165+, you have bigger problems than the range your Freesync monitors has. The matter of the fact is these things have been invented to be used when the hardware can't quite push enough frames as to be fully synchronized with the display not for when it can barley push playable framerates several times below the maximum refresh rate and where the frame times are very high anyway. This whole range thing has been beaten to death when in reality it matters little in real world use.

But what can I say I guess you will have a better 30fps cinematic experience on your G-sync monitor that most Freesync user do. I can't deny the validity of that claim.
Posted on Reply
#40
Xaled
bugIt doesn't mean gaming at 30fps, that's for sure.
Mmm. So why make a range down to 30hz? You monitor switch vrr from 165 Hz to 30hz just for fun?

I mean the fluidity/ game experience is bad at that range no matter what monitor you've got.
And difference between 40h and 30hz is just like the difference between 120 and 144hz or between 144 and 165 Hz monitor. Doesn't make difference at all
Posted on Reply
#41
bug
XaledMmm. So why make a range down to 30hz? You monitor switch vrr from 165 Hz to 30hz just for fun?
So that if your game dips at 31 fps, the frame can still be presented to you in a timely manner.
Whenever your fps dips below the minimum supported refresh, VRR won't work, you'll get regular vsync instead. (edit: you get LFC if you're on FreeSync2)
Posted on Reply
#42
jabbadap
XaledYou sure you understand what Sync means?
Yes in VRR it means monitor refresh rate is synced on game fps, or doubled Hz on LFC. VRR helps best when you stay on VRR range, usually one don't game on "30fps". One try to find settings that minimum fps stays above 30fps, thus keeping on VRR range all the time.
Posted on Reply
#43
bug
Vya DomusAnd why would that really matter ?

If you have a game that swings from 30 fps to 165+, you have bigger problems than the range your Freesync monitors has. The matter of the fact is these things have been invented to be used when the hardware can't quite push enough frames as to be fully synchronized with the display not for when it can barley push playable framerates several times below the maximum refresh rate and where the frame times are very high anyway. This whole range thing has been beaten to death when in reality it matters little in real world use.

But what can I say I guess you will have a better 30fps cinematic experience on your G-sync monitor that most Freesync user do. I can't deny the validity of that claim.
Look at how fps actually looks while playing: www.hardocp.com/article/2019/02/14/amd_radeon_vii_video_card_review/5
Posted on Reply
#44
efikkan
FouquinAt least undervolting is there to solve both problems.
It's really sad when people recommend running a product out of spec to make it acceptable.
Undervolting is not smart, it sacrifices reliability, and it's not something you can even guarantee will work.
Posted on Reply
#45
cucker tarlson
Surely it's nice g-sync is now possible on any freesync monitor,if you have some extra cash to burn then you may wanna go with g-sync monitor anyway cause ulmb+adaptive v-sync is an absolutely amazing experience for fast paced games,I personally love using it as often as it's feasible on my rig,but otherwise there's no need to get a g-sync monitor really.Just make sure to search around for info on what experience are other people getting with the monitor you're looking at paired with nvidia cards.
efikkanIt's really sad when people recommend running a product out of spec to make it acceptable.
Undervolting is not smart, it sacrifices reliability, and it's not something you can even guarantee will work.
people overclock all the time,though I get how undervolting a hot and loud card just to make it acceptable is a different kind of an animal.I'd probably bear it if I was on a very tight budget but wouldn't like to be forced to do it I had better options.
Posted on Reply
#46
bug
efikkanIt's really sad when people recommend running a product out of spec to make it acceptable.
Anything for the preciousss ;)
cucker tarlsonSurely it's nice g-sync is now possible on any freesync monitor,if you have some extra cash to burn then you may wanna go with g-sync monitor anyway cause ulmb+adaptive v-sync is an absolutely amazing experience for fast paced games,I personally love using it as often as it's feasible on my rig,but otherwise there's no need to get a g-sync monitor really.Just make sure to search around for info on what experience are other people getting with the monitor you're looking at paired with nvidia cards.


people overclock all the time,though I get how undervolting a hot and loud card just to make it acceptable is a different kind of an animal.I'd probably bear it if I was on a very tight budget but wouldn't like to be forced to do it I had better options.
It would be if both could be active at the same time. Last I checked, you still had to choose one or the other.
Posted on Reply
#47
cucker tarlson
bugIt would be if both could be active at the same time. Last I checked, you still had to choose one or the other.
adaptive v-sync I said
an absolute must for playing with ulmb since keeping constant 120 fps is mostly unreal on any hardware.
Posted on Reply
#48
Xaled
You guys miss the whole point.
Spending 100-200$ on GFX and getting a Freesync monitor with 165-40hz range is smarter than paying 100-200$ more to get a Gsync monitor with 165-30hz range instead.
Posted on Reply
#49
cucker tarlson
XaledYou guys miss the whole point.
Spending 100-200$ on GFX and getting a Freesync monitor with 165-40hz range is smarter than paying 100-200$ more to get a Gsync monitor with 165-30hz range instead.
depends what you want and how concerned you are with money really.
if you'd like ulmb and can pay for that feature then freesync monitors don't have that.
Posted on Reply
#50
Dimi
Vya DomusAnd why would that really matter ?

If you have a game that swings from 30 fps to 165+, you have bigger problems than the range your Freesync monitors has. The matter of the fact is these things have been invented to be used when the hardware can't quite push enough frames as to be fully synchronized with the display not for when it can barley push playable framerates several times below the maximum refresh rate and where the frame times are very high anyway. This whole range thing has been beaten to death when in reality it matters little in real world use.

But what can I say I guess you will have a better 30fps cinematic experience on your G-sync monitor that most Freesync user do. I can't deny the validity of that claim.
Because some games run between 40-55 fps, some 80-100fps and then there's games that run between 120-165 fps. I will have adaptive sync on all 3 different games.

Then there's Freesync monitors, with the most common adaptive sync range of 48-75hz. Great if you can keep your fps between 48 and 75 fps, anything outside of that and there goes your adaptive sync unless you cap it of course.
Posted on Reply
Add your own comment
Dec 18th, 2024 05:43 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts