• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Best time to sell your used 4090s is now.

4090 will be good for 5+ years more I think given how meagre the hardware gains in the mid range are now.

On the software side they're developing for that mid range.

RT offers little image quality benefits for me and not worth using upscaling for or the raw fps hit.
True this slide from Nvidia is like putting a Juicy stake in front of the ai customer and putting a milking bucket in front of PC gamer wanting more raw performance.
I believe the midrange will take a turn in 2025 with the 5070 at $549 ( 4070ti raw performance)

RT is very subjective but if you want the best picture quality for the least amount of resources spent imo global illuminations that are ray traced especially on an hdr screen can make the image pop and is the best rt feature.
Also the 9070 xt might bring some competition to the midrange.
1000057747.jpg
 
Totally not true. Quite the opposite. Reflex works when your gpu is at 99%. That's when latency is high
Quite the opposite? You get more frames?

Brother, reflex at 26fps is doing absolutely nothing for anyone.
I think you don't know what reflex is. Just educate yourself before speaking, it's not that hard.
I think you've never turned it off and think it's just magic. Why isn't it on by default then if there's zero downside? Turning it off in CP2077 gives more frames, the game nvidia uses to show off everything, still. Some games are more pronounced than others but it happens, reflex isn't free you bozos lol
 
Quite the opposite? You get more frames?

Brother, reflex at 26fps is doing absolutely nothing for anyone.

I think you've never turned it off and think it's just magic. Why isn't it on by default then if there's zero downside? Turning it off in CP2077 gives more frames, the game nvidia uses to show off everything, still. Some games are more pronounced than others but it happens, reflex isn't free you bozos lol
Yes, turning it off gives more frames, at the cost of input latency. Too bored to go into depth into the why to be honest, but when GPU is at 99% (spitting the most amount of frames), latency takes a nosedive.
 
Quite the opposite? You get more frames?

Brother, reflex at 26fps is doing absolutely nothing for anyone.

I think you've never turned it off and think it's just magic. Why isn't it on by default then if there's zero downside? Turning it off in CP2077 gives more frames, the game nvidia uses to show off everything, still. Some games are more pronounced than others but it happens, reflex isn't free you bozos lol
You are continuing on showing your ignorance
 
You are continuing on showing your ignorance
Ok? You're continuing contributing nothing, you do that everywhere you go?
Yes, turning it off gives more frames, at the cost of input latency. Too bored to go into depth into the why to be honest, but when GPU is at 99% (spitting the most amount of frames), latency takes a nosedive.

At 26fps, latency is high af with or without reflex. Take the frames. It's not a no brainer turn it on and leave it setting like everyone wants to treat it.
 
Ok? You're continuing contributing nothing, you do that everywhere you go?


At 26fps, latency is high af with or without reflex. Take the frames. It's not a no brainer turn it on and leave it setting like everyone wants to treat it.
Explain to me what it is first so it is easier for me to explain it to you. I want to know what your understanding of the thing is.
Btw you said ''I think you've never turned it off'' when I don't even have a nvidia gpu lol. Coping at maximum
 
The optimal GPU usage threshold to minimize GPU-related latencies is widely considered to be 95% of the hardware's capacity. This is regardless of vendor, it's kind of how the thing works. I think the argument here is somewhat of a moot point because if you have 26 fps, you are running above the hardware's capability anyway and you should reduce the workload by using lower settings or upgrade the hardware, if possible.

Now the argument here is probably the DLSS 4 multi frame generation and how it promises to make 30 into 240 fps... that should be where their frame warp technology kicks in. Of course, I don't know how it works, I don't think this was even explained in depth yet. Give it a few weeks.
 
The optimal GPU usage threshold to minimize GPU-related latencies is widely considered to be 95% of the hardware's capacity. This is regardless of vendor, it's kind of how the thing works. I think the argument here is somewhat of a moot point because if you have 26 fps, you are running above the hardware's capability anyway and you should reduce the workload by using lower settings or upgrade the hardware, if possible.

Now the argument here is probably the DLSS 4 multi frame generation and how it promises to make 30 into 240 fps... that should be where their frame warp technology kicks in. Of course, I don't know how it works, I don't think this was even explained in depth yet. Give it a few weeks.
I wait with bated breath to see whether latency (responsiveness) and artifacting are not major problems.
 
I wait with bated breath to see whether latency (responsiveness) and artifacting are not major problems.

Going by Ada's frame generation capability, I'd say YMMV with bold letters. There were games I had a positively amazing experience, even before official DLSS support was added (such as Starfield), but then again there were games were it was an equally horrible experience (Final Fantasy 16). I wonder how will Blackwell's more powerful hardware will manage these games that clearly did not perform well, or maybe if that will be done by the new transformative DLSS system (currently, DLSS 3.x uses a CNN model). We'll see.
 
I wait with bated breath to see whether latency (responsiveness) and artifacting are not major problems.
Ghosting is reduced but still there as per Nvidia's own rep at CES booth in a Hotharware video.
The optimal GPU usage threshold to minimize GPU-related latencies is widely considered to be 95% of the hardware's capacity. This is regardless of vendor, it's kind of how the thing works. I think the argument here is somewhat of a moot point because if you have 26 fps, you are running above the hardware's capability anyway and you should reduce the workload by using lower settings or upgrade the hardware, if possible.

Now the argument here is probably the DLSS 4 multi frame generation and how it promises to make 30 into 240 fps... that should be where their frame warp technology kicks in. Of course, I don't know how it works, I don't think this was even explained in depth yet. Give it a few weeks.
According to this CES presentation in uses the new dlss upscaler that will be available to 2000 series and higher they are using dlss set to performance and multi frame generation to get to those high competetive fps in AlanWake2. :laugh:
Although the image is superior and more stable subjectively speaking.
 
Explain to me what it is first so it is easier for me to explain it to you. I want to know what your understanding of the thing is.
Btw you said ''I think you've never turned it off'' when I don't even have a nvidia gpu lol. Coping at maximum
No, you know everything. Go ahead.
 
Just wondering, and no offense, but what kind of jobs do you guys have. Granted I live in a different country with much lower wages in general, but a 2k 4090 is two months of income for me, and ~4 months of rent for most people here.

I’m an IT Systems Analyst. I buy things and use them forever, so I currently drive a 10 year old car. My 13900K/RTX 4090 system replaced a 860/GTX 960 system. I will get at least 10 years out of this system, so it amounts to a dollar a day.
 
I'll let you know when my 13900KS goes belly up :D
It's like the 12vhpwr. You just havent waited long enough

Dr. Dro in 2035 "my 13900ks is still fine guys"
"oh yeah, you haven't waited long enough"

Just ignore them. You have one of the fastest cpus money can buy.
 
Video about possible power bottleneck for the Blackwell and Titan/5090ti in plain site.
1000058556.jpg
 

Attachments

  • 1000058557.jpg
    1000058557.jpg
    437.1 KB · Views: 43
Just ignore them. You have one of the fastest cpus money can buy.
LOL, I have two of these wonderful CPUs, the fastest CPUs money can buy -14900K and 13900KS, trying to sell one of them for quite some time now, and nobody for some reason wants to buy it. People are so dumb, you would not believe it... :laugh:
 
LOL, I have two of these wonderful CPUs, the fastest CPUs money can buy -14900K and 13900KS, trying to sell one of them for quite some time now, and nobody for some reason wants to buy it. People are so dumb, you would not believe it... :laugh:
What do you not like about them?
 
New 4090s are going for a minimum of $2500 to $3500. Sell your used 4090s while they have equity before the market correction next month when Blackwell drops. You can use the GeForce ultimate $10 trail in the interem to mitigate the pain of integrated graphics/ in between low end dedicated gpu. If you are planning to upgrade to Blackwell you can used funds to mitigate impact of upgrade out of pocket expenditure. :rockout:
On you own risk , maybe you sell your 4090 now for 2.5k and then you wont find any 5090 under 4k :) , probably wont be this extreme but who knows? :D
 
I have a better idea, why don't we all just wait and not pay the scalper prices and force them to lose money so we don't have to deal with this every release?
 
Back
Top