• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Absent of Official Announcement, NVIDIA RTX 3060 Ti Graphics Cards Up for Preorder in China

Rei

Joined
Aug 1, 2020
Messages
656 (0.42/day)
Location
Guam
System Name 1 Desktop/2 Laptops/1 Netbook
Processor AMD Athon X2 64/Intel Pentium 997/Intel Pentium 4/Intel Atom
Motherboard EpoX ATX motherboard/Samsung/Toshiba/Lenovo
Cooling Stock
Memory 4 GB/4 GB/2 GB/2 GB
Video Card(s) Asus GeForce GTX 780 Ti/Intel HD Graphics/GeForce 4MX/Intel GMA
Storage 6+ TB Total
Display(s) HP Pavilion 14 Inch 1024x768@60Hz 4:3 Aspect Ratio CRT Monitor
Case None
Audio Device(s) Various
Power Supply Seasonic 500 Watt & VenomRX 500 Watt
Mouse Wayes Iron Man Wireless Mouse
Keyboard Rexus VR2 Wireless Keyboard
Software Win10 & WinXP SP3
Benchmark Scores It sucks...
The biggest price hikes with the lowest performance gained came with Turing. There was no COVID back then. It was just greed + quasi-monopolistic position.
Except at the time, the "price hike" came from Nvidia's investment into the new RT cores & Tensor cores to offset processing from the other cores to prevent performance loss. I believe it paid off as Ampere had similar pricing as Turing yet with better RT & Tensor cores as well as much more CUDA cores.

Guys, it's not a Titan, it;'s a BF GPU
That's what Nvidia is calling it. Essentially, what Titan is or suppose to be.
 
Joined
May 15, 2020
Messages
697 (0.42/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
Except at the time, the "price hike" came from Nvidia's investment into the new RT cores & Tensor cores to offset processing from the other cores to prevent performance loss. I believe it paid off as Ampere had similar pricing as Turing yet with better RT & Tensor cores as well as much more CUDA cores.
No, it's the other way around, tensor cores were made for computing applications, but since they had them lying around they came up with a use for them in DLSS.
And in general, R&D doesn't cost extra, this is a misconception it's the same effort that is put in at each generation, it goes into design, optimization, etc. It's not like they paid 1000 extra engineers for Turing, it's the same engineers that they pay for each generation that did the job, Ampere Pascal, etc.

That's what Nvidia is calling it. Essentially, what Titan is or suppose to be.
Well, if that's xhat you think, read around a bit, them. The Titan is supposed to be a prosumer card, with extra optimizations in the drivers for professional applications, that has always been provided only by Nvidia themselves. The 3090 is just a gaming card with an arguably oversized memory, it;s just an excuse for charging 500 bucks more than for the 2080Ti, sold mostly via Nvidia's partners.
 
Last edited:
Joined
Jul 5, 2013
Messages
27,818 (6.68/day)
I love how you bypassed that GamersNexus video screenshot and giving Linus's 1 game to call the fake Titan 3090 8K gaming card.
Especially when that video states clearly its nVidia sponsored! :laugh:
Bypassed? No. Haven't seen it. GN is not my go-to tech channel, but then again, neither is LTT. However it really doesn't matter if it was NVidia sponsored, LTT SHOWED THE ACTUAL GAME PLAY FOOTAGE! Solid game play at 8k. End of story. So do hush and quit your whining.
 
  • Like
Reactions: Rei
Joined
Sep 3, 2019
Messages
3,510 (1.84/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 220W PPT limit, 80C temp limit, CO -6-14, +50MHz (up to 5.0GHz)
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F39b, AGESA V2 1.2.0.C
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off-center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3667MT/s 1.42V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~467W (375W current) PowerLimit, 1060mV, Adrenalin v24.10.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR400/1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, ATX v2.4, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v24H2, OSBuild 26100.2161), upgraded from Win10 to Win11 on Jan 2024
Bypassed? No. Haven't seen it. GN is not my go-to tech channel, but then again, neither is LTT. However it really doesn't matter if it was NVidia sponsored, LTT SHOWED THE ACTUAL GAME PLAY FOOTAGE! Solid game play at 8k. End of story. So do hush and quit your whining.
Yes yes you did bypassed with blindfolds with nVidia logo on them. I’m far from whining and I’m not hushing.
Do you own this place to tell others if they can speak? What and who made you like you are? what are you pretend to be in here? Should we be scared? Am I allowed to talk sir? Do I have your permission?:kookoo:
You are a very special fruit indeed.

Find me a review of full set of games that this faked so called Titan with no titan name and no titan drivers with the titan +10% performance over the 3080 than can play the solid 60hz...:shadedshu:

You can’t... ever!
Dare to ask W1zzard, if the this oven fake cake can do 8k gaming.
not even at a solid 30...

What you can apparently do is to think that people in here are stupid enough to believe from that shilling sponsored video that the fake titan can do 8k gaming.... it’s at least hilarious.

——————————
:laugh::laugh:
Oh man, I enjoy so much reading all that. The “Jensen‘s little clones and mouthpieces parade” are really (at least try) going wild to proof their titan “loyalty”. Or... :rolleyes: maybe it’s also a fake... who’s to tell.
 

Rei

Joined
Aug 1, 2020
Messages
656 (0.42/day)
Location
Guam
System Name 1 Desktop/2 Laptops/1 Netbook
Processor AMD Athon X2 64/Intel Pentium 997/Intel Pentium 4/Intel Atom
Motherboard EpoX ATX motherboard/Samsung/Toshiba/Lenovo
Cooling Stock
Memory 4 GB/4 GB/2 GB/2 GB
Video Card(s) Asus GeForce GTX 780 Ti/Intel HD Graphics/GeForce 4MX/Intel GMA
Storage 6+ TB Total
Display(s) HP Pavilion 14 Inch 1024x768@60Hz 4:3 Aspect Ratio CRT Monitor
Case None
Audio Device(s) Various
Power Supply Seasonic 500 Watt & VenomRX 500 Watt
Mouse Wayes Iron Man Wireless Mouse
Keyboard Rexus VR2 Wireless Keyboard
Software Win10 & WinXP SP3
Benchmark Scores It sucks...
No, it's the other way around, tensor cores were made for computing applications, but since they had them lying around they came up with a use for them in DLSS.
And in general, R&D doesn't cost extra, this is a misconception it's the same effort that is put in at each generation, it goes into design, optimization, etc. It's not like they paid 1000 extra engineers for Turing, it's the same engineers that they pay for each generation that did the job, Ampere Pascal, etc.
While video games is a part of computing applications, Nvidia was fairly new at the time with Tensor prior to Volta. Not to mention that the Tensor cores were needed to be re-engineered to be well suited for gaming purposes.
R&D does cost extra depending on the additions of feature set & whatnot. As such, the Turing generation wasn't just R&D, it was also an investment into new tech. Of course, R&D for Ampere might not have been as costly as Turing as the tech was already there. All there was left was refinement, new design & improvement.

Yes yes you did bypassed with blindfolds with nVidia logo on them. I’m far from whining and I’m not hushing.
Do you own this place to tell others if they can speak? What and who made you like you are? what are you pretend to be in here? Should we be scared? Am I allowed to talk sir? Do I have your permission?:kookoo:
You are a very special fruit indeed.

Find me a review of full set of games that this faked so called Titan with no titan name and no titan drivers with the titan +10% performance over the 3080 than can play the solid 60hz...:shadedshu:

You can’t... ever!
Dare to ask W1zzard, if the this oven fake cake can do 8k gaming.
not even at a solid 30...

What you can apparently do is to think that people in here are stupid enough to believe from that shilling sponsored video that the fake titan can do 8k gaming.... it’s at least hilarious.

——————————
:laugh::laugh:
Oh man, I enjoy so much reading all that. The “Jensen‘s little clones and mouthpieces parade” are really (at least try) going wild to proof their titan “loyalty”. Or... :rolleyes: maybe it’s also a fake... who’s to tell.
Calm down... I take it from this comment that you haven't seen the LTT video then? Also, you should be able to find plenty of YouTube videos on other channels for your request & I suggest you do that instead of throwing salt at people in spite.

Well, if that's xhat you think, read around a bit, them. The Titan is supposed to be a prosumer card, with extra optimizations in the drivers for professional applications, that has always been provided only by Nvidia themselves. The 3090 is just a gaming card with an arguably oversized memory, it;s just an excuse for charging 500 bucks more than for the 2080Ti, sold mostly via Nvidia's partners.
I've already answered that on post #74.
 
Last edited:
Joined
Sep 3, 2019
Messages
3,510 (1.84/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 220W PPT limit, 80C temp limit, CO -6-14, +50MHz (up to 5.0GHz)
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F39b, AGESA V2 1.2.0.C
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off-center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3667MT/s 1.42V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~467W (375W current) PowerLimit, 1060mV, Adrenalin v24.10.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR400/1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, ATX v2.4, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v24H2, OSBuild 26100.2161), upgraded from Win10 to Win11 on Jan 2024
While video games is a part of computing applications, Nvidia was fairly new at the time with Tensor prior to Volta. Not to mention that the Tensor cores were needed to be re-engineered to be well suited for gaming purposes.
R&D does cost extra depending on the additions of feature set & whatnot. As such, the Turing generation wasn't just R&D, it was also an investment into new tech. Of course, R&D for Ampere might not have been as costly as Turing as the tech was already there. All there was left was refinement, new design & improvement.


Calm down... I take it from this comment that you haven't seen the LTT video then? Also, you should be able to find plenty of YouTube videos on other channels for your request & I suggest you do that instead of throwing salt at people in spite.
I'm laughing, can I be more calmed down? ...despite of people telling me to hush. But then I laugh more with the idea of some thinking that can dictate others thoughts, opinions and free speech.

Of course I saw the LTT video. And? does it proof that the fake titan 3090 is 8k gaming card? I do not trust sponsored - marketing videos. I trust reviewers like Gamers Nexus and others. The ones that telling on every corporation (Intel, AMD, nVIdia) when they have something to tell.

I wonder why @W1zzard missed the opportunity to run tests on 8K. That could have been a really nice writing IMHO.
He cant find an 8K monitor?
 
Last edited:
Joined
May 15, 2020
Messages
697 (0.42/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
While video games is a part of computing applications, Nvidia was fairly new at the time with Tensor prior to Volta. Not to mention that the Tensor cores were needed to be re-engineered to be well suited for gaming purposes.
R&D does cost extra depending on the additions of feature set & whatnot. As such, the Turing generation wasn't just R&D, it was also an investment into new tech. Of course, R&D for Ampere might not have been as costly as Turing as the tech was already there. All there was left was refinement, new design & improvement.
They did a huge price hike with Turing because they were pretty new with tensor cores prior to Volta? That's very logical indeed.

There's "new tech" all the time, that's why people buy new graphic cards, otherwise we'd be happy with the old ones, most of us, at least.


I've already answered that on post #74.
Yeah right. If it were a Titan, they would call it a Titan, it is as obvious as that. There will be a Titan as soon as they go to a less shitty node than Samsung 8nm
 
Last edited:

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,842 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
I wonder why @W1zzard missed the opportunity to run tests on 8K. That could have been a really nice writing IMHO.
He cant find an 8K monitor?
Opportunity? I don't think so. Maybe in a year or two.
 
Joined
Dec 5, 2013
Messages
639 (0.16/day)
Location
UK
If a current-gen xx60 card cannot run a game at or near max graphical fidelity with reasonable framerates, then the game is the problem. Something like 80% of all PC gamers will have worse GPU than the current gen xx60 card.

^ This x1,000,000. Every time there's a new GPU generation, the web seems to be filled with epeen "enthusiasts" who actually want new games to run as slow as possible just to give them an excuse to upgrade to a flagship. Optimization has been absolutely sh*t over the past few years. Even comparing games like say Dishonored 1 vs Dishonored 2 (nice house textures on the upper right!) or Deus Ex: HR vs Deus Ex: MD (both sequels run 4x slower yet hardly look any better at all despite only being 4-5 years apart) highlights how cr*p a lot of game engines and optimization efforts are.

Remedy's Control runs like garbage on everything and its visuals don't even remotely justify how low the framerates are. 4K completely unplayable even on the 2080Ti flagship at launch, and visuals that are similar in style but, in my opinion, far worse than 2016's DeusEx: MD that runs much better despite also being criticised as being poorly optimised and demanding on hardware of its day).

I was actually shocked to see how bad Control looks for the performance. Far worse than just 2016-era games. In fact , all I can do is just quietly leave this here... :roll:
 
Joined
May 15, 2020
Messages
697 (0.42/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
^ This x1,000,000. Every time there's a new GPU generation, the web seems to be filled with epeen "enthusiasts" who actually want new games to run as slow as possible just to give them an excuse to upgrade to a flagship. Optimization has been absolutely sh*t over the past few years. Even comparing games like say Dishonored 1 vs Dishonored 2 (nice house textures on the upper right!) or Deus Ex: HR vs Deus Ex: MD (both sequels run 4x slower yet hardly look any better at all despite only being 4-5 years apart) highlights how cr*p a lot of game engines and optimization efforts are.

I was actually shocked to see how bad Control looks for the performance. Far worse than just 2016-era games. In fact , all I can do is just quietly leave this here... :roll:
Are you guys trying to start one of those tin foil hat theories about programmed obsolescence again? ;)
 
Joined
Dec 5, 2013
Messages
639 (0.16/day)
Location
UK
Are you guys trying to start one of those tin foil hat theories about programmed obsolescence again? ;)
When I first heard those "GPU manufacturers are giving game devs a backhander to make their games deliberately run like cr*p so people will buy the next more expensive GPU tranche up", I laughed and called them something like "crazed nutjobs". Given some of the unoptimized turds we've been treated to recently though, it does make you wonder... ;)
 

Rei

Joined
Aug 1, 2020
Messages
656 (0.42/day)
Location
Guam
System Name 1 Desktop/2 Laptops/1 Netbook
Processor AMD Athon X2 64/Intel Pentium 997/Intel Pentium 4/Intel Atom
Motherboard EpoX ATX motherboard/Samsung/Toshiba/Lenovo
Cooling Stock
Memory 4 GB/4 GB/2 GB/2 GB
Video Card(s) Asus GeForce GTX 780 Ti/Intel HD Graphics/GeForce 4MX/Intel GMA
Storage 6+ TB Total
Display(s) HP Pavilion 14 Inch 1024x768@60Hz 4:3 Aspect Ratio CRT Monitor
Case None
Audio Device(s) Various
Power Supply Seasonic 500 Watt & VenomRX 500 Watt
Mouse Wayes Iron Man Wireless Mouse
Keyboard Rexus VR2 Wireless Keyboard
Software Win10 & WinXP SP3
Benchmark Scores It sucks...
Yeah right. If it were a Titan, they would call it a Titan, it is as obvious as that. There will be a Titan as soon as they go to a less shitty node than Samsung 8nm
As I said, this is my observation. Just as I observed how Nvidia just changed the nomenclature of most of their products such as Quadro & Nvidia Tesla (the GPGPU, not the microarchitechture), they seem to be also changing the Titan naming as well. Besides, if Titan already has top tier specs with 24GB VRAM & high pricing, there would be no point in releasing Titan which would have too much similarity with 3090 as having large amount of VRAM yet with limited performance is also pointless. So this to me is Nvidia just changing only the naming scheme of Titan into xx90.
Of course I saw the LTT video. And? does it proof that the fake titan 3090 is 8k gaming card? I do not trust sponsored - marketing videos.
The answer to that is YES, it is an 8K gaming GPU as this video proves it. Just cuz it was a sponsored video, does not make it any less true since that was a sponsored video but NOT a sponsored "marketing" video & there is a difference. To that end, Nvidia sponsored this video to be showcased by a third-party that their 3090 "Titan replacement" GPU can do 8K@60fps which this was what your whole argument was about to start with.
 
Joined
May 15, 2020
Messages
697 (0.42/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
When I first heard those "GPU manufacturers are giving game devs a backhander to make their games deliberately run like cr*p so people will buy the next more expensive GPU tranche up", I laughed and called them something like "crazed nutjobs". Given some of the unoptimized turds we've been treated to recently though, it does make you wonder... ;)
Well, joking aside, the previous generations already allowed playing games under what many users would consider "good enough" conditions, so I guess GPU manufacturers have to come with new features and fashions in order to keep demand high: raytracing, ultra-high refresh, etc. I wonder if they're not shooting themselves in the foot with AI upscaling, in the long term, because it might turn against them.

Anyways, isn't Control just an Nvidia showcase? The lack of optimization might be a feature...
 

bug

Joined
May 22, 2015
Messages
13,773 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
When I first heard those "GPU manufacturers are giving game devs a backhander to make their games deliberately run like cr*p so people will buy the next more expensive GPU tranche up", I laughed and called them something like "crazed nutjobs". Given some of the unoptimized turds we've been treated to recently though, it does make you wonder... ;)
I don't think that's it. We've been served console ports for many years, and ports incur an inherent translation overhead.
Also devs in general seem to grow increasingly dumber. During interviews I seem to encounter more and more developers that only skim over the tools/languages/APIs/frameworks they use and completely break down when you probe their in-depth knowledge.
 
Joined
Jul 5, 2013
Messages
27,818 (6.68/day)
Yes yes you did bypassed with blindfolds with nVidia logo on them.
So you're calling me a liar? How pathetic. Just an FYI there, the term "Bypass" implies deliberate action. I haven't seen GN's take on 3090 8k gaming. Couldn't care less either because he does NOT have an 8k display to do testing with(ATM), so his opinion will carry as much weight as your silly ramblings.
Do you own this place to tell others if they can speak? What and who made you like you are? what are you pretend to be in here? Should we be scared? Am I allowed to talk sir? Do I have your permission?:kookoo:
You are a very special fruit indeed.
Aww, that was adorable, you trying to be clever like that. "Special fruit" indeed.
What you can apparently do is to think that people in here are stupid enough to believe from that shilling sponsored video that the fake titan can do 8k gaming.... it’s at least hilarious.
So what you're saying with that is that doing business is evil? Promoting new products of technological advancement is wrong? It's called progress. Don't like it? Too bad, that's the way the world works. It's neither evil, nor nefarious.
 

bug

Joined
May 22, 2015
Messages
13,773 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
@lexluthermiester Leave him be, what more do you hope to gain from this exchange?
 
Joined
Jul 5, 2013
Messages
27,818 (6.68/day)
Opportunity? I don't think so. Maybe in a year or two.
To be fair, in spite of his silly rambling otherwise, this was a good point. If you can swing a deal, it would be very interesting for you to get an 8k display and do your usual wizardry to show everyone what can be done ATM. Given what Linus was doing, I'll bet you'd have a bunch of fun and it would be a solid edge for the site. You could do some videos and put them on the YT channel with some cross promotion. It would be kickass!:rockout:
@lexluthermiester Leave him be, what more do you hope to gain from this exchange?
Oh come on, you know me, I'm just having fun at this point letting him make silliness of himself.. :laugh:
(not laughing at you with that emote)
 
Joined
Jun 3, 2012
Messages
1,954 (0.43/day)
Location
Denmark
Processor Ryzen 7 7700
Motherboard Asrock B650 PG LIgtning
Cooling artic freezer 36
Memory G.Skill Flare X5 DDR5-6000 - 32GB - CL30
Video Card(s) ASUS Dual GeForce RTX 4070 EVO
Storage 1x2tb KC3000 & 2tb samsung 970 evo plus, 2 x 2 tb external usb harddrives
Display(s) LG 32GP850, IIyama G2470HSU-B1
Case Corsair 5000D airflow tg
Audio Device(s) Marantz PM 6007 Triangle Esprit Titus Ez
Power Supply Corsair RM850X White
Mouse Logitech G PRO Superlight
Keyboard Corsair K70 RGB TKL Champion
Software Windows 11 64 bit, Free bitdefender
I hope that RTX 3060 Ti will be $329-379, RTX 3060 will be $249-299(if 3060 is 6GB, it will be $249-279).
RTX 3060 Ti = RTX 2080S+%15 in 1440p
RTX 3060 = RTX 2080


WHY then there will be no need for a rtx 3070 since a rtx 2080 ti, rtx 3060 ti and 3070 wil be equally fast within a few fps, the rtx is is already +%15 in 1440p
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,842 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
I'll bet you'd have a bunch of fun
Too busy with reviews atm. Got 5x RTX 3070 custom design, Ryzen 5000, AMD new cards soon, I hope. Also rebenching all my SSDs on PCIe 4 platform, and 12 SSD review backlog. Also lots happening in real-life
 

bug

Joined
May 22, 2015
Messages
13,773 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Too busy with reviews atm. Got 5x RTX 3070 custom design, Ryzen 5000, AMD new cards soon, I hope. Also rebenching all my SSDs on PCIe 4 platform, and 12 SSD review backlog. Also lots happening in real-life
Wth do you mean "real-life"? You gotta set your priorities straight, man.
 
Last edited:
Joined
Oct 10, 2018
Messages
147 (0.07/day)
WHY then there will be no need for a rtx 3070 since a rtx 2080 ti, rtx 3060 ti and 3070 wil be equally fast within a few fps, the rtx is is already +%15 in 1440p
I exaggerated little bit but it's performance will be like RTX2070S vs RTX 2080 so, RTX 3060 Ti will be %9 faster than RTX 2080S.
 
Joined
Sep 3, 2019
Messages
3,510 (1.84/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 220W PPT limit, 80C temp limit, CO -6-14, +50MHz (up to 5.0GHz)
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F39b, AGESA V2 1.2.0.C
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off-center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3667MT/s 1.42V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~467W (375W current) PowerLimit, 1060mV, Adrenalin v24.10.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR400/1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, ATX v2.4, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v24H2, OSBuild 26100.2161), upgraded from Win10 to Win11 on Jan 2024
Opportunity? I don't think so. Maybe in a year or two.
I should've quote the "opportunity" in that sentence because its not really a one. That was my point.
If it was an opportunity, it would be very good news "for the first 8K gaming card" and cant think that TPU would loose this "opportunity" to run benchmarks on a such impresive res.

My whole point was that its not, because first of all the 3090 cannot run most games on 8K without producing a slideshow.
And I wanted to give you the heads-up that I mention you and the lack of 8K benchmarks on TPU to make that point, and maybe If you like to make a statement on your own.
If I crossed any lines, I didnt wanted to and I appoligize. I'm aware also of the possibility that maybe you cant do such comments outside of a formal TPU article.

------------------------------------------------

The answer to that is YES, it is an 8K gaming GPU as this video proves it. Just cuz it was a sponsored video, does not make it any less true since that was a sponsored video but NOT a sponsored "marketing" video & there is a difference. To that end, Nvidia sponsored this video to be showcased by a third-party that their 3090 "Titan replacement" GPU can do 8K@60fps which this was what your whole argument was about to start with.
Providing a sponsored video (part of marketing and promotion of a product) isnt proofing anything no matter how hard you try to present it like it does.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,842 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
If I crossed any lines, I didnt wanted to and I appoligize
No worries, sorry for being short and blunt sometimes, just a ton going on

I'm aware also of the possibility that maybe you cant do such comments outside of a formal TPU article.
Nah I'm always happy to comment on every reasonable question, anywhere :D
 
Joined
Jul 5, 2013
Messages
27,818 (6.68/day)
My whole point was that its not, because first of all the 3090 cannot run most games on 8K without producing a slideshow.
So you didn't watch the video posted earlier from LTT showing Linus playing Doom and getting solid gameplay on an 8K display... Maybe you should give that a view. Just throwing it out there.

Providing a sponsored video (part of marketing and promotion of a product) isnt proofing anything no matter how hard you try to present it like it does.
There is nothing wrong with sponsored videos. Get off your high horse.
 
  • Like
Reactions: Rei

Rei

Joined
Aug 1, 2020
Messages
656 (0.42/day)
Location
Guam
System Name 1 Desktop/2 Laptops/1 Netbook
Processor AMD Athon X2 64/Intel Pentium 997/Intel Pentium 4/Intel Atom
Motherboard EpoX ATX motherboard/Samsung/Toshiba/Lenovo
Cooling Stock
Memory 4 GB/4 GB/2 GB/2 GB
Video Card(s) Asus GeForce GTX 780 Ti/Intel HD Graphics/GeForce 4MX/Intel GMA
Storage 6+ TB Total
Display(s) HP Pavilion 14 Inch 1024x768@60Hz 4:3 Aspect Ratio CRT Monitor
Case None
Audio Device(s) Various
Power Supply Seasonic 500 Watt & VenomRX 500 Watt
Mouse Wayes Iron Man Wireless Mouse
Keyboard Rexus VR2 Wireless Keyboard
Software Win10 & WinXP SP3
Benchmark Scores It sucks...
Providing a sponsored video (part of marketing and promotion of a product) isnt proofing anything no matter how hard you try to present it like it does.
Regardless whether the video is sponsored or not, it still is proof that 8K@60fps is possible on 3090. You just choose not to believe it.
So you didn't watch the video posted earlier from LTT showing Linus playing Doom and getting solid gameplay on an 8K display... Maybe you should give that a view. Just throwing it out there.


There is nothing wrong with sponsored videos. Get off your high horse.
Yeah, I think at this point, he is either in denial or just doesn't wanna admit that he is wrong or both.
 
Top