Monday, March 20th 2023

Gigabyte Confirms Upcoming RTX 4070 and 4060 Graphics Cards

Gigabyte has pretty much confirmed the upcoming NVIDIA GeForce RTX 4070 and RTX 4060 graphics cards, by adding support for its two SKUs in the Gigabyte Control Center software. The latest 23.03.02.01 version of the software adds support for the Gigabyte RTX 4070 AERO OC 12 GB and the Gigabyte RTX 4060 GAMING OC 8 GB graphics cards.

Gigabyte's product codes reveal a lot of information, including the first confirmation that the upcoming RTX 4060 will indeed be coming with 8 GB of memory, just as previous rumors stated. Obviously, the GeForce RTX 4070 comes with 12 GB of memory. The GeForce RTX 4070 is rumored to be announced on April 12th, with the retail availability expected on April 13th. The GeForce RTX 4060, or the rumored GeForce RTX 4060 Ti, still do not have a launch date, but earlier rumors pointed to May.
Source: Videocardz
Add your own comment

21 Comments on Gigabyte Confirms Upcoming RTX 4070 and 4060 Graphics Cards

#1
GunShot
8GB, whata-whata?!

Jensen is feeling tooOOoo good it seems.
Posted on Reply
#2
Gungar
GunShot8GB, whata-whata?!

Jensen is feeling tooOOoo good it seems.
It's probably the size of a XX30 class gpu so you know it's enough vram for it.
Posted on Reply
#3
Arkz
8GB 4060 when the 3060 was 12GB lol. I know it was a rather unique situation. But cmon.
Posted on Reply
#5
TheinsanegamerN
And AMD will take advantage of this nonsense by doing absolutely nothing.

Imagine if they had the 7800/7700 ready to go already.
Posted on Reply
#6
hsew
About time they push x60 out of the weird 6/12GB paradigm.
Posted on Reply
#7
TheDeeGee
GunShot8GB, whata-whata?!

Jensen is feeling tooOOoo good it seems.
It's fine?

Or do you fancy 4K15?
Posted on Reply
#8
Bwaze
GunShot8GB, whata-whata?!

Jensen is feeling tooOOoo good it seems.
It's still 500 EUR + card, although at this stage of cutting down it wil probably be slower than PlayStation 5...
Posted on Reply
#9
john_
TheinsanegamerNAnd AMD will take advantage of this nonsense by doing absolutely nothing.

Imagine if they had the 7800/7700 ready to go already.
And why should they had the 7800/7700 ready? I mean, consumers blame AMD for Nvidia's pricing. It's rediculous. So, why offer them alternatives? AMD offered alternatives with RX 6000 series and consumers turn their backs to AMD.

AMD should consentrate on servers and laptops and leave consumers to pay with their wallets their loyalty to Nvidia. AMD shouldn't waste time, resources and money to build products that consumers would find excuses to not buy anyway.
Posted on Reply
#10
BIGMicro
BwazeIt's still 500 EUR + card, although at this stage of cutting down it wil probably be slower than PlayStation 5...
~ $500 is a good price but for the RTX 4070. GTX 1060 - 299$ and now almost double ? I'm afraid $500 is a bad deal, half thousand $/€ for a rather FullHD card in 2023? The 4060 will not provide performance with RT, you will have to cut the settings or/and use DLSS 3.0 and Frame Generation. In addition, we have 2023 and these 8GB seem not very future. The GTX 1070 offered roughly the ~ GTX Titan performance. The RTX 4070 will almost certainly be more than $500 and the performance won't go up to 4K, certainly not with RT. If the RTX 3000 series got the 3rd generation RT core even in a limited edition, all cards from 4050 to 4070 and maybe 4080 (probably except 4090) would be unprofitable. Or 25-40% cheaper because the performance and price of the RTX 4070 - 4080 is completely unacceptable.
Posted on Reply
#11
N/A
What a sloppy thermal pad job.
Posted on Reply
#12
Chaitanya
john_And why should they had the 7800/7700 ready? I mean, consumers blame AMD for Nvidia's pricing. It's rediculous. So, why offer them alternatives? AMD offered alternatives with RX 6000 series and consumers turn their backs to AMD.

AMD should consentrate on servers and laptops and leave consumers to pay with their wallets their loyalty to Nvidia. AMD shouldn't waste time, resources and money to build products that consumers would find excuses to not buy anyway.
When AMD initially offered those alternatives they were just as badly priced(MSRP) as their competition and then sourge of scalpers + Miners didnt help the matter. When finally prices settled(thanks to low demand for AMD cards they crashed faster) most people had already paid scalpers to get alternatives so demand had waned to level that even price cuts didnt help much. AMD shot themselves in their foot with pricing in last 2 gens of GPUs and this time around with CPUs as well.
Posted on Reply
#13
medi01
TheinsanegamerNImagine if they had the 7800/7700 ready to go already.
Things are quite crowded from $600 and below.

And 7900 XT is $800.
Posted on Reply
#14
Bomby569
TheinsanegamerNAnd AMD will take advantage of this nonsense by doing absolutely nothing.

Imagine if they had the 7800/7700 ready to go already.
yes this time they get to 20 % market share, to the moon
Posted on Reply
#15
chrcoluk
Arkz8GB 4060 when the 3060 was 12GB lol. I know it was a rather unique situation. But cmon.
Nvidia being Nvidia.
Posted on Reply
#16
Why_Me
I hope the 4060 Ti is at least a 10GB card.
Posted on Reply
#17
kiakk
BIGMicroI'm afraid $500 is a bad deal
We could play lower spec games. Graphics quality do not define a game how good it is.
Also not acceptable they raise the TDP of the GPUs. We enjod the games back to ages, while a PC power consumption was around 60-80-100W. Now a mid-range gaming PC EZ-PZ reaches 200-300W or a higher-evel gaming PC 400W-600W. I would say this is fckN nonsense, absurd, idiotism. Just imagine that power level how huge: 300W-600W electric scooter, a biking human power only ~150W-250W.
Posted on Reply
#18
Aretak
TheDeeGeeIt's fine?

Or do you fancy 4K15?
No, it most certainly isn't fine, even for 1440p if you want to use ray tracing. 8GB cards choke and die in Hogwarts Legacy at 1440p if you turn RT on, as an example. A 3070 is as fast or slightly faster than a 2080 Ti until it runs out of VRAM, and yet...



Notice how the 2080 Ti is doing just fine and would be perfectly playable with DLSS enabled (heck, even a 6700 XT or A770 should be playable with FSR/XeSS). Notice, however, that there's absolutely no saving a 3070, or any other 8GB card. The 4060 should be in the same sort of performance range as those cards, so would be perfectly capable of a great RT experience... were it not for the gimped VRAM. Do you think games are going to use less VRAM in the future? We're just getting started on PS5/XSX exclusives without the PS4/Bone holding them back.

I must admit though, it's always a little impressive to me just what Nvidia fanboys will defend.
Posted on Reply
#19
A&P211
Bomby569yes this time they get to 20 % market share, to the moon
Only thing to the moon is the price of gpu's today.
Posted on Reply
#20
john_
ChaitanyaWhen AMD initially offered those alternatives they were just as badly priced(MSRP) as their competition and then sourge of scalpers + Miners didnt help the matter. When finally prices settled(thanks to low demand for AMD cards they crashed faster) most people had already paid scalpers to get alternatives so demand had waned to level that even price cuts didnt help much. AMD shot themselves in their foot with pricing in last 2 gens of GPUs and this time around with CPUs as well.
Initial MSRP prices where good for 6800/6900 cards. Then scalpers/miners started buying everything and AMD was coming up with higher MSRPs for it's mid range products, probably knowing that prices will skyrocket anyway. When all that mess gone away, AMD's prices gone way below MSRP where Nvidia's prices still remain above MSRP. But people keep buying Nvidia. So, their is demand and buyers are willing to buy more for products from Nvidia that offer less performance at higher price. I think this loyalty to Nvidia pushes AMD to somewhat abandon the retail market and just maintain an appearance here instead of figheting for market share. They know they will lose in a price war with Nvidia because buyers will give their money to Nvidia anyway. So, probably they are focusing on CPUs and APUs, while taking their time to optimize their chiplet approach for GPUs, that could possibly give them an advantage over Nvidia (seeing Nvidia the last years I bet they will throw a chiplet approach in just one go and stun AMD again - Huang is a genius no one should denied this). If AMD manages to make their chiplet approach for GPUs to both work and at the same time lower the costs for them, we might see them again fighting for market share. For now they leave Nvidia to drive prices up.
Posted on Reply
#21
Bomby569
AretakNo, it most certainly isn't fine, even for 1440p if you want to use ray tracing. 8GB cards choke and die in Hogwarts Legacy at 1440p if you turn RT on, as an example. A 3070 is as fast or slightly faster than a 2080 Ti until it runs out of VRAM, and yet...



Notice how the 2080 Ti is doing just fine and would be perfectly playable with DLSS enabled (heck, even a 6700 XT or A770 should be playable with FSR/XeSS). Notice, however, that there's absolutely no saving a 3070, or any other 8GB card. The 4060 should be in the same sort of performance range as those cards, so would be perfectly capable of a great RT experience... were it not for the gimped VRAM. Do you think games are going to use less VRAM in the future? We're just getting started on PS5/XSX exclusives without the PS4/Bone holding them back.

I must admit though, it's always a little impressive to me just what Nvidia fanboys will defend.
8gb is simply not enough in 2023 but all i see there is a stupid benchmark, all those framerates and 1% lows just makes it a bad experience and no one should play that game with those cards at 1440p ultra+ray. You can have 16gb and it still would be an idiotic choice. Lower it and everything will be much different and playable. It's like measuring which economy car can do better in a F1 setting.
Posted on Reply
Add your own comment
May 29th, 2024 04:32 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts