Thursday, May 4th 2023

Gigabyte Mid-Range GPU Lineup Leaked, Radeon RX 7600 & RTX 4060 Ti Cards Incoming

Hardware tipster harukaze5719 has once again spotted the registration of a new set of Gigabyte products - the Eurasian Economic Commission (EEC) has updated its registry with a slew of NVIDIA and AMD graphics cards - these listings were created this morning, and harukaze5719 picked up on this information almost immediately. Gigabyte's very mid-range new lineup is formed of nine RTX 4060 Ti GPU models and only a pair of Radeon RX 7600 GPU custom design cards - the common pattern is an allocation of 8 GB VRAM.

Gigabyte's two Radeon RX 7600 cards (Gaming & Gaming OC) are likely set for a May 25 launch, according to a combination of official AMD material and rumors from the past week or two - the slightly beefier Radeon RX 7600 XT is rumored to arrive on the same day. Leaked embargo information from earlier on this week suggests that NVIDIA is launching its GeForce RTX 4060 Ti lineup at the end of May, so an almost direct clash between it and AMD's upcoming Radeon 7600 and 7600 XT cards is expected to occur during that time period.
AMD Radeon RX Cards:
  • Gigabyte Radeon RX 7600 8 GB Gaming (GV-R76GAMING-8GD)
  • Gigabyte Radeon RX 7600 8 GB Gaming OC (GV-R76GAMING OC-8GD)
NVIDIA GeForce RTX Cards:
  • Gigabyte GeForce RTX 4060 Ti AORUS Extreme 8 GB (GV-N406TAORUS E-8GD)
  • Gigabyte GeForce RTX 4060 Ti AERO OC 8 GB (GV-N406TAERO OC-8GD)
  • Gigabyte GeForce RTX 4060 Ti AERO 8 GB (GV-N406TAERO-8GD)
  • Gigabyte GeForce RTX 4060 Ti Gaming OC 8 GB (GV-N406TGAMING OC-8GD)
  • Gigabyte GeForce RTX 4060 Ti Gaming 8 GB (GV-N406TGAMING-8GD)
  • Gigabyte GeForce RTX 4060 Ti Eagle OC 8 GB (GV-N406TEAGLE OC-8GD)
  • Gigabyte GeForce RTX 4060 Ti Eagle 8 GB (GV-N406TEAGLE-8GD)
  • Gigabyte GeForce RTX 4060 Ti WindForce OC 8 GB (GV-N406TWF2OC-8GD)
  • Gigabyte GeForce RTX 4060 Ti WindForce 8 GB (GV-N406TWF2-8GD)
Thanks goes to VideoCardz for fully naming and tabulating the (above) graphics card models.
Sources: EEC, harukaze5719 Tweet, VideoCardz
Add your own comment

27 Comments on Gigabyte Mid-Range GPU Lineup Leaked, Radeon RX 7600 & RTX 4060 Ti Cards Incoming

#1
Chaitanya
Hoping for true 2 slot designs.
Posted on Reply
#2
Denver
Both should have at least 12Gb.
AMD Is throwing Its Marketing Into the Trash. Meh.
Posted on Reply
#3
ZoneDymo
DenverBoth should have at least 12Gb.
AMD Is throwing Its Marketing Into the Trash. Meh.
^ this, I really hope the media makes infinite fun of them again for that crap
Posted on Reply
#4
Dirt Chip
8GB for mid-range GPU?!?
Shame! Shame! Shame!
Posted on Reply
#5
MrDweezil
Pricing is everything. 8 GB declares the 7600s as 1080p cards and they'll need to be $250, maybe $300 for the XTs. I suspect Nvidia will charge another $100 on top of that, which seems like a non-starter.
Posted on Reply
#6
wNotyarD
Dirt Chip8GB for mid-range GPU?!?
Shame! Shame! Shame!
I don't know if the 6 series from both teams are considered mid-range nowadays. I'd judge them as mid-low, while the 7's are mid-high.
Posted on Reply
#7
ixi
9 models for 60 ti. Loooooooooool, made me a good laugh.
Posted on Reply
#8
Icon Charlie
Dirt Chip8GB for mid-range GPU?!?
Shame! Shame! Shame!
I agree. When a company throws shade on another about specs., and then does the same thing???

HYPOCRITES here in Silicon Valley.
Posted on Reply
#9
Arkz
6 years ago the RX580 launched with 8GB, 6 damn years. Games have become significantly more VRAM heavy, and they'll launch this card, still with 8GB, for probably double or more the price.

I'm sure 8GB is fine for most games, especially in 1080p. But we've seen the lack of optimization in many ports recently, eating loads of VRAM. And people are probably buying these to last them a good few years, where 8GB for 1080p may really start to struggle in a lot of stuff. And with all the other effects going on in games, turning textures down to medium may not help as much as you hope.
Posted on Reply
#10
ixi
Good news that people with logical thinking will avoid these products :). Bad thing... people without knowledge will buy shi**** prebuilds with these gpu's...
Posted on Reply
#11
RedelZaVedno
Arkz6 years ago the RX580 launched with 8GB, 6 damn years.
Even worse... First Polaris rx 480/8 gigs was released 7 years ago and costed 229 USD :(
Posted on Reply
#12
ixi
RedelZaVednoEven worse... First Polaris rx 480/8 gigs was released 7 years ago and costed 229 USD :(
Even worse.... First gpu was 290x from Sapphire with 8GB vram. 9 years ago.
Posted on Reply
#13
Unregistered
RedelZaVednoEven worse... First Polaris rx 480/8 gigs was released 7 years ago and costed 229 USD :(
Polaris was being given away at that price and they still didn't have much market presence.

AMD posted a loss last quarter. Maybe GPU prices aren't their fault? TSMC, ASML post insane profit and who knows about the other component makers.

People praise the gtx 10 series when those dies were actually much smaller for their equivalent number. The 80 series was similar in size to 4070 and 60 series was 200mm2. Smaller than an rx 6600 XT by quite a bit.

And honestly, how is it people are never happy? I've loved the level of graphics an optimized PS4 game had at 1080p. Heck, if I had a native 720p screen, I might find that good enough. Developers release games like Jedi Survivor with massive textures for what? People complain about performance and it doesn't "look better." My favorite game Sekiro is 1/10th the size and it looks as good and runs great! "But if you stop playing the game and take a screenshot, look at the difference with a microscope, it's so much sharper!" What will ever be enough for people? I wish games traded realism for artistic styles anyway. I don't need skin to look real and have pores in my games...
#14
ymdhis
HaserathPolaris was being given away at that price and they still didn't have much market presence.
Polaris has been the most popular discrete Radeon card on Steam by a wide margin, despite two+ hardware generations coming and going since then. In fact it is only rivaled by onboard Radeon Graphics (!!!), and then followed by the RX5700 and RX6700XT.

Saying it does not have much market presence is simply not true.
Posted on Reply
#15
ZoneDymo
HaserathPolaris was being given away at that price and they still didn't have much market presence.

AMD posted a loss last quarter. Maybe GPU prices aren't their fault? TSMC, ASML post insane profit and who knows about the other component makers.

People praise the gtx 10 series when those dies were actually much smaller for their equivalent number. The 80 series was similar in size to 4070 and 60 series was 200mm2. Smaller than an rx 6600 XT by quite a bit.

And honestly, how is it people are never happy? I've loved the level of graphics an optimized PS4 game had at 1080p. Heck, if I had a native 720p screen, I might find that good enough. Developers release games like Jedi Survivor with massive textures for what? People complain about performance and it doesn't "look better." My favorite game Sekiro is 1/10th the size and it looks as good and runs great! "But if you stop playing the game and take a screenshot, look at the difference with a microscope, it's so much sharper!" What will ever be enough for people? I wish games traded realism for artistic styles anyway. I don't need skin to look real and have pores in my games...
I totally get your point and a lot of it is simply true, that said Crysis came out over a decade ago and those character models still look perfectly fine to even impressive compared to modern games, and that did not need this much Vram, but obviously RT wasnt a factor then either.

I personally think games can look simple but pleasant, coherent, like the Valve games of old, like L4D, but that new Redfall for example just looks ugly, nothing is placed in the world, its all just floating light emitting objects....
But I can start up an older game like Penumbra and enjoy it completely fine.

What I want is devs to stop being lazy and actually move the bar of game design forward instead of giving us more of the same with just better graphics, if you been palying games for the last 20 years...well nothing as changed and all the typical game limitations that you had back then, you still have today, and all those games that pushed the boundries back then...are sadly still the games that pushed boundries today, take the damage/physics model of Red Faction or Soldiers of Fortune.....

Why is the ai in a brand new title like Plague Tale Requiem reacting the same as AI in any other stealth game ever released?

WHERE IS GENUINE PROGRESS......gawd damn it Cyberpunk was suppose to move that needle but man it just set back the clock to over 2 decades and still got praised by many...so we are just doomed really.
Posted on Reply
#16
Roph
MrDweezilPricing is everything. 8 GB declares the 7600s as 1080p cards and they'll need to be $250, maybe $300 for the XTs. I suspect Nvidia will charge another $100 on top of that, which seems like a non-starter.
Heck, 720p cards a few years from now at this rate.
Posted on Reply
#17
tussinman
MrDweezilPricing is everything. 8 GB declares the 7600s as 1080p cards and they'll need to be $250, maybe $300 for the XTs. I suspect Nvidia will charge another $100 on top of that, which seems like a non-starter.
Yeah pricing is the key. 4060 TI if I'm not mistaken is going to start at $450 so 8GB for that is a joke.

If the 7600 is a sub $300 card then AMD will do some sort of damage control like "this isn't the 4060 competitor" or "where pricing this as a 1080p card"
Posted on Reply
#18
Unregistered
ymdhisPolaris has been the most popular discrete Radeon card on Steam by a wide margin, despite two+ hardware generations coming and going since then. In fact it is only rivaled by onboard Radeon Graphics (!!!), and then followed by the RX5700 and RX6700XT.

Saying it does not have much market presence is simply not true.
Ok, I just find it unfair when the better, cheaper gpu loses out in sales at least 4:1 and AMD can’t make money on GPUs outside of crypto nonsense. It may be the most popular AMD card but that also is a sad state of affairs because the people that bought it also only bought it for how cheap it was. I hated seeing them try to come up with a way to make themselves look like the good guys for buying Polaris when AMD stopped selling charity GPUs. “Why have they abandoned their fans” BS. These gamers are just as greedy in their hearts as these corporations and it makes me mad.
ZoneDymoI totally get your point and a lot of it is simply true, that said Crysis came out over a decade ago and those character models still look perfectly fine to even impressive compared to modern games, and that did not need this much Vram, but obviously RT wasnt a factor then either.

I personally think games can look simple but pleasant, coherent, like the Valve games of old, like L4D, but that new Redfall for example just looks ugly, nothing is placed in the world, its all just floating light emitting objects....
But I can start up an older game like Penumbra and enjoy it completely fine.

What I want is devs to stop being lazy and actually move the bar of game design forward instead of giving us more of the same with just better graphics, if you been palying games for the last 20 years...well nothing as changed and all the typical game limitations that you had back then, you still have today, and all those games that pushed the boundries back then...are sadly still the games that pushed boundries today, take the damage/physics model of Red Faction or Soldiers of Fortune.....

Why is the ai in a brand new title like Plague Tale Requiem reacting the same as AI in any other stealth game ever released?

WHERE IS GENUINE PROGRESS......gawd damn it Cyberpunk was suppose to move that needle but man it just set back the clock to over 2 decades and still got praised by many...so we are just doomed really.
I’m completely with you about design.

The problem is not laziness(I do not consider laziness to be real) but lack of creativity/unwillingness to risk it imo. These big games take too much money and want to make as much as possible so they want to cookie cutter their way to sales. Zelda and Souls, my two favorite series, went open world and sales exploded, but I’ve now played 4 open world games and am sick with how they implement them. Big, open, boring, and actually too big that they allow poor designs in their game. I got so tired of shrines/koroks and Elden ring bosses I thought were genuinely awful, and if I want nature and space, I can go hiking, which is far better than any open world game.

I was also interested in New Pokemon Snap due to the first one being a fun design but that game had such a sterile design to it and people still liked it. I don’t understand how people discern anything.

I’m not really sad if games aren’t interesting. There’s a lot more I would rather do with my time at this point.
#19
Roph
HaserathIt may be the most popular AMD card but that also is a sad state of affairs because the people that bought it also only bought it for how cheap it was. I hated seeing them try to come up with a way to make themselves look like the good guys for buying Polaris when AMD stopped selling charity GPUs. “Why have they abandoned their fans” BS. These gamers are just as greedy in their hearts as these corporations and it makes me mad.
You should be mad about AMD being unable to follow up polaris, shitting out a $200 card six years later that performs worse (or the same at best) and has less features like a missing video encoder. The 6500XT was e-waste before it even touched shelves.

It's not on consumers to be a charity for GPU makers. If prices are obscene at the high end or absurd in the midrange with no improvement 6 years later, they're not under some bizarre obligation to buy.
Posted on Reply
#20
Unregistered
RophYou should be mad about AMD being unable to follow up polaris, shitting out a $200 card six years later that performs worse (or the same at best) and has less features like a missing video encoder. The 6500XT was e-waste before it even touched shelves.

It's not on consumers to be a charity for GPU makers. If prices are obscene at the high end or absurd in the midrange with no improvement 6 years later, they're not under some bizarre obligation to buy.
We don’t know the cost to make these cards. TSMC makes so much net income now it’s absurd. The cost to set up the process to make a new die skyrockets each node. The 6500 XT isn’t even e-waste. Plenty of people are satisfied by it. It can’t play some of the latest high end games, so what?

I have to ask, who uses video encoding? How many people are buying a low end card to do video encoding? It was such an absurd argument that I can only imagine Nvidia made it pertinent to other people and they ate it up.

I know consumers don’t have to buy their product. It’s just been annoying when people here or other places have implied AMD owes them something for purchasing their cards as some sort of supporter.

I get that pricing isn‘t like it used to be, but the semiconductor industry is slowing down progress and costs/payments to the monopolies are skyrocketing. We can’t have GPUs improve like they used to since they are basically linked to Moore’s Law and that has rapidly diminished. 3nm and 2nm are only getting worse on returns.
#21
N/A
ixiEven worse.... First gpu was 290x from Sapphire with 8GB vram. 9 years ago.
Yeah, over a 512 bit bus and did it really need it as GTX 780 only had 3GB and was doing just fine. But i have to admit that if my GTX 760 had 4GB it would make me happier as currently I get textures not loading properly even at lowest resolutions.

With the current densities this is the best they can do and why should it get more. If we take 4090 as 100% and 4070 as 50% performance. 24 GB and 12 GB seems logical. same goes for 4080 and 4060 Ti, half the performance half the VRAM, why would it need more. So i guess as GDDR7 will get a node shrink 16GB 128 bit is doable but until then it will be a bit stagnant.
Posted on Reply
#22
Dirt Chip
wNotyarDI don't know if the 6 series from both teams are considered mid-range nowadays. I'd judge them as mid-low, while the 7's are mid-high.
If thay both aim at FHD at max settings, thay are. Thay should anyway.
Posted on Reply
#24
Dirt Chip
aindriu1080p gaming in 2023
...is a very common thing alright.
Posted on Reply
#25
mama
MrDweezilPricing is everything. 8 GB declares the 7600s as 1080p cards and they'll need to be $250, maybe $300 for the XTs. I suspect Nvidia will charge another $100 on top of that, which seems like a non-starter.
The way things are going I don't know if anyone's gonna go 8GB on a new GPU for any price.
Posted on Reply
Add your own comment
Nov 21st, 2024 12:44 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts