• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Fine-Tunes Llama3.1 Model to Beat GPT-4o and Claude 3.5 Sonnet with Only 70 Billion Parameters

Joined
Jan 11, 2022
Messages
898 (0.84/day)
Would have been nice if intel hadnt duped 3dxpoint so a TB of that could be put on a 5090
 
Joined
Dec 14, 2011
Messages
1,067 (0.22/day)
Location
South-Africa
Processor AMD Ryzen 9 5900X
Motherboard ASUS ROG STRIX B550-F GAMING (WI-FI)
Cooling Noctua NH-D15 G2
Memory 32GB G.Skill DDR4 3600Mhz CL18
Video Card(s) ASUS GTX 1650 TUF
Storage SAMSUNG 990 PRO 2TB
Display(s) Dell S3220DGF
Case Corsair iCUE 4000X
Audio Device(s) ASUS Xonar D2X
Power Supply Corsair AX760 Platinum
Mouse Razer DeathAdder V2 - Wireless
Keyboard Corsair K70 PRO - OPX Linear Switches
Software Microsoft Windows 11 - Enterprise (64-bit)
Up - 1 R

Yep, it works fine.

I was just about to... erm, yeah, you got here first. :D

Glad to see all that VRAM going to "good use" while we have to pay top dollar for a mediocre amount so our games don't stutter, nVstuttterrrrrrrr here are all your "R"s Mr A.I. :roll::roll::roll::roll:
 
Joined
Sep 30, 2024
Messages
99 (1.36/day)
If A.I. especially from nGreedia is so good, how come it's been 2 years since they announced that they would use A.I. to create graphics drivers for their cards...? Just marketing BS, as somebody has already shown in this thread. All it's good for is language, fake voices and A.I. image generation.
 

AleksandarK

News Editor
Staff member
Joined
Aug 19, 2017
Messages
2,626 (0.98/day)
If A.I. especially from nGreedia is so good, how come it's been 2 years since they announced that they would use A.I. to create graphics drivers for their cards...? Just marketing BS, as somebody has already shown in this thread. All it's good for is language, fake voices and A.I. image generation.
NVIDIA already uses AI to design chips, so don't underestimate the power of AI so soon. For something its good, for some things not so much ;)
 
Joined
Sep 30, 2024
Messages
99 (1.36/day)
NVIDIA already uses AI to design chips, so don't underestimate the power of AI so soon. For something its good, for some things not so much ;)
Well it was nV themselves that said they will use A.I. to write their display drivers. They of all people would know if it could do it or not. My point is that A.I. beyond certain low-level functions, is not the be-all-and-end all it is hyped to be if even NV can't make use of it in 2 years.

Just look at the "new" perma-beta NV App... It's been a year, and they still haven't been able to add even half the functionality of their 20-year-old control panel. If A.I. was what NV promise it is, why can't they get it to do anything meaningful for them?

We have learned that anything NV announces doesn't necessarily mean that they will deliver it. I feel for the most part, A.I. is fool's gold.
 
Joined
Jul 24, 2024
Messages
263 (1.87/day)
System Name AM4_TimeKiller
Processor AMD Ryzen 5 5600X @ all-core 4.7 GHz
Motherboard ASUS ROG Strix B550-E Gaming
Cooling Arctic Freezer II 420 rev.7 (push-pull)
Memory G.Skill TridentZ RGB, 2x16 GB DDR4, B-Die, 3800 MHz @ CL14-15-14-29-43 1T, 53.2 ns
Video Card(s) ASRock Radeon RX 7800 XT Phantom Gaming
Storage Samsung 990 PRO 1 TB, Kingston KC3000 1 TB, Kingston KC3000 2 TB
Case Corsair 7000D Airflow
Audio Device(s) Creative Sound Blaster X-Fi Titanium
Power Supply Seasonic Prime TX-850
Mouse Logitech wireless mouse
Keyboard Logitech wireless keyboard
NVIDIA already uses AI to design chips, so don't underestimate the power of AI so soon. For something its good, for some things not so much ;)
If so called "AI" is really used by Nvidia to design their chips, then god help us. Wonder how many transistors in that chip are just a waste of sand.

If you let lots of electronics schemes through the model, it will still deliver another scheme based on provided data. It can't really think, so don't expect an improvement. I'm not saying that it can't yield some useful stuff, but the chances are really low.
 
Top