Tuesday, August 1st 2017

NVIDIA Unlocks Certain Professional Features for TITAN Xp Through Driver Update

In a bid to preempt sales of the Radeon Pro Vega Frontier Edition, and the Pro WX 9100, NVIDIA expanded the feature-set of its consumer-segment TITAN Xp graphics card, with certain features reserved for its Quadro family of graphics cards, through a driver update. NVIDIA is rolling out its latest GeForce software update, which adds professional features for applications such as Maya, unlocking "3X more performance" for the software.

Priced at USD $1,199, the TITAN Xp packs a full-featured "GP102" graphics processor, with 3,840 CUDA cores, 240 TMUs, 96 ROPs, and 12 GB of GDDR5X memory across the chip's 384-bit wide memory interface. At its given memory clock of 11.4 GHz (GDDR5X-effective), the card has a memory bandwidth of 547.6 GB/s, which is higher than the 484 GB/s of the Radeon Pro Vega Frontier Edition.

DOWNLOAD: NVIDIA GeForce 385.12 for TITAN Xp
Source: NVIDIA
Add your own comment

92 Comments on NVIDIA Unlocks Certain Professional Features for TITAN Xp Through Driver Update

#51
Hood
[XC] Oj101It's highly fashionable and trendy to hate NVIDIA and Intel lately, even if they retain the performance crown in their respective markets.
Nothing new. It's always been one of the annoying facets of human nature, to tear down the guy (or company) on top, putting them under a microscope in search of the slightest flaw, and exaggerating any that are found. It seems worse lately, with the current atmosphere of entitlement - now lots of people feel it's their God-given right to dictate the policies of large, successful corporations, because of course they should make ME chairman of the board, because I'm special, all my teachers told me so when they were giving out trophies to everyone. It's gonna get worse before it gets better, so keep your mind right or you might get pulled into the abyss...
Posted on Reply
#52
Prince Valiant
HoodNothing new. It's always been one of the annoying facets of human nature, to tear down the guy (or company) on top, putting them under a microscope in search of the slightest flaw, and exaggerating any that are found. It seems worse lately, with the current atmosphere of entitlement - now lots of people feel it's their God-given right to dictate the policies of large, successful corporations, because of course they should make ME chairman of the board, because I'm special, all my teachers told me so when they were giving out trophies to everyone. It's gonna get worse before it gets better, so keep your mind right or you might get pulled into the abyss...
Excusing it is how you get shoddy products. No one reasonable expects a company to follow their expectations to the letter but there's nothing wrong with expecting a reasonable level of quality.
Posted on Reply
#53
cdawall
where the hell are my stars
Prince ValiantExcusing it is how you get shoddy products. No one reasonable expects a company to follow their expectations to the letter but there's nothing wrong with expecting a reasonable level of quality.
Would you call the 970 a shoddy product? Sales would say otherwise. I also don't really know anyone upset with them.
Posted on Reply
#54
Prince Valiant
cdawallWould you call the 970 a shoddy product? Sales would say otherwise. I also don't really know anyone upset with them.
Would you say that it's acceptable to deliberately mislead people with the specs and set a precedent for it to be acceptable? My previous post had nothing to do with the 970 specifically, just so that's clear.
Posted on Reply
#55
cdawall
where the hell are my stars
Prince ValiantWould you say that it's acceptable to deliberately mislead people with the specs and set a precedent for it to be acceptable? My previous post had nothing to do with the 970 specifically, just so that's clear.
It had 4gb of vram.
Posted on Reply
#56
Prince Valiant
In the strictest sense. Unless there were lots of asterisks specifying that a portion of that memory was slower it's still misleading. If they had advertised it as a 3.5GB card there wouldn't have been a lawsuit.
Posted on Reply
#57
cdawall
where the hell are my stars
Prince ValiantIn the strictest sense. Unless there were lots of asterisks specifying that a portion of that memory was slower it's still misleading. If they had advertised it as a 3.5GB card there wouldn't have been a lawsuit.
It still has 4gb of ram...fast slow or on fire it is still there
Posted on Reply
#58
Prince Valiant
cdawallIt still has 4gb of ram...fast slow or on fire it is still there
Okay? They still deserved everything they got as a result of their decision.
Posted on Reply
#59
vega22
cdawallWould you call the 970 a shoddy product? Sales would say otherwise. I also don't really know anyone upset with them.
sales would say the 1050 is the best gpu you can buy today as most people buying a gpu buy them. yet it doesn't make it true :D

has anybody found out what they have actually enabled with these special sauce drivers?
Posted on Reply
#60
cdawall
where the hell are my stars
Prince ValiantOkay? They still deserved everything they got as a result of their decision.
From what it sounds like they got nothing. Yet to hear of anyone getting a payout.
Posted on Reply
#61
jabbadap
vega22has anybody found out what they have actually enabled with these special sauce drivers?
Yeah there's no changelog or release notes on these drivers so that is good question.
Posted on Reply
#62
Prince Valiant
cdawallFrom what it sounds like they got nothing. Yet to hear of anyone getting a payout.
It's a class action suit. I can't think of any that were especially fast.

According to this the initial payout has happened or started:
www.bursor.com/2017/04/payments-sent-to-nvidia-gtx-970-class-members/

Settlement info (I have no idea if this is the final document):
cdn.arstechnica.net/wp-content/uploads/2016/07/show_temp.pl-2.pdf

If you want to slog through legal documents for more specific info, feel free.
Posted on Reply
#63
evernessince
birdie- Normal people don't have enough money to throw at Titans, and pros perfectly understand all the differences between different Titan versions. Also NVIDIA has never forced anyone to buy Titans.
- Never used one, don't care. And I'm an NVIDIA user. Come again please.
- Factually it has 4 gigs of VRAM, 0.5 of which runs slower which made a lot of people angry, yet in absolute most games these 0.5GB of slow VRAM don't affect performance/FPS in any way. The specs could have been better presented, true and for that they lost a class action lawsuit.
- GameWorks use the hardware features which work faster on NVIDIA GPUs. It's not like GW use the features which don't allow AMD cards to run GW titles. In a lot of games you can disable GW features completely. Also, since NVIDIA is a dominant player in the GPU market and GW allow for better graphics, independent game developers use the said features because they will work for at least 75% of people out there without a discernible performance loss.
- What?
- NVIDIA has never made anyone buy FE cards. Never.
- This has been proven to be false for at least a dozen times already.

So what do we see here? A load of either falsehoods, incompetence or baloneys.

Yeah, come with another dozen please, because you've got nothing so far. Or just don't.
Actually he's right about the GameWorks one. Nvidia doesn't just use features that work faster on Nvidia hardware, they use features to a nonsensical point to hurt the competition and everything but their latest product.. I most certainly remember Crysis 2's issues where Nvidia has enabled tessellation on geometry that wasn't even visible and tessellated the grass far beyond the point of visual improvement. Not to mention, this also crippled their previous gen cards at the time, the 700 series. I remember the anger of 780 ti owners on the Crytek forms. We could even go back to the start of gameworks and point to Sacred 2. Enabling PhysX on that game on AMD hardware essentially made it unplayable. Mind you, this was a game they had working in Beta and Alpha perfectly fine on AMD hardware before Nvidia acquired ageia and PhysX

"NVIDIA is a dominant player in the GPU market and GW allow for better graphics, independent game developers use the said features because they will work for at least 75% of people out there without a discernible performance loss."

This is just a blatantly false statement. I can't even think of a game where GameWorks has improved the graphics. In fact, where GameWorks is implemented performance often takes a dive for both Nvidia and AMD users. For example, GodRays on Fallout 4, a simple effect by any standard, were awful and completely thanks to Nvidia. The only time GameWorks doesn't have a toll on performance is when it is only using PhysX lightly. In borderlands 2, if you turn PhysX on medium or high it does introduce lag/stuttering, especially on high.

No, if you wanted to introduce performant effects into video games you'd use one of AMD's technologies like TressFX because everyone can optimize for it due to it being open. Compare that to Nvidia, where games that are in the GameWorks program must bar AMD from seeing large portions of the code and prevent them from optimizing.

GameWorks is absolute shit no matter which side you are on.
Posted on Reply
#64
TheGuruStud
cdawallFrom what it sounds like they got nothing. Yet to hear of anyone getting a payout.
Checks have been being mailed for weeks.

RIP P6000
Posted on Reply
#65
cdawall
where the hell are my stars
TheGuruStudChecks have been being mailed for weeks.

RIP P6000
Curious there was someone in the other thread just complaining up and down about not getting it etc.
Posted on Reply
#66
rtwjunkie
PC Gaming Enthusiast
SOSDD. I logged in just to see if by chance there might be a thread that didn't feature green/red hatred. I guess my expectations were too high.

A few level heads in here, and also some mild sarcasm meant to point out reality. 54thvoid said it best, to paraphrase: these are companies in business to make money, off you. Fanatical loyalty is misplaced, because neither one has any for you. It's money that matters! :cool:

Posted on Reply
#67
renz496
FordGT90ConceptWait, wait, wait! What did this driver supposedly unlock? Is it FP16 performance related? If it is and NVIDIA could be heading for a massive false advertising lawsuit from the people they gouged (deep learning and AI developers). This will probably be immediately followed by an FTC anti-trust probe because their position in the market allowed them to benefit hugely from the false advertising. There's plenty of supporting evidence for that charge as well (Founder's Edition, TWIMTBP promotion, intentionally gimping performance on GameWorks games when run on competitor's hardware with no means for competitors to rectify it, and so on).
no. other than GP100 all pascal chip FP16 performance is being limited in hardware. this is just about providing more "optimized" driver for existing professional application which most of them not even utilize FP16.

GameWorks is licensing issue. it sucks on consumer side for sure but i don't think anything unfair happen. AMD can go to havok for example for source access but they will not going to get it unless they pay the access fee. it is the same with GameWorks. even AMD themselves did not open their tech for other to freely access until nvidia coming out with GameWorks.
Posted on Reply
#68
Imsochobo
cdawallAnyone who complains about nvidia drivers should be forced to use AMD for a month with a single overclocked card. Nvidia drivers work so much better it isn't even funny.

That doesn't even start with the crossfire issues. Start a game up, guess crossfire isn't working today, reboot the game, still no crossfire, reboot the system, reset up wattman, oh it works kinda now.
Overclocking gpu? bios flash ?
I use a overclocking tool for a day and then flash the gpu cause I hate setting profiles in both Windows and Linux. thus never really experienced this.

2.\
Cause SLI works? now recently they've stopped using sli in most games, just ain't doing anything.
I'd say avoid multigpu regardless of vendor :)
BiggieShadyAh, I suspected as much ... although I haven't used them for gaming since the evergreens, all I've been hearing is how it's much better now.
I use AMD at work though, having issues with multiple monitors ... sometimes my second screen stays locked in low resolution and I'm unable to change res unless machine is restarted.
same issues with our computers with nvidia and intel graphics ;)
Posted on Reply
#69
DeathtoGnomes
BiggieShadyAh, I suspected as much ... although I haven't used them for gaming since the evergreens, all I've been hearing is how it's much better now.
I use AMD at work though, having issues with multiple monitors ... sometimes my second screen stays locked in low resolution and I'm unable to change res unless machine is restarted.
I use multiple monitors without such issues. Your IT guy is fluffiing off if he cant fix that.
Posted on Reply
#70
FordGT90Concept
"I go fast!1!11!1!"
renz496no. other than GP100 all pascal chip FP16 performance is being limited in hardware. this is just about providing more "optimized" driver for existing professional application which most of them not even utilize FP16.
Got source?
renz496AMD can go to havok for example for source access but they will not going to get it unless they pay the access fee.
Microsoft bought Havok from Intel. Microsoft is no doubt planning to implement Havok into DirectX. It won't be open source but Microsoft has a history of working with hardware manufacturers to get hardware acceleration working.
Posted on Reply
#71
Vayra86
FordGT90ConceptGot source?


Microsoft bought Havok from Intel. Microsoft is no doubt planning to implement Havok into DirectX. It won't be open source but Microsoft has a history of working with hardware manufacturers to get hardware acceleration working.
Not a conclusive source, but Google turned up this one which I find quite interesting, and several responses suspect a separate hardware bit on GP100

devtalk.nvidia.com/default/topic/1001191/how-fp32-and-fp16-units-are-implemented-in-gp100-gpus/
Posted on Reply
#72
cdawall
where the hell are my stars
ImsochoboOverclocking gpu? bios flash ?
I use a overclocking tool for a day and then flash the gpu cause I hate setting profiles in both Windows and Linux. thus never really experienced this.

2.\
Cause SLI works? now recently they've stopped using sli in most games, just ain't doing anything.
I'd say avoid multigpu regardless of vendor :)
Ah so hows this signed bios junk treating you? Its my favorite thing having to bypass checking to get a bios mod to work.

Multigpu works fine for nvidia right now. I am using it. Either it works or it doesn't in a game it isn't a game to see if the profile launches correctly like crossfire.
Posted on Reply
#73
renz496
FordGT90ConceptGot source?
www.nvidia.com/object/accelerate-inference.html

this is nvidia tesla card made specifically for deep learning stuff. look at tesla P4 (GP104 based) and tesla P40 (GP102 based). for both product nvidia did not mention it's FP16 performance. only tesla P100 FP16 performance was listed. for this kind of product there is no reason for nvidia to artificially limit it's FP16 performance unless the performance was poor to being with. and then source from anandech:

www.anandtech.com/show/10510/nvidia-announces-nvidia-titan-x-video-card-1200-available-august-2nd

look at the spec table both Titan X 2016 and GTX 1080 FP16 (native) performance is listed as 1/64 (of FP32 performance)
FordGT90ConceptMicrosoft bought Havok from Intel. Microsoft is no doubt planning to implement Havok into DirectX. It won't be open source but Microsoft has a history of working with hardware manufacturers to get hardware acceleration working.
that prospect is interesting but physic engine is integrated on game engine level not operating system. imagine the game suddenly cannot work when being port to linux or PS4 since both did not use Microsoft direct x. i don't think developer like the idea of changing physics engine in their game based on what operating system they use because right now when it comes to physic engine the one they use will pretty much work on all platform.
Posted on Reply
#74
BiggieShady
DeathtoGnomesI use multiple monitors without such issues. Your IT guy is fluffiing off if he cant fix that.
I use multiple monitors also without such issues most of the time, you gnome killer :p ... then I accidentally knock off VGA cable while in lock screen when 2nd display is off (damn laptop has one HDMI and one VGA) and when I reconnect the unpredictable bugs ensue ... it may be win10 thing but also it's a first gen GCN mobile gpu on a old piece of crap laptop
Posted on Reply
#75
TheGuruStud
BiggieShadyI use multiple monitors also without such issues most of the time, you gnome killer :p ... then I accidentally knock off VGA cable while in lock screen when 2nd display is off (damn laptop has one HDMI and one VGA) and when I reconnect the unpredictable bugs ensue ... it may be win10 thing but also it's a first gen GCN mobile gpu on a old piece of crap laptop
Oh, yeah, windows is of no help. I just use the driver panel to set displays. Miraculously (ha), it works just fine after that.
Posted on Reply
Add your own comment
Dec 18th, 2024 15:09 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts