Friday, August 30th 2013
NVIDIA Teams Up With Warner Bros. on Batman: Arkham Origins
NVIDIA today announced it is working with Warner Bros. Interactive Entertainment and WB Games Montréal to make Batman: Arkham Origins, the next installment in the blockbuster Batman: Arkham videogame franchise, a technically advanced and intensely realistic chapter in the award-winning saga for PC players.
Gamers who purchase a qualifying GPU from a participating partner will receive a free PC edition of Batman: Arkham Origins, which will be released worldwide on Oct. 25, 2013.Developed by WB Games Montréal, Batman: Arkham Origins features an expanded Gotham City and introduces an original prequel storyline set several years before the events of Batman: Arkham Asylum and Batman: Arkham City. Taking place before the rise of Gotham City's most dangerous criminals, the game showcases a young Batman as he faces a defining moment of his early career and sets his path to becoming the Dark Knight.
Batman has immense power, strength and speed - the same attributes that make a GeForce GTX GPU the ultimate weapon to take on Gotham's dark underworld. The NVIDIA Developer Technology Team has been working closely with WB Games Montréal to incorporate an array of cutting-edge NVIDIA gaming technologies including DirectX tessellation, NVIDIA TXAA antialiasing, soft shadows and various NVIDIA PhysX engine environmental effects, such as cloth, steam and snow. Combined, these technologies bring the intricately detailed worlds of Gotham to life.
"The Batman: Arkham games are visually stunning and it's great that we are able to continue building upon the amazing graphics with Batman: Arkham Origins," said Samantha Ryan, Senior Vice President, Production and Development, Warner Bros. Interactive Entertainment. "With NVIDIA's continued support, we are able to deliver an incredibly immersive gameplay experience."
NVIDIA will be unveiling a sneak peek of Batman: Arkham Origins at PAX Prime in Seattle, during the NVIDIA stage presentation at the Paramount Theater on Monday, Sept. 2 at 10 a.m. PT. Entry is free.
Additionally, any PAX attendees that purchase a qualified bundle from the special kiosk at the NVIDIA booth on the show floor will receive for free a limited edition Batman lithograph -- one of only 1,000 being produced.
For a full list of participating bundle partners, visit: www.geforce.com/free-batman. This offer is good only until Jan. 31, 2014.
Gamers who purchase a qualifying GPU from a participating partner will receive a free PC edition of Batman: Arkham Origins, which will be released worldwide on Oct. 25, 2013.Developed by WB Games Montréal, Batman: Arkham Origins features an expanded Gotham City and introduces an original prequel storyline set several years before the events of Batman: Arkham Asylum and Batman: Arkham City. Taking place before the rise of Gotham City's most dangerous criminals, the game showcases a young Batman as he faces a defining moment of his early career and sets his path to becoming the Dark Knight.
Batman has immense power, strength and speed - the same attributes that make a GeForce GTX GPU the ultimate weapon to take on Gotham's dark underworld. The NVIDIA Developer Technology Team has been working closely with WB Games Montréal to incorporate an array of cutting-edge NVIDIA gaming technologies including DirectX tessellation, NVIDIA TXAA antialiasing, soft shadows and various NVIDIA PhysX engine environmental effects, such as cloth, steam and snow. Combined, these technologies bring the intricately detailed worlds of Gotham to life.
"The Batman: Arkham games are visually stunning and it's great that we are able to continue building upon the amazing graphics with Batman: Arkham Origins," said Samantha Ryan, Senior Vice President, Production and Development, Warner Bros. Interactive Entertainment. "With NVIDIA's continued support, we are able to deliver an incredibly immersive gameplay experience."
NVIDIA will be unveiling a sneak peek of Batman: Arkham Origins at PAX Prime in Seattle, during the NVIDIA stage presentation at the Paramount Theater on Monday, Sept. 2 at 10 a.m. PT. Entry is free.
Additionally, any PAX attendees that purchase a qualified bundle from the special kiosk at the NVIDIA booth on the show floor will receive for free a limited edition Batman lithograph -- one of only 1,000 being produced.
For a full list of participating bundle partners, visit: www.geforce.com/free-batman. This offer is good only until Jan. 31, 2014.
27 Comments on NVIDIA Teams Up With Warner Bros. on Batman: Arkham Origins
Once you disable the Nvidia only goodies AMD does okay for itself.
does that mean this is defo not a gameing evolved or never settle game then, just they both seem to be working hard with everyone these days its hard to keep up especially since i came across grid2s intel evolved nonesense , feck cant they all sort their heads out:ohwell: its us poor gamers getting the short shift every time:wtf:.
Looking forward to it regardless, hopefully WB Games Montréal have done the franchise justice.
Might as well just DLC the previous game using UE 2.5 (10yr old game engine) and add the features if the core engine isn't going be changed much. Then you go into proprietary features which they are tauting.
Doesn't take a genius to figure that out. Last time I checked there are no CUDA acceleration in any consoles this game will be released on.
Sending engineers to a studio for marketing to sell GPU cards :rolleyes:
This vendor proprietary stuff needs to stop. I won't support Nvidia as long as they continue to do it.
Would you be happy if an "AMD Evolved" title you were excited to play had various special features and effects that were blocked for you if it's detected that you have an Nvidia GPU?
As for the whole PhysX farrago, its just as much a case of ATI's dithering as anything else. ATI originally looked into buying Ageia( I suppose they would buy the IP then give it away free to everyone else?), decided that they'd hitch their wagon to Havok FX ...Havok got swallowed by Intel and development goes into a tailspin and Nvidia buys Ageia (for the not inconsiderable sum of around $US150 million)- offers PhysX to ATI(BTW this is the same Roy Taylor that does the talking head thing for AMD now)...ATI says no thanks, 'cos the Future is HavokFX™....mmm ok). AMD begin a public thrashdown of PhysX thinking that HavokFX will eventually run riot. A couple of months later the PhysX engine is incorporated into the Nvidia driver - AMD locked out. Now who didn't see that coming?
If ATI/AMD wanted physics so badly they'd either stump up for a licence or help develop an engine. They did neither. If they cant be bothered and by all accounts, the majority of the AMD user-base have no time for PhysX...what precisely is the issue?
You're right, it's funny :D:D:D
Not doing yourself any favours are you. :slap:
Roph butthurt. :laugh:
I see you renewed your contract as Nvidia PR puppet again.
Try reading the post and who I was responding to before you make an ass out of yourself like always.
I was responding to Fourstaff accertion that Nvidia engineers were sent over to make a game better. Which if that was the case the game would be improved through out all platforms not just limited to PC platform where only a certain percentage of the users whom happen to have the corresponding hardware to take advantage of the proprietary API.
In-coming excuses in 3,2,1...
Not as i really care as the only thing i have seen it good at is killing FPS.
It's not some thing that will get most people buy a nVidia card although it might encourage people with ATI cards to get a nVidia card as their second.
Blocking shit though drivers is so lame. and all they have to say is that ATI and nvidia configs are not supported but may work.
Pricks..
What people do not understand, is that it is a compute based physics feature. Basically meaning you need a GPU good at compute, which is what GCN architecture is very good at. 5870s, were junk with compute, and Nvidia Kepler, is the same way unless you have a Titan, or a Quadro/Tesla card.
Kepler has had its double precision part of the GPU completely stripped from it for the GeForce series, which is why performance with TressFX isnt that great on Nvidia cards, or other cards that are weak in compute (5870s).
I played and beat Tomb Raider with a gtx680, and once Nvidia was able to get drivers released that worked well with the final game code, the game ran like butter. I was getting 50-60 FPS constant maxed (it would drop below that, but not long enough for me to notice. And my eyes are pretty tuned to dropping frame rate too). I get even more now with the 780. Geforce cards handle TressFX through pretty much brute force, since their compute performance is lack luster.