• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Editorial It's Sony, Not AMD in GeForce Titan's Crosshair

That PC Alliance should have done it's job in promoting some standards and quality in the first place.
PC already has standards (OpenGL, Direct3D), but the quality of drivers is not up to par. Making tweaks for games shouldn't be necessary if their software was properly defined and their internals well done, but you know, they also want people to partner with AMD or NVIDIA, not both, so doing nasty things on purpose is also something to take into account.
 
consoles have the one thing that gamers want that desktops will never have: ease of use.

it takes a lot more time and effort to maintain a desktop where a console is automatic. less parts to fail, less to worry about. flick the switch on and you are gaming.
 
Hopefully the prices won't be as idiotically high as they were till now. Because i wouldn't mind going with GeForce this time around...
 
consoles have the one thing that gamers want that desktops will never have: ease of use.

it takes a lot more time and effort to maintain a desktop where a console is automatic. less parts to fail, less to worry about. flick the switch on and you are gaming.

RROD victims would disagree.
 
Doesn't the PlayStation 4, Wii U, and next Xbox all have an AMD GPU? NVIDIA is literally only competing in the computer market so they have to hit hard. Makes sense why NVIDIA would launch a monster--their revenue stream is in danger of drying up.

Hardly drying up. They're killing in the mobile market with Tegra. A lot of the GPU development information gets fed into that program as well. They're also still performing well in the desktop GPU market.

If anything, nVidia is in a much stronger position now than they've ever been in the past.
 
I think the new consoles will have more in game physics effects with all those cpu cores... So much for Physx Nvidia :laugh:

With Sony and MS both looking for 60fps @ 1080p (well they should be!) and with consoles using similar hardware to PCs (more than before) we should in theory see more better looking ports but how they will play and perform on PC in terms of the quality of the ports remains to be seen.

As DX9 was the previous standard for consoles, does anyone know if the next gen console games will be developed using DX10 or DX11 etc?

As both new consoles will use Bluray, will this mark the end of DVD games as a format on PC too? (I know there are other factors like digital distribution too).
 
RROD victims would disagree.

Yea kinda sucked for xbox and PS3 users to get such hardware failure. I still have my original SNES and N64 while they have had quadruple the game time on them as xbox/PS3 have been released. Factors that came into play with xbox/ps3 was the low heat point LEADFREE solder.
 
PC already has standards (OpenGL, Direct3D), but the quality of drivers is not up to par. Making tweaks for games shouldn't be necessary if their software was properly defined and their internals well done, but you know, they also want people to partner with AMD or NVIDIA, not both, so doing nasty things on purpose is also something to take into account.

I'm not talking about those. I'm talking about optimizations for multi-core CPUs, DX10/10.1/11/11.1, on every game those partner companies launch. I'm talking about working with other devs in order to implement more features for the PC crowd, to make a PC first and then port it to consoles. I'm talking about GPU accelerated physics and AI and so the list goes on. :)
 
RROD victims would disagree.

you missed the point. and even so, PCs have many many more ways to fail and a lot more software layers to get through.
 
you missed the point. and even so, PCs have many many more ways to fail and a lot more software layers to get through.

They only fail when people think they know how to take of a OS better then MS does or fail to follow mfg. directions.
 
Anyone else find it odd that an Analyst believes they are going to render games at 240 FPS, where current PC games render 30/60FPS? All that with an APU.

I don't think this analyst is worth his weight in horse manure.
 
Anyone else find it odd that an Analyst believes they are going to render games at 240 FPS, where current PC games render 30/60FPS? All that with an APU.

I don't think this analyst is worth his weight in horse manure.

I render BF3 well above 60fps. Problem is my monitor. If you can get a 240Mhz monitor/TV I could run Quake 3 all day at 240 FPS. :laugh: I think it all depends on the game and monitor because even current APU's can do that. They are not saying the PS4 will run BF4 at 240 FPS. They are just saying it can run SOMETHING at 240 FPS.

FYI Carmack already said next gen consoles are staying at the 30FPS cap in most circumstances so that "analyst" is playing with semantics......or is just a retard.
 
Last edited:
I think Nvidia will fail soon, their Tegra 4 is garbage and they lost Xbox 720, PS4. I actually do not mind, because they the one who destroyed 3dfx disgusting way...
 
They're killing in the mobile market with Tegra.

Tegra is a POS, thank you for playing.

Only breakthrough they made there is because of the Nexus 7, and all the OS/software optimizations and tweaks that came with.


Well, I remember the rumored pricing on AMD's HD 8800s cards, and the speculation that they would outperform any of the current gen (pre-Titan) single GPUs. And I'm pretty sure a pair of R8870s would demolish the GK110... for what? $560?
 
....or is just a retard.

Pretty much what I was pointing out. Even high end PCs can't play games from 6 years ago at a solid 240 FPS. (getting pretty damned close though)
 
You're all forgetting that games written for consoles are done in assembly as much as possible. The one strength of a console is it's uniformity. There's only one possible choice of hardware, so you don't go through drivers and unknowns, but code directly for well-known hardware, shedding some pretty significant overhead and inefficiency.

A well-executed APU with shared GPU and CPU cache could be a rather potent tool in the hands of a skilled coder. Having an engine that never fetches GPU instructions and mesh data from RAM, only using it for immediately needed variables for final rendering, and only looking in VRAM for raw texture data would mean it executes an order of magnitude faster than a typical Direct3D title. Add a bit of driverless low-level access to all registers and shaders, and you have a machine that could very well render at 120 FPS constant. 240, I'm not sure about, but I suppose it's doable...
 
You're all forgetting that games written for consoles are done in assembly as much as possible. The one strength of a console is it's uniformity. There's only one possible choice of hardware, so you don't go through drivers and unknowns, but code directly for well-known hardware, shedding some pretty significant overhead and inefficiency.

A well-executed APU with shared GPU and CPU cache could be a rather potent tool in the hands of a skilled coder. Having an engine that never fetches GPU instructions and mesh data from RAM, only using it for immediately needed variables for final rendering, and only looking in VRAM for raw texture data would mean it executes an order of magnitude faster than a typical Direct3D title. Add a bit of driverless low-level access to all registers and shaders, and you have a machine that could very well render at 120 FPS constant. 240, I'm not sure about, but I suppose it's doable...

Doesn't matter how optimize the code is if the hardware isn't there to push it. APU's are damn nice but they are still APU's and cannot hold a candle to a current mid-range dedicated GPU. Cannot substitute horsepower with skill.
 
An editorial! We need more of these please. :)

What gets me is the sleight of hand that nvidia did to increase the price of graphics cards by pitching the mid range chip of the next generation architecture (Kepler) as a top end product (GTX 680) instead of something like GTX 660 where it belongs, simply because it beat the GTX 580. In contrast, the previous generation Fermi GPU in the GTX 580 is a true top end chip.

Thus, we've paid top dollar for a mid range card, pushing up the price of the true top GPU to stratospheric levels.

And that really sucks for us. If you don't feel resentment towards nvidia for doing this, then you should.
 
An editorial! We need more of these please. :)

What gets me is the sleight of hand that nvidia did to increase the price of graphics cards by pitching the mid range chip of the next generation architecture (Kepler) as a top end product (GTX 680) instead of something like GTX 660 where it belongs, simply because it beat the GTX 580. In contrast, the previous generation Fermi GPU in the GTX 580 is a true top end chip.

Thus, we've paid top dollar for a mid range card, pushing up the price of the true top GPU to stratospheric levels.

And that really sucks for us. If you don't feel resentment towards nvidia for doing this, then you should.

Why? Its called the free market. I remember when people were saying we didn't need AMD or ATI because Intel and NVIDIA would never price gouge due to market demand. Well welcome to the realities of the real world. Market demand is partly driven by competition. This is what happens, and to bash them for it is BS. You would do the SAME THING.

I say bring on the $1000 GPU's.
 
Doesn't matter how optimize the code is if the hardware isn't there to push it. APU's are damn nice but they are still APU's and cannot hold a candle to a current mid-range dedicated GPU. Cannot substitute horsepower with skill.

Oh? Go tell that to the 1964 Morris Mini Cooper S :p
 
Oh? Go tell that to the 1964 Morris Mini Cooper S :p

We are not talking about crappy European economy car racing. You can substitute the car with a pickle.
 
consoles have the one thing that gamers want that desktops will never have: ease of use.

it takes a lot more time and effort to maintain a desktop where a console is automatic. less parts to fail, less to worry about. flick the switch on and you are gaming.

Yes and no. If you mostly buy games on Steam, half of the job has already been done. So in the end, all you need to do is update graphic drivers here and there. www.amd.com and www.nvidia.com. Not exactly a complicated thing to do.

You also have to understand that while PC may be more complicated, it's not locked down platform. I can play games that were designed for PC's from 20 years ago on a current modern systems.

Or even patch them yourself. For example Need for Speed 3 game released in 1998, i've patched it myself and you can play it on a brand new 2013 PC pretty much without any hassle. You just slam in CD, run my patch that copies the game files and updates them and voila. Try doing that with a PS2 game on a PS3. Or an Xbox game on a X360.

Developers don't give a toss even though some of us would buy refreshed games (like we did with Serious Sam HD series). But on PC you at least have community patches like my NFS3 patch and hundreds of others. On consoles you can only stick a finger up your bottom because developers don't care and you have no community.
 
So, was there any news as to whether it's going to be a dual GPU based card or...single?
 
I think Nvidia will fail soon, their Tegra 4 is garbage and they lost Xbox 720, PS4. I actually do not mind, because they the one who destroyed 3dfx disgusting way...

no_6bea18024ca385aeffc1a4030e6b58ae.gif


Tegra is a POS, thank you for playing.

Only breakthrough they made there is because of the Nexus 7, and all the OS/software optimizations and tweaks that came with.


Well, I remember the rumored pricing on AMD's HD 8800s cards, and the speculation that they would outperform any of the current gen (pre-Titan) single GPUs. And I'm pretty sure a pair of R8870s would demolish the GK110... for what? $560?

Keep dreaming.
 
Back
Top