• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Ships New Tools that Boost Game Performance

I think integrated graphics are going to be the next phase in computers so....hey. Its start. And I know for a fact some integrated graphics can play some the the latest games out. Because some of the latest games out have optimization that go back years. AKA L4D, HL2, etc.
 
X1200 can not handle bluray, Intel GMA4500 can :).

GMA4500 cuts out half of the colors and frames in that very blu-ray you speak of.


oh and X1200 can handle 720P just fine it just struggles with 1080P
 
You know, if we took intels IGP's back in time about 12 years, people would think they are amazing!!! :roll:
 
You know, if we took intels IGP's back in time about 12 years, people would think they are amazing!!! :roll:

no if you took it back 12yrs you would have the same damn IGP
 
this mere thought = fail
 
a "fail" for everyone!!
 
bet this tool disable half the colors and cuts out every other frame and thats how it gets "better" performance
 
GMA4500 cuts out half of the colors and frames in that very blu-ray you speak of.


oh and X1200 can handle 720P just fine it just struggles with 1080P

what about the HD3300? are we forgetting that it is one of IF NOT THE BEST igp ATM
 
Just my two cents, but im gonna go out on a limb here and say this is a good thing. I mean, optimization is optimization no matter how you look at it. Sure its never gonna play crysis with decent frames, but then again, even most people's desktops with decent graphics cards dont. And how many people who are not tech geeks do you know that play games, at least WoW, on thier intel laptops? Sure the details are set to low, and frames are terrible... but when youre doing 10 fps anyway, that extra 3 or frames you might get out of it are very important. I just dont see how this is a bad thing. Laughable, yes, but better is better no matter how small i say.
 
I just hope that whatever Intel's doing on the graphics front, they better bring one hell of a performer in Larrabee, or else they'll never be taken seriously. Honestly, if they don't get it right with Larrabee, they might as well just give up entirely. Or acquire NVIDIA and be done with it.
 
Just my two cents, but im gonna go out on a limb here and say this is a good thing. I mean, optimization is optimization no matter how you look at it. Sure its never gonna play crysis with decent frames, but then again, even most people's desktops with decent graphics cards dont. And how many people who are not tech geeks do you know that play games, at least WoW, on thier intel laptops? Sure the details are set to low, and frames are terrible... but when youre doing 10 fps anyway, that extra 3 or frames you might get out of it are very important. I just dont see how this is a bad thing. Laughable, yes, but better is better no matter how small i say.

I have to agree with you. ANYTHING that can allow for possible improvement is a positive step forward.

Technology nowdays is not 12 years old, and i like to look forward in hope.
Dont get me wrong guys, im one cold mofo and i dont take dodgy scams for improvement lightly but with the influx of improved laptop gpu's and cpu's becoming more powerfull, smaller and more efficient, with intergrated sound becoming just as good as the average creative sound cards or better etc etc, i have to expect IGP's to do more, be better and more efficient.

Keep inmind that tools from both ATi & Nvidia for their GPU's came out long before the GPU was worthy of any gaming.

Note : - If the game isnt pumping over 100 frames a second, i dont bother playing it.
The only exception is Crysis. I played Crysis knowing it was poorly coded.
There wont ever be a decent onboard GPU in my books because to me decent is 100fps :)
 
@Haytch: doesn't the human eye only discern up to 60fps? Can you really tell the difference between 60 and 100? (not saying you cant, just asking if you can, maybe you have super fast eyes, or maybe im wrong. Google says all kinds of things about it lol)
 
@haytch: doesn't the human eye only discern up to 60fps? Can you really tell the difference between 60 and 100? (not saying you cant, just asking if you can, maybe you have super fast eyes, or maybe im wrong. Google says all kinds of things about it lol)

I wish i knew Papahyooie, i really wish i knew!

Ive never really sat down and done my research on this, but i have debated it several times with associates and i guess we always end up agreeing to disagree.

One thing is for certain, when you hide a single frame in a movie at 50 frames there are those among us that actually see it, and those of us that dont. I wont get into the whole concious and unconcious activity of the brain and so forth because im sure you understand what im talking about already.

Then we have those among us that must adapt to scenarios because so much happens in such a short amount of time that it becomes vital, for example a fighter pilot, a soldier in the heart of the battle or a first person shooter. I believe that under those and many more circumstances the individual is able to utilize a single frame amongst billions per millisecond and react differently to another individual.

I personally think that the terran race have no idea what the going rate of frames per millisecond is in reality, but we do know what we can capture.
I can clearly see the difference between 1fps and 5fps, i can see the difference between 5fps and 25fps, i can see the difference between 25fps and 50fps, and i can see the difference between 50fps and 100fps.
Beyond that i dont think i can, but its always nice to have the extra frames available so that i could MAYBE react accordingly.
 
Last edited:
Too bad the topic of the thread is about improving game performance, not media capabilities or feature set. :p Also keep in mind that the x1200 is much much older hence the lack of certain newer features. In the end the point most of us are trying to make is regardless of what is done, software tweaks, steroids, 1.21 Jigawatts of electricity or what have you, will never bring current Intel IGPs gaming performance to the level of even previous generation ATI or Nvidia offerings. Which is pretty sad as this GPA software becomes more obviously recognized as a marketing ploy than anything else. :banghead:


Amen :pimp:
 
anyone try crysis benchmark :roll:
 
Lmao, the tweaker probably does what we do to our machines already excluding Dropping the Graphics Level down to the bare minimum along with the Resolution. I wouldn't doubt that a Card from 2002 is stronger than the Intel Extreme Graphics of today.
 
Haych can you tell me how you see and need more than 100 fps when your using an LCD with a max of 75 Hz? :rolleyes:

Because if you play with Vsync on, 75 is the max you will get, that if you have actually moved the refresh rate up from the default 60 Hz...
And if you don't use Vsync, the tearing must be killing you that you are so capable of seing so much...
 
Maybe I will be able to play games on my netbook better !





..or not, because there is no way intel is going to make their shit IGP's any better. If they do, I will brand my nuts on youtube.
 
Maybe I will be able to play games on my netbook better !





..or not, because there is no way intel is going to make their shit IGP's any better. If they do, I will brand my nuts on youtube.
:laugh:

Deal you brand your nuts we all get a good laugh, i would say its a win, win you don't have to go threw the pain of kids (Being that you don't have any) and we laugh. Better start preparing:roll:
 
:laugh:

Deal you brand your nuts we all get a good laugh, i would say its a win, win you don't have to go threw the pain of kids (Being that you don't have any) and we laugh. Better start preparing:roll:

+1
I could use a good lol so i hope intel does what they say
 
I almost fell off my chair I was laughting so hard at the notion of intel trying to claim you can game on their IGP.
 
Back
Top