Wednesday, February 20th 2019

Intel Invites Gamers for a Graphics Odyssey Spanning Multiple Continents

Intel is working to build up hype and awareness around its discrete graphics efforts, of which we're still to see more off besides Intel's continuous acquisition of AMD-based talent for that particular endeavor. It's relatively hard to build up enthusiasm for something other than the fact there is a third competitor entering the high-performance graphics card space; the rest is mostly rumors, speculations, and declared intentions.
Intel seems to be starting an Odyssey of its own with multiple events spanning the globe, aiming for gatherings of gamers that can give Intel feedback on their graphics pursuit. The Odyssey is "built around a passionate community, focused on improving graphics and visual computing for everyone, from gamers to content creators."; access for the events will be granted, after a sign-up form, by way of an Intel VIP Pass, which will give gamers "killer deals and freebies, preferred beta access, the latest gaming news and more." If you're interested and want to partake in being on the inside of some Intel events that might grant you access to information on Intel's upcoming Xe graphics products as well as to giveaways and freebies, follow the source link for both the press info and sign-up form.
Sources: Intel gameplay, images via Cristiano Siqueira's Twitter, Unofficial Concept Renders
Add your own comment

53 Comments on Intel Invites Gamers for a Graphics Odyssey Spanning Multiple Continents

#26
cucker tarlson
ArbitraryAffectionEnjoy your 6 threads and RTX, oh and DLSS. I'm sure it's worth it.
Well I have neither but since you mentioned it then enjoy your 2700x that cost 1.6x as much as 8400 but can't prove itself faster in gaming. Great value that intel can't come close to. :roll:
ArbitraryAffectionSure thing hon, enjoy paying more and getting less.
lol,ironyyyy
I get it,more threads per dollar is how you determine that.That was the whole thing that made fx8 superior to i3s after all.
why don't you snag a radeon vii while you're at it,it's got 16gb of vram.you're getting more for less.
Posted on Reply
#27
cucker tarlson
ArbitraryAffection8400 is going to hitch and stutter in newer games because 6 threads isn't going to last long at all. In fact anyone that bought 8400 over 2600 is an idiot.
:laugh:
lol,I honestly expected better of you, I am quite surprised.
Posted on Reply
#28
Xaled
The main point that many ignore is there is no need for the new card to beat 2080ti, nor even 2080. A 2070 equivalent with 250 would shake the whole world :)
Posted on Reply
#29
notb
XaledThe main point that many ignore is there is no need for the new card to beat 2080ti, nor even 2080. A 2070 equivalent with 250 would shake the whole world :)
Of course. Intel is doing all this research and invest billions to sell the final product much lower than they could.
You should teach business. :-)
Posted on Reply
#30
Xaled
notbOf course. Intel is doing all this research and invest billions to sell the final product much lower than they could.
You should teach business. :-)
Offering what used to cost 500-600$ 3 years ago for 250$ today is not a big thing at all.
Posted on Reply
#31
ghazi
Ah yes. More Raja hype man marketing BS with flashy posters and logos yet no product to show.
Posted on Reply
#32
Berfs1
Object55Well at least they should get props for one thing. Power connector location.
It’s a render made by a third party, not by intel.
Posted on Reply
#33
Casecutter
I find it odd they "poach" AMD's old RTG personnel, they do get the suppose cutting edge engineering folks? At least if we listen to those who believe the other GPU is far superior. Intel has the cash to poach anyone but a mindshare group can't wrap there heads around why no big Nvidia employees. I'm sure Intel just want to pay or has cash for only a "second string" of GPU engineers. Well, at least Raja Koduri was a big name "second string", AMD didn't lose in the defection.

The other is will these Intel GPU be anywhere near great if Intel's own Fab's aren't looking toward the next shrink any time soon, it will be hard to compete with 7nm. While we are hearing Intel 7nm is slated to be on track for it's "introduction in accordance with its original schedule". Some are saying that's now at minimum late 2020.
Posted on Reply
#34
moproblems99
CasecutterI find it odd they "poach" AMD's old RTG personnel, they do get the suppose cutting edge engineering folks? At least if we listen to those who believe the other GPU is far superior. Intel has the cash to poach anyone but a mindshare group can't wrap there heads around why no big Nvidia employees. I'm sure Intel just want to pay or has cash for only a "second string" of GPU engineers. Well, at least Raja Koduri was a big name "second string", AMD didn't lose in the defection.

The other is will these Intel GPU be anywhere near great if Intel's own Fab's aren't looking toward the next shrink any time soon, it will be hard to compete with 7nm. While we are hearing Intel 7nm is slated to be on track for it's "introduction in accordance with its original schedule". Some are saying that's now at minimum late 2020.
It could be that no one is reporting NV defectors. I agree though that you would think Intel would be poaching engineers from the top company. That is what leads me to believe that Intel is specifically building a computer oriented card as those are the engineers they are taking.

Also, look at the performance difference between Vega II and Turing 2080 Ti. Node doesn't really matter (grandly speaking), architecture does.
Posted on Reply
#35
Vayra86
lynx29Agreed, if their product can't compete with 2080 ti at $799 price point, I'll pass, I trust Nvidia drivers more for older titles.

Honestly, since release won't even be late 2020, if it does not surpass 2080 ti performance, I am not even going to blink an eye. I don't have time to wait around anymore. Most likely AMD and Intel will be a laughing stock, I wish it were not so... but I demand ultimate performance on my last GPU buy (I firmly believe silicon is dying and plan to make 2080 ti or 3080 ti my last purchase) hopefully I can hold out long enough for 3080 ti, I think I can ^^
Silicon dying? Hell no. They will stretch that as long as possible, and we're just at the early days of 7nm. Unless you want that 2080ti equivalent to last over a decade, I think you should adjust some expectations here. These things don't move fast, despite what everyone wants. Even just the trickle down of 7nm to consumer graphics is going to take at least 1 ~ 2 years, they are only just ramping up EUV 7nm production which makes it cost effective. And the next step, which, given the state of development on alternatives, will certainly also be made, and even 7nm can be vastly improved. Look at 14nm and how long Intel dragged that one out now, while still gaining performance every gen.

There are more ways to expand silicon performance without node shrinks. Die size can be increased, and even the whole package can go to new standards in size. You can rest assured that EVEN if we see the 'last node', they will be using that for many many years to compensate the initial expense. Even if that was supposed to be 7nm, you'd still be looking at a decade. Just imagine only the expenses involved in changing fabs internally to a new material. Machines worth billions of dollars need to be replaced/phased out/adjusted for ALL fabs.
Posted on Reply
#36
notb
CasecutterI find it odd they "poach" AMD's old RTG personnel, they do get the suppose cutting edge engineering folks? At least if we listen to those who believe the other GPU is far superior. Intel has the cash to poach anyone but a mindshare group can't wrap there heads around why no big Nvidia employees. I'm sure Intel just want to pay or has cash for only a "second string" of GPU engineers. Well, at least Raja Koduri was a big name "second string", AMD didn't lose in the defection.
Both companies are large and they all hire dozens of top tier GPU engineers and designers. But we know just the key people by name.
So maybe Intel hired Koduri from AMD, but his whole team was based on ex-Nvidia people? We don't know that.

It's just that AMD is more of a cult brand, so people actually know top managers/designers by name and follow them => we have a lot of HR news.

I'm not surprised by key AMD people jumping ship. Many of them remember the golden days of ATI. They tasted success. They may have just become fed up by a decade in Nvidia's shadow with equally sad perspective.

At Nvidia you're making GPUs that dominate almost every segment. You're getting into exotic tech like tensor cores. You're involved in scientific projects and conferences. And it's a rich company, so salaries and working comfort must be excellent.

At AMD you're polishing a decade-old technology to get more fps in games. And every time you open a WallStreet newspaper, someone analyzes who will buy you or how long you'll survive if Sony switches to Nvidia for next-gen PS. I don't think people at AMD need a lot of persuading...
Posted on Reply
#38
Totally
I'll take their efforts seriously when I see results and when they start knockinng the ugly off that card starting with those LEDs
cucker tarlsonNo point in arguing obvious delusions.

You mean in your alternative reality ? :laugh:
Since when is quoting local prices delusional?

8600k $292 USD
2700k $310 USD

Where can I get a CPU cooler and an SSD for $18.
Posted on Reply
#39
cucker tarlson
TotallyI'll take their efforts seriously when I see results and when they start knockinng the ugly off that card starting with those LEDs



Since when is quoting local prices delusional?

8600k $292 USD
2700k $310 USD

Where can I get a CPU cooler and an SSD for $18.
lol,you just quoted your local prices.
but I get it,if it isn't happenning in America,it isn't happening at all.

plus you're +30 usd off, 8600k is $260
pcpartpicker.com/product/Mr2rxr/intel-core-i5-8600k-36ghz-6-core-processor-bx80684i58600k

now you've got $50 difference, that's enough to step up your gpu from 1070 to 2060 for 20% more performance.
Posted on Reply
#40
Dutch_Goat
They definitely did a good job with the rtx 2070, because its at a pretty good price. They could just make the 2080ti a bit cheaper though.
Posted on Reply
#41
Space Lynx
Astronaut
Dutch_GoatThey definitely did a good job with the rtx 2070, because its at a pretty good price. They could just make the 2080ti a bit cheaper though.
the 2060 overclocked matches a 2070 almost. and is only $350. imo the 2060 is the best value and bang for buck, 2070 is a little overrated. 2080 for $599 would be an ok price, and 2080 ti is just overrated for its price point
Posted on Reply
#42
Xaled
Exaggeratedly Overpriced, you mean?
Posted on Reply
#43
goodeedidid
lynx29Agreed, if their product can't compete with 2080 ti at $799 price point, I'll pass, I trust Nvidia drivers more for older titles.

Honestly, since release won't even be late 2020, if it does not surpass 2080 ti performance, I am not even going to blink an eye. I don't have time to wait around anymore. Most likely AMD and Intel will be a laughing stock, I wish it were not so... but I demand ultimate performance on my last GPU buy (I firmly believe silicon is dying and plan to make 2080 ti or 3080 ti my last purchase) hopefully I can hold out long enough for 3080 ti, I think I can ^^
Calm down guy, you're buying a computer part, not a weapon.
Posted on Reply
#44
Space Lynx
Astronaut
goodeedididCalm down guy, you're buying a computer part, not a weapon.
ok, thanks guy.
Posted on Reply
#45
Totally
cucker tarlsonlol,you just quoted your local prices.
but I get it,if it isn't happenning in America,it isn't happening at all.

plus you're +30 usd off, 8600k is $260
pcpartpicker.com/product/Mr2rxr/intel-core-i5-8600k-36ghz-6-core-processor-bx80684i58600k

now you've got $50 difference, that's enough to step up your gpu from 1070 to 2060 for 20% more performance.
Let's go over few things because you are clearly off the rocker, the previous commenter said the 8600k costs as much as a 2700x, then you interjected calling/implying that he was delusional because according to your narrow perspective the 2700k is 25% more expensive. Hence why I quoted the prices, thinking you'd get the hint but it clearly went over your head. Sorry, If anyone is being ignorant, it is you.
Posted on Reply
#46
xenocide
I'm interested to see where this thing lands. Intel has the resources to pull this off, but I'm apprehensive.
Posted on Reply
#47
Fluffmeister
Intel have an opportunity to grab a decent foothold for sure, AMD are currently stuck shrinking Polaris and Vega and both still fall well short frankly.
Posted on Reply
#48
crispysilicon
I've been waiting for Intel to make a move since Knight series. I hate how NV builds these things post-Fermi/Kepler and AMD is lagging too hard in power efficiency.

Am registered for the event in San Francisco so will see. :)
Posted on Reply
#49
notb
crispysiliconI've been waiting for Intel to make a move since Knight series. I hate how NV builds these things post-Fermi/Kepler and AMD is lagging too hard in power efficiency.

Am registered for the event in San Francisco so will see. :)
Well, I quite like how Nvidia is making the cards of late (and I'm thrilled by the latest features).
But Intel getting into the mix is certainly interesting.
A computation-oriented card would cement their importance in datacenters and science. And, unlike AMD, they have the potential to challenge the mighty CUDA.
As far as gaming goes - I don't think they can get close to Nvidia at the moment. I expect something similar to Vega, just more focused and better positioned.

I'm looking forward to what this means to their IGP.
The tiny HD IGP is perfect for just providing the video signal (and also for it's hardware encoding abilities).
But Intel just can't ignore how much software uses GPGPU acceleration now. Not to mention even really small, ultrabook-friendly chips like MX150 are starting to gain some traction in gaming.

And yeah, since there's going to be an even in Poland, I'm certainly going.
Posted on Reply
#50
Casecutter
notbfed up by a decade in Nvidia's shadow
Most would say Radeon R790 TeraScale Architecture of (April 2009) overshadow GeForce 200 and then even more so Fermi (April 2010). GCN 1.0 Tahiti (January 2012) and then Kepler (March 2012) where very competitive, up until Maxwell arrival Sep 19th, 2014
notbAt AMD you're polishing a decade-old technology
Tahiti was 2012, and even up until Hawaii (October 2013) verses Maxwell coming out a year later (Sept 2014) did Nvidia move out-front. While wasn't truly eclipsed (and competition waned in the high-end/Enthusiast segment) until the arrival Pascal finally in May of 2016. This is about the time AMD/RGT said they wouldn't be pursuing the Enthusiast Gaming segment, focusing resources in the Professional deep learning/HPC markets.

Get your history straight... noob
Posted on Reply
Add your own comment
Dec 23rd, 2024 20:09 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts