# AMD Doesn't Trust its Own Processors - Project Quantum Driven by Intel Core i7-4790K



## btarunr (Jun 22, 2015)

One of the three unexpected products based on the "Fiji" GPU, which AMD announced at its E3 event, Project Quantum, or the quest to design a 4K-worthy SFF gaming PC, which runs two "Fiji" GPUs in CrossFire, had the press assume that the rest of the system could be AMD-based, such as AMD-branded (albeit Patriot Memory manufactured) memory, AMD-branded (albeit OCZ manufactured) SSD; and importantly an AMD-made CPU or APU. Given its liquid-cooling, the prospect of a 95W "Godavari," or even upcoming "Carrizo" APU didn't seem far-fetched. Even a 95W FX CPU could have been deployed, since AM3+ on mini-ITX is not impossible.

When taken apart, Project Quantum was shown to be running an Intel Core i7-4790K "Devil's Canyon" CPU, on an ASRock-made mini-ITX motherboard, with its non-essential parts soldered out. The i7-4790K is neighbored by a pair of half-height Crucial Ballistix memory modules, which is excusable, since there are no half-height AMD Radeon memory modules, yet. The SSD is AMD-branded. The unit features a unified liquid cooling solution that's custom-made for AMD, by Asetek. A large (200 mm?) radiator, with a single fan, cools the CPU, the PCH, as well as the two "Fiji" GPUs. 



 

 



*View at TechPowerUp Main Site*


----------



## Mutagen240 (Jun 22, 2015)

Very misleading title. AMD clearly knows that Intel was the only way to go. As stated there are no m-itx motherboard for FX CPUs and APUs would be a huge bottleneck so they could only have chosen Intel. Its not that they don't trust they're CPUs, its that they're not stupid. Please change the title.

Yours sincerely, AMD fanboy


----------



## btarunr (Jun 22, 2015)

Mutagen240 said:


> Very misleading title. AMD clearly knows that Intel was the only way to go. As stated there are no m-itx motherboard for FX CPUs and APUs would be a huge bottleneck so they could only have chosen Intel. Its not that they don't trust they're CPUs, its that they're not stupid. Please change the title.
> 
> Yours sincerely, AMD fanboy



Changed title to "Processors," a term AMD uses for both CPUs and APUs; and mentioned that AMD COULD HAVE USED FX, because it turns out that you CAN design mini-ITX motherboards using the 2-chip chipset after all. Below is an example. It runs 890GX+SB850. AMD 970+SB950 has a lower TDP than 890GX due to lack of integrated graphics. So you can build ITX boards with AMD 970 and AM3+. You'll just have to use SO-DIMM instead of standard DIMMs, which shouldn't be a problem. MSI's 4K Gaming notebook uses SO-DIMMs.


----------



## MikeMurphy (Jun 22, 2015)

Performance, heat and power consumption needed to be considered.  It has nothing to do with trust.


----------



## btarunr (Jun 22, 2015)

MikeMurphy said:


> Performance, heat and power consumption needed to be considered.  It has nothing to do with trust.



Then why stop at CPUs? AMD should have used two NVIDIA GPUs.


----------



## RejZoR (Jun 22, 2015)

Is it a prototype? YES
Can you buy Quantum? NO

Where is the problem? You'll likely see Quantum be available next year when they'll have Zen and when R9 Fury Maxx will be released, most likely some time in the fall or even winter 2015. Zen comes in 2016. But for the prototype they were forced to use Intel. You won't believe what all companies use to make prototypes and what then actually ends up in the same products when customers can buy them...


----------



## ZoneDymo (Jun 22, 2015)

Kind of odd to use a term as "trust" for this.
Its not like their own processors are know to spontaneously combust or something.

AMD just knows the competition has more high-end stuff out, a level at which they themselves dont compete at atm.
So for a so called high end machine, you turn in this case to the competition.

Here is hoping AMDs new future cpu's will wow us and perhaps will put them back in the high end category and perhaps power future Project Quantum setups.


----------



## ZoneDymo (Jun 22, 2015)

btarunr said:


> Then AMD should have used two NVIDIA GTX 980s (good enough for 4K).



Because those are better then two fiji's?
That some insider info there man, watch out for that NDA


----------



## Xaled (Jun 22, 2015)

-When AMD uses its own cpu peple say:
"Too sad for such  hardware to get bottlenecked by such cpu just because AMD wants to use its own hardware"

-when AMD uses Intel cpu they say:
"AMD doesnt trust its hardware..


----------



## GreiverBlade (Jun 22, 2015)

what if it's a joint venture ? and btw ... 4 days too late ... 

"omigodz AMD is using Intel CPU in a AMD branded mITX PC"
am i the only one who just say : "so what?"

AMD is also a GPU manufacturer and ... for that nice attempt 


btarunr said:


> Then why stop at CPUs? AMD should have used two NVIDIA GPUs.


nope Fury X is better than what they offer 

intel is not tied to nvidia luckily (nor is AMD GPU to AMD CPU )

also since the next AMD CPU line is not up, and the APU are not meant for that use: no big deal about using a Intel CPU for that demo, right...

dont make them say what they didn't say


Xaled said:


> -When AMD uses its own cpu peple say:
> "Too sad for such  hardware to get bottlenecked by such cpu just because AMD wants to use its own hardware"
> 
> -when AMD uses Intel cpu they say:
> "AMD doesnt trust its hardware..


(that post is good) ahah


----------



## btarunr (Jun 22, 2015)

Xaled said:


> -When AMD uses its own cpu peple say:
> "Too sad for such  hardware to get bottlenecked by such cpu just because AMD wants to use its own hardware"
> 
> -when AMD uses Intel cpu they say:
> "AMD doesnt trust its hardware..



When people use AMD FX CPUs in their own builds, they say "this CPU doesn't bottleneck any GPU on the planet"


----------



## ZoneDymo (Jun 22, 2015)

btarunr said:


> When people use AMD FX CPUs in their own builds, they say "this CPU doesn't bottleneck any GPU on the planet"



Yeah.... his comment was aimed at the "anti AMD" crowd, and how they change position in order to suit the scenario that makes AMD or their product look bad.

Sorta like how the rightwingers call Obama both a dictator and a "mom jean wearing weakling that should be more like Putin", whatever suits them best atm to use against Obama.


----------



## bencrutz (Jun 22, 2015)

btarunr said:


> When people use AMD FX CPUs in their own builds, they say "this CPU doesn't bottleneck any GPU on the planet"



ever occurred to you that thermals may be the consideration instead of the bottleneck?


----------



## GreiverBlade (Jun 22, 2015)

also it might be for PCiEX 3.0 ... didn't think about that one??? pfah! then yes ... it's a CFX in a X8/X8 configuration iirc, so.... what are the only PciEX 3.0 compatible CPUfrom AMD... YEP the APU Kaveri and Godavari.
so then : logical choice (even for a demo)

even if a dual X8 2.0 is not a bottleneck atm (for actual card ... for that kind of cards used in : who knows)



bencrutz said:


> ever occurred to you that thermals may be the consideration instead of the bottleneck?


yeah right ... a AMD cpu : 125w a intel CPU: 88w... HUGGGE difference "OH MY GOD" effect inserted... (well the one in the box seems to be OC'ed so it's closer to 130w than 88w )


tho that setup seems to be good new for me ... i can put a CFX Fiji in my rig and keep only a single loop cpu/gpu on the 2 240mm rad ... bhahahaah (just need to  witch the DC12-220 to a 260/400 maybe...)


----------



## Digital Dreams (Jun 22, 2015)

Shitty thread title IMO. As "editor and senior moderator" I expect better from you. This has nothing to do with trust at all. Also people calling you out on your shit hardly makes them fanboys.


----------



## bobbavet (Jun 22, 2015)

Dual Zen. Why do you think its called quantum?


----------



## bencrutz (Jun 22, 2015)

GreiverBlade said:


> yeah right ... a AMD cpu : 125w a intel CPU: 88w... HUGGGE difference "OH MY GOD" effect inserted... (well the one in the box seems to be OC'ed so it's closer to 130w than 88w )



ok, so 42,5% more is not huge, eh? 

anyway, it's more of a proof of concept methink, so it wouldn't matter what processor they use


----------



## ensabrenoir (Jun 22, 2015)

...poking at anything Amd before a major launch is to court _*Fanboy Fury!!!   * _But seriously guys.... the behind the seen conversation at Amd had to be priceless.  Imagine if Ford had to use a Dodge Hemi in a truck prototype.....or if all Intel's office and factory computers were Amd powered....gotta learn to  laugh at it all


----------



## R-T-B (Jun 22, 2015)

RejZoR said:


> Is it a prototype? YES
> Can you buy Quantum? NO
> 
> Where is the problem? You'll likely see Quantum be available next year when they'll have Zen and when R9 Fury Maxx will be released, most likely some time in the fall or even winter 2015. Zen comes in 2016. But for the prototype they were forced to use Intel. You won't believe what all companies use to make prototypes and what then actually ends up in the same products when customers can buy them...




Good RejZor, you've learned not to declare things before a product ships...  they may even ship with a Zen next year for all we know.  It's a prototype.

Now if only btarunr would learn the same lesson, but I'll let him slide because journalism and headlines IS what he does afterall, lol. (Just messing with you a bit, man).


----------



## mroofie (Jun 22, 2015)

Digital Dreams said:


> Shitty thread title IMO. As "editor and senior moderator" I expect better from you. This has nothing to do with trust at all. Also people calling you out on your shit hardly makes them fanboys.


doesn't change the fact that quantum is using intel


----------



## R-T-B (Jun 22, 2015)

ensabrenoir said:


> ...poking at anything Amd before a major launch is to court _*Fanboy Fury!!!   * _But seriously guys.... the behind the seen conversation at Amd had to be priceless.  Imagine if Ford had to use a Dodge Hemi in a truck prototype..... gotta love it!!!



I'd say in modern gaming, the CPU is less the engine than the GPU.  The CPU is more like a secondary-but-still-important piece.  Maybe transmission?


----------



## Xaled (Jun 22, 2015)

btarunr said:


> When people use AMD FX CPUs in their own builds, they say "this CPU doesn't bottleneck any GPU on the planet"


Yeah FX 6300 is the best cpu on the planet 

They may offer an excellent performance/price ratio but we do know that AMD has lost big share of laptops market share juat because they insist on using their own cpu, 7970 mobility was an excellent gpu but it was killed by AMD..


----------



## ZoneDymo (Jun 22, 2015)

ensabrenoir said:


> ...poking at anything Amd before a major launch is to court _*Fanboy Fury!!!   * _But seriously guys.... the behind the seen conversation at Amd had to be priceless.  Imagine if Ford had to use a Dodge Hemi in a truck prototype.....or if all Intel's office and factory computers were Amd powered....gotta learn to  laugh at it all



Many a performance Ford has its engine made by zeh Germans.


----------



## btarunr (Jun 22, 2015)

bencrutz said:


> ever occurred to you that thermals may be the consideration instead of the bottleneck?



FX-8370E has just 7W higher TDP than an i7-4790K. And this is a chip which AMD is recommending on its website, for "enthusiasts."


----------



## rooivalk (Jun 22, 2015)

Digital Dreams said:


> Shitty thread title IMO. As "editor and senior moderator" I expect better from you. This has nothing to do with trust at all. Also people calling you out on your shit hardly makes them fanboys.


If people agitated over miniscule detail over thread title, I think it's pretty suitable to call them fanboys.

Besides, CPU is one of AMD core business. If they're not using their own products over their own AMD branded project and worse, give a way to their SOLE competitor, it's fair to say <insert any brand here> doesn't trust their own product.

There're too much example for this. For example: French president is using Citroen/Peugeot for official state car, even though it's just compact or medium car not full size (and probably better armored) sedan which is commonly used by the presidents of the rest of the world. France Air Force is also bought it's own Rafale before exporting it (although they don't really want to buy it) far before Rafale hit jackpot with Indian Air Force.

It's logical for end user to use Intel over AMD CPU but not for AMD themselves. It's simply bad precedent.


----------



## GreiverBlade (Jun 22, 2015)

bencrutz said:


> ok, so 42,5% more is not huge, eh?
> 
> anyway, it's more of a proof of concept methink, so it wouldn't matter what processor they use


well ... nope ... the cpu should mostly be around 130w so the difference is not so big ... tho it mean a AMD CPU would have no OC headroom which kinda put the Thermal design kinda true 

for the second line yes pretty much: it doesn't matter.


----------



## john_ (Jun 22, 2015)

Eteknix used that title, because they where upset with AMD, after AMD informed them that they will not be in the list of those who will receive a Fury X sample.

So, what is the excuse for TechPowerUp?

You don't get a free Fury X sample?
The author is having a personal vendetta with the company?
The site tries to make it's sponsors happy?
Click bait?
Combination of the above or something else?


----------



## bogami (Jun 22, 2015)

A good combination really is not here only by the manufacturer. performance AMD processors just in recent years stinks. Zen comes, it is intended to fill a difference .a where it is that in 2016.
I honor you as such plans are often processed in I my head. You can mount the dual strip PCI-E and get a 8x8 which would allow even the 4 GPU postavitev.The size of radiator is minimal. A 2xFury processor is announced in this setting, if I read correctly, and that will satisfy every 4k game 60 FPS + .Do not be cheap but if you are get here only SSD units. I hope that it will be used only copper. The pump is good, tube 10 mm opening. because it will be a good flow of which can be easily discharged temperature from processors.
4 cors 8 tr. In RAID bound 2 units and dual GPU FuryX will put forward a very strong mini system would have wanted everyone home.


----------



## esrever (Jun 22, 2015)

Only real reason AMD choose intel is there isn't any high end AMD CPU/Mobo that they can use in a mini ITX form factor. Its also the reason they didn't use intel's extreme cpus.


----------



## Tetsudo77 (Jun 22, 2015)

Click bait title from a Senior Editor. Didn;t see the same reaction from this guy with the 970 debacle. I see NV has its ways with an abundance of budget. Might as well blame AMD for that.


----------



## btarunr (Jun 22, 2015)

john_ said:


> Eteknix used that title, because they where upset with AMD, after AMD informed them that they will not be in the list of those who will receive a Fury X sample.
> 
> So, what is the excuse for TechPowerUp?



The excuse is truth, even if it hurts your feelings.



john_ said:


> You don't get a free Fury X sample?



We already have one.



john_ said:


> The author is having a personal vendetta with the company?



Bought K6, Athlon XP 3200+, FX-57, HD 6870, FX-8350, and R9 290 with my own money, used each for an average of 18 months.



john_ said:


> The site tries to make it's sponsors happy?



No graphics card vendor has a deal with us.



john_ said:


> Click bait?



TPU is too big/old for that.


----------



## the54thvoid (Jun 22, 2015)

john_ said:


> Eteknix used that title, because they where upset with AMD, after AMD informed them that they will not be in the list of those who will receive a Fury X sample.
> 
> So, what is the excuse for TechPowerUp?
> 
> ...



Do you not ask yourself why a company that sells CPU and GPU doesn't put its own product into a new design concept? It didn't need to be an i7 for PR, so why do it?
The speed to criticise Btarunr evidences a lot of ego's being dented, nothing more.  
It IS very odd for AMD (you know, they make umm, AMD processors) to NOT pop in a AMD CPU. Unless their own cpu's don't have the power to drive two if the world's most advanced gfx cards.

Stop criticising TPU and look at AMD instead.


----------



## Lionheart (Jun 22, 2015)

Meh, doesn't bother me, a mini itx 2011 board with a 5820K would be kick ass with this.


----------



## ZetZet (Jun 22, 2015)

btarunr said:


> FX-8370E has just 7W higher TDP than an i7-4790K. And this is a chip which AMD is recommending on its website, for "enthusiasts."


at stock speeds. but it doesn't even compete with an i3 at stock speeds.


----------



## Tetsudo77 (Jun 22, 2015)

the54thvoid said:


> Do you not ask yourself why a company that sells CPU and GPU doesn't put its own product into a new design concept? It didn't need to be an i7 for PR, so why do it?
> The speed to criticise Btarunr evidences a lot of ego's being dented, nothing more.
> It IS very odd for AMD (you know, they make umm, AMD processors) to NOT pop in a AMD CPU. Unless their own cpu's don't have the power to drive two if the world's most advanced gfx cards.
> 
> Stop criticising TPU and look at AMD instead.


Equivalent of stop criticizing Clickbait Practices and criticize AMD's newborn hardware which has been around for 5 days.


----------



## btarunr (Jun 22, 2015)

bencrutz said:


> ever occurred to you that thermals may be the consideration instead of the bottleneck?





ZetZet said:


> at stock speeds. but it doesn't even compete with an i3 at stock speeds.



How many times will you people shift goalposts and throw up strawman arguments, before you realise the actual point of this article? Which is:

AMD is not using an AMD processor in a PC bearing its brand name
If AMD doesn't use the product-line it's most known for - processors - in its gaming PCs, then it's AMD telling you "don't use our CPUs, we don't use them ourselves."
AMD is currently selling processors WHICH IT THINKS ARE COMPETITIVE WITH INTEL
It could have used its own processors, even if it's a lowly Godavari+A88X ITX; if this was just meant to be a proof-of-concept. It would have shown that AMD trusts its own creations
If Intel-ITX is indeed bound to be final-spec, then instead AMD should have used an FX-8370E + AMD 970 ITX motherboard (which is doable, and AMD designs motherboards itself)


----------



## Assimilator (Jun 22, 2015)

btarunr said:


> How many times will you people shift goalposts and throw up strawman arguments, before you realise the actual point of this article? Which is:
> 
> AMD is not using an AMD processor in a PC bearing its brand name
> If AMD doesn't use the product-line it's most known for - processors - in its gaming PCs, then it's AMD telling you "don't use our CPUs, we don't use them ourselves."
> ...


FTFY.

Also the AMD fanboy butthurt in this thread is hilarious. QQ more children, your tears are so delicious.


----------



## jigar2speed (Jun 22, 2015)

btarunr said:


> How many times will you people shift goalposts and throw up strawman arguments, before you realise the actual point of this article? Which is:
> 
> AMD is not using an AMD processor in a PC bearing its brand name
> If AMD doesn't use the product-line it's most known for - processors - in its gaming PCs, then it's AMD telling you "don't use our CPUs, we don't use them ourselves."
> ...



A quick query - Does AMD have Mini ITX mobo that support 2 FIJI graphic cards ? Or does Quantum have 2X Fiji on single PCB ?


----------



## john_ (Jun 22, 2015)

btarunr said:


> The excuse is truth, even if it hurts your feelings.


Hurt my feelings? You serious? 


> We already have one.


 Great! 



> Bought K6, Athlon XP 3200+, FX-57, HD 6870, FX-8350, and R9 290 with my own money, used each for an average of 18 months.


 So many bad memories 



> No graphics card vendor has a deal with us.


 Yes but the article is about the CPU, right? 



> TPU is too big/old for that.


 Which makes it strange, doesn't it?


----------



## R-T-B (Jun 22, 2015)

FWIW, this is the most unbiased tech site I know of.


----------



## btarunr (Jun 22, 2015)

jigar2speed said:


> A quick query - Does AMD have Mini ITX mobo that support 2 FIJI graphic cards ? Or does Quantum have 2X Fiji on single PCB ?



Quantum uses an ITX board with a single x16 slot, and a dual-GPU video card split between two PCBs, with a sandwiched cooling solution (à la GeForce 9800 GX2).


----------



## TRWOV (Jun 22, 2015)

Why not just leave the title as "AMD's Project Quantum Driven by Intel Core i7-4790K"? 

TPU isn't in charge of creating drama, TPUers are


----------



## john_ (Jun 22, 2015)

the54thvoid said:


> Do you not ask yourself why a company that sells CPU and GPU doesn't put its own product into a new design concept? It didn't need to be an i7 for PR, so why do it?
> The speed to criticise Btarunr evidences a lot of ego's being dented, nothing more.
> It IS very odd for AMD (you know, they make umm, AMD processors) to NOT pop in a AMD CPU. Unless their own cpu's don't have the power to drive two if the world's most advanced gfx cards.
> 
> Stop criticising TPU and look at AMD instead.


AMD is a company that a few years ago come out in public and said that it can't compete in the hi end CPU market. Not to mention that they have completely abandoned the server market. Should I point at the price of the top AMD model being lower than even a i5's price. 
Did we just re invented the wheel here? 

Everyone knows that for a card like the dual Fiji, you need a hi end Intel CPU. Project Quantum would not sell with an FX in it, and an FX in it would have been a reason for criticism again. Can we really rule out the possibility an OEM selling Project Quantum when dual Fiji card comes out? Many months before Zen? I think not. And considering AMD is a company with no money in their pockets, they can't create a prototype that they can't sell, or two different prototypes just for an one hour show. 

I believe the person making the video about Project Quantum had to do much apologizing for not hiding the motherboard, but other than that, I don't find something wrong with Project Quantum not using an FX.


----------



## dj-electric (Jun 22, 2015)

btarunr said:


> Then why stop at CPUs? AMD should have used two NVIDIA GPUs.



According to AMD's own statements, the R9 Nano is a better option anyway over a 970, so why chose a 970?


----------



## Mutagen240 (Jun 22, 2015)

btarunr said:


> Fanboy-proofed my article a little. Changed title to "Processors," a term AMD uses for both CPUs and APUs; and mentioned that AMD COULD HAVE USED FX, because it turns out that you CAN design mini-ITX motherboards using the 2-chip chipset after all. Below is an example. It runs 890GX+SB850. AMD 970+SB950 has a lower TDP than 890GX due to lack of integrated graphics. So you can build ITX boards with AMD 970 and AM3+. You'll just have to use SO-DIMM instead of standard DIMMs, which shouldn't be a problem. MSI's 4K Gaming notebook uses SO-DIMMs.



Holy crap, did not know that. I planned to build with an AMD CPU over a year ago but I could not find m-atx motherboards for my build so I went Intel haswell. Thanks for the info, I'll look into this for my next build.


----------



## huguberhart (Jun 22, 2015)

far gone conclusions in the title - cheap sensations. you can do better.


----------



## the54thvoid (Jun 22, 2015)

huguberhart said:


> far gone conclusions in the title - cheap sensations. you can do better.



So the title should be:
"AMD CPU not powerful enough for ATI's dual Fiji GPU."?


----------



## Toothless (Jun 22, 2015)

Next week, Intel uses Qualcomm in their laptops!


----------



## john_ (Jun 22, 2015)

Toothless said:


> Next week, Intel uses Qualcomm in their laptops!







Should I try to find the TPU article with the title "Intel doesn't trust it's IGPs"?


----------



## btarunr (Jun 22, 2015)

john_ said:


> Should I try to find the TPU article with the title "Intel doesn't trust it's IGPs"?



Nice try. Intel doesn't have an iGPU IP of the same class as G64xx; whereas AMD claims to have CPUs of the same class as Intel.

When you claim to have a chip of the same class as Intel, and you want people to buy your product over Intel, and then you turn around and use Intel in your product, you're telling people that your original claims were enough bovine defecation to fertilize Africa.


----------



## Toothless (Jun 22, 2015)

john_ said:


> View attachment 65953
> 
> 
> Should I try to find the TPU article with the title "Intel doesn't trust it's IGPs"?


More like Intel doesn't trust AMD's GPUs so it hires Qualcomm to force NVIDIA to use Samsung technology for DDR5 ram. At least that's what I'm getting from the comments in this thread.

Be a fanboy to everyone, so no one is left out.


----------



## mroofie (Jun 22, 2015)

Lol the butthurt is real


----------



## GreiverBlade (Jun 22, 2015)

Toothless said:


> Be a fanboy to everyone, so no one is left out.


well... from that point of view ...thanking you for that part not for the rest ahah ... intel use the Rogue PowerVR because the Adreno (radeon ...  ) is indeed Qualcomm only

Intel I5-4690K
AMD R9 290
Nvidia Shield Tablet

for what's in current use
spare:
GT730 (because you never know when you need one  )
GTX670 (because ... eh? nevermind... )

my favorit are my 3 nforce chipset AMD board (2 S939 1 S940)


----------



## john_ (Jun 22, 2015)

btarunr said:


> Nice try. Intel doesn't have an iGPU IP of the same class as G64xx; whereas AMD claims to have CPUs of the same class as Intel.



LOL, nope! I would have to say the same thing. "Nice try". AMD doesn't compete in the hi end. Their prices is proof.



> When you claim to have a chip of the same class as Intel, and you want people to buy your product over Intel, and then you use Intel over your chips in your product, you're telling people that your original claims were imperial tons of bovine defecation.


So much effort to justify Intel. So happy to criticize AMD.
Intel is justified, AMD is not. Why am I not surprised? On the net every company is justified to do anything, AMD never is.[/QUOTE]

Intel does not make extreme low-power graphics IP that it can put into mobile SoCs. But AMD does make CPUs of the same kind, with the same application compatibility, with the same marketing claims, as Intel.

The more you use that PowerVR analogy, the more stupid you make yourself look.


----------



## Toothless (Jun 22, 2015)

GreiverBlade said:


> well... from that point of view ...thanking you for that part not for the rest ahah ... intel use the Rogue PowerVR because the Adreno (radeon ...  ) is indeed Qualcomm only
> 
> Intel I5-4690K
> AMD R9 290
> ...


Intel, G.Skill, Gigabyte, EVGA, Rosewill, MSI, PNY, AMD, ASUS, Western Digital, Toshiba, Hatachi, Seagate, I've used them all and I admit, I favor some over others but the only brand I'm against is ASUS for their inability to check for bad motherboard batches and horrid customer service.

I consider myself on the brown team, because what does blue, red, and green make?

(Short story my board died the same way a bunch of other customers' boards died, while we all ordered the board around the same time period)

Also, my idles are the GTX650 and GTX660 which will go up for sale here soon.


----------



## Ubersonic (Jun 22, 2015)

Isn't this just an early prototype? They could be using the Intel CPU to show it off because the AMD one isn't ready yet?


----------



## GreiverBlade (Jun 22, 2015)

Toothless said:


> Intel, G.Skill, Gigabyte, EVGA, Rosewill, MSI, PNY, AMD, ASUS, Western Digital, Toshiba, Hatachi, Seagate, I've used them all and I admit, I favor some over others but the only brand I'm against is ASUS for their inability to check for bad motherboard batches and horrid customer service.
> 
> I consider myself on the brown team, because what does blue, red, and green make?
> 
> ...


and here i am all ASUS on my main rig (and some friends rig i did) and never had any problem (no RMA no horrid customer service, eh?  ) 
i guess i am just lucky xD (tho i only listed what i use now ... huhuhu)


----------



## btarunr (Jun 22, 2015)

Ubersonic said:


> Isn't this just an early prototype? They could be using the Intel CPU to show it off because the AMD one isn't ready yet?



Then why not show it off with a Godavari+A88X-ITX board? It's not like they're putting out benchmarks/frame-rates, and I can't imagine a Godavari would bottleneck Fijix2 to unplayable frame-rates?

It's more likely that AMD has chosen Haswell. The choice has been made. And that betrays its CPU lineup.


----------



## john_ (Jun 22, 2015)

The same article from Guru3D
AMD Project Quantum powered by Intel


> Okay, so the title sounds a bit ironic, but with Dual-GPU Fiji based GPUs in a small PC, you are going top need quite a bit of processing power. So props to AMD for having the balls to make this choice. BTW I also know there will be an AMD version available, so the choice is yours.




Can you spot the difference?


----------



## Caring1 (Jun 22, 2015)

john_ said:


> The same article from Guru3D
> Can you spot the difference?


Yep, their spelling sucks.


----------



## huguberhart (Jun 22, 2015)

the54thvoid said:


> So the title should be:
> "AMD CPU not powerful enough for ATI's dual Fiji GPU."?


Obviously Intel has more powerful and efficient chips. AMD reps said that Intel CPU is a placeholder until Zen.
The 'no trust in its own' is quite sensationalist for a five day old news.


----------



## GreiverBlade (Jun 22, 2015)

btarunr said:


> Then why not show it off with a Godavari+A88X-ITX board? It's not like they're putting out benchmarks/frame-rates, and I can't imagine a Godavari would bottleneck Fijix2 to unplayable frame-rates?
> 
> AMD has chosen Haswell. The choice has been made. And that betrays its CPU lineup.


because it's not meant to be used (APU) in that configuration ... (they line the FX8XXX as enthusiast because it's what they are ... and in some kind of task they line up to their intel equivalent, ie: movie/or heavy threaded load a 8350 is not far a 4770/4790, ofc with Skylake ... i don't really know, another example is my trade of a FX6300 to a i5-4690K unless a heavily populated MMO i notice no real difference ... so why did i change, well because i was not bound by a budget anymore. that Quantum is a high end ... and surely not a budget one ... so why not use a intel CPU until Zen is ready)

it doesn't matter at all what CPU they use and AMD (for the XXXth time) is not a CPU only maker: they can choose whatever they want for driving their GPU, no betraying no drama no nothing.

we just should stop worrying on that "Betraying part" and let that go away, as it doesn't matter at all and bring nothing more than speculation and useless debate on the table


little joke ...


----------



## btarunr (Jun 22, 2015)

john_ said:


> So much effort to justify Intel. So happy to criticize AMD.
> Intel is justified, AMD is not. Why am I not surprised? On the net every company is justified to do anything, AMD never is.



Firstly sorry, I hit the "edit" button instead of "reply" when replying to your post.

Intel does not make extreme low-power graphics IP that it can put into mobile SoCs. But AMD does make CPUs of the same kind, with the same application compatibility, with the same marketing claims, as Intel.


----------



## john_ (Jun 22, 2015)

btarunr said:


> Firstly sorry, I hit the "edit" button instead of "reply" when replying to your post.
> 
> Intel does not make extreme low-power graphics IP that it can put into mobile SoCs. But AMD does make CPUs of the same kind, with the same application compatibility, with the same marketing claims, as Intel.


Bravo. And bravo again. So, you edit my post and you apologize for that. No problem. But you don't delete what you wrote there. Nice one.

Then you apologize for the edit, and not for calling me stupid. The problem with the PowerVR or the Guru3D article is that shows who the biased is. And the only way to NOT answer that, is to INSULT the other person.
Bravo again. So, that's TPU's "*Editor & Senior Moderator*" level? Good to know.


----------



## Yorgos (Jun 22, 2015)

btarunr said:


> How many times will you people shift goalposts and throw up strawman arguments, before you realise the actual point of this article? Which is:
> 
> AMD is not using an AMD processor in a PC bearing its brand name
> If AMD doesn't use the product-line it's most known for - processors - in its gaming PCs, then it's AMD telling you "don't use our CPUs, we don't use them ourselves."
> ...


get your facts right dude.
AMD's GPU department is not using AMD processor.
They are marketing GPUs, not CPUs. They might as-well ship it with SPARC, ARM or even z/Architecture if it in fact gives them a better end-product.

OTOH, if it were for the CPU department to ship a new low power SOC(like geode), they will have to use some other's vendor GPU, since the AMD graphics department doesn't produce a low power GPU any more.

I wonder why this site has a title like that and the rest of the internet sees this choice from a different perspective.
I smell butt-hurt from the Fury benchmarks


----------



## Yorgos (Jun 22, 2015)

john_ said:


> The same article from Guru3D
> AMD Project Quantum powered by Intel
> 
> 
> Can you spot the difference?


guru3d is a well respected site.
yesterday I realized why amd refused to give cards to some reviewers. 
take a look at this


----------



## SonicZap (Jun 22, 2015)

I'm a fan of AMD and I find nothing wrong with the title. I wouldn't trust the highest-end AMD CPU to handle two Fiji GPUs either, and if it was clocked high enough to not be a bottleneck, the TDP would go way over 200 W (see FX-9590). The i7-4790K simply does a better job with far less power, making the system faster and easier to cool.

I was also expecting to see an AMD CPU in Project Quantum, but choosing the i7 makes sense to me. It's not like everyone but the most blind AMD fans wouldn't already know that Intel's desktop Haswell Core i5s and i7s murder all AMD CPUs in gaming. If this Project Quantum is a success, I'd expect their next version to include a Zen-based CPU. Right now they're simply making the system as capable as possible, and with the i7 it's more capable and cooler than with any AMD CPU.


btarunr said:


> Then why stop at CPUs? AMD should have used two NVIDIA GPUs.


I disagree with that. Fiji supposedly improves power efficiency a lot, and it might compete very well with Maxwell-based GPUs. Also, Project Quantum is for showcasing the power of AMD GPUs in the first place, the CPU isn't as important as the GPUs here.

I personally like btarunr's style and find him far from biased. He points out things like these from all companies, and he does it starkly. Remember what he wrote of Nvidia's drivers when they prevented overclocking laptop GPUs (twice)?


----------



## Frick (Jun 22, 2015)

The title reeks of qubitness, which is a bad thing.


----------



## R-T-B (Jun 22, 2015)

btarunr said:


> Nice try. Intel doesn't have an iGPU IP of the same class as G64xx; whereas AMD claims to have CPUs of the same class as Intel.



Do they really though?  I thought they readily admitted they gave up trying to compete on the high end.

I've always been one to defend your willingness to call out both sides for their crap btarunr, but this title is a bit sensationalist if I am being honest.


----------



## Yorgos (Jun 22, 2015)

SonicZap said:


> I'm a fan of AMD and I find nothing wrong with the title. I wouldn't trust the highest-end AMD CPU to handle two Fiji GPUs either, and if it was clocked high enough to not be a bottleneck, the TDP would go way over 200 W (see FX-9590). The i7-4790K simply does a better job with far less power, making the system faster and easier to cool.
> 
> I was also expecting to see an AMD CPU in Project Quantum, but choosing the i7 makes sense to me. It's not like everyone but the most blind AMD fans wouldn't already know that Intel's desktop Haswell Core i5s and i7s murder all AMD CPUs in gaming. If this Project Quantum is a success, I'd expect their next version to include a Zen-based CPU. Right now they're simply making the system as capable as possible, and with the i7 it's more capable and cooler than with any AMD CPU.
> 
> ...


what's the point of the Quantum Project? To sell CPUs or to sell GPUs?
Did we read about amd not using a seamicro(in-house) motheboard? no
Did we read about a non-AMD RAM on this platform? No
Someone with enough power to make his opinion widely available, thinks that AMD does not trust a several year old μArchitecture and we are debating on a subject based on an opinion.


----------



## ZoneDymo (Jun 22, 2015)

btarunr said:


> Firstly sorry, I hit the "edit" button instead of "reply" when replying to your post.
> 
> Intel does not make extreme low-power graphics IP that it can put into mobile SoCs. But AMD does make CPUs of the same kind, with the same application compatibility, with the same marketing claims, as Intel.




with a massive price difference.
If Honda says its S2000 sports car is a high performance vehicle and Ferrari says its Laferrari is a high performance vehicle, and the S2000 goes for 35.000 dollars and the Laferrari goes for 1 million dollars, you think its fair to compare the two head to head?

Your title is just F'd up and even you know it, the main problem is the word "trust".
Like I said earlier (its great how you can choose what comments to respond to), AMD does NOT compete in the high end and haven't been for a while and you (should) know that.
Intel i7-4790k - 340 dollars
AMD FX-8370 - 190 dollars.

Nearly half the price, but for you they are on the same footing?

And hell even part form all of that, 'Trust" does not make any sense at all.
This is 0's and 1's, not "maybe if it tries hard enough".
"we trust this cpu will perform on a higher level then we KNOW it can" 
It makes no sense man, they know what their cpu's can do and they know what the competition can do and what they have right now does not do the gpu (which is what its all about) justice.

If your title was "AMD does not deem its own cpus capable/fast enough,  4790k powers Quantum" all would be fine.


----------



## Yorgos (Jun 22, 2015)

I have a possible legitimate chart from a deleted comment from another site. 
Should I post it or I might cause trouble to the site?


----------



## ZoneDymo (Jun 22, 2015)

Caring1 said:


> Yep, their spelling sucks.



What spelling errors were made?


----------



## ZoneDymo (Jun 22, 2015)

Yorgos said:


> I have a possible legitimate chart from a deleted comment from another site.
> Should I post it or I might cause trouble to the site?



Post away my friend, cant cause any issues


----------



## Ferrum Master (Jun 22, 2015)

Crap is this Fudzilla? What kind of title is that? 

The decision is sane and reasonable in every aspect.


----------



## joyman (Jun 22, 2015)

Really going downhill TPU, since the last two years you are really going anti-AMD and so full of crap. I will have to find another tech site to follow, shame, really a shame.


----------



## r.h.p (Jun 22, 2015)

ensabrenoir said:


> ...poking at anything Amd before a major launch is to court _*Fanboy Fury!!!   * _But seriously guys.... the behind the seen conversation at Amd had to be priceless.  Imagine if Ford had to use a Dodge Hemi in a truck prototype.....or if all Intel's office and factory computers were Amd powered....gotta learn to  laugh at it all



"Nice one , little bit less """"OMG  there using a intel cpu lol. "If I can say 1 thing is those glowing *amd radeon leds* are wicked man .....


----------



## Assimilator (Jun 22, 2015)

Mutagen240 said:


> Holy crap, did not know that. I planned to build with an AMD CPU over a year ago but I could not find m-atx motherboards for my build so I went Intel haswell. Thanks for the info, I'll look into this for my next build.



Presumably you mean mITX? And I'd be careful of going with an mITX AMD board due to power/heat and overclocking concerns.


----------



## Durvelle27 (Jun 22, 2015)

Very misleading title.  

Let's be logical now 

AMD as of currently doesn't make any iTX boards for AM3+

They wouldn't use a APU as they are not meant for the high end market

And Zen isn't ready yet

So why not use Intel


----------



## AsRock (Jun 22, 2015)

OMG people just pick on any thing these days. Ever thought they actually might give a option if you want a AMD or Intel chip in it.

And at this time if i were to buy a such item i would want a Intel chipset \ CPU in it anyways lets face it most of you would too.  Some times i wounder if i should buy a load of extra large wooden spoons for people on here for Christmas.

I am sure AMD know what was the best to use at the time and i could not agree more what they used.


----------



## rvalencia (Jun 22, 2015)

btarunr said:


> One of the three unexpected products based on the "Fiji" GPU, which AMD announced at its E3 event, Project Quantum, or the quest to design a 4K-worthy SFF gaming PC, which runs two "Fiji" GPUs in CrossFire, had the press assume that the rest of the system could be AMD-based, such as AMD-branded (albeit Patriot Memory manufactured) memory, AMD-branded (albeit OCZ manufactured) SSD; and importantly an AMD-made CPU or APU. Given its liquid-cooling, the prospect of a 95W "Godavari," or even upcoming "Carrizo" APU didn't seem far-fetched. Even a 95W FX CPU could have been deployed, since AM3+ on mini-ITX is not impossible.
> 
> When taken apart, Project Quantum was shown to be running an Intel Core i7-4790K "Devil's Canyon" CPU, on an ASRock-made mini-ITX motherboard, with its non-essential parts soldered out. The i7-4790K is neighbored by a pair of half-height Crucial Ballistix memory modules, which is excusable, since there are no half-height AMD Radeon memory modules, yet. The SSD is AMD-branded. The unit features a unified liquid cooling solution that's custom-made for AMD, by Asetek. A large (200 mm?) radiator, with a single fan, cools the CPU, the PCH, as well as the two "Fiji" GPUs.
> 
> ...


If you read http://www.nvidia.com/page/uli_m6117c.html
I don't see Kitguru's Anton Shilov using NVIDIA's 386 CPU instead of Intel's X86 CPU. LOL.

http://www.nvidia.com/docs/IO/145393/NVIDIA-Jetson-Pro-Development-Kit.png
NVIDIA'a ARM desktop type motherboard... it's a POS for desktop usage.

http://www.cnx-software.com/2013/03/21/seco-mitx-gpu-devkit-features-nvidia-tegra-3-supports-cuda-5/
SECO mITX GPU DEVKIT that supports Nvidia Tegra 3 and  PCI-e x16 connector (PCI Express x4) which is another POS for desktop use..

Kitguru's Anton Shilov is a hypocrite.

If you support Anton Shilov's view point then you are also a hypocrite.


----------



## cowie (Jun 22, 2015)

the amd family can not get the bs fed to some of you guys fast enough....it's ok to put a chevy motor in the new mustang.

you ran it on your competitors hardware??????????????? that's never ok no matter what you guys say


----------



## cokker (Jun 22, 2015)

joyman said:


> Really going downhill TPU, since the last two years you are really going anti-AMD and so full of crap. I will have to find another tech site to follow, shame, really a shame.



+1

When impartiality has been lost, a site is no longer reputable.


----------



## iO (Jun 22, 2015)

Stupid title, should be marked as editorial...


"Hey, lets cripple the performance of our dual GPU solution prototype concept with an APU!
Or even better: lets convince a mainboard manufacturer to design an mITX board based on the years old and dead FX platform!!!"...

Everybody knows Intel has faster CPUs, AMD benches their cards with Intels platforms for a few generations now, nothing wrong with it.
Its like saying nVidia doesnt trust its Denver cores because they didnt used it as their platform for their TitanX performance numbers......


----------



## btarunr (Jun 22, 2015)

iO said:


> Its like saying nVidia doesnt trust its Denver cores because they didnt used it as their platform for their TitanX performance numbers......



Didn't know NVIDIA Denver ran Windows 7 and Crysis. TIL.


----------



## the54thvoid (Jun 22, 2015)

This thread is awesome.
Semantics aplenty. What language Btarunr used is irrelevant.  If you think it's trolling or flame bait, so be it.
I couldn't give a poo what's inside a PC of awesome but the fact is clear that, for now, AMD don't have a suitable solution for the device. They've designed a box that can't be suitably  powered, for now, by their own processor.
It follows they could have used their own cpu's but technically it would be an issue of not having faith it would work adequately or in other words, AMD does not trust its current CPU architecture (and its concurrent hardware requirements) to work in their uber box of awesome.
Why are people up in arms? In two more days, we'll have a lovely review of what is probably the fastest gfx card of all time. This AMD hating site will tell it how it is.
It's another symptomatic idiom of today's age where the sense of 'injustice' is taken to such levels.  This site isn't biased. Editorials may be worded to cause emotive responses but hey, deal with it.


----------



## cowie (Jun 22, 2015)

yeah what he said^


----------



## bpgt64 (Jun 22, 2015)

This, is actually a very positive sign of things to come with Zen and such for me.  1st step to fixing a problem is admitting you have one.


----------



## GreiverBlade (Jun 22, 2015)

bpgt64 said:


> This, is actually a very positive sign of things to come with Zen and such for me.  1st step to fixing a problem is admitting you have one.


oh a post that make sense ... quite refreshing, thanks @bpgt64


----------



## fortiori (Jun 22, 2015)

Is there an editing process here at TPU or can editors post whatever they want to with no oversight?

Because if there is an editing process and this article still made it through then there is a systemic problem and TPU is in trouble.

If there isn't then this article demonstrates a dire need for one.

TPU, all you have is your reputation. This article just took a hammer to it.


----------



## Katanai (Jun 22, 2015)

rooivalk said:


> France Air Force is also bought it's own Rafale before exporting it (although they don't really want to buy it) far before Rafale hit jackpot with Indian Air Force.



Going a bit off-topic here, I don't know where you got your information from but the Rafale is an excellent fighter and no one was ever forced to buy it. It's one of the best war planes ever made. The french air force bought it a bit later not because it didn't want it but because the french navy needed it most for it's carrier to replace the much older planes it had. So the air force started getting them only after 60 planes were delivered to the navy...


----------



## bpgt64 (Jun 22, 2015)

GreiverBlade said:


> oh a post that make sense ... quite refreshing, thanks @bpgt64




tbh, my first custom was an AMD64 3500+ , but I have been using intel in everything since Thubane was a bust...  I have given intel untold amounts of money and I am not ashamed at all of it.  There product has been better for gaming.  

But I'd love to see AMD take them down a peg.


----------



## Katanai (Jun 22, 2015)

Btarunr dude you have my full support. Sometimes you have to say it like it is. AMD releasing a "next generation" platform and using Intel CPU's in it?!? I mean how the hell are you supposed to not say anything about that in the title?!?


----------



## Evildead666 (Jun 22, 2015)

btarunr said:


> Quantum uses an ITX board with a single x16 slot, and a dual-GPU video card split between two PCBs, with a sandwiched cooling solution (à la GeForce 9800 GX2).



I don't think the video card is split between two PCB's.
The Fury X2 (Dual GPU card), is two GPU's on one card, with the watercooling block sandwiched between the 2 GPU's and the CPU.


----------



## librin.so.1 (Jun 22, 2015)




----------



## iO (Jun 22, 2015)

btarunr said:


> Didn't know NVIDIA Denver ran Windows 7 and Crysis. TIL.


Fair enough but its well supported in Linux which would give comparable numbers but they dont do it because it would bottleneck the performance.
And for the same reason AMD didnt choose one of their CPUs, especially in a prototype while evaluating max performance, heat, power draw etc.


----------



## GhostRyder (Jun 22, 2015)

Eh, I knew this already but its not that big a deal as it is.

They already claim they are not competing in the high end market.  Since this is focused on showcasing the Fiji cards more than anything is would make sense to pair this with an Intel CPU since they are better for gaming overall.  Not only that, but the PCI-E at 3.0 speeds (8x each) is probably a better option (even if 2.0 @ 8x would not be much of a bottleneck if any) just to make sure you have the performance available.  Either way, still a cool looking product.


----------



## btarunr (Jun 22, 2015)

iO said:


> Fair enough but its well supported in Linux which would give comparable numbers but they dont do it because it would bottleneck the performance.
> And for the same reason AMD didnt choose one of their CPUs, especially in a prototype while evaluating max performance, heat, power draw etc.



Didn't know Crysis, 3DMark, Far Cry 4, <insert relevant benchmarks> ran on PC Linux either.


----------



## raghu78 (Jun 22, 2015)

Digital Dreams said:


> Shitty thread title IMO. As "editor and senior moderator" I expect better from you. This has nothing to do with trust at all. Also people calling you out on your shit hardly makes them fanboys.



seriously this guy is on a mission to troll AMD. AMD is developing the best small factor gaming PC and they are putting the best CPU to drive their GPUs. Its simple for anybody who is not trolling to realize that. 

I think he is getting more than his fair share of generous gifts this month from the green team to improve his trolling efforts. anyway its pathetic for a so called editor to stoop to this level.


----------



## ZoneDymo (Jun 22, 2015)

Katanai said:


> Btarunr dude you have my full support. Sometimes you have to say it like it is. AMD releasing a "next generation" platform and using Intel CPU's in it?!? I mean how the hell are you supposed to not say anything about that in the title?!?


 
Perhaps you should read more, nobody is saying it should not be mentioned or that this is not perhaps somewhat newsworthy, the title however should be free of personal remarks etc.
Make it : "the new Project Quantum is powered by an Intel i7 4970k" and nobody would have any problem.
Those are just unambiguous facts.


----------



## truth teller (Jun 22, 2015)

tpu has surely been going downhill for the last couple of years, resorting to click-bait-fud wasn't on the agenda in "the early days"

in b4 w1zzard is "forced" to "review" reviews so he can continue to review, rip

edit, the "forced" moderation before posts get into the airwaves is good too, this way opinions get "reviewed" too before they hurt anyone


----------



## raghu78 (Jun 22, 2015)

btarunr said:


> Quantum uses an ITX board with a single x16 slot, and a dual-GPU video card split between two PCBs, with a sandwiched cooling solution (à la GeForce 9800 GX2).



how much more wrong can you get. Fury X2 is a single PCB design while 9800 GX2 was a dual PCB design.

http://www.legitreviews.com/xfx-geforce-9800-gx2-1gb-gddr3-video-card-review_679/3

"As you can see from the diagram above the NVIDIA GeForce 9800 GX2 employs a special design using two GPUs, each mounted on its own dedicated PCB. This design offers several advantages over a single PCB design:"

http://videocardz.com/56588/amd-dual-fiji-radeon-r9-fury-x2-pcb-pictured

Instead of trolling AMD spend your time doing something productive or atleast do your homework when talking about stuff. 



ZoneDymo said:


> *Perhaps you should read more, nobody is saying it should not be mentioned or that this is not perhaps somewhat newsworthy, the title however should be free of personal remarks etc.*
> Make it : "the new Project Quantum is powered by an Intel i7 4970k" and nobody would have any problem. Those are just unambiguous facts.



Exactly. that guy seriously is lacking the professionalism expected of a editor. Its easy to rub it in when a company is going through a tough time due to a failed architecture / design. But does that make that commenting moron any more intelligent. No. He can't design shit and here he is proving he is just a troll with not much if any intelligence at all.


----------



## btarunr (Jun 22, 2015)

raghu78 said:


> how much more wrong can you get. Fury X2 is a single PCB design while 9800 GX2 was a dual PCB design.
> 
> http://www.legitreviews.com/xfx-geforce-9800-gx2-1gb-gddr3-video-card-review_679/3
> 
> ...



AMD said Project Quantum features two Fiji GPUs, but not that it features the Fiji dual-GPU card. Take a good look at those renders, and think if that Fiji X2 card (single PCB, 2 GPUs); can fit into that. Your next personal remark gets you banned.


----------



## REAYTH (Jun 22, 2015)

TheMailman78 asked me to post this in here for you guys.


----------



## Evildead666 (Jun 22, 2015)

btarunr said:


> AMD said Project Quantum features two Fiji GPUs, but not that it features the Fiji dual-GPU card. Take a good look at those renders, and think if that Fiji X2 card (single PCB, 2 GPUs); can fit into that. Your next personal remark gets you banned.



I took a good look at those renders, and can only see one card (and a single GPU at that, then again its a rendered image, not final thing).
Where is the second GPU board ?
Also, if the Block is sandwiched between two GPU's, where does it cool the CPU ?
edit : The dual GPU card is a equivalent/smaller than a standard High end GPU Card of today. Of course it will fit.

All i'm saying, is you are making a broad statement that it is a dual PCB based card, with no facts whatsoever, and images that don't back it up.
This must be nipped in the Bud before it becomes 'Fact' all over the Internet.


----------



## wiak (Jun 22, 2015)

why do every review site use intel cpus these days?... (FYI every intel cpu has amd technology inside it) so its not really a true intel cpu anyway either see AMD64, )

and its their Innovation lab, they build cool stuff to show off or for blueprint for oems etc
if oems are gonna sell it, they can make a amd version, its not that hard to switch a motherboard/cpu..

if you want to show off your graphics power, why would you pick something that can bottleneck you against your graphics competitor? (if you did your competitor could pick it...)

its a touchy line to go, but you pick what you competitor can pick and obviously they would have picked intel


----------



## raghu78 (Jun 22, 2015)

btarunr said:


> AMD said Project Quantum features two Fiji GPUs, but not that it features the Fiji dual-GPU card. Take a good look at those renders, and think if that Fiji X2 card (single PCB, 2 GPUs); can fit into that. Your next personal remark gets you banned.



So now you are asking me to see if the fiji dual GPU card aka Fury x2 will fit into Quantum. You made a statement without any confirmative proof and now you want me to disprove your statement or rather prove mine. how nice. so thats why I said do your homework. oh btw I don't give a damn if you ban me. Thats all you can do.  What are you doing with AMD. A company has its own existence and value just as much as you do. What you made was a insinuatory remark (and may I repeat a pathetic trolling attempt) targetted at a company which consists of thousands of engineers who are much smarter and brighter than you are. So seriously your troll title is just pathetic and laughable.


----------



## john_ (Jun 22, 2015)

btarunr said:


> Didn't know NVIDIA Denver ran Windows 7 and Crysis. TIL.


You are not going to find Denver cores in the latest Nvidia ARM based products. They used it in one product and then? Then they lost their trust on them and abandon them completely.


----------



## Katanai (Jun 22, 2015)

ZoneDymo said:


> Perhaps you should read more, nobody is saying it should not be mentioned or that this is not perhaps somewhat newsworthy, the title however should be free of personal remarks etc.
> Make it : "the new Project Quantum is powered by an Intel i7 4970k" and nobody would have any problem.
> Those are just unambiguous facts.



No it should be not free of any remarks. If AMD is using Intel processors in their own platform it should be yelled out loud like it is here. This is like NVIDIA releasing their next shield tablet and using an AMD GPU in it.  What do you think the titles would be in car magazines if the next Ford Focus would use an Audi engine?!?


----------



## raghu78 (Jun 22, 2015)

Evildead666 said:


> I took a good look at those renders, and can only see one card.
> Where is the second GPU board ?
> Also, if the Block is sandwiched between two GPU's, where does it cool the CPU ?
> edit : The dual GPU card is a equivalent/smaller than a standard High end GPU Card of today. Of course it will fit.
> ...




ha ha ha. well said. next thing we know wccftech will be publishing an article saying Quantum houses a dual PCB fury X2.


----------



## Prima.Vera (Jun 22, 2015)

See? Even AMD knows their CPU's are _*ubber crap*_ for gaming.


----------



## Tatty_One (Jun 22, 2015)

truth teller said:


> tpu has surely been going downhill for the last couple of years, resorting to click-bait-fud wasn't on the agenda in "the early days"
> 
> in b4 w1zzard is "forced" to "review" reviews so he can continue to review, rip
> 
> *edit, the "forced" moderation before posts get into the airwaves is good too, this way opinions get "reviewed" too before they hurt anyone*



I think you will find that is a practice of most sites these days, it's to stop spam, the sheer amount of which at times would literally break the spine of some smaller sites, it's a continual battle we fight


----------



## ZoneDymo (Jun 22, 2015)

Katanai said:


> No it should be not free of any remarks. If AMD is using Intel processors in their own platform it should be yelled out loud like it is here. This is like NVIDIA releasing their next shield tablet and using an AMD GPU in it.  What do you think the titles would be in car magazines if the next Ford Focus would use an Audi engine?!?



If the car magazine is worth a damn (aka not fox news like sensationalism bs) it would say:
"the new Ford Focus is powered by an engine from Audi" 

and not something like 
"omg the germans took over the usa car industry" or
"ford finally realized they cannot make cars and turned to the germans" or
"History repeats itself, Ford sends money to Germany yet again"

You catch my drift.


----------



## john_ (Jun 22, 2015)

Nvidia uses ARM cores instead of Denver.
Samsung many times uses Qualcomm SOCs instead of Exynos.
Apple uses SOCs from Samsung instead of it's own.
So many other examples probably everywhere in the technology market.
Still the only guilty here is AMD. The company that hardware sites use as a Boxing Punch Bag when they want to sell "objectivity" and "independence" from the hardware manufacturers.


----------



## Ruyki (Jun 22, 2015)

Correct me if I'm wrong, but isn't project quantum just meant to show what is possible with the small size of AMD's new Fury GPUs? It's probably not going to be sold. At least not in this form.

Anyway, I respect AMD for having the guts to put an intel CPU in that thing. Intel is just better for gaming right now. And better for small form factor machines like quantum.

But that clickbait title. I'm pretty sure someone in AMD is mad right now


----------



## Evildead666 (Jun 22, 2015)

Here's a pic from HardwareBBQ : http://www.hardwarebbq.com/amd-laun...ji-silicon-project-quantum-sff-gaming-system/




This shows a render of the topside of the block, without the CPU in the way.


----------



## btarunr (Jun 22, 2015)

Ruyki said:


> But that clickbait title.



The entire article has been in the front-page view. There's nothing you achieve by clicking.


----------



## wiak (Jun 22, 2015)

Mutagen240 said:


> Very misleading title. AMD clearly knows that Intel was the only way to go. As stated there are no m-itx motherboard for FX CPUs and APUs would be a huge bottleneck so they could only have chosen Intel. Its not that they don't trust they're CPUs, its that they're not stupid. Please change the title.
> 
> Yours sincerely, AMD fanboy


i agree also they did create more than one "project quantum" so i suspect there might be some amd variants around 

AMD already have Intel setups for compatibility testing too...
to realy check the perf of your graphics you do need to test it on every platforms


----------



## Katanai (Jun 22, 2015)

ZoneDymo said:


> If the car magazine is worth a damn (aka not fox news like sensationalism bs) it would say:
> "the new Ford Focus is powered by an engine from Audi"
> 
> and not something like
> ...



And what do you think the titles in 90% of magazines would be? And I have another question for you: the computer you're posting from, what CPU does it have?


----------



## ShurikN (Jun 22, 2015)

By the time this project lives, AMD will probably have Zen on the market.
I see no problem in using Intel for the time being.


----------



## ZoneDymo (Jun 22, 2015)

Katanai said:


> And what do you think the titles in 90% of magazines would be? And I have another question for you: the computer you're posting from, what CPU does it have?



Probably one of the sensationalist headlines, to get stupid people to buy it.
But we are not stupid people and this has atleast so far, not been a sensationalist website.
Our the complained here is completely about that, if you like your sensationalist material then I guess thats your business but not something the majority of the commuters here want to see.

The PC Im on now has an Intel CPU as can be seen on my profile (not that I see how that would matter, infact I would like to know why you asked that.


----------



## Katanai (Jun 22, 2015)

ZoneDymo said:


> The PC Im on now has an Intel CPU as can be seen on my profile (not that I see how that would matter, infact I would like to know why you asked that.



Yeah I can see you have an Intel CPU and AMD GPU's just like the system we talk about here. That just means you don't trust AMD Processors either.  So relax, there's nothing wrong with that...


----------



## GhostRyder (Jun 22, 2015)

Alright everyone, lets take a deep breath and step away from the keyboards.

Look, project Quantum is cool relatively speaking.  Its a way to show what they can accomplish with their new GPU's as that is where most of the focus is currently (even on the APU's the focus is on the GPU power more than anything).  Why would they chance bottlenecking their newest GPU's in a system that is trying to show off what their newest product could accomplish.

Don't hate the article or messenger, maybe the title could have been phrased differently but if anything he has done the reverse for both sides including calling NVidia out on their recent string of GPU BS (GTX 970 3.5gb, locking out overclocking through drivers twice, etc).  So lets just all calm down and stop the war before we start end up with broken keyboards and monitor in our front lawns .


----------



## techy1 (Jun 22, 2015)

nothing sensational about this title - it states facts... if AMD would trust that their own CPU would not botleneck - then they would put it in their little prtoject... at least AMD has more common sense than their fans (who hapen to be very upset about title, but not so much about the fact that AMD knows that AMD CPU is bottleneck).


----------



## HisDivineOrder (Jun 22, 2015)

Well, this is one way to get people to click and respond.


----------



## mastrdrver (Jun 22, 2015)

Did AMD design this or did AMD have someone else design this for them?

Sounds like Kitguru could possibly have some salty squirrel tears going on. The fact that they mention the Mac Pro at the bottom of the article is also suggestive of such a premise. Maybe just a coincidence? In either case to state, as this article does, that AMD does not trust it's own processors is blatantly obvious that this (TPU) article is either click-bait or disinformation (for what ever reason). Especially since not even Kitguru jumped to such a subjective conclusion in their article.

As an aside, the amount of anti AMD rhetoric coming out of TPU staff lately is a little disturbing. There were no editorials about the GTX 970 memory deal that were stating that nVidia was trying to take advantage of the customer. There have been no editorials about Intel being incompetent with falling behind their tick-tock schedule. Yet, when there might be a hint of the smallest of inconsequential things (like 6 months since last official driver, really?), there seems to be something that comes out of the "news" department lately from TPU about AMD being incompetent which usually turns out to be either a misleading title, click-bait, or a blatant lie without effort put in to check the facts.

The amount of integrity that seems to have been thrown out the window lately here is more than a bit disturbing.


----------



## r.h.p (Jun 22, 2015)

cowie said:


> the amd family can not get the bs fed to some of you guys fast enough....it's ok to put a chevy motor in the new mustang.
> 
> you ran it on your competitors hardware??????????????? that's never ok no matter what you guys say



Seriously a cpu is in $AUS dollars is a $400.00 purchase   , and making a  reference to a classic FORD vs GM vs Dodge war, is a $65,000.00 purchase and entirely in a  different garage in my opinion, come on man ......lol
and yeah we have our FORD vs GMH war too .


----------



## Tatty_One (Jun 22, 2015)

mastrdrver said:


> Did AMD design this or did AMD have someone else design this for them?
> 
> Sounds like Kitguru could possibly have some salty squirrel tears going on. The fact that they mention the Mac Pro at the bottom of the article is also suggestive of such a premise. Maybe just a coincidence? In either case to state, as this article does, that AMD does not trust it's own processors is blatantly obvious that this (TPU) article is either click-bait or disinformation (for what ever reason). Especially since not even Kitguru jumped to such a subjective conclusion in their article.
> 
> ...


I don't know about editorials specifically but there were/are several news articles posted here about the situation with regard to the 3.5 v 4.0GB Vram, none that I can see came out in support of NVidia's position?


----------



## the54thvoid (Jun 22, 2015)

Newsfelch!

TPU is not Reuters.

The only thing going downhill is the forum members sense of irony, humour and critical analysis of their own preferred brands.
In fact, its worrying that an editor can't post an emotive title without cry babies screaming blue murder.

It's a tech site, not Panorama.


----------



## yogurt_21 (Jun 22, 2015)

you know between the new R9 rebadge, this cpu controversy, and the HBM based Fury cards I think I've seen more news and mention of AMD in the past 5 weeks than I've seen in the past 10 years. I'm not even talking about tech sites whose business revolve around hardware. AMD has been showing up on yahoo, msn, forbes, and etc. Ok maybe its not the bbc and cnn but hey it's way better than it has been.

Is it just me or are all these controversies a huge boon for AMD? The stock is on the rise whereas earlier this quarter it looked like they could be on their way to be de-listed. I've never seen so much buzz about a new amd product when it comes to Fury and it seems like zen might just get the same attention should this cpu controversy continue. 

This has to be the most brilliant marketing scheme in AMD's history.


----------



## ZoneDymo (Jun 22, 2015)

Katanai said:


> Yeah I can see you have an Intel CPU and AMD GPU's just like the system we talk about here. That just means you don't trust AMD Processors either.  So relax, there's nothing wrong with that...



You cannot possibly be this stupid, by your logic everything you do not use in one pc at that time is something you dont "trust"?
So if you are using Corsair memory now, that means you dont trust Crucial, ADATA, Kingston, Mushkin etc?
Can I drag that logic even further, the car brand you or your family is driving now is the only one you will ever have because its clearly the only one you trust because thats the brand you are using right now?

I mean seriously.....



techy1 said:


> nothing sensational about this title - it states facts... if AMD would trust that their own CPU would not botleneck - then they would put it in their little prtoject... at least AMD has more common sense than their fans (who hapen to be very upset about title, but not so much about the fact that AMD knows that AMD CPU is bottleneck).



Trust is not a fact, its an emotion.
Trust makes no sense here, they know it would bottleneck, you cannot just hope for a cpu to do better then it does, its 0's and 1's not "oh if I just cheer it on it will go faster".
They KNOW that their current cpu lineup is not fast enough to get most out of what they are demoing, their new gpu's.
Again there is nothing wrong with saying "Project Quantum is powered by an Intel 4970k", but saying its because AMD does not "trust" their own hardware? that again makes A no sense and B yeah is very much sensationalist.


----------



## john_ (Jun 22, 2015)

mastrdrver said:


> (like 6 months since last official driver, really?), there seems to be something that comes out of the "news" department lately from TPU about AMD being incompetent which usually turns out to be either a misleading title, click-bait, or a blatant lie without effort put in to check the facts.



It could be something more. Many sites took that article about the driver updates and made news out of it. You know, when a personal opinion gets on the first page of a big site like TPU, other sites that just grab TPU's articles for their own first pages, don't always manage to distinguish between real news, an editorial, or even a plain post, like the millions posts people are doing all over the internet every day.

So a personal opinion of one person in the first page of TPU, instantly becomes news.

The info about the Intel CPU is know for 5 days now if not more. But no one really cared enough to make a fuss out of it. So, what if the title is NOT a click bait for us, who post in here. What if this title is a bait for all the other sites? Many sites will also copy the above title doing someone's marketing work for free.

Just an opinion of course. A few conspiracy theories.


----------



## ZoneDymo (Jun 22, 2015)

the54thvoid said:


> Newsfelch!
> 
> TPU is not Reuters.
> 
> ...



Its a tech site, not a gossip magazine.
On a tech site you post Tech news, not Tech news + personal opinion.


----------



## btarunr (Jun 22, 2015)

ZoneDymo said:


> Its a tech site, not a gossip magazine.
> On a tech site you post Tech news, not Tech news + personal opinion.



I've made around 15,000 newsposts on this site. At this point, I can wipe my behind with your opinion, without consequences. Post on the subject, or don't post.


----------



## rvalencia (Jun 22, 2015)

Katanai said:


> No it should be not free of any remarks. If AMD is using Intel processors in their own platform it should be yelled out loud like it is here. This is like NVIDIA releasing their next shield tablet and using an AMD GPU in it.  What do you think the titles would be in car magazines if the next Ford Focus would use an Audi engine?!?


Stop being a hypocrite. I haven't seen you using NVIDIA's 386 CPU with their GPU. Read http://www.nvidia.com/page/uli_m6117c.html


----------



## ZoneDymo (Jun 22, 2015)

btarunr said:


> I've made around 15,000 newsposts on this site. At this point, I can wipe my behind with your opinion, without consequences. Post on the subject, or don't post.



Oh I have no doubt of your shall we say, "great powers".
And it quite evident you can't take criticism well (as to the wording used) but that is something you share with a lot of people to the point we can call it normal.
Regardless, I responded to a comment made by a user here, same as I am responding to you now as you responded to me (even though this was not directly directed at you but regardless)
I'm pretty sure that is allowed, I mean why else facilitate the users with a "reply" and even "multi quote" feature?


----------



## btarunr (Jun 22, 2015)

ZoneDymo said:


> I'm pretty sure that is allowed, I mean why else facilitate the users with a "reply" and even "multi quote" feature?



So they can post on the subject, and avoid getting banned.


----------



## ZoneDymo (Jun 22, 2015)

btarunr said:


> So they can post on the subject, and avoid getting banned.



Wait there is another way of replying to the subject and that one does get you banned?
Not aware of that but that also hardly is the case here, I just clicked that little reply button that shows up on the right side under the comment you posted, there is a "thanks" a "multi quote" and a "reply", the later of which I am using right now to reply to you.
Again seems odd to not be allowed to respond to other users comment yet facilitate us with said buttons that allow just that with intent.


----------



## Evildead666 (Jun 22, 2015)

btarunr said:


> I've made around 15,000 newsposts on this site. At this point, I can wipe my behind with your opinion, without consequences. Post on the subject, or don't post.



I just hope AMD understands your wiping of your behind with their Brand Name thru this "News Article".
You might want to update the original post with the possibility that AMD might provide the Final Quantum Systems with a choice of either AMD or Intel CPU's?
It's just as correct as your original newspost, and it leaves some wiggle room.


----------



## btarunr (Jun 22, 2015)

Evildead666 said:


> I just hope AMD understands your wiping of your behind with their Brand Name thru this "News Article".
> You might want to update the original post with the possibility that AMD might provide the Final Quantum Systems with a choice of either AMD or Intel CPU's?
> It's just as correct as your original newspost, and it leaves some wiggle room.



I'm gathering more such details, and will do another article tomorrow. I'm waiting on some communication on its graphics board design. Then we'll nip the bud on that together.


----------



## lilhasselhoffer (Jun 22, 2015)

btarunr said:


> I've made around 15,000 newsposts on this site. At this point, I can wipe my behind with your opinion, without consequences. Post on the subject, or don't post.



In a previous post I defended your right to post facts, under the badge of news.  Facts are things that can be backed up with data. 

I also suggested that there was some kernel of reason to your posts, as you posted that "Windows 8 is here to stay and everyone railing against is is wrong."  Opinions can be wrong, as yours was with Windows 8.  Despite this, it was just as valid as any other opinion.



At this point, I'd like to think I'm reasonably impartial.  Unfortunately, that means you are suddenly full of crap.  
1) The article title is not news, but click-bait.  While the later half of the title is accurate, the former is conjecture and opinion.  You've done nothing to prove that AMD lacks "trust," only that their prototype used an Intel CPU. 
2) Your responses to this have basically been that your initial title was too severe, and then a defending your "fact" with the statement they used an Intel CPU.  
3) Telling people that you can "wipe your behind with their opinion" means nothing.  Volume of output is an interesting metric, but if you get to the point where your only rebuttal is to tell people to shut up and listen you've lost the argument.


Perhaps, for a moment, you can look at this critically.  It's rather difficult to believe you've titled this article as such unless you either want click-bait, or have a deep desire to point out the perceived failings of AMD.  Neither of these goals is entirely laudable, but the important fact is that neither of them are news.  The reason you're getting more push-back than just fan boys is that Techpowerup generally shoots straight on issues; heck, W1zz is one of the most respectable peddlers of unbiased card reviews anywhere.  When we can no longer trust the news here is unbiased then there have to be consequences.  If you're incapable of either adequately proving your conjecture (honestly, there is no objective way to prove a lack of internal confidence as an outsider) or retracting it with some grace, then you've lost objectivity.  If I can't trust your news to be factual, I don't care about anything else you write.  We get enough spin from the news, we don't need any more from sites like TPU.


----------



## Evildead666 (Jun 22, 2015)

btarunr said:


> I'm gathering more such details, and will do another article tomorrow. I'm waiting on some communication on its graphics board design. Then we'll nip the bud on that together.



Cool 
Looking forward to it.


----------



## REAYTH (Jun 22, 2015)

So much butt hurt in this thread.


----------



## Katanai (Jun 22, 2015)

ZoneDymo said:


> You cannot possibly be this stupid



And the personal attacks start... I told you, chill, relax, fanboy. I'm having fun here, why you have to be mad? I'm sorry that no one, not even you, trusts AMD CPU's. But it's not my fault. Tell AMD to build better CPU's, maybe when they will launch their next platform they can use one of their own...



rvalencia said:


> Stop being a hypocrite. I haven't seen you using NVIDIA's 386 CPU with their GPU. Read http://www.nvidia.com/page/uli_m6117c.html




No, I use Intel. Always have, always will. 






This is the greatest thread ever, tears so sweet, denial so strong it would make the midday sun disappear.


----------



## Casecutter (Jun 22, 2015)

btarunr said:


> It could have used its own processors, even if it's a lowly Godavari+A88X ITX; if this was just meant to be a proof-of-concept. It would have shown that AMD trusts its own creations


 


Ubersonic said:


> Isn't this just an early prototype? They could be using the Intel CPU to show it off because the AMD one isn't ready yet?


I've got to say I see no good reason for AMD needing to show this, waste the design resources, money and then no prestige (heck a backlash).  All this comes across as a see what we could do/did!  I just ask why spend time and money on this... now?

I might have said, *fine AMD you’re going "all in" on a gaming machine* based for platform longevity and cost effective production scaling, do it like a Xbox or P/S. Sure you’ve got one super wildly compact cooler layout and form-factor that’s innovative, sweet! Though, after that it amounts nothing more than a super fancy home ITX build from parts that are standard-shelf stuff. Costly and leaves it open to messing with it.

Had AMD said, we are working to design a *mobo that had one large interposer*, and on that that we'll package a special Godavari based CPU (the APU is only there to run low 3D non-gaming functions), then two Fiji's, and filled in around with 8-12Gb of HBM. Then all that memory shared between the CPU and GPU's that would've been something, even if it was just mock-up for proof of concept!  Keep the chassis closed say it's a proof of concept for the chassis and cooling side and show the "innereds" with solid modeling of the mobo/interposer concept.  You're not reveling performance for it at this point anyway, who gives a crap what's used to evaluate the cooling.

The strangest/stupidest was why revel an i7, sure perf/power/heat and Pci-E 16x, I get that; and you’ve got to give it to Intel, for 4K on Dx11 they offer best product, but we've been told... "the times they are a chang'in". Even though Intel was the right part, when AMD has been touting asynchronous shaders, and the value of optimizations for true multi-core CPU's, it make this even more stupid. The idea of using i7 together with what’s been AMD's foray in optimization of execution resources, low-level API, and hearing DX12 would work around such road blocks, this seems to be invalidated all that now.

Knowing this will almost assuredly run Win10/ Dx12, and while like the last 8mo AMD been telling us CPU overhead and poor multi-core command found in Dx11… AMD multi-core technology will provide benefits. So "even if not the best", working from a non-AMD base CPU to then hope to tout what AMD has supposedly been telling us... is the dumbest move.

The Executives and Board of AMD have been out to lunch for way to long. This is not like shooting yourself in the foot, it more akin to blowing one's knee off... it's stupid on so many levels. Sometime you just need learn to say NO!


----------



## GorbazTheDragon (Jun 22, 2015)

Replying to first few comments...

Although it has been done, using FX CPUs in ITX boards is a stupid idea. Not because of the TDP, but rather the fact that you have to cramp both a NB and SB on the board. On top of that, the socket is clearly not designed to fit in such a small PCB.

And regardless, even if they made some custom form factor board specifically for the device, the CPU could still be a bottleneck when running at a reasonable power level. I don't think AMD should trust their FX processors on something like that, or any of their CMT architectures for that matter. What is silly is how the title is phrased in a way that makes it seem surprisingly bad and the article is written in a way that goes about putting FX in there as a completely viable option. They went with intel for a reason.


----------



## Lionheart (Jun 22, 2015)

REAYTH said:


> So much butt hurt in this thread.



Your avatar pic & comment go so well together, brilliant!


----------



## Evildead666 (Jun 22, 2015)

Casecutter said:


> I've got to say I see no good reason for AMD needing to show this, waste the design resources, money and then no prestige (heck a backlash).  All this comes across as a see what we could do/did!  I just ask why spend time and money on this... now?
> 
> I might have said, *fine AMD you’re going "all in" on a gaming machine* based for platform longevity and cost effective production scaling, do it like a Xbox or P/S. Sure you’ve got one super wildly compact cooler layout and form-factor that’s innovative, sweet! Though, after that it amounts nothing more than a super fancy home ITX build from parts that are standard-shelf stuff. Costly and leaves it open to messing with it.
> 
> ...



Interposer the size of a Mobo....
You _do_ realize that the current interposer size on Fiji is the Largest it can possibly be atm, due to the lithography process ?
This is a Prototype. They are not going to be showing off Zen Silicon a full year or so in advance.
Just about everyone knows that AMD CPU's are not up to scratch atm, and that the i5/i7 K series are the only option for a High-end gaming rig.

Whatever AMD had chosen, for some it will always be the wrong choice....ffs


----------



## ensabrenoir (Jun 22, 2015)

......aaaaawwwww come on guys........this the best you can do?






Im sorry so many people are getting so upset ...but man i love new release time.....most fun I've had all day!  THANKS GUYS!!!!!!


----------



## Casecutter (Jun 22, 2015)

Evildead666 said:


> Interposer the size of a Mobo....
> You _do_ realize that the current interposer size on Fiji is the Largest it can possibly be atm, due to the lithography process ?


No it would be nowhere near as big a ITX mobo.  While that what I expect of Tech find a way to "Get-r-Done" and start advancing interposer technology now, that's what HBM and smaller Nodes can do for us.  I was just saying if you want to dream... Dream Big.


----------



## Evildead666 (Jun 22, 2015)

Casecutter said:


> No it would be nowhere near as big a ITX mobo.  While that what I expect of Tech find a way to "Get-r-Done" and start advancing interposer technology now, that's what HBM and smaller Nodes can do for us.  I was just saying if you want to dream... Dream Big.



It might get close with the Zen based Opterons. They should have HBM onboard, which is close to what you're talking about 
They won't have the GPU power though. 

Who knows ? In 2020, we might have an APU that would wipe the floor with Fiji


----------



## jumpman (Jun 22, 2015)

Quite the shitstorm one article title has brought up. The title is too biased and condescending. Why even put in the title about AMD trusting it's own processors? It's not like TPU is new and inexperienced. All this could be easily avoided with a more unbiased, less clickbait title: AMD Project Quantum driven by Intel. Instead we have the author frantically trying to defend his position on why he wrote the title and giving a poke at AMD every chance he gets.


----------



## rainzor (Jun 22, 2015)

rvalencia said:


> Stop being a hypocrite. I haven't seen you using NVIDIA's 386 CPU with their GPU. Read http://www.nvidia.com/page/uli_m6117c.html



Nvidia does not possess an x86 licence and never did. What you linked there is a Uli product. Nvidia acquired Uli some time ago


----------



## Relayer (Jun 22, 2015)

*So, AMD should have used their CPU's that don't support PCI-E 3 for dual Fiji GPU's?* btarunr is really upsetting people with the recent negative articles's/editorials. What's the point of portraying AMD so badly?


----------



## rvalencia (Jun 22, 2015)

rainzor said:


> Nvidia does not possess an x86 licence and never did. What you linked there is a Uli product. Nvidia acquired Uli some time ago


So what? That doesn't address the hypocritical view from article.

https://en.wikipedia.org/wiki/X86
Open:
Partly. For some advanced features, x86 may require license from Intel; x86-64 may require an additional license from AMD. *The 80486 processor has been on the market for more than 20 years**[1]** and so cannot be subject to patent claims*.* The pre-586 subset of the x86 architecture is therefore fully open.*


----------



## xenocide (Jun 23, 2015)

Relayer said:


> *So, AMD should have used their CPU's that don't support PCI-E 3 for dual Fiji GPU's?* btarunr is really upsetting people with the recent negative articles's/editorials. What's the point of portraying AMD so badly?


 
That's a failure on AMD's part.  They can't pretend to be a player in the CPU market when they are failing to implement even basic standards like PCI-express 3.0 and SATA 3.0. 



rvalencia said:


> So what? That doesn't address the hypocritical view from article.
> 
> https://en.wikipedia.org/wiki/X86
> Open:
> Partly. For some advanced features, x86 may require license from Intel; x86-64 may require an additional license from AMD. *The 80486 processor has been on the market for more than 20 years**[1]** and so cannot be subject to patent claims*.* The pre-586 subset of the x86 architecture is therefore fully open.*


 
You omitted the important part; "For some advanced features, x86 may require license from Intel; x86-64 may require an additional license from AMD."  So Nvidia could make a clone of the 486 but couldn't use a majority of the modern Instruction Sets (pretty much all of them) and would be confined to 32-bit.  It's not hypocritical, Nvidia doesn't make PC CPU's, you're just being thick.


----------



## rvalencia (Jun 23, 2015)

xenocide said:


> That's a failure on AMD's part.  They can't pretend to be a player in the CPU market when they are failing to implement even basic standards like PCI-express 3.0 and SATA 3.0.
> 
> 
> 
> You omitted the important part; "For some advanced features, x86 may require license from Intel; x86-64 may require an additional license from AMD."  So Nvidia could make a clone of the 486 but couldn't use a majority of the modern Instruction Sets (pretty much all of them) and would be confined to 32-bit.  It's not hypocritical, *Nvidia doesn't make PC CPU's*, you're just being thick.


That still doesn't address the hypocritical view from Anton Shilov's article. Hint; FIND Mini-ITX motherboard for AMD FX based CPU.

Transmeta was able to offer X86-64 solution. NVidia has non-exclusive license to Transmeta’s Longrun and Longrun 2 technologies and other intellectual property.


http://www.nvidia.com/docs/IO/145393/NVIDIA-Jetson-Pro-Development-Kit.png
NVIDIA'a ARM desktop type motherboard... it's a POS for desktop usage.

http://www.cnx-software.com/2013/03/21/seco-mitx-gpu-devkit-features-nvidia-tegra-3-supports-cuda-5/
SECO mITX GPU DEVKIT that supports Nvidia Tegra 3 and  *PCI-e x16 connector* (PCI Express x4) which is another POS for desktop use..







NVIDIA's ARM based Mini-ITX near-desktop solution with PCI-E slot from SECO.


Anton Shilov is a hypocrite.

At least, AMD still use it's own X86-64 ISA IP on Intel CPU (i.e. Itanium ISA failed), while NVIDIA uses ARM ISA IP.

Your own "Nvidia doesn't make PC CPU's" statement shows are being a hypocrite.


----------



## xenocide (Jun 23, 2015)

You keep using that word, I don't think you know what it means.

Transmeta did *NOT* offer an x86-64 CPU, they offered a 128-bit VLIW CPU (Crusoe) that could translate x86 instructions (32-bit), at a noticable performance loss.  Their second offering was a 256-bit VLIW CPU (Efficeon) that could translate x86 instructions (32-bit), but still ran like crap.  So no, they didn't really offer an x86-64 solution.  Also, you sure you want to cite a company that absolutely _tanked_?

As for your Nvidia examples, the first is for a Jetson Devkit, which if anyone is wondering is used to develop ARM software for *Automobiles*, and has never been available to consumers.  The second is pretty similar, the CARMA Devkit was intended to allow people to use a desktop setup to develop natively for ARM CPU's for various devices.  All sites that sold the now discontinued product stress it is not for consumers or enthusiasts, and was only available to companies for professional use.  So neither of these are examples of Nvidia making a Desktop PC Platform, they were both ARM-based solutions for professional development.


----------



## rvalencia (Jun 23, 2015)

xenocide said:


> You keep using that word, I don't think you know what it means.
> 
> Transmeta did *NOT* offer an x86-64 CPU, they offered a 128-bit VLIW CPU (Crusoe) that could translate x86 instructions (32-bit), at a noticable performance loss.  Their second offering was a 256-bit VLIW CPU (Efficeon) that could translate x86 instructions (32-bit), but still ran like crap.  So no, they didn't really offer an x86-64 solution.  Also, you sure you want to cite a company that absolutely _tanked_?
> 
> As for your Nvidia examples, the first is for a Jetson Devkit, which if anyone is wondering is used to develop ARM software for *Automobiles*, and has never been available to consumers.  The second is pretty similar, the CARMA Devkit was intended to allow people to use a desktop setup to develop natively for ARM CPU's for various devices.  All sites that sold the now discontinued product stress it is not for consumers or enthusiasts, and was only available to companies for professional use.  So neither of these are examples of Nvidia making a Desktop PC Platform, they were both ARM-based solutions for professional development.


You are still making excuses for NVIDIA. Can't you see the double standards?  *FIND Mini-ITX motherboard for AMD FX based CPU.*


http://www.legitreviews.com/kontron...op-with-ktt30mitx-arm-based-motherboard_13647
"Kontron Brings NVIDIA Tegra 3 To Desktop With KTT30/mITX ARM-Based Motherboard"
It has PCI-E 1X slot LOL...

Modern X86 CPUs *translates *variable length complex (CISC) X86 instruction to fix length RISC like internal ISA.

AMD K5 recycled investments from AMD 29K RISC core design. Intel Pentium Pro (P6) was CISC X86-to-RISC CPU product and Pentium II (P6) was a follow-on design from Pentium Pro.

Intel Itanium (another VLIW aka EPIC)  includes X86-32 compatibility.

AMD, Intel and VIA hasn't built variable length CPU cores  for a long time. Note why  X86 CPUs has kept up and beat RISC ISA CPUs like PowerPC, MIPS and Alpha i.e. X86 CPUs assimilated RISC design concepts.

From http://archive.arstechnica.com/cpu/4q99/risc-cisc/rvc-6.html
_
Both the Athlon and the P6 run the CISC x86 ISA in what amounts to hardware emulation, but they translate the x86 instructions into smaller, RISC-like operations that fed into a fully post-RISC core.  Their cores have a number of RISC features (LOAD/STORE memory access, pipelined execution, reduced instructions, expanded register count via register renaming), to which are added all of the post-RISC features we've discussed.  The Athlon muddies the waters even further in that it uses both direct execution and a microcode engine for instruction decoding.  A crucial difference between the Athlon (and P6) and the G4 is that, as already noted, the Athlon must translate x86 instructions into smaller RISC ops. _


*There nothing new with Transmeta's translation CPU design *i.e. Intel smashed Transmeta low power X86 solution with Intel Centrino (P6+, Pentium M, Core Duo, Core 2 Duo mobile).






*
*


----------



## xenocide (Jun 23, 2015)

rvalencia said:


> You are still making excuses for NVIDIA.
> 
> http://www.legitreviews.com/kontron...op-with-ktt30mitx-arm-based-motherboard_13647
> "Kontron Brings NVIDIA Tegra 3 To Desktop With KTT30/mITX ARM-Based Motherboard"
> ...


 
No, I'm pointing out facts.  You're misleading and obfuscating information.  Lets see this Tegra 3 Desktop Board.  Oh look, it's made by a Third Party and not marketed directly by Nvidia.  On top of that, it's designed for embedded devices like kiosks and toll booths and crap, not for consumers, and was intended to compete with the Raspberry Pi.  Also, it has a mPCIe, not a PCI-e 1x like a normal desktop computer.  The mPCIe is likely for various 3rd party addon boards not something like a GPU, because the Tegra 3 has a GPU in it already.  Still not a consumer desktop PC solution.

But lets throw all the niche products out and just look at the basics.  Nvidia is a Graphics company, that in the past 5 years has expanded to Mobile SOCs.  Intel is a CPU company that in the past 5 years has focused on improving Integrated Graphics in the CPUs.  AMD has been a CPU company for 46 years, and in the last 10 (after buying ATi) has had a Graphics division.  A majority of AMD's staff is dedicated to CPU's.  Their company was founded on making CPU's.  Their high point was when they were offering CPU's that beat Intel's.  And you're not seeing a bit of sadness in the fact that they, a CPU company, are using their only competitors products in something they are marketting?


----------



## Relayer (Jun 23, 2015)

xenocide said:


> That's a failure on AMD's part. They can't pretend to be a player in the CPU market when they are failing to implement even basic standards like PCI-express 3.0 and SATA 3.0.)



That's a completely different subject. I'm just explaining why "AMD doesn't _trust_ their own CPU's".

AMD isn't an equal player to Intel. Maybe some people haven't gotten the memo? Hopefully Zen will change that. We sure can use some competition in the desktop CPU space.


----------



## lilhasselhoffer (Jun 23, 2015)

After 7 pages of this crap, I think it's time that we all come to some conclusions.

1) The article title was click-bait.  There is no objective reason that it was chosen, and the fact that it has been changed already indicates the careless nature of its construction.
2) Quantum is a stupid product.  It isn't going to see life anywhere in the consumer space, so it's just a useless toy for AMD to show off their GPU fitting into a SFF PC.
3) Fan boy arguments are winning out here.  Absolutely nothing about Nvidia is discussed, but somehow the discourse has become Nvidia X AMD X Intel.  Once we reach that point there's nothing useful left to say.
4) TPU as a whole finds it less than acceptable to devolve into a more traditional media outlet.  Look through this thread, and between fan boy rage, you'll see people calling out the author for fronting opinion as news.  That is unacceptable, and hopefully will be rectified.  If we had two articles, one editorial about how the inclusion of an Intel CPU showed AMD had weak CPU offerings and the other news article about the specifications, this would not have been as controversial as it was.


So, allow me to indelicately add a stopping point to this discussion.
1) Intel is a company so rigid that they make the SS seem like a humane police force.
2) AMD's directors have a larger PR problem than the 1945 German facist party.
3) Ummm..... Nvidia wishes the world would bow down to its graphical superiority, and allow its chosen people to rule the PC master race...(yeah, stretching it but I'm out of ideas here)...

I've invoked Godwin's Law.  Hopefully now we can recognize that this discussion has become a toxic cesspool for fan boy rage.


Edit:
My conclusion 2 may be incorrect.  I'm not stripping it from the above, but it needs to be stated that it may be incorrect.  Time will shortly tell.


----------



## Prima.Vera (Jun 23, 2015)

Relayer said:


> *So, AMD should have used their CPU's that don't support PCI-E 3 for dual Fiji GPU's?* btarunr is really upsetting people with the recent negative articles's/editorials. What's the point of portraying AMD so badly?



Why their CPUs don't support PCI-E 3?? Whos fault is that??  
BTA is right. Even AMD admitted with this that their CPUs are complete junk for gaming.


----------



## Relayer (Jun 23, 2015)

Prima.Vera said:


> Why their CPUs don't support PCI-E 3?? Whos fault is that??
> BTA is right. Even AMD admitted with this that their CPUs are complete junk for gaming.



Fault? I'm simply stating a fact. Anyone with 2 live brain cells to rub together would understand that.


----------



## Caring1 (Jun 23, 2015)

AMD can't win, there is always going to be someone critical of what they do, just like there is in the green team.
If they had used all AMD components, people would whinge that the system was optimized to obtain certain results, by using the competitors CPU they are basically saying our system runs great regardless of what CPU is used.
Lets leave it at that.


----------



## rvalencia (Jun 23, 2015)

xenocide said:


> No, I'm pointing out facts.


You haven't posted any facts.



xenocide said:


> You're misleading and obfuscating information.  Lets see this Tegra 3 Desktop Board.  Oh look, it's made by a Third Party and not marketed directly by Nvidia.


You're misleading and obfuscating information, since
1. Kontron's Terga 3 mITX desktop solution is similar to NVIDIA's Jetson Tk1 mITX from https://developer.nvidia.com/jetson-tk1

You can buy NVIDIA's Jetson Tk1 from usual desktop PC stores like* Newegg, Tiger Direct and Micro Center*.

2. AMD Quantum's mITX solution is made by ASrock i.e. a 3rd party.




xenocide said:


> On top of that, it's designed for embedded devices like kiosks and toll booths and crap, not for consumers, and was intended to compete with the Raspberry Pi.  Also, it has a mPCIe, not a PCI-e 1x like a normal desktop computer.  The mPCIe is likely for various 3rd party addon boards not something like a GPU, because the Tegra 3 has a GPU in it already.  *Still not a consumer desktop PC solution.*


Again, you are still making excuses for NVIDIA.

You can buy NVIDIA's Jetson Tk1 mITX from usual PC stores like Newegg, Tiger Direct and Micro Center.

http://www.newegg.com/Product/Produ...m_re=nvidia_jetsen_tk1-_-13-190-005-_-Product

http://www.tigerdirect.com/applications/searchtools/item-details.asp?EdpNo=9085339

http://www.microcenter.com/product/...d_Core_ARM_Cortex-A15_CPU,_2GB_Mem,_16GB_eMMC

Depending on PCI address allocation, 1x mPCIe can support full blown GPU card with an adapter. http://www.netstor.com.tw/_03/03_02.php?OTc

As an owner of Dell Studio XPS 1645 laptop with ExpressCard slot, I was able to connect and run an external GPU card with a *PE4H *adapter.

View  http://forum.notebookreview.com/threads/diy-egpu-experiences.418851/page-313#post7256409  for my external GPU results.

My Samsung laptop's 1x mPCIe slot is the same as ExpressCard i.e. I would need ribbon extender with mPCIe to PCI-E 16X slot adaptor and hope Samsung haven't gimped my laptop's PCI address allocation functions. As you can see, I have experience with small PCI-E slots.

PCI-E adapters like *PE4H *provides 75 watt PCI-E power. I can recycle PE4H and change interface card from ExpressCard to mPCI-E.


*Your argument for "Still not a consumer desktop PC solution." shows your double standards. AGAIN, Find mITX motherboard with AMD FX CPU.*

Since NVIDIA's ARM based mITX solution doesn't run Windows X86-64 or Linux X86-64, it's unsuitable for protecting X86 PC software investments.

I run AmiDuo Beta for Android 5.x X86 native build VM with ARMv7 emulator on my X86 PCs.

I'm sure NVIDIA would be happy if somebody ports open source Windows NT/XP ReactOS ARM build with JIT X86 emulator on their ARM based mITX. Hint Windows NT 4.0 DEC Alpha edition with FX32 (X86 CPU JIT emulator)




xenocide said:


> But lets throw all the niche products out and just look at the basics.  Nvidia is a Graphics company, that in the past 5 years has expanded to Mobile SOCs.  Intel is a CPU company that in the past 5 years has focused on improving Integrated Graphics in the CPUs.  AMD has been a CPU company for 46 years, and in the last 10 (after buying ATi) has had a Graphics division.  A majority of AMD's staff is dedicated to CPU's.  Their company was founded on making CPU's.  Their high point was when they were offering CPU's that beat Intel's.  And you're not seeing a bit of sadness in the fact that they, a CPU company, are using their only competitors products in something they are marketting?


Again, you still making excuses for NVIDIA and you can't see your double standard.  *AGAIN, Find mITX motherboard with AMD FX CPU.

*
PS; "Dev motherboard" is just an excuse that their solution is not being desktop PC ready i.e. I have seen the same "Dev motherboard" excuses from wannabe desktop PowerPC camp.

NVIDIA's "develop solutions in computer vision, robotics, medicine, security, and automotive" sounds like wannabe desktop PowerPC camp's marketing.





Prima.Vera said:


> Why their CPUs don't support PCI-E 3?? Whos fault is that??
> BTA is right. Even AMD admitted with this that their CPUs are complete junk for gaming.


Find mITX motherboard with AMD FX CPU. AMD FX CPU is fine for DirectX12


----------



## john_ (Jun 23, 2015)

Anyone commented on Nvidia abandoning Denver cores? Nope.

Anyone commended on Apple using Samsung hardware? Nope.

Anyone said something about Intel using PowerVR GPU? Yes, insults from the moderator. Probably hit a nerve there.

Anyone said anything for all those Samsung phones using Qualcomm processors? Nope.


All these ignored. The target is AMD anyway. Anything else is irrelevant.


----------



## xenocide (Jun 23, 2015)

AMD CPU's don't have a mITX board because their CPU's use too much power and AMD realized Bulldozer/Piledriver was a flop.   mITX is a relatively new form factor and AMD hasn't updated their chipsets since Piledriver launched back in 2012--about the time this ultra small form factor computers started becoming popular.  If AMD wanted a mITX board, they could have one, but they don't, and their hardware is not good enough to work in such a setting.  It just boils down the fact that AMD, a primarily CPU company, can't even justify using their own products in a device they are selling.




john_ said:


> Anyone commented on Nvidia abandoning Denver cores? Nope.
> 
> Anyone commended on Apple using Samsung hardware? Nope.
> 
> ...


 
Has Nvidia officially abandoned Denver Cores?  They released the Tegra K1 last year which featured a dual Denver core, and it was pretty well recieved.  Apple uses hardware manufactured by Samsung foundries, but it's not like they buy Exynos CPU's and stick them in iPhones.  Intel isn't a Graphics company, they are CPU company that is investing in iGPU to help hold down their marketshare.  I don't doubt what they learn in the PC world will eventually translate to the mobile, they've only been serious about iGPU's for about 3 years.  Samsung probably used Qualcomm chips to get their phones out the door with comperable performance while they refined the manufacturing of their chips.  Every report I've read said they already dropped them.

Happy?  I addressed every one.


----------



## john_ (Jun 23, 2015)

xenocide said:


> Has Nvidia officially abandoned Denver Cores?  They released the Tegra K1 last year which featured a dual Denver core, and it was pretty well recieved.


Do you see any new products with Denver cores? Does the lattest Nvidia products use cores from ARM and not Denver?



> Apple uses hardware manufactured by Samsung foundries, but it's not like they buy Exynos CPU's and stick them in iPhones.


 Except all that Samsung hardware, what SOC do they use in their smartwatch?



> Intel isn't a Graphics company, they are CPU company that is investing in iGPU to help hold down their marketshare.


 Do they have GPUs? Yes. Do they have more that 60% of graphics market? Yes. So? So for Intel we are full of excuses. For AMD we forget that they create GPUs, it suits us here to consider them as a CPU company. We even ignore that the event was about a GPU. So for AMD we only know how to point the finger at them.



> I don't doubt what they learn in the PC world will eventually translate to the mobile, they've only been serious about iGPU's for about 3 years.


 Your dreams for Intel is irrelevant here. I could also say that Zen will conquer the world. You are still going to ignore it and point at FX. Double standards.



> Samsung probably used Qualcomm chips to get their phones out the door with comperable performance while they refined the manufacturing of their chips.  Every report I've read said they already dropped them.


 Oh come on. Can you even read what you are writing? All these excuses are valid for AMD also, but NOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO! Excuses, excuses excuses for the others. And for the same, exactly the same things, complaints against AMD.



> Happy?  I addressed every one.


 Of course I am happy. You proved ALL my points. I couldn't be happier to tell you the truth. I am even considering to upvote your post. Thanks!


----------



## joyman (Jun 23, 2015)

Joseph Goebbels would be proud with TPU and some of its readers. The do the unimaginable to defend their bending of the information. Its a world-wide trend to invent news and model people's minds. Perhaps its a paid chore, or done because its modern. Whichever it is - its utter wrong and not honorable thing to do. If you want to uphold your reputation as the biggest and best tech website you better change again to what you were. And its not only the news, the reviews are showing the same trend. Be better than this, don't fall to the traps for the gullible - because i doubt you take money to do this.


----------



## ensabrenoir (Jun 23, 2015)

.......just wait until the NDAs lift......then the real fun begins.  Oh....and as a side note to all Amd users. Most people don't care one way or another.....some just enjoy seeing you guys get riled up...... (goes off an orders 5 Fm2+ systems for local daycare)


----------



## wiak (Jun 23, 2015)

i was right that amd had quantum systems with both intel and amd cpus
http://www.tomshardware.com/news/amd-project-quantum-intel-cpu,29430.html


----------



## AsRock (Jun 23, 2015)

Relayer said:


> *So, AMD should have used their CPU's that don't support PCI-E 3 for dual Fiji GPU's?* btarunr is really upsetting people with the recent negative articles's/editorials. What's the point of portraying AMD so badly?



Which makes me think they would of used their own CPU's too but the one they want to use is not even finished yet.

I keep hearing they are changing socket and bringing new chips to the table so why would they use tech that do not plan to be in it in the 1st place in the finished product.


----------



## lilhasselhoffer (Jun 23, 2015)

So, seems as though this discourse has been put to rest:
http://www.techpowerup.com/forums/t...oject-quantum-as-options.213762/#post-3303488

I won't state anything else here, but needless to say the debate seems to have been rather conclusively answered.


----------



## rvalencia (Jun 24, 2015)

xenocide said:


> AMD CPU's don't have a mITX board because their CPU's use too much power and AMD realized Bulldozer/Piledriver was a flop.   mITX is a relatively new form factor and AMD hasn't updated their chipsets since Piledriver launched back in 2012--about the time this ultra small form factor computers started becoming popular.  If AMD wanted a mITX board, they could have one, but they don't, and their hardware is not good enough to work in such a setting.  It just boils down the fact that AMD, a primarily CPU company, can't even justify using their own products in a device they are selling.


Still doesn't address your double standard view points.

AMD doesn't mass produce desktop motherboards i.e. that's the 3rd party's job. There are Mini-ITX motherboards that supports AMD Piledriver based APU, but these are limited two Piledriver modules with 4 CPU threads.

I have mini-ITX with Intel Core i7-2600 "Sandybridge"+Silverstone SG-07 case and I switched back to micro-ATX+Aerocool DS case  for Intel "Devil's Canyon" i.e. *mini-ITX form factor is not new and it's not recent. *I switch back to micro-ATX since I have an ASUS Xonar D2 sound card audio card.

There are SFF Micro-ATX cases.


----------



## Durvelle27 (Jun 24, 2015)

For all the nasayers 

"We have Quantum designs that feature both AMD and Intel processors, so we can fully address the entire market. I'm sure you've heard AMD leaders speak before about how we're driving growth in the company and our key businesses, and that one of the key strategies we have for doing that is listening to customers.

You may have heard at the recent AMD financial analyst day that Lisa Su described Job #1 as "Build Great Products." In the case of buyers for systems like Project Quantum, there is a clear preference for choice; they're not interested in a narrow range of computing solutions - they want to pick and choose the balance of components that they want, that are hand-tailored in a world of off-the-rack-suits.

With a product as compelling as R9 Fury, we are extremely pleased to enable as much success as we can. There is a range of technology options for CPU in Project Quantum… but the real star is Radeon Fury."

Directly from AMD


----------



## HumanSmoke (Jun 29, 2015)

john_ said:


> Do you see any new products with Denver cores? Does the lattest Nvidia products use cores from ARM and not Denver?


Well this seems all a bit off topic - your bailiwick it seems.
You must be new to technology.
*Project Denver is ARM based





*


john_ said:


> Do you see any new products with Denver cores?


You mean like the X1 powered Shield ? AFAIA, X1 is being marketed for automotive in-car features (sensors. cameras) and entertainment systems, but it is being validated for consumer products aside from Shield.
You seem to think that Project Denver is something other than what it was always purported to be ( you're not alone Charlie D. couldn't work it out either). Maybe this from a couple of months back will shed some light


----------



## john_ (Jun 30, 2015)

HumanSmoke said:


> Well this seems all a bit off topic - your bailiwick it seems.
> You must be new to technology.
> *Project Denver is ARM based
> 
> ...




*"Some times it is better too chew than talk"* from a gum advertising.

Tegra X1: The Heart Of the SHIELD Android TV - The NVIDIA SHIELD Android TV Review: A Premium 4K Set Top Box
*Octa core, with 4 A57 cores and 4 A53 cores, not Denver cores.*

*Nvidia made two versions of K1*. One with *2 Denver* cores and one with *4 ARM cores*. They used Denver cores in


> Dual-core Denver CPU paired with a Kepler-based GPU solution (Tegra K1); the dual-core 2.3 GHz Denver was first used in the HTC Nexus 9 tablet, released November 3, 2014.[5][6]


 *based on your wiki page*.

*X1 is using 8 ARM cores*. _*NO DENVER CORES*_. That's why they do not advertise the type of cores used in the new Shield device.

You may also want to look at the slides in the KitGuru article* that you also posted*. It says "*ARM cores*". So, *NO DENVER CORES*.








*You are full of smoke.*


----------



## wiak (Jul 3, 2015)

Xaled said:


> -When AMD uses its own cpu peple say:
> "Too sad for such  hardware to get bottlenecked by such cpu just because AMD wants to use its own hardware"
> 
> -when AMD uses Intel cpu they say:
> "AMD doesnt trust its hardware..


soo true. if someone was to benchmark the quantum pc with amd cpu they would say its slower than a pair of 960 in sli...


----------



## R-T-B (Jul 4, 2015)

john_ said:


> *"Some times it is better too chew than talk"* from a gum advertising.
> 
> Tegra X1: The Heart Of the SHIELD Android TV - The NVIDIA SHIELD Android TV Review: A Premium 4K Set Top Box
> *Octa core, with 4 A57 cores and 4 A53 cores, not Denver cores.*
> ...




Uh... are you seriously arguing Denver is x86 based on the fact two Denver cores were used in the Nexus 9 (which is an ARM based device?).  Or am I misunderstanding this?


----------



## Steevo (Jul 4, 2015)

R-T-B said:


> Uh... are you seriously arguing Denver is x86 based on the fact two Denver cores were used in the Nexus 9 (which is an ARM based device?).  Or am I misunderstanding this?




There is no mention of X86 anywhere in that post.


----------



## R-T-B (Jul 4, 2015)

Steevo said:


> There is no mention of X86 anywhere in that post.



Thanks.  I plead sleep deprivation.  That was pretty stupid, but I coudl've sworn I saw it earlier.... lol.


----------



## Steevo (Jul 4, 2015)

I do that too sometimes, or a lot of the time if I am honest.


----------



## john_ (Jul 5, 2015)

R-T-B said:


> Uh... are you seriously arguing Denver is x86 based on the fact two Denver cores were used in the Nexus 9 (which is an ARM based device?).  Or am I misunderstanding this?


It was just one of a few examples I used, where a company doesn't use it's own proprietary tech and that thing didn't lead to an editorial and 8 pages of comments. No mention of x86.


----------



## R-T-B (Jul 5, 2015)

john_ said:


> It was just one of a few examples I used, where a company doesn't use it's own proprietary tech and that thing didn't lead to an editorial and 8 pages of comments. No mention of x86.



Yeah, is all good man.  Just misread your post.


----------



## john_ (Jul 5, 2015)

Yes I understood that. Just wanted to be clear.


----------



## Prima.Vera (Jul 5, 2015)

Either way, this was by far the worst Marketing goof AMD pulled this year.


----------



## john_ (Jul 5, 2015)

It was just one more example of press using double standards and a proof that many hardware funs are happy to point a gun at AMD. On the other hand, for similar cases, involving other companies, logic prevails over mockery.


----------



## rvalencia (Jul 10, 2015)

Prima.Vera said:


> Either way, this was by far the worst Marketing goof AMD pulled this year.


Either way, your view is filled with double standards


----------

