• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Arc B580

i want one in egpu format (like onexgpu and similar) for 250 EUR.
I think oculink + egpu's are the future, tired of these cardz..
 
In what world is a RX 6800 a "low end card"? Why would anyone put a $1700USD video card in the presentation?
Exactly. Cards like RX 6500 XT and RTX 3050 6GB are low-end cards. RX 6800 is higher mid-end or lower high-end.
 
"No DLSS" as a negative and no word about the excellent video encoder as a positive.
Weird conclusion.
 
"No DLSS" as a negative and no word about the excellent video encoder as a positive.
Weird conclusion.
It is part of the greater Ecosystem.
 
now tell it to the consumers xd
You mean the consumers who have bought out the US launch stock? I think they know.

1734048657704.png1734048895049.png
 
In order for XeSS to compete with DLSS, Intel would have to be paying every dev to add it to a game, and I don't see that happening when Nvidia is likely paying more.
I wouldn't put it past Nvidia to only allow devs to add DLSS first, or have exclusivity agreements
Since the AMD (likely) blocking DLSS in Starfield fiasco, Nvidia very promptly and publicly stated their position on this, so I doubt there would be shenanigans there as it would open them up to an epic shit storm like AMD faced on even the allegations and evidence.

NVIDIA does not and will not block, restrict, discourage, or hinder developers from implementing competitor technologies in any way. We provide the support and tools for all game developers to easily integrate DLSS if they choose and even created NVIDIA Streamline to make it easier for game developers to add competitive technologies to their games.

And since streamline, now Microsoft have developed DirectSR which at a basic level does what Nvidia's own streamline does, easily lets developers add in any (and all more importantly) upscaler they want with minimal effort. I concede there is some wiggle room for 'shenanigans' but I don't think any company wants to go through what AMD did over it and Starfield, and I would wager that at the very least sponsored games will not have upscaling exclusivity at a contractual level.
 
Since the AMD (likely) blocking DLSS in Starfield fiasco, Nvidia very promptly and publicly stated their position on this, so I doubt there would be shenanigans there as it would open them up to an epic shit storm like AMD faced on even the allegations and evidence.

NVIDIA does not and will not block, restrict, discourage, or hinder developers from implementing competitor technologies in any way. We provide the support and tools for all game developers to easily integrate DLSS if they choose and even created NVIDIA Streamline to make it easier for game developers to add competitive technologies to their games.

And since streamline, now Microsoft have developed DirectSR which at a basic level does what Nvidia's own streamline does, easily lets developers add in any (and all more importantly) upscaler they want with minimal effort. I concede there is some wiggle room for 'shenanigans' but I don't think any company wants to go through what AMD did over it and Starfield, and I would wager that at the very least sponsored games will not have upscaling exclusivity at a contractual level.
Do you really believe that was AMD's doing when Starfield was launched on the Consoles at the same time. How do you know it was not Sony that demanded that FSR be implemented first? Do you think Sony cares about DLSS in Games on their console? Their is no proof other than conjecture that AMD had anything to do with Starfield not supporting DLSS at launch. I guess it showed how raw raster is with Nvidia. Please provide sources from AMD to support this internet created narrative.
 
Energy Efficiency ??
B580 100%
4060 94%
How?

B550 use 44.5% more power in gaming VS Rtx4060
And its only 5% faster than Rtx4060 in 1080p
 
Energy Efficiency ??
B580 100%
4060 94%
How?

B550 use 44.5% more power in gaming VS Rtx4060
And its only 5% faster than Rtx4060 in 1080p

Read the text above the table:

"Energy Efficiency calculations are based on measurements using Cyberpunk 2077. We record power draw and FPS rate to calculate the energy efficiency of the graphics card as it operates."

Calculated from one game.
 
Do you really believe that was AMD's doing
Yes, absolutely. I don't need to see irrefutable proof to hold this belief (although I'd happily consume said proof if it ever surfaced, no matter what outcome it dictated), and, I don't need to justify my belief to you or any of the other AMD volunteer marketing department fans on the internet. So unless you or anyone else has said proof they didn't do it to show, my belief holds firm.

In any event I also believe the outcome of the fiasco (no matter what the whole truth even was), is a net benefit to all PC gamers.
 
ah yeah good point, I didn't think of that.

honestly for a 1080p gamer, 5700x3d cpu or whatever the cheapest x3d cpu is these days on AM4 platform, combined with this Intel gpu, that would be a really solid 1080p rig. Sharing a 1440p benchmark was probably not fair of me with the argument I was making originally.

Ppls talk how awesome intel GPU is but looking marketshare.. 2% and this year 0%

Lot of talk but not so much buying it.
 
You mean the consumers who have bought out the US launch stock? I think they know.

View attachment 375423View attachment 375424
First of all I really want this gpu to make some noise, coz of awful behavior from Nvidia and AMD, with their high prices and 8GB/128bit trash, despite how I dislike Intel as a company.
But I feel skeptical, and have to wait few months, or half of the year.
Launch stock is not a big deal right now, since we don't know the size of it, the card is not even in 95% stores yet.
 
Do you really believe that was AMD's doing when Starfield was launched on the Consoles at the same time. How do you know it was not Sony that demanded that FSR be implemented first? Do you think Sony cares about DLSS in Games on their console? Their is no proof other than conjecture that AMD had anything to do with Starfield not supporting DLSS at launch. I guess it showed how raw raster is with Nvidia. Please provide sources from AMD to support this internet created narrative.
Starfield launching with consoles at the same time makes sense to have support for AMD FSR first, and I have yet to see proof of AMD actually blocking DLSS. The fact that the Nvidia mindshare keeps repeating it and only has Starfield as an example proves the DLSS marketing works. If a dev implements FSR first everyone loses their minds, but if DLSS comes first and FSR not implemented until later or the implementation of FSR is bad no one seems to care.
 
fantastic card and a great mid range. I may pick one up.

But I am also excited to see what else they will release for battlemage.
 
Good to see Intel's GPU rise up to the occasion here. It basically puts pressure on AMD and Nvidia, especially the former. Nvidia will likely still be the preferred brand for many, but at least there is a good alternative that is reasonably priced and specced. It is a slap in AMD's face to wake them up as well because they are severely lagging in their GPU development. To me, I don't see significant improvement between RDNA2, 3, and 3.5 (whatever fancy name they want to give it).

One thing TPU is missing is frame time testing. GamerNexus did some and in the 3 of the 5 games tested the intel card had significantly better frame times despit enot being that much faster then the 4060.

8GB is obsolete.
Hopefully this will go away as drivers mature. The frame time consistency problem has always been a bane for Intel GPUs. The average frame rate and even the 0.1 and 1% low may look fine on the surface, but a game can look quite juddery in reality due to the frequency of spikes. But overall, I think Intel have made a significant improvement with Battlemage as compared to Alchemist.
 
@ir_cow based on the reviews I have seen from him, an extra 200 mhz while maintaining 1:1 literally will not matter at all in games (versus the standard 6000, maybe 0.5 fps gained on avg.
It's about keeping the 1% lows from getting too low.
Correct. It's mostly just to shut up people who keep nagging about slow memory speeds
About time you came into the present age. Thank you very much. :)
 
Hopefully this will go away as drivers mature. The frame time consistency problem has always been a bane for Intel GPUs. The average frame rate and even the 0.1 and 1% low may look fine on the surface, but a game can look quite juddery in reality due to the frequency of spikes. But overall, I think Intel have made a significant improvement with Battlemage as compared to Alchemist.

I think he was saying that Intel's frame time was better than Nvidia's 60% of the time.

Something I noticed watching Linus' review, the 1% lows were very good on Battlemage.

Also on Linus' review, the B580 overall was better than is reflected on TPU. It's really destroying the 4060 here, and is right on the heels of the 6700 XT.

Also, note the difference to its real previous gen counterpart - the A580. Edit: And it did that with 20% fewer cores than the A580

So, B770 should be interesting.

1734053663310.png

S
 
Thank you W1zzard for this much needed review.


I just looked at Newegg pricing today for the giggles.

The regular RX 6800 had a 20% increase in their pricing since black friday. The reason why is that you could have picked up a refurb for $310.00.

Are these companies that effing STUPID? They jacked up the prices before black friday, then jacked them up again so they can give the illusion of a discount them during the last 2 weeks of the year???

Screw that.

As much as we love to laugh at Intel it looks like very good job in it's price vs performance pricing on it's video card.

This is something I'm looking at to replace my RX 5700.
 
My biggest concern for this card was the drivers. HWUB stated Intel had to update the drivers to get some games to run stable, glad they were able to quick fix the issues. It seems to be a far better launch than the previous generation. Congratz, but there is still a lot of work to be done. If they stay in the GPU race for another generation, I might even scoop one to play with.
 
Reviews are good. I fear the MSRP will only be a dream until whatever Nvidia and AMD respond with. Something like a 8700XT or 5060TI could get the same MSRP or $20 more. I want a race to the bottom though. I am so tried of these ridiculous prices
 
So where it needs those fps the most, it fails to deliver vs. the competition 2/3 of the time. Hopefully the silver lining is that with driver improvements this can be fixed.
I don't think it's the drivers at this point in time, there seem to be certain workloads that just tank performance, something about their architecture makes it awfully ineffective at times.

Do you really believe that was AMD's doing when Starfield was launched on the Consoles at the same time. How do you know it was not Sony that demanded that FSR be implemented first?
People are really dumb and don't understand how this works, if a game is sponsored by AMD or Nvidia they are going to include their features but everything else is extra work. It's not a matter of blocking anything but rather a matter of including the feature that your sponsor wants and everything else was being optional from your perspective, FSR runs on everything DLSS and XeSS doesn't, it's no surprise FSR is probably the first upscaler that gets implemented.

Not to mention how laughable it is to think that the GPU maker with a tiny market share gets to tell Microsoft and Bethesda what to do.

Besides it's very easy to tell if their intentions are malicious. Does the sponsor's feature disproportionality affect performance/quality of other vendors ? If the answer is no, the proposition is idiotic from the get go as there is nothing to be gained from it, FSR runs and looks the same on everything.
 
Last edited:
Starfield launching with consoles at the same time makes sense to have support for AMD FSR first, and I have yet to see proof of AMD actually blocking DLSS. The fact that the Nvidia mindshare keeps repeating it and only has Starfield as an example proves the DLSS marketing works. If a dev implements FSR first everyone loses their minds, but if DLSS comes first and FSR not implemented until later or the implementation of FSR is bad no one seems to care.
The most amusing part to me of the entire ordeal is that if AMD had just promptly released a media statement denying it, I'd have believed them. You can claim mindshare and marketing all you like, but understand that even if that's the case, then it also goes both ways ;)

.... In any case, back on topic.

B580 all sold out here locally from what I can tell, and understandably at such sharp local pricing where we usually get screwed for tech. Makes me very keen to see what B750 and B770 will be able to achieve.
 
its prob not even stocked yet.

pre-orders were up as far back as the preview, but they have likely burned through projected inventory.
 
Back
Top