Tech
Intel Arc B580 LE Graphics Card Review: Solid 1440p Gaming for a Little Less
Pros
- Good value
- Solid performance
- The extra VRAM really does help for floating-point AI
- Simple design
- DP 2.1 support
Cons
- Power requirements a bit high for upgrading some older systems
- It can struggle with some 1440p play
Sliding in just below the Nvidia GeForce RTX 4060’s typical price — $280 versus $249 — and a little above its typical performance, Intel’s Arc B580 graphics card makes a nice impression as a step-up-from-entry upgrade for older PC. But not too old, because if you decided to save money on the power supply when you bought or built your system, it may be borderline for this card. Thanks to the extra 4GB video memory, the Arc B580 Limited Edition (Intel’s house-branded card) offers a little more headroom than 8GB cards to help boost performance for AI or bump you up a quality class for gaming. In the alphabet of branding, B stands for Battlemage.
But while it’s a fine card, I’m not sure there’s enough here to help Intel gain any more traction than the first gen, which barely registers for market share. Inertia is a HUGE factor.
This is the second generation for Intel’s Arc discrete graphics, and the company is using the 12GB VRAM to help differentiate it from Nvidia’s 8GB cards. 1080p and 1440p gameplay generally uses the same amount of video memory, or at least both fit comfortably into 8GB. Once you start adding some of the shiny — literally, in the case of ray tracing — the memory needs start to creep up, which may push you across the 8GB barrier. (The big jump is between 1440p and 4K.)
Intel Arc B580 Limited Edition
Memory | 12GB GDDR6 |
---|---|
Memory bandwidth (GBps) | 456 |
Memory clock (GHz) | 2.375 |
GPU clock (GHz) | 2.670 |
Memory data rate/Interface | 19Gbps/192 bits |
Render Slices/RT cores | 5/20 |
Execution Units/Texture mapping units | 20/160 |
Shaders | 2560 |
XMX AI engines/peak integer TOPS | 160/233 |
Process | 5nm |
TBP/min PSU (watts) | 190/600 |
Max thermal (degrees) | n/a |
Bus | PCIe 4 x 8 |
Size | 2 slots; 10.7×4.5 inches/272x115mm |
Connections | 3 x DP 2.1, 1 x HDMI 2.1 |
Current list price | $249 |
Ship date | December 13, 2024 |
But the other part of the equation is how well a card performs on ray tracing and other DirectX 12 Ultimate features (as opposed to simply fitting them in memory), and there the RTX 4060 and B580 are about the same — that is to say, just okay. Intel is counting on XeSS 2, its second-generation upscaling and optimization technology (which incorporates frame-interpolation technology similar to AMD’s FidelityFX Fluid Motion Frames and Nividia’s DLSS 3.x) and aggressive upscaling, to help sustain frame rates. AMD still lags woefully, though.
For best results, XeSS 2 requires custom support by game developers (frame gen doesn’t work without it, though other pieces of XeSS don’t require it). And XeSS 2 isn’t backwardly compatible with the first gen, so games that support the first version don’t automatically support the latest. It’s also required to take advantage of Intel’s input-latency reduction, XeLL, which serves a similar purpose to Nvidia’s Reflex.
At last count, Intel lists 43 games that support the first version of XeSS and a few that will support XeSS 2. Even if it’s not an exhaustive list (and I don’t think it is), there isn’t a lot of motivation at the moment to buy a GPU that relies on it, no matter how good it might be. (I didn’t test it because there’s only one game at the moment, F1 24, and no abstract benchmark yet.)
On the other hand, Intel’s Core Ultra 200V series (Lunar Lake), uses the same GPU technology for its integrated graphics (albeit stripped down to save power), and Intel’s integrated GPUs technically dominate the market; XeSS 2 will run on Lunar Lake, which could be a big incentive for developers to support it.
Design and Performance
The extra memory, among other things, makes this card a bit more power hungry than some entry-to-midrange systems bought in 2020, which are the ones now ripe for upgrades. If you’re one of the people who thought “I’ll save money and get a 450-watt power supply,” this card isn’t a good fit at 190-watt power draw for the card. Especially if you want to overclock it.
But the B570 may exist to fill that niche. We’ll see. It is shorter than the A750 was, but it also takes up two slots, so that may affect your decision. And given it has only two fans, it seems a little long. Despite its length, though, it’s pretty light, so it doesn’t need a support like so many GPUs we see these days. Not even to hold it during installation (because I struggle with just two hands).
As mentioned earlier, I think the extra memory helped to keep it from bottlenecking on generative AI, and it also helps if you feel like trying to play in 4K; but better than competitors doesn’t necessarily mean it’s usable. And it lags a lot in the SpecViewPerf Pro graphics tests in both 1080p and 4K, so technical graphics professionals may want to stick with Nvidia because CUDA has been around for ages.
For older games, even without XeSS it performs quite well at 1440p and on Speed Way, 3DMark’s DX12 Ultimate benchmark (1440p) — all the whizzy technologies like ray tracing on Windows. But it lags a little on DXR, which just measures the ray-tracing component. Just a little, though. Not running behind the train like AMD.
But for modern, GPU-heavy AAA games without XeSS support, you may end up dialing quality or resolution back to 1080p. I threw Indiana Jones and the Great Circle at it (1440p, high quality) and it struggled to reach 60fps.
Unfortunately, I didn’t have time to test the new overclocking controls before I had to write this, unfortunately, but I’ll get to them subsequently, I hope.
I don’t envy Intel the Sisyphean task of rolling out a second generation of the cards, but the company’s put a lot of thought into them and seems determined to keep up the support cadence, with new drivers and so on. The Intel Arc B580 LE is pretty good if you need to adhere to a strict budget and certainly gives you a lot of GPU for the money.
Performance results
Configurations of test systems
Acer Swift 14 AI (SF14-51T-75AF) | Microsoft Windows 11 Home; Intel Core Ultra 7 258V; 32GB DDR5 RAM; Intel Arc 140V Graphics; 1TB SSD |
---|---|
Alienware Aurora R16 | Microsoft Windows 11 Pro; 2.5GHz Intel Core i9-14400KF; 32GB DDR5-5600; 12GB Nvidia GeForce RTX 4070 |
Alienware m16 R2 | Microsoft Windows 11 Home 23H2; 1.4GHz Intel Core 7 Ultra 155H; 16GB DDR5 5,600MHz RAM; 8GB Nvidia GeForce RTX 4070 GPU; 1TB SSD |
Alienware m18 R2 | Microsoft Windows 11 Home 23H2; 2.2GHz Intel Core 19-14900HX; 32GB DDR5-5600 RAM; 16GB GDDR6 Nvidia GeForce RTX 4090 GPU @ 175w; 2TB SSD |
Apple Mac Mini M4 | Apple MacOS Sequoia 15.1; Apple M4 (10-core CPU, 10-core GPU); 16GB LPDDR5; 512GB SSD |
Lenovo Legion T5 26IRB8 (90UT001AUS) | Microsoft Windows 11 Home; 2.5GHz Intel Core i5-14400F; 16GB DDR5-4400; 8GB Nvidia GeForce RTX 4060 |
Minisforum AtomMan G7 Ti | Microsoft Windows 11 Home; 2.2GHz Intel Core i9-14900HX; 32GB DDR5-5600; 8GB Nvidia GeForce RTX 4070 mobile (140w) |