Retesting Ghost of Tsushima with a fresh Arc A770 shows that Intel has a mountain to climb to compete against AMD and Nvidia

 A photo of an Intel Arc A770 Limited Edition graphics card, installed in an open-chassis PC.
Credit: Future

Last month, I put Ghost of Tsushima through a benchmarking meat grinder, testing every graphics preset across multiple CPUs and GPUs. Upscaling and frame generation were also explored to see how good a job Nixxes Software had done, porting the PS4 hit to the mighty gaming PC. For the most part, everything ran really well, making the game a really enjoyable experience on PC. Well, apart from two GPUs used for the analysis: an AMD Radeon RX 7800 XT and an Intel Arc A770.

The former would only run for three or four seconds of gameplay before crashing to the desktop and while the latter did work, it produced some odd visual glitches and the overall performance could have been a lot better.

Intel kindly reached out and offered to help with this, lending me an Arc A770 Limited Edition, as I'd used an Acer Predator version previously. For some reason, that particular graphics card was a tad temperamental at times, especially during the BIOS phase of booting. I wasn't overly convinced a new GPU would help but I gave it a go. Or tried to, at least.

Intel's drivers refused to install, despite clearing out every trace of the previous versions. Even a reinstallation of Windows didn't help. After much head-scratching, the solution transpired to be rather simple, if somewhat old-school: manually unzip the driver package and then install them via Device Manager. That's not a method one would expect to be doing in this day and age, but at least it worked.

Anyway, with everything finally installed, I could get on with the testing. The good news is that the previous rendering glitch was gone, and the new Arc card ran through all of the upscaling and frame generation tests without any major issues, unlike before.

The not-so-good news is that the results for the new Arc weren't really that much different to the old one. A little faster at 1080p but worse at 1440p and 4K. At least it all ran correctly and I could properly examine Intel's upscaler in the game.

XeSS produced a handy performance bump, of course, but pairing it with AMD's FSR 3 Frame Generation wasn't as successful as it was with the RTX 3060 Ti using DLSS upscaling and FSR 3 FG. Running Ghost of Tsushima at 4K Very High, with XeSS Ultra Performance and frame gen enabled only resulted in an average frame rate of 64 fps.

That might sound quite decent but the Ampere-powered RTX card pulled 107 fps on a Ryzen 7 5700X3D machine, using DLSS Ultra Performance and FSR 3 frame gen.

No other graphics card tested in the game showed as large a change in frame rate, going from the High to Very High preset, as the Arc A770. So I went back and tested all of the quality settings, to see if I could pin down exactly what the issue was in this game. The biggest culprit turned out to be the volumetric fog option.

At 1080p, with the graphics options set to the Very High preset, the Arc A770 ran at 40 fps with very high-quality fog and 61 fps with high-quality fog—a 100% increase in performance! While the other cards also ran better using high-quality fog, compared to the very high-quality setting, the gains were nowhere near as large.

So why is the A770 running so poorly compared to all of the other cards used in the analysis? The first thing I did was run some GPU traces—using Intel's GPA software—to compare the difference in rendering workloads between the fog modes, but there was nothing to suggest that the GPU itself was bouncing off any particular limit.

And it's not as if it's lagging in hardware stats, as the table below shows:

On paper, the Arc A770 should be as fast, if not faster, than the Radeon RX 6750 XT and RTX 3060 Ti in Ghost of Tsushima. But as my tests have shown, it's so far behind them that there must be something about the architecture and/or drivers that just don't like the rendering workload.

It could be a coding issue in the game itself, of course, but as we noticed in our A770 review, lacklustre performance isn't limited to just one game (though to be fair, those results were taken with olders drivers). Sometimes Intel's Alchemist GPU performs exactly as expected, sometimes it's a complete mystery as to what's going on.

To try and investigate this further, I turned to a Vulkan compute performance tool created by Nemez on X, which assesses a GPU's capabilities by running through multiple instruction and cache tests. While the results can't be used to directly analyse why the A770 struggles so much in Ghost of Tsushima, they do show that Alchemist's performance is a bit of an enigma.

FP32 multiply instructions are extremely common in graphics routines and not only is the Alchemist chip well off the pace of the RNDA 2 and Ampere GPUs, but it's also well down on its peak throughput. Hitting the full rate isn't always possible, even in tests like this one, but it's still far lower than it should be.

However, in the other throughput tests, the A770 is really good. It's not lacking for internal bandwidth and there's no sign of high cache latencies, yet it suffers far more than the competition when dealing with high resolutions or heavy rendering in Ghost of Tsushima.

Intel has been fully committed to issuing regular driver updates for its Arc graphics cards, but I think drivers can only go so far now—after all, support for Ghost of Tsushima was added in the 5518 driver set and we're already two releases past that (5522 and 5534).

Ultimately, whatever the issues are, they're almost certain to be found in the Alchemist architecture. The Battlemage GPU in Lunar Lake chips looks very promising and some of the changes look like they will help a lot. The only problem is that the competition is already significantly ahead. AMD's $500 Radeon RX 7800 XT is a perfect example of what Battlemage is going to be facing.

Ghost of Tsushima has been patched a few times since release and one of the updates was to improve stability for Radeon GPUs. A full run of benchmarks showed that Nixxes had definitely solved that issue and the RDNA 3-powered GPU had no issues running the game at 1080p and 1440p.

Your next upgrade

Nvidia RTX 4070 and RTX 3080 Founders Edition graphics cards
Nvidia RTX 4070 and RTX 3080 Founders Edition graphics cards

Best CPU for gaming: The top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game ahead of the rest.

Only at 4K did it begin to struggle but even then, it wasn't super slow and a spot of upscaling made it very playable. And it's been like that in all the games I've tested so far with that graphics card.

One can rightly point out that the Navi 32 chip in the RX 7800 XT is far more capable than the ACM-G10 in the Arc A770, but only because it has dual-ALU shaders (thus doubling the FP32 throughput) and more cache and VRAM bandwidth. At a resolution of 1440p, the advantage they offer isn't anywhere near as big as it is at 4K.

Nvidia dominates the discrete GPU market, so Intel needs to look at stealing some of AMD's share, regardless of how small it is. But when an RX 7800 XT is an astonishing 132% faster than an Arc A770 at 1440p High, and an RX 6750 XT is 68% faster, it does make me wonder if Battlemage's leap in performance over Alchemist is going to be big enough.

One game certainly isn't indicative of a GPU's overall performance but it does suggest that Intel has a veritable mountain of GPU gains to climb.