Opinion: What You Need to Know About Next-Gen Performance

There’s no question that the world is ready for the next generation of game consoles. The built-up pressure of anticipation, rampant rumors, speculation, and discussion happening online is all the evidence you need. As the time has passed since the announcement of the PlayStation 4 and Xbox One earlier this year, details have been confirmed (and denied) and many arguments have been laid to rest. But one huge question still remains: exactly how do the PS4 and Xbox One compare in performance, and what does it mean to the average gamer? As time until the launch of the new consoles draws to an end, gamers of all walks of life are trying to make decisions about what to buy, and I felt it was important to explore this question. And although we don’t yet have all the details about these new platforms, enough data is available to paint an interesting picture of the console landscape — at least for the near future.


Even way back in January 2013, when we knew the Xbox One and PlayStation 4 by the names Durango and Orbis, hints existed that Microsoft’s next console might not be as powerful as Sony’s. Leaks from various sources, including the infamous SuperDAE, indicated specification differences, and the numbers didn’t lie — on paper, the next-gen Xbox was bested in nearly every category by the PS3’s successor.

To the layman, the numbers are heavily skewed in favor of Sony’s console. But more informed minds weighed in and concluded that Microsoft may bring some “secret sauce” to the table in the form of Durango’s move engines and ESRAM. You don’t need to understand those complicated concepts to get this simple point — Microsoft’s exotic hardware choices might mean the numbers were not as clear-cut as it initially seemed.

When the next big consoles from Microsoft and Sony were finally unveiled in public, all the leaks seemed confirmed, and there was plenty of rightful speculation that the performance gap of the previous generation would not repeat itself, that a fair fight between evenly-matched platforms would ensue. However, in the months leading up to today, more rumors surfaced that took the kick out of Xbox One’s “secret sauce”.

Edge Online reported in September that next-gen developers claimed the PlayStation 4 could be as much as 50% more powerful than its competitor, and that the Xbox One was more difficult to develop for due to its complex performance pipeline, a decision that was intended to give it that graphical edge. Kotaku recently shared more insight from unnamed developers, some claiming that Edge’s reported performance gap was true but fixable in the long run. One chalked it up to a “bumpy time for launch”, suggesting that issues could be remedied with optimization and regular improvements to the system’s software development kit.

But others seemed less optimistic, stating that the graphical difference was a larger systemic problem with the Xbox One’s architecture. It seemed that the current generation’s performance situation had completely flipped on its head: Sony had learned from its mistakes with the Cell and Microsoft was perhaps resting on its laurels.

Until recently, these ethereal comments from unnamed developers were the only things fueling rumors of a performance gap. And then “resolutiongate” began.


If you frequent anywhere on the web that has to do with videogames, there’s nearly a 100% chance that you’ve seen some heated debate among gamers about the native resolution of various next-generation multiplatform games. Though discussions about resolution have been going on since the console announcements, they struck a fever pitch in the past few weeks as firm information about native resolutions began to slip out.

Before we go too deep down the resolution rabbit hole, let’s go over the basic technical details. In simple terms, resolution measures the pixel count of a display or video stream. Whether you know it or not, you may have some basic knowledge of resolution if you’ve heard the terms 1080p or 720p. Those are just shorthand measurements for resolution that have developed as standards over the years.

Resolution is measured by pixel-width times pixel-height. Thus, a display or video with 1920 columns of pixels and 1080 rows would be notated as 1920 x 1080. (With progressive scan, that’s 1080p.) Because of the way resolution is measured, some simple third-grade math leads us to the total number of pixels. In the case of 1080p, it’s 2,073,600.

An average Joe may not think there’s much difference between 720p and 1080p. After all, 1080 is not that much bigger a number than 720, right? While that may be true when we’re talking about integers, resolution is a much different story. See, 720p video has a resolution of 1280 x 720, which equals a total pixel count of just 921,600 — less than half of 1080p. And when you take that smaller image and upscale it to the size of a 1080p display, image imperfections can become more obvious, and additional artifacts can be introduced. If you’ve ever tried to blow up an image to larger than its original size, you’ll be familiar with this concept.

One more resolution that will be helpful to know if you read further is 900p — 1600 x 900, or 1,440,000 pixels. Here’s a diagram comparing the relative sizes of 1080p, 900p, and 720p (click for full size):

As you can imagine with numbers like that, many gamers were concerned when news broke that Battlefield 4 would be running at 720p on Xbox One and 900p on PlayStation 4. 720p games are standard on Xbox 360 and PS3, so many expected next generation games to run at 1080p across the board. And beyond the lack of resolution improvements, the disparity between the two consoles was even more disconcerting. Was this finally the evidence of a real-world performance gap between the two future consoles?

When Digital Foundry released comparison footage between the two next-gen versions of Battlefield 4, debate raged about the visual differences and if the difference really mattered. Writers and pundits repeated timeandtimeagain that the differences weren't that great or that a determination couldn't be made until both consoles were in the hands of the public, but a vocal contingent of gamers seemed ill-satisfied by those answers.

Debate intensified when confirmation came from Infinity Ward after weeks of speculation that Call of Duty: Ghosts would be running at 720p on Xbox One and native 1080p on PlayStation 4. To some, this seemed to be the nail in the coffin that confirmed the next Xbox just wasn’t as powerful as Sony’s competitor.

At the time of this writing, Call of Duty: Ghosts and Battlefield 4 are the only solid resolution differences we know of. Of the games launching on both Xbox One and PlayStation 4, FIFA 14Need for Speed: Rivals, Madden NFL 25, and NBA 2K14 are known to be 1080p on both systems. Native resolution for the remaining multiplatform launch games are currently unknown. However, multiplatform titles only tell part of the story. Many exclusive Xbox One launch titles are running at resolutions below the ideal target of 1080p, Killer InstinctDead Rising 3, and Powerstar Golf, all at 720p, with Ryse: Son of Rome at 900p. The only PlayStation 4-exclusive launch title known to run at less than 1080p is Battlefield 4 at 900p. (Ubisoft and Warner Bros. Interactive have not responded to requests about the resolutions of their multiplatform launch games for this story.)

The fervor that swept across message boards and social media didn’t go unnoticed by Microsoft, and their stock answer sounds a lot like many pundits’ — resolution just isn’t that big a deal. After all, it’s about the games, isn’t it? It’s clear that answer isn’t good enough for many gamers. So the question remains: which next-generation console is more powerful, and why does it matter to you?

After long nights of research and testing, I finally have an answer that’s good enough for me, and maybe it’s good enough for you, too.

The big sticking point about resolution differences comes in the total subjectivity of the real world and human eyes. Plenty of folks are quick to suggest that “most people” or “the average gamer” couldn’t tell the difference between 720p and 1080p, even if the two were side-by-side. Yet others claim that they can tell the difference without a doubt. I don’t know if you can tell the difference, but I sure can. And I bet that mythical “average” gamer could as well. Why? It’s all about viewing distance.

According to the most recent data available, the median HDTV size in the US is around 46 inches. The top four online electronics retailers in the US say that among TVs at or near that size, 1080p sets are in the vast majority. In fact, today there are very few TVs above 40 inches that are not 1080p capable. Estimates about the average television viewing distance are much harder to come by, but we can assume it’s somewhere in the range of 12-16 feet based on typical living room sizes for homes and apartments in the US.

Those who suggest that resolution differences aren’t that noticeable often turn to charts that purport to show the optimal viewing distance for televisions of various sizes and resolutions. Here is one such example, featured in Kyle Orland’s op-ed piece for Ars Technica entitled, “Why I’m not too worked up about the next-gen console resolution wars”:

The composite average household we’ve just created exists within the red bordered  section of the chart, which suggests that with the “average” setup, 720p is “enough” and anything more is wasted. Interestingly enough, my personal setup (a 42-inch TV viewed at 12 feet) falls right within that area, indicated by the dot on the chart. I often notice the difference between watching 720p and 1080p content, so the entire chart seemed suspect to me. In order to rule out any possible video compression issues, I decided to set up a test that mimicked a more realistic scenario — playing games.


Using my gaming PC, I enlisted my wife as my lovely assistant and set up a blind test that pitted games running at 1080p against the exact same sections of those games running at 720p upscaled to 1080p. The games tested were Batman: Arkham CityPortal 2, and Hard Reset, each of which I played several sections of. In each case, I was able to reliably tell the difference between the two resolutions at a normal viewing distance with near 100% accuracy. I can’t say the games looked twice as good in 1080p as the pixel counts might suggest, but they did look better. In particular, jaggies were much more noticeable at the lower resolution, and textures looked muddy.

This test was enough to persuade me that these viewing distance charts are bunk for my personal gaming habits and for the TV setup of a typical US household. But furthermore, the charts don’t even take into account the veryatypical setups that gamers tend to have. Consoles are often set up in dorm rooms, offices, and bedrooms that are much smaller than the average living room, and because games are an interactive “lean-forward” medium, my experience is that gamers tend to sit closer to their screens than a passive TV watcher. In fact, when playing competitive games, I often sit as close as five feet away, encroaching on the above chart’s “Ultra HD” territory.

And yet, in spite of what the last 500 words might suggest, I don’t think resolution is the problem; it’s merely the symptom of a much larger performance difference than the numbers show, and you can thank Mark Rubin for that verdict.

As an executive producer at Infinity Ward, Rubin has more insight than most into these issues. In an interview published by Eurogamer, Rubin talked about the resolution difference between the two next-gen versions of Call of Duty: Ghosts, attempting to explain why the Xbox One version runs at a lower resolution. His answer? “I don't know if I can point to one particular cause.” There are hints at problems with the Xbox One SDK and operating system, but little real confirmation. Indeed, Rubin seemed hesitant to share too many details in a second interview with British tabloid Metro, although he did reveal perhaps the most important piece of the puzzle: that the resolution was “the only difference” between the Xbox One and PS4 versions of the game.

A rough rule of thumb in PC game benchmarks is that if you double the resolution of a game at a given graphics setting, you’ll cut the frame rate roughly in half. It’s been clear for a long time that the Xbox One and PS4 are essentially x86 PCs, so even though it’s not perfect, this same equation should hold relatively true. And that means that we can assume that Call of Duty: Ghosts running at 1080p on Xbox One would run at about 30 frames a second (in contrast to the PS4 version running at 60). Those numbers could be very important in establishing a real-world comparison of the consoles’ performance.

Although we don’t know the exact specs of the GPUs in either next-gen system, some tech-savvy gamers have crunched the numbers and assumed that the Xbox One’s graphics are quite similar to the Radeon HD 7770, while the PS4’s are closer to the Radeon HD 7850. My own estimations based on comparing frame rate differences in benchmarked games line up pretty well with those hypotheses.


The next generation of games means change, improvement, and progress. More realistic textures, lighting, and effects are not going to completely change the way we play games, but the power under the hood of the Xbox One and the PlayStation 4 will also inevitably lead to more seamless open worlds with a wider variety of activities to do and more interesting ways to do them. Take Dead Rising 3, for instance. While early footage shows that it isn’t the most graphically stunning, there’s no way the current generation of consoles could maintain that level of fidelity with huge numbers of enemies on screen.

After all the research and testing I’ve done for this story, there’s no question in my mind that the PlayStation 4 is not only more powerful than the Xbox One, but significantly more so than the initial numbers suggest. More horsepower opens the door for more interesting gameplay decisions and more immersive worlds. And yes, the games will look better.

Ultimately, I don’t know what the right console choice is for you. There are positive and negative aspects of both, and plenty of factors to consider. Some may consider it shortsighted to make a decision about which to pick up based solely on performance and graphics, but it’s more clear than ever that many people care about those factors. And beyond that, with this much money on the line, every gamer should be as informed as possible in making the decision for themselves.