Heard the latest scuttlebutt about the Wii U’s system specs? No, I don’t mean the one about Nintendo’s next console being 50% more powerful than a PlayStation 3 or Xbox 360. That one dates back to June—ancient history in gaming years—and was based on claims supposedly made by developers then translated by a market research firm.
No, I’m talking about the story sourced to Wii U Daily (henceforth, WUD) obtained from a Japanese developer porting a PS3 game over to the new system. WUD’s source claims the Wii U’s specs will be:
- Quad Core, 3 GHz PowerPC-based 45nm CPU, very similar to the Xbox 360 chip.
- 768 MB of DRAM “embedded” with the CPU, and shared between CPU and GPU
- Unknown, 40nm ATI-based GPU
WUD has a dog in this fight by virtue of its existence—it’s a Wii U fan site, after all—so make no assumptions about this information’s veracity, but in theory, it means the system has plenty of oomph under the hood. System architectures are notoriously hard to wrangle into apples-to-apples comparisons, but the latest version of the Xbox 360, by comparison, employs a Tri-Core, 3.2 GHz PowerPC-based 45nm CPU, has 512MB of RAM and uses a 90nm ATI-based GPU. Assuming a six-years-newer GPU and PPC-based CPU architecture, the benefits of the extra core, and more and faster RAM, it’s easy to see where claims the Wii U would be “50% faster” were coming from, even if it’s not really surprising given the six-years interval. (I won’t bother listing the PS3’s specs here, because they’re dissimilar enough to be confusing.)
Given any of the above, when the Wii U arrives next year, we’ll be looking at three more than capable systems, two of which can still compare favorably (in most ways) to games released for today’s desktop PCs. And chances are, given development budgets and current-gen design wisdom, Wii U games are going to look a lot like what we’ve grown accustomed to the last half decade–great, that is, but not remarkably different on a pixel-for-pixel level.
The one detail worth pulling out is the Wii U’s declared support for 1080p. That’s 1920 by 1080 pixels, the equal (or better) of most computer monitors on desks where more than 22 or 24 inches seems like overindulging (you can swing up to 2560 by 1600 pixels, but who not in graphic design has space for 30-plus inches?).
The Wii, by comparison, tops out at a paltry 640 by 480 pixels. Most televisions sold today are either 720p or 1080p. Running a Wii game on a 720p or 1080p screen results in interpolation, or “pixel stretching,” making aliasing (the jagged stair step effect along edges) more pronounced and the video feed look like its trapped behind dirty glass. It’s more a technology “mismatch” issue than a Wii shortcoming, but it’s annoying enough to interfere with my enjoyment of Wii games not piped through an old-school CRT (which no one sells anymore, or wants to cram into an entertainment center now that flat screens and high-def are everywhere).
Now that Nintendo’s joined the HD club, I’m washing my hands of system spec showdowns. I’ve stopped caring about texels and shaders and bump-mapping. Who outside the development biz really cares? Does anyone care whether Modern Warfare 3 looks better (or worse) than Battlefield 3? Is that what you’re really noticing while playing either? It’s time to leave behind the which-one’s-more-powerful debate. Games today look incredible enough, so much so that the visual upticks are irrelevant–scaffolding to aid the production, not upstage it.
Which means that from the Wii U, I’m expecting more than just a bunch of Nintendo games with a high-definition makeover. There’s plenty of reason to. No one on the hardware side–not Sony, not Microsoft–matches Nintendo’s gumption when it comes to thinking outside the box game-wise. When Zelda for the Wii U arrives, it won’t be Link’s high-def duds or Master Sword or slingshot and boomerang I’ll be scrutinizing, it’ll be what Nintendo’s done with the series after years of riffing on 1998’s Ocarina of Time.