One of the things I've been annoyed with recently is the very idea of a console video game having frame rate issues. The game that brought this up in my mind is Lego City: Undercover. Now, don't get me wrong, the game is supposedly pretty good. I'm not bashing the game itself, most of the reviews are good and I haven't tried it myself so I'm not going to judge it as far as gameplay goes.
My issue is that it, according to most of the reviews, has frame rate issues. The thing that makes this very irritating is that a console, unlike a PC, is a fixed set of specs. You won't find an Xbox 360 that has more memory than any other one, or a PS3 with a faster processors than another PS3. They're all the same. There aren't any issues with viruses or drivers. Functionally, as far as designing a game for a console goes, they're all completely and totally identical. This provides consoles the opportunity to have their games optimized for the hardware in a way you could never do on PC. To have ANY issues with game performance when you know exactly what hardware you're designing it for is, I think, completely moronic.
Whatever frame rate you're targeting for the game, be it 30 or 60, it should be locked there, never wavering for a moment. This is true for games that release on multiple consoles but it's especially true for games exclusive to just one console, such as Lego City: Undercover which is exclusive to Wii U.
This is not an issue with the Wii U
itself, it's an issue with any developer not willing to properly optimize their games. I just feel that if you know precisely the hardware you're designing for, you should make the game run within the boundaries and limitations of that hardware. Everything should be scaled up or down in all areas of the game to stay at your target frame rate and run smoothly. But maybe I'm just crazy.