Skip to content

Instantly share code, notes, and snippets.

@vaguerant
Last active February 27, 2024 03:30
Show Gist options
  • Star 6 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save vaguerant/eb9c7fba35f118ccb32c5df22b2b0f35 to your computer and use it in GitHub Desktop.
Save vaguerant/eb9c7fba35f118ccb32c5df22b2b0f35 to your computer and use it in GitHub Desktop.
Wii Video

Hi there. If you're reading this, somebody wanted to explain something about the video output of the Nintendo Wii but it would have been too long to tell you personally. So here's this gist to give a rundown of some of the reasons why Wii video looks the way that it does. We'll be going roughly in order from the most to least obvious considerations.

NB: All screenshots in this gist were originally captured on real hardware, but some scaling artifacts like aspect ratio are simulated via editing.

Table of contents

  1. Background
  2. Aspect Ratio
  3. The "Deflicker" Filter
  4. Pixel Aspect Ratio and viWidth
    1. Pixel Aspect Ratio
    2. viWidth
  5. Overscan
  6. 480p Bug
  7. PAL Video
    1. Embedded Framebuffer
    2. External Framebuffer and xfbHeight
  8. Conclusion

Background

The old joke during the Wii's lifespan was that it was just two GameCubes duct-taped together. This is false, mainly because the Wii is one-and-a-half GameCubes duct-taped together. Most of the hardware in a Wii is identical to what's in the GameCube, and the video interface is very much a part of that. This means that the Wii video hardware was designed for the world of ~2001, when the GameCube launched. The TVs were mostly analog cathode-ray tubes (CRTs) with a 4:3 aspect ratio hooked up via the decades-old composite standard. This was already changing by the time of the Wii's launch in 2006, and by the end of its lifespan in 2011 or so, the Wii was looking positively archaic, a relic from two generations ago. Still, that's our starting place. Let's go.

Aspect Ratio

This one is pretty obvious, but the Wii and most games released on it support two aspect ratios, the older 4:3 and modern 16:9 (widescreen).

Using the example of Mario Kart Wii, here's what 4:3 looks like:

MKW-43-default

And this is what 16:9 looks like:

MKW-169-default

Wait a minute. That already doesn't look right. It's not even any wider than 4:3. Why is Mario so skinny? What's happening?

At the time of writing, the Wii's video hardware is over 20 years old. While we often think of widescreen as a matter of resolution (more pixels on the sides = widescreen), things were different in the analog era. Widescreen was mostly achieved through a method called anamorphic widescreen (en.wikipedia.org). This is a way of encoding a widescreen image into the same resolution as a 4:3 image. You would basically try to cram more information into the same space, then stretch it back out again later.

If you'd like a bad analogy, imagine you're packing a suitcase for travel. It's already "full"; there's no space left. But if you sit on top of the suitcase, your clothes will squish down a little bit and you can stuff in a few extra shirts. The suitcase didn't get any bigger, but through the power of smushing, you put more stuff in it. The only thing is, your clothes are going to come out the other end a bit crumpled.

The squishy looking screenshot above is the contents of your overstuffed suitcase. It's squished in ready to be pulled back out, so let's go ahead and do that now:

MKW-169-default-stretch

Hooray, that's more like it!

On an original Wii, what you've done here is go into your TV's aspect ratio settings and switch it to "Full" or "16:9" or something similar. The Wii doesn't do this step for you, it happens externally to the console. Because of this, the exact impact of the stretch on the picture quality will depend on the scaling your device uses.

On a Wii U, the backward-compatible follow-up to the Wii, this step is usually handled automatically: the Wii U generally knows when the Wii mode wants to display in 4:3 or 16:9 and adjusts as appropriate.

So, what does this mean for our picture? It's been smushed and stretched back out, so it looks a bit crumpled. But hey, it's widescreen, so you can see more stuff to the side of the track. Ultimately, it's a compromise. The shirts got squished, but there's more of them. Do you want neater shirts, or more shirts? You can only pick one.

tl;dr: 4:3 is crisper but 16:9 is wider.

The "Deflicker" Filter

Let's get this out of the way first: these are really called vFilters or copy filters, but the Wii scene has decided on calling all filters the "deflicker" filter. In reality, different video configurations have different names and purposes and the deflicker filter is just one of them. It isn't even the one people dislike. But I'm already fighting a losing battle here, so let's just move on.

Any Wii software can set up a filter of its choosing on the video interface. These filter the rendered pixels in different ways before sending them out the back of the Wii to be displayed on your TV. With 20+ years of hindsight, these filters are quite primitive compared to modern anti-aliasing techniques, but they did their job at the time and sometimes result in a genuine improvement (ironically, the actual deflicker filter is one of the better ones). That said, many people dislike these "deflicker" filters and would rather get a clearer image that looks more like the raw rendered pixels.

Let's get to the examples. Mario Kart Wii does use the "deflicker" filter, but with a light touch compared to many games. The screenshots above show the game with its default filter settings. In the examples below, I've ramped up the filter to a high level via homebrew for demonstration purposes. While Mario Kart doesn't use this level of filtering, other games do.

Filter on (soft):

MKW-43-maxfilter

Filter off (sharp):

MKW-43-nofilter

With the filter off, the image is much clearer, with all that that entails: fine details are more easily visible, but so is the aliasing (the jagged, blocky edges of objects). In case this idea is new to you, that's where anti-aliasing gets its name, it's ... anti ... aliasing.

If you're playing in 16:9 widescreen, the "deflicker" filter is compounded by the stretching of the image that we talked about in the aspect ratio section. Here's Mario Kart with that high strength filter and stretched to 16:9:

MKW-169-maxfilter-stretch

I don't feel so good, Baby Park. And for comparison, filter off and stretched to 16:9:

MKW-169-nofilter-stretch

Not too bad overall. Not as sharp as 4:3, but acceptable to most. Remember, the 16:9 stretch step depends on what device is doing the stretching, so the exact quality will vary based on that.

tl:dr: "Deflicker" filters soften the hard edges, disabling the filter gives a sharper, more aliased image.

Pixel Aspect Ratio and viWidth

These two get to share a heading because of the way they tightly interact.

Pixel Aspect Ratio

In modern times, we think of a pixel as inherently a square object. But analog video was not measured or displayed in pixels. Without getting into literally how CRT TVs work, because I absolutely could not tell you, video game consoles had to convert each row of raw pixels into an analog video scanline (one horizontal line of the picture, out of 480 for this example). The way this works is that each pixel is sampled for a period before moving on to the next pixel, and the sample period is what determines how wide that "pixel" will be relative to the entire scanline. I think. This results in what is called a "pixel aspect ratio", i.e. the ratio of the shape of one individual pixel.

Many analog video formats were standardized in the exceedingly complicated Recommendation ITU-R BT.601-5, generally referred to as Rec. 601 (glenwing.github.io). This basically sets out how video signals should be encoded and how they should be understood–for example, in my case, they are understood "not at all". With this detailed technical information, it's possible to determine using Numbers™ exactly how a given signal should appear on a correctly tuned television.

Because the Wii is essentially a last-gen console, it and most consoles of the immediately preceding generation (Dreamcast, GameCube, PlayStation 2 and maybe Xbox but I don't know anything about Xbox) are essentially implementing pretty standard Rec. 601-compliant 720×480 video. (720? But I thought the Wii was 640×480 ... hmm.) They do this using a 13.5 MHz "pixel clock", which determines the sample timing for each pixel. The Pin Eight wiki, created by certified cross-platform homebrew genius Damian Yerrick, gives many examples of known pixel clock speeds (pineight.com) and their resulting pixel aspect ratios. The 13.5 MHz clock of the Wii and other systems results in a PAR of 10:11. That means each pixel is slightly (1/11th) slimmer than it is tall.

Too many words, where's the pictures?

240p-linearity

This is the Linearity test pattern from Artemio Urbina's invaluable 240p test suite (artemiourbina.itch.io) software, an open-source tool available for *checks notes* all of the platforms. Our friend Damian Yerrick, who you hopefully remember from one paragraph ago, developed the NES, Game Boy and Game Boy Advance versions. In the Linearity test, the 240p test suite displays five "perfect" circles on the screen, for definitions of perfect that include jagged, low-def and pixelated (mine does).

If you're familiar with circles, you might notice that these look a little bit ... not that. That's because this is a raw capture of the test pattern being displayed with square pixels, which we just established the Wii does not have. Artemio knew that the Wii did not have square pixels when writing his tool, and so the Linearity pattern accounts for this by drawing the circles slightly wide. This way, when they're displayed using the Wii's 10:11 PAR, the circles magically appear like so:

240p-linearity-squish

Yay! I remember circles!

tl;dr: The Wii has 10:11 (skinny) pixels.

viWidth

Things are about to get kinda complicated ... again. Until now, we've mostly been talking in terms the Wii's "embedded framebuffer" (eFB). That's the thing people generally think of as the Wii's native resolution. When people talk about the Wii running at 640×480, they're talking about the embedded framebuffer. If we're factoring in 50 Hz PAL, the embedded framebuffer resolution is 640×528, but things are confusing enough already, so from now on, let's just forget PAL ever existed. I'd sure like to.

In addition to the embedded framebuffer, the Wii has an "external framebuffer" (xFB). I know, embedded and external both start with E. I didn't come up with the names, go tell Nintendo. The xFB has a resolution of 720×480. Does that mean the Wii really runs at 720×480? Literally yes, but practically no; it would be annoying to claim that as the Wii's resolution when it's limited by the embedded framebuffer. Technically it's possible with extreme shenanigans to display more content on the xFB by using multiple transfers off the eFB, but this isn't an intended use of the xFB and no official software does so. Please forget about that too.

Now, the default thing you might do when copying from the eFB to the xFB is just plop the image into the middle, like this:

MKW-43-nofilter-xFBplop

You could stop there. Many games did. But that's a lot of empty black space, mainly on the sides. The picture is slimmed down by those 10:11 pixels we talked about and there's space left over on the sides? This is turning out to be a very skinny picture.

This is where viWidth comes in. You can copy the eFB to display over anywhere between 640 and 720 pixels of the xFB, horizontally. The scaling is performed with 8× over-sampling, so it's basically the nicest scaling you'll ever see on a Wii. For Mario Kart Wii, the viWidth is 670, so it actually looks more like this:

MKW-43-nofilter-xFB670

That looks pretty damn solid, it's filling most of the signal with picture, it counteracts the slimming effect of the 10:11 PAR, the scaling is high quality so it doesn't noticeably degrade the picture by much. I like all of these things. Good job, Wii! Technically, matching the viWidth to the eFB width will give you a slightly sharper image, but it sacrifices the aspect ratio and leads to huge black pillars on the sides of the screen.

For reference, the viWidth which completely counteracts the existing pixel aspect ratio and fills the screen is 704 (11÷10×640). While the rest of the image (8 pixels on each side) is not intended to be seen at all and is left empty in commercial software, there's nothing to actually stop anybody filling them, and some homebrews do.

tl;dr: viWidth adjusts the horizontal scaling, counteracting the pixel aspect ratio and filling the screen without degrading the image much.

Overscan

That last shot in the previous section is a pretty good approximation of the video signal coming out of the Wii. But you've probably owned a Wii and don't remember seeing a black frame surrounding everything all the time. Or maybe you had a nice TV and you do remember the black borders, but what are they doing?

Analog television is analog. There's interference. There's moving parts. There's aging parts. Because of these and a hundred more factors, no two analog TVs are ever tuned exactly the same. Besides variations in color, crawl, curvature and all the other C-words, something that could vary substantially was exactly how much of the signal the TV was calibrated (C!) to fit onto the screen. It was extremely common for some of the picture around the edges to be hanging off the edges of the screen, and this is called "overscan". This was so prevalent that going further back, games consoles would sometimes take advantage of the overscan area by just leaving weird glitchy nonsense on the edge of the screen if it helped them achieve a graphical effect they wanted elsewhere. Some examples are the left column on many scrolling NES games or the CRAM dots at the bottom of the screen on Mega Drive.

Other than filling it with junk, the main consideration game developers, TV broadcasters, etc. had was that they needed to avoid putting important parts of the image in the edges, where some people's TVs would cut them off. Imagine a sports broadcast where the scores aren't visible, or a video game where the health bar is missing, all because the out-of-touch big-city media elites with their correctly calibrated TVs didn't account for overscan. It would suck. This led to the concept of the "safe zone", which was the part in the middle of the screen where it was safe to put your important details.

While it is entirely possible for the Wii to fill the entire signal with picture, usually at least some of it is left blank because it will not be visible on some or even most TVs. This allows developers to address only the part of the screen that all players will be able to see. Still, even when you carefully leave parts of the screen blank, there's always a bigger ... miscalibration. Here is a simulation of what overscan might do to Mario Kart Wii on your crappy vintage analog TV:

MKW-43-nofilter-xFB670-overscan

At this point, you really start to see just how serious Nintendo was about keeping things in the safe area. The entire HUD is scooched way in from the edges, so that even with overscan cutting off a significant portion of the screen, you could still afford to lose more. Mario Kart Wii was designed to be playable on the worst junkyard TV on the lot. However, not all games are Mario Kart Wii, and different developers made different allowances for overscan, or didn't. Many games put picture way out into the area that Mario Kart left blank, which would only be visible on very well-calibrated sets at the time.

In the digital realm, overscan doesn't exist. You send a picture to the TV and the TV displays the whole picture. The only "overscan" you'll encounter on a modern digital setup has been deliberately programmed in to mimic older TVs. This is sometimes done because older media may have expected it to be OK to put junk data into the overscan area, or because they would leave it blank, like Mario Kart Wii. The Wii U is one such example, with its stock behavior being to simulate significant overscan in Wii mode.

tl;dr: Either overscan is cutting off some of your picture, or lack of overscan is filling your screen with black borders.

480p Bug

There was a bug in Nintendo's software development kit for the Wii which became a bug in all published Wii software (whoops) where they would miscommunicate with the hardware responsible for encoding the analog video signal. This resulted in some grossness on the horizontal axis. Nintendo never fixed this bug in software, but instead, later board revisions had video encoders which were "aware" of the bug, so when they got the bad requests, they'd simply correct them internally. Thus, on later production runs of the Wii (including the original model, as well as the Family Edition and Mini; the issue does not affect the Wii U), the bug has been corrected through that hardware update. However, it was never officially fixed in software. Only homebrew methods to patch the bug in software can help owners of the affected earlier Wii runs.

tl;dr: Nintendo did an oopsie, 480p sucks on early Wii models.

PAL Video

All right, I admit it. PAL exists. Let's talk about it. PAL, now sometimes called PAL 50, is the television standard historically used in most of Europe and a few other extraneous territories like Australia. Traditional, 50 Hz PAL was used almost exclusively in the major PAL territories until the late '90s. At the turn of the millennium, many devices began supporting the output and display of "PAL 60", which is very close to NTSC video for most purposes. In fact, PAL 60 mode on the PlayStation 2 is a lie, it literally just switches to NTSC when enabling so-called "PAL 60". That's unacceptable, demand a refund.

NB: In this section, we'll just be displaying raw screen dumps with square pixels, not making any attempt to simulate the actual PAR. PAL has a shorter/wider PAR (12:11) than NTSC or PAL 60 (10:11), so the PAL images in this section will look "too tall". PAL 60 will conversely look "too wide". This is simply a limitation of comparing multiple images which expect different PARs. Please understand.

Anyway, what's the difference? Referring back to Rec. 601, the important distinctions for PAL video are:

PAL NTSC/PAL 60
Resolution 720×576 720×480
Refresh Rate 50 Hz 60 Hz

PAL has a terrible reputation in the video game community, but as you can see above, both standards have their advantages. PAL has a higher resolution, while NTSC has a higher refresh rate. You could make an argument for either one being superior, or even that PAL is better for slow-paced games where the framerate is not as important as the picture quality, while NTSC is better for fast-paced games where reaction time is key ...

... except for one thing: most of the major players in the video game industry in its first 40 or so years were North American or Japanese. This led to NTSC being the de facto standard which all consoles and games were designed around. Instead of catering equally to the advantages of each standard, most games–even those from PAL territory developers like the United Kingdom's Rare–were designed for NTSC first and converted to run on PAL machines later, with results that were inconsistent.

In the worst cases, a poor PAL conversion would mean a game that ran at the reduced NTSC resolution and the reduced PAL speed, plus large black bars letterboxing the image and an incorrect aspect ratio (due to not accounting for the resolution difference). It's not that PAL is worse than NTSC, just that most of the time, the NTSC version was the original, intended product and the PAL version was not. Some studios, like Rare, went out of their way to produce quality PAL conversions, but they're the exception rather than the rule.

Lastly, while the Wii's hardware supports progressive PAL (576p) just fine, Nintendo's development guidelines did not allow for 576p software, for ... reasons? Probably reasons. This means that without homebrew, PAL output is capped at 576i, giving NTSC/PAL 60 a clear advantage, as they support 480p.

tl;dr: PAL's not a bad standard. Just had bad luck.

Embedded Framebuffer

When running in PAL (50 Hz) mode, the Wii has 576 lines to display instead of the 480 lines at NTSC/PAL 60. To help account for this difference, the embedded framebuffer (the Wii's internal resolution) is increased to 640×528. It's time for an example, so along the same lines as Mario Kart Wii, here's another unavoidable megahit, the Wii's most iconic competitive multiplayer party game ...

fritz-480vs528

Just me? Oh well. This is Fritz Chess, running on a PAL Wii's 60 Hz (left) and 50 Hz (right) modes. You can probably tell that the 50 Hz mode is a taller image. If you want exact numbers, the game uses a 640×448 framebuffer for PAL 60, and a full-size 640×528 for traditional PAL (50). The increased resolution of the PAL version allows them some extra space for the crowded on-screen display. The aspect ratio differences should not be alarming: even though the 528-high framebuffer is taller, it's still going to be shown in 4:3. It will just be a slightly higher-definition 4:3.

It's important to note that not all games take advantage of this feature, it's just something that is open as an option. Some PAL 50 games still use 480 lines or less.

tl;dr: Poor reputation aside, PAL 50 Hz mode can display a higher resolution than NTSC/PAL 60.

External Framebuffer and xfbHeight

Just like the embedded framebuffer is larger in PAL 50 Hz mode, so is the external framebuffer. In this case, the external framebuffer has a size of 720×576. Since the embedded framebuffer only goes up to 528, you might assume the rest of the signal is left blank, but that's not necessarily the case. Similar to the viWidth discussed earlier, the video interface also has an xfbHeight or external framebuffer height. Using this, developers can scale their embedded framebuffer to fill more of the signal with picture, while once again taking a hit to image quality from the additional scaling.

To be clear, Fritz Chess does not natively make use of this feature for scaling. Instead, it uses an xfbHeight of 528, identical to the embedded framebuffer's height. Here, I have patched the game to perform a vertical scale on the external framebuffer in order to demonstrate its effect. These images are placed on the external framebuffer (720×576). The black areas are the parts of the signal not being filled with picture.

fritz-528vs576

On the left, the xfbHeight leaves the framebuffer un-scaled. On the right, it's scaled all the way up to fill 576 lines. These are just two examples; the choice of xfbHeight is not just between "scaled" and "not scaled". The exact input and output heights are in the hands of the developer, e.g. a lazy way to perform a PAL conversion is to simply take the original 480 lines from the NTSC/PAL 60 mode and scale them to fill ~542-574 lines. This prevents most letterboxing (black bars) but will result in a more filtered image.

Conclusion

In case you couldn't tell, there's a bunch of things that can go wrong between your games getting rendered and the picture coming out of your TV. It's amazing we can see Wii games at all. Stay safe out there. Whether you choose to switch to 4:3, disable the "deflicker" filter, adjust the viWidth, disable fake overscan on your TV, enable the 480p fix, switch to 60 Hz, update your glasses prescription or all of the above, there's a lot that can be done to improve the clarity of the Wii's picture. But at the end of the day ... it's still just a Wii. Don't overthink it. I clearly haven't.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment