Skip to content

Instantly share code, notes, and snippets.

@breezhang
Last active August 29, 2015 13:55
Show Gist options
  • Save breezhang/8779362 to your computer and use it in GitHub Desktop.
Save breezhang/8779362 to your computer and use it in GitHub Desktop.
DPI PPI defined

4

###DPI measurement in monitor resolutio

Monitors do not have dots, but do have pixels; the closely related concept for monitors and images is pixels per inch or PPI. Old CRT type video displays were almost universally rated in dot pitch, which refers to the spacing between the sub-pixel red, green and blue dots which made up the pixels themselves. Monitor manufacturers used the term "dot trio pitch", the measurement of the distance between the centers of adjacent groups of three dots/rectangles/squares on the CRT screen. Monitors commonly used dot pitches of 0.39, 0.33, 0.32, 0.29, 0.27, 0.25, or 0.22 millimetres (mm, 0.0087 in). LCD monitors have a trio of subpixels, which are more easily measured.

DPI measurement in printing

DPI is used to describe the resolution number of dots per inch in a digital print and the printing resolution of a hard copy print dot gain, which is the increase in the size of the halftone dots during printing. This is caused by the spreading of ink on the surface of the media. Up to a point, printers with higher DPI produce clearer and more detailed output. A printer does not necessarily have a single DPI measurement; it is dependent on print mode, which is usually influenced by driver settings. The range of DPI supported by a printer is most dependent on the print head technology it uses. A dot matrix printer, for example, applies ink via tiny rods striking an ink ribbon, and has a relatively low resolution, typically in the range of 60 to 90 DPI (420 to 280 µm). An inkjet printer sprays ink through tiny nozzles, and is typically capable of 300-720 DPI.[2] A laser printer applies toner through a controlled electrostatic charge, and may be in the range of 600 to 2,400 DPI. The DP measurement of a printer often needs to be considerably higher than the pixels per inch (PPI) measurement of a video display in order to produce similar-quality output. This is due to the limited range of colors for each dot typically available on a printer. At each dot position, the simplest type of color printer can either print no dot, or print a dot consisting of a fixed volume of ink in each of four color channels (typically CMYK with cyan, magenta, yellow and black ink) or 24 = 16 colors on laser, wax and most inkjet printers. Higher-end inkjet printers can offer 5, 6 or 7 ink colors giving 32, 64 or 128 possible tones per dot location. Contrast this to a standard sRGB monitor where each pixel produces 256 intensities of light in each of three channels (RGB). While some color printers can produce variable drop volumes at each dot position, and may use additional ink-color channels, the number of colors is still typically less than on a monitor. Most printers must therefore produce additional colors through a halftone or dithering process. The exception to this rule is a dye-sublimation printer that utilizes a printing method more akin to pixels per inch. The printing process could require a region of four to six dots (measured across each side) in order to faithfully reproduce the color contained in a single pixel. An image that is 100 pixels wide may need to be 400 to 600 dots in width in the printed output; if a 100×100-pixel image is to be printed inside a one-inch square, the printer must be capable of 400 to 600 dots per inch in order to accurately reproduce the image.

Computer monitor DPI standards

Since the 1980s, the Microsoft Windows operating system has set the default display "DPI" to 96 PPI, while Apple/Macintosh computers have used a default of 72 PPI.[3] These default specifications arose out of the problems rendering standard fonts in the early display systems of the 1980s, including the IBM-based CGA, EGA, VGA and 8514 displays as well as the Macintosh displays featured in the 128K computer and its successors. The choice of 72 PPI by Macintosh for their displays arose from the convenient fact that the official 72 points-per-inch mirrored the 72 pixels-per-inch that actually appeared on their display screens. (Points are a physical unit-of-measure in typography dating to the days of printing presses, where 1 point by the modern definition is 1/72 of the international inch (25.4 mm), which therefore makes 1 point approximately 0.0139 in or 352.8 µm). Thus, a 72 pixels-per-inch seen on the display was exactly the same physical dimensions as the 72 points-per-inch later seen on a printout, with 1 pt in printed text equal to 1 px on the display screen. As it is, the Macintosh 128K featured a screen measuring 512 pixels in width by 342 pixels in height, and this corresponded to the width of standard office paper (512 px ÷ 72 px/in = 7.1 in, with a 0.75 in margin down each side when assuming 8.5 in × 11 in North American paper size).

A consequence of Apple's decision was that the widely used 10 point fonts from the typewriter era had to be allotted 10 display pixels in em height, and 5 display pixels in x-height. This is technically described as 10 pixels per em (PPEm). This made 10-point fonts render crudely and difficult to read on the display screen, particularly for lowercase characters. Furthermore, there was the consideration that computer screens are typically viewed (at a desk) at a distance 1/3 or 33% greater than printed materials, causing a mismatch between the perceived sizes seen on the computer screen versus those on the printouts.

Microsoft tried to solve both problems with a hack that has had long-term consequences for the understanding of what DPI and PPI mean.[4] Microsoft began writing its software to treat the screen as though it provided a PPI characteristic that is \tfrac{1}{3} times larger than what the screen actually displayed. Because most screens at the time provided around 72 PPI, Microsoft essentially wrote its software to assume that every screen provides 96 PPI (because 72 * (1+\tfrac{1}{3}) = 96). The short-term gain of this trickery was twofold: It would seem to the software that \tfrac{1}{3} more pixels were available for rendering an image, thereby allowing for bitmap fonts to be created with greater detail. On every screen that actually provided 72 PPI, each graphical element (such as a character of text) would be rendered at a size \tfrac{1}{3} larger than it "should" be, thereby allowing a person to sit a comfortable distance from the screen. However, larger graphical elements meant less screen space was available for programs to draw. Thus, for example, a 10-point font on a Macintosh (at 72 PPI) was represented with 10 pixels (i.e., 10 PPEm), whereas a 10-point font on a Windows platform (at 96 PPI) using the same screen is represented with 13 pixels (i.e., Microsoft rounded 13.3333 to 13 pixels, or 13 PPEm). Likewise, a 12-point font was represented with 12 pixels on a Macintosh, and 16 pixels on a Windows platform that used the same screen, and so on.[5] The negative consequence of this standard is that with 96 PPI displays, there is no longer a 1-to-1 relationship between the font size in pixels and the printout size in points. This difference is accentuated on more recent displays that feature higher pixel densities. This has been less of a problem with the advent of vector graphics and fonts being used in place of bitmap graphics and fonts. Moreover, many Windows software programs have been written since the 1980s which assume that the screen provides 96 PPI. Accordingly, these programs do not display properly at common alternative resolutions such as 72 PPI or 120 PPI. The solution has been to introduce two concepts:[4]

  1. logical PPI: The PPI that software claims a screen provides. This can be thought of as the PPI provided by a virtual screen created by the operating system.
  2. physical PPI: The PPI that a physical screen actually provides.

Software programs render images to the virtual screen and then the operating system renders the virtu

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment