As mentioned in the previous post, here are four relevant indicators – PPI, DPI, LPI and SPI.
Basic unit of a virtually reproduced image. Term invented from the expression ‘picture element’, represented by a small square.
PPI stands for Pixels Per Inch. The term PPI refers to the actual resolution of the image, as a virtual image, on a monitor. When working on the image on the computer, in any image editing program, we are working in PPI.
PPI is also involved in the method of calculating print size relative to the number of pixels in the image, since print media images are not measured in pixels but in metric system units or inches. The calculation is obtained by dividing both image measurements (pixel width and height) by the resolution (PPI). As explained in the next paragraph, this gives rise to a mixing between PPI and DPI but this mixing of PPI with DPI also confuses this aspect itself, because even sounding brands from the graphic and photography world apply DPI in this calculation, again mixing dot with pixel, or physical with virtual.
DPI refers to Dots Per Inch. In this case, we are already looking at the unit of measure for converting the virtual image into the physical image, hence the change from pixel to dot because printers use dots to reproduce a single pixel of the image. However, many software manufacturers substitute the reference PPI for DPI, and if you do a search on these acronyms, you will find a real battle of theses, definitions and justifications for this confusion. You must be aware that the concept of a pixel is much more complex and interesting than the generalized descriptions.
A pixel is not a simple linear unit that corresponds to that tiny square of the image when magnified to its maximum. The generic graphic representation of the little square on the display is nothing more than a model adopted by the creators of graphic cards as a way to simplify the visual representation of the pixel, which in reality has no defined form and somehow it would have to be represented. The true notion of pixel may be one of the reasons for this confusion between PPI and DPI, since a pixel corresponds to a sample of a certain point of the image and in a color image this pixel may even contain three samples, one for each primary color (read Image And Colors).
But what generally happens is to take DPI as a measure of output device and in this case almost all devices fit – not only printers, but video-projectors, cameras (etc.) – and this will be the main reason for manufacturers to ignore the PPI, adopting DPI as a general unit of resolution, without distinguishing physical image from virtual image.
But this concept of output device gets confused with another notion of simple output also used to distinguish PPI from DPI, where PPI is considered an input measure and DPI an output measure, and in this case the only output devices are printers and imagesetters, because they are the only devices through which the images cannot return to the monitor in the same way, and according to this logic all other devices are considered input pathways, even if some have dual pathways (input and output), as is the case with photographic cameras, for example.
The basic unit of a physically reproduced image, i.e. printed by whatever process. The printing dot can have various dimensions and shapes (round, elliptical, hexagonal, square, etc.), depending on the desired result or equipment used.
Moreover, we must take into account that the process of reproduction of the printed image is quite different from the process of reproduction of the virtual image and therefore it makes sense the distinction between PPI and DPI, not least because the resolution of the virtual image does not always correspond to the resolution of the printer, right? I can print an image with a resolution of 1200 PPI on a printer that prints at a maximum resolution of 600 DPI, for example. Because the 1200 PPI refers to the definition of the image and the 600 DPI refers to the quality of the print, two different aspects. If I call everything DPI, you end up losing this notion. DPI was actually created for the exclusive scope of printing, as a hardware reference, so replacing PPI with DPI is a simplistic choice, to say the least, and one that many schools and professionals consider crude and wrong.
Refers to a unit of measurement nowadays established as equivalent to 2.54 centimeters in the metric system. Inch is the name adopted in European countries in analogy to the male thumb whose measurement is approximated. The inch is commonly used in the United States, Canada, and England.
Nowadays, the wide variety of existing printer models makes everything even more confusing. Evolution has multiplied options and nomenclatures in function of increasingly improved results and ease of use. Both in home printers and in many professional printing equipment, instead of DPI there are options such as Fine, Coarse, Photo, High, Low, Document, Image, etc.
However, most of these options do not refer to the number of dots per inch, but to the way of reading color and contrast, as well as the type of dot applied as a way to improve the final result of the printed image, since most printers only have one to three real DPI options (usually 300, 600 and 1200 in laser printers and higher resolutions in inkjet printers, from 720 to 5000 and more, but each printer has only two or three options, and often only one), making this whole distinction even more difficult.
Nowadays, printers also come with several print size options for the same image, the user simply chooses the desired size and the machine does all the calculations according to this choice. However, at no time does DPI interfere with the number of pixels in the image. DPI only changes the number of dots that the printer will apply to represent a single pixel on the substrate. When the image is converted to the printer what happens is a resizing of the pixel, bigger or smaller, according to the output size defined by the user.
A printing process created in the second half of the 20th century, which uses light beams and electrical charges to engrave the matrix on the drum that transports it to the paper. The ink applied in this process is called toner, a fine powder mostly made of polyester, for its electrostatic characteristics.
A printing process that emerged in the middle of the 20th century, which projects tiny jets of ink onto paper as a way to print an image. Unlike other printing techniques, there is no matrix, but rather a central circuit that decodes the image directly from the computer onto the substrate.
Improved and automated lithographic printing technique, also called offset lithography, which appeared in the 19th century. The term offset comes from the fact that it is an indirect printing process. While in lithography we have a direct process – the substrate directly receives the image engraved on the stone or plate – in offset the process is indirect – the image engraved on the plate is transferred (offset) to a rubber surface which in turn prints this image on the substrate.
A printing process created in the late 18th century by the German Aloys Senefelder, which uses the repellency between water and fat as a way to fix the matrix to be printed. Originally, lithography used a morosely drawn image on a smooth, level limestone (from the Greek lithos = stone), using oily material (wax or grease). After the drawing phase, a solution of gum arabic and nitric acid was applied over the entire surface of the stone, generating an interesting chemical process in which the drawn parts bond to the stone and the free parts repel the grease and attract water.
LPI refers to Lines Per Inch. This is another measurement related to printing and is used mostly in commercial and industrial printing processes, in halftone printing – typically in screen printing, laser printing and offset printing. The halftone technique corresponds to a type of reproduction that simulates continuous tone images through the use of dots, varying them in size or spacing, thus generating an effect similar to a gradient or light and shadow effect for the same tone.
The halftone photomechanical process emerged sometime in the 19th century for the purpose of reproducing photographs in newspapers of the time. Prior to this, any typographic reproduction was limited to black and white areas on the paper – somewhat crude, opaque, and overall looking stroke characters and illustrations (here the term “putty” or “plate” is applied, which designates 100% opaque, dotless patches of print).
The halftone technique requires equipment capable of reproducing in amplitude modulated grid. Today, for example, in the specific case of inkjet printers, even in a professional or commercial environment, halftoning is not applicable because these printers tend to print by modulated dot frequency rather than amplitude.
In the case of laser printers, linescreen generally ranges from 50 to 110 LPI, depending on the characteristics of the printer and type of paper. The user should keep in mind that the LPI reference is intended for high production printing techniques and specific substrates, so most laser printers, especially those intended for office, home use and even a considerable professional range do not allow the linescreen to be chosen, only the resolution. In this case, linescreens can be simulated using image editing programs.
The LPI is the number of lines per inch, which in turn are made up of dots, and the more lines you have in the image, the smaller the dots and the greater the definition. LPI is not exactly a measure of resolution, since resolution implies constant dot or pixel size on the same screen or image (DPI and PPI) and in LPI we have variable dot sizes on the same screen. Therefore, when we talk about LPI, we talk about screen frequency and not resolution.
The linescreen also depends on the printing process and also on the substrate characteristics. Currently, printing companies work with linescreens that vary from 55 to 85 LPI in the case of newspaper printing; from 100 to 120 LPI in the case of magazines, and from 120 to 200 LPI for higher quality prints, for laser and offset.
In screen-printing, the advisable linescreen varies between 45 and 65 LPI. Screens with linescreen between 90 and 185 threads per square centimeter are intended for halftone printing. This is because the halftone technique started precisely by being applied through a gauze screen, by its inventor – William Talbot – when he realized that it would be necessary to reticulate the image to be able to hold the ink in the most extensive areas of the recorded image.
William Fox Talbot
The idea of the halftone is credited to William Fox Talbot (English mathematician, astronomer and archaeologist), but it is not clear who invented this process. Talbot created and patented his photographic screen in 1852 (a gauze screen he called a “photographic veil”), following his 1841 patent for the invention of the negative, through which it was possible to reproduce any number of positives, and which Talbot called a calotype, the forerunner process of photography as we know it today. There is no doubt that his remarkable work made him stand out in the early days of photographic reproduction, not least because it triggered a succession of experiments and changes in materials, equipment and processes in the reproduction of images. Therefore, the German Georg Meisenbach, and the Americans Frederic Ives and Max Levy can also be considered great names in the development of the halftone printing process, until the 20th century.
There is also this other resolution reference called Samples Per Inch (SPI). This measurement is exclusively linked to scanners because a scanner is a specific input device that captures images differently than a still camera. SPI translates to the optical resolution of the scanner and is related to the color depth of the scanned object, which is translated into bits (read Image And Color).
The quality of the scan is defined by the number of color samples that the scanner’s optics are capable of capturing per linear inch. This capacity varies from scanner to scanner, according to type and gamut, so it happens that many low-end scanners claim to capture millions of colors per inch, when in fact they have no real optical capacity, they only have a mode capable of simulating millions of colors which, obviously, does not correctly translate the colors of the scanned object (similar to what happens between optical and digital zoom, i.e. everything that is simulated does not have the quality of the real thing).
Once again, the SPI reference is often wrongly replaced by DPI in the software interface, similarly to what happens between DPI and PPI, which in this case makes even less sense, since scanners are not output or even mixed devices (input/output devices), they are typical input devices.
In computing, an input device refers to a device used to input data and information into the computer, as opposed to an output device – a device that sends data and information from the computer to other devices. A mixed device, input/output device, both sends data to and from the computer.