US20040201865A1 - Method for smooth trap suppression of small graphical objects using run length encoded data - Google Patents

Method for smooth trap suppression of small graphical objects using run length encoded data Download PDF

Info

Publication number
US20040201865A1
US20040201865A1 US10/411,562 US41156203A US2004201865A1 US 20040201865 A1 US20040201865 A1 US 20040201865A1 US 41156203 A US41156203 A US 41156203A US 2004201865 A1 US2004201865 A1 US 2004201865A1
Authority
US
United States
Prior art keywords
trap
trapping
run
width
runs
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US10/411,562
Other versions
US7271934B2 (en
Inventor
Jon McElvain
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xerox Corp
Original Assignee
Xerox Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xerox Corp filed Critical Xerox Corp
Priority to US10/411,562 priority Critical patent/US7271934B2/en
Assigned to XEROX CORPORATION reassignment XEROX CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MCELVAIN, JON S.
Assigned to JPMORGAN CHASE BANK, AS COLLATERAL AGENT reassignment JPMORGAN CHASE BANK, AS COLLATERAL AGENT SECURITY AGREEMENT Assignors: XEROX CORPORATION
Publication of US20040201865A1 publication Critical patent/US20040201865A1/en
Application granted granted Critical
Publication of US7271934B2 publication Critical patent/US7271934B2/en
Assigned to XEROX CORPORATION reassignment XEROX CORPORATION RELEASE BY SECURED PARTY (SEE DOCUMENT FOR DETAILS). Assignors: JPMORGAN CHASE BANK, N.A. AS SUCCESSOR-IN-INTEREST ADMINISTRATIVE AGENT AND COLLATERAL AGENT TO JPMORGAN CHASE BANK
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/56Processing of colour picture signals
    • H04N1/58Edge or detail enhancement; Noise or error suppression, e.g. colour misregistration correction

Definitions

  • This invention is related to co-pending, co-assigned application Ser. No. 10/263,534 for “Method for Trapping Raster Data in a Run-Length Encoded Form” filed Oct. 2, 2002, and to co-pending, co-assigned application Ser. No. 09/538,596 filed Mar. 29, 2000, for “Method for Trapping Suppression for Thin Graphical Objects Using Run Length Encoded Data,” the contents of both are incorporated herein by reference and made a part of this application.
  • This invention is also related to the inventor's application for “Method for Smooth Trap Suppression of Small Graphical Objects” which has been assigned to the assignee of this invention and which has been filed the same date as this application.
  • This invention relates generally to methods for correcting for marking engine characteristics, and more particularly, to a method for smooth trap suppression of small graphical objects using run length encoded data.
  • Trapping is a generally a two step process.
  • the first step in the trapping process is to determine where there is an edge on which to apply trapping. Trapping is typically used between pixels that are not of identical color, but it can be used in other locations as well.
  • the second step is to generate the overlay of one or more pixels, in any combination of the color separations, which is done by a trap generator or trap oracle.
  • the two inputs for the trap generator are the colors on both sides of the edge in question. For example, consider magenta and cyan, with a user-specified maximum trap width of two. The generator will compute from these whether trapping is necessary, what color to use, and where it should be applied.
  • the correction could be zero (no trapping), one, or two pixels in width in any combination of cyan, magenta, yellow and black, and it could be located in either the magenta or cyan area.
  • Edge detection and image manipulation to perform trapping may be done in any of several processes, including for example, the technique described in U.S. Pat. No. 6,345,117 to Victor Klassen, for “Method for Automatic Trap Selection for Correcting for Separation Misregistration in Color Printing”.
  • Run length encoding is a type of lossless compression which utilizes the fact that many files frequently contain the same character repeated many times in a row. For example, text files use multiple spaces to separate sentences, indent paragraphs, format tables and charts, etc. Digitized signals can also have runs of the same value, indicating that the signal is not changing. For example, in a data sequence having frequent runs of zeros, each time a zero is encountered in the input data, two values are written to the output file. The first of these is a zero, a flag to indicate that run length compression is beginning. The second is the number of zeros in the run. If the average run length is longer than two, compression will take place. Many different run length encoding schemes have been developed.
  • D/A0062 Co-assigned application Ser. No. 09/538,596 filed Mar. 29, 2000, for “Method for Trapping Suppression for Thin Graphical Objects Using Run Length Encoded Data”, (D/A0062) describes a technique for suppressing trapping for the boundary between two color areas where one is a thin line by testing the number of pixels which make up the thickness of the areas, the decision being based on run length encoded image data.
  • the method of D/A0062 disables trapping for objects with a dimension smaller than a threshold value. Although this reduces the hue shift for “regular” rectangular objects, irregular trapping occurs at the edges of small and complex graphical objects (where local dimensions may be near or below the threshold), for example, small text or thin triangles. It is therefore desirable to reduce the hue shift for these small objects while preserving the continuity of traps along the edges.
  • a method for smooth trapping of an object containing run length encoded image pixel data includes collecting a number of scanlines of run length encoded pixel data equal to 2M in a buffer, wherein M is a line width trap threshold; determining those runs within the collected scanlines that require trapping, wherein a run is a portion of a scanline; determining if a run requiring trapping is located in a fast scan direction or a slow scan direction; if trapping is required in the fast scan direction, determining the length of the run requiring trapping; if the length of the run requiring trapping is less than M, reducing the width of the trap region according to a predetermined relationship; and applying a trap correction to the trap pixel according to the reduced trap region width.
  • the method further includes determining the number of runs above and below the run to be trapped that possess the same color as the run to be trapped; and if the number of runs above and below the run to be trapped is less than M, reducing the number of runs above and below the run to be trapped according to a predetermined relationship.
  • the predetermined relationship may include reducing the number of pixels to trap by a prorated percent (in the fast scan direction) and reducing the number of runs to trap by a prorated percent (in the slow scan direction).
  • the predetermined relationship may include reducing the trap region width monotonically.
  • the result is a method for efficient suppression of trapping of thin objects, and an overall improvement of image quality, due to the facile nature of image data processing associated with the run length encoded format.
  • a method for smooth trapping of a thin graphical object includes receiving from a trap generator the location of a trap pixel in a thin object that should be changed in color; wherein a thin graphical object has a width dimension which is substantially less than the object's length dimension, determining the width of the thin object containing the trap pixel; comparing the width of the thin object with a trap threshold width; if the width of the thin object is less than the trap threshold width, reducing the width of the trap region according to a predetermined relationship; and applying a trap correction to the trap pixel according to the reduced trap region width.
  • the predetermined relationship may be a relationship which reduces the trap width monotonically.
  • the method of the invention reduces the trap distance in a monotonic fashion (as the object size is reduced). As a result, the hue shift will be reduced, while preserving some degree of trapping at the edges to reduce visible misregistration errors.
  • the method for smooth trapping may be applied to uniformly small objects, i.e., objects in which both dimensions (length and width) are less than the trap width threshold of a trap generator.
  • a method for smooth trapping of a small object includes receiving from a trap generator the location of a trap pixel in the small object that should be changed in color; wherein a small object has a size of the order of a few pixels; determining a dimension of the small object containing the trap pixel; comparing the dimension of the small object with a trap threshold width; if the dimension of the object is less than the trap threshold width, reducing the width of the trap region according to a predetermined relationship; and applying a trap correction uniformly to the small object according to the reduced trap width.
  • An example of a small object includes small font size text objects.
  • the method for smooth trapping may be used to provide smooth trapping for text objects.
  • a text object is generally described by its font size.
  • the method includes receiving from a trap generator the location of a trap pixel in the text object that should be changed in color; determining the font size of the text object containing the trap pixel; comparing the font size of the object with a font size threshold; if the font size of the object is less than the font size threshold, reducing the width of the trap region according to a predetermined relationship; and applying a trap correction uniformly to the text object according to the reduced trap width.
  • the same predetermined relationships used with respect to thin objects may also be used with small objects and text objects.
  • FIG. 1 illustrates hue shift at the edges of a thin object compared to a thicker object
  • FIG. 2 is a region of interest for trapping runs, with intersections and four relevant corner run segments;
  • FIG. 3 shows the 13 possible corner geometries
  • FIG. 5 is a graph of an exemplary weighting function for determining trap size.
  • a thin graphical object is generally one in which the width is substantially less than its length (as can be seen, for example, in the thin line of FIG. 1).
  • the method of the invention determines the width of the object for which trapping is selected. If the width of the object is less than the trap width threshold of the trap generator, the method of the invention determines a trap width correction. The trap correction reduces the trap width according to a predetermined relationship. Trap pixels are then applied to the reduced trap area.
  • a method for smooth trap suppression for thin graphical objects assumes run-length encoded form of raster data. In the discussion following, it is assumed that the runs are of constant color, where all the pixels of a given run have the same color. It is possible, however, to apply the method of the invention to runs that have sampled color, wherein each pixel of the run has a different color, as specified by a referenced pixel map.
  • McElvain et al. (U.S. patent application Ser. No. 10/263,534 filed Oct. 2, 2002) describes a method for trapping data in a run length encoded form. This method requires a user specification of the maximum trap radius; i.e. the maximum number of pixels over which the trap zone will extend.
  • the method employs a “scanline buffer” that has twice the number of scan lines as the trap radius specified.
  • Each scanline of the image consists of a string of “runs”, which are specified by a minimum position in the fast (horizontal) direction, a length (number of pixels), a color (if it is constant color), and other tags used for printing purposes.
  • the runs can be of constant color, or derived from sampled image data.
  • the trap radius is specified as two
  • the runs of all 2M scanlines are inspected, and the point where any run ends on a particular scanline is assumed to be an edge (denoted a “segment boundary”).
  • FIG. 2 shows a region of interest for trapping runs, with the central intersection and four relevant corner run segments. More than one of these may have the same color.
  • FIG. 3 shows all of the possible geometries for the segment boundary inspection procedure.
  • the geometry determines where traps might be needed, directing the software to specific code that handles each geometry case.
  • the colors are used as inputs to the trapping generator, and depending on the output of the trapping generator, the colors of all four runs within the crosshairs may be modified.
  • the horizontal line will have two lines above and two below.
  • the vertical line will be through the next point at which any of the four scan lines exhibits a run (color) boundary (denoted a “segment boundary”).
  • the colors of the pixels surrounding the point of intersection (upper left, upper right, lower left, lower right) are sent to the trapping generator and the color changes that should be made are received back; 3) If trapping is required in the fast scan direction, do so only if the length of the run in the fast scan direction is greater than M.
  • the trapping color change should be made only if the pixel to be changed is in a run of M or more pixels; 4) If trapping of a run is required in the slow scan direction, do so only if there are M runs above or below of the same color; 5) Repeat this sequence to find the next center point, from step 2 ; and 6) At the end of the scan, output the topmost scan line in the buffer for printing (or further image processing), and roll the remaining scanlines up one line (e.g., scanline 2 becomes 1 , 3 becomes 2 , and 4 becomes 3 ). A new scanline of image data is then added at the bottom, and the process is repeated from step 2 .
  • the method for smooth trapping of an object containing run length encoded data uses the above described thin line discrimination method for trapping run length encoded data such that the trapping zone is reduced, rather than eliminated.
  • the method includes: 1) collecting a number of scanlines equal to twice the line width threshold (denoted M) in a buffer; 2) determining those runs that require trapping; 3) if trapping is required in the fast scan direction, trapping in a normal fashion if the length of the run in the fast scan direction is greater than M—otherwise reducing the number of pixels to trap by a predetermined relationship (such as prorated percent); and 4) if trapping of a run is required in the slow scan direction, trapping in a normal fashion if at least M runs directly above or below possess the same color as the run to be trapped—otherwise reducing the number of runs (above or below) to be trapped by a predetermined relationship (such as a prorated percent).
  • the result is a method for suppression of small object trapping hue shifts, and a preservation of trap continuity
  • FIG. 4 An example of a trap suppression problem using run length encoded data is shown in FIG. 4.
  • 2M scanlines worth of run data are collected in a buffer, where M is the minimum line width above which normal trapping is to be performed.
  • M 5 (10 scanlines collected).
  • segment boundaries i.e., the fast scan positions where run boundaries occur
  • the method inspects the runs of the middle two scanlines (designated by the horizontal dashed lines in FIG. 4) that intersect at each segment boundary. A total of 4 runs will therefore be considered (upper left, upper right, lower left, and lower right) at each segment boundary.
  • the trap generator is then called if two or more abutting runs have a different color. Based on the run configuration at this intersection (the color “geometry”) the implementation will determine whether to trap in the fast scan direction, the slow scan direction, or both.
  • the proposed enhancement will first inspect the length of the run. If the length of the run is greater than M, then trapping will be performed as requested. However, if the run length is less than M, the implementation will trap, but with a reduced trap width.
  • the weight function ⁇ is unity above the threshold, and an increasing function below the threshold. An example weighting function ⁇ is shown below in FIG. 5.
  • FIG. 4 An example of a situation where such fast-scan discrimination would be applied can be found in FIG. 4.
  • the trapping distance (the number of runs above) will be reduced in a fashion consistent with FIG. 5.
  • the result of this “smooth” thin object trapping discrimination is a significant reduction of small object hue shifts, as well as a preservation of trapping continuity at object edges.
  • the predetermined relationship may be a monotonically increasing function of the object size, and is dependent on the threshold itself.
  • a function of the type may be used:
  • weighting function ⁇ is generally unity above the threshold, and an increasing function below the threshold (as w increases).
  • An example weighting function ⁇ that is monotonically increasing up to the threshold, and unity thereafter is shown in FIG. 5 .
  • weighting functions may also be applied.
  • a quadratic function which increases as w approached wt may be used.
  • the method for smooth trapping may be applied to small objects, i.e., objects in which both dimensions (length and width) are less than the trap threshold width of a trap generator (or objects of only a few pixels).
  • a dimension of the small object is determined and compared with the trap threshold width. If the object's dimension is less than the trap threshold width, a predetermined relationship (such as one of those described above with respect to thin objects) is used to reduce the trap region. Then trapping is applied uniformly to the entire small object using the reduced trap width.
  • the method of the invention can be used for font size discrimination, where the reduced trap width would be applied uniformly to the font.
  • Text objects are generally defined the their font size. If a text character is one color and it is positioned on top of a different colored background (or plain paper), misregistration errors may appear as a shadowing effect. For very small font sizes, many trap generator would simply turn off the trap process.
  • the method of the invention can be applied to reduce the trap width based on a predetermined relationship and apply the trap pixels uniformly to the text characters.

Abstract

A method for smooth trapping of an object containing run length encoded image pixel data, includes collecting a number of scanlines of run length encoded pixel data equal to 2M in a buffer, wherein M is a line width trap threshold; determining those runs within the collected scanlines that require trapping, wherein a run is a portion of a scanline; if trapping is required in the fast scan direction and the length of the run requiring trapping is less than M, reducing the width of the trap region by a prorated percent. If trapping is required in the slow scan direction, and the number of runs above and below the run to be trapped is less than M, reducing the number of runs above and below the run to be trapped by a prorated percentage.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This invention is related to co-pending, co-assigned application Ser. No. 10/263,534 for “Method for Trapping Raster Data in a Run-Length Encoded Form” filed Oct. 2, 2002, and to co-pending, co-assigned application Ser. No. 09/538,596 filed Mar. 29, 2000, for “Method for Trapping Suppression for Thin Graphical Objects Using Run Length Encoded Data,” the contents of both are incorporated herein by reference and made a part of this application. This invention is also related to the inventor's application for “Method for Smooth Trap Suppression of Small Graphical Objects” which has been assigned to the assignee of this invention and which has been filed the same date as this application.[0001]
  • FIELD OF THE INVENTION
  • This invention relates generally to methods for correcting for marking engine characteristics, and more particularly, to a method for smooth trap suppression of small graphical objects using run length encoded data. [0002]
  • BACKGROUND OF THE INVENTION
  • Electronic processing of graphic and text images produces multi-color prints using multiple color separations. Typically, four process colors, cyan, magenta, yellow and black, are used to print multiple separations, which tend to have minor misregistration problems. The result of abutting or overlapping shapes is a boundary between adjacent regions of color that, under ideal printing conditions should have zero width. That is, one color should stop exactly where the abutting color begins, with no new colors being introduced along the boundary by the printing process itself. In practice, the realization of a zero width boundary between regions of different color is impossible as a result of small but visible misregistration problems from one printed separation to another. The error is manifested as a “light leak” or as a visible boundary region of an undesired color. [0003]
  • Methods for correcting for this misregistration are known. The general approach is to expand one of the abutting regions' separations to fill the gap or misregistration border region with a color determined to minimize the visual effect when printed. Borders or edges expanded from a region of one color to another in this manner are said to be “spread”. A border which has been expanded is referred to as a “trap”, and the zone within which color is added is called the “trap zone”. [0004]
  • Trapping is a generally a two step process. The first step in the trapping process is to determine where there is an edge on which to apply trapping. Trapping is typically used between pixels that are not of identical color, but it can be used in other locations as well. The second step is to generate the overlay of one or more pixels, in any combination of the color separations, which is done by a trap generator or trap oracle. The two inputs for the trap generator are the colors on both sides of the edge in question. For example, consider magenta and cyan, with a user-specified maximum trap width of two. The generator will compute from these whether trapping is necessary, what color to use, and where it should be applied. In this example, the correction could be zero (no trapping), one, or two pixels in width in any combination of cyan, magenta, yellow and black, and it could be located in either the magenta or cyan area. Edge detection and image manipulation to perform trapping may be done in any of several processes, including for example, the technique described in U.S. Pat. No. 6,345,117 to Victor Klassen, for “Method for Automatic Trap Selection for Correcting for Separation Misregistration in Color Printing”. [0005]
  • For the typical trapping operation, it is assumed that objects to be trapped are very large relative to the trapping region, so that the trap colors will be difficult to distinguish. Thus, the color of only a thin boundary of the object will be changed, while the large internal area will have the original, correct color. However, for objects smaller than a few pixels, or for long, thin objects having a width less than a few pixels, trapping results in visible hue changes in the color of the entire object. For example, if a thin line is only two pixels in width, and the trap generator decides to change the color of those two pixels, the entire color of the thin line has been changed. Small objects, such as small font size text characters, may be printed in an entirely different color. If the thin line, or the small object, happens to be located near a larger object of the same initial color, there will be a visible hue shift relative to the larger object, and the result of the trapping operation will be less desirable than no trapping at all. [0006]
  • Run length encoding is a type of lossless compression which utilizes the fact that many files frequently contain the same character repeated many times in a row. For example, text files use multiple spaces to separate sentences, indent paragraphs, format tables and charts, etc. Digitized signals can also have runs of the same value, indicating that the signal is not changing. For example, in a data sequence having frequent runs of zeros, each time a zero is encountered in the input data, two values are written to the output file. The first of these is a zero, a flag to indicate that run length compression is beginning. The second is the number of zeros in the run. If the average run length is longer than two, compression will take place. Many different run length encoding schemes have been developed. [0007]
  • Co-assigned application Ser. No. 09/538,596 filed Mar. 29, 2000, for “Method for Trapping Suppression for Thin Graphical Objects Using Run Length Encoded Data”, (D/A0062) describes a technique for suppressing trapping for the boundary between two color areas where one is a thin line by testing the number of pixels which make up the thickness of the areas, the decision being based on run length encoded image data. The method of D/A0062 disables trapping for objects with a dimension smaller than a threshold value. Although this reduces the hue shift for “regular” rectangular objects, irregular trapping occurs at the edges of small and complex graphical objects (where local dimensions may be near or below the threshold), for example, small text or thin triangles. It is therefore desirable to reduce the hue shift for these small objects while preserving the continuity of traps along the edges. [0008]
  • SUMMARY OF THE INVENTION
  • The method of the invention can be used to reduce hue shifts in thin objects and small objects caused by trapping suppression in trap generators. A method for smooth trapping of an object containing run length encoded image pixel data, according to one aspect of the invention, includes collecting a number of scanlines of run length encoded pixel data equal to 2M in a buffer, wherein M is a line width trap threshold; determining those runs within the collected scanlines that require trapping, wherein a run is a portion of a scanline; determining if a run requiring trapping is located in a fast scan direction or a slow scan direction; if trapping is required in the fast scan direction, determining the length of the run requiring trapping; if the length of the run requiring trapping is less than M, reducing the width of the trap region according to a predetermined relationship; and applying a trap correction to the trap pixel according to the reduced trap region width. If trapping is required in the slow scan direction, the method further includes determining the number of runs above and below the run to be trapped that possess the same color as the run to be trapped; and if the number of runs above and below the run to be trapped is less than M, reducing the number of runs above and below the run to be trapped according to a predetermined relationship. [0009]
  • The predetermined relationship may include reducing the number of pixels to trap by a prorated percent (in the fast scan direction) and reducing the number of runs to trap by a prorated percent (in the slow scan direction). The predetermined relationship may include reducing the trap region width monotonically. In accordance with another aspect of the invention, the predetermined relationship may be of the form t′=t[0010] of(wt, w) for w<wt, and t′=to when w≧wt, where w is the width of the object, wt is the trap threshold width, to is an original trap region width as determined by a trap generator and t′ is the reduced trap region width. In accordance with another aspect of the invention, the predetermined relationship may be of the form f(wt, w)=to(w/wt). A different predetermined relationship may be used for the fast scan direction than is used for the slow scan direction.
  • The result is a method for efficient suppression of trapping of thin objects, and an overall improvement of image quality, due to the facile nature of image data processing associated with the run length encoded format. [0011]
  • A method for smooth trapping of a thin graphical object, according to another aspect of the invention, includes receiving from a trap generator the location of a trap pixel in a thin object that should be changed in color; wherein a thin graphical object has a width dimension which is substantially less than the object's length dimension, determining the width of the thin object containing the trap pixel; comparing the width of the thin object with a trap threshold width; if the width of the thin object is less than the trap threshold width, reducing the width of the trap region according to a predetermined relationship; and applying a trap correction to the trap pixel according to the reduced trap region width. [0012]
  • In one embodiment, the predetermined relationship may be a relationship which reduces the trap width monotonically. Instead of completely eliminating trapping for dimensions below a threshold, the method of the invention reduces the trap distance in a monotonic fashion (as the object size is reduced). As a result, the hue shift will be reduced, while preserving some degree of trapping at the edges to reduce visible misregistration errors. [0013]
  • In another embodiment, the predetermined relationship may be of the form t′=t[0014] oƒ(wt, w) for w<wt, and t′=object, wt is the trap threshold width, to is an original trap width as determined by the trap generator, t′ is the reduced trap width. In another embodiment, the function may be a linear relationship of the type f(wt, w)=to(w/wt).
  • In accordance with another aspect of the invention, the method for smooth trapping may be applied to uniformly small objects, i.e., objects in which both dimensions (length and width) are less than the trap width threshold of a trap generator. A method for smooth trapping of a small object, includes receiving from a trap generator the location of a trap pixel in the small object that should be changed in color; wherein a small object has a size of the order of a few pixels; determining a dimension of the small object containing the trap pixel; comparing the dimension of the small object with a trap threshold width; if the dimension of the object is less than the trap threshold width, reducing the width of the trap region according to a predetermined relationship; and applying a trap correction uniformly to the small object according to the reduced trap width. [0015]
  • An example of a small object includes small font size text objects. In accordance with another aspect of the invention, the method for smooth trapping may be used to provide smooth trapping for text objects. A text object is generally described by its font size. The method includes receiving from a trap generator the location of a trap pixel in the text object that should be changed in color; determining the font size of the text object containing the trap pixel; comparing the font size of the object with a font size threshold; if the font size of the object is less than the font size threshold, reducing the width of the trap region according to a predetermined relationship; and applying a trap correction uniformly to the text object according to the reduced trap width. The same predetermined relationships used with respect to thin objects may also be used with small objects and text objects.[0016]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 illustrates hue shift at the edges of a thin object compared to a thicker object; [0017]
  • FIG. 2 is a region of interest for trapping runs, with intersections and four relevant corner run segments; [0018]
  • FIG. 3 shows the 13 possible corner geometries; [0019]
  • FIG. 4 shows an example scanline buffer configuration with 6 scanlines collected (M=3); and [0020]
  • FIG. 5 is a graph of an exemplary weighting function for determining trap size.[0021]
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • A thin graphical object is generally one in which the width is substantially less than its length (as can be seen, for example, in the thin line of FIG. 1). Assuming that a particular trap generator has a predetermined trap width threshold (below which trapping is disabled), the method of the invention determines the width of the object for which trapping is selected. If the width of the object is less than the trap width threshold of the trap generator, the method of the invention determines a trap width correction. The trap correction reduces the trap width according to a predetermined relationship. Trap pixels are then applied to the reduced trap area. [0022]
  • A method for smooth trap suppression for thin graphical objects according to one feature of the invention assumes run-length encoded form of raster data. In the discussion following, it is assumed that the runs are of constant color, where all the pixels of a given run have the same color. It is possible, however, to apply the method of the invention to runs that have sampled color, wherein each pixel of the run has a different color, as specified by a referenced pixel map. [0023]
  • McElvain et al. (U.S. patent application Ser. No. 10/263,534 filed Oct. 2, 2002) describes a method for trapping data in a run length encoded form. This method requires a user specification of the maximum trap radius; i.e. the maximum number of pixels over which the trap zone will extend. The method employs a “scanline buffer” that has twice the number of scan lines as the trap radius specified. Each scanline of the image consists of a string of “runs”, which are specified by a minimum position in the fast (horizontal) direction, a length (number of pixels), a color (if it is constant color), and other tags used for printing purposes. The runs can be of constant color, or derived from sampled image data. For example, if the trap radius is specified as two, 2M=4 scanlines at a time will be examined during the trapping procedure. In this method, the runs of all 2M scanlines are inspected, and the point where any run ends on a particular scanline is assumed to be an edge (denoted a “segment boundary”). [0024]
  • In order to determine what traps are needed on the fast- and slow-scan edges, and adjacent corners, the four colors adjacent to the center “crosshairs” of the buffer at each segment boundary are inspected. The center intersection is formed by the segment boundary in the fast-scan direction between the two segments, and the scanline boundary between the center two scanlines of the buffer. There are four segments adjacent to these crosshairs, referred to as “upper left”, “upper right”, “lower left” and “lower right” corners. FIG. 2 shows a region of interest for trapping runs, with the central intersection and four relevant corner run segments. More than one of these may have the same color. In order to determine where traps, if any, need to be placed, the four corners are compared, yielding one of 13 possible geometries. FIG. 3 shows all of the possible geometries for the segment boundary inspection procedure. The geometry determines where traps might be needed, directing the software to specific code that handles each geometry case. In each of these thirteen cases, the colors are used as inputs to the trapping generator, and depending on the output of the trapping generator, the colors of all four runs within the crosshairs may be modified. Once all segment boundaries within the scanline buffer are processed, the topmost scanline is output (to the printer, for example), and a new scanline is read into the bottom of the buffer. The trapping procedure is then repeated for this new set of scanlines within the buffer, until the entire image is processed. [0025]
  • A thin line discrimination process for use in trapping as described in McElvain (U.S. application Ser. No. 09/538,596 filed Mar. 29, 2000, for “Method for Trapping Suppression for Thin Graphical Objects Using Run Length Encoded Data”) includes: 1) Collecting in a buffer a number of scanlines of run length encoded image data equal to twice the minimum line width threshold (M). For an example, assume a threshold of two pixels (M=[0026] 2), so the number collected of scanlines in the buffer would be four; 2) Determining the next “center” point that may require trapping, the point being the intersection of a vertical and horizontal line. The horizontal line is always in the center of the scan lines. Thus, in the case of four lines in the buffer, the horizontal line will have two lines above and two below. The vertical line will be through the next point at which any of the four scan lines exhibits a run (color) boundary (denoted a “segment boundary”). The colors of the pixels surrounding the point of intersection (upper left, upper right, lower left, lower right) are sent to the trapping generator and the color changes that should be made are received back; 3) If trapping is required in the fast scan direction, do so only if the length of the run in the fast scan direction is greater than M. That is, in the example, the trapping color change should be made only if the pixel to be changed is in a run of M or more pixels; 4) If trapping of a run is required in the slow scan direction, do so only if there are M runs above or below of the same color; 5) Repeat this sequence to find the next center point, from step 2; and 6) At the end of the scan, output the topmost scan line in the buffer for printing (or further image processing), and roll the remaining scanlines up one line (e.g., scanline 2 becomes 1, 3 becomes 2, and 4 becomes 3). A new scanline of image data is then added at the bottom, and the process is repeated from step 2.
  • The method for smooth trapping of an object containing run length encoded data uses the above described thin line discrimination method for trapping run length encoded data such that the trapping zone is reduced, rather than eliminated. The method includes: 1) collecting a number of scanlines equal to twice the line width threshold (denoted M) in a buffer; 2) determining those runs that require trapping; 3) if trapping is required in the fast scan direction, trapping in a normal fashion if the length of the run in the fast scan direction is greater than M—otherwise reducing the number of pixels to trap by a predetermined relationship (such as prorated percent); and 4) if trapping of a run is required in the slow scan direction, trapping in a normal fashion if at least M runs directly above or below possess the same color as the run to be trapped—otherwise reducing the number of runs (above or below) to be trapped by a predetermined relationship (such as a prorated percent). The result is a method for suppression of small object trapping hue shifts, and a preservation of trap continuity. [0027]
  • An example of a trap suppression problem using run length encoded data is shown in FIG. 4. 2M scanlines worth of run data are collected in a buffer, where M is the minimum line width above which normal trapping is to be performed. In FIG. 4 M=5 (10 scanlines collected). As with the thin line discrimination method described above, segment boundaries (i.e., the fast scan positions where run boundaries occur) are located within the buffer; these are labeled [0028] 1-6 in FIG. 4. The method inspects the runs of the middle two scanlines (designated by the horizontal dashed lines in FIG. 4) that intersect at each segment boundary. A total of 4 runs will therefore be considered (upper left, upper right, lower left, and lower right) at each segment boundary. The trap generator is then called if two or more abutting runs have a different color. Based on the run configuration at this intersection (the color “geometry”) the implementation will determine whether to trap in the fast scan direction, the slow scan direction, or both.
  • If trapping of a run in the fast scan direction is deemed necessary (by the trap generator), the proposed enhancement will first inspect the length of the run. If the length of the run is greater than M, then trapping will be performed as requested. However, if the run length is less than M, the implementation will trap, but with a reduced trap width. The amount of reduction of the trap width is determined by a predetermined relationship. In this example, the amount by which the trap width is reduced is a function of the trap width and the threshold, according to the relationship w′=w[0029] oƒ(wt, w), where w is the length of the run, w′ and wo are the corrected and the original trap widths, respectively, and wt is the small object threshold. The weight function ƒ is unity above the threshold, and an increasing function below the threshold. An example weighting function ƒ is shown below in FIG. 5.
  • An example of a situation where such fast-scan discrimination would be applied can be found in FIG. 4. The dark (red) runs located between [0030] boundaries 4 and 5 are of length 4 pixels (less than M=5). If the trap generator specifies to trap into the red 2 pixels, the smooth trapping discriminator may only trap 1 pixel, according to the relationship shown in FIG. 5. However, the red runs located beyond boundary 6 are greater in length than M, so they will be modified according to the specification of the trap generator.
  • If trapping of a run in the slow scan direction is required, the procedure is more difficult. In this case, it is necessary to inspect the runs immediately above or below the run in question for comparison of colors. If it is found that the color (or a color that is close) extends above or below at least M scanlines, then trapping will be performed as specified by the trap generator. Otherwise the trapping distance specified by the trap generator will be reduced by an amount according to the weighting function (shown in FIG. 5). A good example of regions where this discrimination would be applied can be found in FIG. 4. Considering the region between the [0031] segment boundaries 1 and 2, assume trapping of the cyan (light) run is requested (as a result of the red/cyan interface in the slow scan direction) in the slow scan direction. Because the cyan run does not extend at least 5 pixels in the slow scan direction, the trapping distance (the number of runs above) will be reduced in a fashion consistent with FIG. 5. The result of this “smooth” thin object trapping discrimination is a significant reduction of small object hue shifts, as well as a preservation of trapping continuity at object edges.
  • Various predetermined relationships may be used. The predetermined relationship may be a monotonically increasing function of the object size, and is dependent on the threshold itself. For example, a function of the type may be used: [0032]
  • t′=t oƒ(w t , w)
  • for w<wt and t′=to when w≧wt, [0033]
  • where w is the width of the object, t′ is the reduced trap width, t[0034] o is the original trap width as determined by the trap generator, and wt is the threshold width as determined by the trap generator. Weighting function ƒ is generally unity above the threshold, and an increasing function below the threshold (as w increases). An example weighting function ƒ that is monotonically increasing up to the threshold, and unity thereafter is shown in FIG. 5. The weighting function in FIG. 5 can be a linear function of the form f(wt, w)=to(w/wt).
  • Other weighting functions may also be applied. For example, a quadratic function which increases as w approached wt may be used. Higher order, monotonically increasing functions may also be used. Any function which meets the requirements of monotonically increasing with object size and f=1 when the object dimensions are greater than or equal to the threshold width determined by the trap generator may be used. [0035]
  • The method for smooth trapping may be applied to small objects, i.e., objects in which both dimensions (length and width) are less than the trap threshold width of a trap generator (or objects of only a few pixels). A dimension of the small object is determined and compared with the trap threshold width. If the object's dimension is less than the trap threshold width, a predetermined relationship (such as one of those described above with respect to thin objects) is used to reduce the trap region. Then trapping is applied uniformly to the entire small object using the reduced trap width. [0036]
  • The method of the invention can be used for font size discrimination, where the reduced trap width would be applied uniformly to the font. Text objects are generally defined the their font size. If a text character is one color and it is positioned on top of a different colored background (or plain paper), misregistration errors may appear as a shadowing effect. For very small font sizes, many trap generator would simply turn off the trap process. The method of the invention can be applied to reduce the trap width based on a predetermined relationship and apply the trap pixels uniformly to the text characters. [0037]
  • The result of this “smooth” thin (and small) object trapping discrimination is a significant reduction of small object hue shifts, as well as a preservation of trapping continuity at object edges. For example, for a trap generator with a trap threshold of 10 pixels (wt) and a trap radius of 3 pixels (to), an object of width 7 pixels (w) might have a corrected trap radius of 2 pixels (t′). [0038]
  • The invention has been described with reference to particular embodiments for convenience only. Modifications and alterations will occur to others upon reading and understanding this specification taken together with the drawings. The embodiments are but examples, and various alternatives, modifications, variations or improvements may be made by those skilled in the art from this teaching which are intended to be encompassed by the following claims. [0039]

Claims (7)

What is claimed is:
1. A method for smooth trapping of an object containing run length encoded image pixel data, comprising:
collecting a number of scanlines of run length encoded pixel data equal to 2M in a buffer, wherein M is a line width trap threshold;
determining those runs within the collected scanlines that require trapping, wherein a run is a portion of a scanline;
determining if a run requiring trapping is located in a fast scan direction or a slow scan direction;
if trapping is required in the fast scan direction, determining the length of the run requiring trapping;
if the length of the run requiring trapping is less than M, reducing the width of the trap region according to a predetermined relationship; and
applying a trap correction to the trap pixel according to the reduced trap region width.
2. The method of claim 1, wherein the predetermined relationship reduces the number of pixels to trap by a prorated percent.
3. The method of claim 1, further comprising:
if trapping is required in the slow scan direction, determining the number of runs above and below the run to be trapped that possess the same color as the run to be trapped;
if the number of runs above and below the run to be trapped is less than M, reducing the number of runs above and below the run to be trapped according to a predetermined relationship.
4. The method of claim 3, wherein the predetermined relationship reduces the number of runs to trap by a prorated percent.
5. The method of claim 1, wherein the predetermined relationship comprises a relationship which reduces the trap region width monotonically.
6. The method of claim 1, wherein the predetermined relationship comprises:
t′=t oƒ(w t , w)
for w<wt, and t′=t′=to when w≧wt,
where w is the width of the object, wt is the trap threshold width, to is an original trap region width as determined by a trap generator and t′ is the reduced trap region width.
7. The method of claim 3, wherein ƒ(wt, w)=to(w/wt).
US10/411,562 2003-04-10 2003-04-10 Method for smooth trap suppression of small graphical objects using run length encoded data Expired - Fee Related US7271934B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US10/411,562 US7271934B2 (en) 2003-04-10 2003-04-10 Method for smooth trap suppression of small graphical objects using run length encoded data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US10/411,562 US7271934B2 (en) 2003-04-10 2003-04-10 Method for smooth trap suppression of small graphical objects using run length encoded data

Publications (2)

Publication Number Publication Date
US20040201865A1 true US20040201865A1 (en) 2004-10-14
US7271934B2 US7271934B2 (en) 2007-09-18

Family

ID=33131014

Family Applications (1)

Application Number Title Priority Date Filing Date
US10/411,562 Expired - Fee Related US7271934B2 (en) 2003-04-10 2003-04-10 Method for smooth trap suppression of small graphical objects using run length encoded data

Country Status (1)

Country Link
US (1) US7271934B2 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030011796A1 (en) * 2001-06-15 2003-01-16 Michael Kohn Method of producing traps in a print page
EP1727354A2 (en) * 2005-05-24 2006-11-29 Xerox Corporation Page edge correction systems and methods
US9135535B1 (en) * 2014-06-09 2015-09-15 Xerox Corporation Method and system for prorating trapping parameters globally with respect to object size

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011081192A (en) * 2009-10-07 2011-04-21 Fuji Xerox Co Ltd Image forming apparatus and pixel control program
US8995021B2 (en) 2013-06-12 2015-03-31 Xerox Corporation Black trapping methods, apparatus and systems for binary images
US9367895B2 (en) * 2014-03-19 2016-06-14 Digitalglobe, Inc. Automated sliver removal in orthomosaic generation

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5687303A (en) * 1994-05-18 1997-11-11 Xerox Corporation Printer controller for object optimized printing
US5923821A (en) * 1996-09-27 1999-07-13 Xerox Corporation Digital image trapping system
US20010033686A1 (en) * 1998-10-22 2001-10-25 Xerox Corporation. Method for automatic trap selection for correcting for separation misregistration in color printing
US6327043B1 (en) * 1994-05-18 2001-12-04 Xerox Corporation Object optimized printing system and method
US6341020B1 (en) * 1998-12-28 2002-01-22 Xerox Corporation Anamorphic object optimized function application for printer defect pre-compensation
US20020024679A1 (en) * 2000-08-23 2002-02-28 Axel Hauck Method of minimizing trapping, I.E., choking or spreading, in a printing-original production process
US6377711B1 (en) * 1999-06-30 2002-04-23 Xerox Corporation Methods and systems for detecting the edges of objects in raster images using diagonal edge detection
US20030011796A1 (en) * 2001-06-15 2003-01-16 Michael Kohn Method of producing traps in a print page
US20030044065A1 (en) * 1999-09-30 2003-03-06 Steven J. Harrington Method and apparatus for implementing a trapping operation on a digital image
US6549303B1 (en) * 1999-09-20 2003-04-15 Hewlett-Packard Company Trapping methods and arrangements for use in printing color images
US6757072B1 (en) * 2000-03-29 2004-06-29 Xerox Corporation Method for trapping suppression for thin graphical objects using run length encoded data
US6781720B1 (en) * 1999-11-30 2004-08-24 Xerox Corporation Gradient-based trapping using patterned trap zones
US6795214B2 (en) * 1999-03-19 2004-09-21 Heidelberger Druckmaschinen Ag Method for generating trapping contours in a print page
US6844942B2 (en) * 1999-09-29 2005-01-18 Xerox Corporation Method for trapping raster data in a run-length encoded form
US6970271B1 (en) * 2001-08-03 2005-11-29 Adobe Systems Incorporated Device independent trap color specification
US7009735B2 (en) * 2002-01-07 2006-03-07 Xerox Corporation Method for black trapping and under print processing
US7123381B2 (en) * 1998-10-22 2006-10-17 Xerox Corporation System and method of trapping for correcting for separation misregistration in color printing

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6006013A (en) * 1994-05-18 1999-12-21 Xerox Corporation Object optimized printing system and method
US6256104B1 (en) * 1994-05-18 2001-07-03 Xerox Corporation Object optimized printing system and method
US6327043B1 (en) * 1994-05-18 2001-12-04 Xerox Corporation Object optimized printing system and method
US5687303A (en) * 1994-05-18 1997-11-11 Xerox Corporation Printer controller for object optimized printing
US5923821A (en) * 1996-09-27 1999-07-13 Xerox Corporation Digital image trapping system
US7123381B2 (en) * 1998-10-22 2006-10-17 Xerox Corporation System and method of trapping for correcting for separation misregistration in color printing
US20010033686A1 (en) * 1998-10-22 2001-10-25 Xerox Corporation. Method for automatic trap selection for correcting for separation misregistration in color printing
US6345117B2 (en) * 1998-10-22 2002-02-05 Xerox Corporation Method for automatic trap selection for correcting for separation misregistration in color printing
US6341020B1 (en) * 1998-12-28 2002-01-22 Xerox Corporation Anamorphic object optimized function application for printer defect pre-compensation
US6795214B2 (en) * 1999-03-19 2004-09-21 Heidelberger Druckmaschinen Ag Method for generating trapping contours in a print page
US6377711B1 (en) * 1999-06-30 2002-04-23 Xerox Corporation Methods and systems for detecting the edges of objects in raster images using diagonal edge detection
US6549303B1 (en) * 1999-09-20 2003-04-15 Hewlett-Packard Company Trapping methods and arrangements for use in printing color images
US6844942B2 (en) * 1999-09-29 2005-01-18 Xerox Corporation Method for trapping raster data in a run-length encoded form
US20030044065A1 (en) * 1999-09-30 2003-03-06 Steven J. Harrington Method and apparatus for implementing a trapping operation on a digital image
US6738159B2 (en) * 1999-09-30 2004-05-18 Xerox Corporation Method and apparatus for implementing a trapping operation on a digital image
US6781720B1 (en) * 1999-11-30 2004-08-24 Xerox Corporation Gradient-based trapping using patterned trap zones
US6757072B1 (en) * 2000-03-29 2004-06-29 Xerox Corporation Method for trapping suppression for thin graphical objects using run length encoded data
US20020024679A1 (en) * 2000-08-23 2002-02-28 Axel Hauck Method of minimizing trapping, I.E., choking or spreading, in a printing-original production process
US20030011796A1 (en) * 2001-06-15 2003-01-16 Michael Kohn Method of producing traps in a print page
US6970271B1 (en) * 2001-08-03 2005-11-29 Adobe Systems Incorporated Device independent trap color specification
US7009735B2 (en) * 2002-01-07 2006-03-07 Xerox Corporation Method for black trapping and under print processing

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030011796A1 (en) * 2001-06-15 2003-01-16 Michael Kohn Method of producing traps in a print page
US7173738B2 (en) * 2001-06-15 2007-02-06 Heidelberger Druckmaschinen Ag Method of producing traps in a print page
EP1727354A2 (en) * 2005-05-24 2006-11-29 Xerox Corporation Page edge correction systems and methods
EP1727354A3 (en) * 2005-05-24 2007-09-12 Xerox Corporation Page edge correction systems and methods
US9135535B1 (en) * 2014-06-09 2015-09-15 Xerox Corporation Method and system for prorating trapping parameters globally with respect to object size

Also Published As

Publication number Publication date
US7271934B2 (en) 2007-09-18

Similar Documents

Publication Publication Date Title
US5737455A (en) Antialiasing with grey masking techniques
US6236754B1 (en) Image modification to reduce susceptibility to misregistration
US6345117B2 (en) Method for automatic trap selection for correcting for separation misregistration in color printing
US7123381B2 (en) System and method of trapping for correcting for separation misregistration in color printing
KR100607018B1 (en) Image processor, image processing method, and medium on which image processing program is recorded
US8103104B2 (en) Text extraction and its application to compound document image compression
US7746505B2 (en) Image quality improving apparatus and method using detected edges
US6839151B1 (en) System and method for color copy image processing
US7619627B2 (en) Image processing apparatus
US7746503B2 (en) Method of and device for image enhancement
EP0590852A2 (en) Color separation in color graphics printing with limited memory
US6844942B2 (en) Method for trapping raster data in a run-length encoded form
US20120114230A1 (en) Image processing apparatus, image processing method, and storage medium
US7271934B2 (en) Method for smooth trap suppression of small graphical objects using run length encoded data
JP4386216B2 (en) Color printing system and control method thereof
US7146043B2 (en) Method for smooth trap suppression of small graphical objects
US6289122B1 (en) Intelligent detection of text on a page
US20060007496A1 (en) Method for smooth trapping suppression of small graphical objects using color interpolation
JP3073837B2 (en) Image region separation device and image region separation method
EP0680194B1 (en) Image processing device and image output device converting binary image into multi-valued image
JP4963559B2 (en) Image processing apparatus, image forming apparatus, image processing method, and program for causing computer to execute the method
JPH0662230A (en) Image forming device
JP4091174B2 (en) Image processing apparatus and image processing method
JP4616199B2 (en) Image processing apparatus and program
JPH06150059A (en) Image area separator

Legal Events

Date Code Title Description
AS Assignment

Owner name: XEROX CORPORATION, CONNECTICUT

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MCELVAIN, JON S.;REEL/FRAME:013991/0271

Effective date: 20030410

AS Assignment

Owner name: JPMORGAN CHASE BANK, AS COLLATERAL AGENT, TEXAS

Free format text: SECURITY AGREEMENT;ASSIGNOR:XEROX CORPORATION;REEL/FRAME:015134/0476

Effective date: 20030625

Owner name: JPMORGAN CHASE BANK, AS COLLATERAL AGENT,TEXAS

Free format text: SECURITY AGREEMENT;ASSIGNOR:XEROX CORPORATION;REEL/FRAME:015134/0476

Effective date: 20030625

STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20190918

AS Assignment

Owner name: XEROX CORPORATION, CONNECTICUT

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:JPMORGAN CHASE BANK, N.A. AS SUCCESSOR-IN-INTEREST ADMINISTRATIVE AGENT AND COLLATERAL AGENT TO JPMORGAN CHASE BANK;REEL/FRAME:066728/0193

Effective date: 20220822