US6859228B1 - Least squares method for color misregistration detection and correction in image data - Google Patents

Least squares method for color misregistration detection and correction in image data Download PDF

Info

Publication number
US6859228B1
US6859228B1 US09/419,602 US41960299A US6859228B1 US 6859228 B1 US6859228 B1 US 6859228B1 US 41960299 A US41960299 A US 41960299A US 6859228 B1 US6859228 B1 US 6859228B1
Authority
US
United States
Prior art keywords
pixel
current pixel
color
color misregistration
window
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
US09/419,602
Inventor
William Ho Chang
Makoto Otsu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sharp Corp
Original Assignee
Sharp Laboratories of America Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sharp Laboratories of America Inc filed Critical Sharp Laboratories of America Inc
Priority to US09/419,602 priority Critical patent/US6859228B1/en
Assigned to SHARP LABORATORIES OF AMERICA, INCORPORATED reassignment SHARP LABORATORIES OF AMERICA, INCORPORATED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CHANG, WILLIAM HO, OTSU, MAKOTO
Application granted granted Critical
Publication of US6859228B1 publication Critical patent/US6859228B1/en
Assigned to SHARP KABUSHIKI KAISHA reassignment SHARP KABUSHIKI KAISHA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SHARP LABORATORIES OF AMERICA, INC.
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/56Processing of colour picture signals
    • H04N1/58Edge or detail enhancement; Noise or error suppression, e.g. colour misregistration correction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/13Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with multiple sensors
    • H04N23/15Image signal generation with circuitry for avoiding or correcting image misregistration

Definitions

  • This invention relates to methods of image capture, more particularly for methods of detecting color misregistration in image capture devices.
  • Color image capture devices typically operate by capturing primary color component signals such as red, green and blue (RGB) from a set of charge coupled devices (CCDs).
  • the CCDs are normally arranged in the main scan direction.
  • the sub-scan direction the direction in which the scanning bar moves, will be referred to as the Y direction and the main scan direction, perpendicular to the sub-scan direction, will be referred to as X.
  • CCDs capture the image in one pass or in three passes, one for each primary color component. Regardless of the number of passes, however, there is typically some misalignment in the RGB signals. This misalignment between colors is referred to as color misregistration. It is caused by faulty superposition of the three colors. It normally manifests itself as color fringes on the edges of the objects that were scanned, either text, graphics or drawings.
  • Color fringes normally appear as either cyan or magenta fringes on the edges of the scanned objects. Cyan fringes result from misregistration of the red signal, and magenta fringes result from misregistration of the green signal.
  • the human eye does not normally detect misregistration of the blue signal, because of its low bandwidth and low contrast sensitivity.
  • a fiber optic detection means is used to detect registration signals produced by a retroreflector.
  • the light from the retroreflector is analyzed and used to adjust the registration of the belt.
  • One aspect of the invention is a method for detecting color misregistration in image data.
  • Input image data is buffered as color space data.
  • the color space data is then transferred to vector space.
  • An examination window around a current pixel is established. Background and foreground pixels in this window are determined.
  • the current pixel is examined to determine if it is on an edge of a scanned object, either text, image or graphic. If the pixel meets all these requirements it is deemed to have color misregistration.
  • a correction value is then determined and adjusted before being applied to the pixel value.
  • the correction value can be determined using normal least squares. It can be adjusted by the application of fuzzy logic.
  • FIG. 1 shows a flowchart of one embodiment of a method for detecting color misregistration in accordance with the invention.
  • FIG. 2 shows a schematic representation of a pixel layout in the sub-scan direction, in accordance with the invention.
  • RGB cyan-magenta-yellow
  • CMYK cyan-magenta-yellow-black
  • FIG. 1 shows a flow chart of one embodiment of a process for detection of color misregistration in accordance with the invention.
  • the input data is received from the color image capture device, typically RGB data.
  • the data is digitized and buffered. For purposes of this discussion, the data is assumed to be digitized at eight bits per color.
  • This data is then processed in RGB vector space in a color misregistration detection circuitry or process.
  • the invention could be implemented as software or in hardware.
  • step 12 the line selected in step 10 is then transferred to vector space.
  • the vector space uses two color pixels, which will be referred to as pixel A and pixel B.
  • step 14 an examination area or window of interest must be established as shown in step 14 .
  • the variations of size and direction of this window is left to the designer. For purposes of the discussion only, a window of 5 pixels by 1 pixel will be assumed. A schematic representation of this type of window is shown in FIG. 2 .
  • the pixel of interest is pixel 0.
  • Two pixels on either side, before ( ⁇ ) and after (+) the pixel of interest are used in the analysis. In this example, these 5 pixels are in the sub-scan, or Y, direction. Only one pixel width is used in the scan direction. As has been mentioned, the dimensions of the window are left up to the designer.
  • step 16 of the process shown in FIG. 1 the pixel is analyzed to determine whether or not it is on an edge.
  • Edge detection may be performed in many ways. For example, a Sobel filter or a gradient filter may be used.
  • a special gradient edge detector can also be used. Using the window established in step 16 , gradients between the pixel of interest and its neighbors are determined. If the gradients fall below a predetermined threshold, no edge is detected. Since color misregistration typically occurs at the edges of scanned objects, such as text, drawings or images, pixels not on an edge are not considered to be candidates for color misregistration. If the result of edge detection at step 16 is negative, the process continues to step 28 and ends with respect to that pixel.
  • step 16 performs only an initial determination of edge detection. A much more detailed analysis is performed further in the process. Step 16 is an optional step, which can speed the process by further narrowing the pixels upon which more advanced computations must be performed.
  • step 16 If the result of edge detection in step 16 is positive, the process moves on to step 18 to differentiate between foreground pixels and background pixels. Again, there are several options for this determination. However, for this discussion, one of two approaches will be discussed. The pixels within the window are analyzed to determine darkest or lightest pixels. Alternatively, the pixels could be compared against a predetermined pattern. Further in the process, the current pixel will be analyzed in comparison to the foreground and background, so the identification of these components of the image is important.
  • scanned objects can include text characters, drawings or images.
  • edges of any of these objects are candidates for color misregistration.
  • step 20 of the process a two-step method will be used for step 20 of the process.
  • the first step will be to check the gradient of the pixel of interest. To be in the edge of an object, the gradient between the foreground and background must be higher than the gradients between the current pixel and the background, and the current pixel and the foreground.
  • Pixel 0 is the designation of the current pixel under study.
  • D the magnitude of the gradients, and a and b for foreground and background: D ( a,b )> D ( a, 0); and D ( a,b )> D ( b, 0).
  • a luminance check may also be performed. Some approximation is used to convert the foreground (a), background (b), and current pixel (0) to luminance values.
  • L ( a ) 0.5 G ( a )+0.3 R ( a )+0.2 B ( a ).
  • step 22 If the results of this step are positive, and the pixel is in the edge of a scanned object, then the process continues on to step 22 . If the results are negative, this pixel is eliminated as a candidate for color misregistration. The process will continue to step 28 and ends with regard to this pixel.
  • linear interpolation and normal least square projection is used to find the optimum correction point.
  • Linear interpolation is accomplished by linearly connecting the foreground pixel P a and the background pixel P b discussed with reference to step 18 .
  • the purpose of using least square projection is to find a point in the interpolation line that contains minimum energy to the current pixel P 0 .
  • the normal least square projection can be represented by: Minimize(Distance( P 0 ⁇ Line( a,b )) 2 .
  • This embodiment uses the normal least square approach, which is different from the traditional least square.
  • the projection direction is perpendicular to the object and is independent of the coordinate system used.
  • the normal least square projection is independent of the rotation in the coordinate system or the rotation of the object. It is also independent of the pixel values used in the current projection and interpolation for P 0 , P a , or P b .
  • the above equation can be solved in several different ways, including analytical geometry, vector calculus, linear algebra, or other optimization techniques.
  • This equation is then substituted in to the normal least squares equation above, which will find the value t corresponding to the projection point in the least squares approach. This value can then be used as the correction value for each color fringing pixel.
  • step 22 the value is adjusted using fuzzy logic in step 24 .
  • Fuzzy logic is a methodology developed by Professor Lotfi A. Zadeh at the University of California at Berkeley in 1965. It is used generally to describe a tool that allows intermediate values to be defined inside the range of conventional evaluations such as ON/OFF, YES/NO. The approach allows these intermediate values to be formulated mathematically and then processed by computers.
  • fuzzy logic results in varying the degree of correction applied to a pixel with color misregistration, beyond the current approach in the art of either correcting or not correcting a particular pixel.
  • the fuzzy logic could be applied in the following manner.
  • the amount of correction is high. If the contrast between the foreground and background is low, then the amount of correction is low. This determination could be made using the most simple linear fuzzy approximation, or triangular function. The complexity of the fuzzy logic applied is only constrained by the selections of the system design and the system operating conditions.
  • a second fuzzy objective could then take into account how closely a current pixel P 0 is to either the foreground or background. If it is close enough and indicates a borderline condition, the amount of correction is reduced. This could be implemented by using a step function to cut the correction factor in half in locations by the borderline. The exact designation of what constitutes a borderline condition can be adjusted to fit a particular system or application during implementation.
  • the above process is implemented in software in the image capture device. It is possible that it could be implemented in the image output device that receives the image data from the image capture device. It could also be implemented in either part of a device that performs both image capture and image output. This process could be implemented in image or graphic application software, Raster Image Processors (RIP), or printer, copier or output device drivers, among others.
  • RIP Raster Image Processors
  • printer copier or output device drivers
  • the process could be implemented in application specific integrated circuits (ASIC), field programmable gate arrays (FPGA) or in digital signal processors (DSP).
  • ASIC application specific integrated circuits
  • FPGA field programmable gate arrays
  • DSP digital signal processors
  • this process could be applied to color spaces other than RGB. It could be implemented in CMY, CMYK and chrominance and luminance based color spaces, such as LAB, LCH, HLS, etc. None of the above specifics or examples are intended to limit applicability of the invention.

Abstract

A method for color misregistration detection. Input image data is buffered and transferred to vector space. An examination window for a current pixel is established and foreground and background pixels within that window are selected. The current pixel is examined to determine if it is in an edge of a scanned object. If the current pixel is in an edge, it is deemed to have color misregistration. For pixels that have been deemed to have color misregistration, a correction value is determined and then adjusted prior to being applied to the pixel value.

Description

BACKGROUND OF THE INVENTION
1. Field of the Invention
This invention relates to methods of image capture, more particularly for methods of detecting color misregistration in image capture devices.
2. Background of the Invention
Color image capture devices typically operate by capturing primary color component signals such as red, green and blue (RGB) from a set of charge coupled devices (CCDs). The CCDs are normally arranged in the main scan direction. The sub-scan direction, the direction in which the scanning bar moves, will be referred to as the Y direction and the main scan direction, perpendicular to the sub-scan direction, will be referred to as X.
These CCDs capture the image in one pass or in three passes, one for each primary color component. Regardless of the number of passes, however, there is typically some misalignment in the RGB signals. This misalignment between colors is referred to as color misregistration. It is caused by faulty superposition of the three colors. It normally manifests itself as color fringes on the edges of the objects that were scanned, either text, graphics or drawings.
Color fringes normally appear as either cyan or magenta fringes on the edges of the scanned objects. Cyan fringes result from misregistration of the red signal, and magenta fringes result from misregistration of the green signal. The human eye does not normally detect misregistration of the blue signal, because of its low bandwidth and low contrast sensitivity.
Most often, color misregistration occurs in the Y direction. Vibration, scanning motion and the mechanical or optical design of the scanner can lead to faulty superposition of the three-color components. Several different approaches have been taken to solve this problem.
For example, some efforts have been directed at correcting the mechanical problems in the scanner by tracking registration marks. One example of these types of techniques is found in U.S. Pat. No. 5,737,003, issued on Apr. 7, 1998. In this patent, a laser scanner used to form latent images on the photoconductive belt is used to detect the position of the edge of the belt. The belt is then controlled to reduce the deviation of the belt from its path. It also includes a method for controlling the laser, and therefore the formation of the image, based upon the position of the belt.
Another of these mechanical registration techniques is found in U.S. Pat. No. 5,774,156, issued Jun. 30, 1998. The system uses several stations, one for each color of toner. The latent image formed by the individual scanners at the stations includes a registration area. The registration area is then aligned prior to the application of the toner. The registration area is then recharged to avoid having the registration marks attract any toner. This is repeated at each station to ensure proper positioning of the image before the latent image for the next color is formed.
U.S. Pat. No. 5,760,815, issued Jun. 2, 1998, shows another method. In this patent, a fiber optic detection means is used to detect registration signals produced by a retroreflector. The light from the retroreflector is analyzed and used to adjust the registration of the belt.
Other methods have focused on optical means to correct the misregistration. An example of these types of techniques can be found in U.S. Pat. No. 4,583,116, issued Apr. 15, 1986. In this patent, the color signals are manipulated to convert them into color separation signals for cyan, magenta, yellow and black. The edges of each of the colors is then detected and manipulated to switch lighter areas with darker areas, or vice versa, to avoid streaks and other imperfections.
Several other types of techniques are used to detect color misregistration at the data level. Examples of these are found in U.S. Pat. Nos. 5,500,746, 5,907,414, 5,477,335, and 5,764,388. In U.S. Pat. No. 5,500,746, issued Mar. 19, 1996, the signals are manipulated to ensure that the dots formed are in line both in the X and Y directions for each color. The dots are resampled and repositioned as determined by line correction devices.
In U.S. Pat. No. 5,907,414, issued May 25, 1999, one of the more powerful prior art methods is shown. An image sensor used to scanning a manuscript generates signals and these signals are examined. If the examination of the signals determines that the pixel exists at an edge of a letter image, it is identified as such. These identified pixels are then adjusted in their brightness relative to the green plane to ensure a smooth edge that was disrupted by vibration of the image sensor.
A less sophisticated but still useful technique is shown in U.S. Pat. No. 5,764,388, issued Jun. 9, 1998. In this patent, the cyan-magenta-yellow components of a pixel are analyzed. If the chrominance of the signal is less than a threshold, it is set to zero to offset an assumed color misregistration error.
The current state of the art, as demonstrated above, has several limitations in the color misregistration area. The techniques are inaccurate and are not adaptive to local conditions in the amount of color misregistration detected. Typically, only one threshold level is used to determine whether the pixel suffers from misregistration, which then causes the correction to be applied. These processes are very susceptible to wrong detection or overcorrection, degrading image quality and losing image information. Sharp transitional artifacts can appear in the image where corrected pixels lie next to uncorrected pixels.
Therefore, a method is needed that detects and corrects color misregistration more reliably and applies necessary corrections in a more accurate manner.
SUMMARY OF THE INVENTION
One aspect of the invention is a method for detecting color misregistration in image data. Input image data is buffered as color space data. The color space data is then transferred to vector space. An examination window around a current pixel is established. Background and foreground pixels in this window are determined. The current pixel is examined to determine if it is on an edge of a scanned object, either text, image or graphic. If the pixel meets all these requirements it is deemed to have color misregistration. A correction value is then determined and adjusted before being applied to the pixel value. The correction value can be determined using normal least squares. It can be adjusted by the application of fuzzy logic.
Other aspects of the invention include the above method with an optional edge detection step that occurs before any detailed analysis is performed. This initial sorting can speed the process by narrowing the number of pixels that need to undergo the detailed analysis.
BRIEF DESCRIPTION OF THE DRAWINGS
For a more complete understanding of the present invention and for further advantages thereof, reference is now made to the following Detailed Description taken in conjunction with the accompanying Drawings in which:
FIG. 1 shows a flowchart of one embodiment of a method for detecting color misregistration in accordance with the invention.
FIG. 2 shows a schematic representation of a pixel layout in the sub-scan direction, in accordance with the invention.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
Vector-based analysis of color misregistration performs color misregistration detection using vector manipulation in RGB color space. However, it is possible that this vector manipulation could be performed in the cyan-magenta-yellow (CMY) or cyan-magenta-yellow-black (CMYK) color space as well. However, as the intent of this invention is analysis upon color capture from an image capture device, and these devices typically use RGB color space, the discussion of this invention will focus on RGB color space. However, that is in no way intended to limit applicability of this invention.
FIG. 1 shows a flow chart of one embodiment of a process for detection of color misregistration in accordance with the invention. In step 10, the input data is received from the color image capture device, typically RGB data. The data is digitized and buffered. For purposes of this discussion, the data is assumed to be digitized at eight bits per color. This data is then processed in RGB vector space in a color misregistration detection circuitry or process. The invention could be implemented as software or in hardware.
In step 12, the line selected in step 10 is then transferred to vector space. The vector space uses two color pixels, which will be referred to as pixel A and pixel B. The two color vectors have the following notation:
P A=(R a , G a , B a); and P B=(R b , G b , B b).
The gradient between the two pixels to be
d ab=(d RAB , d GAB , d BAB), and its magnitude is D AB=magnitude(d AB).
Once the initial data for the RGB color space is buffered and transferred to the vector space, several steps can be performed that will narrow the possible pixels with color misregistration. Prior to performing any of these steps, however, an examination area or window of interest must be established as shown in step 14. The variations of size and direction of this window is left to the designer. For purposes of the discussion only, a window of 5 pixels by 1 pixel will be assumed. A schematic representation of this type of window is shown in FIG. 2.
The pixel of interest is pixel 0. Two pixels on either side, before (−) and after (+) the pixel of interest are used in the analysis. In this example, these 5 pixels are in the sub-scan, or Y, direction. Only one pixel width is used in the scan direction. As has been mentioned, the dimensions of the window are left up to the designer.
Having established the window of pixels to be examined around the pixel of interest, it is now possible to quickly determine whether detailed analysis of that pixel is necessary. In step 16 of the process shown in FIG. 1, the pixel is analyzed to determine whether or not it is on an edge. Edge detection may be performed in many ways. For example, a Sobel filter or a gradient filter may be used.
However, in the instant invention, a special gradient edge detector can also be used. Using the window established in step 16, gradients between the pixel of interest and its neighbors are determined. If the gradients fall below a predetermined threshold, no edge is detected. Since color misregistration typically occurs at the edges of scanned objects, such as text, drawings or images, pixels not on an edge are not considered to be candidates for color misregistration. If the result of edge detection at step 16 is negative, the process continues to step 28 and ends with respect to that pixel.
It must be noted that step 16 performs only an initial determination of edge detection. A much more detailed analysis is performed further in the process. Step 16 is an optional step, which can speed the process by further narrowing the pixels upon which more advanced computations must be performed.
If the result of edge detection in step 16 is positive, the process moves on to step 18 to differentiate between foreground pixels and background pixels. Again, there are several options for this determination. However, for this discussion, one of two approaches will be discussed. The pixels within the window are analyzed to determine darkest or lightest pixels. Alternatively, the pixels could be compared against a predetermined pattern. Further in the process, the current pixel will be analyzed in comparison to the foreground and background, so the identification of these components of the image is important.
Once the determination of foreground or background is made, the process moves to object detection at step 20. As discussed above, scanned objects can include text characters, drawings or images. The edges of any of these objects are candidates for color misregistration. Again, there are several methods for determining if a pixel is part of a scanned object or not.
For purposes of this discussion, a two-step method will be used for step 20 of the process. The first step will be to check the gradient of the pixel of interest. To be in the edge of an object, the gradient between the foreground and background must be higher than the gradients between the current pixel and the background, and the current pixel and the foreground. Pixel 0 is the designation of the current pixel under study. Using D as the magnitude of the gradients, and a and b for foreground and background:
D(a,b)>D(a,0); and D(a,b)>D(b,0).
In addition to the gradient check, a luminance check may also be performed. Some approximation is used to convert the foreground (a), background (b), and current pixel (0) to luminance values. One example of such a conversion is shown below using the foreground values:
L(a)=0.5G(a)+0.3R(a)+0.2B(a).
To be in the edge of an object, the luminance of the current pixel must be between the foreground and background luminance values.
L(b)<L(0)<L(a); or, L(a)<L(0)<L(b).
If the results of this step are positive, and the pixel is in the edge of a scanned object, then the process continues on to step 22. If the results are negative, this pixel is eliminated as a candidate for color misregistration. The process will continue to step 28 and ends with regard to this pixel.
This narrowing process of eliminating pixels that are not good candidates for color misregistration helps speeds the process. Only pixels that are good candidates undergo detailed analysis. This detailed analysis is performed at step 22.
At this point in the process, the pixel is deemed to be a pixel with color misregistration. At step 22, linear interpolation and normal least square projection is used to find the optimum correction point. Linear interpolation is accomplished by linearly connecting the foreground pixel Pa and the background pixel Pb discussed with reference to step 18. For example, a straight line connecting pixels a and b can be represented by:
Line(a,b): (R−R a)/(R b−Ra)=(G−G a)/(G b −G a)=(B−B a)/(B b −B a).
It must be noted that many other types of interpolation exist and are equally applicable. Some may result in equal or better performance in different circumstances.
The purpose of using least square projection is to find a point in the interpolation line that contains minimum energy to the current pixel P0. The normal least square projection can be represented by:
Minimize(Distance(P 0−Line(a,b))2.
This embodiment uses the normal least square approach, which is different from the traditional least square. In normal least square, the projection direction is perpendicular to the object and is independent of the coordinate system used. The normal least square projection is independent of the rotation in the coordinate system or the rotation of the object. It is also independent of the pixel values used in the current projection and interpolation for P0, Pa, or Pb.
The above equation can be solved in several different ways, including analytical geometry, vector calculus, linear algebra, or other optimization techniques. For example, the above equation could be solve by parameterizing the equation for Line(a,b) in t, using:
Line(t)=(P b −P a)t+P a.
This equation is then substituted in to the normal least squares equation above, which will find the value t corresponding to the projection point in the least squares approach. This value can then be used as the correction value for each color fringing pixel.
Once the color correction value is determined in step 22, the value is adjusted using fuzzy logic in step 24. Fuzzy logic is a methodology developed by Professor Lotfi A. Zadeh at the University of California at Berkeley in 1965. It is used generally to describe a tool that allows intermediate values to be defined inside the range of conventional evaluations such as ON/OFF, YES/NO. The approach allows these intermediate values to be formulated mathematically and then processed by computers.
As applied in this embodiment, fuzzy logic results in varying the degree of correction applied to a pixel with color misregistration, beyond the current approach in the art of either correcting or not correcting a particular pixel. Merely as an example, the fuzzy logic could be applied in the following manner.
If the contrast of the gradient between the foreground and background is high, then the amount of correction is high. If the contrast between the foreground and background is low, then the amount of correction is low. This determination could be made using the most simple linear fuzzy approximation, or triangular function. The complexity of the fuzzy logic applied is only constrained by the selections of the system design and the system operating conditions.
A second fuzzy objective could then take into account how closely a current pixel P0 is to either the foreground or background. If it is close enough and indicates a borderline condition, the amount of correction is reduced. This could be implemented by using a step function to cut the correction factor in half in locations by the borderline. The exact designation of what constitutes a borderline condition can be adjusted to fit a particular system or application during implementation. These objectives are merely for demonstrative purposes and are not intended to limit the applicability of multivalued determinations to any particular set of objectives.
Preferably, the above process is implemented in software in the image capture device. It is possible that it could be implemented in the image output device that receives the image data from the image capture device. It could also be implemented in either part of a device that performs both image capture and image output. This process could be implemented in image or graphic application software, Raster Image Processors (RIP), or printer, copier or output device drivers, among others.
Alternately, the process could be implemented in application specific integrated circuits (ASIC), field programmable gate arrays (FPGA) or in digital signal processors (DSP). However, these hardware implementations are not as flexible, so the software embodiments are preferred.
As mentioned previously, this process could be applied to color spaces other than RGB. It could be implemented in CMY, CMYK and chrominance and luminance based color spaces, such as LAB, LCH, HLS, etc. None of the above specifics or examples are intended to limit applicability of the invention.
Thus, although there has been described to this point a particular embodiment for a method and apparatus for color misregistration correction, it is not intended that such specific references be considered as limitations upon the scope of this invention except in-so-far as set forth in the following claims.

Claims (9)

1. A method for detecting and correcting color misregistration of input images, the method comprising:
buffering image data received from an image capture device in the form of color space data;
transferring said color space data to vector space;
establishing a window around and including a current pixel;
selecting foreground and background pixels in said window;
determining if said current pixel is in an edge of an object by determining if a gradient between the foreground pixel and the background pixel is higher than a gradient between the current pixel and the foreground pixel and a gradient between the current pixel and the background pixel;
designating said current pixel as having color misregistration if said current pixel is in an edge of an object;
determining a correction value by applying normal least square projection to said current pixel;
adjusting the correction value for said current pixel having color misregistration; and
applying said correction value to said current pixel having color misregistration.
2. The method as claimed in claim 1 wherein said method further includes initially determining if a pixel is on an edge, occurring before establishing a window.
3. The method as claimed in claim 1 wherein establishing a window further comprises establishing a window of 5 pixels in the sub-scan direction and 1 pixel in the main scan direction.
4. The method as claimed in claim 1, wherein determining if said current pixel is in an edge further comprises a luminance check on said current pixel.
5. The method as claimed in claim 1, wherein adjusting further comprises applying fuzzy logic to said correction value to adjust said correction value.
6. The method as claimed in claim 1, wherein said color space is RGB.
7. The method as claimed in claim 1, wherein said color space is CMY.
8. The method as claimed in claim 1, wherein said color space is CMYK.
9. The method as claimed in claim 1, wherein said color space is a luminance and chrominance color space.
US09/419,602 1999-10-18 1999-10-18 Least squares method for color misregistration detection and correction in image data Expired - Lifetime US6859228B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US09/419,602 US6859228B1 (en) 1999-10-18 1999-10-18 Least squares method for color misregistration detection and correction in image data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US09/419,602 US6859228B1 (en) 1999-10-18 1999-10-18 Least squares method for color misregistration detection and correction in image data

Publications (1)

Publication Number Publication Date
US6859228B1 true US6859228B1 (en) 2005-02-22

Family

ID=34134994

Family Applications (1)

Application Number Title Priority Date Filing Date
US09/419,602 Expired - Lifetime US6859228B1 (en) 1999-10-18 1999-10-18 Least squares method for color misregistration detection and correction in image data

Country Status (1)

Country Link
US (1) US6859228B1 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050046695A1 (en) * 2002-05-16 2005-03-03 Kei Takasugi Color misregistration reducer
US20080049253A1 (en) * 2001-01-19 2008-02-28 Chang William H System and method for data output
US8285802B2 (en) 2000-11-01 2012-10-09 Flexiworld Technologies, Inc. Internet-phone or smart phone with applications for managing and playing digital content, and a mobile device operating system supporting application programming interface
US8533352B2 (en) 2002-12-12 2013-09-10 Flexiworld Technologies, Inc. Method for internet access and for communication
US8595717B2 (en) 2002-12-12 2013-11-26 Flexiworld Technologies, Inc. Memory controller that includes support for autorun of software or data
US8705097B2 (en) 2000-11-20 2014-04-22 Flexiworld Technologies, Inc. Internet applications and services for rendering digital content
US9554107B2 (en) 2014-02-07 2017-01-24 Sony Corporation Method and apparatus for reducing color fringing in composite images
USRE46637E1 (en) 2000-09-05 2017-12-12 Flexiworld Technologies, Inc. Apparatus, methods, and systems for data mining user information
CN110798592A (en) * 2019-10-29 2020-02-14 普联技术有限公司 Object movement detection method, device and equipment based on video image and storage medium
US10860290B2 (en) 2000-11-01 2020-12-08 Flexiworld Technologies, Inc. Mobile information apparatuses that include a digital camera, a touch sensitive screen interface, support for voice activated commands, and a wireless communication chip or chipset supporting IEEE 802.11
US10915296B2 (en) 2000-11-01 2021-02-09 Flexiworld Technologies, Inc. Information apparatus that includes a touch sensitive screen interface for managing or replying to e-mails
US11204729B2 (en) 2000-11-01 2021-12-21 Flexiworld Technologies, Inc. Internet based digital content services for pervasively providing protected digital content to smart devices based on having subscribed to the digital content service
US11467856B2 (en) 2002-12-12 2022-10-11 Flexiworld Technologies, Inc. Portable USB device for internet access service

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4583116A (en) 1982-06-04 1986-04-15 Dr. -Ing. Rudolf Hell Gmbh Method and apparatus for eliminating defects in images in polychromatic printing due to faulty registration of superimposed printing of color separations
US5475428A (en) * 1993-09-09 1995-12-12 Eastman Kodak Company Method for processing color image records subject to misregistration
US5477335A (en) 1992-12-28 1995-12-19 Eastman Kodak Company Method and apparatus of copying of black text on documents using a color scanner
US5485203A (en) * 1991-08-12 1996-01-16 Olympus Optical Co., Ltd. Color misregistration easing system which corrects on a pixel or block basis only when necessary
US5500746A (en) 1993-05-19 1996-03-19 Ricoh Company, Ltd. Color image input apparatus
US5668931A (en) * 1993-03-31 1997-09-16 Dermer; Richard A. Method for automatic trap selection for correcting for plate misregistration in color printing
US5737003A (en) 1995-11-17 1998-04-07 Imation Corp. System for registration of color separation images on a photoconductor belt
US5760815A (en) 1994-12-09 1998-06-02 Xerox Corporation Fiber optic registration mark detection system for a color reproduction device
US5764388A (en) 1995-08-25 1998-06-09 Brother Kogyo Kabushiki Kaisha Method and device for converting color signal
US5774156A (en) 1996-09-17 1998-06-30 Xerox Corporation Image self-registration for color printers
US5907414A (en) 1993-04-28 1999-05-25 Matsushita Electric Industrial Co., Ltd. Color image processing apparatus
US6088475A (en) * 1993-12-09 2000-07-11 Nagashima; Mieko Method and apparatus for forming and correcting color image
US6272239B1 (en) * 1997-12-30 2001-08-07 Stmicroelectronics S.R.L. Digital image color correction device and method employing fuzzy logic
US6345117B2 (en) * 1998-10-22 2002-02-05 Xerox Corporation Method for automatic trap selection for correcting for separation misregistration in color printing
USRE37940E1 (en) * 1990-12-20 2002-12-24 Kaoru Imao Interpolation method and color correction method using interpolation
US6556313B1 (en) * 1999-09-27 2003-04-29 Sharp Laboratories Of America, Incorporated Vector method for color misregistration detection in image data

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4583116A (en) 1982-06-04 1986-04-15 Dr. -Ing. Rudolf Hell Gmbh Method and apparatus for eliminating defects in images in polychromatic printing due to faulty registration of superimposed printing of color separations
USRE37940E1 (en) * 1990-12-20 2002-12-24 Kaoru Imao Interpolation method and color correction method using interpolation
US5485203A (en) * 1991-08-12 1996-01-16 Olympus Optical Co., Ltd. Color misregistration easing system which corrects on a pixel or block basis only when necessary
US5477335A (en) 1992-12-28 1995-12-19 Eastman Kodak Company Method and apparatus of copying of black text on documents using a color scanner
US5668931A (en) * 1993-03-31 1997-09-16 Dermer; Richard A. Method for automatic trap selection for correcting for plate misregistration in color printing
US5907414A (en) 1993-04-28 1999-05-25 Matsushita Electric Industrial Co., Ltd. Color image processing apparatus
US5500746A (en) 1993-05-19 1996-03-19 Ricoh Company, Ltd. Color image input apparatus
US5475428A (en) * 1993-09-09 1995-12-12 Eastman Kodak Company Method for processing color image records subject to misregistration
US6088475A (en) * 1993-12-09 2000-07-11 Nagashima; Mieko Method and apparatus for forming and correcting color image
US5760815A (en) 1994-12-09 1998-06-02 Xerox Corporation Fiber optic registration mark detection system for a color reproduction device
US5764388A (en) 1995-08-25 1998-06-09 Brother Kogyo Kabushiki Kaisha Method and device for converting color signal
US5737003A (en) 1995-11-17 1998-04-07 Imation Corp. System for registration of color separation images on a photoconductor belt
US5774156A (en) 1996-09-17 1998-06-30 Xerox Corporation Image self-registration for color printers
US6272239B1 (en) * 1997-12-30 2001-08-07 Stmicroelectronics S.R.L. Digital image color correction device and method employing fuzzy logic
US6345117B2 (en) * 1998-10-22 2002-02-05 Xerox Corporation Method for automatic trap selection for correcting for separation misregistration in color printing
US6556313B1 (en) * 1999-09-27 2003-04-29 Sharp Laboratories Of America, Incorporated Vector method for color misregistration detection in image data

Cited By (78)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USRE48088E1 (en) 2000-09-05 2020-07-07 Flexiworld Technologies, Inc. Methods, devices, or applications for accessing a service provided over the internet for connecting to another user or device, the service data mines transactions and information of its user
USRE46637E1 (en) 2000-09-05 2017-12-12 Flexiworld Technologies, Inc. Apparatus, methods, and systems for data mining user information
USRE49176E1 (en) 2000-09-05 2022-08-16 Flexiworld Technologies, Inc. Apparatus, methods, or software for data mining user information by providing services over the internet for connecting people
USRE48066E1 (en) 2000-09-05 2020-06-23 Flexiworld Technologies, Inc. Services that are provided, at least partly, over the internet for data mining user information
US9092177B2 (en) 2000-11-01 2015-07-28 Flexiworld Technologies, Inc. Smart phones that include a digital camera, a touch sensitive screen, support for voice activated commands, and support to at least part of a protocol within IEEE 802.11 standards
US11029903B2 (en) 2000-11-01 2021-06-08 Flexiworld Technologies, Inc. Output systems, such as television controllers, televisions, display devices, or audio output devices, operable for playing digital content wirelessly received either from a digital content service over the internet or wirelessly received from a client device that is in the same network as the output system
US10140073B2 (en) 2000-11-01 2018-11-27 Flexiworld Technologies, Inc. Wireless devices that establish a wireless connection with a mobile information apparatus by wirelessly detecting, within physical proximity, the mobile information apparatus
US9110622B2 (en) 2000-11-01 2015-08-18 Flexiworld Technologies, Inc. Internet-pads that include a digital camera, a touch sensitive screen interface, and support for voice activated commands
US10915296B2 (en) 2000-11-01 2021-02-09 Flexiworld Technologies, Inc. Information apparatus that includes a touch sensitive screen interface for managing or replying to e-mails
US10873856B2 (en) 2000-11-01 2020-12-22 Flexiworld Technologies, Inc. Printing devices supporting printing over air or printing over a wireless network
US10866773B2 (en) 2000-11-01 2020-12-15 Flexiworld Technologies, Inc. Information apparatus for playing digital content that is received from a digital content service provided over the internet
US10860290B2 (en) 2000-11-01 2020-12-08 Flexiworld Technologies, Inc. Mobile information apparatuses that include a digital camera, a touch sensitive screen interface, support for voice activated commands, and a wireless communication chip or chipset supporting IEEE 802.11
US10846031B2 (en) 2000-11-01 2020-11-24 Flexiworld Technologies, Inc. Software application for a mobile device to wirelessly manage or wirelessly setup an output system or output device for service
US10768871B2 (en) 2000-11-01 2020-09-08 Flexiworld Technologies, Inc. Wireless output devices or wireless controllers for establishing wireless connectivity and for receiving digital content
US9015329B2 (en) 2000-11-01 2015-04-21 Samsung Electronics Co., Ltd. Portable information apparatus that includes touch sensitive screen interface and wireless communication circuitry for selecting an output device on the touch sensitive screen and for wireless transmitting output stream to the selected output device
US9037088B2 (en) 2000-11-01 2015-05-19 Flexiworld Technologies, Inc. Smart phone that includes a wireless communication unit compatible with at least one protocol within bluetooth and/or IEEE802.11 standards for wireless transmission of audio digital content from the smart phone to a wireless audio output device for voice output or music playing
US10761791B2 (en) 2000-11-01 2020-09-01 Flexiworld Technologies, Inc. Wireless printing devices that provide printing services over a network without a need for a client device of the printing device to use, at the client device, a printer specific printer driver
US10740066B2 (en) 2000-11-01 2020-08-11 Flexiworld Technologies, Inc. Output devices that establish wireless connection with an information apparatus subsequent to having been wirelessly discovered by the information apparatus
US11096056B2 (en) 2000-11-01 2021-08-17 Flexiworld Technologies, Inc. Output devices, such as televisions, output controllers, or audio output devices, that are setup to wirelessly receive digital content from a digital content service over the internet or from a wireless information apparatus that is in the same network as the output devices
US11204729B2 (en) 2000-11-01 2021-12-21 Flexiworld Technologies, Inc. Internet based digital content services for pervasively providing protected digital content to smart devices based on having subscribed to the digital content service
US8332521B2 (en) 2000-11-01 2012-12-11 Flexiworld Technologies, Inc. Internet-pad specification, the internet-pad specification may include a touch sensitive screen, a digital camera, a document application, an e-mail application, icons over the touch sensitive screen for user selection, a wireless communication unit for wireless connectivity, a digital content application for playing digital content, and an operating system supporting application programming interface (API)
US10642576B2 (en) 2000-11-01 2020-05-05 Flexiworld Technologies, Inc. Mobile information apparatus that includes wireless communication circuitry for discovery of an output device for outputting digital content at the wirelessly discovered output device
US8285802B2 (en) 2000-11-01 2012-10-09 Flexiworld Technologies, Inc. Internet-phone or smart phone with applications for managing and playing digital content, and a mobile device operating system supporting application programming interface
US9164718B2 (en) 2000-11-01 2015-10-20 Samsung Electronics Co., Ltd. Output device and method for output without an output driver
US10592201B2 (en) 2000-11-01 2020-03-17 Flexiworld Technologies, Inc. Mobile information apparatus supporting mobile payment having security based, at least in part, on device identification number, password or pin code, data encryption, and short physical distance wireless communication
US9383956B2 (en) 2000-11-01 2016-07-05 Mattel, Inc. Smart phones that include touch sensitive screen and supporting voice activated commands for managing or replying to E-mails
US10592202B2 (en) 2000-11-01 2020-03-17 Flexiworld Technologies, Inc. Mobile information apparatus supporting mobile payment that includes security based, at least partly, on user pushbutton, user biometrics, data encryption, and short physical distance wireless communication
US10489096B2 (en) 2000-11-01 2019-11-26 Flexiworld Technologies, Inc. Information apparatus and application for receiving digital content from a digital content service over the internet and for playing at least part of the received digital content at an output device
US9798516B2 (en) 2000-11-01 2017-10-24 Flexiworld Technologies, Inc. Smart phones that support wireless printing of emails over air to a wireless printer in a wireless local area network
US10481846B2 (en) 2000-11-01 2019-11-19 Flexiworld Technologies, Inc. Software applications and information apparatus for printing over air or for printing over a network
US10481847B2 (en) 2000-11-01 2019-11-19 Flexiworld Technologies, Inc. Information apparatus and software applications supporting output of digital content over a network to a registered output device
US11416197B2 (en) 2000-11-01 2022-08-16 Flexiworld Technologies, Inc. Wireless controllers connectable to televisions, wireless televisions, wireless output controllers, or wireless output devices for receiving digital content from one or more servers over the internet
US10387087B2 (en) 2000-11-01 2019-08-20 Flexiworld Technologies, Inc. Output systems or audio output devices that include an interface operable by a user to initiate wireless discovery for establishing wireless connections with mobile devices
US10359957B2 (en) 2000-11-01 2019-07-23 Flexiworld Technologies, Inc. Integrated circuit device that includes a secure element and a wireless component for transmitting protected data over short range wireless point-to-point communications
US10162596B2 (en) 2000-11-01 2018-12-25 Flexiworld Technologies, Inc. Portable electronic device configured to receive voice activated commands and to wirelessly manage or drive an output device
US10037178B2 (en) 2000-11-01 2018-07-31 Flexiworld Technologies, Inc. Wireless output devices or wireless controllers that support wireless device discovery for establishing wireless connectivity
US10108394B2 (en) 2000-11-01 2018-10-23 Samsung Electronics Co., Ltd. Output device and method for output without an output driver
US10126991B2 (en) 2000-11-01 2018-11-13 Flexiworld Technologies, Inc. Output systems, information apparatus, or internet appliances supporting voice commands for receiving and for playing selected digital content from a service over a network
US10152285B2 (en) 2000-11-01 2018-12-11 Flexiworld Technologies, Inc. Mobile information apparatus that includes voice commands for playing digital content that is received from a digital content service provided over the internet
US10140072B2 (en) 2000-11-01 2018-11-27 Flexiworld Technologies, Inc. Sound output system or internet appliance that supports voice activated commands, and that plays audio data received from a service over a network
US10133527B2 (en) 2000-11-20 2018-11-20 Flexiworld Technologies, Inc. Wireless devices that communicate, via short range wireless communication, with a mobile client device for establishing services of the wireless device with a server over the internet
US9389822B2 (en) 2000-11-20 2016-07-12 Flexiworld Technologies, Inc. Mobile information apparatus that include support for receiving video digital content over the internet from a service, and for wireless displaying or playing over air at least part of the received video digital content from the mobile information apparatus to televisions, television controllers, display devices, or projection devices
US11169756B2 (en) 2000-11-20 2021-11-09 Flexijet Technologies, Inc. Method for capturing, storing, accessing, and outputting digital content
US9971555B2 (en) 2000-11-20 2018-05-15 Flexiworld Technologies, Inc. Internet based digital content services that provide content upload, content storage, content sharing, content playlist selection, content editing, or content download; and smart phones, information pads, smart televisions and printers with access to the internet based digital content services
US10261739B2 (en) 2000-11-20 2019-04-16 Pebble Tide Llc System for capturing and outputting digital content over a network that includes the internet
US10303411B2 (en) 2000-11-20 2019-05-28 Pebble Tide Llc Method for capturing, storing, accessing, and outputting digital content
US10346114B2 (en) 2000-11-20 2019-07-09 Flexiworld Technologies, Inc. Digital content services over the internet that transmit or stream protected digital content to mobile devices, display devices, audio output devices, printing devices, televisions, or television controllers
US9965233B2 (en) 2000-11-20 2018-05-08 Flexiworld Technologies, Inc. Digital content services or stores over the internet that transmit or stream protected or encrypted digital content to connected devices and applications that access the digital content services or stores
US8705097B2 (en) 2000-11-20 2014-04-22 Flexiworld Technologies, Inc. Internet applications and services for rendering digital content
US9836259B2 (en) 2000-11-20 2017-12-05 Flexiworld Technologies, Inc. Televisions, output controllers, or speakers that are setup to wirelessly connect to a network and to receive digital content from a digital content service over the network
US8964220B2 (en) 2000-11-20 2015-02-24 Flexiworld Technologies, Inc. Mobile devices supporting wireless synchronization over a wireless Network compatible, at least partly, with IEEE 802.11 standard(s)
US10606535B2 (en) 2000-11-20 2020-03-31 Flexiworld Technologies, Inc. Internet based digital content services that provide content upload, content storage, content sharing, and content playlist selection; and smart devices with access to the internet based digital content services
US9298407B2 (en) 2000-11-20 2016-03-29 Flexiworld Technologies, Inc. Mobile information apparatus that are configurable to include applications for playing audio digital content received over the internet from subscribed audio digital content service(s)
US9042811B2 (en) 2001-01-19 2015-05-26 Flexiworld Technologies, Inc. Specification of smart wireless television for rendering digital content
US8630000B2 (en) 2001-01-19 2014-01-14 Flexiworld Technologies, Inc. Essential components for enabling a pervasive wireless digital ecosystem and wireless devices that support the wireless digital ecosystem
US20080049253A1 (en) * 2001-01-19 2008-02-28 Chang William H System and method for data output
US20080049651A1 (en) * 2001-01-19 2008-02-28 Chang William H Output controller systems, method, software, and device for wireless data output
US9069510B2 (en) 2001-01-19 2015-06-30 Flexiworld Technologies, Inc. Touch sensitive screen information apparatus that includes close proximity or near distance radio frequency field communication
US20100227550A1 (en) * 2001-01-19 2010-09-09 Flexiworld Technologies, Inc. Method and apparatus for wireless output of digital content
US10140071B2 (en) 2001-01-19 2018-11-27 Flexiworld Technologies, Inc. Printers, printer controllers, printer software, or printer firmware for supporting wireless printing or printing over air
US9036181B2 (en) 2001-01-19 2015-05-19 Flexiworld Technologies, Inc. Wireless printing device for printing digital content received via wireless communication compatible, at least partly, with IEEE 802.11 or Bluetooth
US8989064B2 (en) 2001-01-19 2015-03-24 Flexiworld Technologies, Inc. Wireless controller wire connectable to output devices such as televisions for accessing digital content and for wireless communication with mobile information apparatus
US10841798B2 (en) 2001-01-19 2020-11-17 Flexiworld Technologies, Inc. Information apparatus or client software that wirelessly discovers, within short range, one or more output devices for establishing a wireless connection
US9841935B2 (en) 2001-01-19 2017-12-12 Flexiworld Technologies, Inc. Wireless television or controller wire connectable to a television for receiving and rendering digital content
US9836257B2 (en) 2001-01-19 2017-12-05 Flexiworld Technologies, Inc. Mobile information apparatus that includes intelligent wireless display, wireless direct display, or transfer of digital content for playing over air the digital content at smart televisions, television controllers, or audio output devices
US20050046695A1 (en) * 2002-05-16 2005-03-03 Kei Takasugi Color misregistration reducer
US7372989B2 (en) * 2002-05-16 2008-05-13 Olympus Corporation Color misregistration reducer
US10963169B2 (en) 2002-12-12 2021-03-30 Flexiworld Technologies, Inc. Integrated circuit device storing protected data for wireless transmitting, over short range wireless communication, the protected data to a wireless computing device
US8972610B2 (en) 2002-12-12 2015-03-03 Flexiworld Technologies, Inc. Portable communication USB device for providing mobile internet access service or for providing other communication services
US8533352B2 (en) 2002-12-12 2013-09-10 Flexiworld Technologies, Inc. Method for internet access and for communication
US9043482B2 (en) 2002-12-12 2015-05-26 Flexiworld Technologies, Inc. Portable communication device for providing phone calling service
US8595717B2 (en) 2002-12-12 2013-11-26 Flexiworld Technologies, Inc. Memory controller that includes support for autorun of software or data
US9116723B2 (en) 2002-12-12 2015-08-25 Flexiworld Technologies, Inc. Communication device or media device for providing phone calling service, internet access service, or digital content service
US11467856B2 (en) 2002-12-12 2022-10-11 Flexiworld Technologies, Inc. Portable USB device for internet access service
US11662918B2 (en) 2002-12-12 2023-05-30 Flexiworld Technologies, Inc. Wireless communication between an integrated circuit memory device and a wireless controller device
US11829776B2 (en) 2002-12-12 2023-11-28 Flexiworld Technologies, Inc. Integrated circuit device that includes a protected memory component for transmitting protected data over a communication interface
US9554107B2 (en) 2014-02-07 2017-01-24 Sony Corporation Method and apparatus for reducing color fringing in composite images
CN110798592A (en) * 2019-10-29 2020-02-14 普联技术有限公司 Object movement detection method, device and equipment based on video image and storage medium

Similar Documents

Publication Publication Date Title
US6701009B1 (en) Method of separated color foreground and background pixel improvement
US6556313B1 (en) Vector method for color misregistration detection in image data
US6868180B2 (en) Image processing apparatus, image forming apparatus, and image processing method
JP3399486B2 (en) Color image processing apparatus and method
US7142717B2 (en) Image processing apparatus and method
US6859228B1 (en) Least squares method for color misregistration detection and correction in image data
US20010015815A1 (en) Image forming apparatus which excels in reproducibility of colors, fine lines and gradations even in a copy made from a copied image
JPH11266367A (en) Image processor
US7782506B2 (en) Image reading apparatus capable of detecting noise
US7099520B2 (en) Image processing apparatus, image processing method, and program product for image processing
US7099045B2 (en) Image processing apparatus, image forming apparatus, and image processing method for judging pixels in edge area of character in halftone-dot area
US20020141003A1 (en) Method of three dimensional color vector determinant for automatic kanji and chinese character detection and enhancement
US6972866B1 (en) Detecting process neutral colors
US7092124B2 (en) Image processing apparatus, image forming apparatus, and image processing method with judging pixels in halftone-dot areas based on isolated pixel counts
US7672013B2 (en) Methods and apparatus for electronically trapping digital images
US7136194B2 (en) Image processing apparatus, image forming apparatus, and image processing method
JP3923293B2 (en) Image processing method, image processing apparatus, and image forming apparatus
US6999197B1 (en) Black edge determination method and device
JP3967217B2 (en) Image processing method, image processing program, recording medium recording image processing program, image processing apparatus, and image processing system
US6999632B2 (en) Image processing apparatus, image forming apparatus and image processing method
JP3700456B2 (en) Color image processing device
JPH10210314A (en) Digital image-processing unit
JPH11266360A (en) Image processor
JPH09247481A (en) Picture processor
JP2006005806A (en) Method and apparatus for image processing, image forming apparatus and computer program

Legal Events

Date Code Title Description
AS Assignment

Owner name: SHARP LABORATORIES OF AMERICA, INCORPORATED, WASHI

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CHANG, WILLIAM HO;OTSU, MAKOTO;REEL/FRAME:010324/0956

Effective date: 19991015

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

AS Assignment

Owner name: SHARP KABUSHIKI KAISHA, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SHARP LABORATORIES OF AMERICA, INC.;REEL/FRAME:030973/0643

Effective date: 20130808

FEPP Fee payment procedure

Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 12