CN100468453C - Method for detecting and correcting red-eye in image - Google Patents

Method for detecting and correcting red-eye in image Download PDF

Info

Publication number
CN100468453C
CN100468453C CNB2003101160349A CN200310116034A CN100468453C CN 100468453 C CN100468453 C CN 100468453C CN B2003101160349 A CNB2003101160349 A CN B2003101160349A CN 200310116034 A CN200310116034 A CN 200310116034A CN 100468453 C CN100468453 C CN 100468453C
Authority
CN
China
Prior art keywords
candidate
pixel
red
region
eye region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CNB2003101160349A
Other languages
Chinese (zh)
Other versions
CN1635547A (en
Inventor
张钺
夏一哂
龚平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Priority to CNB2003101160349A priority Critical patent/CN100468453C/en
Publication of CN1635547A publication Critical patent/CN1635547A/en
Application granted granted Critical
Publication of CN100468453C publication Critical patent/CN100468453C/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Abstract

This invention provides a test method to test the red eye spots in images, which comprises the following steps: identifying the eye area; identifying the prepare red eye area with first number in the said eye area; contracting the prepare red eye area of the first number to get the prepare area with second number; enlarging the second number prepare red area to get the third number red area; selecting the prepare red area from the third number area as the detected red area.

Description

The method of the blood-shot eye illness in detection and the correcting image
Technical field
The present invention relates to Flame Image Process, and be particularly related to the method for the blood-shot eye illness in detection and the correcting image.
Background technology
Blood-shot eye illness is the unnatural red color tone that occurs around people's the pupil.It is normally caused by the flash of light reflected light of blood vessel.The method that many corrects red eyes are arranged at present.Some method requires user's intervention.Inconvenience concerning most of users like this.Some method can automatically detect and corrects red eyes, but they search for blood-shot eye illness in entire image, and thereby needs the more time.In addition, the accuracy of the method for existing detection and corrects red eyes is not high.
Summary of the invention
The purpose of this invention is to provide a kind of can fast detecting blood-shot eye illness in the image, and the accurate method of the correct detection blood-shot eye illness of arriving.
In order to realize above purpose, the invention provides the method for the blood-shot eye illness in a kind of detected image, it is characterized in that may further comprise the steps:
Discern the eye district in the described image;
Discern candidate's red eye region of first number in the described eye district;
Dwindle candidate's red eye region of described first number, obtain candidate's red eye region of second number, described second number is less than or equal to described first number;
Enlarge candidate's red eye region of described second number, obtain candidate's red eye region of the 3rd number, described the 3rd number is less than or equal to described second number; And
From candidate's red eye region of described the 3rd number, select no more than one candidate's red eye region, as detected blood-shot eye illness in the described eye district.
According to method of the present invention, in face image, or in the eye district that more specifically in face image, discerns, automatically detect blood-shot eye illness.Thereby, can see red with detecting exactly fast.
Together with the accompanying drawing description of preferred embodiments, other characteristics of the present invention and advantage will be more obvious by following, and accompanying drawing illustrates principle of the present invention by example.
Description of drawings
Fig. 1 is the schematic flow diagram of method of the present invention;
Fig. 2 is the schematic flow diagram of an example process of Fig. 1 step 105;
Fig. 3 is the schematic flow diagram of another example process of Fig. 1 step 105;
Fig. 4 is the schematic flow diagram of an example process of Fig. 1 step 106;
Fig. 5 is the schematic flow diagram of an example process of Fig. 1 step 107;
Fig. 6 is the schematic flow diagram of an example process of Fig. 1 step 107;
Fig. 7 is the schematic flow diagram of an example process of Fig. 1 step 108;
Fig. 8 is the schematic flow diagram of an example process of Fig. 1 step 110;
Fig. 9 A represents that a width of cloth comprises the image of two blood-shot eye illness;
Fig. 9 B represents to detect two blood-shot eye illness;
Fig. 9 C represents to proofread and correct two blood-shot eye illness;
Fig. 9 D represents the gray level image of detected blood-shot eye illness;
Fig. 9 E is illustrated in the gray level image of Fig. 9 D is carried out Gauss's smoothing processing gray level image as a result afterwards; And
Figure 10 schematically illustrates an image processing system, can realize that wherein Fig. 1 is to each processing shown in Figure 8.
Embodiment
In the following description, about how discerning the candidate face zone, how to discern the eye district in people's face, can be with reference to No. 00127067.2 Chinese patent application of submitting on September 15th, 2000 by same Applicant, by the 01132807.X Chinese patent application of same Applicant in submission on September 6 calendar year 2001, by No. 02155468.4 Chinese patent application of same Applicant in submission on Dec 13rd, 2002, No. 021600163.3 Chinese patent application of submitting on Dec 30th, 2002 by same Applicant, No. 03137345.3 Chinese patent application of submitting on June 18th, 2003 by same Applicant etc.These applications are here with reference to introducing.Yet the method in the method in disclosed identification candidate face zone and identification eye district is not construed as limiting the present invention in these applications.The present invention can utilize the method in the conventional method in any identification candidate face zone or the eye district within the recognition image.
Fig. 1 is the schematic flow diagram of method of the present invention.Flow process begins in step 101.In step 102, within pending image, discern a candidate face zone then.
Then, in step 103, two eye districts of identification in the candidate face zone of step 102 identification.These two eye districts are left eye district and right eye district.
As mentioned above, for performing step 102 and 103, the present invention can use the conventional method in eye district in any identification candidate face zone and the identification people face.
Secondly,, be made as current eye with one in two eye districts and distinguish in step 104, and at first through being subject to processing (step 105 is to 108).
In step 105, candidate's red eye region of first number within current eye district in the current eye of the identification district.
In order to discern candidate's red eye region within the eye district, consider the eigenwert of the pixel in the eye district.In the present invention, with a pixel that comprises in the blood-shot eye illness, promptly the pixel with blood-shot eye illness feature is called " blood-shot eye illness pixel ".
In step 105, for example considered the color variance or the texture of eye district interior pixel, or the combination of color variance and texture.Fig. 2 relates generally to the color of pixel variance of considering in the eye district.Fig. 3 relates generally to the texture of considering the pixel in the eye district.
Any one of Fig. 2 and flow process shown in Figure 3 can be used for discerning candidate's red eye region of first number in the current eye district, and followingly will describe in detail very much in this manual.
Preferably, if be used in combination, two flow processs of Fig. 2 and Fig. 3 will greatly increase the accuracy of method of the present invention.More preferably, if execution graph 3 flow processs at first, and according to 2 flow processs of execution graph as a result of Fig. 3 flow process, method of the present invention will more save time.
After Fig. 1 step 105, be step 106.
In step 106, dwindle candidate's red eye region of first number.As a result, obtain candidate's red eye region of second number.
According to originally dwindling processing, be that each pixel in each candidate's red eye region is estimated an eigenwert at least in candidate's red eye region of first number.If the eigenwert of estimating does not satisfy the standard that the blood-shot eye illness pixel is provided with, then evaluated pixel is removed from the correlation candidate red eye region.Thereby, the most area in candidate's red eye region of first number is reduced.If all pixels that comprise in candidate's red eye region are removed, then this candidate's red eye region does not exist, and no longer considers.
Thereby, second number, i.e. the sum of the candidate's red eye region after execution in step 106 may be less than first number, i.e. the sum of the candidate's red eye region before execution in step 106.Fig. 4 represents a example that candidate's red eye region is dwindled, and will be in following detailed description.
After step 106, proceed to step 107.
In step 107, enlarge candidate's red eye region of second number.As a result, obtain candidate's red eye region of the 3rd number.
In this step, consider candidate's red eye region boundary pixel separately of second number." boundary pixel " is meant the pixel that is positioned at candidate's red eye region edge.If near the pixel of boundary pixel satisfies the standard that is provided with for the blood-shot eye illness pixel, then these pixels are included in the correlation candidate red eye region.Thereby, the most area of candidate's red eye region of second number is increased, and some candidate's red eye region is merged mutually.Introduce another function of step 107 like this.
Another function of step 107 is, removes candidate's red eye region of merging selectively, candidate's red eye region of merging of combination selectively, or in the candidate's red eye region that keeps selectively merging one and remove other candidate's red eye region.
Candidate's red eye region of removing no longer takes in.
Thereby, the 3rd number, i.e. the sum of the candidate's red eye region after execution in step 107 may be less than second number, i.e. the sum of the candidate's red eye region before execution in step 107.Fig. 5 and Fig. 6 represent to enlarge an example of candidate's red eye region, and will be in following detailed description.
After the step 107 is step 108.
In step 108, no more than one candidate's red eye region is chosen as a detected blood-shot eye illness in this district.
In step 108, estimate a large amount of eigenwerts of the pixel in candidate's red eye region of the 3rd number.According to estimated value, remove the great majority in candidate's red eye region of the 3rd number.Then remaining candidate's red eye region is marked, and only the candidate's red eye region with maximum score is further considered.If only there is candidate's red eye region to satisfy standard, then it is chosen as detected blood-shot eye illness in the current eye district with maximum score.Otherwise current eye does not detect blood-shot eye illness in the district.Fig. 7 represents to select the example of no more than one candidate's red eye region, and will be in following detailed description.
In step 109, judge whether select candidate's red eye region then in step 108.If result of determination is a "Yes", handle forwarding step 110 to; Otherwise forward step 111 to.
In step 110, proofread and correct selected candidate's red eye region, promptly detected blood-shot eye illness.Fig. 8 describes the processing of corrects red eyes in detail.After step 110, handle forwarding step 111 to.
In step 111, judge whether two eye districts have all considered.If result of determination is a "Yes", handle forwarding step 113 to; Otherwise forward step 112 to.
In step 112, setting another district is current eye district, handles forwarding step 105 to.
In step 113, processing finishes.
Fig. 2 is the schematic flow diagram of an example process of Fig. 1 step 105.
Fig. 2 flow process is based on color variance theory.Usually, the brightness of blood-shot eye illness interior pixel and color are fierce changes.With (x, y) can represent with following formula (1) for the regional characteristic of the eye at center:
variance ( x , y ) = 1 | R x , y | Σ r ∈ R x , y | | 1 - 1 | P r | Σ P ∈ P r P | | 2 - - - ( 1 )
Wherein:
(x y) is the coordinate of pixel;
R X, yBe so that (x y) is the eye zone at center;
| R X, y| be a region R X, yArea, promptly the eye region R X, yInterior pixel count;
P rBe so that (x y) is the pupil region at center;
| P r| be pupil region P rArea, i.e. pupil region P rInterior pixel count;
R is a region R X, yAn interior pixel
P is pupil region P rAn interior pixel.
For each pixel in the candidate face zone, limit an eye zone and a pupil region, both are the center with this pixel.Then, use above formula (1) to calculate variance yields to this pixel.If the variance yields that calculates, thinks then that this pixel is a blood-shot eye illness pixel greater than a corresponding threshold value.
In fact, each pixel is calculated two variance yields.These two variance yields are called horizontal color variance yields and vertical color variance yields.When two variance yields during, think that then this pixel is a blood-shot eye illness pixel all greater than their corresponding threshold value.
As shown in Figure 2, processing begins in step 201.In step 202, variable R is set to 1.In step 203, another variable C is set to 1.
In step 204, (R C) is the center, limits a pupil region with pixel.
In order to limit a pupil region in people's face, the width in the height of the width of people's face, people's face, eye zone, the height in eye zone, the width of pupil region, the height of pupil region can satisfy following relation:
(1) width of Yan width=0.2* face;
(2) height of Yan height=0.2* face;
(3) width of the width of pupil=0.1* face;
(4) height of the height of pupil=0.1* face.
In step 205, calculate r (red) value, g (green) value and b (indigo plant) value of the pupil region that limits in step 204.
In the present invention, Qu Yu r value defined for the r value of all pixels of comprising in this zone on average; The g value defined in zone for the g value of all pixels of comprising in this zone on average; And the b value defined in zone for the b value of all pixels of comprising in this zone on average.
Thereby step 205 is corresponding in the above formula (1)
Figure C200310116034D00101
In step 206, still (R C) is the center, limits a level eye zone with pixel then.Here, the width in level eye zone can be the width in an eye zone, and the height in level eye zone can be the height of a pupil region.
Secondly, in step 207,, use above formula (1) calculated level color variance ScatterH, wherein R to the level eye zone that limits in step 206 X, yIt is level eye zone.
In step 208, judge that whether ScatterH is greater than first threshold.The scope of first threshold is 600 to 2500, and is preferably 1200.
If the result of determination of step 208 is a "No", handle forwarding step 213 to; Otherwise forward step 209 to.
In step 209, still (R C) is the center, limits a vertical eye zone with pixel.Here, the width in vertical eye zone can be the width of a pupil region, and the height in vertical eye zone can be the height in an eye zone.
Secondly, in step 210,, use above formula (1) to calculate vertical color variance ScatterV, wherein to the vertical eye zone that limits in step 209 X, yIt is vertical eye zone.
In step 211, judge that whether ScatterV is greater than second threshold value.The scope of second threshold value is 600 to 2500, and is preferably 1200.
If the result of determination of step 211 is a "No", handle forwarding step 213 to; Otherwise forward step 212 to.
In step 212, with in the candidate face zone (R, the pixel of C) locating be labeled as one blood-shot eye illness pixel.Then, processing forwards step 213 to.
In step 213, make after variable C increases progressively, judge that C is whether greater than the width in candidate face zone.
If the result of determination of step 213 is a "No", handle forwarding step 204 to; Otherwise forward step 214 to.
In step 214, make after variable R increases progressively, judge that R is whether greater than the height in candidate face zone.
If the result of determination of step 214 is a "No", handle forwarding step 203 to; Otherwise forward step 215 to.
In step 215, processing finishes.
To 215, a plurality of pixels can be labeled as a blood-shot eye illness pixel in the eye district by above step 201.Candidate's red eye region of first number of these blood-shot eye illness pixels in can pie graph 1 step 105.
Fig. 3 is the schematic flow diagram of another example process of Fig. 1 step 105.
Fig. 3 flow process is based on the texture theory.Represent that the pixel of obvious textural characteristics thinks to see red pixel.According to morphology, opening operation and closed operation have Different Results.Thereby, by using different template sizes, opening operation and closed operation can be used for extracting region-of-interest (for example, red eye region).For opening operation and closed operation are described, can be with reference to Digital ImageProcessing, Kenneth R.Castleman, Prentice Hall Inc., Copyright 1996.
As shown in Figure 3, processing begins in step 301.Then, in step 302, the candidate face zone is produced a width of cloth gray level image.((it is a width of cloth coloured image to pixel for x, gray scale y) in the calculating gray level image for x, r y) (red), g (green) and b (indigo plant) value according to the respective pixel in the candidate face zone.The gray scale of pixel is defined as:
r-max(g,b) (2)
Then, in step 303,, produce a width of cloth and close image by the gray level image that produces in step 302 is carried out closed operation.In closed operation, use the 3*3 template.
Secondly, in step 304,, produce a width of cloth and open image by the gray level image that produces in step 302 is carried out opening operation.In opening operation, use the 9*9 template.
Then, in step 305,, produce a width of cloth target image by from close image, deducting out image.
In step 306, target image is carried out Gauss's smoothing processing.Here, in target image, each pixel has a gray-scale value.If the gray scale of a pixel is greater than the threshold value of gray scale in the target image, think that then respective pixel is a blood-shot eye illness pixel in the candidate face zone.
Especially, in step 307, variable R is set to 1.In step 308, variable C is set to 1.
In step 309, judge that (R, whether the gray scale of the pixel of C) locating is greater than the 3rd threshold value in the target image.The 3rd threshold value is in 50% to 70% the scope of maximum gradation value of pixel in the target image, and be preferably pixel in the target image maximum gradation value 53%.
If the result of determination of step 309 is a "No", handle forwarding step 311 to; Otherwise forward step 310 to.
In step 310, with (R, the pixel of C) locating is labeled as a blood-shot eye illness pixel in the candidate face zone.Certainly, in order to use target image later on, (R, the pixel of C) locating can be set to 255 in the target image.
Then, processing forwards step 311 to.
In step 311, make after variable C increases progressively, judge that C is whether greater than the width in candidate face zone.
If the result of determination of step 311 is a "No", handle forwarding step 309 to; Otherwise forward step 312 to.
In step 312, make after variable R increases progressively, judge that R is whether greater than the height in candidate face zone.
If the result of determination of step 312 is a "No", handle forwarding step 308 to; Otherwise forward step 313 to.
In step 313, processing finishes.
As above described with reference to figure 1, the flow process of Fig. 3 and Fig. 2 can be used together.Usually, the at first processing of execution graph 3.If the processing by Fig. 3 is the blood-shot eye illness pixel with a pixel detection, estimate this pixel with the processing of Fig. 2 so.Only when pixel was estimated as a blood-shot eye illness pixel, this pixel just may be thought a blood-shot eye illness pixel.Before execution graph 2 is handled, because handling, Fig. 3 filters a large amount of non-blood-shot eye illness pixels, so reduced a large amount of calculating operations.
Fig. 4 is the schematic flow diagram of an example process of Fig. 1 step 106.
Processing begins in step 401.
In step 402, the candidate face zone is produced the two gray level images of a width of cloth.This pair gray level image also can be the result that Fig. 3 handles, the result that Fig. 2 handles, or the result of Fig. 3 and Fig. 2 treatment combination.
In this pair gray level image, the pixel with 255 gray scales is corresponding to the pixel of candidate face zone internal labeling for the blood-shot eye illness pixel, and the pixel with 0 gray scale is corresponding to the pixel that is not labeled as the blood-shot eye illness pixel in the candidate face zone.
In step 402, to producing, or, carries out one " label is cut apart " processing by the gray level image that Fig. 1 step 105 produces in step 402.Handle identification blood-shot eye illness block of pixels, and the number of definite blood-shot eye illness block of pixels according to " label is cut apart ".
In step 404, variable I is set to 1.
In step 405, calculate the area (i.e. pixel count in the I blood-shot eye illness block of pixels) of I blood-shot eye illness block of pixels, and judge that whether the area that calculates is less than the 4th threshold value.The 4th threshold value be the candidate face zone height, width minimum value certain multiple square.The certain multiple here and is preferably 0.1 in 0.05 to 0.02 scope.
For example, if the height in candidate face zone is 189, and the width in candidate face zone is 153, then in height and the width less one be 153.Make that certain multiple is 0.1.0.1 take advantage of 153 to get 15.3.Round numbers " 15 ".15 square be 225.Thereby the 4th threshold value in this candidate face zone can value " 225 ".
Another example is arranged here.The height of supposing the candidate face zone is 135, and the width in candidate face zone is 109.Less one is 109 in height and the width.0.1 take advantage of 109 to get 10.9.Round numbers " 10 ".10 square be 100.Thereby the 4th threshold value in this candidate face zone can value " 100 ".
If the result of determination of step 405 is a "Yes", handle forwarding step 410 (just, I blood-shot eye illness block of pixels being remained former state) to; Otherwise forward step 407 to.
In step 407,, in the candidate face zone, calculate the first red degree to respectively seeing red pixel in the I blood-shot eye illness piece.The first red degree of a pixel is defined as:
First red degree=r/ (r+g+b) (3)
Wherein r, g and b are meant r (red), g (green) and b (indigo plant) value of pixel.
Selectively, the first red degree in the zone corresponding with the 3*3 neighborhood of pixels of a pixel also can be as an index of this pixel.
In the present invention, first of a zone red degree is defined as the first red degree average of all pixels that this zone comprises.
In step 408,, be that I blood-shot eye illness block of pixels is calculated an interim threshold value according to the value of the first red degree that calculates in step 407.
For example, can at first press descending sort, form an array first red scale value.And the value of ad-hoc location in the array is thought interim threshold value.
In step 409,, the gray scale of this pixel is become 0 from 255 if the first red degree of a pixel is not more than interim threshold value.Just, if first of its respective pixel red degree is not more than interim threshold value in the gray level image, be originally that the pixel of blood-shot eye illness pixel no longer is considered to see red pixel.
In step 410,, judge that whether I is greater than blood-shot eye illness block of pixels number making after variable I increases progressively.
If the result of determination of step 410 is a "No", handle forwarding step 405 to; Otherwise forward step 411 to.
In step 411, processing finishes.
Fig. 5 and Fig. 6 are the schematic flow diagrams of an example process of Fig. 1 step 107.Fig. 1 step 107 can roughly be divided into three sub-steps.At first substep,,, then increase the main body of blood-shot eye illness block of pixels by absorbing these pixels if near these pixels satisfy predetermined condition near the pixel the boundary pixel of blood-shot eye illness block of pixels.At second substep, for the boundary pixel adjacent pixels of blood-shot eye illness block of pixels, if these satisfy predetermined condition with the boundary pixel adjacent pixels, then increase the main body of seeing red block of pixels by absorbing these pixels.The 3rd substep is to prevent to see red block of pixels to merge mutually, and removes the blood-shot eye illness block of pixels that does not satisfy predetermined condition.Fig. 5 example shown first and second substeps, and Fig. 6 example shown the 3rd substep.
As shown in Figure 5, processing begins in step 501.In step 502, variable I is set to 1.In step 503, calculate first red degree and the tone of I blood-shot eye illness block of pixels.
As this instructions front definition, the first red degree of a piece (it is a zone in fact) is defined as the first red degree average of all pixels that this piece comprises.
The first red degree of a pixel is as above defined by formula (3).
The tone of a piece (zone) is defined as tone value average of all pixels that this piece (zone) comprises.
The tone of a pixel is defined as:
Tone=min (h, 360-h) (4)
Wherein h is meant the tone value of candidate face zone interior pixel.
In step 504, locate a boundary pixel of I blood-shot eye illness block of pixels then.
In step 505, the brightness of computation bound pixel.The brightness of a pixel is defined as:
Brightness=(r+g+b)/3 (5)
Wherein r, g and b are meant r (red), g (green) and b (indigo plant) value of pixel.
In step 506, judge that whether the brightness of calculating in step 505 is greater than the 5th threshold value.The 5th threshold value and is preferably 100 in 80 to 120 scope.
If the result of determination of step 506 is a "Yes", handle forwarding step 509 to; Otherwise forward step 507 to.
In step 507, judge that whether the brightness of calculating in step 505 is greater than the 6th threshold value.The 6th threshold value and is preferably 50 in 40 to 60 scope.
If the result of determination of step 507 is a "Yes", handle forwarding step 508 to; Otherwise forward step 510 to.
In step 509,,, then this and boundary pixel adjacent pixels are added in the I blood-shot eye illness block of pixels if the respective pixel with the pixel in boundary pixel interval satisfies first condition for each pixel adjacent in the I blood-shot eye illness block of pixels with boundary pixel.
For example, the coordinate of a boundary pixel of I blood-shot eye illness block of pixels be (i, j).The function of step 509 be check (i+2, j), (i-2, j), (i, j-2) and (i, whether pixel j+2) satisfies first condition.
For example, if in that (i+2, pixel j) satisfies first condition, then will be in that (i+1, pixel j) is added in the I blood-shot eye illness block of pixels.
If boundary pixel is a pixel A, and with pixel A at interval the pixel of a pixel be pixel B, and the pixel between pixel A and the pixel B (promptly adjacent with a boundary pixel A respective pixel) is added in the I blood-shot eye illness block of pixels.First condition can define so so, so that pixel A and pixel B satisfy following requirement:
(1) saturation degree of pixel B is greater than 0.9 saturation degree of taking advantage of pixel A.Here, coefficient 0.9 can be in 0.7 to 1 scope.
(2) first of the pixel B red degree takes advantage of I to see red the first red degree of block of pixels greater than 0.8.Here, coefficient 0.8 can be in 0.7 to 0.9 scope.
(3) tone of pixel B takes advantage of I to see red the tone (calculating in step 503) of block of pixels less than 5.Here, coefficient 5 can be in 2 to 10 scopes.
After step 509, be step 510.
Step 508 is similar with step 509.Unique not being both in step 508 estimated second condition.
If boundary pixel is a pixel A, and with pixel A at interval the pixel of a pixel be pixel B, and the pixel between pixel A and the pixel B (promptly adjacent with a boundary pixel A respective pixel) is added in the I blood-shot eye illness block of pixels.Second condition can define so so, so that pixel A and pixel B satisfy following requirement:
(1) saturation degree of pixel B is greater than 1 saturation degree of taking advantage of pixel A, and the absolute difference between the brightness of pixel B and pixel A is less than 0.1 brightness of taking advantage of pixel A.Here, coefficient 1 can be in 0.8 to 1.2 scope, and coefficient 0.1 can be in 0 to 0.3 scope.
(2) first of the pixel B red degree takes advantage of I to see red the first red degree of block of pixels greater than 0.8.Here, coefficient 0.8 can be in 0.7 to 0.9 scope.
(3) tone of pixel B takes advantage of I to see red the tone (calculating in step 503) of block of pixels less than 5.Here coefficient 5 can be in 2 to 10 scopes.
After step 507,508 or 509, be step 510.
In step 510, judge whether the interpolation of pixel satisfies the 5th condition.
If the pixel count that this (i.e. current circulation from step 505 to step 509) added to the I blood-shot eye illness block of pixels is " count2 ", the pixel count that added to last time (i.e. previous cycle from step 505 to step 509) the I blood-shot eye illness block of pixels is " count1 ", and the number of times that enlarges I blood-shot eye illness block of pixels is " times ".
So, adding the 5th condition that pixel satisfied can define like this:
(1) count1/count2 is greater than 0 and less than 1.Here value " 0 " is a threshold value in 0 to 0.5 scope, and value " 1 " is a threshold value in 0.8 to 2 scope.
(2) count2 takes advantage of the width in eye zone less than 4, and wherein the width in eye zone is that minimum value in the height of the width of human face region and human face region multiply by 1/10.Here, value " 4 " is a threshold value in 1 to 8 scope, and value " 1/10 " is a threshold value in 1/5 to 1/15 scope.
(3) times takes advantage of the width in eye zone less than 1/2.Here, value " 1/2 " is a threshold value in 1/10 to 1 scope.
If the result of determination of step 510 is a "Yes", handle forwarding step 511 to; Otherwise forward step 512 to.
In step 511, judge and whether locate another boundary pixel.If the result of determination of step 511 is a "Yes", handle forwarding step 505 to; Otherwise forward step 512 to.
The processing of step 512 and the processing of step 505 to 511 are similar.The fundamental purpose of step 512 is, for each boundary pixel adjacent pixels with the blood-shot eye illness block of pixels, if this pixel satisfies the 3rd condition (being similar to first condition) or the 4th condition (being similar to second condition), then this pixel is added in this blood-shot eye illness block of pixels.
For example, the coordinate of a boundary pixel of a blood-shot eye illness block of pixels be (i, j).One of function of step 512 be check (i+1, j), (i-1, j), (i, j-1) and (i, whether pixel j+1) satisfies the 3rd condition (or the 4th condition).
For example, if in that (i+1, pixel j) satisfies the 3rd condition, will be in that (i+1, pixel j) is added in the I blood-shot eye illness block of pixels.
Then, in step 513,, judge that whether I is greater than blood-shot eye illness block of pixels number making after variable I increases progressively.If the result of determination of step 513 is a "No", handle forwarding step 503 to; Otherwise forward step 514 to.
In step 504, processing finishes.
As shown in Figure 6, processing begins in step 601.In step 602, variable I is set to 1.In step 603, judge whether I blood-shot eye illness block of pixels sees red block of pixels with another, and for example piece A merges mutually.Step 603 can be implemented as the 3*3 neighborhood of pixels of checking a pixel in the blood-shot eye illness block of pixels and whether comprises an interior pixel of another blood-shot eye illness block of pixels.If two blood-shot eye illness block of pixels are merged mutually.
If the result of determination of step 603 is a "No", handle forwarding step 611 to; Otherwise forward step 604 to.
In step 604, to I blood-shot eye illness block of pixels computation of characteristic values C1, and to piece A computation of characteristic values C2.
The eigenwert of blood-shot eye illness block of pixels is defined as vector [the first red degree, tone, saturation degree, the Y channel value of blood-shot eye illness block of pixels, the Cb channel value, the Cr channel value] and weight vector [0.920728,0.000282347 ,-0.0750816,0.000145816,0.00336445,0.000197533] dot product.
In above vector, the front definition in this manual of the first red degree of piece, the tone of piece.
The saturation degree of a piece (zone) is defined as intensity value average of all pixels that this piece (zone) comprises.
The Y channel value of a piece (zone) is defined as Y channel value average of all pixels that this piece (zone) comprises.
The Cb channel value of a piece (zone) is defined as Cb channel value average of all pixels that this piece (zone) comprises.
Secondly, in step 6051, judge whether C1 is less than or equal to the 8th threshold value.Here, the 8th threshold value usage charges She Er classification obtains (R.A.Fisher, " The Use ofMultiple Measures in Taxonomic Problems, " Ann.Eugenics, 1936) by training.The 8th threshold value and is preferably 0.835 in 0.5 to 1.5 scope.
If the result of determination of step 6051 is a "No", handle forwarding step 6052 to; Otherwise forward step 6091 to.
In step 6052, judge whether C2 is less than or equal to the 8th threshold value.
If the result of determination of step 6052 is a "No", handle forwarding step 606 to; Otherwise forward step 6092 to.
In step 606, calculate the absolute difference C3 between C1 and the C2.Then, in step 607, judge that whether C3 is greater than the 9th threshold value.
If the result of determination of step 607 is a "Yes", handle forwarding step 610 to; Otherwise forward step 608 to.
In step 608, remove blood-shot eye illness block of pixels with smaller value among C1 and the C2.Just, the piece of removing is no longer thought a blood-shot eye illness block of pixels.Then, step forwards step 611 to.
In step 610, I blood-shot eye illness block of pixels and piece A are combined into a piece.Handle then and forward step 611 to.
In step 6091, remove I blood-shot eye illness block of pixels.Just, the I piece no longer thinks to see red block of pixels.Handle then and forward step 6053 to.
In step 6053, judge whether C2 is less than or equal to the 8th threshold value.
If the result of determination of step 6053 is a "No", handle forwarding step 611 to; Otherwise forward step 6092 to.
In step 6092, remove piece A.Just, piece A no longer thinks to see red block of pixels.Handle then and forward step 611 to.
In step 611,, judge that whether I is greater than blood-shot eye illness block of pixels number making after variable I increases progressively.If the result of determination of step 611 is a "No", handle forwarding step 603 to; Otherwise forward step 612 to.
In step 612, processing finishes.
After Fig. 6 handles, a plurality of blood-shot eye illness block of pixels of identification in the candidate face zone.It is corresponding with candidate's red eye region in the candidate face zone that each sees red block of pixels.
Fig. 7 is the schematic flow diagram of an example process of Fig. 1 step 108.The purpose of Fig. 7 is according in separately red degree of a plurality of candidate's red eye region, saturation degree, tone, Cr channel value, Cb channel value, Y variance, red degree variance, tone variance, the r variance at least one, selects no more than one candidate's red eye region.
The Cb channel value in the color harmony zone in the saturation degree in zone, zone defines in this instructions front.The Cr channel value in zone is defined as Cr channel value average of all pixels that this zone comprises.Here, Cr is color space (Y, Cb, Cr) Nei a passage.
Present embodiment comprises three types red degree, and they are the first red degree, the second red degree and the 3rd red degree.The front definition in this manual of the first red degree of a pixel and the first red degree in a zone.The second red degree and the 3rd red degree will be done following explanation.
As shown in Figure 7, processing begins in step 701.In step 702, variable I is set to 1.In step 703, calculate the first red degree, the second red degree, the 3rd red degree, intensity value and the Cr channel value of I blood-shot eye illness block of pixels.
As the definition of this instructions front, the first red degree of a pixel is defined as r/ (r+g+b) (being formula (3)).And the first red degree of a piece (i.e. zone) is defined as the first red degree average of all pixels that this piece (zone) comprises.
The second red degree of a pixel is defined by formula (6):
Second red degree=r 2/ (r 2+ g 2+ b 2) (6)
Wherein r, g and b are meant r (red), g (green) and b (indigo plant) value of pixel.
The second red degree of a piece (i.e. zone) is defined as the second red degree average of all pixels that this piece (zone) comprises.
The 3rd red degree of a pixel is defined by formula (7):
The 3rd red degree=r-max (g, b) (7)
The 3rd red degree of a piece (i.e. zone) is defined as the 3rd red degree average of all pixels that this piece (zone) comprises.
After step 703, handle forwarding step 704 to.In step 704, judge in step 703 is calculated more than all value whether in scope separately.
For example, the first red degree of a piece should greater than one in 0.3 to 0.7 scope, and be preferably 0.37 threshold value.
The second red degree of a piece should be greater than one in 0.3 to 0.7 scope and be preferably 0.415 threshold value.
The 3rd red degree of a piece should be greater than one in 1 to 100 scope and be preferably 11 threshold value.
The intensity value of a piece should be greater than one in 0.1 to 0.5 scope and be preferably 0.12 threshold value.
The Cr channel value of a piece should be greater than one 80 to 200 and be preferably 138 threshold value.
If the result of determination of step 704 is a "Yes", handle forwarding step 705 to; Otherwise forward step 706 to.
In step 706, remove I blood-shot eye illness piece.Just, I blood-shot eye illness piece is no longer thought a blood-shot eye illness block of pixels.Handle then and forward step 707 to.
In step 705, for I blood-shot eye illness block of pixels, according to the second red degree of this piece, the Cb channel value of this piece, the Y variance of this piece, the 3rd red degree variance of this piece, the second red degree variance of this piece, and the tone of this piece, the tone variance of this piece, the r variance of this piece and the first red degree of this piece, count the score.
The score of a blood-shot eye illness block of pixels is calculated in the following manner:
(1) obtains the second red degree of this piece, the Cb channel value of this piece, the Y variance of this piece, the 3rd red degree variance of this piece, the second red degree variance of this piece, the tone of this piece, the tone variance of this piece, the r variance of this piece, the first red degree of this piece, and form one nine dimensional vector according to these values.
(2) by training set is carried out Fei Sheer classification and back-and-forth method, obtain a weight vector 0.35352,0.00177666 ,-0.000882568,0.00124642,0.337243,0.00010886 ,-0.000689163,0.000666636 ,-0.303867}.
(3) calculate the inner product of the vector in the vector sum (2) in (1), and the blood-shot eye illness block of pixels is obtained a score.
After step 705, be step 707.
In step 707,, judge that whether I is greater than blood-shot eye illness block of pixels number making after variable I increases progressively.If the result of determination of step 707 is a "No", handle forwarding step 703 to; Otherwise forward step 708 to.
In step 708, the location has the blood-shot eye illness block of pixels of maximum score.
In step 709, judge that whether the blood-shot eye illness block of pixels of being located is greater than the tenth threshold value.The tenth threshold value and is preferably 0.32 in 0.2 to 0.5 scope.The tenth threshold value usage charges She Er classification obtains by training.
If the result of determination of step 709 is a "Yes", handle forwarding step 710 to; Otherwise forward step 711 to.
In step 710, only keep having the blood-shot eye illness block of pixels of maximum score, and remove other blood-shot eye illness block of pixels.Just, select candidate's red eye region.
In step 711, remove all blood-shot eye illness pixels.Just, there is not the red eye region of selection.
In step 712, processing finishes.
Fig. 8 is the schematic flow diagram of an example process of Fig. 1 step 110, the i.e. processing of corrects red eyes.
In Fig. 8 example, change new R, G, B value into by R, G, B value, each pixel in the corrects red eyes zone with each pixel.
As shown in Figure 8, processing begins in step 801.In step 802,, carry out Gauss's smoothing processing to the gray level image that obtains by previous detection step.
In step 803, variable I is set to 1.
In step 804, judge whether the gray scale of I pixel in the gray level image is 0.
If the result of determination of step 804 is a "Yes", handle forwarding step 813 to; Otherwise forward step 805 to.
In step 805,, calculate the second red degree P1 for the respective pixel in the candidate face zone.
In fact, the second red degree P1 is calculated as:
P1=r 2/(r 2+g 2+b 2)+c1 (8)
Wherein c1 is an empirical constant in [0,1] scope.If P1〉1, establish P1=1.Thereby P1 is in [0,1] scope.
In step 806, the I pixel is calculated an index P2.Index P2 indicates the I pixel how far to have from the border of red eye region.The P2 value of a pixel is more little, and the distance between the border of this pixel and red eye region is more little.
In fact, the index P2 of a pixel is calculated as:
P2=nGray/10+c2 (9)
Wherein nGray is the gray scale of pixel in the gray level image, and c2 is an empirical constant in [0,0.9] scope.If P2〉1, establish P2=1.Thereby P2 is in [0,1] scope.
In step 807, correction coefficient P is calculated as:
P=P1*P2 (10)
In step 808, judge whether P equals 0.If the result of determination of step 808 is a "Yes", handle forwarding step 813 to; Otherwise forward step 809 to.
In step 809, convert R, G, the B value of I pixel to Y channel value, Cr channel value, Cb channel value.
In step 810, new Y, Cr, Cb channel value are calculated as follows:
New Y=Y;
New Cr=(1-P) * Cr; (11)
New Cb=(1-P) * Cb.
In step 811, convert new Y value, new Cr value, new Cb value to new R, new G, new B value.
In step 812, former R, G, the B value of I pixel are changed into new R, G, B value.
In step 813,, judge that whether I is greater than the pixel count in the gray level image making after variable I increases progressively.
If the result of determination of step 813 is a "No", handle forwarding step 804 to; Otherwise forward step 814 to.
In step 814, processing finishes.
Fig. 9 A to Fig. 9 E illustrates an example.Fig. 9 A is illustrated in a candidate face zone of Fig. 1 step 102 identification.Fig. 9 B represents by Fig. 1 step 103 to detect a red eye region 901 in left eye region, and detect a red eye region 902 in right eye region to 108.
Get red eye region 901 as an example.Fig. 9 D is illustrated in the gray level image of the red eye region 901 that obtains after Fig. 1 step 108.
Fig. 9 E is illustrated in Fig. 9 D gray level image is carried out Gauss's smoothing processing (Fig. 8 step 802) gray level image as a result afterwards.
About red eye region 902, carry out similar processing equally.
Fig. 9 C represents wherein to see red and obtains people's face of proofreading and correct.
Figure 10 schematically illustrates an image processing system, can realize that wherein Fig. 1 is to each processing shown in Figure 8.Image processing system shown in Figure 10 comprises a CPU (CPU (central processing unit)) 1001, a RAM (random access memory) 1002, a ROM (ROM (read-only memory)) 1003, a system bus 1004, a HD (hard disk) controller 1005, a keyboard controller 1006, a serial port controller 1007,1009, one hard disks 1010 of 1008, one display controllers of a parallel port controller, a keyboard 1011,1012, one printers 1013 of a camera and a display 1014.In these parts, what be connected with system bus 1004 has a CPU 1001, and RAM 1002, and ROM 1003, HD controller 1005, keyboard controller 1006, serial port controller 1007, parallel port controller 1008 and display controller 1009.Hard disk 1010 is connected with HD controller 1005, and keyboard 1011 is connected with keyboard controller 1006, camera 1012 is connected with serial port controller 1007, and printer 1013 is connected with parallel port controller 1008, and display 1004 is connected with display controller 1009.
Each functions of components is known in this area among Figure 10, and architecture shown in Figure 10 is conventional.Such architecture is not only applicable to personal computer, and is applicable to hand portable equipment, for example P/PC, PDA (personal digital assistant), digital camera etc.In different application, can omit some in the parts shown in Figure 10.For example, if total system is a digital camera, can omit parallel port controller 1008 and printer 1013, and system can be embodied as an one chip microcomputer.If application software stores in EPROM or other nonvolatile memorys, can be omitted HD controller 1005 and hard disk 1010.
Total system shown in Figure 10 is controlled by computer-readable instruction, and these instructions are stored in (or as mentioned above, being stored in EPROM or other nonvolatile memorys) in the hard disk 1010 as software usually.Software also can be downloaded from the network (not shown).Be kept at hard disk 1010 or from the software of the network download RAM 1002 that can pack into, and carry out, with the function that realizes that software was limited by CPU 1001.
To those skilled in the art, according to Fig. 1 one or more to the process flow diagram shown in Figure 8, develop one or more software, do not relate to creative work.
Though the above is with reference to specific embodiment of the present invention, but it will be understood to those of skill in the art that these are illustrative, and do not violating under the principle of the present invention, its scope is limited by accessory claim, can realize many changes of these embodiment.

Claims (9)

1. the method for the blood-shot eye illness in the detected image is characterized in that may further comprise the steps:
Discern an eye district in the described image;
Discern first candidate's red eye region of first number in the described eye district;
Dwindle this first candidate red eye region of described first number, obtain second candidate's red eye region of second number, described second number is less than or equal to described first number;
Enlarge this second candidate red eye region of described second number, obtain candidate's red eye region of the 3rd number, described the 3rd number is less than or equal to described second number;
From candidate's red eye region of described the 3rd number, select no more than one candidate's red eye region, as detected blood-shot eye illness in the described eye district; And
Wherein said expansion step will be near the boundary pixel of this second candidate region, satisfy the pixel of predefined condition and add this second candidate region to, and the result based near pixel that whether is included in another second candidate region through adding of this second candidate region through adding is checked merges with this another second candidate region through adding through second candidate region of interpolation.
2. according to the method for claim 1, near the pixel that it is characterized in that working as the pixel of being added satisfies should be predefined during condition, and described expansion step is added near this pixel.
3. according to the method for claim 1, it is characterized in that the step of candidate's red eye region of described second number of described expansion may further comprise the steps for each the candidate's red eye region in candidate's red eye region of described second number:
Locate a boundary pixel of described candidate's red eye region;
Locate one be not included in described candidate's red eye region, but first pixel adjacent with described boundary pixel;
If described boundary pixel and described first pixel satisfy another predefined condition, then described first pixel is added in described candidate's red eye region.
4. according to the method for claim 1, it is characterized in that the described step of no more than one candidate's red eye region of selecting may further comprise the steps from candidate's red eye region of described the 3rd number:
Calculate the score of each the candidate's red eye region in candidate's red eye region of described the 3rd number; And
Select candidate's red eye region with maximum score.
5. according to the method for claim 4, it is characterized in that the step of the score of each the candidate's red eye region in candidate's red eye region of described the 3rd number of described calculating may further comprise the steps:
To described candidate's red eye region, calculate the second red degree, Cb channel value, Y variance, the 3rd red degree variance, the second red degree variance, tone, tone variance, r variance, the first red degree variance, form one nine dimensional vector;
Calculate the inner product of described nine dimensional vectors and a weight vector, obtain the described score of described candidate's red eye region.
6. according to the method for claim 5, it is characterized in that obtaining described weight vector by a training set is carried out Fei Sheer classification and back-and-forth method.
7. according to the method for claim 1, it is characterized in that the step of candidate's red eye region of first number in the described eye of the described identification district may further comprise the steps for each pixel in the described eye district:
Limiting one is the pupil region at center with described pixel;
According to described pupil region, calculate described color of pixel variance; And
If described color variance, then is labeled as described pixel a blood-shot eye illness pixel greater than in the first threshold and second threshold value at least one, described blood-shot eye illness pixel and other blood-shot eye illness pixels form in candidate's red eye region of described first number together.
8. according to the method for claim 1, it is characterized in that the step of candidate's red eye region of first number in the described blood-shot eye illness of the described identification district may further comprise the steps:
Produce a width of cloth gray level image of described image;
By using first template that described gray level image is carried out closed operation, produce a width of cloth and close image;
By using second template that described gray level image is carried out opening operation, produce a width of cloth and open image;
By from the described image of opening of the described figure of closing image subtraction, produce a width of cloth target image;
Described target image is carried out Gauss's smoothing processing; And
To having each pixel in the described target image greater than the gray scale of the 3rd threshold value, a respective pixel in the original image is labeled as a blood-shot eye illness pixel, described blood-shot eye illness pixel and other blood-shot eye illness pixels form in candidate's red eye region of described first number together.
9. according to the method for claim 1, it is characterized in that each the candidate's red eye region for candidate's red eye region of described first number, described step of dwindling candidate's red eye region of described first number may further comprise the steps:
Calculate the first red degree of each pixel in described candidate's red eye region;
Calculate the interim threshold value of the described first red degree of described candidate's red eye region interior pixel;
From described candidate's red eye region, remove the pixel of its first red degree less than described interim threshold value.
CNB2003101160349A 2003-12-29 2003-12-29 Method for detecting and correcting red-eye in image Expired - Fee Related CN100468453C (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CNB2003101160349A CN100468453C (en) 2003-12-29 2003-12-29 Method for detecting and correcting red-eye in image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CNB2003101160349A CN100468453C (en) 2003-12-29 2003-12-29 Method for detecting and correcting red-eye in image

Publications (2)

Publication Number Publication Date
CN1635547A CN1635547A (en) 2005-07-06
CN100468453C true CN100468453C (en) 2009-03-11

Family

ID=34843534

Family Applications (1)

Application Number Title Priority Date Filing Date
CNB2003101160349A Expired - Fee Related CN100468453C (en) 2003-12-29 2003-12-29 Method for detecting and correcting red-eye in image

Country Status (1)

Country Link
CN (1) CN100468453C (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101447026B (en) * 2008-12-26 2013-02-13 北京中星微电子有限公司 Pinkeye detecting device and detection method
CN104637031B (en) * 2013-11-12 2017-08-29 华为终端有限公司 Eyes image treating method and apparatus
CN104063850B (en) * 2014-06-24 2017-02-01 广东互维科技有限公司 Red-eye correction method
CN104573715B (en) 2014-12-30 2017-07-25 百度在线网络技术(北京)有限公司 The recognition methods in image subject region and device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"红眼照片自动检测和修复方法". 王艺莼,卜佳俊.计算机工程,第29卷第11期. 2003
"红眼照片自动检测和修复方法". 王艺莼,卜佳俊.计算机工程,第29卷第11期. 2003 *

Also Published As

Publication number Publication date
CN1635547A (en) 2005-07-06

Similar Documents

Publication Publication Date Title
CN101620679B (en) Method for eliminating red eye in image
CN101799923B (en) Image processing apparatus for detecting coordinate position of characteristic portion of face
CN101246593B (en) Color image edge detection method and apparatus
CN107507144A (en) Processing method, device and the image processing apparatus of colour of skin enhancing
CN108961260B (en) Image binarization method and device and computer storage medium
CN108614995A (en) Gesture data collection acquisition method, gesture identification method and device for YOLO networks
US20020118889A1 (en) Image status estimating method, image correcting method, image correction apparatus, and storage medium
CN103440633A (en) Digital image automatic speckle-removing method
CN100468453C (en) Method for detecting and correcting red-eye in image
CN111882555B (en) Deep learning-based netting detection method, device, equipment and storage medium
JP2008234509A (en) Image evaluation device, method and program
CN103020949A (en) Facial image detection method
Dong et al. Detecting soft shadows in a single outdoor image: From local edge-based models to global constraints
CN105590307A (en) Transparency-based matting method and apparatus
CN109389116A (en) A kind of character detection method and device
CN109961015A (en) Image-recognizing method, device, equipment and storage medium
CN106910207B (en) Method and device for identifying local area of image and terminal equipment
CN109889696B (en) Anti-noise shot image recognition method and system for automatic geometric correction
CN110706254B (en) Target tracking template self-adaptive updating method
CN107818552A (en) A kind of binocular image goes reflective method
CN112348808A (en) Screen perspective detection method and device
CN115423724B (en) Underwater image enhancement method, device and medium for reinforcement learning parameter optimization
CN117152787A (en) Character clothing recognition method, device, equipment and readable storage medium
CN113223098B (en) Preprocessing optimization method for image color classification
CN114358131A (en) Digital photo frame intelligent photo optimization processing system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20090311

Termination date: 20161229