CN101986346B - Face image processing method and device - Google Patents

Face image processing method and device Download PDF

Info

Publication number
CN101986346B
CN101986346B CN2010105339767A CN201010533976A CN101986346B CN 101986346 B CN101986346 B CN 101986346B CN 2010105339767 A CN2010105339767 A CN 2010105339767A CN 201010533976 A CN201010533976 A CN 201010533976A CN 101986346 B CN101986346 B CN 101986346B
Authority
CN
China
Prior art keywords
facial image
image
display plane
plane
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN2010105339767A
Other languages
Chinese (zh)
Other versions
CN101986346A (en
Inventor
安文吉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Device Co Ltd
Original Assignee
Huawei Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Device Co Ltd filed Critical Huawei Device Co Ltd
Priority to CN2010105339767A priority Critical patent/CN101986346B/en
Publication of CN101986346A publication Critical patent/CN101986346A/en
Application granted granted Critical
Publication of CN101986346B publication Critical patent/CN101986346B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

Embodiments of the invention discloses a face image processing method and device, relating to the field of image processing, the method and the device can correct a face image, so that the face image can be displayed according to a predetermined angle. The face image processing method comprises the following steps: obtaining a first face image downwards, and obtaining a second face image upwards; and projecting pixel points in the first face image and the second face image on a display plane according to a predetermined corresponding relationship. The face image processing method and the device are mainly used for face image processing.

Description

Face image processing process and device
Technical field
The present invention relates to image processing field, relate in particular to face image processing process and device.
Background technology
Along with the continuous development of the communication technology, people have been not content with the information interaction mode of literal and voice, and video conversation has been applied to the communications field.
In the prior art, be employed in the mode that camera is installed on the communication facilities, realize video conversation.Communication facilities obtains image through camera, and the image that obtains is sent on the communication facilities of communication counterpart in real time, and communication facilities shows the other side's image of receiving in real time.
Camera is equivalent to the other side's eyes; Because people are always seeing the other side's on the communication facilities display screen image or literal when carrying out video conversation; Rather than seeing camera; So when carrying out video calling, communicating pair sees that the eyes in the other side's image are seeing often beyond the screen that other is local, rather than with oneself eyes to looking.Because the eye gaze direction of the other side's image of seeing of communicating pair always has individual differential seat angle with own direction of visual lines, can not with oneself eyes to looking, so can't reach on the spot in person, face-to-face sensation of talking with.
Summary of the invention
Embodiments of the invention provide a kind of face image processing process and device, can proofread and correct facial image, and said facial image is shown according to predetermined angular.
For achieving the above object, embodiments of the invention adopt following technical scheme:
A kind of face image processing process comprises:
Obtain first facial image from the angle of overlooking, obtain second facial image from the angle of looking up;
Pixel in said first facial image, second facial image is projected on the display plane according to preset corresponding relation.
A kind of facial image treating apparatus comprises:
First acquiring unit is used for obtaining first facial image from the angle of overlooking, and obtains second facial image from the angle of looking up;
Projecting cell projects to the pixel in said first facial image, second facial image on the display plane according to preset corresponding relation.
Face image processing process that the embodiment of the invention provides and device; First facial image and the pixel from second facial image that the angle of looking up is obtained that will obtain from the angle of overlooking; Project on the display plane, the image after the projection on the display plane is the neutralization of first facial image and second facial image.Can change the angles of display of original image, said facial image is shown according to predetermined angular.
Description of drawings
In order to be illustrated more clearly in the technical scheme in the embodiment of the invention; The accompanying drawing of required use is done to introduce simply in will describing embodiment below; Obviously, the accompanying drawing in describing below only is some embodiments of the present invention, for those of ordinary skills; Under the prerequisite of not paying creative work, can also obtain other accompanying drawing according to these accompanying drawings.
Fig. 1 is the process flow diagram of the said face image processing process of the embodiment of the invention;
Fig. 2 is the process flow diagram of the preferred embodiment of the said face image processing process of the embodiment of the invention;
Fig. 3 concerns synoptic diagram between each plane and the angle in the said face image processing process of the embodiment of the invention;
Fig. 4 is the structural drawing of the said facial image treating apparatus of the embodiment of the invention;
Fig. 5 is the structural drawing of the preferred embodiment of the said facial image treating apparatus of the embodiment of the invention.
Embodiment
To combine the accompanying drawing in the embodiment of the invention below, the technical scheme in the embodiment of the invention is carried out clear, intactly description, obviously, described embodiment only is the present invention's part embodiment, rather than whole embodiment.Based on the embodiment among the present invention, those of ordinary skills are not making the every other embodiment that is obtained under the creative work prerequisite, all belong to the scope of the present invention's protection.
The embodiment of the invention provides a kind of face image processing process, and is as shown in Figure 1, comprises the steps:
101, obtain first facial image through image acquiring device from the angle of overlooking, obtain second facial image from the angle of looking up.
102, the pixel in said first facial image, second facial image is projected on the display plane according to preset corresponding relation.
The face image processing process that present embodiment provides; First facial image and the pixel from second facial image that the angle of looking up is obtained that will obtain from the angle of overlooking; Project on the display plane, the image after the projection on the display plane is the neutralization of first facial image and second facial image.Can change the angles of display of original image, said facial image is shown according to predetermined angular.
As a kind of improvement of present embodiment, the embodiment of the invention provides another kind of face image processing process, and is as shown in Figure 2, comprises the steps:
201, through image acquiring device, such as, camera obtains first facial image from the angle of overlooking, and obtains second facial image from the angle of looking up.
Can use first image acquiring device in the centre position of the top front end that is arranged on terminal device to obtain first facial image, use second image acquiring device in the centre position of the end front end that is arranged on terminal device to obtain second facial image from the angle of looking up from the angle of overlooking.
As a kind of embodiment of present embodiment, said first facial image, second facial image both can be the general images of people's face, also can be the parts of images of people's face.Can get the image of the first half of people's face such as said first facial image, said second facial image can be got the image of the latter half of people's face.
Another kind of embodiment as present embodiment; Said first facial image can be the people face part of said first image acquiring device from the image that the angle of overlooking gets access to, and said second facial image can be the people face part of said second image acquiring device from the image that the angle of looking up gets access to.
As a kind of embodiment of this step, said image acquiring device can be two.An image acquiring device is arranged on the centre position of the top front end of terminal device, and another image acquiring device is arranged on the centre position of the end front end of terminal device.With the mobile phone is example, can be on the top of mobile phone the centre position of front end a camera is set, the centre position of front end is provided with a camera at the end of mobile phone.The viewfinder range of each camera can be set according to actual conditions, specifically can be following scheme:
The first, the camera that is provided with of the centre position of mobile phone top front end is used for obtaining the image of the first half of people's face; The camera that the centre position of front end is provided with at the bottom of the mobile phone is used for obtaining the image of the latter half of people's face.
The second, two cameras can absorb the general image of people's face.
As a kind of embodiment of present embodiment, the angle of looking up when the angle of overlooking when said first image acquiring device obtains image can be obtained image with said second image acquiring device is identical.
202, in order to make the said face image processing process of the embodiment of the invention more targeted; After said first image acquiring device gets access to image from the angle of overlooking; Adopt image recognition technology, from the image that gets access to, extract the people face part as the first pending facial image such as face recognition technology; After said second image acquiring device gets access to image from the angle of overlooking, adopt image recognition technology, from the image that gets access to, extract the people face part as the second pending facial image such as face recognition technology.
203, calculate the cosine value of the first angle a between said first facial image plane, place and the display plane, the cosine value of the second angle b between said second facial image plane, place and the display plane.
Here suppose the user when using the terminal device that adopts the said face image processing process of the embodiment of the invention, the nose of people's face is always vertical with display plane with the line at the center of terminal device display plane.
As shown in Figure 3, AC represents the plane at said first facial image place, and GD represents first image acquiring device when obtaining image, the shooting light path between first image acquiring device and the people's face.Obviously, first image acquiring device represented of the plane at said first facial image place of AC representative and DG is vertical with shooting light path between people's face.
AB representative image display plane, GE is perpendicular to the display plane of terminal device.Obviously, GE is perpendicular to the image display plane of AB representative.
So; AC represents the plane at said first facial image place and the angle between the AB representative image display plane; Angle a0 between the shooting light path between first image acquiring device that the i.e. first angle a between plane, first facial image place and the display plane, and GE and GD represent and the people's face equates.So, cos (a)=cos (a0).
Usually; When making terminal device; Shooting light path between first image acquiring device and the people's face and the angle between the display plane are predefined; Then can confirm and first image acquiring device and people's face between shooting light path and the angle between the display plane mutually surplus angle a 0 can calculate, and then can draw the value of angle a, then certainly calculate the value of cos (a).
In like manner, can calculate the cosine value cos (b) of the second angle b between second facial image plane, place and the display plane according to shooting light path between predefined second image acquiring device and the people's face and the angle between the display plane.
Another kind of embodiment as this step; When making terminal device; Shooting light path and the angle between the display plane between first image acquiring device and the people's face are not predefined; But the employing image recognition technology such as the organ identification technology, makes camera carry out track up to certain organ of people's face.In the process that the user uses, shooting light path between first image acquiring device and the people's face and the angle between the display plane are with the variable in distance between people's face and the display plane.In this case, adopt following method to calculate the value of cos (a):
At first, need to confirm the distance between people's face and the display plane, can obtain the distance between people's face plane and the image display plane through range sensor.
Secondly, as shown in Figure 3, AC represents the plane at said first facial image place, and GD represents first image acquiring device when obtaining image, the shooting light path between first image acquiring device and the people's face.Obviously, no matter how the shooting light path between first image acquiring device and the people's face changes, and first image acquiring device that the plane at said first facial image place of AC representative and GD represent is vertical with the shooting light path between people's face.
AB representative image display plane, GE is perpendicular to the display plane of terminal device.Obviously, GE is perpendicular to the image display plane of AB representative.
So; AC represents the plane at said first facial image place and the angle between the AB representative image display plane; Angle a0 between the shooting light path between first image acquiring device that the i.e. first angle a between plane, first facial image place and the display plane, and GE and GD represent and the people's face equates.So, cos (a)=cos (a0).
Figure GDA0000129281600000051
wherein; M is the distance between people's face plane and image display plane, and n is in the heart a distance in image acquiring device center and the image display plane.
In like manner; The cosine value of angle b0 between cosine value and the GF that can draw the second angle b between plane, second facial image place and the display plane and the GE equates;
Figure GDA0000129281600000052
wherein; M is the distance between people's face plane and image display plane, and n is in the heart a distance in image acquiring device center and the image display plane.
204, according to the said first included angle cosine value be cos (a) with said the first face image projection to display plane, be that cos (b) projects to said second facial image on the display plane according to the said second included angle cosine value.
At first, calculate pixel in said first facial image to the coordinate of said display plane projection, and the pixel in said first facial image is projected on the said display plane according to the relation of x0=x1/cos (a), y0=y1/cos (a).
Calculate pixel in said second facial image to the coordinate of said display plane projection according to the relation of x0=x2/cos (b), y0=y2/cos (b), and the pixel in said second facial image is projected on the said display plane.
Wherein, Said display plane and people's face plane parallel; (x1, y1) is the coordinate of pixel in said first facial image, and (x2, y2) is the pixel coordinate in said second facial image, and (x0, y0) is the point coordinate on the said display plane; Cos (a) is the included angle cosine value between said first facial image plane, place and the display plane, and cos (b) is the included angle cosine value between said second facial image plane, place and the display plane.
In the present embodiment pixel in said first facial image is projected on the said display plane, perhaps the pixel in said second facial image is projected on the said display plane, can not have sequencing in time.
A kind of embodiment as present embodiment; The projection that same coordinate points on display plane is repeated, said pixel in said first facial image is projected on the said display plane can be coordinate position projection on the said display plane that the pixel in said second facial image is not projected to.Perhaps
Said pixel in said second facial image is projected on the said display plane can be coordinate position projection on the said display plane that the pixel in said first facial image is not projected to.
In this embodiment; After perhaps second facial image projects to display plane with first facial image; Next when projecting to another facial image on the display plane; No longer the whole pixels with another facial image all project on the display plane, but, only with the pixel in the facial image to correspondence not by the coordinate points projection of projection.The projection of avoiding the same coordinate points on the display plane to be repeated.
The face image processing process that present embodiment provides; First facial image and the pixel from second facial image that the angle of looking up is obtained that will obtain from the angle of overlooking; Project on the display plane, the image after the projection on the display plane is the neutralization of first facial image and second facial image.Can change the angles of display of original image, said facial image is shown according to predetermined angular.
The embodiment of the invention provides a kind of facial image treating apparatus, and is as shown in Figure 4, comprising: first acquiring unit 41, projecting cell 42.
Wherein, said first acquiring unit 41 obtains first facial image from the angle of overlooking, and obtains second facial image from the angle of looking up.Said projecting cell 42 projects to the pixel in said first facial image, second facial image on the display plane according to preset corresponding relation.
The facial image treating apparatus that present embodiment provides; First facial image and the pixel from second facial image that the angle of looking up is obtained that will obtain from the angle of overlooking; Project on the display plane, the image after the projection on the display plane is the neutralization of first facial image and second facial image.Can change the angles of display of original image, said facial image is shown according to predetermined angular.
The embodiment of the invention provides another kind of facial image treating apparatus, and is as shown in Figure 5, comprising: first acquiring unit 51, second acquisition unit 52, projecting cell 53.
Wherein projecting cell 53 comprises: acquisition module 531, projection module 532.
Wherein, said first acquiring unit 51 obtains first facial image from the angle of overlooking, and obtains second facial image from the angle of looking up.Second acquisition unit 52 obtains the distance between people's face plane and image display plane through range sensor.Said projecting cell 53 projects to the pixel in said first facial image, second facial image on the display plane according to preset corresponding relation; Be specially: at first; Acquisition module 531 obtains the first included angle cosine value between said first facial image plane, place and the display plane, the second included angle cosine value between said second facial image plane, place and the display plane; Projection module 532 according to the said first included angle cosine value with said the first face image projection to display plane, according to the said second included angle cosine value said second facial image is projected on the display plane.
The facial image treating apparatus that present embodiment provides; First facial image and the pixel from second facial image that the angle of looking up is obtained that will obtain from the angle of overlooking; Project on the display plane, the image after the projection on the display plane is the neutralization of first facial image and second facial image.Can change the angles of display of original image, said facial image is shown according to predetermined angular.
A kind of embodiment as present embodiment; Said first acquiring unit 51 can use first image acquiring device in the centre position of the top front end that is arranged on terminal device to obtain first facial image from the angle of overlooking, and uses second image acquiring device in the centre position of the end front end that is arranged on terminal device to obtain second facial image from the angle of looking up.
As a kind of embodiment of present embodiment, first facial image that said first acquiring unit 51 gets access to, second facial image both can be the general images of people's face, also can be the parts of images of people's face.
Another kind of embodiment as present embodiment; First facial image that said first acquiring unit 51 gets access to can be the people face part of said first image acquiring device from the image that the angle of overlooking gets access to, and said second facial image can be the people face part of said second image acquiring device from the image that the angle of looking up gets access to.
As a kind of embodiment of this step, said first acquiring unit 51 comprises two image acquiring devices.An image acquiring device is arranged on the centre position of the top front end of terminal device, and another image acquiring device is arranged on the centre position of the end front end of terminal device.With the mobile phone is example, can be on the top of mobile phone the centre position of front end a camera is set, the centre position of front end is provided with a camera at the end of mobile phone.The viewfinder range of each camera can be set according to actual conditions, specifically can be following scheme:
The first, the camera that is provided with of the centre position of mobile phone top front end is used for obtaining the image of the first half of people's face; The camera that the centre position of front end is provided with at the bottom of the mobile phone is used for obtaining the image of the latter half of people's face.
The second, two cameras can absorb the general image of people's face.
As a kind of embodiment of present embodiment, the angle of looking up when the angle of overlooking when said first image acquiring device obtains image can be obtained image with said second image acquiring device is identical.
More targeted when making first acquiring unit 51 in the said facial image treating apparatus of the embodiment of the invention obtain image; After said first image acquiring device gets access to image from the angle of overlooking; Adopt image recognition technology, from the image that gets access to, extract the people face part as the first pending facial image such as face recognition technology; After said second image acquiring device gets access to image from the angle of overlooking, adopt image recognition technology, from the image that gets access to, extract the people face part as the second pending facial image such as face recognition technology.
A kind of embodiment as present embodiment; Said acquisition module 531 obtains the first included angle cosine value between said first facial image plane, place and the display plane, and the second included angle cosine value between said second facial image plane, place and the display plane can adopt following method:
Here suppose the user when using the terminal device that adopts the said face image processing process of the embodiment of the invention, the nose of people's face is always vertical with display plane with the line at the center of terminal device display plane.
As shown in Figure 3, AC represents the plane at said first facial image place, and GD represents first image acquiring device when obtaining image, the shooting light path between first image acquiring device and the people's face.Obviously, first image acquiring device represented of the plane at said first facial image place of AC representative and DG is vertical with shooting light path between people's face.
AB representative image display plane, GE is perpendicular to the display plane of terminal device.Obviously, GE is perpendicular to the image display plane of AB representative.
So; AC represents the plane at said first facial image place and the angle between the AB representative image display plane; Angle a0 between the shooting light path between first image acquiring device that the i.e. first angle a between plane, first facial image place and the display plane, and GE and GD represent and the people's face equates.So, cos (a)=cos (a0).
Usually; When making terminal device; Shooting light path between first image acquiring device and the people's face and the angle between the display plane are predefined; Then can confirm and first image acquiring device and people's face between shooting light path and the angle between the display plane mutually surplus angle a0 can calculate, and then can draw the value of angle a, then certainly calculate the value of cos (a).
In like manner, can calculate the value of cos (b).
Another kind of embodiment as this step; When making terminal device; Shooting light path and the angle between the display plane between first image acquiring device and the people's face are not predefined; But the employing image recognition technology such as the organ identification technology, makes camera carry out track up to certain organ of people's face.In the process that the user uses, shooting light path between first image acquiring device and the people's face and the angle between the display plane are with the variable in distance between people's face and the display plane.In this case, adopt following method to calculate the value of cos (a):
At first, need to confirm the distance between people's face and the display plane, can obtain the distance between people's face plane and the image display plane through range sensor.
Secondly, as shown in Figure 3, AC represents the plane at said first facial image place, and GD represents first image acquiring device when obtaining image, the shooting light path between first image acquiring device and the people's face.Obviously, no matter how the shooting light path between first image acquiring device and the people's face changes, and first image acquiring device that the plane at said first facial image place of AC representative and DG represent is vertical with the shooting light path between people's face.
AB representative image display plane, GE is perpendicular to the display plane of terminal device.Obviously, GE is perpendicular to the image display plane of AB representative.
So; AC represents the plane at said first facial image place and the angle between the AB representative image display plane; Angle a0 between the shooting light path between first image acquiring device that the i.e. first angle a between plane, first facial image place and the display plane, and GE and GD represent and the people's face equates.So, cos (a)=cos (a0).
Figure GDA0000129281600000101
wherein; M is the distance between people's face plane and image display plane, and n is in the heart a distance in image acquiring device center and the image display plane.
In like manner; Can draw
Figure GDA0000129281600000102
wherein; M is the distance between people's face plane and image display plane, and n is in the heart a distance in image acquiring device center and the image display plane.
A kind of embodiment as present embodiment; Said projection module 432 according to the said first included angle cosine value with said the first face image projection to display plane, according to the said second included angle cosine value said second facial image is projected on the display plane and can adopt following method:
According to the said first included angle cosine value be cos (a) with said the first face image projection to display plane, be that cos (b) projects to said second facial image on the display plane according to the said second included angle cosine value.
At first, calculate pixel in said first facial image to the coordinate of said display plane projection, and the pixel in said first facial image is projected on the said display plane according to the relation of x0=x1/cos (a), y0=y1/cos (a).
Calculate pixel in said second facial image to the coordinate of said display plane projection according to the relation of x0=x2/cos (b), y0=y2/cos (b), and the pixel in said second facial image is projected on the said display plane.
Wherein, Said display plane and people's face plane parallel; (x1, y1) is the coordinate of pixel in said first facial image, and (x2, y2) is the pixel coordinate in said second facial image, and (x0, y0) is the point coordinate on the said display plane; Cos (a) is the included angle cosine value between said first facial image plane, place and the display plane, and cos (b) is the included angle cosine value between said second facial image plane, place and the display plane.
In the present embodiment pixel in said first facial image is projected on the said display plane, perhaps the pixel in said second facial image is projected on the said display plane, can not have sequencing in time.
A kind of embodiment as present embodiment; The projection that same coordinate points on display plane is repeated, said pixel in said first facial image is projected on the said display plane can be coordinate position projection on the said display plane that the pixel in said second facial image is not projected to.Perhaps
Said pixel in said second facial image is projected on the said display plane can be coordinate position projection on the said display plane that the pixel in said first facial image is not projected to.
In this embodiment; After perhaps second facial image projects to display plane with first facial image; Next when projecting to another facial image on the display plane; No longer the whole pixels with another facial image all project on the display plane, but, only with the pixel in the facial image to correspondence not by the coordinate points projection of projection.The projection of avoiding the same coordinate points on the display plane to be repeated.
The facial image treating apparatus that present embodiment provides; First facial image and the pixel from second facial image that the angle of looking up is obtained that will obtain from the angle of overlooking; Project on the display plane, the image after the projection on the display plane is the neutralization of first facial image and second facial image.Can change the angles of display of original image, said facial image is shown according to predetermined angular.
Terminal device described in the embodiment of the invention can be equipment such as mobile phone, PDA, computing machine.
Through the description of above embodiment, the those skilled in the art can be well understood to the present invention and can realize by the mode that software adds essential common hardware, can certainly pass through hardware, but the former is better embodiment under a lot of situation.Based on such understanding; The part that technical scheme of the present invention contributes to prior art in essence in other words can be come out with the embodied of software product, and this computer software product is stored in the storage medium that can read, like the floppy disk of computing machine; Hard disk or CD etc.; Comprise some instructions with so that computer equipment (can be personal computer, server, the perhaps network equipment etc.) carry out the described method of each embodiment of the present invention.
The above; Be merely embodiment of the present invention, but protection scope of the present invention is not limited thereto, any technician who is familiar with the present technique field is in the technical scope that the present invention discloses; Can expect easily changing or replacement, all should be encompassed within protection scope of the present invention.Therefore, protection scope of the present invention should be as the criterion by said protection domain with claim.

Claims (11)

1. a face image processing process is characterized in that, comprising:
Obtain first facial image from the angle of overlooking, obtain second facial image from the angle of looking up;
Pixel in said first facial image, second facial image is projected on the display plane according to preset corresponding relation; Comprise: obtain the first included angle cosine value between said first facial image plane, place and the display plane, the second included angle cosine value between said second facial image plane, place and the display plane; According to the said first included angle cosine value with said the first face image projection to display plane, according to the said second included angle cosine value said second facial image is projected on the display plane;
Said according to said first angle with said the first face image projection to display plane; According to said second angle said second facial image is projected on the display plane and to be: calculate pixel in said first facial image to the coordinate of said display plane projection according to the relation of x0=x1/cos (a), y0=y1/cos (a), and the pixel in said first facial image is projected on the said display plane; Calculate pixel in said second facial image to the coordinate of said display plane projection according to the relation of x0=x2/cos (b), y0=y2/cos (b), and the pixel in said second facial image is projected on the said display plane;
Wherein, Said display plane and people's face plane parallel; (x1, y1) is the coordinate of pixel in said first facial image, and (x2, y2) is the pixel coordinate in said second facial image, and (x0, y0) is the point coordinate on the said display plane; Cos (a) is the included angle cosine value between said first facial image plane, place and the display plane, and cos (b) is the included angle cosine value between said second facial image plane, place and the display plane.
2. face image processing process according to claim 1; It is characterized in that said pixel in said first facial image is projected on the said display plane is: the coordinate position projection on the said display plane that the pixel in said second facial image is not projected to; Perhaps
Said pixel in said second facial image is projected on the said display plane is: the coordinate position projection on the said display plane that the pixel in said first facial image is not projected to.
3. face image processing process according to claim 1 and 2; It is characterized in that; Saidly obtain first facial image from the angle of overlooking; Obtaining second facial image from the angle of looking up is: use first image acquiring device in the centre position of the top front end that is arranged on terminal device to obtain first facial image from the angle of overlooking, use second image acquiring device in the centre position of the end front end that is arranged on terminal device to obtain second facial image from the angle of looking up.
4. face image processing process according to claim 3 is characterized in that,
Figure FDA0000159726120000021
wherein; M is the distance between people's face plane and image display plane, and n is in the heart a distance in image acquiring device center and the image display plane.
5. face image processing process according to claim 4 is characterized in that, obtains the distance between people's face plane and image display plane through range sensor.
6. face image processing process according to claim 5; It is characterized in that; Said first facial image is the people face part of said first image acquiring device from the image that the angle of overlooking gets access to, and said second facial image is the people face part of said second image acquiring device from the image that the angle of looking up gets access to.
7. a facial image treating apparatus is characterized in that, comprising:
First acquiring unit is used for obtaining first facial image from the angle of overlooking, and obtains second facial image from the angle of looking up;
Projecting cell projects to the pixel in said first facial image, second facial image on the display plane according to preset corresponding relation;
Said projecting cell comprises:
Acquisition module is used to obtain the first included angle cosine value between said first facial image plane, place and the display plane, the second included angle cosine value between said second facial image plane, place and the display plane;
Projection module is used for according to the said first included angle cosine value said the first face image projection said second facial image being projected on the display plane according to the said second included angle cosine value to display plane;
Said projection module according to the said first included angle cosine value with said the first face image projection to display plane; According to the said second included angle cosine value said second facial image is projected on the display plane and to be: said projection module calculates pixel in said first facial image to the coordinate of said display plane projection according to the relation of x 0=x1/cos (a), y0=y1/cos (a), and the pixel in said first facial image is projected on the said display plane; Calculate pixel in said second facial image to the coordinate of said display plane projection according to the relation of x0=x2/cos (b), y0=y2/cos (b), and the pixel in said second facial image is projected on the said display plane;
Wherein, Said display plane and people's face plane parallel; (x1, y1) is the coordinate of pixel in said first facial image, and (x2, y2) is the pixel coordinate in said second facial image, and (x0, y0) is the point coordinate on the said display plane; Cos (a) is the included angle cosine value between said first facial image plane, place and the display plane, and cos (b) is the included angle cosine value between said second facial image plane, place and the display plane.
8. facial image treating apparatus according to claim 7; It is characterized in that said projection module projects to the pixel in said first facial image on the said display plane and is: the coordinate position projection on the said display plane that said projection module not projects to the pixel in said second facial image; Perhaps
Said projection module projects to the pixel in said second facial image on the said display plane: the coordinate position projection on the said display plane that said projection module not projects to the pixel in said first facial image.
9. according to claim 7 or 8 described facial image treating apparatus; It is characterized in that; Said acquiring unit obtains first facial image from the angle of overlooking; Obtaining second facial image from the angle of looking up is: said acquiring unit uses first image acquiring device in the centre position of the top front end that is arranged on terminal device to obtain first facial image from the angle of overlooking, and uses second image acquiring device in the centre position of the end front end that is arranged on terminal device to obtain second facial image from the angle of looking up.
10. facial image treating apparatus according to claim 9 is characterized in that,
Figure FDA0000159726120000031
wherein; M is the distance between people's face plane and image display plane, and n is in the heart a distance in image acquiring device center and the image display plane.
11. facial image treating apparatus according to claim 10 is characterized in that, said facial image treating apparatus also comprises:
Second acquisition unit is used for obtaining the distance between people's face plane and image display plane through range sensor.
CN2010105339767A 2010-11-01 2010-11-01 Face image processing method and device Active CN101986346B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2010105339767A CN101986346B (en) 2010-11-01 2010-11-01 Face image processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2010105339767A CN101986346B (en) 2010-11-01 2010-11-01 Face image processing method and device

Publications (2)

Publication Number Publication Date
CN101986346A CN101986346A (en) 2011-03-16
CN101986346B true CN101986346B (en) 2012-08-08

Family

ID=43710692

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2010105339767A Active CN101986346B (en) 2010-11-01 2010-11-01 Face image processing method and device

Country Status (1)

Country Link
CN (1) CN101986346B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103746985A (en) * 2013-12-30 2014-04-23 华为技术有限公司 Method and device for communication
JP2016009873A (en) * 2014-06-20 2016-01-18 ソニー株式会社 Information processing apparatus, information processing system, information processing method, and program
CN105989577B (en) * 2015-02-17 2020-12-29 中兴通讯股份有限公司 Image correction method and device
CN105812709A (en) * 2016-03-18 2016-07-27 合肥联宝信息技术有限公司 Method for realizing virtual camera by using cameras

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5500671A (en) * 1994-10-25 1996-03-19 At&T Corp. Video conference system and method of providing parallax correction and a sense of presence
US6771303B2 (en) * 2002-04-23 2004-08-03 Microsoft Corporation Video-teleconferencing system with eye-gaze correction
CN1672431A (en) * 2002-05-21 2005-09-21 索尼株式会社 Information processing apparatus, information processing system, and dialogist displaying method
CN101547332A (en) * 2008-03-26 2009-09-30 联想(北京)有限公司 Image processing method and device thereof

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4663699B2 (en) * 2007-09-27 2011-04-06 富士フイルム株式会社 Image display device and image display method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5500671A (en) * 1994-10-25 1996-03-19 At&T Corp. Video conference system and method of providing parallax correction and a sense of presence
US6771303B2 (en) * 2002-04-23 2004-08-03 Microsoft Corporation Video-teleconferencing system with eye-gaze correction
CN1672431A (en) * 2002-05-21 2005-09-21 索尼株式会社 Information processing apparatus, information processing system, and dialogist displaying method
CN101547332A (en) * 2008-03-26 2009-09-30 联想(北京)有限公司 Image processing method and device thereof

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
JP特开2009-86703A 2009.04.23
Kin-Man Lam等.An Analytic-to-Holistic Approach for Face Recognition Based on a Single Frontal View.《IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE》.1998,第20卷(第7期), *
顾亦然 等.三维人脸姿态校正算法研究.《仪器仪表学报》.2010,第31卷(第10期), *

Also Published As

Publication number Publication date
CN101986346A (en) 2011-03-16

Similar Documents

Publication Publication Date Title
US8908008B2 (en) Methods and systems for establishing eye contact and accurate gaze in remote collaboration
CN107396075B (en) Method and device for generating projection image correction information
US9626564B2 (en) System for enabling eye contact in electronic images
CN106462937B (en) Image processing apparatus and image display apparatus
WO2017116570A1 (en) Real-time automatic vehicle camera calibration
CN101986346B (en) Face image processing method and device
CN102483854A (en) Image processing system
CN104881114A (en) Angle rotation real-time matching method based on try wearing of 3D (three dimensional) glasses
CN103458259B (en) A kind of 3D video causes detection method, the Apparatus and system of people's eye fatigue
CN104881526A (en) Article wearing method and glasses try wearing method based on 3D (three dimensional) technology
EP2894608A1 (en) Display device, display method, and program
CN104038697A (en) Mobile terminal multi-person photographing method and mobile terminal
CN106774870A (en) A kind of augmented reality exchange method and system
Stearns et al. Automated person detection in dynamic scenes to assist people with vision impairments: An initial investigation
CN205545426U (en) Three -dimensional mobile telephone device of panoramic shooting group
CN103997616B (en) A kind of method, apparatus and conference terminal handling video conference picture
CN107592520B (en) Imaging device and imaging method of AR equipment
US9424555B2 (en) Virtual conferencing system
CN107291225A (en) A kind of wear-type virtual reality/augmented reality device based on mobile phone
CN105812709A (en) Method for realizing virtual camera by using cameras
TW201518994A (en) Viewing angle adjusting method, apparatus and system of liquid crystal display
JP5896445B2 (en) Display device, display method, and program
CN207366818U (en) A kind of not coaxial image optical system
CN110288680A (en) Image generating method and mobile terminal
US20200366869A1 (en) Virtual window for teleconferencing

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20171101

Address after: Metro Songshan Lake high tech Industrial Development Zone, Guangdong Province, Dongguan City Road 523808 No. 2 South Factory (1) project B2 -5 production workshop

Patentee after: HUAWEI terminal (Dongguan) Co., Ltd.

Address before: 518129 Longgang District, Guangdong, Bantian HUAWEI base B District, building 2, building No.

Patentee before: Huawei Device Co., Ltd.

TR01 Transfer of patent right
CP01 Change in the name or title of a patent holder

Address after: 523808 Southern Factory Building (Phase I) Project B2 Production Plant-5, New Town Avenue, Songshan Lake High-tech Industrial Development Zone, Dongguan City, Guangdong Province

Patentee after: Huawei Device Co., Ltd.

Address before: 523808 Southern Factory Building (Phase I) Project B2 Production Plant-5, New Town Avenue, Songshan Lake High-tech Industrial Development Zone, Dongguan City, Guangdong Province

Patentee before: HUAWEI terminal (Dongguan) Co., Ltd.

CP01 Change in the name or title of a patent holder