CN101986346A - Face image processing method and device - Google Patents

Face image processing method and device Download PDF

Info

Publication number
CN101986346A
CN101986346A CN 201010533976 CN201010533976A CN101986346A CN 101986346 A CN101986346 A CN 101986346A CN 201010533976 CN201010533976 CN 201010533976 CN 201010533976 A CN201010533976 A CN 201010533976A CN 101986346 A CN101986346 A CN 101986346A
Authority
CN
China
Prior art keywords
facial image
image
display plane
plane
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN 201010533976
Other languages
Chinese (zh)
Other versions
CN101986346B (en
Inventor
安文吉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Device Co Ltd
Original Assignee
Huawei Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Device Co Ltd filed Critical Huawei Device Co Ltd
Priority to CN2010105339767A priority Critical patent/CN101986346B/en
Publication of CN101986346A publication Critical patent/CN101986346A/en
Application granted granted Critical
Publication of CN101986346B publication Critical patent/CN101986346B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

Embodiments of the invention discloses a face image processing method and device, relating to the field of image processing, the method and the device can correct a face image, so that the face image can be displayed according to a predetermined angle. The face image processing method comprises the following steps: obtaining a first face image downwards, and obtaining a second face image upwards; and projecting pixel points in the first face image and the second face image on a display plane according to a predetermined corresponding relationship. The face image processing method and the device are mainly used for face image processing.

Description

Face image processing process and device
Technical field
The present invention relates to image processing field, relate in particular to face image processing process and device.
Background technology
Along with the continuous development of the communication technology, people have been not content with the information interaction mode of literal and voice, and video conversation has been applied to the communications field.
In the prior art, adopt the mode that camera is installed on communication facilities, realize video conversation.Communication facilities obtains image by camera, and the image that obtains is sent on the communication facilities of communication counterpart in real time, and communication facilities shows the other side's image of receiving in real time.
Camera is equivalent to the other side's eyes, because people are always seeing image or the literal of the other side on the communication facilities display screen when carrying out video conversation, rather than seeing camera, so when carrying out video calling, communicating pair sees that the eyes in the other side's image are seeing other place beyond the screen often, rather than with oneself eyes to looking.Because the eye gaze direction of the other side's image of seeing of communicating pair always has individual differential seat angle with own direction of visual lines, can not with oneself eyes to looking, so can't reach on the spot in person, face-to-face sensation of talking with.
Summary of the invention
Embodiments of the invention provide a kind of face image processing process and device, can proofread and correct facial image, and described facial image is shown according to predetermined angular.
For achieving the above object, embodiments of the invention adopt following technical scheme:
A kind of face image processing process comprises:
Obtain first facial image from the angle of overlooking, obtain second facial image from the angle of looking up;
Pixel in described first facial image, second facial image is projected on the display plane according to default corresponding relation.
A kind of facial image treating apparatus comprises:
First acquiring unit is used for obtaining first facial image from the angle of overlooking, and obtains second facial image from the angle of looking up;
Projecting cell projects to the pixel in described first facial image, second facial image on the display plane according to default corresponding relation.
Face image processing process that the embodiment of the invention provides and device, first facial image and the pixel from second facial image that the angle of looking up is obtained that will obtain from the angle of overlooking, project on the display plane, the image after the projection on the display plane is the neutralization of first facial image and second facial image.Can change the angles of display of original image, described facial image is shown according to predetermined angular.
Description of drawings
In order to be illustrated more clearly in the technical scheme in the embodiment of the invention, the accompanying drawing of required use is done to introduce simply in will describing embodiment below, apparently, accompanying drawing in describing below only is some embodiments of the present invention, for those of ordinary skills, under the prerequisite of not paying creative work, can also obtain other accompanying drawing according to these accompanying drawings.
Fig. 1 is the process flow diagram of the described face image processing process of the embodiment of the invention;
Fig. 2 is the process flow diagram of the preferred embodiment of the described face image processing process of the embodiment of the invention;
Fig. 3 concerns synoptic diagram between each plane and the angle in the described face image processing process of the embodiment of the invention;
Fig. 4 is the structural drawing of the described facial image treating apparatus of the embodiment of the invention;
Fig. 5 is the structural drawing of the preferred embodiment of the described facial image treating apparatus of the embodiment of the invention.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the invention, the technical scheme in the embodiment of the invention is clearly and completely described, obviously, described embodiment only is the present invention's part embodiment, rather than whole embodiment.Based on the embodiment among the present invention, those of ordinary skills belong to the scope of protection of the invention not making the every other embodiment that is obtained under the creative work prerequisite.
The embodiment of the invention provides a kind of face image processing process, as shown in Figure 1, comprises the steps:
101, obtain first facial image by image acquiring device from the angle of overlooking, obtain second facial image from the angle of looking up.
102, the pixel in described first facial image, second facial image is projected on the display plane according to default corresponding relation.
The face image processing process that present embodiment provides, first facial image and the pixel from second facial image that the angle of looking up is obtained that will obtain from the angle of overlooking, project on the display plane, the image after the projection on the display plane is the neutralization of first facial image and second facial image.Can change the angles of display of original image, described facial image is shown according to predetermined angular.
As a kind of improvement of present embodiment, the embodiment of the invention provides another kind of face image processing process, as shown in Figure 2, comprises the steps:
201, by image acquiring device, such as, camera obtains first facial image from the angle of overlooking, and obtains second facial image from the angle of looking up.
Can use first image acquiring device in the centre position of the top front end that is arranged on terminal device to obtain first facial image, use second image acquiring device in the centre position of the end front end that is arranged on terminal device to obtain second facial image from the angle of looking up from the angle of overlooking.
As an embodiment of the present embodiment, described first facial image, second facial image both can be the general images of people's face, also can be the parts of images of people's face.Can get the image of the first half of people's face such as described first facial image, described second facial image can be got the image of the latter half of people's face.
Another kind of embodiment as present embodiment, described first facial image can be the people face part of described first image acquiring device from the image that the angle of overlooking gets access to, and described second facial image can be the people face part of described second image acquiring device from the image that the angle of looking up gets access to.
As a kind of embodiment of this step, described image acquiring device can be two.An image acquiring device is arranged on the centre position of the top front end of terminal device, and another image acquiring device is arranged on the centre position of the end front end of terminal device.With the mobile phone is example, and a camera can be set in the centre position of the top of mobile phone front end, in the centre position of the end of mobile phone front end a camera is set.The viewfinder range of each camera can be set according to actual conditions, is specifically as follows following scheme:
The first, the camera that is provided with of the centre position of mobile phone top front end is used for obtaining the image of the first half of people's face; The camera that the centre position of front end is provided with at the bottom of the mobile phone is used for obtaining the image of the latter half of people's face.
The second, two cameras can absorb the general image of people's face.
As an embodiment of the present embodiment, the angle of looking up when the angle of overlooking when described first image acquiring device obtains image can be obtained image with described second image acquiring device is identical.
202, in order to make the described face image processing process of the embodiment of the invention more targeted, after described first image acquiring device gets access to image from the angle of overlooking, adopt image recognition technology, from the image that gets access to, extract the people face part as the first pending facial image such as face recognition technology; After described second image acquiring device gets access to image from the angle of overlooking, adopt image recognition technology, from the image that gets access to, extract the people face part as the second pending facial image such as face recognition technology.
203, calculate the cosine value of the first angle a between described first plane, facial image place and the display plane, the cosine value of the second angle b between described second plane, facial image place and the display plane.
Here suppose the user when using the terminal device that adopts the described face image processing process of the embodiment of the invention, the nose of people's face is always vertical with display plane with the line at the center of terminal device display plane.
As shown in Figure 3, AC represents the plane at the described first facial image place, and GD represents first image acquiring device when obtaining image, the shooting light path between first image acquiring device and the people's face.Obviously, first image acquiring device represented of the plane at the described first facial image place of AC representative and DG is vertical with shooting light path between people's face.
AB representative image display plane, GE is perpendicular to the display plane of terminal device.Obviously, GE is perpendicular to the image display plane of AB representative.
So, AC represents the plane at the described first facial image place and the angle between the AB representative image display plane, angle a0 between the shooting light path between first image acquiring device of the i.e. first angle a between first plane, facial image place and the display plane, and GE and GD representative and the people's face equates.So, cos (a)=cos (a0).
Usually, when making terminal device, shooting light path between first image acquiring device and the people's face and the angle between the display plane are predefined, then can determine and first image acquiring device and people's face between shooting light path and the mutually surplus angle a0 of the angle between the display plane can calculate, and then can draw the value of angle a, then certainly calculate the value of cos (a).
In like manner, can calculate the cosine value cos (b) of the second angle b between second plane, facial image place and the display plane according to shooting light path between predefined second image acquiring device and the people's face and the angle between the display plane.
Another kind of embodiment as this step, when making terminal device, shooting light path and the angle between the display plane between first image acquiring device and the people's face are not predefined, but employing image recognition technology, such as the organ identification technology, make camera carry out track up to certain organ of people's face.In the process that the user uses, shooting light path between first image acquiring device and the people's face and the angle between the display plane are with the variable in distance between people's face and the display plane.In this case, adopt the value of calculating cos (a) in the following method:
At first, need to determine the distance between people's face and the display plane, can obtain distance between people's face plane and the image display plane by range sensor.
Secondly, as shown in Figure 3, AC represents the plane at the described first facial image place, and GD represents first image acquiring device when obtaining image, the shooting light path between first image acquiring device and the people's face.Obviously, no matter how the shooting light path between first image acquiring device and the people's face changes, and first image acquiring device that the plane at the described first facial image place of AC representative and GD represent is vertical with the shooting light path between people's face.
AB representative image display plane, GE is perpendicular to the display plane of terminal device.Obviously, GE is perpendicular to the image display plane of AB representative.
So, AC represents the plane at the described first facial image place and the angle between the AB representative image display plane, angle a0 between the shooting light path between first image acquiring device of the i.e. first angle a between first plane, facial image place and the display plane, and GE and GD representative and the people's face equates.So, cos (a)=cos (a0).
Figure BSA00000335643700051
Wherein, m is the distance between people's face plane and image display plane, and n is in the heart a distance in image acquiring device center and the image display plane.
In like manner, the cosine value that can draw the angle b0 between the cosine value of the second angle b between plane, the second facial image place and the display plane and GF and the GE equates,
Figure BSA00000335643700052
Wherein, m is the distance between people's face plane and image display plane, and n is in the heart a distance in image acquiring device center and the image display plane.
204, according to the described first included angle cosine value be cos (a) with described the first face image projection to display plane, be that cos (b) projects to described second facial image on the display plane according to the described second included angle cosine value.
At first, calculate pixel in described first facial image to the coordinate of described display plane projection, and the pixel in described first facial image is projected on the described display plane according to the relation of x0=x1/cos (a), y0=y1/cos (a).
Calculate pixel in described second facial image to the coordinate of described display plane projection according to the relation of x0=x2/cos (b), y0=y2/cos (b), and the pixel in described second facial image is projected on the described display plane.
Wherein, described display plane is parallel with the image display plane, (x1, y1) is the coordinate of pixel in described first facial image, (x2, y2) is the pixel coordinate in described second facial image, (x0, y0) is the point coordinate on the described display plane, cos (a) is the included angle cosine value between described first plane, facial image place and the display plane, and cos (b) is the included angle cosine value between described second plane, facial image place and the display plane.
In the present embodiment pixel in described first facial image is projected on the described display plane, perhaps the pixel in described second facial image is projected on the described display plane, can not have sequencing in time.
As an embodiment of the present embodiment, the projection that same coordinate points on display plane is repeated, described pixel in described first facial image is projected on the described display plane can be coordinate position projection on the described display plane that the pixel in described second facial image is not projected to.Perhaps
Described pixel in described second facial image is projected on the described display plane can be coordinate position projection on the described display plane that the pixel in described first facial image is not projected to.
In the present embodiment, after first facial image or second facial image are projected to display plane, next when projecting to another facial image on the display plane, no longer the whole pixels with another facial image all project on the display plane, but, only with the coordinate points projection that not be projected of the pixel in the facial image to correspondence.The projection of avoiding the same coordinate points on the display plane to be repeated.
The face image processing process that present embodiment provides, first facial image and the pixel from second facial image that the angle of looking up is obtained that will obtain from the angle of overlooking, project on the display plane, the image after the projection on the display plane is the neutralization of first facial image and second facial image.Can change the angles of display of original image, described facial image is shown according to predetermined angular.
The embodiment of the invention provides a kind of facial image treating apparatus, as shown in Figure 4, comprising: first acquiring unit 41, projecting cell 42.
Wherein, described first acquiring unit 41 obtains first facial image from the angle of overlooking, and obtains second facial image from the angle of looking up.Described projecting cell 42 projects to the pixel in described first facial image, second facial image on the display plane according to default corresponding relation.
The facial image treating apparatus that present embodiment provides, first facial image and the pixel from second facial image that the angle of looking up is obtained that will obtain from the angle of overlooking, project on the display plane, the image after the projection on the display plane is the neutralization of first facial image and second facial image.Can change the angles of display of original image, described facial image is shown according to predetermined angular.
The embodiment of the invention provides another kind of facial image treating apparatus, as shown in Figure 5, comprising: first acquiring unit 51, second acquisition unit 52, projecting cell 53.
Wherein projecting cell 53 comprises: acquisition module 531, projection module 532.
Wherein, described first acquiring unit 51 obtains first facial image from the angle of overlooking, and obtains second facial image from the angle of looking up.Second acquisition unit 52 obtains distance between people's face plane and image display plane by range sensor.Described projecting cell 53 projects to the pixel in described first facial image, second facial image on the display plane according to default corresponding relation, be specially: at first, acquisition module 531 obtains the first included angle cosine value between described first plane, facial image place and the display plane, the second included angle cosine value between described second plane, facial image place and the display plane; Projection module 532 according to the described first included angle cosine value with described the first face image projection to display plane, according to the described second included angle cosine value described second facial image is projected on the display plane.
The facial image treating apparatus that present embodiment provides, first facial image and the pixel from second facial image that the angle of looking up is obtained that will obtain from the angle of overlooking, project on the display plane, the image after the projection on the display plane is the neutralization of first facial image and second facial image.Can change the angles of display of original image, described facial image is shown according to predetermined angular.
As an embodiment of the present embodiment, described first acquiring unit 51 can use first image acquiring device in the centre position of the top front end that is arranged on terminal device to obtain first facial image from the angle of overlooking, and uses second image acquiring device in the centre position of the end front end that is arranged on terminal device to obtain second facial image from the angle of looking up.
As an embodiment of the present embodiment, first facial image that described first acquiring unit 51 gets access to, second facial image both can be the general images of people's face, also can be the parts of images of people's face.
Another kind of embodiment as present embodiment, first facial image that described first acquiring unit 51 gets access to can be the people face part of described first image acquiring device from the image that the angle of overlooking gets access to, and described second facial image can be the people face part of described second image acquiring device from the image that the angle of looking up gets access to.
As a kind of embodiment of this step, described first acquiring unit 51 comprises two image acquiring devices.An image acquiring device is arranged on the centre position of the top front end of terminal device, and another image acquiring device is arranged on the centre position of the end front end of terminal device.With the mobile phone is example, and a camera can be set in the centre position of the top of mobile phone front end, in the centre position of the end of mobile phone front end a camera is set.The viewfinder range of each camera can be set according to actual conditions, is specifically as follows following scheme:
The first, the camera that is provided with of the centre position of mobile phone top front end is used for obtaining the image of the first half of people's face; The camera that the centre position of front end is provided with at the bottom of the mobile phone is used for obtaining the image of the latter half of people's face.
The second, two cameras can absorb the general image of people's face.
As an embodiment of the present embodiment, the angle of looking up when the angle of overlooking when described first image acquiring device obtains image can be obtained image with described second image acquiring device is identical.
More targeted when making first acquiring unit 51 in the described facial image treating apparatus of the embodiment of the invention obtain image, after described first image acquiring device gets access to image from the angle of overlooking, adopt image recognition technology, from the image that gets access to, extract the people face part as the first pending facial image such as face recognition technology; After described second image acquiring device gets access to image from the angle of overlooking, adopt image recognition technology, from the image that gets access to, extract the people face part as the second pending facial image such as face recognition technology.
As an embodiment of the present embodiment, described acquisition module 531 obtains the first included angle cosine value between described first plane, facial image place and the display plane, and the second included angle cosine value between described second plane, facial image place and the display plane can adopt following method:
Here suppose the user when using the terminal device that adopts the described face image processing process of the embodiment of the invention, the nose of people's face is always vertical with display plane with the line at the center of terminal device display plane.
As shown in Figure 3, AC represents the plane at the described first facial image place, and GD represents first image acquiring device when obtaining image, the shooting light path between first image acquiring device and the people's face.Obviously, first image acquiring device represented of the plane at the described first facial image place of AC representative and DG is vertical with shooting light path between people's face.
AB representative image display plane, GE is perpendicular to the display plane of terminal device.Obviously, GE is perpendicular to the image display plane of AB representative.
So, AC represents the plane at the described first facial image place and the angle between the AB representative image display plane, angle a0 between the shooting light path between first image acquiring device of the i.e. first angle a between first plane, facial image place and the display plane, and GE and GD representative and the people's face equates.So, cos (a)=cos (a0).
Usually, when making terminal device, shooting light path between first image acquiring device and the people's face and the angle between the display plane are predefined, then can determine and first image acquiring device and people's face between shooting light path and the mutually surplus angle a0 of the angle between the display plane can calculate, and then can draw the value of angle a, then certainly calculate the value of cos (a).
In like manner, can calculate the value of cos (b).
Another kind of embodiment as this step, when making terminal device, shooting light path and the angle between the display plane between first image acquiring device and the people's face are not predefined, but employing image recognition technology, such as the organ identification technology, make camera carry out track up to certain organ of people's face.In the process that the user uses, shooting light path between first image acquiring device and the people's face and the angle between the display plane are with the variable in distance between people's face and the display plane.In this case, adopt the value of calculating cos (a) in the following method:
At first, need to determine the distance between people's face and the display plane, can obtain distance between people's face plane and the image display plane by range sensor.
Secondly, as shown in Figure 3, AC represents the plane at the described first facial image place, and GD represents first image acquiring device when obtaining image, the shooting light path between first image acquiring device and the people's face.Obviously, no matter how the shooting light path between first image acquiring device and the people's face changes, and first image acquiring device that the plane at the described first facial image place of AC representative and DG represent is vertical with the shooting light path between people's face.
AB representative image display plane, GE is perpendicular to the display plane of terminal device.Obviously, GE is perpendicular to the image display plane of AB representative.
So, AC represents the plane at the described first facial image place and the angle between the AB representative image display plane, angle a 0 between the shooting light path between first image acquiring device of the i.e. first angle a between first plane, facial image place and the display plane, and GE and GD representative and the people's face equates.So, cos (a)=cos (a 0).
Figure BSA00000335643700101
Wherein, m is the distance between people's face plane and image display plane, and n is in the heart a distance in image acquiring device center and the image display plane.
In like manner, can draw
Figure BSA00000335643700102
Wherein, m is the distance between people's face plane and image display plane, and n is in the heart a distance in image acquiring device center and the image display plane.
As an embodiment of the present embodiment, described projection module 432 according to the described first included angle cosine value with described the first face image projection to display plane, according to the described second included angle cosine value described second facial image is projected on the display plane and can adopt following method:
According to the described first included angle cosine value be cos (a) with described the first face image projection to display plane, be that cos (b) projects to described second facial image on the display plane according to the described second included angle cosine value.
At first, calculate pixel in described first facial image to the coordinate of described display plane projection, and the pixel in described first facial image is projected on the described display plane according to the relation of x 0=x1/cos (a), y0=y1/cos (a).
Calculate pixel in described second facial image to the coordinate of described display plane projection according to the relation of x0=x2/cos (b), y0=y2/cos (b), and the pixel in described second facial image is projected on the described display plane.
Wherein, described display plane is parallel with the image display plane, (x1, y1) is the coordinate of pixel in described first facial image, (x2, y2) is the pixel coordinate in described second facial image, (x0, y0) is the point coordinate on the described display plane, cos (a) is the included angle cosine value between described first plane, facial image place and the display plane, and cos (b) is the included angle cosine value between described second plane, facial image place and the display plane.
In the present embodiment pixel in described first facial image is projected on the described display plane, perhaps the pixel in described second facial image is projected on the described display plane, can not have sequencing in time.
As an embodiment of the present embodiment, the projection that same coordinate points on display plane is repeated, described pixel in described first facial image is projected on the described display plane can be coordinate position projection on the described display plane that the pixel in described second facial image is not projected to.Perhaps
Described pixel in described second facial image is projected on the described display plane can be coordinate position projection on the described display plane that the pixel in described first facial image is not projected to.
In the present embodiment, after first facial image or second facial image are projected to display plane, next when projecting to another facial image on the display plane, no longer the whole pixels with another facial image all project on the display plane, but, only with the coordinate points projection that not be projected of the pixel in the facial image to correspondence.The projection of avoiding the same coordinate points on the display plane to be repeated.
The facial image treating apparatus that present embodiment provides, first facial image and the pixel from second facial image that the angle of looking up is obtained that will obtain from the angle of overlooking, project on the display plane, the image after the projection on the display plane is the neutralization of first facial image and second facial image.Can change the angles of display of original image, described facial image is shown according to predetermined angular.
Terminal device described in the embodiment of the invention can be equipment such as mobile phone, PDA, computing machine.
Through the above description of the embodiments, the those skilled in the art can be well understood to the present invention and can realize by the mode that software adds essential common hardware, can certainly pass through hardware, but the former is better embodiment under a lot of situation.Based on such understanding, the part that technical scheme of the present invention contributes to prior art in essence in other words can embody with the form of software product, this computer software product is stored in the storage medium that can read, floppy disk as computing machine, hard disk or CD etc., comprise some instructions with so that computer equipment (can be personal computer, server, the perhaps network equipment etc.) carry out the described method of each embodiment of the present invention.
The above; only be the specific embodiment of the present invention, but protection scope of the present invention is not limited thereto, anyly is familiar with those skilled in the art in the technical scope that the present invention discloses; can expect easily changing or replacing, all should be encompassed within protection scope of the present invention.Therefore, protection scope of the present invention should be as the criterion by described protection domain with claim.

Claims (15)

1. a face image processing process is characterized in that, comprising:
Obtain first facial image from the angle of overlooking, obtain second facial image from the angle of looking up;
Pixel in described first facial image, second facial image is projected on the display plane according to default corresponding relation.
2. face image processing process according to claim 1 is characterized in that, described pixel in described first facial image, second facial image is projected on the display plane according to default corresponding relation comprises:
Obtain the first included angle cosine value between described first plane, facial image place and the display plane, the second included angle cosine value between described second plane, facial image place and the display plane;
According to the described first included angle cosine value with described the first face image projection to display plane, according to the described second included angle cosine value described second facial image is projected on the display plane.
3. face image processing process according to claim 2, it is characterized in that, described according to described first angle with described the first face image projection to display plane, according to described second angle described second facial image is projected on the display plane and to be: calculate pixel in described first facial image to the coordinate of described display plane projection according to the relation of x0=x1/cos (a), y 0=y1/cos (a), and the pixel in described first facial image is projected on the described display plane; Calculate pixel in described second facial image to the coordinate of described display plane projection according to the relation of x0=x2/cos (b), y0=y2/cos (b), and the pixel in described second facial image is projected on the described display plane;
Wherein, described display plane is parallel with the image display plane, (x1, y1) is the coordinate of pixel in described first facial image, (x2, y2) is the pixel coordinate in described second facial image, (x0, y0) is the point coordinate on the described display plane, cos (a) is the included angle cosine value between described first plane, facial image place and the display plane, and cos (b) is the included angle cosine value between described second plane, facial image place and the display plane.
4. face image processing process according to claim 3, it is characterized in that described pixel in described first facial image is projected on the described display plane is: the coordinate position projection on the described display plane that the pixel in described second facial image is not projected to; Perhaps
Described pixel in described second facial image is projected on the described display plane is: the coordinate position projection on the described display plane that the pixel in described first facial image is not projected to.
5. according to each described face image processing process of claim 1 to 4, it is characterized in that, describedly obtain first facial image from the angle of overlooking, obtaining second facial image from the angle of looking up is: use first image acquiring device in the centre position of the top front end that is arranged on terminal device to obtain first facial image from the angle of overlooking, use second image acquiring device in the centre position of the end front end that is arranged on terminal device to obtain second facial image from the angle of looking up.
6. according to claim 3 or 4 described face image processing process, it is characterized in that,
Wherein, m is the distance between people's face plane and image display plane, and n is in the heart a distance in image acquiring device center and the image display plane.
7. face image processing process according to claim 6 is characterized in that, obtains distance between people's face plane and image display plane by range sensor.
8. face image processing process according to claim 5, it is characterized in that, described first facial image is the people face part of described first image acquiring device from the image that the angle of overlooking gets access to, and described second facial image is the people face part of described second image acquiring device from the image that the angle of looking up gets access to.
9. a facial image treating apparatus is characterized in that, comprising:
First acquiring unit is used for obtaining first facial image from the angle of overlooking, and obtains second facial image from the angle of looking up;
Projecting cell projects to the pixel in described first facial image, second facial image on the display plane according to default corresponding relation.
10. facial image treating apparatus according to claim 9 is characterized in that, described projecting cell comprises:
Acquisition module is used to obtain the first included angle cosine value between described first plane, facial image place and the display plane, the second included angle cosine value between described second plane, facial image place and the display plane;
Projection module is used for according to the described first included angle cosine value described the first face image projection described second facial image being projected on the display plane according to the described second included angle cosine value to display plane.
11. facial image treating apparatus according to claim 10, it is characterized in that, described projection module according to the described first included angle cosine value with described the first face image projection to display plane, according to the described second included angle cosine value described second facial image is projected on the display plane and to be: described projecting cell calculates pixel in described first facial image to the coordinate of described display plane projection according to the relation of x0=x1/cos (a), y0=y1/cos (a), and the pixel in described first facial image is projected on the described display plane; Calculate pixel in described second facial image to the coordinate of described display plane projection according to the relation of x0=x2/cos (b), y0=y2/cos (b), and the pixel in described second facial image is projected on the described display plane;
Wherein, described display plane is parallel with the image display plane, (x1, y1) is the coordinate of pixel in described first facial image, (x2, y2) is the pixel coordinate in described second facial image, (x0, y0) is the point coordinate on the described display plane, cos (a) is the included angle cosine value between described first plane, facial image place and the display plane, and cos (b) is the included angle cosine value between described second plane, facial image place and the display plane.
12. facial image treating apparatus according to claim 11, it is characterized in that described projecting cell projects to the pixel in described first facial image on the described display plane and is: the coordinate position projection on the described display plane that the pixel in described second facial image is not projected to; Perhaps
Described projecting cell projects to the pixel in described second facial image on the described display plane: the coordinate position projection on the described display plane that the pixel in described first facial image is not projected to.
13. according to each described facial image treating apparatus of claim 9 to 12, it is characterized in that, described acquiring unit obtains first facial image from the angle of overlooking, obtaining second facial image from the angle of looking up is: described acquiring unit uses first image acquiring device in the centre position of the top front end that is arranged on terminal device to obtain first facial image from the angle of overlooking, and uses second image acquiring device in the centre position of the end front end that is arranged on terminal device to obtain second facial image from the angle of looking up.
14. according to claim 11 or 12 described facial image treating apparatus, it is characterized in that,
Figure FSA00000335643600031
Wherein, m is the distance between people's face plane and image display plane, and n is in the heart a distance in image acquiring device center and the image display plane.
15. facial image treating apparatus according to claim 14 is characterized in that, described facial image treating apparatus also comprises:
Second acquisition unit is used for obtaining distance between people's face plane and image display plane by range sensor.
CN2010105339767A 2010-11-01 2010-11-01 Face image processing method and device Active CN101986346B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2010105339767A CN101986346B (en) 2010-11-01 2010-11-01 Face image processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2010105339767A CN101986346B (en) 2010-11-01 2010-11-01 Face image processing method and device

Publications (2)

Publication Number Publication Date
CN101986346A true CN101986346A (en) 2011-03-16
CN101986346B CN101986346B (en) 2012-08-08

Family

ID=43710692

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2010105339767A Active CN101986346B (en) 2010-11-01 2010-11-01 Face image processing method and device

Country Status (1)

Country Link
CN (1) CN101986346B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103746985A (en) * 2013-12-30 2014-04-23 华为技术有限公司 Method and device for communication
CN105812709A (en) * 2016-03-18 2016-07-27 合肥联宝信息技术有限公司 Method for realizing virtual camera by using cameras
CN105989577A (en) * 2015-02-17 2016-10-05 中兴通讯股份有限公司 Image correction method and device
CN106664361A (en) * 2014-06-20 2017-05-10 索尼公司 Information processing device, information processing system, and information processing method and program

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5500671A (en) * 1994-10-25 1996-03-19 At&T Corp. Video conference system and method of providing parallax correction and a sense of presence
US6771303B2 (en) * 2002-04-23 2004-08-03 Microsoft Corporation Video-teleconferencing system with eye-gaze correction
CN1672431A (en) * 2002-05-21 2005-09-21 索尼株式会社 Information processing apparatus, information processing system, and dialogist displaying method
JP2009086703A (en) * 2007-09-27 2009-04-23 Fujifilm Corp Image display device, image display method and image display program
CN101547332A (en) * 2008-03-26 2009-09-30 联想(北京)有限公司 Image processing method and device thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5500671A (en) * 1994-10-25 1996-03-19 At&T Corp. Video conference system and method of providing parallax correction and a sense of presence
US6771303B2 (en) * 2002-04-23 2004-08-03 Microsoft Corporation Video-teleconferencing system with eye-gaze correction
CN1672431A (en) * 2002-05-21 2005-09-21 索尼株式会社 Information processing apparatus, information processing system, and dialogist displaying method
JP2009086703A (en) * 2007-09-27 2009-04-23 Fujifilm Corp Image display device, image display method and image display program
CN101547332A (en) * 2008-03-26 2009-09-30 联想(北京)有限公司 Image processing method and device thereof

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
《IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE》 19980731 Kin-Man Lam等 An Analytic-to-Holistic Approach for Face Recognition Based on a Single Frontal View 第20卷, 第7期 2 *
《仪器仪表学报》 20101031 顾亦然 等 三维人脸姿态校正算法研究 第31卷, 第10期 2 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103746985A (en) * 2013-12-30 2014-04-23 华为技术有限公司 Method and device for communication
CN106664361A (en) * 2014-06-20 2017-05-10 索尼公司 Information processing device, information processing system, and information processing method and program
CN106664361B (en) * 2014-06-20 2019-12-13 索尼公司 Information processing apparatus, information processing method, and computer-readable storage medium
CN105989577A (en) * 2015-02-17 2016-10-05 中兴通讯股份有限公司 Image correction method and device
CN105989577B (en) * 2015-02-17 2020-12-29 中兴通讯股份有限公司 Image correction method and device
CN105812709A (en) * 2016-03-18 2016-07-27 合肥联宝信息技术有限公司 Method for realizing virtual camera by using cameras

Also Published As

Publication number Publication date
CN101986346B (en) 2012-08-08

Similar Documents

Publication Publication Date Title
US10694175B2 (en) Real-time automatic vehicle camera calibration
US8908008B2 (en) Methods and systems for establishing eye contact and accurate gaze in remote collaboration
CN107396075B (en) Method and device for generating projection image correction information
US9626564B2 (en) System for enabling eye contact in electronic images
CN106462937B (en) Image processing apparatus and image display apparatus
US20110148868A1 (en) Apparatus and method for reconstructing three-dimensional face avatar through stereo vision and face detection
CN104881114B (en) A kind of angular turn real-time matching method based on 3D glasses try-in
CN105787884A (en) Image processing method and electronic device
CN102483854A (en) Image processing system
CN105763829A (en) Image processing method and electronic device
CN101986346B (en) Face image processing method and device
CN112017222A (en) Video panorama stitching and three-dimensional fusion method and device
CN104881526A (en) Article wearing method and glasses try wearing method based on 3D (three dimensional) technology
CN103458259B (en) A kind of 3D video causes detection method, the Apparatus and system of people's eye fatigue
US20150206338A1 (en) Display device, display method, and program
CN106774870A (en) A kind of augmented reality exchange method and system
KR101148508B1 (en) A method and device for display of mobile device, and mobile device using the same
CN205545426U (en) Three -dimensional mobile telephone device of panoramic shooting group
CN107592520B (en) Imaging device and imaging method of AR equipment
CN107291225A (en) A kind of wear-type virtual reality/augmented reality device based on mobile phone
CN105812709A (en) Method for realizing virtual camera by using cameras
TW201518994A (en) Viewing angle adjusting method, apparatus and system of liquid crystal display
CN105488764B (en) Fisheye image correcting method and device
CN102222362B (en) Method and device for generating water wave special effect and electronic equipment
CN110288680A (en) Image generating method and mobile terminal

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20171101

Address after: Metro Songshan Lake high tech Industrial Development Zone, Guangdong Province, Dongguan City Road 523808 No. 2 South Factory (1) project B2 -5 production workshop

Patentee after: HUAWEI terminal (Dongguan) Co., Ltd.

Address before: 518129 Longgang District, Guangdong, Bantian HUAWEI base B District, building 2, building No.

Patentee before: Huawei Device Co., Ltd.

CP01 Change in the name or title of a patent holder
CP01 Change in the name or title of a patent holder

Address after: 523808 Southern Factory Building (Phase I) Project B2 Production Plant-5, New Town Avenue, Songshan Lake High-tech Industrial Development Zone, Dongguan City, Guangdong Province

Patentee after: Huawei Device Co., Ltd.

Address before: 523808 Southern Factory Building (Phase I) Project B2 Production Plant-5, New Town Avenue, Songshan Lake High-tech Industrial Development Zone, Dongguan City, Guangdong Province

Patentee before: HUAWEI terminal (Dongguan) Co., Ltd.