Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the invention, the technical scheme in the embodiment of the invention is clearly and completely described, obviously, described embodiment only is the present invention's part embodiment, rather than whole embodiment.Based on the embodiment among the present invention, those of ordinary skills belong to the scope of protection of the invention not making the every other embodiment that is obtained under the creative work prerequisite.
The embodiment of the invention provides a kind of face image processing process, as shown in Figure 1, comprises the steps:
101, obtain first facial image by image acquiring device from the angle of overlooking, obtain second facial image from the angle of looking up.
102, the pixel in described first facial image, second facial image is projected on the display plane according to default corresponding relation.
The face image processing process that present embodiment provides, first facial image and the pixel from second facial image that the angle of looking up is obtained that will obtain from the angle of overlooking, project on the display plane, the image after the projection on the display plane is the neutralization of first facial image and second facial image.Can change the angles of display of original image, described facial image is shown according to predetermined angular.
As a kind of improvement of present embodiment, the embodiment of the invention provides another kind of face image processing process, as shown in Figure 2, comprises the steps:
201, by image acquiring device, such as, camera obtains first facial image from the angle of overlooking, and obtains second facial image from the angle of looking up.
Can use first image acquiring device in the centre position of the top front end that is arranged on terminal device to obtain first facial image, use second image acquiring device in the centre position of the end front end that is arranged on terminal device to obtain second facial image from the angle of looking up from the angle of overlooking.
As an embodiment of the present embodiment, described first facial image, second facial image both can be the general images of people's face, also can be the parts of images of people's face.Can get the image of the first half of people's face such as described first facial image, described second facial image can be got the image of the latter half of people's face.
Another kind of embodiment as present embodiment, described first facial image can be the people face part of described first image acquiring device from the image that the angle of overlooking gets access to, and described second facial image can be the people face part of described second image acquiring device from the image that the angle of looking up gets access to.
As a kind of embodiment of this step, described image acquiring device can be two.An image acquiring device is arranged on the centre position of the top front end of terminal device, and another image acquiring device is arranged on the centre position of the end front end of terminal device.With the mobile phone is example, and a camera can be set in the centre position of the top of mobile phone front end, in the centre position of the end of mobile phone front end a camera is set.The viewfinder range of each camera can be set according to actual conditions, is specifically as follows following scheme:
The first, the camera that is provided with of the centre position of mobile phone top front end is used for obtaining the image of the first half of people's face; The camera that the centre position of front end is provided with at the bottom of the mobile phone is used for obtaining the image of the latter half of people's face.
The second, two cameras can absorb the general image of people's face.
As an embodiment of the present embodiment, the angle of looking up when the angle of overlooking when described first image acquiring device obtains image can be obtained image with described second image acquiring device is identical.
202, in order to make the described face image processing process of the embodiment of the invention more targeted, after described first image acquiring device gets access to image from the angle of overlooking, adopt image recognition technology, from the image that gets access to, extract the people face part as the first pending facial image such as face recognition technology; After described second image acquiring device gets access to image from the angle of overlooking, adopt image recognition technology, from the image that gets access to, extract the people face part as the second pending facial image such as face recognition technology.
203, calculate the cosine value of the first angle a between described first plane, facial image place and the display plane, the cosine value of the second angle b between described second plane, facial image place and the display plane.
Here suppose the user when using the terminal device that adopts the described face image processing process of the embodiment of the invention, the nose of people's face is always vertical with display plane with the line at the center of terminal device display plane.
As shown in Figure 3, AC represents the plane at the described first facial image place, and GD represents first image acquiring device when obtaining image, the shooting light path between first image acquiring device and the people's face.Obviously, first image acquiring device represented of the plane at the described first facial image place of AC representative and DG is vertical with shooting light path between people's face.
AB representative image display plane, GE is perpendicular to the display plane of terminal device.Obviously, GE is perpendicular to the image display plane of AB representative.
So, AC represents the plane at the described first facial image place and the angle between the AB representative image display plane, angle a0 between the shooting light path between first image acquiring device of the i.e. first angle a between first plane, facial image place and the display plane, and GE and GD representative and the people's face equates.So, cos (a)=cos (a0).
Usually, when making terminal device, shooting light path between first image acquiring device and the people's face and the angle between the display plane are predefined, then can determine and first image acquiring device and people's face between shooting light path and the mutually surplus angle a0 of the angle between the display plane can calculate, and then can draw the value of angle a, then certainly calculate the value of cos (a).
In like manner, can calculate the cosine value cos (b) of the second angle b between second plane, facial image place and the display plane according to shooting light path between predefined second image acquiring device and the people's face and the angle between the display plane.
Another kind of embodiment as this step, when making terminal device, shooting light path and the angle between the display plane between first image acquiring device and the people's face are not predefined, but employing image recognition technology, such as the organ identification technology, make camera carry out track up to certain organ of people's face.In the process that the user uses, shooting light path between first image acquiring device and the people's face and the angle between the display plane are with the variable in distance between people's face and the display plane.In this case, adopt the value of calculating cos (a) in the following method:
At first, need to determine the distance between people's face and the display plane, can obtain distance between people's face plane and the image display plane by range sensor.
Secondly, as shown in Figure 3, AC represents the plane at the described first facial image place, and GD represents first image acquiring device when obtaining image, the shooting light path between first image acquiring device and the people's face.Obviously, no matter how the shooting light path between first image acquiring device and the people's face changes, and first image acquiring device that the plane at the described first facial image place of AC representative and GD represent is vertical with the shooting light path between people's face.
AB representative image display plane, GE is perpendicular to the display plane of terminal device.Obviously, GE is perpendicular to the image display plane of AB representative.
So, AC represents the plane at the described first facial image place and the angle between the AB representative image display plane, angle a0 between the shooting light path between first image acquiring device of the i.e. first angle a between first plane, facial image place and the display plane, and GE and GD representative and the people's face equates.So, cos (a)=cos (a0).
Wherein, m is the distance between people's face plane and image display plane, and n is in the heart a distance in image acquiring device center and the image display plane.
In like manner, the cosine value that can draw the angle b0 between the cosine value of the second angle b between plane, the second facial image place and the display plane and GF and the GE equates,
Wherein, m is the distance between people's face plane and image display plane, and n is in the heart a distance in image acquiring device center and the image display plane.
204, according to the described first included angle cosine value be cos (a) with described the first face image projection to display plane, be that cos (b) projects to described second facial image on the display plane according to the described second included angle cosine value.
At first, calculate pixel in described first facial image to the coordinate of described display plane projection, and the pixel in described first facial image is projected on the described display plane according to the relation of x0=x1/cos (a), y0=y1/cos (a).
Calculate pixel in described second facial image to the coordinate of described display plane projection according to the relation of x0=x2/cos (b), y0=y2/cos (b), and the pixel in described second facial image is projected on the described display plane.
Wherein, described display plane is parallel with the image display plane, (x1, y1) is the coordinate of pixel in described first facial image, (x2, y2) is the pixel coordinate in described second facial image, (x0, y0) is the point coordinate on the described display plane, cos (a) is the included angle cosine value between described first plane, facial image place and the display plane, and cos (b) is the included angle cosine value between described second plane, facial image place and the display plane.
In the present embodiment pixel in described first facial image is projected on the described display plane, perhaps the pixel in described second facial image is projected on the described display plane, can not have sequencing in time.
As an embodiment of the present embodiment, the projection that same coordinate points on display plane is repeated, described pixel in described first facial image is projected on the described display plane can be coordinate position projection on the described display plane that the pixel in described second facial image is not projected to.Perhaps
Described pixel in described second facial image is projected on the described display plane can be coordinate position projection on the described display plane that the pixel in described first facial image is not projected to.
In the present embodiment, after first facial image or second facial image are projected to display plane, next when projecting to another facial image on the display plane, no longer the whole pixels with another facial image all project on the display plane, but, only with the coordinate points projection that not be projected of the pixel in the facial image to correspondence.The projection of avoiding the same coordinate points on the display plane to be repeated.
The face image processing process that present embodiment provides, first facial image and the pixel from second facial image that the angle of looking up is obtained that will obtain from the angle of overlooking, project on the display plane, the image after the projection on the display plane is the neutralization of first facial image and second facial image.Can change the angles of display of original image, described facial image is shown according to predetermined angular.
The embodiment of the invention provides a kind of facial image treating apparatus, as shown in Figure 4, comprising: first acquiring unit 41, projecting cell 42.
Wherein, described first acquiring unit 41 obtains first facial image from the angle of overlooking, and obtains second facial image from the angle of looking up.Described projecting cell 42 projects to the pixel in described first facial image, second facial image on the display plane according to default corresponding relation.
The facial image treating apparatus that present embodiment provides, first facial image and the pixel from second facial image that the angle of looking up is obtained that will obtain from the angle of overlooking, project on the display plane, the image after the projection on the display plane is the neutralization of first facial image and second facial image.Can change the angles of display of original image, described facial image is shown according to predetermined angular.
The embodiment of the invention provides another kind of facial image treating apparatus, as shown in Figure 5, comprising: first acquiring unit 51, second acquisition unit 52, projecting cell 53.
Wherein projecting cell 53 comprises: acquisition module 531, projection module 532.
Wherein, described first acquiring unit 51 obtains first facial image from the angle of overlooking, and obtains second facial image from the angle of looking up.Second acquisition unit 52 obtains distance between people's face plane and image display plane by range sensor.Described projecting cell 53 projects to the pixel in described first facial image, second facial image on the display plane according to default corresponding relation, be specially: at first, acquisition module 531 obtains the first included angle cosine value between described first plane, facial image place and the display plane, the second included angle cosine value between described second plane, facial image place and the display plane; Projection module 532 according to the described first included angle cosine value with described the first face image projection to display plane, according to the described second included angle cosine value described second facial image is projected on the display plane.
The facial image treating apparatus that present embodiment provides, first facial image and the pixel from second facial image that the angle of looking up is obtained that will obtain from the angle of overlooking, project on the display plane, the image after the projection on the display plane is the neutralization of first facial image and second facial image.Can change the angles of display of original image, described facial image is shown according to predetermined angular.
As an embodiment of the present embodiment, described first acquiring unit 51 can use first image acquiring device in the centre position of the top front end that is arranged on terminal device to obtain first facial image from the angle of overlooking, and uses second image acquiring device in the centre position of the end front end that is arranged on terminal device to obtain second facial image from the angle of looking up.
As an embodiment of the present embodiment, first facial image that described first acquiring unit 51 gets access to, second facial image both can be the general images of people's face, also can be the parts of images of people's face.
Another kind of embodiment as present embodiment, first facial image that described first acquiring unit 51 gets access to can be the people face part of described first image acquiring device from the image that the angle of overlooking gets access to, and described second facial image can be the people face part of described second image acquiring device from the image that the angle of looking up gets access to.
As a kind of embodiment of this step, described first acquiring unit 51 comprises two image acquiring devices.An image acquiring device is arranged on the centre position of the top front end of terminal device, and another image acquiring device is arranged on the centre position of the end front end of terminal device.With the mobile phone is example, and a camera can be set in the centre position of the top of mobile phone front end, in the centre position of the end of mobile phone front end a camera is set.The viewfinder range of each camera can be set according to actual conditions, is specifically as follows following scheme:
The first, the camera that is provided with of the centre position of mobile phone top front end is used for obtaining the image of the first half of people's face; The camera that the centre position of front end is provided with at the bottom of the mobile phone is used for obtaining the image of the latter half of people's face.
The second, two cameras can absorb the general image of people's face.
As an embodiment of the present embodiment, the angle of looking up when the angle of overlooking when described first image acquiring device obtains image can be obtained image with described second image acquiring device is identical.
More targeted when making first acquiring unit 51 in the described facial image treating apparatus of the embodiment of the invention obtain image, after described first image acquiring device gets access to image from the angle of overlooking, adopt image recognition technology, from the image that gets access to, extract the people face part as the first pending facial image such as face recognition technology; After described second image acquiring device gets access to image from the angle of overlooking, adopt image recognition technology, from the image that gets access to, extract the people face part as the second pending facial image such as face recognition technology.
As an embodiment of the present embodiment, described acquisition module 531 obtains the first included angle cosine value between described first plane, facial image place and the display plane, and the second included angle cosine value between described second plane, facial image place and the display plane can adopt following method:
Here suppose the user when using the terminal device that adopts the described face image processing process of the embodiment of the invention, the nose of people's face is always vertical with display plane with the line at the center of terminal device display plane.
As shown in Figure 3, AC represents the plane at the described first facial image place, and GD represents first image acquiring device when obtaining image, the shooting light path between first image acquiring device and the people's face.Obviously, first image acquiring device represented of the plane at the described first facial image place of AC representative and DG is vertical with shooting light path between people's face.
AB representative image display plane, GE is perpendicular to the display plane of terminal device.Obviously, GE is perpendicular to the image display plane of AB representative.
So, AC represents the plane at the described first facial image place and the angle between the AB representative image display plane, angle a0 between the shooting light path between first image acquiring device of the i.e. first angle a between first plane, facial image place and the display plane, and GE and GD representative and the people's face equates.So, cos (a)=cos (a0).
Usually, when making terminal device, shooting light path between first image acquiring device and the people's face and the angle between the display plane are predefined, then can determine and first image acquiring device and people's face between shooting light path and the mutually surplus angle a0 of the angle between the display plane can calculate, and then can draw the value of angle a, then certainly calculate the value of cos (a).
In like manner, can calculate the value of cos (b).
Another kind of embodiment as this step, when making terminal device, shooting light path and the angle between the display plane between first image acquiring device and the people's face are not predefined, but employing image recognition technology, such as the organ identification technology, make camera carry out track up to certain organ of people's face.In the process that the user uses, shooting light path between first image acquiring device and the people's face and the angle between the display plane are with the variable in distance between people's face and the display plane.In this case, adopt the value of calculating cos (a) in the following method:
At first, need to determine the distance between people's face and the display plane, can obtain distance between people's face plane and the image display plane by range sensor.
Secondly, as shown in Figure 3, AC represents the plane at the described first facial image place, and GD represents first image acquiring device when obtaining image, the shooting light path between first image acquiring device and the people's face.Obviously, no matter how the shooting light path between first image acquiring device and the people's face changes, and first image acquiring device that the plane at the described first facial image place of AC representative and DG represent is vertical with the shooting light path between people's face.
AB representative image display plane, GE is perpendicular to the display plane of terminal device.Obviously, GE is perpendicular to the image display plane of AB representative.
So, AC represents the plane at the described first facial image place and the angle between the AB representative image display plane, angle a 0 between the shooting light path between first image acquiring device of the i.e. first angle a between first plane, facial image place and the display plane, and GE and GD representative and the people's face equates.So, cos (a)=cos (a 0).
Wherein, m is the distance between people's face plane and image display plane, and n is in the heart a distance in image acquiring device center and the image display plane.
In like manner, can draw
Wherein, m is the distance between people's face plane and image display plane, and n is in the heart a distance in image acquiring device center and the image display plane.
As an embodiment of the present embodiment, described projection module 432 according to the described first included angle cosine value with described the first face image projection to display plane, according to the described second included angle cosine value described second facial image is projected on the display plane and can adopt following method:
According to the described first included angle cosine value be cos (a) with described the first face image projection to display plane, be that cos (b) projects to described second facial image on the display plane according to the described second included angle cosine value.
At first, calculate pixel in described first facial image to the coordinate of described display plane projection, and the pixel in described first facial image is projected on the described display plane according to the relation of x 0=x1/cos (a), y0=y1/cos (a).
Calculate pixel in described second facial image to the coordinate of described display plane projection according to the relation of x0=x2/cos (b), y0=y2/cos (b), and the pixel in described second facial image is projected on the described display plane.
Wherein, described display plane is parallel with the image display plane, (x1, y1) is the coordinate of pixel in described first facial image, (x2, y2) is the pixel coordinate in described second facial image, (x0, y0) is the point coordinate on the described display plane, cos (a) is the included angle cosine value between described first plane, facial image place and the display plane, and cos (b) is the included angle cosine value between described second plane, facial image place and the display plane.
In the present embodiment pixel in described first facial image is projected on the described display plane, perhaps the pixel in described second facial image is projected on the described display plane, can not have sequencing in time.
As an embodiment of the present embodiment, the projection that same coordinate points on display plane is repeated, described pixel in described first facial image is projected on the described display plane can be coordinate position projection on the described display plane that the pixel in described second facial image is not projected to.Perhaps
Described pixel in described second facial image is projected on the described display plane can be coordinate position projection on the described display plane that the pixel in described first facial image is not projected to.
In the present embodiment, after first facial image or second facial image are projected to display plane, next when projecting to another facial image on the display plane, no longer the whole pixels with another facial image all project on the display plane, but, only with the coordinate points projection that not be projected of the pixel in the facial image to correspondence.The projection of avoiding the same coordinate points on the display plane to be repeated.
The facial image treating apparatus that present embodiment provides, first facial image and the pixel from second facial image that the angle of looking up is obtained that will obtain from the angle of overlooking, project on the display plane, the image after the projection on the display plane is the neutralization of first facial image and second facial image.Can change the angles of display of original image, described facial image is shown according to predetermined angular.
Terminal device described in the embodiment of the invention can be equipment such as mobile phone, PDA, computing machine.
Through the above description of the embodiments, the those skilled in the art can be well understood to the present invention and can realize by the mode that software adds essential common hardware, can certainly pass through hardware, but the former is better embodiment under a lot of situation.Based on such understanding, the part that technical scheme of the present invention contributes to prior art in essence in other words can embody with the form of software product, this computer software product is stored in the storage medium that can read, floppy disk as computing machine, hard disk or CD etc., comprise some instructions with so that computer equipment (can be personal computer, server, the perhaps network equipment etc.) carry out the described method of each embodiment of the present invention.
The above; only be the specific embodiment of the present invention, but protection scope of the present invention is not limited thereto, anyly is familiar with those skilled in the art in the technical scope that the present invention discloses; can expect easily changing or replacing, all should be encompassed within protection scope of the present invention.Therefore, protection scope of the present invention should be as the criterion by described protection domain with claim.