WO2003030086A1 - Head motion estimation from four feature points - Google Patents
Head motion estimation from four feature points Download PDFInfo
- Publication number
- WO2003030086A1 WO2003030086A1 PCT/IB2002/003713 IB0203713W WO03030086A1 WO 2003030086 A1 WO2003030086 A1 WO 2003030086A1 IB 0203713 W IB0203713 W IB 0203713W WO 03030086 A1 WO03030086 A1 WO 03030086A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- points
- head
- image
- facial image
- linear method
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
Definitions
- the present invention relates to systems and methods for computing head motion estimation from the facial image positions, e.g., eye and mouth corners, and, particularly, to a linear method for performing head motion estimation using four (4) facial feature points.
- a linear method for performing head motion estimation using four (4) facial feature points As a special case, an algorithm for head pose estimation from four feature points is additionally described.
- Head pose recognition is an important research area in human computer interaction and many approaches of head pose recognition have been proposed. Most of these approaches model a face with certain facial features. For example, most existing approaches utilize six facial feature points including pupils, nostrils and lip corners are used to model a face, while others, such as reported in the reference to Z. Liu and Z. Zhang entitled “Robust Head Motion Computation by Taking Advantage of Physical Properties", Proc. Workshop on Human Motion, pp. 73-80, Austin, December 2000, implements five facial feature points including eye and mouth comers and the tip ofthe nose. In Zhang, the head motion is estimated from the five feature points through non-linear optimization. In fact, existing algorithms for face pose estimation are non-linear.
- the head pose estimation algorithm from four feature points may be utilized for avatar control applications, video chatting and face recognition applications.
- Fig. 1 depicts the configuration of typical feature points for a typical head
- Fig. 2 depicts the face geometry 10 providing the basis ofthe head pose estimation algorithm ofthe present invention.
- a linear method for the computation of head motion estimation from the image positions of eye and mouth comers is provided. More particularly, a method is provided for estimating head motion from four point matches, with head pose estimation being a special case, when a frontal view image is used as a reference position.
- the method is superior to other existing methods, which require either more point matches (at least 7) or, are non-linear requiring at least 5 facial feature matches.
- the method for head motion estimation is as follows:
- the first step is to acquire a first image and detecting the head in I ⁇ .
- a second image I 2 is acquired with the head detected in 7 2 .
- the next step involves determining the motion ofthe head represented by a rotation matrix R and translation vector T. It is understood that once motion parameters R and T are computed, the 3-D structure of all point matches may be computed. However, structure and translation may be determined only up to a scale, so if the magnitude of T is fixed, then the structure is uniquely determined.
- the algorithm for head pose estimation is a special case ofthe head motion estimation algorithm and there are two ways in which this may be accomplished: 1) interactive, which requires a reference image; and, 2) approximate, which uses a generic (average biometric) head geometry information, also referred to as a Generic Head Model (GHM).
- GMM Generic Head Model
- the following steps are implemented: 1) Before using the system, a user is asked to face the camera in a predefined reference position. The reference eye and mouth comers PQ are acquired as described in the steps above. 2) When a new image is acquired, eye and mouth comers are detected and head motion estimated as in the remaining steps indicated in the algorithm above. 3) The head rotation matrix corresponds to head pose matrix.
- the Approximate algorithm requires no interaction with the user, but assumes certain biometric information is available and fixed for all the users.
- the approximate algorithm including the configuration of typical feature points for a typical head 19 in relation to a camera coordinate system 20 denoted as system C xyz .
- the points Pi and P 3 represent the eye and mouth comers, respectively ofthe generic head model 19. It is understood that for the frontal view, shown in Fig. 1, these points Pj and P 3 have different depths (Z ⁇ and Z 3 , respectively).
- An assumption is made that the angle ris known, and an average value is used over all possible human heads.
- pitch (tilt) angle is very difficult to compute precisely, since even the same person, when asked to look straight into camera, may tilt head differently in repeated experiments.
- head pose may be uniquely determined from only one image ofthe head as will be explained in greater detail hereinbelow.
- points pi, p 2 , p 3 and p 4 denote the image coordinates of eye (points pi, p 2 ) and mouth comers (points p 3 and p 4 ) in a first image and let pi , p' 2 , p 3 , p' 4 denote the corresponding eye and mouth comer coordinates in a second image. Given these feature coordinates, the task is to determine head motion (represented by rotation and translation) between those first and second two images.
- the algorithm is performed in the following steps: 1) using facial constraints, compute the three-dimensional (3-D) coordinates for the feature points from both images; and, 2) given the 3-D positions ofthe feature points, compute the motion parameters (rotation R and translation T matrices).
- equation (6) results:
- equation (7) may be set forth as equation (8) as follows:
- Equation (11) may now be written in terms of R and T as:
- a "Head Pose” may be uniquely represented as a set of three angles (yaw, roll and pitch), or, as a rotation matrix R (given that there is a one-to-one correspondence between the rotation matrix and the pose angles).
- ARP Auxiliary Reference Position
- the rotation matrix R may be written as follows:
- r 3 (P 6 - P 5 )x (P 2 - P 1 ) .
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP02765214A EP1433119B1 (en) | 2001-09-28 | 2002-09-10 | Head motion estimation from four feature points |
JP2003533212A JP2005505063A (en) | 2001-09-28 | 2002-09-10 | Head motion estimation from four feature points |
DE60217143T DE60217143T2 (en) | 2001-09-28 | 2002-09-10 | ESTIMATION OF HEAD MOVEMENT FROM FOUR CHARACTER POINTS |
KR10-2004-7004460A KR20040037152A (en) | 2001-09-28 | 2002-09-10 | Head motion estimation from four feature points |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US09/966,410 | 2001-09-28 | ||
US09/966,410 US7027618B2 (en) | 2001-09-28 | 2001-09-28 | Head motion estimation from four feature points |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2003030086A1 true WO2003030086A1 (en) | 2003-04-10 |
Family
ID=25511358
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2002/003713 WO2003030086A1 (en) | 2001-09-28 | 2002-09-10 | Head motion estimation from four feature points |
Country Status (8)
Country | Link |
---|---|
US (1) | US7027618B2 (en) |
EP (1) | EP1433119B1 (en) |
JP (1) | JP2005505063A (en) |
KR (1) | KR20040037152A (en) |
CN (1) | CN1316416C (en) |
AT (1) | ATE349738T1 (en) |
DE (1) | DE60217143T2 (en) |
WO (1) | WO2003030086A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN100358340C (en) * | 2005-01-05 | 2007-12-26 | 张健 | Digital-camera capable of selecting optimum taking opportune moment |
CN102968180A (en) * | 2011-12-02 | 2013-03-13 | 微软公司 | User interface control based on head direction |
Families Citing this family (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7218774B2 (en) * | 2003-08-08 | 2007-05-15 | Microsoft Corp. | System and method for modeling three dimensional objects from a single image |
US7508979B2 (en) * | 2003-11-21 | 2009-03-24 | Siemens Corporate Research, Inc. | System and method for detecting an occupant and head pose using stereo detectors |
US20070263923A1 (en) * | 2004-04-27 | 2007-11-15 | Gienko Gennady A | Method for Stereoscopic Measuring Image Points and Device for Carrying Out Said Method |
US9030486B2 (en) | 2008-08-22 | 2015-05-12 | University Of Virginia Patent Foundation | System and method for low bandwidth image transmission |
CN101697199B (en) * | 2009-08-11 | 2012-07-04 | 北京盈科成章科技有限公司 | Detection method of head-face gesture and disabled assisting system using same to manipulate computer |
US8363919B2 (en) | 2009-11-25 | 2013-01-29 | Imaging Sciences International Llc | Marker identification and processing in x-ray images |
US9082036B2 (en) * | 2009-11-25 | 2015-07-14 | Dental Imaging Technologies Corporation | Method for accurate sub-pixel localization of markers on X-ray images |
US9082182B2 (en) * | 2009-11-25 | 2015-07-14 | Dental Imaging Technologies Corporation | Extracting patient motion vectors from marker positions in x-ray images |
US9082177B2 (en) * | 2009-11-25 | 2015-07-14 | Dental Imaging Technologies Corporation | Method for tracking X-ray markers in serial CT projection images |
US9826942B2 (en) * | 2009-11-25 | 2017-11-28 | Dental Imaging Technologies Corporation | Correcting and reconstructing x-ray images using patient motion vectors extracted from marker positions in x-ray images |
KR20120059994A (en) | 2010-12-01 | 2012-06-11 | 삼성전자주식회사 | Apparatus and method for control avatar using expression control point |
US8964041B2 (en) * | 2011-04-07 | 2015-02-24 | Fr Vision Ab | System and method for video stabilization of rolling shutter cameras |
CN102842146B (en) * | 2011-06-25 | 2015-01-07 | 湖南大学 | Motion data conversion method based on structure decomposition method |
CN102289844B (en) * | 2011-07-18 | 2013-11-06 | 中国科学院自动化研究所 | Method for selecting candidate point of scale-invariable characteristic of three-dimensional image |
US9020192B2 (en) | 2012-04-11 | 2015-04-28 | Access Business Group International Llc | Human submental profile measurement |
CN102999164B (en) * | 2012-11-30 | 2016-08-03 | 广东欧珀移动通信有限公司 | A kind of e-book flipping-over control method and intelligent terminal |
KR102085180B1 (en) * | 2013-10-08 | 2020-03-05 | 삼성전자주식회사 | Method of estimating body's orientation, Computer readable storage medium of recording the method and an device |
US9898674B2 (en) | 2015-12-10 | 2018-02-20 | International Business Machines Corporation | Spoof detection for facial recognition |
CN106774936B (en) * | 2017-01-10 | 2020-01-07 | 上海木木机器人技术有限公司 | Man-machine interaction method and system |
WO2019000462A1 (en) | 2017-06-30 | 2019-01-03 | 广东欧珀移动通信有限公司 | Face image processing method and apparatus, storage medium, and electronic device |
WO2019073732A1 (en) * | 2017-10-10 | 2019-04-18 | 株式会社 資生堂 | Stress evaluation method |
US10373332B2 (en) * | 2017-12-08 | 2019-08-06 | Nvidia Corporation | Systems and methods for dynamic facial analysis using a recurrent neural network |
CN114007054B (en) * | 2022-01-04 | 2022-04-12 | 宁波均联智行科技股份有限公司 | Method and device for correcting projection of vehicle-mounted screen picture |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2001063560A1 (en) * | 2000-02-22 | 2001-08-30 | Digimask Limited | 3d game avatar using physical characteristics |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5586215A (en) * | 1992-05-26 | 1996-12-17 | Ricoh Corporation | Neural network acoustic and visual speech recognition system |
US6272231B1 (en) * | 1998-11-06 | 2001-08-07 | Eyematic Interfaces, Inc. | Wavelet-based facial motion capture for avatar animation |
TW413795B (en) * | 1999-02-26 | 2000-12-01 | Cyberlink Corp | An image processing method of 3-D head motion with three face feature points |
US7020305B2 (en) * | 2000-12-06 | 2006-03-28 | Microsoft Corporation | System and method providing improved head motion estimations for animation |
-
2001
- 2001-09-28 US US09/966,410 patent/US7027618B2/en not_active Expired - Fee Related
-
2002
- 2002-09-10 AT AT02765214T patent/ATE349738T1/en not_active IP Right Cessation
- 2002-09-10 WO PCT/IB2002/003713 patent/WO2003030086A1/en active IP Right Grant
- 2002-09-10 CN CNB028190475A patent/CN1316416C/en not_active Expired - Fee Related
- 2002-09-10 EP EP02765214A patent/EP1433119B1/en not_active Expired - Lifetime
- 2002-09-10 DE DE60217143T patent/DE60217143T2/en not_active Expired - Fee Related
- 2002-09-10 KR KR10-2004-7004460A patent/KR20040037152A/en not_active Application Discontinuation
- 2002-09-10 JP JP2003533212A patent/JP2005505063A/en active Pending
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2001063560A1 (en) * | 2000-02-22 | 2001-08-30 | Digimask Limited | 3d game avatar using physical characteristics |
Non-Patent Citations (3)
Title |
---|
MING XU ET AL: "Detecting head pose from stereo image sequence for active face recognition", AUTOMATIC FACE AND GESTURE RECOGNITION, 1998. PROCEEDINGS. THIRD IEEE INTERNATIONAL CONFERENCE ON NARA, JAPAN 14-16 APRIL 1998, LOS ALAMITOS, CA, USA,IEEE COMPUT. SOC, US, 14 April 1998 (1998-04-14), pages 82 - 87, XP010277648, ISBN: 0-8186-8344-9 * |
QIAN CHEN ET AL: "Head pose estimation using both color and feature information", PATTERN RECOGNITION, 2000. PROCEEDINGS. 15TH INTERNATIONAL CONFERENCE ON SEPTEMBER 3-7, 2000, LOS ALAMITOS, CA, USA,IEEE COMPUT. SOC, US, 3 September 2000 (2000-09-03), pages 842 - 845, XP010533945, ISBN: 0-7695-0750-6 * |
ZICHENG LIU ET AL.: "Robust Head Motion Computation by Taking Advantage of Physical Properties", PROC. WORKSHOP ON HUMAN MOTION, December 2000 (2000-12-01) - December 2000 (2000-12-01), Austin, Tx, USA, pages 73 - 77, XP002222476 * |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN100358340C (en) * | 2005-01-05 | 2007-12-26 | 张健 | Digital-camera capable of selecting optimum taking opportune moment |
CN102968180A (en) * | 2011-12-02 | 2013-03-13 | 微软公司 | User interface control based on head direction |
WO2013081918A1 (en) * | 2011-12-02 | 2013-06-06 | Microsoft Corporation | User interface control based on head orientation |
US8803800B2 (en) | 2011-12-02 | 2014-08-12 | Microsoft Corporation | User interface control based on head orientation |
CN102968180B (en) * | 2011-12-02 | 2016-03-30 | 微软技术许可有限责任公司 | Based on the user interface control of cephalad direction |
Also Published As
Publication number | Publication date |
---|---|
US7027618B2 (en) | 2006-04-11 |
EP1433119B1 (en) | 2006-12-27 |
CN1561499A (en) | 2005-01-05 |
ATE349738T1 (en) | 2007-01-15 |
EP1433119A1 (en) | 2004-06-30 |
DE60217143T2 (en) | 2007-10-04 |
JP2005505063A (en) | 2005-02-17 |
US20030063777A1 (en) | 2003-04-03 |
KR20040037152A (en) | 2004-05-04 |
DE60217143D1 (en) | 2007-02-08 |
CN1316416C (en) | 2007-05-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2003030086A1 (en) | Head motion estimation from four feature points | |
CN108876879B (en) | Method and device for realizing human face animation, computer equipment and storage medium | |
US7515173B2 (en) | Head pose tracking system | |
US8571258B2 (en) | Method of tracking the position of the head in real time in a video image stream | |
CN110913751B (en) | Wearable eye tracking system with slip detection and correction functions | |
CN109271950B (en) | Face living body detection method based on mobile phone forward-looking camera | |
EP2597597B1 (en) | Apparatus and method for calculating three dimensional (3D) positions of feature points | |
WO2009091029A1 (en) | Face posture estimating device, face posture estimating method, and face posture estimating program | |
CN110520905B (en) | Face posture correcting device and method | |
KR101759188B1 (en) | the automatic 3D modeliing method using 2D facial image | |
Munn et al. | 3D point-of-regard, position and head orientation from a portable monocular video-based eye tracker | |
JP3144400B2 (en) | Gesture recognition device and method | |
Liu et al. | Robust head motion computation by taking advantage of physical properties | |
Caunce et al. | Locating facial features and pose estimation using a 3D shape model | |
Tordoff et al. | Head pose estimation for wearable robot control. | |
CN108694348B (en) | Tracking registration method and device based on natural features | |
Caunce et al. | Improved 3D Model Search for Facial Feature Location and Pose Estimation in 2D images. | |
US20230144111A1 (en) | A method for generating a 3d model | |
CN113343879A (en) | Method and device for manufacturing panoramic facial image, electronic equipment and storage medium | |
Fang et al. | Automatic head and facial feature extraction based on geometry variations | |
Liang et al. | Affine correspondence based head pose estimation for a sequence of images by using a 3D model | |
CN112381952A (en) | Face contour point cloud model reconstruction method and device based on multiple cameras | |
CN112836544A (en) | Novel sitting posture detection method | |
CN109671108B (en) | Single multi-view face image attitude estimation method capable of rotating randomly in plane | |
Monaco et al. | Active, foveated, uncalibrated stereovision |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): CN JP |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FR GB GR IE IT LU MC NL PT SE SK TR |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2003533212 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2002765214 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 20028190475 Country of ref document: CN Ref document number: 1020047004460 Country of ref document: KR |
|
WWP | Wipo information: published in national office |
Ref document number: 2002765214 Country of ref document: EP |
|
WWG | Wipo information: grant in national office |
Ref document number: 2002765214 Country of ref document: EP |