[go: up one dir, main page]

WO2002031772A3 - Method for tracking motion of a face - Google Patents

Method for tracking motion of a face Download PDF

Info

Publication number
WO2002031772A3
WO2002031772A3 PCT/IB2001/002736 IB0102736W WO0231772A3 WO 2002031772 A3 WO2002031772 A3 WO 2002031772A3 IB 0102736 W IB0102736 W IB 0102736W WO 0231772 A3 WO0231772 A3 WO 0231772A3
Authority
WO
WIPO (PCT)
Prior art keywords
face
motion
person
represented
global
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/IB2001/002736
Other languages
French (fr)
Other versions
WO2002031772A8 (en
WO2002031772A2 (en
Inventor
Tanju A Erdem
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US09/689,595 external-priority patent/US6294157B1/en
Application filed by Individual filed Critical Individual
Publication of WO2002031772A2 publication Critical patent/WO2002031772A2/en
Publication of WO2002031772A8 publication Critical patent/WO2002031772A8/en
Publication of WO2002031772A3 publication Critical patent/WO2002031772A3/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • G06V40/176Dynamic expression
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • G06V10/245Aligning, centring, orientation detection or correction of the image by locating a pattern; Special marks for positioning

Landscapes

  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Processing Or Creating Images (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

A method for tracking the motion of a person's face for the purpose of animating a 3-D face model of the same or another person is disclosed. The 3-D face model carries both the geometry (shape) and the texture (color) characteristics of the person's face. The shape of the face model is represented via a 3-D triangular mesh (geometry mesh), while the texture of the face model is represented via a 2-D composite image (texture image). Both the global motion and the local motion of the person's face are tracked. Global motion of the face involves the rotation and the translation of the face in 3-D. Local motion of the face involves the 3-D motion of the lips, eyebrows, etc., caused by speech and facial expressions. The 2-D positions of salient features of the person's face and/or markers placed on the person's face are automatically tracked in a time-sequence of 2-D images of the face. Global and local motion of the face are separately calculated using the tracked 2-D positions of the salient features or markers. Global motion is represented in a 2-D image by rotation and position vectors while local motion is represented by an action vector that specifies the amount of facial actions such as smiling-mouth, raised-eyebrows, etc.
PCT/IB2001/002736 2000-10-13 2001-10-09 Method for tracking motion of a face Ceased WO2002031772A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US09/689,595 US6294157B1 (en) 1999-10-14 2000-10-13 Composition containing sapogenin
US09/689,595 2000-10-13

Publications (3)

Publication Number Publication Date
WO2002031772A2 WO2002031772A2 (en) 2002-04-18
WO2002031772A8 WO2002031772A8 (en) 2002-07-04
WO2002031772A3 true WO2002031772A3 (en) 2002-10-31

Family

ID=24769119

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2001/002736 Ceased WO2002031772A2 (en) 2000-10-13 2001-10-09 Method for tracking motion of a face

Country Status (1)

Country Link
WO (1) WO2002031772A2 (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6664956B1 (en) 2000-10-12 2003-12-16 Momentum Bilgisayar, Yazilim, Danismanlik, Ticaret A. S. Method for generating a personalized 3-D face model
DE102004038545A1 (en) * 2004-08-06 2006-03-16 Peters, Heiko, Dr. Positioning system and position measuring system
CN101796545A (en) * 2007-09-04 2010-08-04 索尼公司 Integrated Motion Capture
US20090110245A1 (en) * 2007-10-30 2009-04-30 Karl Ola Thorn System and method for rendering and selecting a discrete portion of a digital image for manipulation
CN105975935B (en) * 2016-05-04 2019-06-25 腾讯科技(深圳)有限公司 A kind of face image processing process and device
KR102540756B1 (en) * 2022-01-25 2023-06-08 주식회사 딥브레인에이아이 Apparatus and method for generating speech synsthesis image
KR102540759B1 (en) * 2022-02-09 2023-06-08 주식회사 딥브레인에이아이 Apparatus and method for generating speech synsthesis image
KR102584485B1 (en) * 2022-02-14 2023-10-04 주식회사 딥브레인에이아이 Apparatus and method for generating speech synsthesis image
KR102584484B1 (en) * 2022-02-14 2023-10-04 주식회사 딥브레인에이아이 Apparatus and method for generating speech synsthesis image

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08293026A (en) * 1995-04-21 1996-11-05 Murata Mach Ltd Image recognition device
US5802220A (en) * 1995-12-15 1998-09-01 Xerox Corporation Apparatus and method for tracking facial motion through a sequence of images

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08293026A (en) * 1995-04-21 1996-11-05 Murata Mach Ltd Image recognition device
US5802220A (en) * 1995-12-15 1998-09-01 Xerox Corporation Apparatus and method for tracking facial motion through a sequence of images

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
COSI P ET AL: "Phonetic recognition by recurrent neural networks working on audio and visual information", SPEECH COMMUNICATION, ELSEVIER SCIENCE PUBLISHERS, AMSTERDAM, NL, vol. 19, no. 3, 1 September 1996 (1996-09-01), pages 245 - 252, XP004013654, ISSN: 0167-6393 *
DATABASE WPI Section EI Week 199703, Derwent World Patents Index; Class T01, AN 1997-031076, XP002205957 *
EBIHARA K ET AL: "REAL-TIME 3-D FACIAL IMAGE RECONSTRUCTION FOR VIRTUAL SPACE TELECONFERENCING", ELECTRONICS & COMMUNICATIONS IN JAPAN, PART III - FUNDAMENTAL ELECTRONIC SCIENCE, SCRIPTA TECHNICA. NEW YORK, US, vol. 82, no. 5, May 1999 (1999-05-01), pages 80 - 90, XP000875659, ISSN: 1042-0967 *
GUENTER, BRIAN; GRIMM, CINDY; WOOD, DANIEL; MALVAR, HENRIQUE; PIGHIN FREDRICK: "Making faces", COMPUTER GRAPHICS. PROCEEDINGS. SIGGRAPH 98 CONFERENCE PROCEEDINGS, PROCEEDINGS OF SIGGRAPH 98: 25TH INTERNATIONAL CONFERENCE ON COMPUTER GRAPHICS AND INTERACTIVE TECHNIQUES, ORLANDO, FL, USA, 19-24 JULY 1998, 1998, New York, NY, USA, ACM, USA, pages 55 - 66, XP002205956, ISBN: 0-89791-999-8 *

Also Published As

Publication number Publication date
WO2002031772A8 (en) 2002-07-04
WO2002031772A2 (en) 2002-04-18

Similar Documents

Publication Publication Date Title
Habermann et al. Deepcap: Monocular human performance capture using weak supervision
Noh et al. A survey of facial modeling and animation techniques
Kalra et al. Real-time animation of realistic virtual humans
Theobalt et al. Combining 2D feature tracking and volume reconstruction for online video-based human motion capture
WO2002030171A3 (en) Facial animation of a personalized 3-d face model using a control mesh
WO2002037415A3 (en) Person tagging in an image processing system utilizing a statistical model based on both appearance and geometric features
Breton et al. FaceEngine a 3D facial animation engine for real time applications
Ohya et al. Real-time reproduction of 3D human images in virtual space teleconferencing
Bregler et al. Video motion capture
Goto et al. MPEG-4 based animation with face feature tracking
WO2002031772A3 (en) Method for tracking motion of a face
JP6818219B1 (en) 3D avatar generator, 3D avatar generation method and 3D avatar generation program
Valente et al. Face tracking and realistic animations for telecommunicant clones
Ulgen A step towards universal facial animation via volume morphing
Pandžić et al. Real-time facial interaction
Moeslund et al. Summaries of 107 computer vision-based human motion capture papers
Richter et al. Real-time reshaping of humans
Valente et al. A visual analysis/synthesis feedback loop for accurate face tracking
Malciu et al. Tracking facial features in video sequences using a deformable-model-based approach
Otsuka et al. Extracting facial motion parameters by tracking feature points
Hilton Towards model-based capture of a persons shape, appearance and motion
Zheng et al. A model based approach in extracting and generating human motion
Kakadiaris et al. Vision-based animation of digital humans
Schreer et al. Real-time vision and speech driven avatars for multimedia applications
Pei et al. Transferring of speech movements from video to 3D face space

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: C1

Designated state(s): JP

AL Designated countries for regional patents

Kind code of ref document: C1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR

CFP Corrected version of a pamphlet front page
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
AK Designated states

Kind code of ref document: A3

Designated state(s): JP

AL Designated countries for regional patents

Kind code of ref document: A3

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP