WO2006000063A1 - Method for deriving a treatment plan for orthognatic surgery and devices therefor - Google Patents

Method for deriving a treatment plan for orthognatic surgery and devices therefor Download PDF

Info

Publication number
WO2006000063A1
WO2006000063A1 PCT/BE2005/000100 BE2005000100W WO2006000063A1 WO 2006000063 A1 WO2006000063 A1 WO 2006000063A1 BE 2005000100 W BE2005000100 W BE 2005000100W WO 2006000063 A1 WO2006000063 A1 WO 2006000063A1
Authority
WO
WIPO (PCT)
Prior art keywords
analysis
scan
cephalometric
surface model
cephalogram
Prior art date
Application number
PCT/BE2005/000100
Other languages
French (fr)
Inventor
Filip Schutyser
Original Assignee
Medicim Nv
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Medicim Nv filed Critical Medicim Nv
Priority to CN2005800210757A priority Critical patent/CN1998022B/en
Priority to BRPI0511379-2A priority patent/BRPI0511379B1/en
Priority to EP05758955A priority patent/EP1759353B1/en
Priority to JP2007516900A priority patent/JP5020816B2/en
Priority to US11/629,270 priority patent/US7792341B2/en
Priority to DE602005010861T priority patent/DE602005010861D1/en
Publication of WO2006000063A1 publication Critical patent/WO2006000063A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B33ADDITIVE MANUFACTURING TECHNOLOGY
    • B33YADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
    • B33Y50/00Data acquisition or data processing for additive manufacturing
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B33ADDITIVE MANUFACTURING TECHNOLOGY
    • B33YADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
    • B33Y80/00Products made by additive manufacturing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30008Bone
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/41Medical

Definitions

  • the present invention relates to methods for assessing the shape of the skull and soft tissues and for determining a treatment plan for maxillofacial surgery and more particularly for orthognatic surgery and devices used in such surgery or in the preparation thereof.
  • orthognatic surgery involves surgical interventions of repair, in particular, of a mis-positioning of the jaws with respect to one another, called orthognatic surgery.
  • orthognatic surgery involves osteotomies of the maxilla and/or mandible to reposition these bone fragments correctly with respect to the rest of the skull and to create a good • occlusion.
  • Osteotomies are surgical operations whereby a bone is cut to shorten, lengthen or change its alignments. With 'occlusion' is meant the manner in which the teeth from upper and lower arches come together when the mouth is closed.
  • two splints are generally made from the dental castings, in addition to a so-called initial splint linking the two jaws in their occlusion position before the intervention.
  • a so-called intermediary splint determines the foreseeable displacement of the maxilla with respect to the mandible, when the mandible is in its original (preoperative) position. This splint enables the surgeon to place the maxilla back on the skull in the desired definitive position before intervening on the mandible.
  • a so-called definitive splint determines the occlusion objective to be surgically achieved and is thus used to correctly position the mandible on the skull by setting the position of the mandible with respect to the previously replaced maxi11a.
  • Radiographic techniques The preparation of the surgical operation also uses X-ray radiographs of the patient. Typically a lateral radiograph (cephalogram) is performed. Sometimes a frontal X-ray radiograph and other radiography with different views are performed. These radiographs enable, in particular, performing an approximate simulation of the operative action. [0008] The simulation is performed manually from a tracing paper placed on the radiography. For example, landmarks are indicated and the contours of the mandible are drawn. The tracing paper is then moved to approximately reproduce thereon the desired post-operative occlusion, after which the maxillary contours are drawn.
  • the maxillo- mandibular assembly drawn on the tracing paper is then moved in one block while respecting cephalometric standards, labial ratios, as well as other criteria known for this type of intervention.
  • the direction and amplitude of the jaw displacements are thus radiographically and approximately defined.
  • the results of this simulation are compared and adjusted according to the relative motion of the mandible and of the maxilla envisaged by means of the splints.
  • the actual simulation of an orthognatic surgical intervention is thus performed essentially manually. Further, this simulation is only done in two dimensions based on a plane profile view of the skull .
  • the current generation of CT-scanners provide detailed 3D information of the patient's anatomy.
  • Bone fragments can be isolated, and moved with respect to each other. This could provide a suitable basis for a computer assisted orthognatic surgery planning system.
  • the currently available 3D surface representations algorithms do not provide a suitable framework comprising anatomically relevant references, which allow the clinician to easily and reliably reposition a bone fragment in the virtual 3D environment .
  • a second problem is associated with the use of 3D surface representations derived from CT-scans of patients having amalgam dental fillings. Teeth amalgam fillings create artefacts that appear as streaks on the CT images. Using these CT images as ' such, it is impossible to plot on a three-dimensional view the exact position of the teeth to obtain the bite.
  • Patent document WO03/028577-A2 discloses an apparatus and method for fabricating orthognatic surgical splints. It also relates to a method for creating a computerised composite skull model suitable for diagnosis and treatment planning. In said method a 3D CT model of the patient's bone structure and a digital dental computer model of the patient's dentition are generated, both comprising a same set of fiduciary markers.
  • the present invention aims to provide a method for performing a cephalometric and/or anthropometric analysis.
  • it aims to provide a method for deriving a treatment plan for orthognatic surgery, comprising said analysis method.
  • it aims to provide devices suitable therefor.
  • the present invention relates to a method for performing a cephalometric and/or anthropometric analysis comprising the steps of : - acquiring a 3D scan of a person's head using a 3D medical image modality, - generating a 3D surface model using data from that 3D scan, generating from the 3D scan at least one 2D cephalogram geometrically linked to the 3D surface model, indicating anatomical landmarks on the at least one 2D cephalogram and/or on the 3D surface model, performing the cephalometric and/or anthropometric analysis using the anatomical landmarks.
  • the medical image modality is magnetic resonance imaging or computer tomography.
  • the 3D surface model advantageously represents a bone structure surface and/or a soft tissue envelope.
  • the method further comprises the step of visualising said generated at least one 2D cephalogram together with the 3D surface model in a virtual scene.
  • the method further comprises the determining of a reference frame from anatomical reference points on the person's head.
  • a report of the cephalometric analysis is generated.
  • the method comprises the further step of providing 2D or 3D photographs, from which a textured 3D skin surface is derived.
  • the analysis typically comprises the determination of linear distances between two landmarks, the distance of a landmark to a reference plane, the distance between landmarks projected on a plane, angles between landmarks or planes, proportions computed between these measurements or the distance between two points along a surface and parallel to a plane.
  • the method comprises the steps of acquiring a 3D scan of the person' s head, while the person is wearing a 3D splint. Also a 3D scan of casts of said person's upper and lower jaw is then preferably acquired.
  • the 3D scan of said person's head, while wearing the 3D splint, and the 3D scan of casts of upper and lower jaw are fused, based on features of the 3D splint.
  • Advantageously data from the 3D scan of the person wearing said 3D splint is subsequently used for generating the 3D surface model .
  • the invention in a second object relates to a method for deriving planning information for repositioning a bone fragment, comprising the steps of : - performing a cephalometric and/or anthropometric analysis as previously described, - defining a set of virtual positions of the bone fragment to be repositioned, said positions being defined based on the anatomical landmarks, visualising for each of the virtual positions the result of repositioning the bone fragment together with the landmarks in the 3D surface model and on the 2D cephalograms, taking a decision on an intra-operative repositioning of the bone fragment based on the cephalometric analysis and on the visualisation.
  • the virtual positions result from a translation and/or rotation of the bone fragment .
  • the invention in another object relates to a device for cephalometric and/or anthropometric analysis, comprising a computing unit arranged for generating from 3D scan data a 3D surface model and a 2D cephalogram geometrically linked to the 3D surface model, visualisation means for representing the 2D cephalogram and/or the 3D surface model, and - computation means for performing the analysis based on anatomical landmarks provided on the at least one 2D cephalogram and/or on the 3D surface model.
  • the 3D scan data are preferably CT or MRI data.
  • the invention relates to a 3D splint for use in a method as previously described.
  • the 3D splint comprises a U-shaped part arranged for fitting the upper and lower dental arches and is provided with an extra-oral or intra-oral extension on the U-shaped part.
  • the invention relates to a program, executable on a programmable device' containing instructions, which when executed, perform the method as previously described.
  • Fig. 1 represents the generation of a virtual (lateral) cephalogram.
  • Fig. 2 represents the definition of an anatomically related reference frame.
  • Figs. 3A and 3B represent the definition of anatomical landmarks . Both the lateral cephalogram and 3D bone surface model are used to exactly define the points.
  • Fig. 4 represents the results of the 3D cephalometric tracing.
  • Fig. 5 represents the moving bone fragments being indicated during set-up of orthognatic surgery.
  • Fig. 6 represents a virtual result of orthognatic surgery.
  • Fig. 7 represents the control window used to move the bone fragments.
  • Fig. 1 represents the generation of a virtual (lateral) cephalogram.
  • Fig. 2 represents the definition of an anatomically related reference frame.
  • Figs. 3A and 3B represent the definition of anatomical landmarks . Both the lateral cephalogram and 3D bone surface model are used to exactly define the points.
  • Fig. 4 represents the results of the 3D ce
  • Fig. 8 represents the tracking of the movements of the landmarks on the virtual cephalograms (Fig.8A) and on the bone surface representations (Fig.8B) .
  • Fig. 9 represents the 3D-splint alone and in position between the plaster casts of the dentition. This 3D-splint envisages fusing by means of features of the extension.
  • Fig. 10 represents the 3D-splint alone and in position between the plaster casts of the dentition. This 3D splint envisages fusing by means of gutta-percha markers.
  • Fig. 11 to 14 represent flowcharts of various embodiments of the method according to the invention.
  • Fig. 15 represents a global summary of the flowcharts shown in Figs.11-14.
  • the present invention provides a method to redefine the framework of a 3D surface representation into an anatomically relevant framework.
  • the anatomically relevant framework allows a clinician to perform an accurate cephalometric and/or anthropometric analysis in an intuitive manner.
  • a 3D surface representation comprising an anatomically relevant framework has the advantage that it allows the virtual repositioning of bone fragments in relation to anatomically relevant landmarks, making it particularly suited for the planning of surgical interventions.
  • the flowchart shown in Fig.11 summarises the main steps of the method according to the invention.
  • a modality is any of the various types of equipment or probes used to acquire images of the body. Radiography, computer tomography, ultrasonography and magnetic resonance imaging are examples for modalities in the present context.
  • a method and device to perform a 3D cephalometric and/or anthropometric analysis is disclosed allowing a preoperative assessment of a patient's anatomy.
  • the ' device comprises a computerised system, visualising image volumes (e.g. CT-image volumes) and surface models extracted from it, together with 2D projection grey-value images, i.e. virtual X-ray images geometrically linked with the CT-image and computed from it.
  • image volumes e.g. CT-image volumes
  • 2D projection grey-value images i.e. virtual X-ray images geometrically linked with the CT-image and computed from it.
  • the combined information provides a means for effectively and accurately assessing the 3D anatomy of the patient's skull and soft tissue surface.
  • the technology fuses classical 2D cephalometric tracings with 3D bone surface visualisation.
  • the surface model can be generated using the CT data, as described in the paper ⁇ Marching Cubes: a High Resolution 3D Surface Construction Algorithm' by W.E. Lorensen, H.E.
  • ACM Computer Graphics (ACM SIGGRAPH '87 Proceedings), vol.21, no.4, pp.163-169, July 1987) .
  • the virtual X-ray images can be obtained as described in ⁇ Display of surfaces from volume data' , Levoy M., IEEE Comput. Graph. Appl. 8,3 (May 1988) , pp.29-37.
  • a 3D scan of the patient is the input to the system.
  • the image volume is composed of so-called ⁇ voxels' , i.e. volume elements that each hold one value (e.g. a greyvalue) .
  • the box-shaped voxels compose a complete image volume when arranged in a three-dimensional array.
  • a 3D surface of the bone structure and/or the soft tissue envelope is constructed. If required, it is possible to add the natural complexion (the natural tone and texture of the skin) of the face to the skin surface generated from CT-data, by adding the colour information of the face. To achieve this, a textured 3D skin surface, acquired e.g. by 3D photography or laser scanning, can be added and registered to the CT-data (see flowchart in Fig.12) . As an alternative, a series of 2D photos are acquired and by aligning the skin surface model from CT to the view of the 2D photo, the texture is transferred. [0043] In an initial step the clinician defines or selects the type of analysis.
  • a cephalometric analysis performs measurements at the level of the patient's skull.
  • An anthropometric analysis performs measurements at the level of the patient's skin.
  • the present invention allows defining various cephalometric and anthropometric analyses or even a combination of both.
  • the type of analysis determines the anatomical landmarks that should be indicated by the clinician and the measurements that are computed. [0044] Before indicating these landmarks the clinician has virtually positioned the patient to create a lateral cephalogram (see Fig.l) , and preferably an anatomical reference frame (Fig.2) is installed replacing the co-ordinate system of the CT-data. Also a frontal cephalogram is optionally generated. [0045]
  • the anatomical reference frame is a coordinate system attached to anatomical reference points.
  • This reference frame consists of a horizontal, median and vertical plane (Fig.2) .
  • Fig.2 This reference frame, the directions up/down and left/right are linked with the anatomy of the patient. Consequently the installation of such anatomical reference frame allows an easy navigation within the virtual images.
  • the system constructs such anatomical relevant reference frame after the clinician has indicated following anatomical landmarks: 1. two left/right symmetrical landmarks: e.g. the left and right fronto-zygomatic suture. 2. Nasion 3. Sella
  • the horizontal plane is defined by the direction defined in 1, together with the direction Nasion-Sella and goes through the Sella.
  • the median plane is perpendicular on the horizontal plane, contains the left/right direction and goes through Sella.
  • the vertical plane is perpendicular on the median plane and horizontal plane and goes through Sella.
  • Another reference frame can be defined based on the skin surface alone: 1. two left/right symmetrical landmarks: e.g. the pupils, 2. with a lateral view of the head, the direction of the pupils tangent to the upper limit of the ear, 3. a soft tissue point on the facial midline, e.g. on the soft tissue Nasion point (Nasion-s) .
  • the horizontal plane is defined by the directions defined in 1 and 2, and goes through Nasion-s.
  • the median plane is perpendicular on the horizontal plane, and contains the direction defined by 2, and goes through Nasion-s.
  • the vertical plane is perpendicular on the horizontal and median plane, and goes through Nasion-s.
  • Landmarks are characteristic anatomical points on hard tissues or soft tissues.
  • the landmarks can be indicated on the surface model or on the 2D cephalogram (see Fig. 3) .
  • Selected anatomical points can determine an anatomical plane, which should be considered as one of the anatomical landmarks.
  • the measurements (distances or angles) of the analysis are computed and preferably a report is generated. The position of the landmarks can be adjusted. Possible measurements comprise : - angles between planes (e.g. the inclination of the Frankfurter plane with the horizontal plane of the reference frame) , - angles between projected points, - linear distances between two landmarks.
  • Fig.4 shows an example of analysis results.
  • Several types of cephalometric analyses can be defined. In the set-up of a specific type of cephalometric analysis preferably following elements are defined: - whether the reference frames are used or not, and if so, which ones, - a number of measurements between anatomical landmarks or anatomical planes are defined. If a landmark for a measurement is not already defined in the system, a new landmark has to be defined.
  • the planning system should allow repositioning the bone fragments with respect to an anatomically defined reference frame and with respect to anatomically defined rotation/translation references, and - it should visualise the results of any repositioning.
  • the effects of any repositioning are visualised at the level of the skeleton as well as on the level of the soft tissues.
  • a 2D cephalogram is a projection image
  • 3D information is lost, while dental casts only give 3D information on a very limited area of the head and provide no information on the soft tissue.
  • useful additional information can be obtained using the above-described 3D cephalometric analysis (Fig.5) .
  • the user typically a surgeon
  • Fig. 6 shows the result of a virtual maxillary repositioning.
  • Different types of translation and rotation with respect to the landmarks can be simulated by means of the computerised planning system.
  • a rotation around an axis or a translation along a direction can be defined as the intersection between two planes or as being perpendicular to a plane or it can be defined by two landmarks.
  • various types of surgery such as maxillary advancement, mandibular advancement, mandibular widening, etc.
  • a user interface asking to perform several tasks is popped up.
  • the surgeon can enter specific surgical parameters and the bone fragments are moved accordingly (Fig.7) .
  • Fig.7 shows parameters for the movement of the maxilla with respect to the anatomically defined reference frame.
  • the landmarks are updated accordingly and the movement of the landmarks with respect to their original position is depicted (Fig.8) .
  • the user can define his set of bone movement references, adhering to his way of working and performing surgery.
  • the splint has at least one extension.
  • This extension can be either extra-oral or intra-oral .
  • the splint is produced in a non-toxic material that is almost radiolucent. While wearing this splint, the patient is CT-scanned. Then, plaster casts of the patient's upper and lower jaw with the splint in between (see Fig.9) are CT-scanned. The additional steps are also indicated in the flowchart of Fig.14.
  • image analysis techniques the features of said extension are extracted from both the patient CT scan and the cast scan. Based on these features both data sets are fused, and the plaster casts are co-visualised with the patient CT scan. Such a feature can be the part of the surface of said extension.
  • the plaster casts are mounted in an articulator.
  • the planning system exports the virtual planning results to the articulator in order to move the plaster casts in the same way as in the virtual planning (see flowchart in Fig.13) .
  • this can be performed by modifying a number of characteristic parameters in accordance with the planning output, or in case of e.g. a motorised articulator, to drive that articulator.
  • the model has to be split into several components, the same procedure is repeated for all components.
  • the physical surgical splints are produced.
  • the surgical splints can be digitally designed. A box-shaped or a U-shaped object is introduced in the software and the intersection volume with the plaster cast model is computed, after which the inserted object is removed. This object is then produced.
  • Several available production methods can be applied: e.g.
  • the splints are directly produced or otherwise a model is produced from which a splint can be derived manually by routinely used techniques.
  • the planning results of the maxillofacial surgery planning can be exported to a surgical navigation system, as indicated in the flowchart of Fig.13.
  • the surgeon can also work the other way around. The surgeon performs a (possibly partial) model surgery on the plaster casts. To check this model surgery with the remainder of the skull, the new positions of the models are CT-scanned. This scan is entered in the planning system by means of registration. Based on one or more unaltered parts of the current plaster casts and the original plaster casts, the models are registered by surface matching and the transformation matrices for the bone surface are known.

Abstract

The present invention is related to a method for performing a cephalometric or anthropometric analysis comprising the steps of : - acquiring a 3D scan of a person's head using a 3D medical image modality, - generating a 3D surface model using data from the 3D scan, - generating from the 3D scan at least one 2D cephalogram geometrically linked to the 3D surface model, - indicating anatomical landmarks on the at least one 2D cephalogram and/or on the 3D surface model, - performing the analysis using the anatomical landmarks.

Description

METHOD FOR DERIVING A TREATMENT PLAN FOR ORTHOGNATHIC SURGERY AND DEVICES THEREFOR
Field of the invention [0001] The present invention relates to methods for assessing the shape of the skull and soft tissues and for determining a treatment plan for maxillofacial surgery and more particularly for orthognatic surgery and devices used in such surgery or in the preparation thereof.
State of the art [0002] In maxillofacial surgery, the skull and dentition is surgically remodelled or restored. This surgical discipline encompasses surgical interventions of repair, in particular, of a mis-positioning of the jaws with respect to one another, called orthognatic surgery. Typically, orthognatic surgery involves osteotomies of the maxilla and/or mandible to reposition these bone fragments correctly with respect to the rest of the skull and to create a good • occlusion. Osteotomies are surgical operations whereby a bone is cut to shorten, lengthen or change its alignments. With 'occlusion' is meant the manner in which the teeth from upper and lower arches come together when the mouth is closed. [0003] The preparation of such a surgical intervention requires implementing orthodontic and radiographic techniques. Orthodontic techniques [0004] A casting of the patient's mandibular and maxillary dentition is made. These castings, generally made of plaster, are then mounted in an articulator representing the tempero-mandibular joints and jaw members. The castings are used to simulate the relative displacement that has to be applied to the jaws to create a good occlusion. To enable the surgeon to respect the simulated relative positions, a splint, i.e. a plate comprising on each of its surfaces tooth-prints of the two castings, is made. The splint is used to maintain the casting or the jaws in relative positions where the teeth are in occlusion. [0005] Since the surgical intervention generally includes osteotomies of both jaws, two splints are generally made from the dental castings, in addition to a so-called initial splint linking the two jaws in their occlusion position before the intervention. [0006] A so-called intermediary splint determines the foreseeable displacement of the maxilla with respect to the mandible, when the mandible is in its original (preoperative) position. This splint enables the surgeon to place the maxilla back on the skull in the desired definitive position before intervening on the mandible. A so-called definitive splint determines the occlusion objective to be surgically achieved and is thus used to correctly position the mandible on the skull by setting the position of the mandible with respect to the previously replaced maxi11a.
Radiographic techniques [0007] The preparation of the surgical operation also uses X-ray radiographs of the patient. Typically a lateral radiograph (cephalogram) is performed. Sometimes a frontal X-ray radiograph and other radiography with different views are performed. These radiographs enable, in particular, performing an approximate simulation of the operative action. [0008] The simulation is performed manually from a tracing paper placed on the radiography. For example, landmarks are indicated and the contours of the mandible are drawn. The tracing paper is then moved to approximately reproduce thereon the desired post-operative occlusion, after which the maxillary contours are drawn. The maxillo- mandibular assembly drawn on the tracing paper is then moved in one block while respecting cephalometric standards, labial ratios, as well as other criteria known for this type of intervention. The direction and amplitude of the jaw displacements are thus radiographically and approximately defined. The results of this simulation are compared and adjusted according to the relative motion of the mandible and of the maxilla envisaged by means of the splints. [0009] The actual simulation of an orthognatic surgical intervention is thus performed essentially manually. Further, this simulation is only done in two dimensions based on a plane profile view of the skull . [0010] The current generation of CT-scanners provide detailed 3D information of the patient's anatomy. Based on this data, 3D surface reconstructions of the bone and the skin surface are possible. Bone fragments can be isolated, and moved with respect to each other. This could provide a suitable basis for a computer assisted orthognatic surgery planning system. However, the currently available 3D surface representations algorithms do not provide a suitable framework comprising anatomically relevant references, which allow the clinician to easily and reliably reposition a bone fragment in the virtual 3D environment . A second problem is associated with the use of 3D surface representations derived from CT-scans of patients having amalgam dental fillings. Teeth amalgam fillings create artefacts that appear as streaks on the CT images. Using these CT images as ' such, it is impossible to plot on a three-dimensional view the exact position of the teeth to obtain the bite. [0011] Patent document WO03/028577-A2 discloses an apparatus and method for fabricating orthognatic surgical splints. It also relates to a method for creating a computerised composite skull model suitable for diagnosis and treatment planning. In said method a 3D CT model of the patient's bone structure and a digital dental computer model of the patient's dentition are generated, both comprising a same set of fiduciary markers.
Aims of the invention [0012] The present invention aims to provide a method for performing a cephalometric and/or anthropometric analysis. In a second object it aims to provide a method for deriving a treatment plan for orthognatic surgery, comprising said analysis method. In a further object it aims to provide devices suitable therefor.
Summary of the invention [0013] The present invention relates to a method for performing a cephalometric and/or anthropometric analysis comprising the steps of : - acquiring a 3D scan of a person's head using a 3D medical image modality, - generating a 3D surface model using data from that 3D scan, generating from the 3D scan at least one 2D cephalogram geometrically linked to the 3D surface model, indicating anatomical landmarks on the at least one 2D cephalogram and/or on the 3D surface model, performing the cephalometric and/or anthropometric analysis using the anatomical landmarks. [0014] Preferably the medical image modality is magnetic resonance imaging or computer tomography. The 3D surface model advantageously represents a bone structure surface and/or a soft tissue envelope. [0015] In a preferred embodiment the method further comprises the step of visualising said generated at least one 2D cephalogram together with the 3D surface model in a virtual scene. [0016] Advantageously the method further comprises the determining of a reference frame from anatomical reference points on the person's head. [0017] Preferably in a further step a report of the cephalometric analysis is generated. [0018] In another embodiment the method comprises the further step of providing 2D or 3D photographs, from which a textured 3D skin surface is derived. [0019] The analysis typically comprises the determination of linear distances between two landmarks, the distance of a landmark to a reference plane, the distance between landmarks projected on a plane, angles between landmarks or planes, proportions computed between these measurements or the distance between two points along a surface and parallel to a plane. [0020] In yet a further embodiment the method comprises the steps of acquiring a 3D scan of the person' s head, while the person is wearing a 3D splint. Also a 3D scan of casts of said person's upper and lower jaw is then preferably acquired. Next the 3D scan of said person's head, while wearing the 3D splint, and the 3D scan of casts of upper and lower jaw are fused, based on features of the 3D splint. Advantageously data from the 3D scan of the person wearing said 3D splint is subsequently used for generating the 3D surface model . [0021] In a second object the invention relates to a method for deriving planning information for repositioning a bone fragment, comprising the steps of : - performing a cephalometric and/or anthropometric analysis as previously described, - defining a set of virtual positions of the bone fragment to be repositioned, said positions being defined based on the anatomical landmarks, visualising for each of the virtual positions the result of repositioning the bone fragment together with the landmarks in the 3D surface model and on the 2D cephalograms, taking a decision on an intra-operative repositioning of the bone fragment based on the cephalometric analysis and on the visualisation. [0022] In an advantageous embodiment the virtual positions result from a translation and/or rotation of the bone fragment . [0023] In another object the invention relates to a device for cephalometric and/or anthropometric analysis, comprising a computing unit arranged for generating from 3D scan data a 3D surface model and a 2D cephalogram geometrically linked to the 3D surface model, visualisation means for representing the 2D cephalogram and/or the 3D surface model, and - computation means for performing the analysis based on anatomical landmarks provided on the at least one 2D cephalogram and/or on the 3D surface model. The 3D scan data are preferably CT or MRI data. [0024] In a further object the invention relates to a 3D splint for use in a method as previously described. The 3D splint comprises a U-shaped part arranged for fitting the upper and lower dental arches and is provided with an extra-oral or intra-oral extension on the U-shaped part. [0025] In a last object the invention relates to a program, executable on a programmable device' containing instructions, which when executed, perform the method as previously described.
Short description of the drawings [0026] Fig. 1 represents the generation of a virtual (lateral) cephalogram. [0027] Fig. 2 represents the definition of an anatomically related reference frame. [0028] Figs. 3A and 3B represent the definition of anatomical landmarks . Both the lateral cephalogram and 3D bone surface model are used to exactly define the points. [0029] Fig. 4 represents the results of the 3D cephalometric tracing. [0030] Fig. 5 represents the moving bone fragments being indicated during set-up of orthognatic surgery. [0031] Fig. 6 represents a virtual result of orthognatic surgery. [0032] Fig. 7 represents the control window used to move the bone fragments. [0033] Fig. 8 represents the tracking of the movements of the landmarks on the virtual cephalograms (Fig.8A) and on the bone surface representations (Fig.8B) . [0034] Fig. 9 represents the 3D-splint alone and in position between the plaster casts of the dentition. This 3D-splint envisages fusing by means of features of the extension. [0035] Fig. 10 represents the 3D-splint alone and in position between the plaster casts of the dentition. This 3D splint envisages fusing by means of gutta-percha markers. [0036] Fig. 11 to 14 represent flowcharts of various embodiments of the method according to the invention. [0037] Fig. 15 represents a global summary of the flowcharts shown in Figs.11-14.
Detailed description of the invention [0038] In order to perform an adequate 3D cephalometric analysis of bone tissue and/or soft tissues, the ability to indicate the relevant points on the 3D structures alone does not suffice. Points required for adequate 3D cephalometric tracing that are not well defined on the 3D structure are available on a 2D representation and vice-versa. The present invention describes a computerised system that solves this problem. [0039] The present invention provides a method to redefine the framework of a 3D surface representation into an anatomically relevant framework. The anatomically relevant framework allows a clinician to perform an accurate cephalometric and/or anthropometric analysis in an intuitive manner. Moreover, a 3D surface representation comprising an anatomically relevant framework has the advantage that it allows the virtual repositioning of bone fragments in relation to anatomically relevant landmarks, making it particularly suited for the planning of surgical interventions. The flowchart shown in Fig.11 summarises the main steps of the method according to the invention. [0040] In medical imaging, a modality is any of the various types of equipment or probes used to acquire images of the body. Radiography, computer tomography, ultrasonography and magnetic resonance imaging are examples for modalities in the present context. [0041] A method and device to perform a 3D cephalometric and/or anthropometric analysis is disclosed allowing a preoperative assessment of a patient's anatomy. The ' device comprises a computerised system, visualising image volumes (e.g. CT-image volumes) and surface models extracted from it, together with 2D projection grey-value images, i.e. virtual X-ray images geometrically linked with the CT-image and computed from it. The combined information provides a means for effectively and accurately assessing the 3D anatomy of the patient's skull and soft tissue surface. The technology fuses classical 2D cephalometric tracings with 3D bone surface visualisation. The surface model can be generated using the CT data, as described in the paper λMarching Cubes: a High Resolution 3D Surface Construction Algorithm' by W.E. Lorensen, H.E. Cline (ACM Computer Graphics (ACM SIGGRAPH '87 Proceedings), vol.21, no.4, pp.163-169, July 1987) . The virtual X-ray images (cephalograms) can be obtained as described in ^Display of surfaces from volume data' , Levoy M., IEEE Comput. Graph. Appl. 8,3 (May 1988) , pp.29-37. [0042] A 3D scan of the patient is the input to the system. The image volume is composed of so-called λvoxels' , i.e. volume elements that each hold one value (e.g. a greyvalue) . The box-shaped voxels compose a complete image volume when arranged in a three-dimensional array. Based on this image volume, a 3D surface of the bone structure and/or the soft tissue envelope is constructed. If required, it is possible to add the natural complexion (the natural tone and texture of the skin) of the face to the skin surface generated from CT-data, by adding the colour information of the face. To achieve this, a textured 3D skin surface, acquired e.g. by 3D photography or laser scanning, can be added and registered to the CT-data (see flowchart in Fig.12) . As an alternative, a series of 2D photos are acquired and by aligning the skin surface model from CT to the view of the 2D photo, the texture is transferred. [0043] In an initial step the clinician defines or selects the type of analysis. A cephalometric analysis performs measurements at the level of the patient's skull. An anthropometric analysis performs measurements at the level of the patient's skin. The present invention allows defining various cephalometric and anthropometric analyses or even a combination of both. The type of analysis determines the anatomical landmarks that should be indicated by the clinician and the measurements that are computed. [0044] Before indicating these landmarks the clinician has virtually positioned the patient to create a lateral cephalogram (see Fig.l) , and preferably an anatomical reference frame (Fig.2) is installed replacing the co-ordinate system of the CT-data. Also a frontal cephalogram is optionally generated. [0045] The anatomical reference frame is a coordinate system attached to anatomical reference points. This reference frame consists of a horizontal, median and vertical plane (Fig.2) . With this reference frame, the directions up/down and left/right are linked with the anatomy of the patient. Consequently the installation of such anatomical reference frame allows an easy navigation within the virtual images. [0046] In a particular embodiment, the system constructs such anatomical relevant reference frame after the clinician has indicated following anatomical landmarks: 1. two left/right symmetrical landmarks: e.g. the left and right fronto-zygomatic suture. 2. Nasion 3. Sella The horizontal plane is defined by the direction defined in 1, together with the direction Nasion-Sella and goes through the Sella. The median plane is perpendicular on the horizontal plane, contains the left/right direction and goes through Sella. The vertical plane is perpendicular on the median plane and horizontal plane and goes through Sella. Another reference frame can be defined based on the skin surface alone: 1. two left/right symmetrical landmarks: e.g. the pupils, 2. with a lateral view of the head, the direction of the pupils tangent to the upper limit of the ear, 3. a soft tissue point on the facial midline, e.g. on the soft tissue Nasion point (Nasion-s) . The horizontal plane is defined by the directions defined in 1 and 2, and goes through Nasion-s. The median plane is perpendicular on the horizontal plane, and contains the direction defined by 2, and goes through Nasion-s. The vertical plane is perpendicular on the horizontal and median plane, and goes through Nasion-s. [0047] In a next step anatomical landmarks of the analysis are indicated. Landmarks are characteristic anatomical points on hard tissues or soft tissues. The landmarks can be indicated on the surface model or on the 2D cephalogram (see Fig. 3) . Selected anatomical points can determine an anatomical plane, which should be considered as one of the anatomical landmarks. [0048] Finally, the measurements (distances or angles) of the analysis are computed and preferably a report is generated. The position of the landmarks can be adjusted. Possible measurements comprise : - angles between planes (e.g. the inclination of the Frankfurter plane with the horizontal plane of the reference frame) , - angles between projected points, - linear distances between two landmarks. This can be the actual distance between points or the distance of the points projected on the reference planes: the height, the width and the depth distances between two points, - distance of a landmark to the reference planes, - proportional measurements that compute the proportion between two measurements. Fig.4 shows an example of analysis results. [0049] Several types of cephalometric analyses can be defined. In the set-up of a specific type of cephalometric analysis preferably following elements are defined: - whether the reference frames are used or not, and if so, which ones, - a number of measurements between anatomical landmarks or anatomical planes are defined. If a landmark for a measurement is not already defined in the system, a new landmark has to be defined. Also freely orientated extra virtual X-ray images can be generated. [0050] In order to prepare efficiently the repositioning of bone fragments the following requirements should be achieved for an orthognatic surgery planning system : - the planning system should allow repositioning the bone fragments with respect to an anatomically defined reference frame and with respect to anatomically defined rotation/translation references, and - it should visualise the results of any repositioning. Preferably, the effects of any repositioning are visualised at the level of the skeleton as well as on the level of the soft tissues. [0051] In the prior art solutions most clinicians perform a planning using 2D cephalograms in combination with dental castings. However, as a 2D cephalogram is a projection image, 3D information is lost, while dental casts only give 3D information on a very limited area of the head and provide no information on the soft tissue. [0052] When preparing a bone fragment repositioning, useful additional information can be obtained using the above-described 3D cephalometric analysis (Fig.5) . Using information from the cephalometric analysis, the user (typically a surgeon) can reposition bone fragments in a virtual way. As an example, Fig. 6 shows the result of a virtual maxillary repositioning. Different types of translation and rotation with respect to the landmarks can be simulated by means of the computerised planning system. For example a rotation around an axis or a translation along a direction can be defined as the intersection between two planes or as being perpendicular to a plane or it can be defined by two landmarks. [0053] To create an easy way of working, the user can predefine in the computerised orthognatic surgery planning system various types of surgery, such as maxillary advancement, mandibular advancement, mandibular widening, etc. When he chooses a type of surgery, a user interface asking to perform several tasks is popped up. At the end, the surgeon can enter specific surgical parameters and the bone fragments are moved accordingly (Fig.7) . Fig.7 shows parameters for the movement of the maxilla with respect to the anatomically defined reference frame. The landmarks are updated accordingly and the movement of the landmarks with respect to their original position is depicted (Fig.8) . In order to increase flexibility of the bone repositioning tools in the planning system, the user can define his set of bone movement references, adhering to his way of working and performing surgery.
[0054] Amalgam dental fillings can corrupt CT-images at the level of the teeth. This renders accurate visualisation of the occlusion very difficult. Moreover, to clearly inspect the occlusion, the details of the teeth are very important. To image the details of teeth, a very high resolution CT-scan is required, and in consequence a high X-ray exposure of the patient. However, it should be avoided to expose a patient to high X-ray doses. [0055] In order to increase the level of detail at the level of the crown of the teeth, without increasing the CT radiation dose, a 3D-splint (Fig.9) is used with a planar U-shaped geometry and fitting on both the actual upper and lower dental arches at the same time. Attached to this part, the splint has at least one extension. This extension can be either extra-oral or intra-oral . The splint is produced in a non-toxic material that is almost radiolucent. While wearing this splint, the patient is CT-scanned. Then, plaster casts of the patient's upper and lower jaw with the splint in between (see Fig.9) are CT-scanned. The additional steps are also indicated in the flowchart of Fig.14. Using image analysis techniques, the features of said extension are extracted from both the patient CT scan and the cast scan. Based on these features both data sets are fused, and the plaster casts are co-visualised with the patient CT scan. Such a feature can be the part of the surface of said extension. This allows accurate software planning at the level of crowns of teeth. Instead of employing features of the extension, one could also envisage the use of gutta-percha markers (see Fig.10) . The splint then contains at least 4 spherical gutta-percha markers with a diameter of about lmm. At least one marker should be positioned on the extension and not in the same plane as the U-shaped part. [0056] After finishing the virtual planning using the 3D cephalometric reference frame with the enhanced imaging of the teeth, the plaster casts are mounted in an articulator. The planning system exports the virtual planning results to the articulator in order to move the plaster casts in the same way as in the virtual planning (see flowchart in Fig.13) . Depending on the type of articulator, this can be performed by modifying a number of characteristic parameters in accordance with the planning output, or in case of e.g. a motorised articulator, to drive that articulator. In case the model has to be split into several components, the same procedure is repeated for all components. Based on the new position of the plaster casts in the articulator, the physical surgical splints are produced. [0057] Alternatively, the surgical splints can be digitally designed. A box-shaped or a U-shaped object is introduced in the software and the intersection volume with the plaster cast model is computed, after which the inserted object is removed. This object is then produced. Several available production methods can be applied: e.g. milling, 3D printing, stereolithography, sintering, ... Using these production methods, the splints are directly produced or otherwise a model is produced from which a splint can be derived manually by routinely used techniques. [0058] Also, the planning results of the maxillofacial surgery planning can be exported to a surgical navigation system, as indicated in the flowchart of Fig.13. [0059] Optionally, the surgeon can also work the other way around. The surgeon performs a (possibly partial) model surgery on the plaster casts. To check this model surgery with the remainder of the skull, the new positions of the models are CT-scanned. This scan is entered in the planning system by means of registration. Based on one or more unaltered parts of the current plaster casts and the original plaster casts, the models are registered by surface matching and the transformation matrices for the bone surface are known.

Claims

CLAIMS 1. Method for performing a cephalometric and/or anthropometric analysis comprising the steps of : acquiring a 3D scan of a person's head using a 3D medical image modality, generating a 3D surface model using data from said 3D scan, generating from said 3D scan at least one 2D cephalogram geometrically linked to said 3D surface model, - indicating anatomical landmarks on said at least one 2D cephalogram and/or on said 3D surface model, performing said cephalometric and/or anthropometric analysis using said anatomical landmarks. 2. Method for performing a cephalometric and/or anthropometric analysis as in claim 1, wherein said medical image modality is magnetic resonance imaging or computer tomography. 3. Method for performing an analysis as in claim 1 or 2, wherein said 3D surface model represents a bone structure surface and/or a soft tissue envelope. 4. Method for performing an analysis as in any of claims 1 to 3, further comprising the step of visualising said generated at least one 2D cephalogram together with said 3D surface model in a virtual scene. 5. Method for performing an analysis as in any of claims 1 to 4, further comprising the step of determining a reference frame from anatomical reference points on said person's head. 6. Method for performing an analysis as in any of claims 1 to 5, further comprising the step of generating a report of said cephalometric analysis. 7. Method for performing an analysis as in any of claims 1 to 6, further comprising the step of providing 2D or 3D photographs, from which a textured 3D skin surface is derived. 8. Method for performing an analysis as in any of claims 1 to 7, wherein said analysis comprises the determination of linear distances between two landmarks or the distance of a landmark to a reference plane. 9. Method for performing an analysis as in any of claims 1 to 8, further comprising the steps of acquiring a 3D scan of said person's head, said person wearing a 3D splint as in claim 15, and a 3D scan of casts of said person's upper and lower jaw and further comprising the step of fusing, based on features of said 3D splint, said 3D scan of said person' s head, said person wearing said 3D splint, and said 3D scan of casts of said person's upper and lower jaw. 10. Method for performing an analysis as in claim 9, whereby data from said 3D scan of said person wearing said 3D splint is used for generating said 3D surface model. 11. Method for deriving planning information for repositioning a bone fragment, comprising the steps of
performing a cephalometric and/or anthropometric analysis as in any of the previous claims, - defining a set of virtual positions of said bone fragment to be repositioned, said positions being defined based on said anatomical landmarks, - visualising the result for each of said virtual positions, - taking a decision on an intra-operative repositioning of said bone fragment based on said cephalometric analysis and on said visualisation. 12. Method for deriving planning information as in claim 11, wherein said virtual positions result from a translation and/or rotation of said bone fragment. 13. A device for cephalometric and/or anthropometric analysis, comprising a computing unit arranged for generating from 3D scan data a 3D surface model and a 2D cephalogram geometrically linked to said 3D surface model, visualisation means for representing said 2D cephalogram and/or said 3D surface model, and computation means for performing said analysis based on anatomical landmarks provided on said at least one 2D cephalogram and/or on said 3D surface model. 14. A device for cephalometric and/or anthropometric analysis, wherein said 3D scan data are CT or MRI data. 15. A 3D splint for use in a method as in claim 9, comprising a U-shaped part arranged for fitting the upper and lower dental arches and provided with an extension on said U-shaped part. 16. A program, executable on a programmable device containing instructions, which when executed, perform the method as in any of the claims 1 to 12.
PCT/BE2005/000100 2004-06-25 2005-06-27 Method for deriving a treatment plan for orthognatic surgery and devices therefor WO2006000063A1 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
CN2005800210757A CN1998022B (en) 2004-06-25 2005-06-27 Method for deriving a treatment plan for orthognatic surgery and devices therefor
BRPI0511379-2A BRPI0511379B1 (en) 2004-06-25 2005-06-27 "Method for performing cephalometric analysis and / or anthropometric analysis and device for cephalometric analysis and / or anthropometric analysis"
EP05758955A EP1759353B1 (en) 2004-06-25 2005-06-27 Method for deriving a treatment plan for orthognatic surgery and devices therefor
JP2007516900A JP5020816B2 (en) 2004-06-25 2005-06-27 Method and apparatus for deriving a treatment plan for orthognathic surgery
US11/629,270 US7792341B2 (en) 2004-06-25 2005-06-27 Method for deriving a treatment plan for orthognatic surgery and devices therefor
DE602005010861T DE602005010861D1 (en) 2004-06-25 2005-06-27 METHOD FOR DERIVING A TREATMENT PLAN FOR SQUARED SURGERY AND EQUIPMENT THEREFOR

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GBGB0414277.4A GB0414277D0 (en) 2004-06-25 2004-06-25 Orthognatic surgery
GB0414277.4 2004-06-25

Publications (1)

Publication Number Publication Date
WO2006000063A1 true WO2006000063A1 (en) 2006-01-05

Family

ID=32800197

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/BE2005/000100 WO2006000063A1 (en) 2004-06-25 2005-06-27 Method for deriving a treatment plan for orthognatic surgery and devices therefor

Country Status (12)

Country Link
US (1) US7792341B2 (en)
EP (1) EP1759353B1 (en)
JP (1) JP5020816B2 (en)
CN (1) CN1998022B (en)
AT (1) ATE413668T1 (en)
BR (1) BRPI0511379B1 (en)
DE (1) DE602005010861D1 (en)
ES (1) ES2317265T3 (en)
GB (1) GB0414277D0 (en)
RU (1) RU2384295C2 (en)
WO (1) WO2006000063A1 (en)
ZA (1) ZA200610143B (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008031562A1 (en) 2006-09-11 2008-03-20 Jochen Max Zinser Method for production of at least one surgical splint, in particular for computer-assisted maxillofacial operations and transposition osteotomies
WO2008080235A1 (en) * 2007-01-03 2008-07-10 Ao Technology Ag Device for planning orthodontics and/or orthognathic surgery
EP1982652A1 (en) 2007-04-20 2008-10-22 Medicim NV Method for deriving shape information
WO2008128700A1 (en) * 2007-04-18 2008-10-30 Materialise Dental N.V. Computer-assisted creation of a custom tooth set-up using facial analysis
WO2012016635A2 (en) 2010-08-04 2012-02-09 Charité - Universitätsmedizin Berlin Method for generating a digital tooth topology for a tooth structure, and measurement method
WO2012139999A1 (en) * 2011-04-12 2012-10-18 Universite Paul Sabatier Toulouse Iii Method for designing and manufacturing a positioning groove intended for use in repositioning the maxilla of a patient during orthognathic surgery
EP2789308A1 (en) * 2013-04-12 2014-10-15 Stryker Leibinger GmbH & Co. KG Computer-implemented technique for generating a data set that geometrically defines a bone cut configuration
WO2015142291A1 (en) * 2014-03-20 2015-09-24 National University Of Singapore Computer-aided planning of craniomaxillofacial and orthopedic surgery
IT201600118033A1 (en) * 2016-11-22 2017-02-22 Univ Degli Studi Di Messina Cephalometric model management procedure for compiling orthognathic treatment plans
EP2680233A4 (en) * 2011-02-22 2017-07-19 Morpheus Co., Ltd. Method and system for providing a face adjustment image
CN109671505A (en) * 2018-10-25 2019-04-23 杭州体光医学科技有限公司 A kind of head three-dimensional data processing method for medical consultations auxiliary
EP2564375B1 (en) * 2010-04-30 2019-06-12 Align Technology, Inc. Virtual cephalometric imaging
EP3566651A1 (en) * 2018-05-08 2019-11-13 Siemens Healthcare GmbH Method and device for determining result values based on a skeletal medical image capture
US10869705B2 (en) 2012-12-12 2020-12-22 Obl S.A. Implant and guide

Families Citing this family (58)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070274440A1 (en) * 2006-05-11 2007-11-29 David Phillipe Sarment Automatic determination of cephalometric points in a three-dimensional image
TWI323171B (en) * 2007-06-27 2010-04-11 Univ Nat Cheng Kung Cephalogram image analysis method
US8795204B2 (en) * 2008-01-09 2014-08-05 Allergan, Inc. Anatomical recognition and dimensional analysis of breast volume to assist breast surgery
GB0803514D0 (en) * 2008-02-27 2008-04-02 Depuy Int Ltd Customised surgical apparatus
GB0807754D0 (en) * 2008-04-29 2008-06-04 Materialise Dental Nv Method to determine the impact of a prposed dental modification on the temporomandobular joint
EP2338142A2 (en) * 2008-09-17 2011-06-29 Koninklijke Philips Electronics N.V. Mr segmentation using transmission data in hybrid nuclear/mr imaging
JP5701857B2 (en) * 2009-05-08 2015-04-15 コーニンクレッカ フィリップス エヌ ヴェ Ultrasound planning and guide for implantable medical devices
EP2254068B1 (en) 2009-05-18 2020-08-19 Nobel Biocare Services AG Method and system providing improved data matching for virtual planning
JP5580572B2 (en) * 2009-11-05 2014-08-27 メディア株式会社 How to display the progress of periodontal disease
US8805048B2 (en) * 2010-04-01 2014-08-12 Mark Batesole Method and system for orthodontic diagnosis
US9066733B2 (en) 2010-04-29 2015-06-30 DePuy Synthes Products, Inc. Orthognathic implant and methods of use
US8435270B2 (en) 2010-04-29 2013-05-07 Synthes Usa, Llc Orthognathic implant and methods of use
JP2013530028A (en) * 2010-05-04 2013-07-25 パスファインダー セラピューティクス,インコーポレイテッド System and method for abdominal surface matching using pseudo features
WO2012087043A2 (en) * 2010-12-23 2012-06-28 주식회사 오라픽스 Virtual surgical apparatus for dental treatment and method for manufacturing a wafer using same
RU2461367C1 (en) * 2011-02-18 2012-09-20 Евгений Михайлович Рощин Diagnostic technique for dentition taking into consideration axis of head of mandible and device for implementation thereof
RU2454180C1 (en) * 2011-02-18 2012-06-27 Евгений Михайлович Рощин Method of finding axis of patient's lower jaw head
CN102389335B (en) * 2011-07-21 2015-02-11 中国医学科学院整形外科医院 Digital jaw surgical guide plate and manufacturing method thereof
FR2979226B1 (en) * 2011-08-31 2014-11-21 Maxime Jaisson METHOD FOR DESIGNING A DENTAL APPARATUS
CN104540466B (en) * 2012-05-17 2017-11-07 德普伊新特斯产品有限责任公司 The method of surgery planning
RU2543543C2 (en) * 2012-06-22 2015-03-10 Наталья Васильевна Удалова Method for making sequence of models for kit of correction trays using computer-generated simulation
RU2498785C1 (en) * 2012-08-20 2013-11-20 Федеральное государственное бюджетное образовательное учреждение высшего профессионального образования "Алтайский государственный технический университет им. И.И. Ползунова" (АлтГТУ) Method for estimating therapeutic dental displacement
GB201216214D0 (en) 2012-09-12 2012-10-24 Nobel Biocare Services Ag A digital splint
GB201216230D0 (en) 2012-09-12 2012-10-24 Nobel Biocare Services Ag An improved surgical template
GB201216224D0 (en) 2012-09-12 2012-10-24 Nobel Biocare Services Ag An improved virtual splint
RU2508068C1 (en) * 2012-10-25 2014-02-27 Федеральное государственное бюджетное образовательное учреждение высшего профессионального образования "Новгородский государственный университет имени Ярослава Мудрого" Method of creating three-dimensional design project of marginal parodontium
JP2014117426A (en) * 2012-12-14 2014-06-30 Tetsuya Hirota Information processing device, information processing method, program, and member for occludator
RU2511472C1 (en) * 2012-12-27 2014-04-10 Евгений Михайлович Рощин Diagnostic technique for mandibular dearticulation including directing palatal mapping (versions)
IL225445A0 (en) * 2013-03-24 2013-07-31 Truphatek Int Ltd Survey tool, system and method for pre-intubation patient survey
US9855114B2 (en) 2013-05-21 2018-01-02 Carestream Health, Inc. Method and system for user interaction in 3-D cephalometric analysis
WO2015031728A1 (en) * 2013-08-29 2015-03-05 University Of Washington Through Its Center For Commercialization Methods and systems for simulating an x-ray dental image
RU2548317C2 (en) * 2013-09-03 2015-04-20 Константин Александрович Куракин Method for planning orthognatic surgical operation
CN103598916B (en) * 2013-10-29 2015-10-28 谢叻 A kind of auxiliary device for craniofacial plastic surgery
EP2870941A1 (en) * 2013-11-08 2015-05-13 Orthotaxy Method for planning a surgical intervention
US9545302B2 (en) 2013-11-20 2017-01-17 Dermagenesis Llc Skin printing and auto-grafting
WO2015081025A1 (en) 2013-11-29 2015-06-04 The Johns Hopkins University Cranial reference mount
CN103978789B (en) * 2014-05-22 2016-05-11 中国科学院苏州生物医学工程技术研究所 The head medicine model quick molding method of printing based on 3D
JP6531115B2 (en) * 2014-05-22 2019-06-12 ケアストリーム ヘルス インク Method of 3D cephalometric analysis
US9710880B2 (en) * 2014-07-03 2017-07-18 Siemens Product Lifecycle Management Software Inc. User-guided shape morphing in bone segmentation for medical imaging
US9808322B2 (en) * 2014-08-27 2017-11-07 Vito Del Deo Method and device for positioning and stabilization of bony structures during maxillofacial surgery
WO2016086049A1 (en) 2014-11-24 2016-06-02 The Johns Hopkins University A cutting machine for resizing raw implants during surgery
US10376319B2 (en) * 2015-06-09 2019-08-13 Cheng Xin She Image correction design system and method for oral and maxillofacial surgery
RU2607651C1 (en) * 2015-08-31 2017-01-10 федеральное государственное бюджетное образовательное учреждение высшего образования "Северо-Западный государственный медицинский университет имени И.И. Мечникова" Министерства здравоохранения Российской Федерации (ФГБОУ ВО СЗГМУ им. И.И. Мечникова Минздрава России) Method for simulating bone-reconstructive surgeries in treating new growths of jaw bone in childhood
US11058541B2 (en) 2015-09-04 2021-07-13 The Johns Hopkins University Low-profile intercranial device
KR101893752B1 (en) * 2016-01-04 2018-08-31 주식회사 바텍 Method and Apparatus for Analyzing an X-ray Image for Identifying Patient Positioning Errors and/or Exposure Condition Errors
MX2019002037A (en) * 2016-08-19 2019-07-18 The Methodist Hospital System Systems and methods for computer-aided orthognathic surgical planning.
US10467815B2 (en) 2016-12-16 2019-11-05 Align Technology, Inc. Augmented reality planning and viewing of dental treatment outcomes
US10888399B2 (en) 2016-12-16 2021-01-12 Align Technology, Inc. Augmented reality enhancements for dental practitioners
KR101898887B1 (en) 2017-04-21 2018-11-02 오스템임플란트 주식회사 3D Landmark Suggestion System And Method For Analyzing 3d Cephalometric
GB201708520D0 (en) 2017-05-27 2017-07-12 Dawood Andrew A method for reducing artefact in intra oral scans
RU2693689C2 (en) * 2017-09-27 2019-07-03 Федеральное Государственное Бюджетное Образовательное Учреждение Высшего Образования "Московский государственный медико-стоматологический университет имени А.И. Евдокимова" Министерства здравоохранения Российской Федерации (ФГБОУ ВО МГМСУ имени А.И. Евдокимова Минздрава России) Method for cephalometric analysis of symmetry of contralateral sides in patients with asymmetric deformations of jaws
JP7120610B2 (en) 2018-07-06 2022-08-17 東京体育用品株式会社 gym mat
KR101952887B1 (en) * 2018-07-27 2019-06-11 김예현 Method for predicting anatomical landmarks and device for predicting anatomical landmarks using the same
KR102099390B1 (en) * 2018-08-21 2020-04-09 디디에이치 주식회사 Dental image analyzing method for orthodontic daignosis and apparatus using the same
KR20210108429A (en) 2018-12-20 2021-09-02 메디심 엔브이 Automatic trimming of surface meshes
FI20195182A1 (en) * 2019-03-12 2020-09-13 Planmeca Oy Generation of transformation matrices associated with upper and lower dental arches
CN112590220A (en) * 2020-11-27 2021-04-02 四川大学 Electrode cap design method, manufacturing method and system based on 3D skull model
JP7226879B2 (en) * 2021-07-15 2023-02-21 関西企画株式会社 Information processing device, information processing program, MRI examination device, and information processing method
CN116052850A (en) * 2023-02-01 2023-05-02 南方医科大学珠江医院 CTMR imaging anatomical annotation and 3D modeling mapping teaching system based on artificial intelligence

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6213769B1 (en) * 1996-12-20 2001-04-10 Universite Joseph Fourier Device for determining a movement between two dental cast profile using an x-ray scanner
WO2003028577A2 (en) * 2001-10-03 2003-04-10 Board Of Regents, The University Of Texas System Method and apparatus for fabricating orthognathic surgical splints
US20040015327A1 (en) * 1999-11-30 2004-01-22 Orametrix, Inc. Unified workstation for virtual craniofacial diagnosis, treatment planning and therapeutics

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2637165B2 (en) * 1988-05-18 1997-08-06 株式会社東芝 3D image processing device
JP2839671B2 (en) * 1990-08-24 1998-12-16 株式会社東芝 3D image processing device
JPH05120451A (en) * 1991-08-08 1993-05-18 Hitachi Medical Corp Medical diagnostic picture processor
JPH07311834A (en) * 1994-05-19 1995-11-28 Toshiba Medical Eng Co Ltd Image processor and its aid
JPH07334702A (en) * 1994-06-10 1995-12-22 Toshiba Corp Display device
US6081739A (en) * 1998-05-21 2000-06-27 Lemchen; Marc S. Scanning device or methodology to produce an image incorporating correlated superficial, three dimensional surface and x-ray images and measurements of an object
IL126838A (en) * 1998-11-01 2003-04-10 Cadent Ltd Dental image processing method and system
JP2001017422A (en) * 1999-07-08 2001-01-23 Toshiba Iyo System Engineering Kk Image processing device and marker member for the same
JP2001238895A (en) * 2000-02-28 2001-09-04 Tokyo Inst Of Technol Patient's position identification method for navigation system for surgery and its equipment
US7156655B2 (en) * 2001-04-13 2007-01-02 Orametrix, Inc. Method and system for comprehensive evaluation of orthodontic treatment using unified workstation
JP2003079637A (en) * 2001-09-13 2003-03-18 Hitachi Medical Corp Operation navigating system
JP4328621B2 (en) * 2001-10-31 2009-09-09 イマグノーシス株式会社 Medical simulation equipment
JP3757160B2 (en) * 2001-12-07 2006-03-22 茂樹 上村 3D facial diagram display method for orthodontics

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6213769B1 (en) * 1996-12-20 2001-04-10 Universite Joseph Fourier Device for determining a movement between two dental cast profile using an x-ray scanner
US20040015327A1 (en) * 1999-11-30 2004-01-22 Orametrix, Inc. Unified workstation for virtual craniofacial diagnosis, treatment planning and therapeutics
WO2003028577A2 (en) * 2001-10-03 2003-04-10 Board Of Regents, The University Of Texas System Method and apparatus for fabricating orthognathic surgical splints

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
SCHUTYSER F ET AL: "Image-based 3D planning of maxillofacial distraction procedures including soft tissue implications", MEDICAL IMAGE COMPUTING AND COMPUTER-ASSISTED INTERVENTION - MICCAI 2000. THIRD INTERNATIONAL CONFERENCE. PROCEEDINGS (LECTURE NOTES IN COMPUTER SCIENCE VOL.1935) SPRINGER-VERLAG BERLIN, GERMANY, 2000, pages 999 - 1007, XP008053601, ISBN: 3-540-41189-5 *
TROULIS M J ET AL: "Development of a three-dimensional treatment planning system based on computed tomographic data.", INTERNATIONAL JOURNAL OF ORAL AND MAXILLOFACIAL SURGERY. AUG 2002, vol. 31, no. 4, August 2002 (2002-08-01), pages 349 - 357, XP008053607, ISSN: 0901-5027 *
VERSTREKEN K ET AL: "An image-guided planning system for endosseous oral implants", IEEE TRANSACTIONS ON MEDICAL IMAGING IEEE USA, vol. 17, no. 5, October 1998 (1998-10-01), pages 842 - 852, XP008053785, ISSN: 0278-0062 *

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008031562A1 (en) 2006-09-11 2008-03-20 Jochen Max Zinser Method for production of at least one surgical splint, in particular for computer-assisted maxillofacial operations and transposition osteotomies
DE102006043204A1 (en) * 2006-09-11 2008-03-27 Zinser, Jochen Max, Dr. Method for producing surgical splints, in particular for computer-assisted maxillofacial operations and remodeling osteotomies
WO2008080235A1 (en) * 2007-01-03 2008-07-10 Ao Technology Ag Device for planning orthodontics and/or orthognathic surgery
US8706672B2 (en) 2007-04-18 2014-04-22 Dentsply Implants Nv Computer-assisted creation of a custom tooth set-up using facial analysis
WO2008128700A1 (en) * 2007-04-18 2008-10-30 Materialise Dental N.V. Computer-assisted creation of a custom tooth set-up using facial analysis
JP2010524529A (en) * 2007-04-18 2010-07-22 マテリアライズ・デンタル・ナムローゼ・フエンノートシャップ Computer-aided creation of custom tooth setup using facial analysis
JP2010524530A (en) * 2007-04-20 2010-07-22 メディシム・ナムローゼ・フエンノートシャップ Method for extracting shape information
KR101590330B1 (en) * 2007-04-20 2016-02-01 메디심 엔브이 Method for deriving shape information
EP2142094A2 (en) * 2007-04-20 2010-01-13 Medicim NV Method for deriving shape information
KR20100016180A (en) * 2007-04-20 2010-02-12 메디심 엔브이 Method for deriving shape information
EP1982652A1 (en) 2007-04-20 2008-10-22 Medicim NV Method for deriving shape information
JP2014237005A (en) * 2007-04-20 2014-12-18 メディシム・ナムローゼ・フエンノートシャップ Method for deriving shape information
US9439608B2 (en) 2007-04-20 2016-09-13 Medicim Nv Method for deriving shape information
EP2564375B1 (en) * 2010-04-30 2019-06-12 Align Technology, Inc. Virtual cephalometric imaging
WO2012016635A2 (en) 2010-08-04 2012-02-09 Charité - Universitätsmedizin Berlin Method for generating a digital tooth topology for a tooth structure, and measurement method
DE102010036841A1 (en) 2010-08-04 2012-02-09 Charité - Universitätsmedizin Berlin Method for generating a digital tooth topology for a tooth structure and measuring method
WO2012016635A3 (en) * 2010-08-04 2012-04-19 Charité - Universitätsmedizin Berlin Method for generating a digital tooth topology for a tooth structure, and measurement method
EP2680233A4 (en) * 2011-02-22 2017-07-19 Morpheus Co., Ltd. Method and system for providing a face adjustment image
WO2012139999A1 (en) * 2011-04-12 2012-10-18 Universite Paul Sabatier Toulouse Iii Method for designing and manufacturing a positioning groove intended for use in repositioning the maxilla of a patient during orthognathic surgery
FR2974001A1 (en) * 2011-04-12 2012-10-19 Univ Toulouse 3 Paul Sabatier METHOD FOR DESIGNING AND MANUFACTURING A POSITIONING GUTTER FOR USE IN REPOSITIONING THE MAXILLARY OF A PATIENT DURING OPERATION OF ORTHOGNATHIC SURGERY
US10869705B2 (en) 2012-12-12 2020-12-22 Obl S.A. Implant and guide
US11759244B2 (en) 2012-12-12 2023-09-19 Materialise Nv Implant and guide
US9659152B2 (en) 2013-04-12 2017-05-23 Stryker European Holdings I, Llc Computer-implemented technique for defining a bone cut
EP2789308A1 (en) * 2013-04-12 2014-10-15 Stryker Leibinger GmbH & Co. KG Computer-implemented technique for generating a data set that geometrically defines a bone cut configuration
WO2015142291A1 (en) * 2014-03-20 2015-09-24 National University Of Singapore Computer-aided planning of craniomaxillofacial and orthopedic surgery
IT201600118033A1 (en) * 2016-11-22 2017-02-22 Univ Degli Studi Di Messina Cephalometric model management procedure for compiling orthognathic treatment plans
EP3566651A1 (en) * 2018-05-08 2019-11-13 Siemens Healthcare GmbH Method and device for determining result values based on a skeletal medical image capture
US10977790B2 (en) 2018-05-08 2021-04-13 Siemens Healthcare Gmbh Method and device for determining result values on the basis of a skeletal medical imaging recording
CN109671505A (en) * 2018-10-25 2019-04-23 杭州体光医学科技有限公司 A kind of head three-dimensional data processing method for medical consultations auxiliary
CN109671505B (en) * 2018-10-25 2021-05-04 杭州体光医学科技有限公司 Head three-dimensional data processing method for medical diagnosis and treatment assistance

Also Published As

Publication number Publication date
BRPI0511379A (en) 2007-12-04
JP5020816B2 (en) 2012-09-05
DE602005010861D1 (en) 2008-12-18
RU2384295C2 (en) 2010-03-20
BRPI0511379B1 (en) 2018-06-26
ATE413668T1 (en) 2008-11-15
EP1759353B1 (en) 2008-11-05
EP1759353A1 (en) 2007-03-07
ZA200610143B (en) 2008-02-27
RU2007101297A (en) 2008-07-27
CN1998022B (en) 2010-10-06
US20070197902A1 (en) 2007-08-23
JP2008503280A (en) 2008-02-07
US7792341B2 (en) 2010-09-07
GB0414277D0 (en) 2004-07-28
CN1998022A (en) 2007-07-11
ES2317265T3 (en) 2009-04-16

Similar Documents

Publication Publication Date Title
US7792341B2 (en) Method for deriving a treatment plan for orthognatic surgery and devices therefor
US20230111070A1 (en) Systems and methods for computer-aided orthognathic surgical planning
Plooij et al. Digital three-dimensional image fusion processes for planning and evaluating orthodontics and orthognathic surgery. A systematic review
US8199988B2 (en) Method and apparatus for combining 3D dental scans with other 3D data sets
US6671539B2 (en) Method and apparatus for fabricating orthognathic surgical splints
Popat et al. New developments in: three‐dimensional planning for orthognathic surgery
KR101590330B1 (en) Method for deriving shape information
Kumar et al. Comparison of conventional and cone beam CT synthesized cephalograms
Nakasima et al. Three-dimensional computer-generated head model reconstructed from cephalograms, facial photographs, and dental cast models
Alves et al. Three-dimensional computerized orthognathic surgical treatment planning
Palomo et al. Clinical application of three-dimensional craniofacial imaging in orthodontics
Wang et al. The application of digital model surgery in the treatment of dento-maxillofacial deformities
Baird Evaluation of a custom made anatomical guide for orthognathic surgery
Mukhia A Comparison of Geometric Accuracy of Three Dimensional Bone Surface Modelling on Cone Beam Computed Tomography and White Light Scanner
Barone et al. 3D reconstruction of individual tooth shapes by integrating dental cad templates and patient-specific anatomy
Halazonetis Software Support for Advanced Cephalometric Analysis in Orthodontics
Scherzberg 3D virtual planning in Orthognathic Surgery and CAD/CAM Surgical Splints generation
Nichelini Virtual Orthognathic Surgery: CAD/CAM Splint Generation and Analysis

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DPEN Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed from 20040101)
WWE Wipo information: entry into national phase

Ref document number: 200610143

Country of ref document: ZA

WWE Wipo information: entry into national phase

Ref document number: 2007197902

Country of ref document: US

Ref document number: 11629270

Country of ref document: US

WWE Wipo information: entry into national phase

Ref document number: 2005758955

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2007516900

Country of ref document: JP

Ref document number: 200580021075.7

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Ref document number: DE

WWE Wipo information: entry into national phase

Ref document number: 2007101297

Country of ref document: RU

WWP Wipo information: published in national office

Ref document number: 2005758955

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 11629270

Country of ref document: US

ENP Entry into the national phase

Ref document number: PI0511379

Country of ref document: BR

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)