WO1999059106A1 - Method and apparatus for generating 3d models from medical images - Google Patents

Method and apparatus for generating 3d models from medical images Download PDF

Info

Publication number
WO1999059106A1
WO1999059106A1 PCT/US1999/010566 US9910566W WO9959106A1 WO 1999059106 A1 WO1999059106 A1 WO 1999059106A1 US 9910566 W US9910566 W US 9910566W WO 9959106 A1 WO9959106 A1 WO 9959106A1
Authority
WO
WIPO (PCT)
Prior art keywords
patient
images
tooth
model
points
Prior art date
Application number
PCT/US1999/010566
Other languages
French (fr)
Inventor
David C. Hatcher
William E. Harrell, Jr.
Terry J. Sorensen
Hassan Mostafavi
Charles Palm
Original Assignee
Acuscape International, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Acuscape International, Inc. filed Critical Acuscape International, Inc.
Priority to CA002296274A priority Critical patent/CA2296274A1/en
Priority to EP99924217A priority patent/EP1027681A4/en
Priority to AU40769/99A priority patent/AU4076999A/en
Publication of WO1999059106A1 publication Critical patent/WO1999059106A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C9/00Impression cups, i.e. impression trays; Impression methods
    • A61C9/004Means or methods for taking digitized impressions
    • A61C9/0046Data acquisition means or methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/147Details of sensors, e.g. sensor lenses
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing

Definitions

  • the Field of the Invention relates to the field of medical imaging.
  • the invention relates to the generation and use of three-dimensional medical images
  • model builders will frequently manually remove unneeded vertices to simplify the processing required when displaying and manipulating a three-
  • Such three-dimensional models are typically rendered as wire frames. That is, a plurality of points are identified, corresponding to the points at which image information is captured and the points are displayed together with lines
  • wireframes because the lines between the points appear to constitute a wire mesh.
  • the individual points in such a wireframe are
  • VRML Virtual Reality Mark-up Language
  • Wireframe models are commercially available from a variety of sources.
  • magnetic resonance imaging (MRI) and other imaging technologies, can accurately display two-dimensional slices of a patient.
  • an irregular shape in a 3-D model of a skull may be a tumor, but the system does not relate this additional information to the
  • the 3-D model does not even have information indicating
  • imaging technologies e.g., x-rays, MRIs and
  • Inc. are commonly used to generate and manipulate three-dimensional models.
  • the user interfaces of available commercial software for dealing with three- dimensional models are highly technical and generally unsuited for use by a
  • the 3-D models do not are not related to medical information about a patient (e.g., a shape in a 3-D
  • model is only a shape, there is no information that the shape is a tumor or body part). Also, some technologies do not allow doctors to build models from
  • One aspect of the invention is directed to providing a generic software tool
  • a number of modules are used to achieve this result.
  • These modules include a Sculptor module, a Clinician module and an Executor
  • the Sculptor module maps all acquired imaging, including those from
  • the Sculptor allows a user to identify the location of different anatomical points in each of the
  • the Sculptor allows a user to relate different anatomical points to each other in a 3-D space and also relate the points to the images.
  • the Clinician/Consultant module uses the related points to modify or
  • a stock model e.g., a standard anatomical 3-D model.
  • the customized model that is created corresponds to a 3-D model of the patient's anatomy.
  • the model is "smart" in that when certain changes are made to the dot
  • the remainder of the model can be adjusted or
  • an object representing the patient's tooth is associated with data indicating that the object
  • the Clinician/Consultant is a database query tool that allows for display or visualization of the anatomy and function, manipulation of objects for treatment planning and model analyses.
  • a third module is a database that provides overall system file and image management and coordinates the Sculptor module and the
  • the stock model is a model of
  • this model has approximately 300 objects which can be
  • Some embodiments of the invention include the functionality of some or all of the above modules. For example, in some embodiments, only a subset of the above modules.
  • Sculptor functions performed by the Sculptor are included (e.g., the ability to define related points in multiple images).
  • Figure 1 illustrates a computer system including one embodiment of the
  • FIG. 2 illustrates an architecture of the software used in one embodiment
  • Figure 3 illustrates capturing images for use in the system.
  • Figure 4 and Figure 5 illustrates the calibration frame.
  • Figure 6 illustrates an example method calibrating images, generating a patient specific model, and performing analysis from the calibrated images and the patient specific model.
  • Figure 7 through Figure 24 illustrates user interfaces for a sculpture
  • Figure 25 through Figure 40 illustrates user interfaces for a clinician
  • Computer - is any computing device (e.g., PC compatible computer, Unix
  • a computer includes a processor
  • a computer can include a network of computers.
  • Handheld Device or Palmtop Computer
  • Examples of a handheld device include the Palm IIITM handheld computer and Microsoft's palm sized
  • Internet - is a collection of information stored in computers physically located throughout the world. Much of the information on the Internet is organized onto electronic pages. Users typically bring one page to their
  • Client - a computer used by the user to make a query.
  • Server - a computer that supplies information in response to a query, or performs intermediary tasks between a client and another server.
  • World Wide Web (or Web or web) - is one aspect of the Internet that
  • Program - a sequence of instructions that can be executed by a computer.
  • program can include other programs.
  • a program can include only one
  • Application - is a program.
  • the operations are machine operations.
  • Useful machines for performing the operation of the present invention include
  • the present invention also relates to apparatus for performing these
  • This apparatus may be specially constructed for the required
  • Figure 1 illustrates a computer 110 that can be used to carry out the
  • Figure 1 includes a computer 110, a sculptor 115, a clinician/consultant 125, and an executor 135.
  • the sculptor 115 includes a display of a user interface having a number of patient images 150 that also show a calibration frame 140.
  • the clinician/consultant 125 includes a similar user interface that includes a a view of a patient specific model 160 and an analysis window 127.
  • window 127 includes an example analysis 170.
  • the executor 135 includes image
  • the sculptor 115 and the clinician/consultant 125 can extract and manipulate information from the image data 137 and the patient model data 139 through the executor 135. The following paragraphs describe the elements of Figure 1 in greater detail.
  • the computer 110 represents a computer system upon which the sculptor 115, the clinician/consultant 125, and the executor 135 can execute.
  • the computer 110 is representative of a standard personal computer such as is
  • computers could be used as the computer 110. What is important is that the computer 110 some sort of processor and some memory.
  • the sculptor 115 and the executor 135 may run on one computer at one time. While at another time, the
  • clinician/consultant 125 and the executor 135 can run on another computer at another time. Alternatively, all three programs can run on different computers.
  • the computers can be linked together by a network such as the Internet.
  • the sculptor 115 represents a computer program in which a number of different types of patient images 150 can be calibrated using the images of the
  • the sculptor 115 allows a technician to calibrate the images and identify
  • the patient images 150 can be extracted from the image data 137.
  • image data 137 can be imported from an external source either by transmission
  • the image data 137 need not be retrieved from the executor 135.
  • the image data may be directly imported into the sculptor 115
  • the calibration frame 140 is an apparatus that includes a number of
  • the calibration frame 140 is worn by the patient during the capturing of the patient images.
  • the patient model data 139 represents the data generated by the sculptor 115
  • This output of the sculptor 115 can be included the form of two transport files, the (.sci file and .cln
  • the stock anatomy model is 3D model of a standard
  • the clinician/consultant 125 morphs the stock model into the patient specific model, allows users
  • the clinician/consultant 125 can be used to perform various types of analyses on
  • results of these analyses can be then displayed on the patient images and as well as in the example analysis window 127.
  • the particular model used will be a stock model of a human skull.
  • the human skull can be used by an
  • the model will have a number of objects including objects corresponding to each of the patient's teeth, the jaw, and other elements
  • each of these objects can be manipulated individually in the clinician/consultant 125.
  • An example of a stock model that may be used is one from Viewpoint Data Labs which is specifically created for orthodontic applications. A full custom
  • stock model can also be used.
  • the stock model represents the average structure of a piece of anatomy.
  • the Executor will compile normative stock models to match patient demographics of age, race, sex and body type.
  • stock model has a coordinate system where each point is referenced to another
  • Figure 2 illustrates the various responsibilities of each of the three programs of Figure 1.
  • the sculptor 115 is responsible for input/output control of the patient images 150.
  • the sculptor 115 allows for a calibration between the various
  • the sculptor 115 includes a graphical user interface for performing the various features of the sculptor 115.
  • the viewer supports the viewing of 3D models (useful where a piece of anatomy needs a more detailed
  • model matching allows a user to match portions of the stock model to points on one or more patient images 150.
  • model matching includes
  • the sculptor 115 can be defined. These new locations can then be used for more
  • the sculptor 115 allows the user to identify the location of previously undefined points of the stock model in the patient images 150.
  • the executor 135 takes responsibility for the database storage and
  • the executor 135 includes Internet access for communicating with one or more
  • the executor 135 also has encryption capabilities to protect the security of the information stored by
  • the clinician/consultant 125 includes the following functions. Diagnosis,
  • treatment planning, predictions, analyses, and metrics are all examples of the type of functions that can be performed on the patient specific model 160 and
  • patient model data 139 Examples of these areas are described in greater detail below.
  • the clinician/consultant 125 also keeps track of the stock objects, or
  • the clinician/consultant 125 includes a graphical user interface and viewer
  • the morph editor is used to modify any morphing that is done to generate
  • the simulator simulates the motion of objects within the patient specific
  • model 160 The simulator could also be used to simulate the predicted motion of
  • the simulator can be used to determine the objects in the patient specific model 160.
  • the simulator can be used to determine the objects in the patient specific model 160.
  • Figure 3 illustrates example relationships between a patient, a camera and other imaging technologies, for the purpose of capturing images.
  • a patient 300, a camera 310 and an x-ray device 320 are shown in Figure 3.
  • the camera 310 and the x-ray device 320 can be used to capture the image data
  • These captured images can be all from one camera, x-ray machine, or the
  • image data about the patient 300 from multiple vantage points Other example modes of image capture include MRIs, ultrasound imaging, infrared imaging, and
  • the camera 310 and x-ray 320 are merely symbolic of the fact that
  • images of the patient are captured using various types of imaging technology.
  • the preferred x-ray images would include a frontal
  • preferred photographic images may include a frontal image and two lateral images.
  • Figure 4 illustrates a front view and a side view of a calibration frame 140 that may be used in some embodiments of the invention. Images of this calibration frame 140 appears in the patient images 150. The images of the
  • calibration frame 140 can then be used in the sculptor 115 to calibrate the
  • the calibration process includes recording the anatomy with the calibration
  • the sculptor uses knowledge within the sculptor to compute the location of the imaging sources as a point source with seven degrees of freedom (DOF). Seven DOF includes the x, y, z, yaw, pitch,
  • the calibration process maps the associated images into the 3D matrix associated with the calibration frame.
  • the calibration frame 140 can include a top strap 405, a strap 410, an
  • the top strap 405, the strap 410, and the adjustment knob 420, work together to keep the calibration frame
  • the top strap 405 is designed to encircle the patient's 300 head.
  • the top strap 405 and the strap 410 are part of headgear normally associated with a welding visor from which the face shield has
  • a rigid plexi-glass frame 430 is used to mount a number of calibration targets 440.
  • the circumference of the strap 410 is adjusted using a ratchet and a knob 420. They can be used to
  • the calibration targets 440 provide measurement references during the calibration of the various patient images 150.
  • the calibration targets 440 include
  • Alternative embodiments can include different materials such as cod liver oil capsules as calibration targets 440. Alternative embodiments can also include different materials such as cod liver oil capsules as calibration targets 440. Alternative embodiments can also include different materials such as cod liver oil capsules as calibration targets 440. Alternative embodiments can also include different materials such as cod liver oil capsules as calibration targets 440. Alternative embodiments can also include different materials such as cod liver oil capsules as calibration targets 440. Alternative embodiments can also include different materials such as cod liver oil capsules as calibration targets 440. Alternative embodiments can also be used.
  • calibration targets 440 include different shapes of calibration targets 440, such as crosses.
  • BBs such as shot gun pellets
  • bearings it is preferred to use bearings
  • calibration targets 440 are that they are visible in both optical and x-ray images. However, what is important with respect to the calibration targets 440 is that they provide a fixed reference frame by which patient images 150 can be calibrated. Thus, they should be viewable in each of the patient images 150.
  • the calibration targets 440 can be of different types of materials such that some of the calibration targets 440 appear in some of the images while others of the calibration targets appear in others of the images. As long as enough of the calibration targets 440 are visible in enough of the images,
  • a single calibration target could be made of different materials.
  • a cod liver calibration target could be positioned very close to a crosshair.
  • the crosshair would indicate the position of the calibration target in
  • the calibration targets 440 are positioned in the calibration frame 140 such that it is unlikely that in any one image the calibration targets will
  • calibration targets are visible from each image perspective.
  • shape of the calibration frame for holding the calibration targets 440 may vary from
  • frame used 140 can also be stored with that information.
  • the attachment 450 (also referred to as an appliance) represents another way in which calibration targets can be included in images of the patient 300.
  • the calibration attachment 450 can be still used to calibrate. For example, where an x-ray image is collimated to only focus on a smaller portion of the
  • FIG. 5 illustrates another embodiment of the calibration frame 140. In this case
  • the plexi-glass frame 430 has a number of bends instead of the continuous curve shown in Figure 4. This facilities the attachment of calibration attachments 450 to the calibration frame 140.
  • Figure 5 illustrates a top view 502, a front view 504 and a cross section view 506 of the calibration frame 140.
  • attachment sites 530 can be included on the calibration frame 140. (Note, the top view 502 does not illustrate the attachment sites 530,
  • the cross sectional view 506 illustrates how an attachment can be attached
  • attachment 450 can be stabilized using dowel pins 560.
  • This example illustrates an acrylic appliance (attachment 450).
  • Figure 5 illustrates an acrylic appliance support that can be used
  • FIG. 6 illustrates one embodiment of the invention where the programs of
  • Figure 1 are executed on one or more computers 110.
  • the drawings are executed on one or more computers 110.
  • the drawings are executed on one or more computers 110.
  • patient images 150 are calibrated to generate the patient specific model 160. This information is also used to perform a number of analyses on the patient
  • Figure 6 can be broken down into three general processes: a capturing of the patient specific data 602, generating the patient model 604, and performing analyses and related work on the patient model and
  • calibration frame 140 is mounted on the patient 300 head. This can be done by a technician at a medical facility.
  • adjustment knob 420 can be use to snuggly fit the calibration frame 140 to the
  • a number of different images of the patient are captured.
  • the calibration frame 140 and/or the attachments 450 are included in
  • the sculptor 115 is used to import all the image data 137.
  • This image data 137 is now calibrated using the image information of the calibration frame 140.
  • each patient image 150 is associated with a
  • calibration frame template (a computer representation of the calibration frame 140). A user will match a calibration frame template up with the image of the calibration frame 140 in each of the patient images 150. This tells the computer 110 how the calibration frame 140 is oriented and positioned in the
  • the calibration process involves calibrating locations relative
  • a first plane may be defined that is parallel to approximately the patient's pupils.
  • a y-plane can then be defined through the
  • the last center plane can be determined from the cross product of the other two planes.
  • Appendix A includes a list of those locations.
  • An example of identifying anatomic location 640 would include such things
  • the set of anatomic locations that need to be defined is dependent upon what
  • Orthodontists for centuries, may have different specific features of interest that are identified in the identification of anatomic locations. Orthodontists, for centuries
  • landmarks in the skull tend to concentrate on landmarks in the skull. These landmarks tend to be points concentrated on the mid-sagittal plane, teeth, and jaw. The ability to identify these landmarks and cross correlate them in the various patient
  • images 150 is an important feature for specific medical applications. However, the specific features and anatomic locations that are important to a particular
  • the following describes an example way of identifying a landmark in more than one image. It is important to identify the landmark location in multiple
  • the user can perform the following steps using the computer 110.
  • the user selects a point to be identified.
  • the user places that point in one of the images.
  • the sculptor 115 then generates a line through that
  • the line (epi-polar line) originates from the
  • imaging source and is projected through a landmark point of image A and onto
  • length of the line can be constrained by a priori knowledge of geographic region
  • the point may be automatically defined in all of the other images. Rather than identifying individual points however, it is sometimes desirable
  • the tracing can be done by specifying one or more connected points in a number of the patient images 150.
  • the patient specific model 160 is generated in the generate patient model 604 process.
  • the data for that patient can be exported to the executor 135. This information can then be
  • the clinician/consultant 125 includes a stock model which is to be morphed against the information drawn from the sculptor 115 for a particular patient.
  • the anatomic locations identified in the sculptor 115 from the patient images 150 are all associated with the calibration frame 140.
  • specific model 160 includes all the object information in the original stock model
  • the model is shown as a number of dots in space.
  • the user can select an analysis type from
  • the analysis can be derived from either the landmarks that have been previously identified in the sculptor 115, from the morphed three
  • example analysis 170 This conforms to block 670 of Figure 6.
  • the example analysis 170 illustrates example output from an analysis procedure.
  • measurements may be taken to perform any number of standard orthodontic analyses such as a Ricketts analysis or a McGrann analysis.
  • Other types of analysis, treatments, etc. are described in the following sections.
  • a user can use the clinician/consultant 125 to rotate
  • the patient's face can be mapped
  • the user is able to hide objects in the patient specific model 160. This,
  • the patient specific model 160 can be rendered by placing a skin over the model to show the external appearance after the
  • CAT Computer Assisted Tomography
  • the full patient specific information can be transmitted through a relatively low bandwidth network in a relatively small amount of time. Such information could be transported over the Internet to
  • measurement data could be taken in the sculptor 115 and sent to a dental lab across the network very quickly and efficiently.
  • the patient specific model 160 can be transmitted across the network with the
  • calibration targets 440 In this example, BBs will be used as calibration targets, however, the general process can be used for most any type of calibration target.
  • the image of each BB is circular in the patient images 150. This is why the
  • centroid location in 3D space is then determined as the center of the BB relative to the calibration frame 140. This information can then be stored and
  • the user can drag and drop calibration target identifiers near a calibration target, and the computer 110 (the sculptor 115) can look for a calibration target near where the user dropped the calibration target identifier.
  • FIG. 7 illustrates the sculptor 115 application interface. This is shown as
  • Sculptor interface 715 includes a sculptor toolbar 720 for
  • a patient image 750 is being displayed.
  • Patient image includes a view of the
  • Figure 8 illustrates the placement of an unaligned calibration template 810 in the patient image area.
  • the unaligned calibration template 810 will be aligned
  • FIG. 9 shows that a number of calibration target selectors 910 have been placed over the calibration targets 440 in the patient image 750.
  • the calibration target selector 910 is dragged and dropped onto a calibration target in the image
  • Figure 10 illustrates the partially aligned calibration template 110.
  • Figure 11 illustrates the aligned calibration template 1110.
  • calibration template 1110 now provides the sculptor 1115 with a reference frame for the patient image 750.
  • Figure 12 illustrates the sculptor interface 715 having a second patient image
  • the calibration frame 140 can be seen in both of the images. Additionally, the aligned calibration template 1110 can be seen. A similar alignment process was performed to align the calibration template in the patient image 1250.
  • Figure 13 illustrates the placement of a coordinate reference origin 1310 to
  • the reference planes help in the identification of anatomic locations.
  • Figure 14 illustrates creating a trace using the sculptor 1115.
  • a trace name 1410 is displayed in the sculptor toolbar 720. In this
  • the trace name 1410 is the mid-line soft tissue trace. A user has traced
  • midline soft tissue trace 1410 is also shown in the patient image 1450. This may be the preferred image to trace the mid-line soft tissue trace 1410. The reason for this is that the profile of the patient's soft tissue is most easily seen in this image.
  • the midline soft tissue trace 1410 can also be defined in the patient
  • Figure 15 illustrates a user interface enhancement in the sculptor 115.
  • the sculptor 115 allows the user to perform a localized histogram
  • localized enhancement 1510 is performed in the patient image 1250. This allows the user to enhance portions
  • Figure 16 illustrates a trace and landmark view 1620 where the patient image
  • Figure 17 illustrates a rotated view 1720 of the example trace 1610.
  • the viewer tool used in the clinician/consultant 125 is the viewer tool used in the clinician/consultant 125
  • Figure 18 illustrates a number of landmarks and traces being displayed in
  • Figure 19 illustrates a measure between a bone and a soft tissue point. In this example, the distance measured is the distance from the post nasal spine landmark to the midline soft tissue trace.
  • the measurement line 1910 illustrates
  • the measurement information 1920 shows how many millimeters long the measurement line 1910 is. This
  • Figure 20 illustrates how epi-polar lines 2020 can be used in the placement and location of landmarks and traces.
  • the post nasal spine landmark 2010 was first placed in the patient image 1250. This caused epi-polar lines to be shown in the other two images. Next the post nasal spine landmark 2010 was identified on one of the epi-polar lines in the patient image 1850. This
  • Figure 21-24 illustrates the creation of a trace.
  • a right hand figure 21-24 illustrates the creation of a trace.
  • mandible trace 2110 is shown. This trace was performed by a sequence of click
  • the right mandible trace 2110 includes a number of trace points, such as a trace point 2120. These trace points are connected in the line
  • the trace line 2200 corresponds to the right mandible
  • the trace line 2200 has, or will have, the same number of points as the right mandible trace 2110.
  • the sculptor 115 ensures this.
  • the user can then select using a mouse click to position the next trace point 2200 permanently. Then another trace point is displayed until all of the trace points in the right mandible trace 2110 have been placed. This provides a
  • Figure 23 illustrates the placement of the trace point 2220 after the user has clicked on the mouse. Once the right mandible trace 2110 has been completely
  • the right mandible trace can be projected onto any other calibrated patient image.
  • right mandible trace 2110 can automatically be propagated to another calibrated
  • Figure 25 through Figure 40 illustrate various user interface features of the clinician/consultant 125. These figures illustrates how a user can access the
  • patient specific model data create and manipulate the patient specific model 160, and perform analysis, treatment and the like on the model and the model data.
  • Figure 25 illustrates an .SCL file load window 2510 that can be use to load a .SCL file.
  • the .SCL file is the patient specific file 2520 that was
  • Figure 26 illustrates the morphing interface 2600 that can be part of the clinician/consultant 125.
  • the patient image 750 is shown with a partially
  • the morphing interface 2600 need not be used by the medical practitioner, but it does help illustrate the morphing process.
  • Figure 27 illustrates a morphed and texture mapped patient specific model view 2710. Here the patient specific model view has been rotated. Note that the
  • photo image texture mapped onto the model is the patient image 750.
  • Figure 28 illustrates a wireframe view 2810 of the patient specific model 160.
  • the morphing interface 2600 allows the user to rotate the view of the patient
  • Figure 29 illustrates a dot contour view 2910 of the patient specific model
  • the dot contour view 2910 shows the points that are used to define the
  • model are repositioned, according to the patient model data, to create the patient specific model 160.
  • the clinician interface 3010 is the interface that would normally be used by the medical practitioner when performing analysis, developing a treatment, or presenting information to the
  • the clinician interface 3010 includes a patient specific model flesh view 3010.
  • Figure 31 illustrates the patient specific model skull view 160 having a
  • Figure 32 illustrates an example analysis that has been performed. Here a
  • the analysis window 127 shows the results of the
  • the patient image 1250 and the dot view of the patient specific model 3210 show the analysis lines 3230. Normally, the medical practitioner would have had to draw these lines on the x-ray image, and then measure those lines.
  • Figure 33 illustrates a partially planned treatment where an arch form
  • jaw object dot display 3310 has been put into the jaw object dot display 3310. Importantly, the jaw object can be selected and manipulated separately from the rest of the jaw object
  • patient specific model 160 can place an
  • arch form template 3320 and perform simulations of how the teeth may will be
  • the user interface now includes a
  • Figure 34 illustrates a jaw object solid display 3410 where a particular tooth has been selected (shown as tooth selected display 3420).
  • Figure 35 illustrates a
  • Figure 36 illustrates the where the user has partially extracted and tilted the tooth. This could be used to show a patient what an extraction would look like.
  • Figure 37 illustrate the top view of this configuration.
  • Figure 38 illustrates the jaw object solid display 3410 where the tooth has
  • Figure 39 illustrates another feature of the clinician/consultant 125 user
  • the object display 3920 is used for positioning the slice planes (e.g., slice plane 3910).
  • the jaw object display 3930 shows the results of the slice plane 3910.
  • the clinician/consultant 125 user interface allows the user to position and
  • Figure 40 illustrates a partially transparent slice plane 4010 and a partially
  • Appendix A The following table shows the landmarks and traces used in the creation of the patient specific model 160. Other embodiments of the invention can use other landmarks and/or traces. The following list has been chosen as they
  • Tooth #43 root tip LT_ 3d 43d Tooth #43 distal interproximal contact LT_43c 43c Tooth #43 cervical LT_44m 44m Tooth #44 mesial surface interproximal contact LT_44d 44d Tooth #44 distal surface interproximal contact LT_44r 44r Tooth #44 root tip LT_44b 44b Tooth #44 buccal cusp tip LT_441 441 Tooth #44 lingual cusp tip LT_44c 44c Tooth #44 cervical LT_45m 45m Tooth #45 mesial surface interproximal contact LT_ 5d 45d Tooth #45 distal surface interproximal contact
  • This line represents the patient's midline
  • Notations can be used for Missing, Supernumerary (extra) , pontics, etc. used after the Number.
  • ALL l's are central incisors
  • ALL 4's are 1 st bicus ids
  • ALL 8's are 3 rd molars wisdom teeth etc.
  • ALL four Deciduous cuspids (canines) ( c ) ALL four Permanent Lateral incisors (2) ALL four Permanent Central Incisors (1)
  • the clinician/consultant 125 can use the following
  • Appendix E The following describes an Roth analysis that can be performed using the clinician/consultant 125 and can be particularly helpful in the tooth alignment
  • This module has been designed to assist with patient analysis, treatment planning and patient education for orthodontists, dentists that perform orthodontics and oral surgeons that perform orthognathic surgery.
  • This module will provide the full range of analysis, modelling and treatment features currently expected with all existing 2D software packages.
  • this module will allow the soft and hard tissues to be accurately captured and photorealistically displayed and analyzed in three dimensions. Unique algorithms will then be accessed to perform a number of functions with the captured data.
  • the stock objects will be used for model matching and used as a template for rapid conversion of the patient input data to a 3- D model. These stock objects will include the jaws, teeth, and soft tissues of the face. The stock objects can be constructed to represent NORMAL for the modeled anatomy. In addition, other stock objects could be designed to provide a closer starting point for common types of anatomical variaUon or pathology. These variations may include size, sex and facial form (Angle's Class 1,11 and III). Stock objects will be constructed from a wire frame with a relatively small polygon count. The vertices of the wire frame can be strategically located to allow for the subsequent modifications that will allow for rapid customization to adapt to the patient's input data.
  • the stock objects can have a minimum # of "tie down points" that corresponds to "landmark locations".
  • the minimum # of tie down points on a tooth may include those that allow for rapid modification in height, mesiodistal and buccolingual width, and angulation.
  • the wire frame can be mutable.
  • the wire frame can possess a logical behavior among the neighboring wire frame intersects or vertices. That is, when the wire frame is mutated by the end user all of the intersects that are changed and their neighbors can respond in a logical fashion.
  • Landmark groupings can be able to be segmented and moved to a new location to simulate treatment or growth. The movement of these segmented landmarks can occur through data input or manual "Drag and Drop,"
  • the input data can include photographs, x-rays, a previously rendered patient 3-D data set.
  • the stock objects can have a spatial association with a data base.
  • the data base will record the 3-D spatial attitude of the stock object and will record modifications of the stock object.
  • the data base will track the landmark locations and any other changes that relate to the original polygon size and form.
  • Object-Oriented Data This is a feature that the average user of the software may not fully appreciate.
  • the MedScape product line deals with physical entities such as patients and anatomical structures of the face. It produces images from these objects, extracts measurements and produces models of them. It also produces such end- user data products as growth prediction and treatment plans.
  • the underlying data structure that can define and relate all these entities in a unified fashion is an object oriented database.
  • Typical examples of objects are a patient, a digital image, a specific mandible, the 3-D model of a specific mandible, a "normal" mandible, a treatment plan, etc.
  • the specific instances of these objects are stored in the database as rows of various tables, where each table represents the object class.
  • Each class is identified by its properties and methods (procedures that are applied to them).
  • Each software development team will concentrate on specific object classes assigned to it with the goal of producing class libraries that expose the properties and methods of each object to all development teams for final integration.
  • 3-D ACCURACY Although accuracy numbers for the so called "nominal" conditions can be provided, the accuracy of position and orientation measurements made from one or more images of an object can vary significantly depending on a number parameters.
  • the proposed software will include the necessary models and algorithms to compute these theoretic error bounds and provide them as part of the measurement results. For example in the case of landmark position measurements, for each measured landmark the software outputs the ellipsoid that represents the error uncertainty in three dimensions. In this way the user is given a yard stick by which the accuracy of each measurement results can be judged.
  • MODEL MATCHING The starting point for modeling an object from multiple images is to retrieve a "stock" or normal version of that establishes correspondence of two or more points used for triangulation. This will minimizes the user effort of designating the corresponding points in different views based on visual cues. This automation is achieved by taking into account the geometric constraints imposed by both the imaging system and the object being modeled.
  • 3D display refers to the mode of 3D visualization on a computer screen.
  • the reason MedScape was formed, is to give doctors a convenient, fast and user friendly way to gain accurate 3D information for diagnosis and treatment planning.
  • Today's "state-of-the-art", in orthodontics, orthognathic surgery and plastic and reconstructive surgery diagnosis, is two-dimensional. True three-dimensional visualization and manipulation of the 3D data set is essential for accurate diagnosis and treatment planning.
  • the 3D display allows for the visualization of the 3D data base (created from photos, models, X-rays, etc). This 3D visualization allows for 1) Perspective 3D viewing with shading, shadowing and monocular depth cues. 2) Straight on 3D Stereoscopic viewing and 3) Ability to view the 3D data set in a 45 degree 3D Stereoscopic viewing mode (allows for 50% more visual information).
  • the 3D display of the 3D data set can include the following information
  • the user should be able to define a rotational pattern around one, two, or three axes, together or independently.
  • Animation in perspective 3D and in Stereoscopic 3D (Example: open/closed animation to evaluate deviation on opening, asymmetry, etc. animate mandibular movements associated with jaw tracking).
  • the 3D display should allow for user controlled transparency of facial soft tissue to show underlying teeth and skeletal structure relationship. Transparency should be controlled by a slide bar from 0% - 100% and have predefined 20%, 40%, 60%, 80%, for quick acquisition.
  • Lighting of the 3D data set should be predefined to give the best brightness, contract, etc. Real time lighting changes should be possible to gain better 3D view. Especially important with 3D Stereoscopic viewing on high contrast areas as it gives poor results to the stereoscopic effect. In Stereoscopic mode the lighting should allow for the Stereopairs to be lighted the same. Eliminate difference in lighting of the two separate views , creates ghosting.
  • a Reference plane should be available to show the drawing plane, etc. 15)
  • zoom, magnify, scaling, set axis, split objects, move segments, trim, grasp objects should be available and user controlled
  • the software 3D program should show the wireframe, quick render and full render of the 3D data set Also render a window should be available to render only an area of interest
  • the 3D Display should use the photographs from which the wireframes are generated to create the photorealistic textures
  • the camera setting should be predefined Other setting can be included as scene camera, director camera, pan, tilt, Roll, Dolly, etc.
  • 3D display allow for import/export of Model files ( MDL, DXF, IGS, 3DS, other)
  • the 3D display should allow for facial topography visualization and measurement Facial topography contours have certain patterns that differ from people considered “Beautiful” vs “Ugly” vs “Normal” Subtle differences in the nasal area, Zygomatic area (cheek bone), Lip contour, submental fold and chm area Facial topography will be more evident m stereoscopic 3-D visualization Features that are used to describe beauty
  • Stereoscopic 3D imaging allows for all 3 planes of space to be viewed, simultaneously This is a clear and important difference between 3D Stereoscopic viewing and 3D perspective viewing
  • stereoscopic 3D visualization is added to motion parallax (such as rotation of the object) there is an enhancement of visual depth
  • Any 3D visual information can be created m a 3D Stereoscopic mode to further enhance to visual ability to understand 3D relationships of anatomy When motion parallax is also added even greater visual depth information is present
  • the software program can create the appropriate "stereo pairs" for 3D Stereoscopic viewing. Lighting (brightness, contrast, shadows, etc), can be controlled.
  • the software can create the appropriate parallax on the screen to create the stereoscopic image on the screen when viewed with the appropriate viewing lenses (anaglyph, polarized, field sequential).
  • Anaglyph uses Red & Blue lenses so that each eye only sees the image it is suppose to see. There is some limitation on using colored images with the anaglyph mode.
  • Stereo viewing of angular, linear, planes, angles, points, and volume is important.
  • Full color can be done with anaglyph (synthonics) but problems do arise with red, blue and green colors that are part of the image. True full color is best seen with polarized or field sequential (LCD shutters).
  • Field sequential can be 60 or 120Hz. The image flicker can only be eliminated with the 120 Hz.
  • Another advantage of field sequential is tracking devices can be incorporated to allows the viewer to visualize the 3D scene from multiple viewing angles. The multiple viewing angle is an advantage over fixed viewing angle required by anaglyph or polarized viewing techniques.
  • Landmark ID Landmarks can conespond with standard orthodontics landmarks, i.e., gonion, nasion, sella, pogonion, etc. These landmarks can be located in their normal position on the morphologically normal skeletal stock object and can be visually evident. These landmarks can be spatially associated to each other through the IMS data base functions.
  • the spatially calibrated cephalometric views (H & M Associate software) can be overlaid on the stock object.
  • the stock object will be customized to fit the cephalometric data by either drag and drop of a stock object landmark to the corresponding ceph. landmark (edited in all available projection views) or by digitizing the landmarks on the ceph and then making the association and customization from the stock object through the IMS data base.
  • non-landmark cephalometric data may also be associated with the stock objects.
  • the non-landmark data of most interest are the ridge crests (j.e., orbital rims, rim of the nasal fossa, external borders of the mandible, mandibular canal, sinuses margins etc.
  • the stock objects provide a visual reference of the patients anatomy and the landmarks are the data base portion of the patient's file that describe the features that are unique for that individual. Therefore, only the landmark locations need to be stored in the IMS data base.
  • the landmark locations can serve as a set of instructions for altering the stock objects.
  • the transmission of patient landmark location and customizing the stock object at the receiver is more efficient method then transmitting a customized stock object.
  • Use the IMS data base to compile landmark location data to be used to establish normative data for 3D cephalometric analysis and for upgrading the stock model.
  • 2D Analysis and 2 D Normative Data A 2D orthodontic cephalometric analysis is based on comparison of the patients' data with 2D normative data bases that have existed for decades.
  • 2D normative data bases include: Burlington growth study, Bolton/Broadbent, Rocky Mountain Data Systems, Michigan Growth Study, to name a few.
  • 2D analysis include: Steiner Analysis, Downs Analysis, Ricketts, Tweed, Alabama, RMDS, Wits, Owens, etc.
  • 2D template analyses are normative 2D visualizations that are overlayed
  • the 2D normative data can be adjusted for sex, age, race, size, etc. and created into a graphical representation (template) of normative data for visual comparison.
  • 3D analysis & 3D MedScape was founded on the premise to create, develop and Normative Data offer 3D & 3D Stereoscopic software products to the medical and dental professions. MedScape products will give the doctor the ability to diagnose and treatment plan their patients with three-dimensionally accurate models of anatomy (face, teeth & bones).
  • 3D Normative Data This data will have to be developed through University research as this information is limited at this time. Grayson's arcticle in the AJO describes some 3-D growth patterns. Also Rick Jacobson gives some 3-D data in Jacobson's Book “Radiographic Cepahlometrics”. At this time, 3D analysis will have to be "projected” to a 2D format to compare to "narrative 2D data" since this is what exist at this time. These is some work being done in Australia and Canada on 3D MRI & Ceph data.
  • 3D Analysis of Patient Data The traditional 2D landmarks, angles, planes, etc. can be viewed on the 3D model for comparison.
  • the 3D model will add the advantage of being able to view asymmetries of the right & left side of the face, teeth, and skeletal structure. This is a critical area that is not assessed in "traditional" 2D analyses.
  • the lingual concavity of the upper and lower incisors are related to the disclusion angle and the angle of the eminence. These should be congruent w/ each other.
  • These functional components of TMJ function and dysfunction are important concepts that are critical for proper diagnosis.
  • 3D analysis includes modeling of the critical anatomical areas & for the generic wireframes to adjust to overlay the patient's anatomy. A visual representation of "normal” can be overlayed over the patient's "abnormal" for direct comparison
  • Custom Analysis The doctor will want to customize their analyses to include parts of various 2D & 3D analyses.
  • the doctor can define which components of each to include.
  • MedScape will allow the enduser to define, name, save and employ a custom analysis. This analysis can be implemented as a macro function.
  • Growth forecasting has always been a goal of cephalometric diagnosis and treatment planning since the early days. It became popular with Ricketts introduction to RMDS growth forecasts.
  • Lyle Johnston has developed a template that estimates "normal” growth "averages" in children.
  • Burlington Growth Study is also available along with the Broadbent/Bolton study, Michigan study, & others. All of these are 2D.
  • 3D growth forecasting is yet to be developed and will be a critical area of study and development.
  • Time line tracking would allow the evaluation of progress over time.
  • Patient's ALWAYS ask "When am I getting my braces off'.
  • Accurate 3D evaluation of cooperation and growth or surgical plans with photos would be a GREAT stride forward.
  • the software CEPHALOMETRIC simplifies the operator task of designating landmarks and LANDMARKS traces. For example when tracing an intensity edge in an image, as long as the user maintains the pointer in the general vicinity of the edge, the software automatically finds the edge and traces it without relying on precise pointer movement by the user.
  • Patient Presentation Generic Presentation To demonstrate possible treatment options, patient education about orthodontics using a generic patient.
  • Custom Presentation Demonstrate possible treatment options and outcomes using the patient's 3D anatomy.
  • Arch Length Analysis Arch length analysis is a critical diagnostic measurement, as it and Tooth Size can determine diagnostic decisions of extractions of permanent Discrepancy Analysis teeth to correct certain orthodontic problems vs. non- extractions decisions.
  • the teeth can fit within the supporting bone (alveolar bone) to the upper and lower jaw structure.
  • the alveolar bone is the supporting bone that surrounds the roots of the teeth.
  • the basal bone is the main supporting structure for the jaws.
  • the basal bone of the lower jaw (mandible) is limited in size by its genetic potential and has limited ability for growth modification. There are possible growth modifications procedures, such as Functional Jaw Orthopedics that have some limited growth modification potential.
  • the basal bone supports the alveolar bone which supports the teeth.
  • the alveolar bone has the potential for adaptation to the positions of the teeth and can be modified as long as the teeth are kept within the limits of the basal bone.
  • the upper jaw has the capability of increasing its transverse dimensions via "rapid Palatal Expansion” appliances. These types of orthopedic appliances not only change the alveolar bone shape and the size but can also change the dimension of the maxillary basal bone dimension due to "sutures" that exist in the upper jaw.
  • the lower jaw does not have sutures associated with the mandibular skeletal structure.
  • the maxilla is therefore capable of being increased in size to allow for more room form crowded or crooked teeth to be aligned into "normal" occlusal fit.
  • Extraction vs. non-extraction of decisions have traditionally been based on the space requirements of the mandible due to its inability to be changed significantly.
  • Significant arch discrepancy in the lower arch may require extraction of selected permanent teeth to resolve the crowding problem, the orthodontist can then decide which teeth can be removed in the upper jaw, if any, to create a "normal" occlusal fit of the teeth.
  • the teeth can fit into this ideal occlusion when the mandible is in a CR or CO position.
  • the Curve of Spee and the Curve of Wilson are three- dimensional relationships of the plane of occlusion when viewed from the lateral and frontal planes respectively.
  • the analyses of these relationships of the teeth also are included in the decision making process of the orthodontist as far as the extraction vs. non-extraction treatment decisions.
  • As the Curves "level" out the teeth could be positioned where there is no bone support leading to periodontal (gum) problems. Recession and/or alveolar bone loss could occur if not properly controlled mechanically.
  • the doctor can evaluate ALL 3D requirements of each arch, TMJ, bone configuration, etc. These include: 1. the sagittal dimensions (length), 2. the transverse dimension (width), and 3. The vertical dimension (height).
  • Tooth Size Discrepancy The size of the individual teeth as they are positioned around the "Caternary" type curve of the arch, take upo space. The relative sizes of each tooth type (molars, bicuspids, cuspids, incisors) can be interrelated appropriately or the occlusion of the teeth will not fit properly at the end of treatment. If a discrepancy exists in the relative sizes of certain teeth in the arch, then a so called “Bolton Tooth Size Discrepancy" exists. This tooth size discrepancy can also effect the fit of the occlusion between the opposing arches.
  • Bolton tooth size discrepancies are created when there is a mismatch in the size of teeth within the respective arch. This creates a problem of alignment and proper fit of the occlusion. Knowing these discrepancies prior to treatment is critical for orthodontic diagnosis. Limitations in treatment need to be related to the patient as apart of their informed consent. Small lateral incisors, abnormal shape & form, congenital absence are a few problems that create a compromised end result. Restorative dental procedures to correct some of these discrepancies, need to be planned prior to treatment so the patient will be informed and expect follow up care. Relapse of teeth after orthodontic correction is a major consideration in orthodontic therapy. Many elaborate treatment alternative have been devised to control relapse. The ability to three-dimensionally diagnose and treatment plan a patient may lead to improved retention of orthodontically treated cases.
  • the Curve of Spee is a curve of the occlusal plane as seen from the lateral view.
  • the Curve of Wilson is the curve or construction of the occlusal plane as view from the frontal.
  • the treatment of these two “curves” are important as to the eventual final result of the occlusion.
  • Orthodontist usually "flatten" these curves during treatment for occlusal correction. Uprighting the Curve of Wilson can lead to increased arch length and help to gain space for crowded teeth, up to the limit of the alveolar bone, cortical bone, and basal bone.
  • Leveling the Curve of Spee is a routine orthodontic biomechanical effect of treatment.
  • Each tooth coordinate measurement represents a point in space.
  • the total arch circumference is the magnitude of the summation of all vectors connecting their points and given by:
  • Ct ' (X I -X J ) 2 + (Y 1 -Y J ) 2 + (Z,- Z J ) 2 represents the total arch circumference in 3D space and N is the number of teeth measured.
  • the planer projection of the total arch circumference is calculated using a similar method except the depth coordinate (Z,) i.e., depth of Spee is excluded.
  • C p represents the planer projection of the total arch circumference to a lateral 2D projected view.
  • Asymmetry analysis defines the morphology differences between the right and left halves the mandible, maxilla and other regions of the skeleton.
  • the symmetry of these structures should be determined through the use of landmark groupings.
  • the procedure may include determination of the sagittal plane midline of the patient by utilizing identifying midline landmarks.
  • the sagittal plane midline can be used to define the right and left halves of the patient.
  • the simplest symmetry analysis would be begin with the superimposition of the right and left halves of the mandible utilizing the sagittal plane midline reference as the registration plane.
  • the quantification of the asymmetry would be to compare the x,y,z differences in location of the corresponding right and left landmarks and to compare the location of these landmarks to the normal landmark location (by standard deviation) .
  • This type of analysis would allow the end user to quantify the amount of asymmetry and direct the end user to etiology of the asymmetry. For example, when the mandible is asymmetric then it is safe to assume that one side is too small or the conttalateral side is too large. Comparison of the patient data with normal data would allow the clinician to determine which side was abnormal. Knowledge of the etiology of asymmetry may be critical in controlling or predicting the outcome of treatment.
  • Additional analysis may include a normalization of the 3-D data to the standard ABO 2-D views and performing an analysis using existing analysis models. Tools may be created to allow the end user to create a symmetry analysis.
  • the spatially calibrated cephalometric views can be overlaid on the stock object.
  • the stock object will be customized to fit the cephalometric data by either drag and drop of a stock object landmark to the corresponding ceph. landmark (edited in all available projection views) or by digitizing the landmarks on the ceph and then making the association and customization of the stock object through the IMS data base. A combination of the two customization methods can be used.
  • non- landmark cephalometric data may also be associated with the stock objects.
  • the non-landmark data of most interest are the ridge crests (i.e., orbital rims, rim of the nasal fossa, external borders of the mandible, etc. These same methods may be employed for other stock objects, such as, the teeth, TMJs etc.
  • the stock objects are a graphical representation of normal. These normal values for landmark location have been determined through an analysis of the landmark locations on many patients (Burlington study) and have been sorted by age and sex of the patient. Deviations from normal can be analyzed and statistically grouped as a standard deviation from normal. Through the use of the IMS data base we can define normal and the standard deviations from normal for individual landmarks and landmark groupings.
  • the IMS data base will perform an assessment of landmarks locations and groupings of landmarks by compare the patient data to normal data through look up tables (LUT) contained in the IMS data base. After this analysis the computer can highlight in color on
  • Gnathological Normal refers to the cusp fossa spatial relationships, the tooth to tooth relationships among and between the maxillary and mandibular teeth and tooth position relative to the supporting alveolarO and basal bone.
  • the tooth and its 3-D location and spatial orientation relative to the tooth long axis can be defined through tracking of landmarks located on the root apices or apex, cusp tip(s) or incisal edge and the mesial and distal greatest heights of contour. This specialized segmentation of teeth allows them to function as objects.
  • a database that represents gnathological normal teeth can and be used when rendering the stock object teeth in combination with the skeleton. Deviations from the gnathological normal can be described in a similar fashion to the method used for cephalometric analysis.
  • a pseudo-colored visual display of the anatomy that falls outside the statistical normal will facilitate a quick identification of abnormal tooth position, etc.
  • the airway can be divided into the nasal airway, the nasal pharynx and oropharynx.
  • the nasal airway begins at the anterior opening of the nasal passage and ends at the posterior margin of the nasal fossa.
  • the nasopharynx begins at the posterior margin of the nasal fossa and ends at the most inferior area of the soft palate.
  • the oropharynx begins at the inferior margin of the soft palate and ends at the superior margin of the valecula.
  • An airway analysis includes a mathematical description of nasal septum symmetry about the
  • Three basic imaging software modules (Sculptor, Clinician and Executor) comprise the Acuscape suite of software designed for the medical use.
  • the Sculptor is used at an image processing center (server) and passes the acquired images and measurement files (.sci files) to the Clinician user (client) for the generation of the .pro file and subsequent use.
  • Sculptor Module Images are acquired directly into a patient session file from input devices that include digital cameras, flat bed scanners, x-ray sensors, etc. or from image storage files. An Acuscape image calibration frame is worn during image acquisition and shadows of the calibration markers are
  • the images are first spatially calibrated and a patient centric co-ordinate
  • This co-ordinate system is transferred to the images.
  • This co-ordinate system is adjusted or optimized to best fit the patients anatomy. Part of this adjustment superimposes the y-z plane of the co-ordinate system to superimpose on the patient's mid-
  • the subsequent measurements store data utilizing this constructed co-ordinate system.
  • the calibrated images can be stored by the executor or displayed and measured. Multiple images or image sets can be
  • measurements can be performed as point, closed loop trace and linear trace measurements.
  • the measurement routine occurs simultaneous on all images
  • the selected image is measured and a corresponding
  • epipolar line is constructed on the adjacent images to assist with locating the
  • the z,y and z locations of all of the measurement points and lines (series of points) are stored in a measurement file.
  • measurement files are converted to an export file that contains all .jpg images
  • the .sci files contain the calibration information, camera parameters and the x,y and z locations of all traces and landmarks.
  • Cross calibration refers to calibrating multiple images and image types to the same 3D co-ordinate system. These images can include but are not limited
  • the Sculptor will be used to calibrate and measure the images (spatially, color or gray scale value).
  • Executor Module This module works in the background to manage images for the Sculptor and Clinician modules. This is a patient centric relational data
  • Patient file transport files contain the .jpg images and an .sci file.
  • the .sci file is
  • the module is intended to exist primarily in the doctor's

Abstract

A system for relating images and generating a 3D model of anatomy is described. This can be accomplished using three modules. A sculptor module (115) is used to spatially relate images of a patient. The images are generated using a number of different techniques, such as optical and x-ray. The sculptor allows a user to identify the location of different anatomical points in each of the images. Thus, a sculptor allows a user to relate different anatomical points to each other in a 3D space and also relate the points to the images. The clinician module (125) uses the related points to modify or customize a stock model (e.g., a standard anatomical 3D model). The customized model that is created corresponds to a 3D model of the patient's anatomy.

Description

Method and Apparatus for Generating 3D Models from Medical Images
Copyright Notice A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the
facsimile reproduction by any one of the patent disclosure, as it appears in the Patent and Trademark Office patent files or records, but otherwise reserves all
copyright rights whatsoever.
The Field of the Invention This invention relates to the field of medical imaging. In particular, the invention relates to the generation and use of three-dimensional medical images
and models.
Background of the Invention Efforts to represent images in three-dimensional form go back to the invention of the stereoscope in the 1800's. These attempts continued through
the 1950's when 3-D movies (characterized by the red and blue 3-D glasses which served as color filters to separate left and right images) were briefly
popular. With the advent of modern computer technology, some companies have engaged in considerable efforts to capture and reproduce three-dimensional
information. Typically, three-dimensional information about a scene has been represented using a number of selected points and storing information about each point such
as its color, intensity and distance from the camera. For example, a Cyberware
- l - scanner generates such information by rotating a camera around an object to be modeled and capturing that information at particular points. For a high
resolution model, many rotations about the object are required to capture the model information. The vertical displacement of the plane of rotation is decremented by a resolution amount after each rotation. Such models, while
accurate, result in a huge number of points at which information must be
captured and represented.
Some portions of objects, however, do not require such high resolution. As
a result, model builders will frequently manually remove unneeded vertices to simplify the processing required when displaying and manipulating a three-
dimensional model.
Such three-dimensional models are typically rendered as wire frames. That is, a plurality of points are identified, corresponding to the points at which image information is captured and the points are displayed together with lines
connecting each point with adjacent points. When models are displayed in this
manner, they are typically called wireframes because the lines between the points appear to constitute a wire mesh. The individual points in such a wireframe are
frequently called vertices because they frequently appear at the vertex of the angles formed by lines going to adjacent points.
Software is known for constructing and manipulating three-dimensional
models. An example of such software is 3-D Studio Max™ by Autodesk, Inc.
Typically, such software packages have the capability to render, or provide a surface texture over the surface of the wireframe model. A number of attempts have been made to standardize the representation of
three-dimensional information. Of current popularity is the Virtual Reality Mark-up Language (VRML) found with some frequency in an Internet context.
Wireframe models are commercially available from a variety of sources. In the medical area, magnetic resonance imaging (MRI), and other imaging technologies, can accurately display two-dimensional slices of a patient. The
slices include extremely large amounts of data. Some programs allow doctors to construct 3-D models from these 2-D images. But this process requires a large
amount of data processing. The resulting 3-D models are accurate in that they
describe exactly what is in the images, but the models do not have any tie to human anatomy. For example, an irregular shape in a 3-D model of a skull may be a tumor, but the system does not relate this additional information to the
shape. Even worse, the 3-D model does not even have information indicating
that the shape is within a skull. The doctor is responsible for making these determinations. Thus, the 3-D models have limited use. Another problem with
many of these systems is that they do not allow the doctor to build the model
using combined or a range of imaging technologies (e.g., x-rays, MRIs and
photographs). Thus, the models are typically defined using only one imaging technology.
A number of problems exist with the existing technology. First, a high degree of technical expertise is required to create and manipulate three-
dimensional models. Further, computer processing time is significant and, as a result, special purpose machines, such as those produced by Silicon Graphics,
Inc. are commonly used to generate and manipulate three-dimensional models. The user interfaces of available commercial software for dealing with three- dimensional models are highly technical and generally unsuited for use by a
person whose specialty is not in the computer sciences. Also, the 3-D models do not are not related to medical information about a patient (e.g., a shape in a 3-D
model is only a shape, there is no information that the shape is a tumor or body part). Also, some technologies do not allow doctors to build models from
different types of images (e.g., x-rays, MRI's, and photographs).
Therefore what is desired is an improved 3-D modeling and generation system.
Summary of the Invention The following summarizes various embodiments of the invention. One aspect of the invention is directed to providing a generic software tool
for creating and manipulating three-dimensional models for medical applications. In one embodiment, a number of modules are used to achieve this result.
These modules include a Sculptor module, a Clinician module and an Executor
module.
The Sculptor module maps all acquired imaging, including those from
disparate sources, into a single 3D matrix or database. The images are generated
using a number of different techniques, such as optical and x-ray. The Sculptor allows a user to identify the location of different anatomical points in each of the
images. Thus, the Sculptor allows a user to relate different anatomical points to each other in a 3-D space and also relate the points to the images.
The Clinician/Consultant module uses the related points to modify or
customize a stock model (e.g., a standard anatomical 3-D model). The
customized model that is created corresponds to a 3-D model of the patient's anatomy. The model is "smart" in that when certain changes are made to the dot
or vertex location of the model, the remainder of the model can be adjusted or
morphed to make corresponding changes. Additionally, objects in the model
know what part of the anatomy they represent. For example, an object representing the patient's tooth is associated with data indicating that the object
is a tooth. This allows for analysis of a patient's anatomy to be performed automatically. The Clinician/Consultant is a database query tool that allows for display or visualization of the anatomy and function, manipulation of objects for treatment planning and model analyses.
A third module, called the Executor, is a database that provides overall system file and image management and coordinates the Sculptor module and the
Clinician/Consultant modules.
The various features of the invention are illustrated in the context of an
application to Orthodontics. In this application, the stock model is a model of
the skull and associated facial soft tissues, including the upper and lower jaws. In the examples shown, this model has approximately 300 objects which can be
manipulated in the Clinician module to facilitate the kinds of tasks routinely undertaken by an orthodontist.
Some embodiments of the invention include the functionality of some or all of the above modules. For example, in some embodiments, only a subset of the
functions performed by the Sculptor are included (e.g., the ability to define related points in multiple images).
Other embodiments of the invention include a method and apparatus for
performing medical analysis of the patient's 3-D model.
Although many details have been included in the description and the figures,
the invention is defined by the scope of the claims. Only limitations found in
those claims apply to the invention. Brief Description of the Drawings The figures illustrate the invention by way of example, and not limitation. Like references indicate similar elements.
Figure 1 illustrates a computer system including one embodiment of the
invention.
Figure 2 illustrates an architecture of the software used in one embodiment
of the invention.
Figure 3 illustrates capturing images for use in the system.
Figure 4 and Figure 5 illustrates the calibration frame.
Figure 6 illustrates an example method calibrating images, generating a patient specific model, and performing analysis from the calibrated images and the patient specific model.
Figure 7 through Figure 24 illustrates user interfaces for a sculpture
application. Figure 25 through Figure 40 illustrates user interfaces for a clinician
application.
The Description
Definitions
The following definitions will be helpful in understanding the description. Computer - is any computing device (e.g., PC compatible computer, Unix
workstation, handheld device etc.). Generally, a computer includes a processor
and a memory. A computer can include a network of computers. Handheld Device (or Palmtop Computer)- a computer with a smaller form factor than a desktop computer or a laptop computer. Examples of a handheld device include the Palm III™ handheld computer and Microsoft's palm sized
computers. User - any end user who would normally wish to retrieve information from
the World Wide Web.
Internet - is a collection of information stored in computers physically located throughout the world. Much of the information on the Internet is organized onto electronic pages. Users typically bring one page to their
computer screen, discover its contents, and have the option of bringing more pages of information.
Client - a computer used by the user to make a query.
Server - a computer that supplies information in response to a query, or performs intermediary tasks between a client and another server.
World Wide Web (or Web or web) - is one aspect of the Internet that
supports client and server computers handling multimedia pages. Clients use
software, such as the Netscape Communicator® browser, to view pages. Server
computers use server software to maintain pages for clients to access.
Program - a sequence of instructions that can be executed by a computer. A
program can include other programs. A program can include only one
instruction.
Application - is a program.
The detailed descriptions which follow may be presented in terms of program
procedures executed on a computer or network of computers. These procedural descriptions and representations are the means used by those skilled in the art to most effectively convey the substance of their work to others skilled in the art.
A procedure, program or application, is here, and generally, conceived to be
a self-consistent sequence of steps leading to a desired result. These steps are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals
capable of being stored, transferred, combined, compared, and otherwise manipulated. It proves convenient at times, principally for reasons of common
usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like. It should be noted, however, that all of these and
similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities.
Further, the manipulations performed are often referred to in terms, such as adding or comparing, which are commonly associated with mental operations
performed by a human operator. No such capability of a human operator is necessary, or desirable in most cases, in any of the operations described herein
which form part of the present invention; the operations are machine operations.
Useful machines for performing the operation of the present invention include
general purpose digital computers or similar devices. The present invention also relates to apparatus for performing these
operations. This apparatus may be specially constructed for the required
purpose or it may comprise a general purpose computer as selectively activated
or reconfigured by a computer program stored in the computer. The procedures
presented herein are not inherently related to a particular computer or other
- 9 - apparatus. Various general purpose machines may be used with programs written in accordance with the teachings herein, or it may prove more convenient
to construct more specialized apparatus to perform the required method steps. The required structure for a variety of these machines will appear from the description given.
System
Figure 1 illustrates a computer 110 that can be used to carry out the
invention. The following paragraphs first list the elements of Figure 1, then describe how they are connected, and then define those elements.
Figure 1 includes a computer 110, a sculptor 115, a clinician/consultant 125, and an executor 135. The sculptor 115 includes a display of a user interface having a number of patient images 150 that also show a calibration frame 140.
The clinician/consultant 125 includes a similar user interface that includes a a view of a patient specific model 160 and an analysis window 127. The analysis
window 127 includes an example analysis 170. The executor 135 includes image
data 137 and patient model data 139.
This paragraph describes how the elements of Figure 1 are connected. The sculptor 115 and clinician/consultant 125 communicate with the executor 135.
The sculptor 115 and the clinician/consultant 125 can extract and manipulate information from the image data 137 and the patient model data 139 through the executor 135. The following paragraphs describe the elements of Figure 1 in greater detail.
How these elements are used to generate a 3D model of a patient's anatomy is describe in relation to Figure 2.
The computer 110 represents a computer system upon which the sculptor 115, the clinician/consultant 125, and the executor 135 can execute. The computer 110 is representative of a standard personal computer such as is
available from Dell Computers, Inc. Of course any number of different types of
computers could be used as the computer 110. What is important is that the computer 110 some sort of processor and some memory.
As an alternative to the system of Figure 1, the sculptor 115 and the executor 135 may run on one computer at one time. While at another time, the
clinician/consultant 125 and the executor 135 can run on another computer at another time. Alternatively, all three programs can run on different computers.
The computers can be linked together by a network such as the Internet. The sculptor 115 represents a computer program in which a number of different types of patient images 150 can be calibrated using the images of the
calibration frame 140. Note that the patient images 150 are from multiple
sources. In particular, in this example, an x-ray image and two photographs are
shown. The sculptor 115 allows a technician to calibrate the images and identify
a number of anatomical locations in the images.
The patient images 150 can be extracted from the image data 137. The
image data 137 can be imported from an external source either by transmission
over a network or by scanning of x-ray or optical images, for example. Other embodiments can include direct capture of x-ray images or other types of media
images.
Alternatively, the image data 137 need not be retrieved from the executor 135. The image data may be directly imported into the sculptor 115
and then, later on, possibly be stored in the image data 137.
The calibration frame 140 is an apparatus that includes a number of
calibration targets that can be seen in the patient images 150. The calibration frame 140 is worn by the patient during the capturing of the patient images.
The patient model data 139 represents the data generated by the sculptor 115
that can be used to morph the patient specific model 160 and any other
information that would be important to patient records. This output of the sculptor 115 can be included the form of two transport files, the (.sci file and .cln
file). The executor passes these files to the clinician/consultant.
Turning to the clinician/consultant 125, the data from the sculptor 115 is
used by the clinician/consultant 125 to morph a stock anatomy model into a
patient specific model 160. The stock anatomy model is 3D model of a standard
person's anatomy (e.g., a skull possibly having flesh). The clinician/consultant 125 morphs the stock model into the patient specific model, allows users
visualize what a patient's anatomy looks like. Simulations of treatment plans can be shown in the clinician/consultant 125. Also, because the patient specific data
137 defines the relative location of a number of parts of the patient's anatomy,
the clinician/consultant 125 can be used to perform various types of analyses on
the patient's anatomy. The results of these analyses can be then displayed on the patient images and as well as in the example analysis window 127.
- 12 - In the example, which is described throughout, the particular model used will be a stock model of a human skull. The human skull can be used by an
orthodontist in planning for, and carrying out, a treatment plan for a particular
patient. In this example, the model will have a number of objects including objects corresponding to each of the patient's teeth, the jaw, and other elements
of the skull. Importantly, each of these objects can be manipulated individually in the clinician/consultant 125.
An example of a stock model that may be used is one from Viewpoint Data Labs which is specifically created for orthodontic applications. A full custom
stock model can also be used. The stock model represents the average structure of a piece of anatomy. The Executor (database) will compile normative stock models to match patient demographics of age, race, sex and body type. The
stock model has a coordinate system where each point is referenced to another
point within itself. The information retrieved in the sculptor 115 allows that
stock model to be morphed according to the dimensions and measurements from the sculptor 115. Examples of the individual objects available in the generic
stock model, and the resulting patient specific model 160, include individual teeth, the jaw, and the skull. Each of these objects has a separate coordinate
system which is referenced to the coordinate system in the patient specific model 160. In this way, when a particular object is selected, one may manipulate
that object and change its relationship to the global reference system of the patient specific model 160. Other aspects of the invention are some of the new user interfaces features presented in the sculptor 115 and/or the clinician/consultant 125. These new
user interface features will be described in greater detail below.
Figure 2 illustrates the various responsibilities of each of the three programs of Figure 1.
The sculptor 115 is responsible for input/output control of the patient images 150. The sculptor 115 allows for a calibration between the various
images. 3D measurements of various locations defined in the calibration process
can then be determined. The sculptor 115 includes a graphical user interface for performing the various features of the sculptor 115. The viewer supports the viewing of 3D models (useful where a piece of anatomy needs a more detailed
identification).
The model matching allows a user to match portions of the stock model to points on one or more patient images 150. In addition, model matching includes
the ability to spatially match or register two models of the same patient at
different points in time. Thus, areas of a model that are not already predefined in
the sculptor 115 can be defined. These new locations can then be used for more
accurate morphing process of the particular part of the anatomy of interest, will
facilitate a morphological comparison of two patient specific models and will facilitate the comparison of patient specific models to normative data. For
example, is one part of a patient's anatomy requires specialized treatment, a more
detailed patient specific model 160 may be desired. In such a case, the sculptor 115 allows the user to identify the location of previously undefined points of the stock model in the patient images 150. The executor 135 takes responsibility for the database storage and
transferring of the image data 137 and the patient model data 139. The executor 135 includes Internet access for communicating with one or more
sculptors 115 and one or more clinician/consultants 125. The executor 135 also has encryption capabilities to protect the security of the information stored by
the executor 135.
The clinician/consultant 125 includes the following functions. Diagnosis,
treatment planning, predictions, analyses, and metrics, are all examples of the type of functions that can be performed on the patient specific model 160 and
patient model data 139. Examples of these areas are described in greater detail below. The clinician/consultant 125 also keeps track of the stock objects, or
stock models, that may be used in the morphing processes.
The clinician/consultant 125 includes a graphical user interface and viewer
for viewing patient information, the 3D model, analysis, etc.
The morph editor is used to modify any morphing that is done to generate
the patient specific model 160.
The simulator simulates the motion of objects within the patient specific
model 160. The simulator could also be used to simulate the predicted motion of
objects in the patient specific model 160. Thus, the simulator can be used to
simulate the movement of a jaw, or treatments such as the straightening of teeth. Image Capture Example
Figure 3 illustrates example relationships between a patient, a camera and other imaging technologies, for the purpose of capturing images. The capture
images can then be imported into the sculptor 115. A patient 300, a camera 310 and an x-ray device 320 are shown in Figure 3.
The camera 310 and the x-ray device 320 can be used to capture the image data
137. These captured images can be all from one camera, x-ray machine, or the
like. However, some of the more important features of the invention can be realized when mixed modes of image capturing are combined. In this example, two devices, the camera 310 and the x-ray machine 320, are used to capture
image data about the patient 300 from multiple vantage points. Other example modes of image capture include MRIs, ultrasound imaging, infrared imaging, and
the like. The camera 310 and x-ray 320 are merely symbolic of the fact that
images of the patient are captured using various types of imaging technology. In the orthodontic application, it is desirable to have both x-ray (both skeletal
and soft tissue) and optical modes for capturing images of the patient 300.
Although the camera 310 and x-ray device 320 are shown in the plain of Figure 3, this is not necessary, and in fact, may not be desirable. In the example
of an orthodontic application, the preferred x-ray images would include a frontal
image, a lateral image, and a frontal image with the head tipped back. The
preferred photographic images may include a frontal image and two lateral images. Example Calibration Frame
Figure 4 illustrates a front view and a side view of a calibration frame 140 that may be used in some embodiments of the invention. Images of this calibration frame 140 appears in the patient images 150. The images of the
calibration frame 140 can then be used in the sculptor 115 to calibrate the
various images.
The calibration process includes recording the anatomy with the calibration
frame in place using any number of imaging modalities. The 3D location of the calibration markers and an associated co-ordinate system is included as a priori
knowledge within the sculptor. Through calibration of each imported image, the sculptor computes the location of the imaging sources as a point source with seven degrees of freedom (DOF). Seven DOF includes the x, y, z, yaw, pitch,
roll and focal length of the imaging source. The calibration process maps the associated images into the 3D matrix associated with the calibration frame. Two
or more calibrated images, through a process of triangulation, can be used to
determine the 3D location of any associated points on the image sets.
The calibration frame 140 can include a top strap 405, a strap 410, an
adjustment knob 420, and a plexi-glass frame 430. The top strap 405, the strap 410, and the adjustment knob 420, work together to keep the calibration frame
in a substantially fixed position on the patient 300. Thus, when the images of the patient are capture, from the various modes, a common reference frame is
established.
The following describes the calibration frame 140 in greater detail. The strap
410 is designed to encircle the patient's 300 head. The top strap 405 is designed
- 17 - to prevent the strap 400 assembly from dropping too far down on the patient's head.
In some embodiments, the top strap 405 and the strap 410 are part of headgear normally associated with a welding visor from which the face shield has
been removed.
One of the problems with the welding headgear, by itself, is that it flexes in
ways that are undesirable for image capture. This prevents a good common
reference frame from being established. Accordingly, a rigid plexi-glass frame 430 is used to mount a number of calibration targets 440. The circumference of the strap 410 is adjusted using a ratchet and a knob 420. They can be used to
adjust the amount of overlap between the ends of the strap 410.
The calibration targets 440 provide measurement references during the calibration of the various patient images 150. The calibration targets 440 include
a number of spherical shapes, possibly having substantial x-ray attenuation
properties. However this is not a requirement, and may not be desirable depending on the imaging technology that is being used to capture the patient
images 150. Alternative embodiments can include different materials such as cod liver oil capsules as calibration targets 440. Alternative embodiments can also
include different shapes of calibration targets 440, such as crosses. Although
BBs, such as shot gun pellets, could be used, it is preferred to use bearings
because their spherical shape is held to a closer tolerance. A characteristic of the
calibration targets 440 is that they are visible in both optical and x-ray images. However, what is important with respect to the calibration targets 440 is that they provide a fixed reference frame by which patient images 150 can be calibrated. Thus, they should be viewable in each of the patient images 150.
Importantly some of the calibration targets 440 can be of different types of materials such that some of the calibration targets 440 appear in some of the images while others of the calibration targets appear in others of the images. As long as enough of the calibration targets 440 are visible in enough of the images,
calibration can be performed.
Additionally, a single calibration target could be made of different materials.
For example, a cod liver calibration target could be positioned very close to a crosshair. The crosshair would indicate the position of the calibration target in
the photographs, while the code liver oil capsule would indicate the position of
the calibration target in MRI images.
Generally the calibration targets 440 are positioned in the calibration frame 140 such that it is unlikely that in any one image the calibration targets will
overlap to any great extent. It is also preferable, that at least four of the
calibration targets are visible from each image perspective. Thus the shape of the calibration frame for holding the calibration targets 440 may vary from
medical application to medical application. Thus multiple different calibration
frames can be supported in the sculptor 115. Importantly, when the images have been captured, a session folder is created in which to store the images from the
session, and the patient data are stored in a patient folder of the file management system operated under control of the executor 135. The type of calibration
frame used 140 can also be stored with that information.
19 - The attachment 450 (also referred to as an appliance) represents another way in which calibration targets can be included in images of the patient 300. For
example, where a particular image is restricted to a small area of the patient's head, the calibration attachment 450 can be still used to calibrate. For example, where an x-ray image is collimated to only focus on a smaller portion of the
patient's face, the calibration targets 440 in the attachment 450 would still appear
in that x-ray image.
Figure 5 illustrates another embodiment of the calibration frame 140. In this
example, the plexi-glass frame 430 has a number of bends instead of the continuous curve shown in Figure 4. This facilities the attachment of calibration attachments 450 to the calibration frame 140.
Figure 5 illustrates a top view 502, a front view 504 and a cross section view 506 of the calibration frame 140. The front view 504 and the cross section view
506 illustrate how attachment sites 530 can be included on the calibration frame 140. (Note, the top view 502 does not illustrate the attachment sites 530,
but the sites may be viewable from the top view 502.)
The cross sectional view 506 illustrates how an attachment can be attached
to an attachment site 530 using a captive neural thumb screw 550. The
attachment 450 can be stabilized using dowel pins 560. This example illustrates an acrylic appliance (attachment 450).
Importantly, Figure 5 illustrates an acrylic appliance support that can be used
a part of the calibration frame 140. This is merely illustrative of how appliances
or attachments 450 could be attached to the calibration frame 140. What is important is that there is some way to include calibration targets 440 in different images that may not include the the calibration frame 140.
Example Method of Creating and Using Patient Specific Data
Figure 6 illustrates one embodiment of the invention where the programs of
Figure 1 are executed on one or more computers 110. In this example, the
patient images 150 are calibrated to generate the patient specific model 160. This information is also used to perform a number of analyses on the patient
model and related data. Figure 6 can be broken down into three general processes: a capturing of the patient specific data 602, generating the patient model 604, and performing analyses and related work on the patient model and
related data 606.
Starting with the capturing of the patient specific data 602, at block 610, the
calibration frame 140 is mounted on the patient 300 head. This can be done by a technician at a medical facility. The top strap 405, the strap 410 and the
adjustment knob 420 can be use to snuggly fit the calibration frame 140 to the
patient's 300 head.
Next, at block 620, a number of different images of the patient are captured. Importantly, the calibration frame 140 and/or the attachments 450 are included in
these images. Next, at block 630, the sculptor 115 is used to import all the image data 137.
This image data 137 is now calibrated using the image information of the calibration frame 140. In particular, each patient image 150 is associated with a
calibration frame template (a computer representation of the calibration frame 140). A user will match a calibration frame template up with the image of the calibration frame 140 in each of the patient images 150. This tells the computer 110 how the calibration frame 140 is oriented and positioned in the
image. Thus, all the points within the image can now be associated with, or referenced to, the calibration frame 140.
As noted above, the calibration process involves calibrating locations relative
to the position of the calibration frame 140. As part of this process, it is
convenient to define a coordinate system for a particular patient. This coordinate system can then be mapped into the various views for patient
images 150. For example, a first plane may be defined that is parallel to approximately the patient's pupils. A y-plane can then be defined through the
mid-sagittal plane of the patient. The last center plane can be determined from the cross product of the other two planes.
Next, at block 640, a number of anatomic locations in each of the images are
identified. Examples of this process are described below. What is important, however, is that a set of all the anatomic locations in the patient images 150 is
defined. Appendix A includes a list of those locations.
An example of identifying anatomic location 640 would include such things
as identifying the locations of the ears, a trace of the jaw, the various points on specific teeth.
The set of anatomic locations that need to be defined is dependent upon what
stock model is to be used and how well the resulting morphed patient specific model 160 should match with the patient's exact anatomy. For example, if only a portion of a skull is to be modeled, only those anatomical locations associated
with that portion of the skull need be identified.
Different medical applications may have different specific features of interest that are identified in the identification of anatomic locations. Orthodontists, for
example, tend to concentrate on landmarks in the skull. These landmarks tend to be points concentrated on the mid-sagittal plane, teeth, and jaw. The ability to identify these landmarks and cross correlate them in the various patient
images 150 is an important feature for specific medical applications. However, the specific features and anatomic locations that are important to a particular
discipline will vary from application to application. Once the calibration frame has been used to calibrate the various patient images 150, however, all images and anatomic locations can then be referenced to that calibration frame. This
information can then be stored in the transport file (.sci file).
The following describes an example way of identifying a landmark in more than one image. It is important to identify the landmark location in multiple
views to completely determine the 3D co-ordinates (x, y, z) of that landmark.
To do this, the user can perform the following steps using the computer 110.
First, the user selects a point to be identified. Next, the user then places that point in one of the images. The sculptor 115 then generates a line through that
point in each of the other images. The line (epi-polar line) originates from the
imaging source and is projected through a landmark point of image A and onto
all other images mapped into the 3D matrix. The display length of the line
projected onto the other images through the point of interest is arbitrary but the
length of the line can be constrained by a priori knowledge of geographic region
- 23 - of the point of interest associated with each image. The projection of that line is displayed in each of the other images as noted. The display in the other images
will appear different because the projection of that line, in 3D space, will be viewed from different perspectives in each of the images. The user now can use the projected line to identify the corresponding landmark location in the other
images. By looking along the projected line in each of the images, one can
quickly identify where the landmark should be located in that image. Once the point has been defined in two of the images, the point may be automatically defined in all of the other images. Rather than identifying individual points however, it is sometimes desirable
to outline, or trace, certain anatomic features. The tracing can be done by specifying one or more connected points in a number of the patient images 150.
An example of this would be tracing the outline of an eye socket in an x-ray
image. This would be important for certain medical applications relating to the eye. This traced information could then be stored with the landmark location
information.
By using the calibration frame 140 and the relationship between the various images, an accuracy of 0.1 mm can be achieved with respect to the location of
landmarks and the tracing of anatomic parts.
Thus the capture of the patient specific data process 602 has been completed.
Next, the patient specific model 160 is generated in the generate patient model 604 process.
Once all the anatomic locations are identified in the sculptor 115, the data for that patient can be exported to the executor 135. This information can then be
- 24 - loaded into the clinician/consultant 125. The clinician/consultant 125 includes a stock model which is to be morphed against the information drawn from the sculptor 115 for a particular patient. In this example, the anatomic locations identified in the sculptor 115 from the patient images 150 are all associated with the calibration frame 140. Thus, exact measurements relative to each other of
anatomic locations have been identified. Thus relative location information is then used to morph the stock model into the patient specific model 160. This
can be done using standard techniques for morphing data. The resulting patient
specific model 160 includes all the object information in the original stock model
but has been customized with the measurements of a specific patient.
Importantly, this allows significant types of manipulations to be performed that have not previously been performed in medical imaging systems. Examples of
these processes are now described.
The following describes an example analysis that is performed using the
patient specific model 160 and the other patient data. In this example, a patient
model is displayed. An example of this is shown in the picture of the
clinician/consultant 125 on the left hand side of the display area in Figure 1. In this example, the model is shown as a number of dots in space.
Within the clinician/consultant 125, the user can select an analysis type from
the analysis window 127. The analysis can be derived from either the landmarks that have been previously identified in the sculptor 115, from the morphed three
dimensional model data in the patient specific model 160, or from measurements
taken by the user from within the clinician/consultant 125. An example of the
analysis performed is shown in the example analysis 170. This conforms to block 670 of Figure 6. The example analysis 170 illustrates example output from an analysis procedure. In the case of orthodontics, measurements may be taken to perform any number of standard orthodontic analyses such as a Ricketts analysis or a McGrann analysis. Other types of analysis, treatments, etc. are described in the following sections.
As part of the analysis, a user can use the clinician/consultant 125 to rotate
and manipulate the patient specific model 160. The patient's face can be mapped
onto the generic model from the photographs in the patient images 150.
Other types of analysis, or evaluations, performed in the clinician/consultant 125 are now described.
For a particular tooth, which is a separate object, one may wish to select that tooth and move it somewhat out of its socket. This could be used for example in showing a patient what their teeth would look like if they were straightened. Alternatively, the axis of the tooth for a particular patient may be rotated from its
present location to an ideal location. Using the model, by selecting the tooth,
one could not only translate the tooth, but rotate it to show it in a different orientation with respect to the other elements in the patient specific model 160.
In some embodiments of the invention, the viewing and manipulation of
individual objects of the patient specific model 160 is done using the VRX viewer, commercially available from Synthonics Technologies, Inc.
The user is able to hide objects in the patient specific model 160. This,
would correspond, in the case of a tooth, to extracting a tooth, leaving more room for reorienting other teeth in the jaw. In the case of an orthodontics application, there are certain standard viewpoints used by orthodontists. These include an interior view, a superior view, a lateral view, a frontal view, intra-oral view, and/or an extra-oral view. Once adjustments have been made to the generic stock model, to reflect the patient specific features, i.e., the generation of the patient specific model 160, a
treatment plan can be applied. The patient specific model 160 can be rendered by placing a skin over the model to show the external appearance after the
modifications have occurred. An example of such a system that would allow this is QuickLook trademark rendering product from Synthonics Technologies, Inc. This permits photorealistic texturing of the model, so the patient can see their
actual face after the orthodontic work has been completed.
When using this invention for various medical applications, one can use the capabilities of the software and the models to display and communicate to patients what will be occurring with respect to their treatment. Patients can
visualize the changes that will occur in their mouth as a result of some
orthodontic treatment.
Further, by using the techniques described herein to generate the three dimensional model, one achieves such a model with a much lower dosage of
radiation than would be required, for example, if such a model were constructed
by a CAT (Computer Assisted Tomography) scan.
Further, by constructing a transport file that contains only limited patient
specific information, a user need only identify, for example, two hundred
landmarks in an orthodontic application, versus hundreds of thousands or so vertices that would be required for completely defining the patient specific model 160 by itself. As a result, the full patient specific information can be transmitted through a relatively low bandwidth network in a relatively small amount of time. Such information could be transported over the Internet to
insurance companies, patients, and specialists. This is significantly different than
what is required to transfer full three dimensional models over a network. Thus, measurement data could be taken in the sculptor 115 and sent to a dental lab across the network very quickly and efficiently. Alternatively, information about
the patient specific model 160 can be transmitted across the network with the
same minimal bandwidth requirements to a dental lab, where a finished bridge could be produced in accordance with the specification and measurements
contained in the patient specific file.
Automated Calibration Target Identification
The following describes an example system for automatically identifying the
calibration targets 440. In this example, BBs will be used as calibration targets, however, the general process can be used for most any type of calibration target.
The image of each BB is circular in the patient images 150. This is why the
spherical shape was chosen, so it's image would be circular, regardless of the viewing angle. For a given camera geometry, one would expect the blob formed by the image of a BB in the film to have a certain size. Specifically, one would
expect a BB to appear with a diameter of a certain number of pixels for a
particular camera geometry. One can select the correlation patch corresponding
substantially to that expected size and shape and then search over the patient
image space looking for correlations between the portion of an image underlying the correlation patch with the corresponding pixels for the expected blob pixel set. The points with the highest correlation are likely the locations of the BB. When this has been done for each BB in a particular set of patient images 150,
one can identify the centroid of the region in three dimensional space formed by the intersection of the projections of a BB image from different patient images.
The centroid location in 3D space is then determined as the center of the BB relative to the calibration frame 140. This information can then be stored and
associated with that particular image.
The above process describes a fully automated calibration target identification technique. However, in other embodiments of the invention, the user can drag and drop calibration target identifiers near a calibration target, and the computer 110 (the sculptor 115) can look for a calibration target near where the user dropped the calibration target identifier.
Sculptor Interface Examples Figure 7 illustrates the sculptor 115 application interface. This is shown as
sculptor interface 715. Sculptor interface 715 includes a sculptor toolbar 720 for
performing various functions in the sculptor 115. In this example figure, a patient image 750 is being displayed. Patient image includes a view of the
patient 300 and view of the calibration frame 140.
Figure 8 illustrates the placement of an unaligned calibration template 810 in the patient image area. The unaligned calibration template 810 will be aligned
over the next few figures to show how the sculptor 115 can be used to determine
a calibration reference frame for a patient image. Figure 9 shows that a number of calibration target selectors 910 have been placed over the calibration targets 440 in the patient image 750. The calibration target selector 910 is dragged and dropped onto a calibration target in the image
of the calibration frame 140. Figure 10 illustrates the partially aligned calibration template 110. The
partially aligned calibration template 1010 has been aligned using the calibration
target selectors 910.
Figure 11 illustrates the aligned calibration template 1110. The aligned
calibration template 1110 now provides the sculptor 1115 with a reference frame for the patient image 750.
Figure 12 illustrates the sculptor interface 715 having a second patient image
1250 being displayed. The calibration frame 140 can be seen in both of the images. Additionally, the aligned calibration template 1110 can be seen. A similar alignment process was performed to align the calibration template in the patient image 1250.
Figure 13 illustrates the placement of a coordinate reference origin 1310 to
define a reference plane for use by the user. The reference planes help in the identification of anatomic locations.
Figure 14 illustrates creating a trace using the sculptor 1115. In this
example, a trace name 1410 is displayed in the sculptor toolbar 720. In this
example, the trace name 1410 is the mid-line soft tissue trace. A user has traced
his midline soft tissue trace 1410 in at least two of the patient images 150. The
midline soft tissue trace 1410 is also shown in the patient image 1450. This may be the preferred image to trace the mid-line soft tissue trace 1410. The reason for this is that the profile of the patient's soft tissue is most easily seen in this image. The midline soft tissue trace 1410 can also be defined in the patient
image 750. The sculptor 115 then propagates this trace to the patient image 1250. Figure 15 illustrates a user interface enhancement in the sculptor 115. In particular the sculptor 115 allows the user to perform a localized histogram
normalization in a patient image. For example, localized enhancement 1510 is performed in the patient image 1250. This allows the user to enhance portions
of the various images. Figure 16 illustrates a trace and landmark view 1620 where the patient image
1250 is removed from the display. This further allows the user to determine
where traces and landmarks are being positioned. In particular the example traced 1610 is shown. By removing the patient image 1250, the example trace
1610 can be more easily seen. Figure 17 illustrates a rotated view 1720 of the example trace 1610. The
user can rotate the traces, or otherwise manipulate the traces. In one
embodiment of the invention, the viewer tool used in the clinician/consultant 125
is also used for rotating and displaying the traces.
Figure 18 illustrates a number of landmarks and traces being displayed in
multiple images. In particular landmarks and traces can be seen in the patient
image 1450, the patient image 750, the patient image 1850, and the patient
image 1250. This allows for some basic measurement and analysis features of the sculptor 115 to be used. Figure 19 illustrates a measure between a bone and a soft tissue point. In this example, the distance measured is the distance from the post nasal spine landmark to the midline soft tissue trace. The measurement line 1910 illustrates
where the measurement is taking place. The measurement information 1920 shows how many millimeters long the measurement line 1910 is. This
calculation can be made because the patient image 1250 has been calibrated with
the calibration template 140.
Figure 20 illustrates how epi-polar lines 2020 can be used in the placement and location of landmarks and traces. In this particular example, a post nasal
spine landmark is being shown using epi-polar lines 2020. The post nasal spine landmark 2010 was first placed in the patient image 1250. This caused epi-polar lines to be shown in the other two images. Next the post nasal spine landmark 2010 was identified on one of the epi-polar lines in the patient image 1850. This
automatically placed the post nasal spine landmark marker 2010 into the last
patient image 2050.
Figure 21-24 illustrates the creation of a trace. In Figure 21, a right
mandible trace 2110 is shown. This trace was performed by a sequence of click
and drag actions. The right mandible trace 2110 includes a number of trace points, such as a trace point 2120. These trace points are connected in the line
to form the trace. Now the user will make the same trace in a different patient image.
In Figure 22, a portion of the right mandible trace 2110 is being created in
the patient image 1850. The trace line 2200 corresponds to the right mandible
trace 2110. The trace line 2200 has, or will have, the same number of points as the right mandible trace 2110. The sculptor 115 ensures this. When the user places a trace point down such as the trace point 2210, the trace line 2200
extends from that point and shows where the next trace point 2220 would be placed along the trace line 2200. The user can then manipulate the location of this next trace point 2200 by dragging the trace line 2200 into the appropriate
location. When the user has placed the next trace point 2200 in the appropriate
location, the user can then select using a mouse click to position the next trace point 2200 permanently. Then another trace point is displayed until all of the trace points in the right mandible trace 2110 have been placed. This provides a
particularly simple method of propagating a trace from one patient image to the
next patient image.
Figure 23 illustrates the placement of the trace point 2220 after the user has clicked on the mouse. Once the right mandible trace 2110 has been completely
laid down (2D trace) in the patient image 1850, the epi-polar lines of points on
the right mandible trace can be projected onto any other calibrated patient image.
These epi-polar lines constrain the identification of the corresponding landmark
to those lines. Once the correspondence has been completed and the 3D coordinates of the points on the trace have been computed (triangulated) then the
right mandible trace 2110 can automatically be propagated to another calibrated
patient image. Thus by defining the right mandible trace 2110 in two images, the
right mandible trace 2110 has been completely defined in the space corresponding to the calibration frame 140. Therefore wherever the calibration
frame 140 is identified, the corresponding location of the right mandible trace
2110 can be determined by the sculptor 115. Clinician/Consultant Interface Examples
Figure 25 through Figure 40 illustrate various user interface features of the clinician/consultant 125. These figures illustrates how a user can access the
patient specific model data, create and manipulate the patient specific model 160, and perform analysis, treatment and the like on the model and the model data.
Figure 25 illustrates an .SCL file load window 2510 that can be use to load a .SCL file. In this example, the .SCL file is the patient specific file 2520 that was
generated in Figure 7 through 24.
Figure 26 illustrates the morphing interface 2600 that can be part of the clinician/consultant 125. Here the patient image 750 is shown with a partially
morphed stock model 2610. After the morphing is complete, the patient specific model 160 is created. The morphing interface 2600 need not be used by the medical practitioner, but it does help illustrate the morphing process.
Figure 27 illustrates a morphed and texture mapped patient specific model view 2710. Here the patient specific model view has been rotated. Note that the
photo image texture mapped onto the model is the patient image 750.
Figure 28 illustrates a wireframe view 2810 of the patient specific model 160.
The morphing interface 2600 allows the user to rotate the view of the patient
specific model 160 and to change the way in which the model is displayed.
Figure 29 illustrates a dot contour view 2910 of the patient specific model
160. The dot contour view 2910 shows the points that are used to define the
patient specific model 160. During the morphing process, the points in the stock
model are repositioned, according to the patient model data, to create the patient specific model 160.
- 34 - Figure 30 illustrates the clinician interface 3010. The clinician interface 3010 is the interface that would normally be used by the medical practitioner when performing analysis, developing a treatment, or presenting information to the
patient. In this example, the clinician interface 3010 includes a patient specific model flesh view 3010.
Figure 31 illustrates the patient specific model skull view 160 having a
number of landmarks and control points showing.
Figure 32 illustrates an example analysis that has been performed. Here a
Steiner analysis has been performed on some of the anatomical locations identified in the sculptor 115. The analysis window 127 shows the results of the
analysis. The patient image 1250 and the dot view of the patient specific model 3210 show the analysis lines 3230. Normally, the medical practitioner would have had to draw these lines on the x-ray image, and then measure those lines.
With the clinician/consultant 125, these processes are automated.
Figure 33 illustrates a partially planned treatment where an arch form
template 3320 has been put into the jaw object dot display 3310. Importantly, the jaw object can be selected and manipulated separately from the rest of the
patient specific model 160. Additionally, the medical practitioner can place an
arch form template 3320 and perform simulations of how the teeth may will be
affected by a particular treatment. Note that the user interface now includes a
tool for defining the wire, pivot points, and alignment.
Figure 34 illustrates a jaw object solid display 3410 where a particular tooth has been selected (shown as tooth selected display 3420). Figure 35 illustrates a
similar view, except the jaw object dot display 3510 is shown instead. Figure 36 illustrates the where the user has partially extracted and tilted the tooth. This could be used to show a patient what an extraction would look like. Figure 37 illustrate the top view of this configuration.
Figure 38 illustrates the jaw object solid display 3410 where the tooth has
been extracted.
Figure 39 illustrates another feature of the clinician/consultant 125 user
interface where slice planes have been placed through the jaw object. The jaw
object display 3920 is used for positioning the slice planes (e.g., slice plane 3910). The jaw object display 3930 shows the results of the slice plane 3910. The clinician/consultant 125 user interface allows the user to position and
control multiple slice planes through the object.
Figure 40 illustrates a partially transparent slice plane 4010 and a partially
transparent slice plane 4020 positioned though the jaw object display 3920. The jaw object display with transparent slices 4030 shows the result of the slice
planes.
Appendix A The following table shows the landmarks and traces used in the creation of the patient specific model 160. Other embodiments of the invention can use other landmarks and/or traces. The following list has been chosen as they
represent commonly referred to anatomic landmarks.
// ===== landmarks ==== abbr Description
/ /
LB ME Menton
LB GN Gnathion
LB PG Pogonion
LB B B Point
LB ID Infradentale
LB LIE Lower Incisor Incisal Edge
LB ADP Anterior Downs Point
LB UIE Upper Incisor Incisal Edge
LB UIL Labial of the Upper Incisor
LB SD Supradentale
LB A A Point
LB ANS Anterior Nasal Spine
LB UIA Upper Incisor Apex
LB UIB Upper incisor Lingual Bony Contact Point
LB LIB Lower incisor Lingual Bony Contact Point
LB LIA Lower incisor Apex
LB SYM Lingual Symphyseal Point
LB PMC Pre olar Mesial Contact point
LB PDC Premolar Distal Contact point
LB LMR Lower Molar Root Apex
LB LMJ Lower Molar Mesial CEJ
LB LMC Lower Mesial Contact
LB UMT Upper Mesial Cusp Tip
LB PDP Posterior Downs point
LB LMIT Lower molar Mesial Cusp Tip
LB UMJ Upper molar Mesial CEJ
LB UMR Upper molar Root Apex
LB UDT Upper molar Distal Cusp Tip
LB FPP Functional Occlusal Plane Point
LB LAB L Ant Border Ramus
LB RAB R Ant Border Ramus
LB LGO L Gonion
LB RGO R Gonion
LB GOI Gonial Intersetction
LB LPB L Post Border Ramus
LB RPB R Post Border Ramus
LB PSE Posterior Skull External
LB PSI Posterior skull internal
LB OCP Occipital Protuberance
LB I Inion
LB OP Opisthion
LB BP Bolton point
LB BA Basion
LB AR Articulare Posterior
LB AA Articulare Anterior
LB CO Condylion
LB LCO L Condylion
LB RCO R Condylion
LB PO Porion
LB LAPO L Anatomic Porion LB RAPO R Anatomic Porion
LB S Sella Turcica
LB SE Ethmoid Registration Point
LB GBI Glabella internal
LB GB Glabella
LB FSS Frontal Sinus Superior
LB LSUP L Supraorbitale
LB RSUP R Supraorbitale
LB FSI Frontal Sinus Inferior
LB FMN Frontomaxillary nasal suture
LB N Nasion
LB NB Nasal Bone
LB LLO L Lateral Orbit
LB RLO R Lateral Orbit
LB LOR L Orbitale
LB ROR R Orbitale
LB IZ Inferior Zygoma
LB PNS Post Nasal Spine
LB PTMI Pterygomaxillary Fissure Inferior
LB PTMS Pterygomaxillary Fissure Superior
LB LCP L Coronoid
LB RCP R Coronoid
LB 3VS 3rd Vertebra Superior
LB 3VA 3rd Vertebra Anterior
LB 3VI 3rd Vertebra Inferior
LB 3VP 3rd Vertebra Posterior
LB 3VC 3rd Vertebra Canal
LB 3VSP 3rd Vertebra Spine
LB 4VS 4th Vertebra Superior
LB 4VA 4th Vertebra Anterior
LB 4VI 4th Vertebra Inferior
LB 4VP 4th Vertebra Posterior
LB 4VC 4th Vertebra Canal
LB 4VSP 4th Vertebra Spine
LB COL Columella
LB LA L Articulare
LB RA R Articulare
LB LM L Mastoid
LB RM R Mastoid
LB LAGO L Antegonion
LB RAGO R Antegonion
LB LR1 L Rl
LB RR1 R Rl
LB LR2 L R2
LB RR2 R R2
LB LR3 L R3
LB RR3 R R3
LS GL' Soft Tissue Glabella
LS Na' Soft Tissue Nasion
LS No' Nose Tip
LS Sn' Subnasale
LS A' Soft Tissue A Point
LS UL' Upper Lip
LS LL' Lower Lip
LS B' Soft Tissue B Point
LS Pog' Soft Tissue Pogonion
LS Gn' Soft Tissue Gnathion
LS Me' Soft Tissue Menton
LS URh' Upper Rhinion
LS LRh' Lower Rhinion
LS UEm' Upper Embrasure
LS LEm' Lower Embrasure
LS LEyM' L Eye Medial
LS LEyL' L Eye Lateral
LS REyM' R Eye Medial
LS REyL' R Eye Lateral
LS LLc' L Corner Lip
LS RLc' R Corner Lip
LS LNp' L Posterior Nares LS_RNp ' R Posterior Nares
LS_LN1C L Nasolabial Crease
LS_RN1C R Nasolabial Crease
LS_LTr ' L Tragus
LS_RTr ' R Tragus
LS_TH ' Top of Head
// ==============_τeeth=======================================
NOTE: the number's refer to the tooth quadrant (Teeth #s
11-18,21-28,31-38,41-48) and which tooth the abbreviations represent different points on the teeth letters are as follows . For the most part the teeth points represent morph and control points. Control points are points in the patient specific model 160 that can be used to manipulate specific objects. d = distal = mesial
1 = lingual b = buccal r = root tip l = incisal edge c = cervical abbr landmark Description
// =======
LT_llm21m llm21m Interproximal contact point between teeth #s
11 and 21
LT_llr llr Tooth #11 root tip
LT_llmι llmi Tooth #11 mesial incisal edge
LT_lldι lldi Tooth #11 distal incisal edge
LT_llc lie Tooth #11 cervical
LT_lldl2m lldl2m Interproximal contact point between teeth #s
11 and 12 LT_12r 12r Tooth #12 root tip LT_12dι 12dι Tooth #12 distal incisal edge LT_12mι 12mι Tooth #12 mesial incisal edge LT_12c 12c Tooth #12 cervical LT_12dl3m 12dl3m Interproximal contact point between teeth #s
12 and 13 LT_13r 13r Tooth #13 root tip LT_13ι 13ι Tooth #13 incisal tip LT_13c 13c Tooth #13 cervical LT_13dl4m 13dl4m Interproximal contact point between teeth #s
13 and 14 LT_1 r 14r Tooth #14 root tip LT_141 141 Tooth #14 lingual cusp tip LT_14b 14b Tooth #14 buccal cusp tip LT_14c 14c Tooth #14 cervical LT_14dl5m 14dl5m Interproximal contact point between teeth #s
14 and 15 LT_15r 15r Tooth #15 root tip LT_151 151 Tooth # 15 lingual cusp tip LT_15b 15b Tooth # 15 buccal cusp tip LT_15c 15c Tooth # 15 cervical LT_15dl6m 15dl6m Interproximal contact point between teeth #s
15 and 16 LT_16r 16r Tooth #16 root tip LT_16ml 16ml Tooth #16 mesial lingual cusp tip LT_16dl 16dl Tooth #16 distal lingual cusp tip LT_16mb 16mb Tooth #16 mesial buccal cusp tip LT_16db 16db Tooth #16 distal buccal cusp tip LT_16c 16c Tooth #16 cervical LT_16dl7m 16dl7m Interproximal contact point between teeth #s
16 and 17 LT_17r 17r Tooth #17 root tip LT_17ml 17ml Tooth #17 mesial lingual cusp tip LT_17dl 17dl Tooth #17 distal lingual cusp tip LT_17mb 17mb Tooth #17 mesial buccal cusp tip LT 17c 17c Tooth #17 cervical LT_17db 17db Tooth #17 distal buccal cusp tip
LT_17dl8m 17dl8m Interproximal contact point between teeth #s
17 and 18
LT_18r 18r Tooth #18 root tip
LT_18dl 18dl Tooth #18 distal lingual cusp tip
LT_18ml 18ml Tooth #18 mesial lingual cusp tip
LT_18mb 18mb Tooth #18 mesial buccal cusp tip
LT_18db 18db Tooth #18 distal buccal cusp tip
LT__18c 18c Tooth #18 cervical
LT_21r 21r Tooth #21 root tip
LT_21dι 21dι Tooth #21 distal incisal edge
LT__21mι 21mι Tooth #21 mesial incisal edge
LT_21c 21c Tooth #21 cervical
LT_21d22m 21d22m Interproximal contact point between teeth #s
21 and 22 LT_22r 22r Tooth #22 root tip LT_22dl 22dι Tooth #22 distal incisal edge LT_22ιtα 22mι Tooth #22 mesial incisal edge LT_22c 22c Tooth #22 cervical LT_22d23m 22d23m Interproximal contact point between teeth #s
22 and 23 LT_23r 23r Tooth #23 root tip LT__23l 23ι Tooth #23 incisal tip LT_23c 23c Tooth #23 cervical LT_23d24m 23d24m Interproximal contact point between teeth #s
23 and 24 LT_2 r 24r Tooth #24 root tip LT_241 241 Tooth #24 lingual cusp tip LT_24b 24b Tooth #24 buccal cusp tip LT_24c 24c Tooth #24 cervical LT_24d25m 24d25m Interproximal contact point between teeth #s
24 and 25 LT_25 25r Tooth #25 root tip LT_251 251 Tooth #25 lingual cusp tip LT_25b 25b Tooth #25 buccal cusp tip LT_25c 25c Tooth #25 cervical LT_25d26m 25d26m Interproximal contact point between teeth #s
25 and 26 LT_26r 26r Tooth #26 root tip LT_26ml 26ml Tooth #26 mesial lingual cusp tip LT_26dl 26dl Tooth #26 distal lingual cusp tip LT_26db 26db Tooth #26 distal buccal cusp tip LT__26mb 26mb Tooth #26 mesial buccal cusp tip LT_26c 26c Tooth #26 cervical LT_26d27m 26d27m Interproximal contact point between teeth #s 26 and 27 LT_27r 27r Tooth #27 root tip LT_27db 27db Tooth #27 distal buccal cusp tip LT_27mb 27mb Tooth #27 mesial buccal cusp tip LT_27ml 27ml Tooth #27 mesial lingual cusp tip LT_27dl 27dl Tooth #27 distal lingual cusp tip LT_27c 27c Tooth #27 cervical LT_27d28m 27d28m Interproximal contact point between teeth #s 27 and 28 LT_28r 28r Tooth #28 root tip LT_28ml 28ml Tooth #28 mesial lingual cusp tip LT_28dl 28dl Tooth #28 distal lingual cusp tip LT_28db 28db Tooth #28 distal buccal cusp tip LT_28mb 28mb Tooth #28 mesial buccal cusp tip LT_28c 28c Tooth #28 cervical LT_31m41m 31m41m Interproximal contact point between teeth #s 31 and 41 LT_31r 31r Tooth #31 root tip LT_31dι 31dι Tooth #31 distal incisal edge LT_31mι 31mι Tooth #31 mesial incisal edge LT_31c 31c Tooth #31 cervical LT_31d32m 31d32m Interproximal contact point between teeth #s 31 and 32 LT 32r 32r Tooth #32 root t p LT_32dι 32dι Tooth #32 distal incisal edge LT_32κιι 32mι Tooth #32 mesial incisal edge LT_32c 32c Tooth #32 cervical LT_32d33m 32d33m Interproximal contact point between teeth #s
32 and 33 LT_33r 33r Tooth #33 root tip LT_33ι 33ι Tooth #33 incisal tip LT_33c 33c Tooth #33 cervical LT_33d34m 33d34m Interproximal contact point between teeth #s
33 and 34 LT_34r 34r Tooth #34 root tip LT_34b 34b Tooth #34 buccal cusp tip LT_3 1 341 Tooth #34 lingual cusp tip LT_34c 34c Tooth #34 cervical LT_34d35m 34d35m Interproximal contact point between teeth #s
34 and 35 LT_35r 35r Tooth #35 root tip LT_351 351 Tooth #35 lingual cusp tip LT_35b 35b Tooth #35 buccal cusp tip LT_35c 35c Tooth #35 cervical LT_35d36m 35d36m Interproximal contact point between teeth #s
35 and 36 LT_36r 36r Tooth #36 root tip LT_36ml 36ml Tooth #36 mesial lingual cusp tip LT_36mb 36mb Tooth #36 mesial buccal cusp tip LT_36db 36db Tooth #36 distal buccal cusp tip LT_36c 36c Tooth #36 cervical LT_36dl 36dl Tooth #36 distal lingual cusp tip LT_36d37m 36d37m Interproximal contact point between teeth #s
36 and 37 LT_37r 37r Tooth #37 root tip LT_37dl 37dl Tooth #37 distal lingual cusp tip LT_37db 37db Tooth #37 distal buccal cusp tip LT_37mb 37mb Tooth #37 mesial buccal cusp tip LT_37ml 37ml Tooth #37 mesial lingual cusp tip LT_37c 37c Tooth #37 cervical LT_37d38m 37d38m Interproximal contact point between teeth #s
37 and 38 LT_38r 38r Tooth #38 root tip LT_38db 38db Tooth #38 distal buccal cusp tip LT_38mb 38mb Tooth #38 mesial buccal cusp tip LT_38dl 38dl Tooth #38 distal lingual cusp tip LT_38ml 38ml Tooth #38 mesial lingual cusp tip LT_38c 38c Tooth #38 cervical LT_41r 41r Tooth #41 root tip LT_41dι 41dι Tooth #41 distal incisal edge LT_41mι 41nα Tooth #41 mesial incisal edge LT_41c 41c Tooth #41 cervical LT_41d42m 41d42m Interproximal contact point between teeth #s
41 and 42 LT_ 2r 42r Tooth #42 root tip LT_42dι 42dι Tooth #42 distal incisal edge LT_42mι 42mι Tooth #42 mesial incisal edge LT_42c 42c Tooth #42 cervical LT_ 2d43m 42d43m Interproximal contact point between teeth #s
42 and 43 LT_43r 43r Tooth #43 root tip LT_ 3d 43d Tooth #43 distal interproximal contact LT_43c 43c Tooth #43 cervical LT_44m 44m Tooth #44 mesial surface interproximal contact LT_44d 44d Tooth #44 distal surface interproximal contact LT_44r 44r Tooth #44 root tip LT_44b 44b Tooth #44 buccal cusp tip LT_441 441 Tooth #44 lingual cusp tip LT_44c 44c Tooth #44 cervical LT_45m 45m Tooth #45 mesial surface interproximal contact LT_ 5d 45d Tooth #45 distal surface interproximal contact
LT 45r 45r Tooth #45 root tip
LT 45c 45c Tooth #45 cervical
LT 45b 45b Tooth #45 buccal cusp tip
LT 451 451 Tooth #45 lingual cusp tip
LT_46m 46m Tooth #46 mesial surface interproximal contact
LT 46r 46r Tooth #46 root tip
LT 46mb 46mb Tooth #46 mesial buccal cusp tip
LT 46db 46db Tooth #46 distal buccal cusp tip
LT 46dl 46dl Tooth #46 distal lingual cusp tip
LT 46c 46c Tooth #46 cervical
LT 46ml 46ml Tooth #46 mesial lingual cusp tip
LT 46d 7m 46d47m Interproximal contact point between teeth #s
46 and 47
LT 47r 47r Tooth #47 root tip
LT 47mb 47mb Tooth #47 mesial buccal cusp tip
LT 47db 47db Tooth #47 distal buccal cusp tip
LT 47ml 47ml Tooth #47 mesial lingual cusp tip
LT 47dl 47dl Tooth #47 distal lingual cusp tip
LT 47c 47c Tooth #47 cervical
LT 47d48m 47d48m Interproximal contact point between teeth #s
47 and 48
LT 48r 48r Tooth #48 root tip
LT 48mb 48mb Tooth #48 mesial buccal cusp tip
LT 48db 48db Tooth #48 distal buccal cusp tip
LT 48ml 48ml Tooth #48 mesial lingual cusp tip
LT 48dl 48dl Tooth #48 distal lingual cusp tip
LT_48c 48c Tooth #48 cervical tracedata
{ o, SOFT, "TS MIDLINE", "Midline Soft Tissue Trace"},
{ 1, SOFT, "TS" REBROW" , "R Eyebrow trace"},
{ 2, SOFT, "TS~ LEBROW" , "L Eyebrow Trace"},
{ 3, SOFT, "TS" "REAR" , "R Ear trace" } ,
{ 4, SOFT, "TΞ~ LEAR", "L Ear Trace"},
{ 5, SOFT, "TS" REYE", "R Eye Trace"} ,
{ 6, SOFT, "TS" LEYE", "L Eye Trace"} ,
{ 7, SOFT, "TS" RLIP", "R Lip Trace"} ,
{ 8, SOFT, "TS~ " LIP", "L Lip Trace"} ,
{ 9, SOFT, "Ts[ ~RN" "R Nares Trace"} ,
{ 10, SOFT, "TS] "LN" "L Nares Trace"} ,
( 11, SOFT, "TS" RTH" , "R Top of Head Trace"},
{ 12, SOFT, "TS^ "LTH" , "L Top of Head Trace"},
{ 13, SOFT, "TS~ FH", "Front of Head Trace"},
( 14, BONE, "TB~ RLO", "R Orbit Trace") ,
{ 15, BONE, "TB~ LLO' ', "L Orbit Trace"},
( 16, BONE, "TB" RMAND", "R Mandible trace"},
{ 17, BONE, "TB" LMAND", "L Mandible trace"},
{ 18, BONE, "TB_ RATop", "R Arch Top Trace"},
{ 19, BONE, "TB~ RABot", "R Arch Bottom Trace"},
( 20, BONE, "TB~ LATop", "L Arch Top Trace"},
{ 21, BONE, "TB LABot", "L Arch Bottom Trace"},
{ 22, BONE, "TB "MIDLINE", "Midline Bone Trace"},
There is a single trace for occlusal table for each of the 32 teeth.
In addition, there is one trace for both dental arches that extends along the central groove and incisal edges of teeth.
{ -l, MAX, "'NO'", '"\0 ' " ) , Appendix B The following presents cephalometric analysis that can be performed using
the clinician/consultant 125.
Figure imgf000045_0001
Figure imgf000045_0002
Figure imgf000045_0003
Figure imgf000046_0001
Figure imgf000046_0002
Ja abak Dent Analysts.
Landmarks ean Angle Mean easnrement (degrees)
Occlusal Plane to GoMe Angle
Inteπncisal Angle 1-1 133
LI to GoGn Angle 90 (+/-3)
Figure imgf000047_0001
-45-
Figure imgf000048_0001
Figure imgf000048_0002
1-* BA-CC-Po
2-* FH-N-Po
3 *MP -FH
4*
5-*ANS-Xi to Xi to PM (PM = Point on anterior border of symphysis between Point B and
Pogonion wher the curvature changes from concave to convex Xi constructed point
6 Xi-PM plane to X -DC (DC=a point selected in the center of the neck of the condyle wher the basion-Nasion plane crosses it. Basion (BA) most inferior posteπor p;oint oof the ocαpital bone
7*A-Po plane to N-Po plane
8τ>Tip lower incisor to APo
9-*Apo-Lower incisal angle ( Incisal angle formed along the vertical long axis of tooth)
10-*
11 ^E Plane -Soft tissue tip of Nose to Soft Ηssue Pogonion-Lip is lower lip
Saaaau Analysis.
Landmarks Mean Angle Mean Measurement
(degrees) (mm)
Palatal Plane Length Not Established
Pogonion to ANS Arc Not Established
Figure imgf000049_0001
Appendix C The following describes the tooth labeling conventions used in one
embodiment of the invention.
Most General Dentists in the US use this # convention. Orthodontists use this convention when communicating w/ the General Dentist (GP) as far as extraction, etc. This view is as if you are looking directly at the patient's face (Your left is the patient's right, Your right is the Patient's left)
Patient's Maxillary Right Quadrant Patient's Maxillary Left Quadrant
Patient's Mandibular Right Quadrant Patient's Mandibular Left Quadrant
This line represents the patient's midline
1 - 32 Adult Teeth USA 1 2 3 4 5 6 7 9 10 11 12 13 14 15 16
(Permanent)
32 31 30 29 28 27 26 25' 24 23 22 21 20 19 18
17
Upper Right
Quadrant
1 - 8
Upper Left
Quadrant
9 - 16
Figure imgf000050_0002
Figure imgf000050_0001
[Lower Left
IQuadrant
17-24
Lower Right Quadrant
25-32
Figure imgf000051_0002
a - 1 Deciduous Teeth USA ( Primary)
Figure imgf000051_0001
Figure imgf000051_0003
Other Notations can be used for Missing, Supernumerary (extra) , pontics, etc. used after the Number.
S or s for Supernumarv P or p for pontic
X or x for missing I or i for Implant d for deciduous
Example : 7s = supernumerary (extra) Maxillary Right Lateral 4X = # 4 missing , missing Max. Right 2nd Bicuspid 3 4p 5p 6 = bridge from # 3 - # 6, (#4 & #5 are pontics)
Most Orthodontists in the USA use the following # convention:
Patient's Maxillary Right Quadrant Patient's Maxillary Left Quadrant
Patient's Mandibular Right Quadrant Patient's Mandibular Left Quadrant
This fine represents the patient's midline
1 -8 in each quadrant - Permanent Teeth USA starting w/ 1 in the middle, therefore ALL l's are central incisors, ALL 4's are 1st bicus ids ALL 8's are 3rd molars wisdom teeth etc.
Figure imgf000052_0001
Figure imgf000053_0002
a - e in each quadrant Deciduous Teeth USA
Figure imgf000053_0001
Figure imgf000053_0003
51-
Figure imgf000054_0005
We use a short hand to diagram individual teeth or groups of teeth, example:
Figure imgf000054_0001
= Upper Right Quadrant = Upper Left Quadrant
Lower Right Quadrant Lower Left Quadrant
Figure imgf000054_0002
this short hand is for ALL four 1st biscuspids
this shorthand is for the Maxillary Right Central
Figure imgf000054_0003
Incisor
this is short hand is for both Maxillary 3rd molars
this is short hand for Maxillary Right & Left Deciduous 2n molars,
Mandibular Right Deciduous 2 nnd molar and the Mandibular
Figure imgf000054_0004
Left Permanent 2n bicuspid
6edc21 12cde6
This is short hand for ALL four Permanent 1st Molars
(6),
6edc21 12cde6 ALL four Deciduous 2 & 1st Molars (e & d)
ALL four Deciduous cuspids (canines) ( c ) ALL four Permanent Lateral incisors (2) ALL four Permanent Central Incisors (1)
Other short hand used UR = Upper Right UL = Upper Left LR = Lower Right LL = Lower Left
Followed by tooth # (Ex: URl = Upper Right
Central Incisor
UR8 UR7 UR6 UR5 UR4 UR3 UR2 URl UL1 UL2 UL3 UL4 UL5 UL6 UL7 UL8
LR8 LR7 LR6 LR5 LR4 LR3 LR2 LR1 LL1 LL2 LL3 LL4 LL5 LL6 LL7 LL8
Figure imgf000055_0001
Figure imgf000056_0001
Could also be for Deciduous teeth using URa , LL b, etc.
Figure imgf000056_0002
International Teeth # System 11-18,21-28,31-38,41-48 International Adult teeth
Each Quadrant numbered by 1st #
1 = Maxillary Right Quadrant 1 Quadrant 2
2 =Maxillaιy Left
3 = Mandibular Left Quadrant It Quadrant 3 4 = Mandibular Right
Then tooth #1-8
18 17 16 15 14 13 12 11 21 2223 2425 2627
28
48474645 4443 4241 31 3233 3435 3637
38
International Deciduous Teeth Quadrants 5-8 Quadrant 5 Quadrant
Quadrant 8
Quadrant 7 Then tooth #1-8
5857565554535251 61626364656667
68
8887868584838281171727374757677
78
Figure imgf000057_0001
Figure imgf000058_0001
Figure imgf000058_0002
Appendix D
The following describes a tooth auto-alignment strategy used in the
clinician/consultant 125. The clinician/consultant 125 can use the following
procedure in determining an appropriate treatment for a patient
1 Generate tooth alignment template (arch wire)
Top View Lateral View
Posterior
Anterior
Figure imgf000059_0001
Tooth Wire Alignment
Fit tooth alignment wire to the dental arch
Figure imgf000059_0002
Establish midline and first molar locations on the wire.
Figure imgf000059_0003
Figure imgf000060_0001
Figure imgf000061_0001
Figure imgf000062_0001
Figure imgf000063_0001
Appendix E The following describes an Roth analysis that can be performed using the clinician/consultant 125 and can be particularly helpful in the tooth alignment
treatment.
Figure imgf000064_0001
Appendix F The following describes the specifications of various embodiments of the
invention.
Description of Product Features
Figure imgf000065_0001
This module has been designed to assist with patient analysis, treatment planning and patient education for orthodontists, dentists that perform orthodontics and oral surgeons that perform orthognathic surgery. This module will provide the full range of analysis, modelling and treatment features currently expected with all existing 2D software packages. In addition, this module will allow the soft and hard tissues to be accurately captured and photorealistically displayed and analyzed in three dimensions. Unique algorithms will then be accessed to perform a number of functions with the captured data.
Normal Stock Object The stock objects will be used for model matching and used as a template for rapid conversion of the patient input data to a 3- D model. These stock objects will include the jaws, teeth, and soft tissues of the face. The stock objects can be constructed to represent NORMAL for the modeled anatomy. In addition, other stock objects could be designed to provide a closer starting point for common types of anatomical variaUon or pathology. These variations may include size, sex and facial form (Angle's Class 1,11 and III). Stock objects will be constructed from a wire frame with a relatively small polygon count. The vertices of the wire frame can be strategically located to allow for the subsequent modifications that will allow for rapid customization to adapt to the patient's input data. In the jaws, the stock objects can have a minimum # of "tie down points" that corresponds to "landmark locations". The minimum # of tie down points on a tooth may include those that allow for rapid modification in height, mesiodistal and buccolingual width, and angulation.
The wire frame can be mutable. The wire frame can possess a logical behavior among the neighboring wire frame intersects or vertices. That is, when the wire frame is mutated by the end user all of the intersects that are changed and their neighbors can respond in a logical fashion. Landmark groupings can be able to be segmented and moved to a new location to simulate treatment or growth. The movement of these segmented landmarks can occur through data input or manual "Drag and Drop,"
There can be a method for rapid or automatic registration of the stock object with the input data. The input data can include photographs, x-rays, a previously rendered patient 3-D data set.
The stock objects can have a spatial association with a data base. The data base will record the 3-D spatial attitude of the stock object and will record modifications of the stock object. The data base will track the landmark locations and any other changes that relate to the original polygon size and form. Object-Oriented Data This is a feature that the average user of the software may not fully appreciate. However, as the framework of software Base design, it has many advantages from MedScape point of view, the MedScape product line deals with physical entities such as patients and anatomical structures of the face. It produces images from these objects, extracts measurements and produces models of them. It also produces such end- user data products as growth prediction and treatment plans. The underlying data structure that can define and relate all these entities in a unified fashion is an object oriented database. The time spent initially on a framework for careful definition of the object classes and their relations in such a database will save tremendous amount of effort and cost in the following developments. In addition to producing reusable computer programs, this approach will facilitate definition and integration of the work by multiple teams, namely MedScape R&D contractors as currently envisioned.
Typical examples of objects are a patient, a digital image, a specific mandible, the 3-D model of a specific mandible, a "normal" mandible, a treatment plan, etc. The specific instances of these objects are stored in the database as rows of various tables, where each table represents the object class. Each class is identified by its properties and methods (procedures that are applied to them). Each software development team will concentrate on specific object classes assigned to it with the goal of producing class libraries that expose the properties and methods of each object to all development teams for final integration. 3-D ACCURACY Although accuracy numbers for the so called "nominal" conditions can be provided, the accuracy of position and orientation measurements made from one or more images of an object can vary significantly depending on a number parameters. Some of these parameters are inherent to the image sensor resolution and noise which can be considered fixed and are determined off-line. However, an even larger number of these parameters depend on the geometry and size of the very object being measured, and the geometric setup of the imaging sensors. These "variable" parameters of course mean that the accuracy can not be quoted as a unique specification for a measurement system. However, theoretical error bounds can be internally computed from calibration data, camera resolution, and other system parameters for a specific measurement scenario.
The proposed software will include the necessary models and algorithms to compute these theoretic error bounds and provide them as part of the measurement results. For example in the case of landmark position measurements, for each measured landmark the software outputs the ellipsoid that represents the error uncertainty in three dimensions. In this way the user is given a yard stick by which the accuracy of each measurement results can be judged.
In the case of cephalometric landmark measurements, submillimeter accuracy was predicted and to some extent experimentally validated by exercising these analytic error models. A specific R&D task is to validate the error models using more complete experimentation.
MODEL MATCHING The starting point for modeling an object from multiple images is to retrieve a "stock" or normal version of that
Figure imgf000069_0001
establishes correspondence of two or more points used for triangulation. This will minimizes the user effort of designating the corresponding points in different views based on visual cues. This automation is achieved by taking into account the geometric constraints imposed by both the imaging system and the object being modeled.
3-D Display
Figure imgf000070_0001
3D display refers to the mode of 3D visualization on a computer screen. The reason MedScape was formed, is to give doctors a convenient, fast and user friendly way to gain accurate 3D information for diagnosis and treatment planning. Today's "state-of-the-art", in orthodontics, orthognathic surgery and plastic and reconstructive surgery diagnosis, is two-dimensional. True three-dimensional visualization and manipulation of the 3D data set is essential for accurate diagnosis and treatment planning.
The 3D display allows for the visualization of the 3D data base (created from photos, models, X-rays, etc). This 3D visualization allows for 1) Perspective 3D viewing with shading, shadowing and monocular depth cues. 2) Straight on 3D Stereoscopic viewing and 3) Ability to view the 3D data set in a 45 degree 3D Stereoscopic viewing mode (allows for 50% more visual information). The 3D display of the 3D data set can include the following information
1) Display dimensionally accurate representation of the patients anatomy (face, facial contours, teeth, gmgival tissue, bony anatomy via ceph X-ray or CT, MRI (wire frame and rendered), etc )
2) A consistent cartesian coordinate system A right handed Cartesian Coordinate System is defined as Looking at Front of an object (ex front of the face)
X axis - Horizontal with Positive X to the Right,
Y axis - Vertical with positive Y up,
Z axis - In/Out with positive toward the viewer
Figure imgf000071_0001
All data sources set to this right handed Cartesian Coordinate System
3) An X,Y,Z, analog in computer display to aid viewer m orientation of 3D data set For 3D perspective & 3D Stereoscopic
4) Rotation of the 3D Data Set in all 3 planes of space and the ability to control the roll, pitch & yaw movements around these axes (6 degrees of freedom). The ability to see the 3D data set from any angle or view.
5) Be able to lock the mouse so the object can be rotated in only one axis at a time and in real time.
6) Be able to easily go back to the original orientation of the data set (example - Frontal View).
7) The user can define and control the rotation of the data set precisely. (1 degree Rotation or smaller)
8) The user should be able to define a rotational pattern around one, two, or three axes, together or independently.
9) The user should be able to move the 3D model in real time:
1. up/down,
2. right /left, &/or
3. in/out. (Stereoscopic vs larger/smaller or Z buffer)
10) See any virtual view at any angle Predefined views 1) Frontal, 2) Right Lateral, 3) Left Lateral, 4) SMV, 5) 45 degree right, 6) 45 degree left, 7) smile, 8) lips in repose vs lips closed. (ABO Requirements)
11) Animation in perspective 3D and in Stereoscopic 3D. (Example: open/closed animation to evaluate deviation on opening, asymmetry, etc. animate mandibular movements associated with jaw tracking).
12) The 3D display should allow for user controlled transparency of facial soft tissue to show underlying teeth and skeletal structure relationship. Transparency should be controlled by a slide bar from 0% - 100% and have predefined 20%, 40%, 60%, 80%, for quick acquisition.
13) Lighting of the 3D data set should be predefined to give the best brightness, contract, etc. Real time lighting changes should be possible to gain better 3D view. Especially important with 3D Stereoscopic viewing on high contrast areas as it gives poor results to the stereoscopic effect. In Stereoscopic mode the lighting should allow for the Stereopairs to be lighted the same. Eliminate difference in lighting of the two separate views , creates ghosting.
14) A Reference plane should be available to show the drawing plane, etc. 15) The use of zoom, magnify, scaling, set axis, split objects, move segments, trim, grasp objects should be available and user controlled
16) The software 3D program should show the wireframe, quick render and full render of the 3D data set Also render a window should be available to render only an area of interest
17) The 3D Display should use the photographs from which the wireframes are generated to create the photorealistic textures
18) The camera setting should be predefined Other setting can be included as scene camera, director camera, pan, tilt, Roll, Dolly, etc
19) 3D display allow for import/export of Model files ( MDL, DXF, IGS, 3DS, other)
20) Import/Export of picture formats ( BMP, TGA, GIF, TIF, PCX, other)
21) The 3D display should allow for facial topography visualization and measurement Facial topography contours have certain patterns that differ from people considered "Beautiful" vs "Ugly" vs "Normal" Subtle differences in the nasal area, Zygomatic area (cheek bone), Lip contour, submental fold and chm area Facial topography will be more evident m stereoscopic 3-D visualization Features that are used to describe beauty
1 Cheek high VS Flat
2 Ch asymmetry, prominence, deficiency, cleft
3 Lips Full, thm, protrusive, retrusive, commisure, vermilion border, ethnic considerations
4 Nose Size, width, flaring, alar base, nares, dorsal hump, nasal tip, symmetry, nasal cartilage deviation
5 Smile gummy, deficient, long face, short face, symmetry
6 Facial Proportional thirds, symmetry
7 eyes Symmetry, high, low, prominent vs recessed
8 Glabella Prominent vs deficient
9 Ears Symmetry, size, vertical position, morphology
Figure imgf000074_0001
•72- Negative parallax would be the best choice for 3D stereoscopic viewing of cephalometnc images
Figure imgf000075_0001
Stereoscopic 3D imaging allows for all 3 planes of space to be viewed, simultaneously This is a clear and important difference between 3D Stereoscopic viewing and 3D perspective viewing When stereoscopic 3D visualization is added to motion parallax (such as rotation of the object) there is an enhancement of visual depth
The 45 degree angular stereoscopic viewing allows the operator to view 50% more of the image This is an even greater reason to use Stereoscopic viewing Again motion parallax (rotation) adds even more visual reference It allows for improved visualization of the Z axis information
Any 3D visual information can be created m a 3D Stereoscopic mode to further enhance to visual ability to understand 3D relationships of anatomy When motion parallax is also added even greater visual depth information is present
vantages of 3D Stereo vs perspective 3D vs 2D Once the 3D model is created (patient face, teeth, skeletal structure), the software program can create the appropriate "stereo pairs" for 3D Stereoscopic viewing. Lighting (brightness, contrast, shadows, etc), can be controlled. The software can create the appropriate parallax on the screen to create the stereoscopic image on the screen when viewed with the appropriate viewing lenses (anaglyph, polarized, field sequential).
In order to view stereoscopically on a computer monitor, one can present the two separate images to the conesponding retina in each eye. Anaglyph uses Red & Blue lenses so that each eye only sees the image it is suppose to see. There is some limitation on using colored images with the anaglyph mode.
Other mechanisms such as polarized or field segmental are available. Precise control of the vertical and horizontal parallax is critical.
Stereo viewing of angular, linear, planes, angles, points, and volume is important.
Full color can be done with anaglyph (synthonics) but problems do arise with red, blue and green colors that are part of the image. True full color is best seen with polarized or field sequential (LCD shutters). Field sequential can be 60 or 120Hz. The image flicker can only be eliminated with the 120 Hz. Another advantage of field sequential is tracking devices can be incorporated to allows the viewer to visualize the 3D scene from multiple viewing angles. The multiple viewing angle is an advantage over fixed viewing angle required by anaglyph or polarized viewing techniques.
Models as a Data The molds of the teeth give a physical model upon which arch length and treatment decisions are made. These models can be
Source mounted on an articulator or not mounted. The articulated mounted models give a more true 3D relationship of the teeth to
Figure imgf000077_0001
Figure imgf000078_0001
Figure imgf000078_0002
Figure imgf000079_0001
Figure imgf000080_0001
Figure imgf000081_0001
Feature Descriptiou For Cephalometric Module
Landmark ID Landmarks can conespond with standard orthodontics landmarks, i.e., gonion, nasion, sella, pogonion, etc. These landmarks can be located in their normal position on the morphologically normal skeletal stock object and can be visually evident. These landmarks can be spatially associated to each other through the IMS data base functions. The spatially calibrated cephalometric views (H & M Associate software) can be overlaid on the stock object. The stock object will be customized to fit the cephalometric data by either drag and drop of a stock object landmark to the corresponding ceph. landmark (edited in all available projection views) or by digitizing the landmarks on the ceph and then making the association and customization from the stock object through the IMS data base. A combination of the two customization methods can be used. In addition, non-landmark cephalometric data may also be associated with the stock objects. The non-landmark data of most interest are the ridge crests (j.e., orbital rims, rim of the nasal fossa, external borders of the mandible, mandibular canal, sinuses margins etc.
The stock objects provide a visual reference of the patients anatomy and the landmarks are the data base portion of the patient's file that describe the features that are unique for that individual. Therefore, only the landmark locations need to be stored in the IMS data base. The landmark locations can serve as a set of instructions for altering the stock objects. Similarly, the transmission of patient landmark location and customizing the stock object at the receiver is more efficient method then transmitting a customized stock object. Use the IMS data base to compile landmark location data to be used to establish normative data for 3D cephalometric analysis and for upgrading the stock model.
2D Analysis and 2 D Normative Data A 2D orthodontic cephalometric analysis is based on comparison of the patients' data with 2D normative data bases that have existed for decades. 2D normative data bases include: Burlington growth study, Bolton/Broadbent, Rocky Mountain Data Systems, Michigan Growth Study, to name a few. 2D analysis include: Steiner Analysis, Downs Analysis, Ricketts, Tweed, Alabama, RMDS, Wits, Owens, etc. 2D template analyses are normative 2D visualizations that are overlayed
Figure imgf000083_0001
LATERAL CEPHALOMETRIC ANALYSIS POINTS
Figure imgf000084_0001
Figure imgf000085_0001
head) to the X-ray film.
These "errors" can be reproduced when the 3D data is converted to the "traditional" 2-D data set for comparison to "traditional" 2D normative data. The 2D normative data can be adjusted for sex, age, race, size, etc. and created into a graphical representation (template) of normative data for visual comparison.
3D analysis & 3D MedScape was founded on the premise to create, develop and Normative Data offer 3D & 3D Stereoscopic software products to the medical and dental professions. MedScape products will give the doctor the ability to diagnose and treatment plan their patients with three-dimensionally accurate models of anatomy (face, teeth & bones).
Three dimensionally accurate analysis and normative data critically depends on the accuracy of anatomical landmark location & identification.
** (See landmark ID)
A 3D visual comparison to 3D normative data adjusted for size, sex, race, and age.
3D Normative Data - This data will have to be developed through University research as this information is limited at this time. Grayson's arcticle in the AJO describes some 3-D growth patterns. Also Rick Jacobson gives some 3-D data in Jacobson's Book "Radiographic Cepahlometrics". At this time, 3D analysis will have to be "projected" to a 2D format to compare to "narrative 2D data" since this is what exist at this time. These is some work being done in Australia and Canada on 3D MRI & Ceph data.
3D Analysis of Patient Data - The traditional 2D landmarks, angles, planes, etc. can be viewed on the 3D model for comparison. The 3D model will add the advantage of being able to view asymmetries of the right & left side of the face, teeth, and skeletal structure. This is a critical area that is not assessed in "traditional" 2D analyses.
** ( See Highlight Abnormal)
Centric Relation & centric occlusion will also be viewed in 3D.
** (See CR)
** see segment landmarks
** see convert to CR
** see meas. of soft tissue
** see fuse w/ ceph
** see output from photos
** see landmark tracking over time
** see compute angles
** see compute distances
The lingual concavity of the upper and lower incisors are related to the disclusion angle and the angle of the eminence. These should be congruent w/ each other. These functional components of TMJ function and dysfunction are important concepts that are critical for proper diagnosis. 3D analysis includes modeling of the critical anatomical areas & for the generic wireframes to adjust to overlay the patient's anatomy. A visual representation of "normal" can be overlayed over the patient's "abnormal" for direct comparison
Custom Analysis The doctor will want to customize their analyses to include parts of various 2D & 3D analyses.
The doctor can define which components of each to include.
MedScape will allow the enduser to define, name, save and employ a custom analysis. This analysis can be implemented as a macro function.
Growth Forecasting Growth forecasting has always been a goal of cephalometric diagnosis and treatment planning since the early days. It became popular with Ricketts introduction to RMDS growth forecasts.
2D growth forecasting has had limited value. Short term forecasting has been acceptable at times, but long term forecasting has been inaccurate.
Rocky Mountain Data Systems (RMDS) in association w/ Dr. Bob Ricketts have the most extensive data base on 2D growth forecasting.
Lyle Johnston has developed a template that estimates "normal" growth "averages" in children.
Also the Burlington Growth Study is also available along with the Broadbent/Bolton study, Michigan study, & others. All of these are 2D.
3D growth forecasting is yet to be developed and will be a critical area of study and development.
** see highlight abnormal
** see seg. Landmarks
** see landmark tracking over time
Visual Treatment ** see segment landmark groups Objective & Surgical A visual representation of a treatment plan that a Dr. decides Treatment Objective from using study models of the teeth, X-rays (Ceph, Frontal, Pan, etc.), and photographs of the face & teeth.
The integration of the 2D ceph with 2D video imaging is now "state of the art". Some attempts have been made to have soft tissue change in relation to changes made to the bones & tooth movements, but are only in 2D (video-ceph integration). A more important treatment planning tool would be to evaluate the soft tissue changes the Dr. & patient desires (in 3D) and see what changes NEED to occur in the teeth and skeletal structure to accomplish this soft tissue change. ** see also VTO (implants)
STO - is a Surgical Treatment Objective
This would be a surgical simulation of the movements of the bones and soft tissue to accomplish what orthodontics alone cannot achieve.
** see also Surgical planning (implants)
Orthodontic A sequential set of photographs of the face, teeth, gingival Cooperative tissues with tracking markers could allow the Dr. to track Evaluation & COOPERATION in areas of: Timeline tracking of progress 1. Mechanotherapy
2. Oral hygiene
3. abnormal growth
4. abnormal reaction to forces
5. other
Time line tracking would allow the evaluation of progress over time. Patient's ALWAYS ask "When am I getting my braces off'. Accurate 3D evaluation of cooperation and growth or surgical plans with photos would be a GREAT stride forward.
Goals of Software: Assess Treatment Progress: a. Exceptional b. Good
1. On Schedule
2. Ahead of Schedule c. Fair/ behind schedule d. Poor/ Delayed
Reasons for Progress Assessment: a. Poor Co-operation
1. Head Gear
2. Elastic
3. Removable Appliance
4. Patient disinterest
5. Other b. Missed appointments c. Broken Appliances d. Lost Appliance e. Adverse Biological response f. Unexpected complexity of case g. Other
Modification of treatment based on Progress assessment and reasons: a. In need of Jaw Surgery U/L both b. TMJ Surgery R/L/B c. Extraction considerations d. Parent Consult
1. Modify Treatment approach
2. Alter Fee
3. Stop Treatment
4. Other
ATJTO DETECTION Through automated local image analysis, the software CEPHALOMETRIC simplifies the operator task of designating landmarks and LANDMARKS traces. For example when tracing an intensity edge in an image, as long as the user maintains the pointer in the general vicinity of the edge, the software automatically finds the edge and traces it without relying on precise pointer movement by the user.
Patient Presentation Generic Presentation: To demonstrate possible treatment options, patient education about orthodontics using a generic patient.
Custom Presentation: Demonstrate possible treatment options and outcomes using the patient's 3D anatomy.
Take Home Presentation: Create a limited software program that will display a 3D model of the patient with the ability to do some minor enhancements (smile databank). Output^ floppy disc, video tape.
Arch Length Analysis Arch length analysis is a critical diagnostic measurement, as it and Tooth Size can determine diagnostic decisions of extractions of permanent Discrepancy Analysis teeth to correct certain orthodontic problems vs. non- extractions decisions. The teeth can fit within the supporting bone (alveolar bone) to the upper and lower jaw structure. The alveolar bone is the supporting bone that surrounds the roots of the teeth. The basal bone is the main supporting structure for the jaws.
The basal bone of the lower jaw (mandible) is limited in size by its genetic potential and has limited ability for growth modification. There are possible growth modifications procedures, such as Functional Jaw Orthopedics that have some limited growth modification potential. The basal bone supports the alveolar bone which supports the teeth. The alveolar bone has the potential for adaptation to the positions of the teeth and can be modified as long as the teeth are kept within the limits of the basal bone.
The upper jaw (maxilla) has the capability of increasing its transverse dimensions via "rapid Palatal Expansion" appliances. These types of orthopedic appliances not only change the alveolar bone shape and the size but can also change the dimension of the maxillary basal bone dimension due to "sutures" that exist in the upper jaw. The lower jaw does not have sutures associated with the mandibular skeletal structure.
The maxilla is therefore capable of being increased in size to allow for more room form crowded or crooked teeth to be aligned into "normal" occlusal fit.
Extraction vs. non-extraction of permanent teeth, the decision for a surgical solution (adult) vs. growth modification (child) to resolve "Arch Length" problems is a major diagnostic decision that the orthodontist and/or Oral Surgeon can make.
Extraction vs. non-extraction of decisions have traditionally been based on the space requirements of the mandible due to its inability to be changed significantly. Significant arch discrepancy in the lower arch may require extraction of selected permanent teeth to resolve the crowding problem, the orthodontist can then decide which teeth can be removed in the upper jaw, if any, to create a "normal" occlusal fit of the teeth. The teeth can fit into this ideal occlusion when the mandible is in a CR or CO position.
The Curve of Spee and the Curve of Wilson are three- dimensional relationships of the plane of occlusion when viewed from the lateral and frontal planes respectively. The analyses of these relationships of the teeth also are included in the decision making process of the orthodontist as far as the extraction vs. non-extraction treatment decisions. As the Curves "level" out the teeth could be positioned where there is no bone support leading to periodontal (gum) problems. Recession and/or alveolar bone loss could occur if not properly controlled mechanically.
In order for the teeth to "fit" normally at the end of treatment, the doctor can evaluate ALL 3D requirements of each arch, TMJ, bone configuration, etc. These include: 1. the sagittal dimensions (length), 2. the transverse dimension (width), and 3. The vertical dimension (height).
Dental extraction compensations can be accomplished in order to treat a case without surgery of the jaw structure. This compromised treatment, at times may be acceptable for patients who will not accept the surgical treatment alternative or for medical or other reasons are not candidates for orthognathic surgical procedures. Tooth Size Discrepancy: The size of the individual teeth as they are positioned around the "Caternary" type curve of the arch, take upo space. The relative sizes of each tooth type (molars, bicuspids, cuspids, incisors) can be interrelated appropriately or the occlusion of the teeth will not fit properly at the end of treatment. If a discrepancy exists in the relative sizes of certain teeth in the arch, then a so called "Bolton Tooth Size Discrepancy" exists. This tooth size discrepancy can also effect the fit of the occlusion between the opposing arches.
It is necessary to know the mesial distal, buccal/lingual, height measurement, along with the root lengths of the individual teeth. The root length related to biomechanical tooth movement considerations.
Bolton tooth size discrepancies are created when there is a mismatch in the size of teeth within the respective arch. This creates a problem of alignment and proper fit of the occlusion. Knowing these discrepancies prior to treatment is critical for orthodontic diagnosis. Limitations in treatment need to be related to the patient as apart of their informed consent. Small lateral incisors, abnormal shape & form, congenital absence are a few problems that create a compromised end result. Restorative dental procedures to correct some of these discrepancies, need to be planned prior to treatment so the patient will be informed and expect follow up care. Relapse of teeth after orthodontic correction is a major consideration in orthodontic therapy. Many elaborate treatment alternative have been devised to control relapse. The ability to three-dimensionally diagnose and treatment plan a patient may lead to improved retention of orthodontically treated cases.
Level of the Curve of Spee: The Curve of Spee is a curve of the occlusal plane as seen from the lateral view. The Curve of Wilson is the curve or construction of the occlusal plane as view from the frontal. The treatment of these two "curves" are important as to the eventual final result of the occlusion. Orthodontist usually "flatten" these curves during treatment for occlusal correction. Uprighting the Curve of Wilson can lead to increased arch length and help to gain space for crowded teeth, up to the limit of the alveolar bone, cortical bone, and basal bone. Leveling the Curve of Spee is a routine orthodontic biomechanical effect of treatment. This leads to a better fit of the occlusion when the Curve of Spee is leveled. This curve tends to deepen slightly with age, so orthodontists routinely "over correct" the leveling of this curve to a level occlusal plane three-dimensionally. The mathematical relationship exists to a flatten Curve of Spee. This flattening determines the incisal edges of the anterior teeth at one end of the arch and the disto- buccal scups tips of the lower second molars on the other end of the arch. By using the X.Y and Z coordinates of the occlusal surfaces of the teeth, a calculation of the arch circumference can be determined. The distance between any 2 points, A (xι,yι zi) & B( 2,Y2.Z2) in space is the magnitude of the vectors connecting them and is give by:
AB = (X2-X,)2 + (Y2-Y1)2 + (Zr Z,)2
Each tooth coordinate measurement represents a point in space. The total arch circumference is the magnitude of the summation of all vectors connecting their points and given by:
Ct ' (XI-XJ)2 + (Y1-YJ)2 + (Z,- ZJ)2 represents the total arch circumference in 3D space and N is the number of teeth measured.
To compare to 2D relationships, the planer projection of the total arch circumference is calculated using a similar method except the depth coordinate (Z,) i.e., depth of Spee is excluded.
Figure imgf000096_0001
Cp represents the planer projection of the total arch circumference to a lateral 2D projected view.
Asymmetry Analysis An asymmetry analysis defines the morphology differences between the right and left halves the mandible, maxilla and other regions of the skeleton. The symmetry of these structures should be determined through the use of landmark groupings. The procedure may include determination of the sagittal plane midline of the patient by utilizing identifying midline landmarks. The sagittal plane midline can be used to define the right and left halves of the patient.
The simplest symmetry analysis would be begin with the superimposition of the right and left halves of the mandible utilizing the sagittal plane midline reference as the registration plane. The quantification of the asymmetry would be to compare the x,y,z differences in location of the corresponding right and left landmarks and to compare the location of these landmarks to the normal landmark location (by standard deviation) . This type of analysis would allow the end user to quantify the amount of asymmetry and direct the end user to etiology of the asymmetry. For example, when the mandible is asymmetric then it is safe to assume that one side is too small or the conttalateral side is too large. Comparison of the patient data with normal data would allow the clinician to determine which side was abnormal. Knowledge of the etiology of asymmetry may be critical in controlling or predicting the outcome of treatment.
Additional analysis may include a normalization of the 3-D data to the standard ABO 2-D views and performing an analysis using existing analysis models. Tools may be created to allow the end user to create a symmetry analysis.
Fit to Stock Object The spatially calibrated cephalometric views can be overlaid on the stock object. The stock object will be customized to fit the cephalometric data by either drag and drop of a stock object landmark to the corresponding ceph. landmark (edited in all available projection views) or by digitizing the landmarks on the ceph and then making the association and customization of the stock object through the IMS data base. A combination of the two customization methods can be used. In addition, non- landmark cephalometric data may also be associated with the stock objects. The non-landmark data of most interest are the ridge crests (i.e., orbital rims, rim of the nasal fossa, external borders of the mandible, etc. These same methods may be employed for other stock objects, such as, the teeth, TMJs etc.
Highlight Abnormal The stock objects are a graphical representation of normal. These normal values for landmark location have been determined through an analysis of the landmark locations on many patients (Burlington study) and have been sorted by age and sex of the patient. Deviations from normal can be analyzed and statistically grouped as a standard deviation from normal. Through the use of the IMS data base we can define normal and the standard deviations from normal for individual landmarks and landmark groupings.
Following the completion of the customization of the calibrated cephalometric projections to the stock object the IMS data base will perform an assessment of landmarks locations and groupings of landmarks by compare the patient data to normal data through look up tables (LUT) contained in the IMS data base. After this analysis the computer can highlight in color on
Figure imgf000099_0001
Figure imgf000100_0001
Gnathological Normal Gnathological normal refers to the cusp fossa spatial relationships, the tooth to tooth relationships among and between the maxillary and mandibular teeth and tooth position relative to the supporting alveolarO and basal bone. The tooth and its 3-D location and spatial orientation relative to the tooth long axis can be defined through tracking of landmarks located on the root apices or apex, cusp tip(s) or incisal edge and the mesial and distal greatest heights of contour. This specialized segmentation of teeth allows them to function as objects. A database that represents gnathological normal teeth can and be used when rendering the stock object teeth in combination with the skeleton. Deviations from the gnathological normal can be described in a similar fashion to the method used for cephalometric analysis. A pseudo-colored visual display of the anatomy that falls outside the statistical normal will facilitate a quick identification of abnormal tooth position, etc.
Airway Analysis The airway can be divided into the nasal airway, the nasal pharynx and oropharynx. The nasal airway begins at the anterior opening of the nasal passage and ends at the posterior margin of the nasal fossa. The nasopharynx begins at the posterior margin of the nasal fossa and ends at the most inferior area of the soft palate. The oropharynx begins at the inferior margin of the soft palate and ends at the superior margin of the valecula. An airway analysis includes a mathematical description of nasal septum symmetry about the
Figure imgf000102_0001
Figure imgf000102_0002
Figure imgf000103_0001
Figure imgf000104_0001
Appendix F The following describes alternative embodiments of the invention.
The problems:
Accurate 3 dimensional models are not available to all segments of medicine and dentistry because preferred image acquisition tools may not provide data in a
format that is easily converted to 3D. When considering 3D techniques the data
density is so great that it requires special platforms for data handling and complex algorithms for data reduction. User interface, etc.
Summary: Three basic imaging software modules (Sculptor, Clinician and Executor) comprise the Acuscape suite of software designed for the medical use.
These software packages are further customized with application specific
software to provide benefit to specific medical and dental disciplines. In combination these three software packages and associated application software
produce spatially accurate 3 dimensional replicas (.pro file) of patient anatomy
that allow for the extraction of clinically relevant data, .pro file manipulation,
storage, measurement, modification and display for the purposes that include diagnosis, treatment planning, treatment simulation and outcomes measurements.
The Sculptor is used at an image processing center (server) and passes the acquired images and measurement files (.sci files) to the Clinician user (client) for the generation of the .pro file and subsequent use. Sculptor Module: Images are acquired directly into a patient session file from input devices that include digital cameras, flat bed scanners, x-ray sensors, etc. or from image storage files. An Acuscape image calibration frame is worn during image acquisition and shadows of the calibration markers are
embedded on the resultant images.
The images are first spatially calibrated and a patient centric co-ordinate
system is transferred to the images. This co-ordinate system is adjusted or optimized to best fit the patients anatomy. Part of this adjustment superimposes the y-z plane of the co-ordinate system to superimpose on the patient's mid-
sagittal plane. The subsequent measurements store data utilizing this constructed co-ordinate system. The calibrated images can be stored by the executor or displayed and measured. Multiple images or image sets can be
combined in a common 3-D database, displayed at as a combined set and can be
selectively enhanced for improved measurements. These enhancements include
magnification and equalization of selected image regions. Three space
measurements can be performed as point, closed loop trace and linear trace measurements. The measurement routine occurs simultaneous on all images
displayed in the sculptor. The selected image is measured and a corresponding
epipolar line is constructed on the adjacent images to assist with locating the
same point on that image. The z,y and z locations of all of the measurement points and lines (series of points) are stored in a measurement file. The
measurement files are converted to an export file that contains all .jpg images
104 - and a .sci file. The .sci files contain the calibration information, camera parameters and the x,y and z locations of all traces and landmarks.
The sculptor facilitates the measurements on all calibrated or cross calibrated images. Cross calibration refers to calibrating multiple images and image types to the same 3D co-ordinate system. These images can include but are not limited
to x-rays, tomographs, CT scans, visual band images, MRI, ultrasound, infrared and radar. The image type and projection angulation will be dictated by the
intended purpose of the imaging study. This is an application specific program
that has been optimized to facilitate the original imaging goals. The Sculptor will be used to calibrate and measure the images (spatially, color or gray scale value).
Executor Module: This module works in the background to manage images for the Sculptor and Clinician modules. This is a patient centric relational data
base with multiple tables that stores, retrieves and transports patient image files.
Patient file transport files contain the .jpg images and an .sci file. The .sci file is
created by the Sculptor and transported to the Clinician by any number of means
including modem, internet and floppy, etc.
Clinician Module: The module is intended to exist primarily in the doctor's
office (end user). The Clinician will receive the transport file from the Sculptor
via the Executor. The patient specific measurements contained within the .sci file are used by the morph editor, a sub-section of the Clinician, to morph a "stock model" to spatially match the patient's measurements. The "stock model" is a generic wireframe representation of the anatomy to be modeled. The measurements include specific linear traces, closed traces, landmarks and control points. The measurement locations are pre-programmed into the Sculptor and
the corresponding locations are programmed to the corresponding points on the stock wire frame of the Clinician. For example, the orthodontic application
includes traces of ridge crests (orbits, mandibular borders, etc.), landmarks (nasion, Sella, etc.) and control points (tooth cusp tips, etc.). These measurement locations and names associated with their precise locations on the
"stock model" are contained within the Clinician's database. The Clinician's
knowledge of the precise wireframe vertices associated with the locations of landmarks, traces and control points facilitates the automation of using the .sci file to morph a generic stock model to a patient specific model (.pro file).
A specific .pro file can be retrieved via the Executor and displayed, analyzed and manipulated as a solid model in the Clinician module. The .pro file will exist
as a collection of anatomic "objects". These objects will include anatomic structures, such as each tooth, landmarks and reference planes. The spatial
location of all objects are known and tracked by the Clinician's database. The
.pro file possesses an x,y,z co-ordinate system referred to a the global co¬
ordinate system while each object possesses its own co-ordinate system referred
to as a local co-ordinate system. The Clinician's database is monitoring the spatial location of the .pro file and its sub-objects via their co-ordinate locations. The .pro file and/or any of its objects can be translated or rotated along their co- ordinate axes. The movements will occur along the default global co-ordinate system unless an object or group of objects has been selected then the movement
occurs along the selected local co-ordinate system.
The orthodontic application uses a stock model of the head and portions of the neck. This stock model includes, but is not limited to, the associated
skeleton, soft facial soft tissues, temporomandibular joints and teeth. In the
examples shown this model currently contains more than 300 objects that can be manipulated in the clinician module to facilitate the kinds of tasks routinely undertaken by an orthodontist.

Claims

The Claims What is claimed is:
1. An apparatus for calibrating medical images of a patient comprising: a headgear for mounting to a patient's head, and a plurality of calibration targets mounted on the headgear.
2. The apparatus of claim 1 in which the headgear is size adjustable.
3. The apparatus of claim 1 in which the headgear comprises a rigid portion to which the plurality of calibration targets are mounted.
4. The apparatus of claim 3 in which the plurality of calibration targets are mounted so as to reduce the amount of overlapping when an image of the
appartus is captured.
5. The apparatus of claim 1 in which the plurality of calibration targets are spherical.
6. The apparatus of claim 1 in which the plurality of calibration targets comprise at least one of BBs and bearings.
7. A computer apparatus comprising: a computer having a processor and memory; and
software for execution on the processor, comprising a module for receiving
image data for a patient, for establishing a reference frame to relate anatomic locations of the patient, and for generating a 3D patient specific model from a stock model using the related anatomic locations.
8. The computer apparatus of claim 7 in which the module creates a data file
containing patient specific information for transfer to another module.
9. A computer apparatus comprising: a computer having a processor and memory; and software for execution on the processor, comprising a module for receiving
patient specific images and patient specific information, the software having access to a generic three-dimensional model, the software for
customizing the generic three-dimensional model using the patient specific information to form a customized three-dimensional model of at least a portion of the patient's anatomy.
10. The apparatus of claim 9 in which location of one or more particular three- dimensional model vertices is changed in accordance with the patient specific
information.
11. The apparatus of claim 10 in which the locations of other three-dimensional model vertices are changed to be conform with the change in location of the
particular three-dimensional model vertices.
12. The apparatus of claim 9 in which the generic three-dimensional model
comprises a plurality of objects each having an individual object coordinate system referenced to a model coordinate system.
109 -
13. The apparatus of claim 12 in which the software further comprises a three- dimensional model viewer capable of manipulating the objects individually or collectively.
14. A method of capturing and handling medical images, the method comprising
the steps of: mounting a calibration frame on a patient; capturing images of the patient and the calibration frame from different perspectives; and
storing images from a patient resulting from a plurality of sessions in
respectively separate portions of a file management system so that all patient related data is available in a single entity.
15. A method of processing medical images comprising using corresponding
points on different images of a patient to establish a patient centric coordinate system with respect to which a three-dimensional model is referenced.
16. The method of claim 15 in which the corresponding points are determined
using images of a calibration target.
17. The method of claim 16 in which images of a calibration target are
determined using blob analysis.
18. The method of claim 15 in which the corresponding points are used to
identify the location of a calibration target in three-dimensional space.
19. The method of claim 15 further comprising the step of referencing points on the medical images to the patient centric coordinate system.
20. The method of claim 15 further comprising the step of modifying the
location of particular points on the three-dimensional model to correspond to patient specific information and morphing the three-dimensional model to adjust
the locations of other points correspondingly.
21. A method of analyzing patient images comprising the steps of:
identifying one or more anatomical locations in a plurality of patient images from a particular patient imaging session;
determining anatomical locations information in a relative to a reference frame established from the plurality of patient images; and
using the anatomical location information for patient treatment planning and execution.
22. The method of claim 21 in which an anatomical location is identified by a plurality of points constituting an outline trace of the a portion of the patient's
anatomy.
23. A method of identifying corresponding points in a plurality ofimages which
constitute different views of a three-dimensional space, the method comprising the steps of:
determining a common reference frame for the plurality ofimages; selecting one point on one image;
generating a line through that point;
- Ill - displaying a projection of the line in at least one of the other of the plurality
ofimages; and using the projection of the line in at least one of the other of the plurality of images to identify a corresponding point in that image.
24. A method of identifying the location of a point in a three-dimensional space using two dimensional images which are substantially projections of a three-
dimensional model, comprising the steps of: selecting a particular point on one of the two dimensional images; selecting the same point on at least one other of the two dimensional images;
identifying the associated vertex on the model of the particular point on the
images; determining the selected point location by interpolating between known
locations of model vertices based on interpolation using pixel count between the particular point and one or more sets of adjacent vertices.
25. The method of claim 24 in which distance between two points is determined
by identifying the locations of the two points and determining line length using
the coordinates determined for those points.
26. A system for using patient image information, comprising:
a computer configured to customize a generic three-dimensional model using patient specific information; a network linking the computer to a second computer; and the second computer configured to receive the patient specific information and to customize a locally stored generic three-dimensional model using the patient information.
27. A computer program product, comprising: a memory medium; and a computer program stored on the memory medium, the computer program
comprising instructions for using corresponding points on different
images of a patient to establish a common reference frame, the computer
program further for determining the relative location of anatomical locations of the anatomy of the patient using the different images and the common reference frame.
28. A computer program product, comprising:
a memory medium; and a computer program stored on the memory medium, the computer program
comprising instructions for identifying one or more anatomic location in a
plurality of patient images from a particular patient imaging session,
calculating anatomic location information in a common reference frame
established from the plurality ofimages, and using the location information for patient treatment planning.
PCT/US1999/010566 1998-05-13 1999-05-13 Method and apparatus for generating 3d models from medical images WO1999059106A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CA002296274A CA2296274A1 (en) 1998-05-13 1999-05-13 Method and apparatus for generating 3d models from medical images
EP99924217A EP1027681A4 (en) 1998-05-13 1999-05-13 Method and apparatus for generating 3d models from medical images
AU40769/99A AU4076999A (en) 1998-05-13 1999-05-13 Method and apparatus for generating 3d models from medical images

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US8537298P 1998-05-13 1998-05-13
US60/085,372 1998-05-13

Publications (1)

Publication Number Publication Date
WO1999059106A1 true WO1999059106A1 (en) 1999-11-18

Family

ID=22191192

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1999/010566 WO1999059106A1 (en) 1998-05-13 1999-05-13 Method and apparatus for generating 3d models from medical images

Country Status (4)

Country Link
EP (1) EP1027681A4 (en)
AU (1) AU4076999A (en)
CA (1) CA2296274A1 (en)
WO (1) WO1999059106A1 (en)

Cited By (139)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1348393A1 (en) * 2002-03-27 2003-10-01 BrainLAB AG Medical navigation or pre-operative treatment planning supported by generic patient data
EP1348394A1 (en) 2002-03-27 2003-10-01 BrainLAB AG Planning or navigation assistance by generic obtained patient data with two-dimensional adaptation
WO2003061501A3 (en) * 2002-01-16 2003-10-16 Orthosoft Inc Method and apparatus for reconstructing bone surfaces during surgery
EP1127545A3 (en) * 2000-02-26 2004-05-06 Philips Intellectual Property & Standards GmbH Procedure for locating objects in radiotherapy
EP1417931A1 (en) * 2002-11-05 2004-05-12 EASTMAN KODAK COMPANY (a New Jersey corporation) Method for automatically producing true size radiographic image
WO2004095372A1 (en) * 2003-04-22 2004-11-04 Provincia Italiana Della Congregazione Dei Figli Dell'immacolata Concezione - Instituto Dermopatico Dell'immacolata Automatic detection of skin lesions
WO2004098378A2 (en) 2003-05-02 2004-11-18 Orametrix, Inc. Unified workstation for virtual craniofacial diagnosis treatment planning and therapeutics
EP1570800A1 (en) * 2004-03-01 2005-09-07 BrainLAB AG Method and device for determining the symmetrical plane of a three dimensional object
WO2006092600A1 (en) 2005-03-01 2006-09-08 Kings College London Surgical planning
WO2006116488A2 (en) * 2005-04-25 2006-11-02 Xoran Technologies, Inc. Ct system with synthetic view generation
WO2007017642A1 (en) * 2005-08-05 2007-02-15 Depuy Orthopädie Gmbh Computer assisted surgery system
CN1308897C (en) * 2002-09-15 2007-04-04 深圳市泛友科技有限公司 Method for forming new three-dimensional model using a group of two-dimensional photos and three-dimensional library
EP1803413A2 (en) 2005-12-30 2007-07-04 DePuy Products, Inc. Magnetic sensor array for bone registration in computer-assisted orthopaedic surgery
WO2004044787A3 (en) * 2002-11-11 2007-11-15 Albert Mehl Method for producing denture parts or for tooth restoration using electronic dental representations
EP1959391A1 (en) * 2007-02-13 2008-08-20 BrainLAB AG Determination of the three dimensional contour path of an anatomical structure
EP1982652A1 (en) * 2007-04-20 2008-10-22 Medicim NV Method for deriving shape information
WO2008129360A1 (en) * 2007-04-19 2008-10-30 Damvig Develop Future Aps A method for the manufacturing of a reproduction of an encapsulated three-dimensional physical object and objects obtained by the method
WO2009006303A2 (en) * 2007-06-29 2009-01-08 3M Innovative Properties Company Video-assisted margin marking for dental models
US7477776B2 (en) 2004-03-01 2009-01-13 Brainlab Ag Method and apparatus for determining a plane of symmetry of a three-dimensional object
US7715602B2 (en) 2002-01-18 2010-05-11 Orthosoft Inc. Method and apparatus for reconstructing bone surfaces during surgery
US7720519B2 (en) 2002-10-17 2010-05-18 Elekta Neuromag Oy Method for three-dimensional modeling of the skull and internal structures thereof
ITBO20090111A1 (en) * 2009-02-26 2010-08-27 Paolo Fiorini METHOD AND SURGICAL TRAINING APPARATUS
US7787932B2 (en) 2002-04-26 2010-08-31 Brainlab Ag Planning and navigation assistance using two-dimensionally adapted generic and detected patient data
US7873403B2 (en) 2003-07-15 2011-01-18 Brainlab Ag Method and device for determining a three-dimensional form of a body from two-dimensional projection images
EP2039321A3 (en) * 2000-11-08 2011-04-27 Institut Straumann AG Surface recording and generation
CN102159155A (en) * 2008-09-18 2011-08-17 3形状股份有限公司 Tools for customized design of dental restorations
US8068648B2 (en) 2006-12-21 2011-11-29 Depuy Products, Inc. Method and system for registering a bone of a patient with a computer assisted orthopaedic surgery system
WO2012117122A1 (en) * 2011-03-01 2012-09-07 Dolphin Imaging Systems, Llc System and method for generating profile change using cephalometric monitoring data
US8265949B2 (en) 2007-09-27 2012-09-11 Depuy Products, Inc. Customized patient surgical plan
US20120259592A1 (en) * 2011-04-07 2012-10-11 Dolphin Imaging Systems, Llc System and Method for Three-Dimensional Maxillofacial Surgical Simulation and Planning
US8343159B2 (en) 2007-09-30 2013-01-01 Depuy Products, Inc. Orthopaedic bone saw and method of use thereof
US8357111B2 (en) 2007-09-30 2013-01-22 Depuy Products, Inc. Method and system for designing patient-specific orthopaedic surgical instruments
TWI387315B (en) * 2010-06-29 2013-02-21 Acer Inc Three dimensional liquid crystal shutter glasses
US8417004B2 (en) 2011-04-07 2013-04-09 Dolphin Imaging Systems, Llc System and method for simulated linearization of curved surface
US8858561B2 (en) 2006-06-09 2014-10-14 Blomet Manufacturing, LLC Patient-specific alignment guide
US8862200B2 (en) 2005-12-30 2014-10-14 DePuy Synthes Products, LLC Method for determining a position of a magnetic source
US8868199B2 (en) 2012-08-31 2014-10-21 Greatbatch Ltd. System and method of compressing medical maps for pulse generator or database storage
US8864769B2 (en) 2006-02-27 2014-10-21 Biomet Manufacturing, Llc Alignment guides with patient-specific anchoring elements
US8903496B2 (en) 2012-08-31 2014-12-02 Greatbatch Ltd. Clinician programming system and method
US8900244B2 (en) 2006-02-27 2014-12-02 Biomet Manufacturing, Llc Patient-specific acetabular guide and method
US8903530B2 (en) 2011-06-06 2014-12-02 Biomet Manufacturing, Llc Pre-operative planning and manufacturing method for orthopedic procedure
US8956364B2 (en) 2011-04-29 2015-02-17 Biomet Manufacturing, Llc Patient-specific partial knee guides and other instruments
WO2015027196A1 (en) * 2013-08-22 2015-02-26 Bespoke, Inc. Method and system to create custom products
US8979936B2 (en) 2006-06-09 2015-03-17 Biomet Manufacturing, Llc Patient-modified implant
US8983616B2 (en) 2012-09-05 2015-03-17 Greatbatch Ltd. Method and system for associating patient records with pulse generators
US9005297B2 (en) 2006-02-27 2015-04-14 Biomet Manufacturing, Llc Patient-specific elbow guides and associated methods
US9053563B2 (en) 2011-02-11 2015-06-09 E4 Endeavors, Inc. System and method for modeling a biopsy specimen
US9060788B2 (en) 2012-12-11 2015-06-23 Biomet Manufacturing, Llc Patient-specific acetabular guide for anterior approach
US9066727B2 (en) 2010-03-04 2015-06-30 Materialise Nv Patient-specific computed tomography guides
US9066734B2 (en) 2011-08-31 2015-06-30 Biomet Manufacturing, Llc Patient-specific sacroiliac guides and associated methods
US9084618B2 (en) 2011-06-13 2015-07-21 Biomet Manufacturing, Llc Drill guides for confirming alignment of patient-specific alignment guides
US9113971B2 (en) 2006-02-27 2015-08-25 Biomet Manufacturing, Llc Femoral acetabular impingement guide
US9173666B2 (en) 2011-07-01 2015-11-03 Biomet Manufacturing, Llc Patient-specific-bone-cutting guidance instruments and methods
US9173661B2 (en) 2006-02-27 2015-11-03 Biomet Manufacturing, Llc Patient specific alignment guide with cutting surface and laser indicator
US9180302B2 (en) 2012-08-31 2015-11-10 Greatbatch Ltd. Touch screen finger position indicator for a spinal cord stimulation programming device
US9204977B2 (en) 2012-12-11 2015-12-08 Biomet Manufacturing, Llc Patient-specific acetabular guide for anterior approach
US9237950B2 (en) 2012-02-02 2016-01-19 Biomet Manufacturing, Llc Implant with patient-specific porous structure
US9241745B2 (en) 2011-03-07 2016-01-26 Biomet Manufacturing, Llc Patient-specific femoral version guide
US9259577B2 (en) 2012-08-31 2016-02-16 Greatbatch Ltd. Method and system of quick neurostimulation electrode configuration and positioning
US9271744B2 (en) 2010-09-29 2016-03-01 Biomet Manufacturing, Llc Patient-specific guide for partial acetabular socket replacement
US9289253B2 (en) 2006-02-27 2016-03-22 Biomet Manufacturing, Llc Patient-specific shoulder guide
US9295497B2 (en) 2011-08-31 2016-03-29 Biomet Manufacturing, Llc Patient-specific sacroiliac and pedicle guides
US9301812B2 (en) 2011-10-27 2016-04-05 Biomet Manufacturing, Llc Methods for patient-specific shoulder arthroplasty
US9339278B2 (en) 2006-02-27 2016-05-17 Biomet Manufacturing, Llc Patient-specific acetabular guides and associated instruments
US9345548B2 (en) 2006-02-27 2016-05-24 Biomet Manufacturing, Llc Patient-specific pre-operative planning
US9351743B2 (en) 2011-10-27 2016-05-31 Biomet Manufacturing, Llc Patient-specific glenoid guides
US9375582B2 (en) 2012-08-31 2016-06-28 Nuvectra Corporation Touch screen safety controls for clinician programmer
US9386993B2 (en) 2011-09-29 2016-07-12 Biomet Manufacturing, Llc Patient-specific femoroacetabular impingement instruments and methods
US9393028B2 (en) 2009-08-13 2016-07-19 Biomet Manufacturing, Llc Device for the resection of bones, method for producing such a device, endoprosthesis suited for this purpose and method for producing such an endoprosthesis
US9408616B2 (en) 2014-05-12 2016-08-09 Biomet Manufacturing, Llc Humeral cut guide
US9427320B2 (en) 2011-08-04 2016-08-30 Biomet Manufacturing, Llc Patient-specific pelvic implants for acetabular reconstruction
US9445907B2 (en) 2011-03-07 2016-09-20 Biomet Manufacturing, Llc Patient-specific tools and implants
US9451973B2 (en) 2011-10-27 2016-09-27 Biomet Manufacturing, Llc Patient specific glenoid guide
US9456833B2 (en) 2010-02-26 2016-10-04 Biomet Sports Medicine, Llc Patient-specific osteotomy devices and methods
US9471753B2 (en) 2012-08-31 2016-10-18 Nuvectra Corporation Programming and virtual reality representation of stimulation parameter Groups
US9474539B2 (en) 2011-04-29 2016-10-25 Biomet Manufacturing, Llc Patient-specific convertible guides
US9480490B2 (en) 2006-02-27 2016-11-01 Biomet Manufacturing, Llc Patient-specific guides
US9480580B2 (en) 2006-02-27 2016-11-01 Biomet Manufacturing, Llc Patient-specific acetabular alignment guides
US9498233B2 (en) 2013-03-13 2016-11-22 Biomet Manufacturing, Llc. Universal acetabular guide and associated hardware
US9507912B2 (en) 2012-08-31 2016-11-29 Nuvectra Corporation Method and system of simulating a pulse generator on a clinician programmer
US9517145B2 (en) 2013-03-15 2016-12-13 Biomet Manufacturing, Llc Guide alignment system and method
US9522010B2 (en) 2006-02-27 2016-12-20 Biomet Manufacturing, Llc Patient-specific orthopedic instruments
WO2017011337A1 (en) * 2015-07-10 2017-01-19 Quantant Technology Inc. Remote cloud based medical image sharing and rendering
US9554910B2 (en) 2011-10-27 2017-01-31 Biomet Manufacturing, Llc Patient-specific glenoid guide and implants
US9561040B2 (en) 2014-06-03 2017-02-07 Biomet Manufacturing, Llc Patient-specific glenoid depth control
US9579107B2 (en) 2013-03-12 2017-02-28 Biomet Manufacturing, Llc Multi-point fit for patient specific guide
US9594877B2 (en) 2012-08-31 2017-03-14 Nuvectra Corporation Virtual reality representation of medical devices
US9615788B2 (en) 2012-08-31 2017-04-11 Nuvectra Corporation Method and system of producing 2D representations of 3D pain and stimulation maps and implant models on a clinician programmer
US9662127B2 (en) 2006-02-27 2017-05-30 Biomet Manufacturing, Llc Patient-specific acetabular guides and associated instruments
US9662216B2 (en) 2006-02-27 2017-05-30 Biomet Manufacturing, Llc Patient-specific hip joint devices
US9675400B2 (en) 2011-04-19 2017-06-13 Biomet Manufacturing, Llc Patient-specific fracture fixation instrumentation and method
US9717510B2 (en) 2011-04-15 2017-08-01 Biomet Manufacturing, Llc Patient-specific numerically controlled instrument
FR3048541A1 (en) * 2016-03-01 2017-09-08 Lyra Holding VIRTUAL CHANGE OF A PERSON'S TOOTH
US9767255B2 (en) 2012-09-05 2017-09-19 Nuvectra Corporation Predefined input for clinician programmer data entry
US9795399B2 (en) 2006-06-09 2017-10-24 Biomet Manufacturing, Llc Patient-specific knee alignment guide and associated method
US9820868B2 (en) 2015-03-30 2017-11-21 Biomet Manufacturing, Llc Method and apparatus for a pin apparatus
US9826981B2 (en) 2013-03-13 2017-11-28 Biomet Manufacturing, Llc Tangential fit of patient-specific guides
US9826994B2 (en) 2014-09-29 2017-11-28 Biomet Manufacturing, Llc Adjustable glenoid pin insertion guide
US9833245B2 (en) 2014-09-29 2017-12-05 Biomet Sports Medicine, Llc Tibial tubercule osteotomy
US9839438B2 (en) 2013-03-11 2017-12-12 Biomet Manufacturing, Llc Patient-specific glenoid guide with a reusable guide holder
US9839436B2 (en) 2014-06-03 2017-12-12 Biomet Manufacturing, Llc Patient-specific glenoid depth control
US9861387B2 (en) 2006-06-09 2018-01-09 Biomet Manufacturing, Llc Patient-specific knee alignment guide and associated method
US9907659B2 (en) 2007-04-17 2018-03-06 Biomet Manufacturing, Llc Method and apparatus for manufacturing an implant
US9918740B2 (en) 2006-02-27 2018-03-20 Biomet Manufacturing, Llc Backup surgical instrument system and method
US9968376B2 (en) 2010-11-29 2018-05-15 Biomet Manufacturing, Llc Patient-specific orthopedic instruments
US10159498B2 (en) 2008-04-16 2018-12-25 Biomet Manufacturing, Llc Method and apparatus for manufacturing an implant
US10226262B2 (en) 2015-06-25 2019-03-12 Biomet Manufacturing, Llc Patient-specific humeral guide designs
WO2019068085A1 (en) * 2017-09-29 2019-04-04 K2M, Inc. Systems and methods for simulating spine and skeletal system pathologies
US10278711B2 (en) 2006-02-27 2019-05-07 Biomet Manufacturing, Llc Patient-specific femoral guide
US10282488B2 (en) 2014-04-25 2019-05-07 Biomet Manufacturing, Llc HTO guide with optional guided ACL/PCL tunnels
EP2861180B1 (en) 2012-06-15 2019-08-07 Vita Zahnfabrik H. Rauter GmbH & Co. KG Method for preparing a partial or full dental prosthesis
US10438351B2 (en) 2017-12-20 2019-10-08 International Business Machines Corporation Generating simulated photographic anatomical slices
US10492798B2 (en) 2011-07-01 2019-12-03 Biomet Manufacturing, Llc Backup kit for a patient-specific arthroplasty kit assembly
IT201800006261A1 (en) * 2018-06-13 2019-12-13 PROCEDURE FOR MAKING AN ANATOMICAL PROTOTYPE
US10521908B2 (en) 2017-12-20 2019-12-31 International Business Machines Corporation User interface for displaying simulated anatomical photographs
DE10148412B4 (en) * 2000-10-11 2020-01-16 Palodex Group Oy Method and device for imaging the head area of a patient
US10568647B2 (en) 2015-06-25 2020-02-25 Biomet Manufacturing, Llc Patient-specific humeral guide designs
US10603179B2 (en) 2006-02-27 2020-03-31 Biomet Manufacturing, Llc Patient-specific augments
US10614570B2 (en) 2017-12-20 2020-04-07 International Business Machines Corporation Medical image exam navigation using simulated anatomical photographs
CN111052186A (en) * 2017-08-08 2020-04-21 维申Rt 有限公司 Method and apparatus for measuring accuracy of a model generated by a patient monitoring system
US10668276B2 (en) 2012-08-31 2020-06-02 Cirtec Medical Corp. Method and system of bracketing stimulation parameters on clinician programmers
US10722310B2 (en) 2017-03-13 2020-07-28 Zimmer Biomet CMF and Thoracic, LLC Virtual surgery planning system and method
US10734116B2 (en) 2011-10-04 2020-08-04 Quantant Technology, Inc. Remote cloud based medical image sharing and rendering semi-automated or fully automated network and/or web-based, 3D and/or 4D imaging of anatomy for training, rehearsing and/or conducting medical procedures, using multiple standard X-ray and/or other imaging projections, without a need for special hardware and/or systems and/or pre-processing/analysis of a captured image data
US10758321B2 (en) 2008-05-23 2020-09-01 Align Technology, Inc. Smile designer
WO2020222169A1 (en) 2019-05-02 2020-11-05 DePuy Synthes Products, Inc. Orthopaedic implant placement system and method
US10874487B2 (en) 2003-02-26 2020-12-29 Align Technology, Inc. Systems and methods for fabricating a dental template
US10892058B2 (en) 2017-09-29 2021-01-12 K2M, Inc. Systems and methods for simulating spine and skeletal system pathologies
US11000334B1 (en) 2017-07-12 2021-05-11 K2M, Inc. Systems and methods for modeling spines and treating spines based on spine models
US11051829B2 (en) 2018-06-26 2021-07-06 DePuy Synthes Products, Inc. Customized patient-specific orthopaedic surgical instrument
US11147652B2 (en) 2014-11-13 2021-10-19 Align Technology, Inc. Method for tracking, predicting, and proactively correcting malocclusion and related issues
US11179165B2 (en) 2013-10-21 2021-11-23 Biomet Manufacturing, Llc Ligament guide registration
US11207135B2 (en) 2017-07-12 2021-12-28 K2M, Inc. Systems and methods for modeling spines and treating spines based on spine models
CN114596289A (en) * 2022-03-11 2022-06-07 北京朗视仪器股份有限公司 Mouth point detection method based on soft tissue contour line sampling points
US11419726B2 (en) 2012-01-20 2022-08-23 Conformis, Inc. Systems and methods for manufacturing, preparation and use of blanks in orthopedic implants
US11419618B2 (en) 2011-10-27 2022-08-23 Biomet Manufacturing, Llc Patient-specific glenoid guides
US11636650B2 (en) 2018-09-24 2023-04-25 K2M, Inc. System and method for isolating anatomical features in computerized tomography data
US11707327B2 (en) 2017-07-12 2023-07-25 K2M, Inc. Systems and methods for modeling spines and treating spines based on spine models
WO2023156447A1 (en) * 2022-02-18 2023-08-24 3Shape A/S Method of generating a training data set for determining periodontal structures of a patient
US11950786B2 (en) 2021-07-02 2024-04-09 DePuy Synthes Products, Inc. Customized patient-specific orthopaedic surgical instrument

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11589949B1 (en) * 2018-04-05 2023-02-28 MirrorMe3D, LLC System and methods of creating a 3D medical representation for use in performing reconstructive surgeries
CN110060287B (en) * 2019-04-26 2021-06-15 北京迈格威科技有限公司 Face image nose shaping method and device

Citations (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4945914A (en) * 1987-11-10 1990-08-07 Allen George S Method and apparatus for providing related images over time of a portion of the anatomy using at least four fiducial implants
US5005578A (en) * 1986-12-16 1991-04-09 Sam Technology, Inc. Three-dimensional magnetic resonance image distortion correction method and system
US5099846A (en) * 1988-12-23 1992-03-31 Hardy Tyrone L Method and apparatus for video presentation from a variety of scanner imaging sources
WO1992006645A1 (en) * 1990-10-19 1992-04-30 St. Louis University Surgical probe locating system for head use
US5257203A (en) * 1989-06-09 1993-10-26 Regents Of The University Of Minnesota Method and apparatus for manipulating computer-based representations of objects of complex and unique geometry
US5274551A (en) * 1991-11-29 1993-12-28 General Electric Company Method and apparatus for real-time navigation assist in interventional radiological procedures
US5273429A (en) * 1992-04-03 1993-12-28 Foster-Miller, Inc. Method and apparatus for modeling a dental prosthesis
US5278756A (en) * 1989-01-24 1994-01-11 Dolphin Imaging Systems Method and apparatus for generating cephalometric images
US5356294A (en) * 1993-07-30 1994-10-18 Wataru Odomo Dental diagnostic and instructional apparatus
US5490221A (en) * 1990-10-02 1996-02-06 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Digital data registration and differencing compression system
WO1996010949A1 (en) * 1994-10-07 1996-04-18 Medical Media Systems Video-based surgical targeting system
JPH08335278A (en) * 1995-04-05 1996-12-17 Hitachi Medical Corp Method and device for tomographic image interpolation
WO1997003601A1 (en) * 1995-07-24 1997-02-06 Interact Medical Technologies Corporation Anatomical visualization system
US5608774A (en) * 1995-06-23 1997-03-04 Science Applications International Corporation Portable, digital X-ray apparatus for producing, storing, and displaying electronic radioscopic images
WO1997023164A1 (en) * 1995-12-21 1997-07-03 Siemens Corporate Research, Inc. Calibration system and method for x-ray geometry
US5737506A (en) * 1995-06-01 1998-04-07 Medical Media Systems Anatomical visualization system
US5740802A (en) * 1993-04-20 1998-04-21 General Electric Company Computer graphic and live video system for enhancing visualization of body structures during surgery
US5742291A (en) * 1995-05-09 1998-04-21 Synthonics Incorporated Method and apparatus for creation of three-dimensional wire frames
US5769861A (en) * 1995-09-28 1998-06-23 Brainlab Med. Computersysteme Gmbh Method and devices for localizing an instrument
US5798924A (en) * 1993-12-04 1998-08-25 Eufinger; Harald Process for producing endoprostheses
US5889524A (en) * 1995-09-11 1999-03-30 University Of Washington Reconstruction of three-dimensional objects using labeled piecewise smooth subdivision surfaces
US5910107A (en) * 1993-12-29 1999-06-08 First Opinion Corporation Computerized medical diagnostic and treatment advice method
US5920660A (en) * 1995-04-05 1999-07-06 Hitachi Medical Corporation Tomogram interpolation method for executing interpolation calculation by using pixel values on projection line
US5926568A (en) * 1997-06-30 1999-07-20 The University Of North Carolina At Chapel Hill Image object matching using core analysis and deformable shape loci

Patent Citations (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5005578A (en) * 1986-12-16 1991-04-09 Sam Technology, Inc. Three-dimensional magnetic resonance image distortion correction method and system
US4945914A (en) * 1987-11-10 1990-08-07 Allen George S Method and apparatus for providing related images over time of a portion of the anatomy using at least four fiducial implants
US5099846A (en) * 1988-12-23 1992-03-31 Hardy Tyrone L Method and apparatus for video presentation from a variety of scanner imaging sources
US5278756A (en) * 1989-01-24 1994-01-11 Dolphin Imaging Systems Method and apparatus for generating cephalometric images
US5257203A (en) * 1989-06-09 1993-10-26 Regents Of The University Of Minnesota Method and apparatus for manipulating computer-based representations of objects of complex and unique geometry
US5490221A (en) * 1990-10-02 1996-02-06 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Digital data registration and differencing compression system
WO1992006645A1 (en) * 1990-10-19 1992-04-30 St. Louis University Surgical probe locating system for head use
US5274551A (en) * 1991-11-29 1993-12-28 General Electric Company Method and apparatus for real-time navigation assist in interventional radiological procedures
US5273429A (en) * 1992-04-03 1993-12-28 Foster-Miller, Inc. Method and apparatus for modeling a dental prosthesis
US5740802A (en) * 1993-04-20 1998-04-21 General Electric Company Computer graphic and live video system for enhancing visualization of body structures during surgery
US5356294A (en) * 1993-07-30 1994-10-18 Wataru Odomo Dental diagnostic and instructional apparatus
US5798924A (en) * 1993-12-04 1998-08-25 Eufinger; Harald Process for producing endoprostheses
US5910107A (en) * 1993-12-29 1999-06-08 First Opinion Corporation Computerized medical diagnostic and treatment advice method
WO1996010949A1 (en) * 1994-10-07 1996-04-18 Medical Media Systems Video-based surgical targeting system
JPH08335278A (en) * 1995-04-05 1996-12-17 Hitachi Medical Corp Method and device for tomographic image interpolation
US5920660A (en) * 1995-04-05 1999-07-06 Hitachi Medical Corporation Tomogram interpolation method for executing interpolation calculation by using pixel values on projection line
US5742291A (en) * 1995-05-09 1998-04-21 Synthonics Incorporated Method and apparatus for creation of three-dimensional wire frames
US5737506A (en) * 1995-06-01 1998-04-07 Medical Media Systems Anatomical visualization system
US5608774A (en) * 1995-06-23 1997-03-04 Science Applications International Corporation Portable, digital X-ray apparatus for producing, storing, and displaying electronic radioscopic images
WO1997003601A1 (en) * 1995-07-24 1997-02-06 Interact Medical Technologies Corporation Anatomical visualization system
US5776050A (en) * 1995-07-24 1998-07-07 Medical Media Systems Anatomical visualization system
US5889524A (en) * 1995-09-11 1999-03-30 University Of Washington Reconstruction of three-dimensional objects using labeled piecewise smooth subdivision surfaces
US5769861A (en) * 1995-09-28 1998-06-23 Brainlab Med. Computersysteme Gmbh Method and devices for localizing an instrument
WO1997023164A1 (en) * 1995-12-21 1997-07-03 Siemens Corporate Research, Inc. Calibration system and method for x-ray geometry
US5926568A (en) * 1997-06-30 1999-07-20 The University Of North Carolina At Chapel Hill Image object matching using core analysis and deformable shape loci

Non-Patent Citations (7)

* Cited by examiner, † Cited by third party
Title
DECLERCK J. ET AL: "Automatic Registration and Alignment on a Template of Cardiac Stress and Rest SPECT Images", PROC. OF THE MATHEMATICAL METHODS IN BIOMEDICAL IMAGE ANALYSIS WORKSHOP, 22 June 1996 (1996-06-22), pages 212 - 221 *
MACIUNAS ROBERT J.: "The Application Accuracy of Stereotactic Frames", NEUROSURGERY, vol. 35, no. 4, October 1994 (1994-10-01), pages 682 - 695 *
ROBB R.A. ET AL: "Patient-specific Anatomic Models from Three Dimensional Medical Image Data for Clinical Applications in Surgery and Endoscopy", JOURNAL OF DIGITAL IMAGING, vol. 10, no. 3, August 1997 (1997-08-01), pages 31 - 35 *
SCHELLHAS K.P. ET AL: "Three-dimensional Computed Tomography in Maxillofacial Surgical Planning", ARCHIVES OF FAMILY MEDICINE: OTOLARYNGOL HEAD NECK SURGURY, vol. 114, March 1993 (1993-03-01) - April 1988 (1988-04-01), pages 438 - 442 *
See also references of EP1027681A4 *
SEERAM EDWARD: "3-D Imaging: Basic Concepts for Radiological Technologists Radiologic Technology", vol. 69, no. 2, December 1997 (1997-12-01), pages 127 - 144 *
SELLBERG M.S. ET AL: "Virtual Human: a Computer Graphics Model for Biomechanical Simulations and Computer-aided Instruction", PROC. OF THE 16TH CONF. OF THE IEEE ON ENGINEERING ADVANCES: NEW OPPORTUNITIES FOR BIOMEDICAL ENGINEERS, 6 November 1994 (1994-11-06), pages 329 - 330 *

Cited By (253)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1127545A3 (en) * 2000-02-26 2004-05-06 Philips Intellectual Property & Standards GmbH Procedure for locating objects in radiotherapy
DE10148412B4 (en) * 2000-10-11 2020-01-16 Palodex Group Oy Method and device for imaging the head area of a patient
US8982201B2 (en) 2000-11-08 2015-03-17 Institut Straumann Ag Surface mapping and generating devices and methods for surface mapping and surface generation
US8922635B2 (en) 2000-11-08 2014-12-30 Institut Straumann Ag Surface mapping and generating devices and methods for surface mapping and surface generation
US8026943B2 (en) 2000-11-08 2011-09-27 Institut Straumann Ag Surface mapping and generating devices and methods for surface mapping and surface generation
EP2039321A3 (en) * 2000-11-08 2011-04-27 Institut Straumann AG Surface recording and generation
WO2003061501A3 (en) * 2002-01-16 2003-10-16 Orthosoft Inc Method and apparatus for reconstructing bone surfaces during surgery
US7715602B2 (en) 2002-01-18 2010-05-11 Orthosoft Inc. Method and apparatus for reconstructing bone surfaces during surgery
US7194295B2 (en) 2002-03-27 2007-03-20 Brainlab Ag Medical navigation and/or pre-operative treatment planning with the assistance of generic patient data
EP1348394A1 (en) 2002-03-27 2003-10-01 BrainLAB AG Planning or navigation assistance by generic obtained patient data with two-dimensional adaptation
EP1348393A1 (en) * 2002-03-27 2003-10-01 BrainLAB AG Medical navigation or pre-operative treatment planning supported by generic patient data
US7787932B2 (en) 2002-04-26 2010-08-31 Brainlab Ag Planning and navigation assistance using two-dimensionally adapted generic and detected patient data
CN1308897C (en) * 2002-09-15 2007-04-04 深圳市泛友科技有限公司 Method for forming new three-dimensional model using a group of two-dimensional photos and three-dimensional library
US7720519B2 (en) 2002-10-17 2010-05-18 Elekta Neuromag Oy Method for three-dimensional modeling of the skull and internal structures thereof
EP1417931A1 (en) * 2002-11-05 2004-05-12 EASTMAN KODAK COMPANY (a New Jersey corporation) Method for automatically producing true size radiographic image
US8727776B2 (en) 2002-11-11 2014-05-20 Sirona Dental Systems Gmbh Method for producing denture parts or for tooth restoration using electronic dental representations
WO2004044787A3 (en) * 2002-11-11 2007-11-15 Albert Mehl Method for producing denture parts or for tooth restoration using electronic dental representations
US9672444B2 (en) 2002-11-11 2017-06-06 Dentsply International Inc. Method for producing denture parts or for tooth restoration using electronic dental representations
US10874487B2 (en) 2003-02-26 2020-12-29 Align Technology, Inc. Systems and methods for fabricating a dental template
WO2004095372A1 (en) * 2003-04-22 2004-11-04 Provincia Italiana Della Congregazione Dei Figli Dell'immacolata Concezione - Instituto Dermopatico Dell'immacolata Automatic detection of skin lesions
EP1624823A2 (en) * 2003-05-02 2006-02-15 Orametrix, Inc. Unified workstation for virtual craniofacial diagnosis treatment planning and therapeutics
WO2004098378A2 (en) 2003-05-02 2004-11-18 Orametrix, Inc. Unified workstation for virtual craniofacial diagnosis treatment planning and therapeutics
EP1624823A4 (en) * 2003-05-02 2009-09-02 Orametrix Inc Unified workstation for virtual craniofacial diagnosis treatment planning and therapeutics
US7873403B2 (en) 2003-07-15 2011-01-18 Brainlab Ag Method and device for determining a three-dimensional form of a body from two-dimensional projection images
EP1570800A1 (en) * 2004-03-01 2005-09-07 BrainLAB AG Method and device for determining the symmetrical plane of a three dimensional object
US7477776B2 (en) 2004-03-01 2009-01-13 Brainlab Ag Method and apparatus for determining a plane of symmetry of a three-dimensional object
JP2008531163A (en) * 2005-03-01 2008-08-14 キングズ カレッジ ロンドン Surgery planning
WO2006092600A1 (en) 2005-03-01 2006-09-08 Kings College London Surgical planning
EP2471483A1 (en) * 2005-03-01 2012-07-04 Kings College London Surgical planning
WO2006116488A2 (en) * 2005-04-25 2006-11-02 Xoran Technologies, Inc. Ct system with synthetic view generation
WO2006116488A3 (en) * 2005-04-25 2006-12-21 Xoran Technologies Inc Ct system with synthetic view generation
WO2007017642A1 (en) * 2005-08-05 2007-02-15 Depuy Orthopädie Gmbh Computer assisted surgery system
US8862200B2 (en) 2005-12-30 2014-10-14 DePuy Synthes Products, LLC Method for determining a position of a magnetic source
US8148978B2 (en) 2005-12-30 2012-04-03 Depuy Products, Inc. Magnetic sensor array
EP1803413A2 (en) 2005-12-30 2007-07-04 DePuy Products, Inc. Magnetic sensor array for bone registration in computer-assisted orthopaedic surgery
US9480490B2 (en) 2006-02-27 2016-11-01 Biomet Manufacturing, Llc Patient-specific guides
US9345548B2 (en) 2006-02-27 2016-05-24 Biomet Manufacturing, Llc Patient-specific pre-operative planning
US9918740B2 (en) 2006-02-27 2018-03-20 Biomet Manufacturing, Llc Backup surgical instrument system and method
US9173661B2 (en) 2006-02-27 2015-11-03 Biomet Manufacturing, Llc Patient specific alignment guide with cutting surface and laser indicator
US9539013B2 (en) 2006-02-27 2017-01-10 Biomet Manufacturing, Llc Patient-specific elbow guides and associated methods
US9522010B2 (en) 2006-02-27 2016-12-20 Biomet Manufacturing, Llc Patient-specific orthopedic instruments
US9480580B2 (en) 2006-02-27 2016-11-01 Biomet Manufacturing, Llc Patient-specific acetabular alignment guides
US11534313B2 (en) 2006-02-27 2022-12-27 Biomet Manufacturing, Llc Patient-specific pre-operative planning
US9113971B2 (en) 2006-02-27 2015-08-25 Biomet Manufacturing, Llc Femoral acetabular impingement guide
US9005297B2 (en) 2006-02-27 2015-04-14 Biomet Manufacturing, Llc Patient-specific elbow guides and associated methods
US10206695B2 (en) 2006-02-27 2019-02-19 Biomet Manufacturing, Llc Femoral acetabular impingement guide
US9700329B2 (en) 2006-02-27 2017-07-11 Biomet Manufacturing, Llc Patient-specific orthopedic instruments
US10278711B2 (en) 2006-02-27 2019-05-07 Biomet Manufacturing, Llc Patient-specific femoral guide
US10390845B2 (en) 2006-02-27 2019-08-27 Biomet Manufacturing, Llc Patient-specific shoulder guide
US10426492B2 (en) 2006-02-27 2019-10-01 Biomet Manufacturing, Llc Patient specific alignment guide with cutting surface and laser indicator
US10507029B2 (en) 2006-02-27 2019-12-17 Biomet Manufacturing, Llc Patient-specific acetabular guides and associated instruments
US9913734B2 (en) 2006-02-27 2018-03-13 Biomet Manufacturing, Llc Patient-specific acetabular alignment guides
US10603179B2 (en) 2006-02-27 2020-03-31 Biomet Manufacturing, Llc Patient-specific augments
US9662127B2 (en) 2006-02-27 2017-05-30 Biomet Manufacturing, Llc Patient-specific acetabular guides and associated instruments
US9289253B2 (en) 2006-02-27 2016-03-22 Biomet Manufacturing, Llc Patient-specific shoulder guide
US9662216B2 (en) 2006-02-27 2017-05-30 Biomet Manufacturing, Llc Patient-specific hip joint devices
US8900244B2 (en) 2006-02-27 2014-12-02 Biomet Manufacturing, Llc Patient-specific acetabular guide and method
US9339278B2 (en) 2006-02-27 2016-05-17 Biomet Manufacturing, Llc Patient-specific acetabular guides and associated instruments
US8864769B2 (en) 2006-02-27 2014-10-21 Biomet Manufacturing, Llc Alignment guides with patient-specific anchoring elements
US10743937B2 (en) 2006-02-27 2020-08-18 Biomet Manufacturing, Llc Backup surgical instrument system and method
US8979936B2 (en) 2006-06-09 2015-03-17 Biomet Manufacturing, Llc Patient-modified implant
US10206697B2 (en) 2006-06-09 2019-02-19 Biomet Manufacturing, Llc Patient-specific knee alignment guide and associated method
US9795399B2 (en) 2006-06-09 2017-10-24 Biomet Manufacturing, Llc Patient-specific knee alignment guide and associated method
US8858561B2 (en) 2006-06-09 2014-10-14 Blomet Manufacturing, LLC Patient-specific alignment guide
US10893879B2 (en) 2006-06-09 2021-01-19 Biomet Manufacturing, Llc Patient-specific knee alignment guide and associated method
US9993344B2 (en) 2006-06-09 2018-06-12 Biomet Manufacturing, Llc Patient-modified implant
US11576689B2 (en) 2006-06-09 2023-02-14 Biomet Manufacturing, Llc Patient-specific knee alignment guide and associated method
US9861387B2 (en) 2006-06-09 2018-01-09 Biomet Manufacturing, Llc Patient-specific knee alignment guide and associated method
US8068648B2 (en) 2006-12-21 2011-11-29 Depuy Products, Inc. Method and system for registering a bone of a patient with a computer assisted orthopaedic surgery system
EP1959391A1 (en) * 2007-02-13 2008-08-20 BrainLAB AG Determination of the three dimensional contour path of an anatomical structure
US9907659B2 (en) 2007-04-17 2018-03-06 Biomet Manufacturing, Llc Method and apparatus for manufacturing an implant
US11554019B2 (en) 2007-04-17 2023-01-17 Biomet Manufacturing, Llc Method and apparatus for manufacturing an implant
US8352059B2 (en) 2007-04-19 2013-01-08 Damvig Develop Future Aps Method for the manufacturing of a reproduction of an encapsulated head of a foetus and objects obtained by the method
WO2008129360A1 (en) * 2007-04-19 2008-10-30 Damvig Develop Future Aps A method for the manufacturing of a reproduction of an encapsulated three-dimensional physical object and objects obtained by the method
US9439608B2 (en) 2007-04-20 2016-09-13 Medicim Nv Method for deriving shape information
EP1982652A1 (en) * 2007-04-20 2008-10-22 Medicim NV Method for deriving shape information
WO2008128720A2 (en) * 2007-04-20 2008-10-30 Medicim Nv Method for deriving shape information
WO2008128720A3 (en) * 2007-04-20 2009-03-19 Medicim Nv Method for deriving shape information
WO2009006303A3 (en) * 2007-06-29 2009-10-29 3M Innovative Properties Company Video-assisted margin marking for dental models
US10667887B2 (en) 2007-06-29 2020-06-02 Midmark Corporation Video-assisted margin marking for dental models
WO2009006303A2 (en) * 2007-06-29 2009-01-08 3M Innovative Properties Company Video-assisted margin marking for dental models
US8265949B2 (en) 2007-09-27 2012-09-11 Depuy Products, Inc. Customized patient surgical plan
US11696768B2 (en) 2007-09-30 2023-07-11 DePuy Synthes Products, Inc. Apparatus and method for fabricating a customized patient-specific orthopaedic instrument
US8357166B2 (en) 2007-09-30 2013-01-22 Depuy Products, Inc. Customized patient-specific instrumentation and method for performing a bone re-cut
US8361076B2 (en) 2007-09-30 2013-01-29 Depuy Products, Inc. Patient-customizable device and system for performing an orthopaedic surgical procedure
US8377068B2 (en) 2007-09-30 2013-02-19 DePuy Synthes Products, LLC. Customized patient-specific instrumentation for use in orthopaedic surgical procedures
US8398645B2 (en) 2007-09-30 2013-03-19 DePuy Synthes Products, LLC Femoral tibial customized patient-specific orthopaedic surgical instrumentation
US11931049B2 (en) 2007-09-30 2024-03-19 DePuy Synthes Products, Inc. Apparatus and method for fabricating a customized patient-specific orthopaedic instrument
US8357111B2 (en) 2007-09-30 2013-01-22 Depuy Products, Inc. Method and system for designing patient-specific orthopaedic surgical instruments
US10028750B2 (en) 2007-09-30 2018-07-24 DePuy Synthes Products, Inc. Apparatus and method for fabricating a customized patient-specific orthopaedic instrument
US10828046B2 (en) 2007-09-30 2020-11-10 DePuy Synthes Products, Inc. Apparatus and method for fabricating a customized patient-specific orthopaedic instrument
US8343159B2 (en) 2007-09-30 2013-01-01 Depuy Products, Inc. Orthopaedic bone saw and method of use thereof
US10159498B2 (en) 2008-04-16 2018-12-25 Biomet Manufacturing, Llc Method and apparatus for manufacturing an implant
US11024431B2 (en) 2008-05-23 2021-06-01 Align Technology, Inc. Smile designer
US10896761B2 (en) 2008-05-23 2021-01-19 Align Technology, Inc. Smile designer
US10758321B2 (en) 2008-05-23 2020-09-01 Align Technology, Inc. Smile designer
CN102159155B (en) * 2008-09-18 2014-08-13 3形状股份有限公司 Tools for customized design of dental restorations
CN102159155A (en) * 2008-09-18 2011-08-17 3形状股份有限公司 Tools for customized design of dental restorations
ITBO20090111A1 (en) * 2009-02-26 2010-08-27 Paolo Fiorini METHOD AND SURGICAL TRAINING APPARATUS
US10052110B2 (en) 2009-08-13 2018-08-21 Biomet Manufacturing, Llc Device for the resection of bones, method for producing such a device, endoprosthesis suited for this purpose and method for producing such an endoprosthesis
US9839433B2 (en) 2009-08-13 2017-12-12 Biomet Manufacturing, Llc Device for the resection of bones, method for producing such a device, endoprosthesis suited for this purpose and method for producing such an endoprosthesis
US9393028B2 (en) 2009-08-13 2016-07-19 Biomet Manufacturing, Llc Device for the resection of bones, method for producing such a device, endoprosthesis suited for this purpose and method for producing such an endoprosthesis
US11324522B2 (en) 2009-10-01 2022-05-10 Biomet Manufacturing, Llc Patient specific alignment guide with cutting surface and laser indicator
US9456833B2 (en) 2010-02-26 2016-10-04 Biomet Sports Medicine, Llc Patient-specific osteotomy devices and methods
US9066727B2 (en) 2010-03-04 2015-06-30 Materialise Nv Patient-specific computed tomography guides
US9579112B2 (en) 2010-03-04 2017-02-28 Materialise N.V. Patient-specific computed tomography guides
US10893876B2 (en) 2010-03-05 2021-01-19 Biomet Manufacturing, Llc Method and apparatus for manufacturing an implant
US8421933B2 (en) 2010-06-29 2013-04-16 Acer Incorporated Shutter glasses capable of viewing a plurality of types of monitors whose image light outputs have different polarization directions
TWI387315B (en) * 2010-06-29 2013-02-21 Acer Inc Three dimensional liquid crystal shutter glasses
US10098648B2 (en) 2010-09-29 2018-10-16 Biomet Manufacturing, Llc Patient-specific guide for partial acetabular socket replacement
US9271744B2 (en) 2010-09-29 2016-03-01 Biomet Manufacturing, Llc Patient-specific guide for partial acetabular socket replacement
US11234719B2 (en) 2010-11-03 2022-02-01 Biomet Manufacturing, Llc Patient-specific shoulder guide
US9968376B2 (en) 2010-11-29 2018-05-15 Biomet Manufacturing, Llc Patient-specific orthopedic instruments
US9672655B2 (en) 2011-02-11 2017-06-06 E4 Endeavors, Inc. System and method for modeling a biopsy specimen
US9053563B2 (en) 2011-02-11 2015-06-09 E4 Endeavors, Inc. System and method for modeling a biopsy specimen
US10223825B2 (en) 2011-02-11 2019-03-05 E4 Endeavors, Inc. System and method for modeling a biopsy specimen
US8711178B2 (en) 2011-03-01 2014-04-29 Dolphin Imaging Systems, Llc System and method for generating profile morphing using cephalometric tracing data
WO2012117122A1 (en) * 2011-03-01 2012-09-07 Dolphin Imaging Systems, Llc System and method for generating profile change using cephalometric monitoring data
US9743935B2 (en) 2011-03-07 2017-08-29 Biomet Manufacturing, Llc Patient-specific femoral version guide
US9241745B2 (en) 2011-03-07 2016-01-26 Biomet Manufacturing, Llc Patient-specific femoral version guide
US9445907B2 (en) 2011-03-07 2016-09-20 Biomet Manufacturing, Llc Patient-specific tools and implants
US20120259592A1 (en) * 2011-04-07 2012-10-11 Dolphin Imaging Systems, Llc System and Method for Three-Dimensional Maxillofacial Surgical Simulation and Planning
US8417004B2 (en) 2011-04-07 2013-04-09 Dolphin Imaging Systems, Llc System and method for simulated linearization of curved surface
US8650005B2 (en) 2011-04-07 2014-02-11 Dolphin Imaging Systems, Llc System and method for three-dimensional maxillofacial surgical simulation and planning
US9717510B2 (en) 2011-04-15 2017-08-01 Biomet Manufacturing, Llc Patient-specific numerically controlled instrument
US10251690B2 (en) 2011-04-19 2019-04-09 Biomet Manufacturing, Llc Patient-specific fracture fixation instrumentation and method
US9675400B2 (en) 2011-04-19 2017-06-13 Biomet Manufacturing, Llc Patient-specific fracture fixation instrumentation and method
US8956364B2 (en) 2011-04-29 2015-02-17 Biomet Manufacturing, Llc Patient-specific partial knee guides and other instruments
US9474539B2 (en) 2011-04-29 2016-10-25 Biomet Manufacturing, Llc Patient-specific convertible guides
US9743940B2 (en) 2011-04-29 2017-08-29 Biomet Manufacturing, Llc Patient-specific partial knee guides and other instruments
US9757238B2 (en) 2011-06-06 2017-09-12 Biomet Manufacturing, Llc Pre-operative planning and manufacturing method for orthopedic procedure
US8903530B2 (en) 2011-06-06 2014-12-02 Biomet Manufacturing, Llc Pre-operative planning and manufacturing method for orthopedic procedure
US9687261B2 (en) 2011-06-13 2017-06-27 Biomet Manufacturing, Llc Drill guides for confirming alignment of patient-specific alignment guides
US9084618B2 (en) 2011-06-13 2015-07-21 Biomet Manufacturing, Llc Drill guides for confirming alignment of patient-specific alignment guides
US9668747B2 (en) 2011-07-01 2017-06-06 Biomet Manufacturing, Llc Patient-specific-bone-cutting guidance instruments and methods
US11253269B2 (en) 2011-07-01 2022-02-22 Biomet Manufacturing, Llc Backup kit for a patient-specific arthroplasty kit assembly
US10492798B2 (en) 2011-07-01 2019-12-03 Biomet Manufacturing, Llc Backup kit for a patient-specific arthroplasty kit assembly
US9173666B2 (en) 2011-07-01 2015-11-03 Biomet Manufacturing, Llc Patient-specific-bone-cutting guidance instruments and methods
US9427320B2 (en) 2011-08-04 2016-08-30 Biomet Manufacturing, Llc Patient-specific pelvic implants for acetabular reconstruction
US9603613B2 (en) 2011-08-31 2017-03-28 Biomet Manufacturing, Llc Patient-specific sacroiliac guides and associated methods
US9066734B2 (en) 2011-08-31 2015-06-30 Biomet Manufacturing, Llc Patient-specific sacroiliac guides and associated methods
US9295497B2 (en) 2011-08-31 2016-03-29 Biomet Manufacturing, Llc Patient-specific sacroiliac and pedicle guides
US9439659B2 (en) 2011-08-31 2016-09-13 Biomet Manufacturing, Llc Patient-specific sacroiliac guides and associated methods
US9386993B2 (en) 2011-09-29 2016-07-12 Biomet Manufacturing, Llc Patient-specific femoroacetabular impingement instruments and methods
US10456205B2 (en) 2011-09-29 2019-10-29 Biomet Manufacturing, Llc Patient-specific femoroacetabular impingement instruments and methods
US11406398B2 (en) 2011-09-29 2022-08-09 Biomet Manufacturing, Llc Patient-specific femoroacetabular impingement instruments and methods
US10734116B2 (en) 2011-10-04 2020-08-04 Quantant Technology, Inc. Remote cloud based medical image sharing and rendering semi-automated or fully automated network and/or web-based, 3D and/or 4D imaging of anatomy for training, rehearsing and/or conducting medical procedures, using multiple standard X-ray and/or other imaging projections, without a need for special hardware and/or systems and/or pre-processing/analysis of a captured image data
US10842510B2 (en) 2011-10-27 2020-11-24 Biomet Manufacturing, Llc Patient specific glenoid guide
US9351743B2 (en) 2011-10-27 2016-05-31 Biomet Manufacturing, Llc Patient-specific glenoid guides
US9451973B2 (en) 2011-10-27 2016-09-27 Biomet Manufacturing, Llc Patient specific glenoid guide
US11602360B2 (en) 2011-10-27 2023-03-14 Biomet Manufacturing, Llc Patient specific glenoid guide
US10426493B2 (en) 2011-10-27 2019-10-01 Biomet Manufacturing, Llc Patient-specific glenoid guides
US9936962B2 (en) 2011-10-27 2018-04-10 Biomet Manufacturing, Llc Patient specific glenoid guide
US9554910B2 (en) 2011-10-27 2017-01-31 Biomet Manufacturing, Llc Patient-specific glenoid guide and implants
US11298188B2 (en) 2011-10-27 2022-04-12 Biomet Manufacturing, Llc Methods for patient-specific shoulder arthroplasty
US10426549B2 (en) 2011-10-27 2019-10-01 Biomet Manufacturing, Llc Methods for patient-specific shoulder arthroplasty
US9301812B2 (en) 2011-10-27 2016-04-05 Biomet Manufacturing, Llc Methods for patient-specific shoulder arthroplasty
US11419618B2 (en) 2011-10-27 2022-08-23 Biomet Manufacturing, Llc Patient-specific glenoid guides
US11419726B2 (en) 2012-01-20 2022-08-23 Conformis, Inc. Systems and methods for manufacturing, preparation and use of blanks in orthopedic implants
US9827106B2 (en) 2012-02-02 2017-11-28 Biomet Manufacturing, Llc Implant with patient-specific porous structure
US9237950B2 (en) 2012-02-02 2016-01-19 Biomet Manufacturing, Llc Implant with patient-specific porous structure
EP2861180B1 (en) 2012-06-15 2019-08-07 Vita Zahnfabrik H. Rauter GmbH & Co. KG Method for preparing a partial or full dental prosthesis
US9901740B2 (en) 2012-08-31 2018-02-27 Nuvectra Corporation Clinician programming system and method
US9776007B2 (en) 2012-08-31 2017-10-03 Nuvectra Corporation Method and system of quick neurostimulation electrode configuration and positioning
US10141076B2 (en) 2012-08-31 2018-11-27 Nuvectra Corporation Programming and virtual reality representation of stimulation parameter groups
US9471753B2 (en) 2012-08-31 2016-10-18 Nuvectra Corporation Programming and virtual reality representation of stimulation parameter Groups
US9507912B2 (en) 2012-08-31 2016-11-29 Nuvectra Corporation Method and system of simulating a pulse generator on a clinician programmer
US10668276B2 (en) 2012-08-31 2020-06-02 Cirtec Medical Corp. Method and system of bracketing stimulation parameters on clinician programmers
US9615788B2 (en) 2012-08-31 2017-04-11 Nuvectra Corporation Method and system of producing 2D representations of 3D pain and stimulation maps and implant models on a clinician programmer
US9375582B2 (en) 2012-08-31 2016-06-28 Nuvectra Corporation Touch screen safety controls for clinician programmer
US10083261B2 (en) 2012-08-31 2018-09-25 Nuvectra Corporation Method and system of simulating a pulse generator on a clinician programmer
US8868199B2 (en) 2012-08-31 2014-10-21 Greatbatch Ltd. System and method of compressing medical maps for pulse generator or database storage
US9594877B2 (en) 2012-08-31 2017-03-14 Nuvectra Corporation Virtual reality representation of medical devices
US9259577B2 (en) 2012-08-31 2016-02-16 Greatbatch Ltd. Method and system of quick neurostimulation electrode configuration and positioning
US10347381B2 (en) 2012-08-31 2019-07-09 Nuvectra Corporation Programming and virtual reality representation of stimulation parameter groups
US9314640B2 (en) 2012-08-31 2016-04-19 Greatbatch Ltd. Touch screen finger position indicator for a spinal cord stimulation programming device
US9180302B2 (en) 2012-08-31 2015-11-10 Greatbatch Ltd. Touch screen finger position indicator for a spinal cord stimulation programming device
US10376701B2 (en) 2012-08-31 2019-08-13 Nuvectra Corporation Touch screen safety controls for clinician programmer
US9555255B2 (en) 2012-08-31 2017-01-31 Nuvectra Corporation Touch screen finger position indicator for a spinal cord stimulation programming device
US8903496B2 (en) 2012-08-31 2014-12-02 Greatbatch Ltd. Clinician programming system and method
US9767255B2 (en) 2012-09-05 2017-09-19 Nuvectra Corporation Predefined input for clinician programmer data entry
US8983616B2 (en) 2012-09-05 2015-03-17 Greatbatch Ltd. Method and system for associating patient records with pulse generators
US9204977B2 (en) 2012-12-11 2015-12-08 Biomet Manufacturing, Llc Patient-specific acetabular guide for anterior approach
US9060788B2 (en) 2012-12-11 2015-06-23 Biomet Manufacturing, Llc Patient-specific acetabular guide for anterior approach
US9597201B2 (en) 2012-12-11 2017-03-21 Biomet Manufacturing, Llc Patient-specific acetabular guide for anterior approach
US9839438B2 (en) 2013-03-11 2017-12-12 Biomet Manufacturing, Llc Patient-specific glenoid guide with a reusable guide holder
US11617591B2 (en) 2013-03-11 2023-04-04 Biomet Manufacturing, Llc Patient-specific glenoid guide with a reusable guide holder
US10441298B2 (en) 2013-03-11 2019-10-15 Biomet Manufacturing, Llc Patient-specific glenoid guide with a reusable guide holder
US9579107B2 (en) 2013-03-12 2017-02-28 Biomet Manufacturing, Llc Multi-point fit for patient specific guide
US9700325B2 (en) 2013-03-12 2017-07-11 Biomet Manufacturing, Llc Multi-point fit for patient specific guide
US9498233B2 (en) 2013-03-13 2016-11-22 Biomet Manufacturing, Llc. Universal acetabular guide and associated hardware
US10376270B2 (en) 2013-03-13 2019-08-13 Biomet Manufacturing, Llc Universal acetabular guide and associated hardware
US11191549B2 (en) 2013-03-13 2021-12-07 Biomet Manufacturing, Llc Tangential fit of patient-specific guides
US10426491B2 (en) 2013-03-13 2019-10-01 Biomet Manufacturing, Llc Tangential fit of patient-specific guides
US9826981B2 (en) 2013-03-13 2017-11-28 Biomet Manufacturing, Llc Tangential fit of patient-specific guides
US9517145B2 (en) 2013-03-15 2016-12-13 Biomet Manufacturing, Llc Guide alignment system and method
CN108537628B (en) * 2013-08-22 2022-02-01 贝斯普客公司 Method and system for creating customized products
US11867979B2 (en) 2013-08-22 2024-01-09 Bespoke, Inc. Method and system to create custom, user-specific eyewear
CN105637512A (en) * 2013-08-22 2016-06-01 贝斯普客公司 Method and system to create custom products
US10459256B2 (en) 2013-08-22 2019-10-29 Bespoke, Inc. Method and system to create custom, user-specific eyewear
US10698236B2 (en) 2013-08-22 2020-06-30 Bespoke, Inc. Method and system to create custom, user-specific eyewear
US10451900B2 (en) 2013-08-22 2019-10-22 Bespoke, Inc. Method and system to create custom, user-specific eyewear
US10031351B2 (en) 2013-08-22 2018-07-24 Bespoke, Inc. Method and system to create custom, user-specific eyewear
US11914226B2 (en) 2013-08-22 2024-02-27 Bespoke, Inc. Method and system to create custom, user-specific eyewear
US9529213B2 (en) 2013-08-22 2016-12-27 Bespoke, Inc. Method and system to create custom, user-specific eyewear
US10031350B2 (en) 2013-08-22 2018-07-24 Bespoke, Inc. Method and system to create custom, user-specific eyewear
US11428960B2 (en) 2013-08-22 2022-08-30 Bespoke, Inc. Method and system to create custom, user-specific eyewear
US11428958B2 (en) 2013-08-22 2022-08-30 Bespoke, Inc. Method and system to create custom, user-specific eyewear
CN108537628A (en) * 2013-08-22 2018-09-14 贝斯普客公司 Method and system for creating customed product
WO2015027196A1 (en) * 2013-08-22 2015-02-26 Bespoke, Inc. Method and system to create custom products
US9703123B2 (en) 2013-08-22 2017-07-11 Bespoke, Inc. Method and system to create custom, user-specific eyewear
US10222635B2 (en) 2013-08-22 2019-03-05 Bespoke, Inc. Method and system to create custom, user-specific eyewear
US11179165B2 (en) 2013-10-21 2021-11-23 Biomet Manufacturing, Llc Ligament guide registration
US10282488B2 (en) 2014-04-25 2019-05-07 Biomet Manufacturing, Llc HTO guide with optional guided ACL/PCL tunnels
US9408616B2 (en) 2014-05-12 2016-08-09 Biomet Manufacturing, Llc Humeral cut guide
US9839436B2 (en) 2014-06-03 2017-12-12 Biomet Manufacturing, Llc Patient-specific glenoid depth control
US9561040B2 (en) 2014-06-03 2017-02-07 Biomet Manufacturing, Llc Patient-specific glenoid depth control
US9833245B2 (en) 2014-09-29 2017-12-05 Biomet Sports Medicine, Llc Tibial tubercule osteotomy
US11026699B2 (en) 2014-09-29 2021-06-08 Biomet Manufacturing, Llc Tibial tubercule osteotomy
US10335162B2 (en) 2014-09-29 2019-07-02 Biomet Sports Medicine, Llc Tibial tubercle osteotomy
US9826994B2 (en) 2014-09-29 2017-11-28 Biomet Manufacturing, Llc Adjustable glenoid pin insertion guide
US11147652B2 (en) 2014-11-13 2021-10-19 Align Technology, Inc. Method for tracking, predicting, and proactively correcting malocclusion and related issues
US9820868B2 (en) 2015-03-30 2017-11-21 Biomet Manufacturing, Llc Method and apparatus for a pin apparatus
US10568647B2 (en) 2015-06-25 2020-02-25 Biomet Manufacturing, Llc Patient-specific humeral guide designs
US10925622B2 (en) 2015-06-25 2021-02-23 Biomet Manufacturing, Llc Patient-specific humeral guide designs
US11801064B2 (en) 2015-06-25 2023-10-31 Biomet Manufacturing, Llc Patient-specific humeral guide designs
US10226262B2 (en) 2015-06-25 2019-03-12 Biomet Manufacturing, Llc Patient-specific humeral guide designs
WO2017011337A1 (en) * 2015-07-10 2017-01-19 Quantant Technology Inc. Remote cloud based medical image sharing and rendering
FR3048541A1 (en) * 2016-03-01 2017-09-08 Lyra Holding VIRTUAL CHANGE OF A PERSON'S TOOTH
US10722310B2 (en) 2017-03-13 2020-07-28 Zimmer Biomet CMF and Thoracic, LLC Virtual surgery planning system and method
US11207135B2 (en) 2017-07-12 2021-12-28 K2M, Inc. Systems and methods for modeling spines and treating spines based on spine models
US11707327B2 (en) 2017-07-12 2023-07-25 K2M, Inc. Systems and methods for modeling spines and treating spines based on spine models
US11000334B1 (en) 2017-07-12 2021-05-11 K2M, Inc. Systems and methods for modeling spines and treating spines based on spine models
CN111052186A (en) * 2017-08-08 2020-04-21 维申Rt 有限公司 Method and apparatus for measuring accuracy of a model generated by a patient monitoring system
CN111052186B (en) * 2017-08-08 2023-06-02 维申Rt 有限公司 Method and apparatus for measuring accuracy of model generated by patient monitoring system
EP3673389A4 (en) * 2017-09-29 2021-07-07 K2M, Inc. Systems and methods for simulating spine and skeletal system pathologies
US10892058B2 (en) 2017-09-29 2021-01-12 K2M, Inc. Systems and methods for simulating spine and skeletal system pathologies
WO2019068085A1 (en) * 2017-09-29 2019-04-04 K2M, Inc. Systems and methods for simulating spine and skeletal system pathologies
US11798688B2 (en) 2017-09-29 2023-10-24 K2M, Inc. Systems and methods for simulating spine and skeletal system pathologies
AU2018342606B2 (en) * 2017-09-29 2023-11-09 K2M, Inc. Systems and methods for simulating spine and skeletal system pathologies
US10874460B2 (en) 2017-09-29 2020-12-29 K2M, Inc. Systems and methods for modeling spines and treating spines based on spine models
US10438351B2 (en) 2017-12-20 2019-10-08 International Business Machines Corporation Generating simulated photographic anatomical slices
US10521908B2 (en) 2017-12-20 2019-12-31 International Business Machines Corporation User interface for displaying simulated anatomical photographs
US10614570B2 (en) 2017-12-20 2020-04-07 International Business Machines Corporation Medical image exam navigation using simulated anatomical photographs
IT201800006261A1 (en) * 2018-06-13 2019-12-13 PROCEDURE FOR MAKING AN ANATOMICAL PROTOTYPE
EP3581142A1 (en) * 2018-06-13 2019-12-18 Just Digital S.r.l. Method for providing a dental prosthesis prototype
US11051829B2 (en) 2018-06-26 2021-07-06 DePuy Synthes Products, Inc. Customized patient-specific orthopaedic surgical instrument
US11636650B2 (en) 2018-09-24 2023-04-25 K2M, Inc. System and method for isolating anatomical features in computerized tomography data
WO2020222169A1 (en) 2019-05-02 2020-11-05 DePuy Synthes Products, Inc. Orthopaedic implant placement system and method
US11950786B2 (en) 2021-07-02 2024-04-09 DePuy Synthes Products, Inc. Customized patient-specific orthopaedic surgical instrument
WO2023156447A1 (en) * 2022-02-18 2023-08-24 3Shape A/S Method of generating a training data set for determining periodontal structures of a patient
CN114596289A (en) * 2022-03-11 2022-06-07 北京朗视仪器股份有限公司 Mouth point detection method based on soft tissue contour line sampling points
CN114596289B (en) * 2022-03-11 2022-11-22 北京朗视仪器股份有限公司 Mouth point detection method based on soft tissue contour line sampling points

Also Published As

Publication number Publication date
AU4076999A (en) 1999-11-29
EP1027681A1 (en) 2000-08-16
CA2296274A1 (en) 1999-11-18
EP1027681A4 (en) 2001-09-19

Similar Documents

Publication Publication Date Title
EP1027681A1 (en) Method and apparatus for generating 3d models from medical images
EP3952782B1 (en) Visual presentation of gingival line generated based on 3d tooth model
US10945813B2 (en) Providing a simulated outcome of dental treatment on a patient
US7717708B2 (en) Method and system for integrated orthodontic treatment planning using unified workstation
EP2134290B1 (en) Computer-assisted creation of a custom tooth set-up using facial analysis
Maal et al. The accuracy of matching three-dimensional photographs with skin surfaces derived from cone-beam computed tomography
US7156655B2 (en) Method and system for comprehensive evaluation of orthodontic treatment using unified workstation
EP2564375B1 (en) Virtual cephalometric imaging
US7585172B2 (en) Orthodontic treatment planning with user-specified simulation of tooth movement
US20140379356A1 (en) Method and system for integrated orthodontic treatment planning using unified workstation
US20090068617A1 (en) Method Of Designing Dental Devices Using Four-Dimensional Data
Barone et al. Geometrical modeling of complete dental shapes by using panoramic X-ray, digital mouth data and anatomical templates
Patel et al. Surgical planning: 2D to 3D
Barone et al. 3D reconstruction of individual tooth shapes by integrating dental cad templates and patient-specific anatomy

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW SD SL SZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
ENP Entry into the national phase

Ref document number: 2296274

Country of ref document: CA

Ref country code: CA

Ref document number: 2296274

Kind code of ref document: A

Format of ref document f/p: F

NENP Non-entry into the national phase

Ref country code: KR

WWE Wipo information: entry into national phase

Ref document number: 1999924217

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 502555

Country of ref document: NZ

WWE Wipo information: entry into national phase

Ref document number: 40769/99

Country of ref document: AU

WWP Wipo information: published in national office

Ref document number: 1999924217

Country of ref document: EP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

WWE Wipo information: entry into national phase

Ref document number: 09462752

Country of ref document: US

WWW Wipo information: withdrawn in national office

Ref document number: 1999924217

Country of ref document: EP