WO2001084479A1 - Method and system for scanning a surface and generating a three-dimensional object - Google Patents

Method and system for scanning a surface and generating a three-dimensional object Download PDF

Info

Publication number
WO2001084479A1
WO2001084479A1 PCT/US2001/012107 US0112107W WO0184479A1 WO 2001084479 A1 WO2001084479 A1 WO 2001084479A1 US 0112107 W US0112107 W US 0112107W WO 0184479 A1 WO0184479 A1 WO 0184479A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
feature
data
model
point
Prior art date
Application number
PCT/US2001/012107
Other languages
French (fr)
Inventor
Rüdger Rubbert
Peer Sporbert
Thomas Weise
Rohit Sachdeva
Original Assignee
Orametirix, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US09/560,132 external-priority patent/US6771809B1/en
Priority claimed from US09/560,583 external-priority patent/US6738508B1/en
Priority claimed from US09/560,584 external-priority patent/US7068836B1/en
Priority claimed from US09/560,644 external-priority patent/US6413084B1/en
Priority claimed from US09/560,133 external-priority patent/US6744932B1/en
Priority claimed from US09/560,645 external-priority patent/US6728423B1/en
Priority claimed from US09/560,131 external-priority patent/US6744914B1/en
Priority claimed from US09/616,093 external-priority patent/US6532299B1/en
Priority to JP2001581218A priority Critical patent/JP4206213B2/en
Priority to EP01925005A priority patent/EP1287482A4/en
Priority to AU2001251606A priority patent/AU2001251606A1/en
Application filed by Orametirix, Inc. filed Critical Orametirix, Inc.
Publication of WO2001084479A1 publication Critical patent/WO2001084479A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2509Color coding
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C7/00Orthodontics, i.e. obtaining or maintaining the desired position of teeth, e.g. by straightening, evening, regulating, separating, or by correcting malocclusions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C7/00Orthodontics, i.e. obtaining or maintaining the desired position of teeth, e.g. by straightening, evening, regulating, separating, or by correcting malocclusions
    • A61C7/12Brackets; Arch wires; Combinations thereof; Accessories therefor
    • A61C7/14Brackets; Fixing brackets to teeth
    • A61C7/146Positioning or placement of brackets; Tools therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C9/00Impression cups, i.e. impression trays; Impression methods
    • A61C9/004Means or methods for taking digitized impressions
    • A61C9/0046Data acquisition means or methods
    • A61C9/0053Optical means or methods, e.g. scanning the teeth by a laser or light beam
    • A61C9/006Optical means or methods, e.g. scanning the teeth by a laser or light beam projecting one or more stripes or patterns on the teeth
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C7/00Orthodontics, i.e. obtaining or maintaining the desired position of teeth, e.g. by straightening, evening, regulating, separating, or by correcting malocclusions
    • A61C7/002Orthodontic computer assisted systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/02Prostheses implantable into the body
    • A61F2/30Joints
    • A61F2/3094Designing or manufacturing processes
    • A61F2/30942Designing or manufacturing processes for designing or making customized prostheses, e.g. using templates, CT or NMR scans, finite-element analysis or CAD-CAM techniques
    • A61F2002/30953Designing or manufacturing processes for designing or making customized prostheses, e.g. using templates, CT or NMR scans, finite-element analysis or CAD-CAM techniques using a remote computer network, e.g. Internet
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30036Dental; Teeth

Definitions

  • the present invention relates generally to the mapping of objects, and more specifically to creating three-dimensional models of objects, to registering object portions to generate a model of the object, to registering scanned portions of the object to provide a three-dimensional object, to scanning anatomical structures to treat and diagnose, as well as develop and manufacture medical and dental devices and appliances and to providing specific images to aid the mapping of objects.
  • anatomical devices such as prosthetics, orthotics, and appliances such as orthodontics
  • Current methods of generating anatomical devices is subjective, whereby a practitioner specifies, or designs, the anatomical device based upon subjective criteria such as the practitioner's view of the anatomical structure, the location where a device is to be used, and the practitioner's experience and recall of similar situations.
  • subjective criteria results in the development of an anatomical device that can vary significantly from practitioner to practitioner, and prevents the acquisition of a knowledge database that can be used by others.
  • impressions are subject to distortion, wear, damage, have a limited shelf life, are imprecise, require additional cost to generate multiply copies, and have an accuracy that is not readily verifiable. Therefore, whether an impression or a model of a structure is a true representation of the anatomical structure is not readily verifiable.
  • impression processes are generally uncomfortable and inconvenient for patients, require a visit to a practitioner's office, and are time consuming.
  • Another attempt to make development less subjective includes using two-dimensional images.
  • the use of 2-dimensional images as known can not provide precise structure information, and still must be objectively interpreted by the practitioner.
  • the manufacturing of the device is still based upon an objective interpretation.
  • Prior art Figure 1 illustrates an object 100 having visible surfaces 101- 104.
  • the visible surfaces 101-103 form a rectangular shape residing on top of a generally planer surface 104.
  • an image Projected onto the object 100 is an image, which includes the line 110.
  • the image of line 110 is received by a viewing device, such as a camera, (not shown) and processed in order to determine the shape of that portion of object 100 where the line 110 resides.
  • a viewing device such as a camera
  • By moving the line 110 across the object 100 it is possible to map the entire object 100.
  • Limitations associated with using an image comprising a single line 110 is that a significant amount of time is needed to scan the object 100 to provide an accurate map, and a fixed reference point is needed at either the scanner or the obj ect.
  • Figure 2 illustrates a prior art solution to reduce the amount of time taken to scan an object. Specifically, Figure 2 illustrates an image including lines 121 through 125. By providing multiple lines, it is possible to scan a greater surface area at once, thus allowing for more efficient processing of data associated with the object 100. Limitations of using patterns such as are illustrated in Figure 2 include the need for a fixed reference point, and that the surface resolution capable of being mapped can be reduced because of the potential for improper processing of data due to overlapping of the discrete portions of the image.
  • Prior art Figure 3 illustrates the shapes of Figures 1 and 2 from a side view such that only surface 102 is visible.
  • the projection device projects a pattern in a direction perpendicular to the surface 101 which forms the top edge of surface 102 in Figure 3.
  • the point from the center of the projection lens to the surface is referred to as the projection axis, the rotational axis of the projection lens, or the centerline of the projection lens.
  • an imaginary line from a center point of the viewing device (not shown) is referred to as the view axis, the rotational axis of the view device, or the centerline of the view device, extends in the direction which the viewing device is oriented.
  • the physical relationship of the projection axis and the view axis with respect to each other is generally known.
  • the projection axis and the view axis reside in a common plane.
  • the relationship between the projection system and the view system is physically calibrated, such that the relationship between the projector, and the view device is known.
  • point of reference is to describe the reference from which a third person, such as the reader, is viewing an image. For example, for Figure 2, the point of reference is above and to the side of the point that is formed by surfaces 101, 102, and 103.
  • Figure 4 illustrates the object 100 with the image of Figure 2 projected upon it where the point of reference is equal to the projection angle.
  • the point of reference is equal to the projection angle
  • no discontinuities appear in the projected image.
  • the lines 121-125 appear to be straight lines upon the object 100.
  • the point of reference is equal to the projection axis
  • no useful data for mapping objects is obtained, because the lines appear to be undistorted.
  • Figure 5 illustrates the object 100 from a point of reference equal to the view angle fleet of Figure 2.
  • the surfaces 104, 103 and 101 are visible because the view axis is substantially perpendicular to the line formed by surfaces 101 and 103, and is to the right of the plane formed by surface 102, see Figure 2, which is therefore not illustrated in Figure 5. Because of the angle at which the image is being viewed, or received by the viewing device, the lines
  • line 122 and 123, and 123 and 124 coincide to give the impression that they are single continuous lines. Because line 125 is projected upon a single level surface elevation, surface 104, line 125 is a continuous single line.
  • the line pairs 121 and 122, 122 and 123, and 123 and 124 will be improperly interpreted as single lines.
  • the two- tiered object illustrated in Figure 2 may actually be mapped as a single level surface, or otherwise inaccurately displayed because the processing steps can not distinguish between the line pairs.
  • Figure 6 illustrates a prior art solution for overcoming the problem described in Figure 5. Specifically, Figure 6 illustrates the shape 100 having an image projected upon it whereby a plurality of lines having different line widths, or thickness, are used. Figure 7 illustrates the pattern of Figure 6 from the same point of reference as that of Figure 5.
  • Figure 8 illustrates from a side point of reference a structure having a surface 710 with sharply varying features.
  • the surface 710 is illustrated to be substantially perpendicular to the point of reference of Figure 8.
  • the object 700 has side surfaces 713 and 715, and top surfaces 711 and 712. From the point of reference of Figure 8, the actual surfaces 711, 712, 713 and 715 are not viewed, only their edges are represented.
  • the surface 711 is a relatively steep sloped surface, while the surface 712 is a relatively gentle sloped surface.
  • a first line 721 has a width of four.
  • a second projected line 722 has a width of one.
  • a third projected line 723 has a width of eight.
  • the line 721, having a width of four, is projected onto a relatively flat surface 714. Because of the angle between the projection axis and the view axis, the actual line 721 width viewed at the flat surface 714 is approximately two. If the lines 722 and 723 where also projected upon the relatively flat surface 714 their respected widths would vary by approximately the same proportion amount as that of 721, such that the thickness can be detected during the analysis steps of mapping the surface. However, because line 722 is projected onto the angled surface 711, the perspective from the viewing device along the viewing axis is such that the line 722 has a viewed width of two.
  • Line 722 appears to have a width of two because the steep angle of the surface 710 allows for a greater portion of the projected line 722 to be projected onto a greater area of the surface 711. It is this greater area of the surface 722 that is viewed to give the perception that the projected line 722 has a thickness of two.
  • line 723 is affected by surface 712 to give the perception that the projected line 723 having an actual width of eight, has a width of two. This occurs because the angle of the surface 712, relative to the viewing device allows the surface area with the projected line 723 to appear to have a width of two. The results of this phenomenon are further illustrated in Figure 9.
  • Figure 9 illustrates the shape 700 of Figure 8 from the point of reference of the view axis.
  • the lines 721-723 are projected onto the surface 714 in such a manner that the difference between the line thickness can be readily determined. Therefore, when an analysis of the surface area 714 occurs, the lines are readily discernable based upon the viewed image.
  • the line 722 can be erroneously identified as being line 721 because not only are the widths the same, but line 722 on surface 711 lines up with line 721 on surface 714.
  • the line 723, having a projected width of eight has a viewed width of two. Therefore, during the analysis of the received images, it may not be possible to distinguish between lines 721, 722, and 723 on surfaces 711 and 712. The inability to distinguish between such lines can result in an erroneous analysis of the surfaces.
  • One proposed method of scanning disclosed in foreign patent DE 198 21 611.4, used a pattern that had rows of black and white triangles and squares running parallel to a plane of triangulation.
  • the rows used measuring features that include a digital encrypted pattern.
  • a break in the sequence can result due to a portion of the pattern be hidden.
  • the disclosed encrypted pattern is such that breaks in the sequence can result in the inability to decode the pattern, since it may not be possible to know which portion of the pattern is missing.
  • a further limitation of the type of encoding described is that distortion can cause one encoding feature to look like another. For example, a triangle can be made to look like a square.
  • Figure 1 illustrates an object being scanned by a single line in accordance with the prior art
  • Figure 2 illustrates an object being scanned by a plurality of lines in accordance with the prior art
  • Figure 3 illustrates a projection axis and a view axis associated with the lines of Figure 2 in accordance with the prior art
  • Figure 4 illustrates the object of Figure 1 from a point of reference equal to the proj ection axis of Figure 3 ;
  • Figure 5 illustrates the object of Figure 3 from the view axis of Figure 3;
  • Figure 6 illustrates an object having a plurality of lines of varying thickness projected upon it in accordance with the prior art
  • Figure 7 illustrates the object of Figure 6 from a point of reference equal to the view axis as shown in Figure 3 ;
  • Figure 8 illustrates an object from a side view having varying projected line thickness in accordance with the prior art
  • Figure 9 illustrates the object of Figure 8 from point of reference equal to the view axis of Figure 8;
  • Figure 10 illustrates a system in accordance with the present invention
  • Figure 11 illustrates a portion of the system of Figure 10 in accordance with the present invention
  • Figure 12 illustrates, in flow diagram form, a method in accordance with the present invention
  • Figure 13 illustrates, the object of Figure 3 from a point of reference equal to the view axis of Figure 3 in accordance with the present invention
  • Figure 14 illustrates, the object of Figure 3 from a point of reference equal to the view axis of Figure 3 in accordance with the present invention
  • Figure 15 illustrates an object having a pattern projected upon it in accordance with the present invention
  • Figure 16 illustrates a table identifying various types of pattern components in accordance with the present invention
  • Figure 17 illustrates a set of unique identifiers in accordance with the present invention
  • Figure 18 illustrates a set of repeating identifiers in accordance with the present invention
  • Figures 19-22 illustrate, in flow diagram form, a method in accordance with the present invention
  • Figure 23 illustrates a sequence of images to be projected upon an object in accordance with an embodiment of the present invention
  • Figure 24 illustrates an image having varying features in accordance with an embodiment of the present invention
  • Figure 25 illustrates a projected image feature being reflected off surfaces at different depths in accordance with a preferred embodiment of the present invention
  • Figure 26 illustrates the projected image of Figure 25 as viewed at the different depths
  • FIGS. 27-30 illustrate a dentition object from various perspectives in accordance with preferred embodiments of the present invention.
  • Figure 31 illustrates a method in accordance with a specific embodiment of the present invention
  • Figures 32 and 33 illustrate a dentition object being scanned from various perspectives in accordance with preferred embodiments of the present invention
  • Figure 34 illustrates primitive shapes for modeling a dentition object
  • Figures 35 and 36 illustrate methods in accordance with a specific embodiment of the present invention
  • Figure 37 illusfrates a graphical representation of a method for selecting various entry points for registration in accordance with a preferred embodiment of the present invention
  • Figures 38-43 illustrate methods in accordance with a specific embodiment of the present invention.
  • FIGS 44-52 illustrate specific flows in accordance with specific embodiments of the present invention.
  • an image is projected upon a surface.
  • the image can include a pattern having a plurality of individual shapes used to measure and map the surface.
  • the plurality of individual shapes include features that are detectable in a direction parallel to the plane formed by a projection axis of the projected shapes and a point associated with a view axis.
  • the image further comprises a feature containing encoding information for identifying the plurality of shapes individually.
  • the encoding feature varies in a direction substantially orthogonal to a plane formed by the projection axis and a point of a view axis, and can be a separate feature from each of the plurality of individual shapes, can be a feature integral to the plurality of individual shapes, and/or be displayed at different time intervals from the plurality of individual shapes.
  • the feature containing encoding information is oriented such that the encoding information is retrieved along a line substantially perpendicular to a plane formed by the projection axis and the point along the view axis.
  • the use of the feature is used to perform multiframe reference independent scanning. In a specific embodiment, scanned frames are registered to one another.
  • Figures 10 and 11 represent a system for implementing a specific embodiment of the present invention
  • Figures 12, and 19-22 illustrate specific methods in accordance with the present invention
  • Figures 13-18, 23, 24 illustrates specific implementations of the method in combination with the system.
  • Figures 44-52 illustrate a specific method and apparatus of another inventive embodiment that uses three-dimensional scan data of an anatomical structure, which may be obtained in the specific manner described herein.
  • the three-dimensional scan data is transmitted to a remote facility for further use.
  • the three-dimensional scan data can represent the anatomy of an anatomical structure which is used to design an anatomical device, manufacture an anatomical device, monitor structural changes of the anatomy, archive data pertaining to the anatomical structure, perform a closed-loop iterative analysis of the anatomical structure, perform an interactive consultation of the structure, perform simulations based upon the structure, make a diagnosis related to the anatomical structure, or determine a treatment plan based on the anatomical structure.
  • Figure 10 illustrates a system controller 951 that provides control signals to the scanning device 980.
  • the scanning device 980 projects an image bound by lines 962 and 963, and retrieves, or views, the images within the reflected lines 972 and 973.
  • the system controller 951 provides specific information to the scanner 980 specifying a specific image to be projected upon the surface 991 of the object 990.
  • the reflected image is captured by the scanning device 980, which in turn provides the captured information back to the system controller 951.
  • the captured information can be provided back to system controller 951 automatically, or can be stored within the scanning device 980 and retrieved by the system 951.
  • the image data once received by the system controller 951 is analyzed in order to determine the shape of the surface 991. Note that the analysis of the received data can be performed either by the system controller 951 , or by an external-processing device that is not shown.
  • the scanning device 980 which includes a projecting device (projector) 960 and a viewing device (viewer) 970.
  • the projector 960 is oriented such that the image is projected on the object 990.
  • the projector 960 has a projection axis 961.
  • the projection axis 961 begins at the center of the lens projecting the image and is representative of the direction of projection.
  • the viewer 970 has a view axis 971 that extends from the center of the lens associated with the viewer 970 and represents the direction from which images are being received.
  • FIG 11 illusfrates in greater detail the system controller 951 of Figure 10.
  • the system controller 951 further includes data processor 952, a projection image representation 953, the projector controller 954, and a viewer controller 955.
  • the viewer controller 955 provides the interface needed to receive data from the viewer 970 representing the reflected image data.
  • the reflected image data is received from the viewer 970 at the viewer controller 955, and subsequently provided to the data processor 952.
  • the projector controller 954 provides the interface necessary to control the projector 960.
  • the projector controller 954 provides the projector 960 with the image to be projected in a format supported by the projector.
  • the projector 960 projects the image onto the surface of the object.
  • the projector controller 954 receives or accesses the projection image representation 953 in order to provide the proj ector with the image.
  • the projection image representation 953 is an electronic representation of the image stored in a memory location.
  • the stored image can represent a bit mapped image, or other standard or custom protocol used to define the image to be projected by the projector 960.
  • the projection image is a digital image (electrically generated)
  • the representation can be stored in memory by data processor 952, thereby allowing the data processor 952 to modify the projection image representation, it is possible to vary the image as necessary in accordance with the present invention.
  • the projection image representation 953 need not be present. Instead, the projection controller 954 may select one or more transparencies (not illustrated) associated with the projector 960. Such transparencies can include any combination of films, plates, or other types of retical devices that project images.
  • the data processor 952 controls the projection and reception of data through the controller 954 and 955 respectively.
  • Figure 12 illustrates a method in accordance with the present invention that will be discussed with reference to the system of Figure 10 and the accompanying Figures.
  • projection/view plane refers to a plane formed by the projection axis and at least one point of the view axis.
  • the term projection/view plane is best understood with reference to Figure 3. Assuming that Figure 3 represents a cross section of the object 100.
  • the projection axis illustrated is directed such that it lies entirely within the plane formed by the sheet of paper including Figure 3.
  • the view axis of Figure 3 is also lying entirely within the plane represented by the sheet of paper of Figure 3.
  • the projection/view plane formed by the projection axis of Figure 3 and at least one point of the view axis of Figure 3 includes the sheet of paper on which the Figure is drawn.
  • the projection/view plane can be described to contain substantially all of the projection axis and at least one point of the view axis, or all of the view axis and at least one point of the projection axis. For purposes of discussion herein, it will be assumed that the point of the view axis nearest the view device is the point to be included within that projection/view plane.
  • the projection/view plane described with reference to Figure 3 would be substantially orthogonal to the surface 104, and orthogonal to each of the lines 121-125.
  • the projection/view plane is represented by line 99, which represents the plane from an edge view intersecting the lines 121-125.
  • an image is projected having an encoding
  • each of the shapes or patterns 931-935 represent an encoding feature.
  • Each of the individual features 931-935 has a component(s) that varies in a direction orthogonal to the projection view plane.
  • feature 933 varies orthogonal to the projection plane such that three individual lines can be identified.
  • the thicknesses of the three individual lines By varying the thicknesses of the three individual lines a unique pattern is associated with each of the features 931-935.
  • the bar code feature 933 varies orthogonal between no line, thin line, no line, thick line, no line, thin line, and no line.
  • the individual lines of the feature 933 are projected parallel to the projection/view plane. Projecting lines parallel to the projection/view plane reduces, or eliminates, the viewed distortion affects of surface topology on the width of the lines.
  • the thickness, or relative thickness, of each individual line of the feature 933 can be readily identified independent of surface topology. As a result, the feature 933 can be identified substantially independent of surface topology.
  • Figure 13 displays a specific embodiment of an image having five separate lines (measuring features) 431-435.
  • the lines 431-435 illustrated have lengths that run substantially orthogonal to the projection view plane, and are uniformly spaced from each other in a direction parallel to the projection/view plane. By providing a plurality of lines which are detectable in the direction parallel to the projection/view plane, multiple measuring lines can be viewed and analyzed simultaneously.
  • the lines 431-435 In addition to the lines 431-435, five unique bar codes 931-935 are also illustrated. Each of the unique bar codes (variable features) 931-935 are associated with, and repeated along a respective measuring feature 431-435. In other implementations, each bar code can be repeated along a measuring feature more than the two times illustrated. Note that the bar codes illustrated are illustrated as repeating sets. In other implementations, the bar codes would not need to be grouped in sets.
  • the lines 431-435 and bar codes 931-935 are generated using visible light that is low-intensity, such that the pattern is eye- tolerant and skin tolerant.
  • the lines 431-435 can be viewed as white lines, and the bar codes 931-935 can be viewed as specific colors or combinations of colors.
  • high-intensity or laser light can also be used depending upon the application.
  • the lines 432 and 433 appear to be a continuous line at the edge of object 101.
  • the lines 432 and 433 can be distinguished from each other by analyzing the (encoding feature) barcodes associated with each line. In other words, where line 432 and line 433 appear to the viewer to be a common line, it can now be readily determined that they are two different lines because the bar code associated with line 432 on the left would not be the same as the bar code associated with line 433 on the right.
  • the analysis of the retrieved images would determine that there is a discontinuity somewhere between the left most bar code 932 and the right most bar code 933 causing the line segments 432 and 433 to appear as a common line.
  • the location of such an edge can be determined with greater precision by providing repeating bar code patterns in relatively close proximity to one another. For example, the edge where surface 102 meets surface 101 can be determined only to an accuracy equal to the spacing between adjacent bar codes. This is because when the analysis encounters what appears to be a single line having two different bar codes it is unknown where between the two bar codes the discontinuity has occurred. Therefore, by repeating the bar code more frequently along the measuring lines of Figure 13 the location of discontinuities can be more accurately identified.
  • the encoding features 931-935 of Figure 13 are non-repeating in that no two bar codes are the same. However, an encoding value, or sequence, can be repeated within a projected image as long as ambiguity is avoided. For example, if the image includes 60 lines (measuring features) using a binary encoding, 6 bits of data are needed to identify each line uniquely. However, due to the fact that the range of focus of the scanner is limited by the depth of field, each individual line of the 60 lines can show up as a recognizable image only within a certain range.
  • Figures 25 and 26 better illustrate how the depth of field affects the repeating of features.
  • Figure 25 illustrates a projector projecting a SHAPE along a path 2540.
  • the SHAPE When the SHAPE is projected onto a surface its image is reflected along a reflection path to a viewing device 2506.
  • reflection path 2544 results when the SHAPE is reflected off a surface at the location 2531
  • a reflection path 2541 results when the SHAPE is reflected off a surface at the location 2532
  • a reflection path 2542 results when the SHAPE is reflected off a surface at the location 2533
  • a reflection path 2543 results when the SHAPE is reflected off a surface at the location 2534.
  • Figure 26 represents the SHAPE as the viewer 2506 would view it.
  • the image reflected off of the surface 2531 which is the surface closest to the projector, is viewed as the right most image in Figure 26, while the image reflected off of the surface 2534, which is the surface furthest from the projector, is viewed as the left most image in Figure 26.
  • the left and right most images which are furthest and closest to the projector 2505 respectively, are out of focus. Because they are out of focus they can not be accurately detected based upon the image received by the viewing device 2506.
  • any surface closer to the projection device 2505 than plane 2525, or further from the projection device 2505 than the plane 2526 is not capable of reflecting a usable SHAPE because it is outside the viewable range 2610, or field of view. Therefore, the SHAPE can be repeated and still be uniquely identified, so long as the repeated SHAPE can not be viewed within the range 2610 of Figure 6.
  • a projector projects approximately 80 lines.
  • Each of the 80 lines have a color-coded encoding sequence. For example, if three colors are used (red, blue, Green), an encoding feature having three color locations could uniquely identify 27 different lines.
  • This coding sequence of 27 lines can be repeated three times to cover all 80 lines, provided the field of view is such that lines having the same encoding can not be viewed at the same location.
  • five color locations can be added with or without increasing the number of lines in a sequence to provide recognition capability where a specific color location may be lost.
  • coding features may be repeated, as long as the fields of view in which each of the repeating features may be viewed do not overlap.
  • a sequence of 12 unique encoding features requiring only four bits of binary data, can be repeated five times to encode all 60 lines, provided there is no chance for features to be viewed at the same location.
  • reference independent scanning By providing a pattern having a large number of measuring features with associated coding features reference independent scanning is achieved. Specifically, neither the object nor the scanner needs to be fixed in space, nor with reference to each other. Instead, on a frame by frame basis, the reference independent scanner retrieves enough measuring information (a 3D cloud), which is accurate due to the encoding feature, to permit registration to its adjacent frame. Registration is the process which determines the overlapping features on adjacent frames to form an integrated map of the object.
  • Figure 14 illustrates the object of Figure 13 whereby the measuring lines
  • Figure 15 represents the object 700 of Figures 8 and 9 having a pattern in accordance with the present invention projected upon its surface.
  • Figure 15 illustrates the projection of lines 721-723 having varying widths.
  • the lines 722 and 723, when projected onto the surfaces 711 and 712 respectively appear to have the same line thickness as line 721. Therefore, merely having measuring lines of varying thickness will not allow an analysis of the images to determine which line is which.
  • identification of the lines 721-723, and the subsequent mapping analysis is improved over the prior art.
  • a table is illustrated where a specific set of shapes used in a direction orthogonal to the projection/view plane are illustrated.
  • Column 1 of table 16 represents unique feature identifiers.
  • the columns 2-4 of table 16 illustrate specific manners in which each feature identifier can be represented.
  • Column 2 indicates bar codes.
  • Column 3 indicates colors capable of being used either alone or with other encoding features. Note that some types of encoding features, including color features, can be implemented as an integral part of a measuring feature as well as an encoding feature separate from the measuring feature. Likewise, other types of encoding can be based upon the intensity at which a measuring and/or feature and its encoding feature is projected.
  • Column 4 represents patterns that can be utilized either independently from the shape to identify the shape, or in combination as part of a shape.
  • a line comprising a repeating pattern sequence of the type illustrated in Column 4 can be provided.
  • the change of pattern in a direction orthogonal to the projection/view plane can be relative to the actual shape itself.
  • one of ordinary skill in the art will recognize that many variations as to variable components would be anticipated by the present invention.
  • Figure 17 illustrates in tabular form, the use of unique non-repeating identifiers for each line.
  • sequence 0-F sequentially is presented.
  • each of the values from 0 through F will represent a unique code associated with a specific line.
  • spacer may need to exist between each individual code. For example, a long space, or a unique code can be used.
  • Figure 18 illustrates four unique repeating code sequences.
  • the letter S in table 18 is utilized to represent a spacer used between repeating sequences.
  • a spacer can be some unique identifier specifying where each of the repeating codes of the encoding sequence begins and/or ends.
  • a representation of the surface image is received at a viewer. This is analogous to the discussion of Figure 10 whereby the viewer 970 receives the reflected image.
  • the location of a point associated with an object is determined based upon the orthogonally varying feature.
  • the point is based upon the variable component because each one of the shapes, e.g. lines is qualified to a unique code pattern prior to being used for object analysis.
  • Figure 19 illustrates sub steps to be associated with step 611 of Figure 12.
  • a first image is projected, while at step 622 a second feature is projected.
  • the first image can be analogous to the combination of the measuring line 431 and its associated encoding features 931.
  • the second feature could be represented by the combination of the measuring line 432 and its encoding features 932.
  • a specific line in a group of lines such as illustrated in Figure 14, can be identified based on more than one of the various encoding features.
  • steps 621 and 622 can occur at different times as discussed with reference to Figure 23.
  • Figure 21 illusfrates another method in accordance with the present invention.
  • a plurality of first features, and a plurality of second features are projected. These features may be projected simultaneously, or at separate locations.
  • one of the plurality of first features is determined, or identified, based upon the second features.
  • the plurality of first features would include the lines measuring 431-435.
  • the bar code 931-935 By utilizing the second features, the bar code 931-935, a specific one of the lines 431-435 can be identified.
  • the location of a point at the surface is determined based upon the specific one of the plurality of parallel first features.
  • This specific embodiment is an advantage over the prior art, in that a line identified by the analysis of the received shape is not utilized until its identity is verified based upon the encoding information.
  • FIG. 22 illustrates another method in accordance with the present invention.
  • step 641 parallel first and second discrete shapes are projected. Examples of such discrete shapes would include the lines 431 and 432 of Figure
  • an encoding feature relative to the first discrete shape is projected.
  • the encoding feature relative to the line 432 could include the encoding feature 932 or even an encoding feature 933.
  • an encoding feature relative to the second discrete shape is projected.
  • the first discrete shape is identified based upon the first encoding feature. This is accomplished in a manner similar to as discussed previously.
  • a location of a specific point of an object is determined based upon the first discrete shape.
  • Figure 23 illustrates another embodiment of the present invention. Specifically, Figure 23 illustrates a series of images projected at times Tl, T2, T3 and T4. At time Tl, the image projected includes measuring features 1011 through 1013. During time Tl, no encoding feature is projected. During time T2, an image containing encoding features 1021-1023 is projected. The patterns of times Tl and T2 are repeated during times T3 and T4 respectively. The result of alternating the projection of encoding and measuring features is that denser patterns can be used, allowing for more information to be obtained. Note that the image of time T4 shows the encoding features 1021-1023 overlying the measuring features 1011-1013. However, in one embodiment, the measuring features have been included for illustration purposes only, and would not generally be present at the same time as the encoding features.
  • Figure 24 illustrates an image having features with different characteristics. Specifically, Figure 24 illustrates an image 1100 having lines 1131 through 1134 with a distance X between the individual lines, while the distance between lines 1134, 1135, and 1136 have a substantially greater distance Y separating the lines.
  • the line 1135 can be used to map surface features that otherwise may not be mappable. Note that the pattern 1100 could be used with or without the coding techniques described herein.
  • each 2D point of the 2D image frame can be converted into a 3D point using conventional 3D imaging techniques, provided each 2D point of the 2D image frame can be correlated to a projected point.
  • the use of a projected frame pattern that has encoding features enables correlation of the points of the 2D image to a respective projected point.
  • Multi-frame reference independent scanning is described herein in accordance with another aspect of the present disclosure.
  • multiple 3D image frames are received by using a hand-held scanner to scan an object one frame at a time to obtain a plurality of frames, where each frame captures only a portion of the object.
  • reference independent scanning has a spatial position that is frame-by-frame variable relative to the object being scanned, and whose spatial position is not fixed, or tracked, relative to a reference point. For example, there is no fixed reference point relative to the object being scanned.
  • One type of reference independent scanner disclosed herein includes a hand-held scanner that projects a pattern in successive frames having measuring features and encoding features. This allows each viewed point of a frame to have a known corresponding projected point, thereby enabling the 2D frame data to be converted into 3D frame data.
  • Figures 27-28 are used to discuss multiple frame reference independent scanning.
  • Figures 27, 28, and 30 illustrate an object 2700 from different points of view.
  • the object 2700 includes three teeth 2710, 2720, and 2730, and a gum portion 2740 that is adjacent to the three teeth.
  • the Figure 27 point-of-view is such that a plurality of non continuous surface portions are viewed.
  • three noncontiguous surface portions 2711-2713 are viewed.
  • the surface portion 2713 represents a side portion of the tooth 2710.
  • the surface portion 2711 represents a portion of the tooth 2710 biting surface that is not continuous with surface portion 2713.
  • the surface portion 2712 represents another portion of the tooth 2710 biting surface that is not continuous with either portion 2711 or 2713.
  • tooth 2720 has four surface portions 2721-2724
  • tooth 2730 has four surface portions 2731-2734.
  • Figure 28 illustrates the object 2700 from a slightly different point-of- view (Figure 28 point-of-view).
  • the point-of-view change from Figure 27 to Figure 28 is the result of the viewer, i.e. scanner, moving in a direction that allows a greater portion of the upper teeth surfaces to be viewed.
  • the change in point-of-view has resulted in variations to a plurality of viewed surface portions.
  • tooth 2710 tooth portion 2813 now represents a smaller 2D surface than did its corresponding tooth portion 2713; while tooth portions 2811 and 2812 now are viewed as larger 2D surfaces than their corresponding portions 2711 and 2712 of Figure 27.
  • tooth 2720 surface 2824 now is viewed as a smaller 2D surface than its corresponding tooth surface 2724 of Figure 27.
  • tooth surface 2821 represents a continuously viewed tooth surface that includes both of the surfaces 2721 and 2723 from the Figure 27 point-of- view.
  • the viewed 2D surfaces 2832 and 2835 each include portions of surface 2732 and previously unviewed surface area. This is the result of a topographical feature of the tooth 2730, which resulted in the inability of the surface 2732 to be viewed continuously from the second frame point-of-view.
  • Figure 29 is from the same point-of-view as Figure 28 with the viewed surface portions of Figure 27 indicated as shaded areas.
  • surface portion 2711 of Figure 27 is represented as a shaded portion within the surface portion 2811.
  • the change in the point-of-view between Figure 27 and Figure 28 results in a viewed surface portion 2811 that encompasses the smaller viewed surface portion 2711.
  • the change in perspective has resulted in different surface portions being viewed.
  • Figure 30 illustrates the object 2700 from another point-of-view.
  • Figure 30 point-of-view is from directly over the teeth 2710- 2730.
  • Figure 30 superimposed onto Figure 30 are the viewed surface portions of Figure
  • Figure 31 illustrates a method 3100 in accordance with a specific embodiment of reference independent scanning.
  • the object is scanned to obtain a 2D cloud of data.
  • the 2D cloud of data includes a plurality of frames. Each of the frames has a plurality of 2D points, which, if viewed, would represent a 2D image.
  • a first frame of the 2D cloud of data is converted to 3D frame model.
  • a 3D frame model is a 3D point model, which includes a plurality of points in three-dimensional space.
  • the actual conversion to a 3D frame point model is performed on some or all of the frame's 2D cloud of data using conventional techniques for converting a scanned 2D cloud of data into a 3D point model.
  • surfaces with non continuous viewed surfaces such as the teeth 2710-2730 of Figure 27, can be successfully scanned frame-by-frame.
  • Figures 32 and 33 further illustrate the object 2700 being scanned from the Figure 27 and Figure 28 points-of-view respectively.
  • the scan pattern includes scan lines 3221-3223. Any scan line portion outside the frame boundary 3210 is not capable of being properly scanned. Within the boundary 3210 each scan line, when sensed at the CCD (charge coupled diode) chip of the scanner, is converted to plurality of 2D points (cloud of data). Some or all points of a scan line can be used in accordance with the present invention. For example, every other, or every third point of a scan line can be used depending upon the desired resolution of a final 3D model.
  • Figure 32 illustrates four points (A-D) of each line being identified. A 2D coordinate value, such as an X-Y coordinate, is determined for each of these points.
  • a scan rate of 1 to 20 frames per second is used. Greater scan rates can be used. In a specific embodiment, the scan rate is chosen to allow for real-time viewing of a three-dimensional image.
  • the pulse time during which each frame is captured is a function of the speed at which the scanner is expected to be moving. For dentition structures, a maximum pulse width has been determined to be approximately 140 microsecond, although much faster pulse widths, i.e. 3 micro-seconds, are likely to be used.
  • the teeth 2710-2730 are coated with a substance that results in a surface that is more opaque than the teeth themselves.
  • each point of the cloud of data is analyzed during the various steps and functions described herein.
  • only a portion of the cloud of data may be analyzed. For example, it may be determined only every 3 rd or 4 th point needs to be analyzed for a desired resolution to be met.
  • a portion of the frame data can be a bounding box that is smaller than the entire frame of data such that only a specific spatial portion of the cloud of data is used for example, only a center portion of the cloud of data is included within the bounding box.
  • Figure 33 illustrates the object 2700 being scanned from the Figure 28 point of view.
  • the viewed pattern including lines 3321-3323 are positioned differently on the teeth 2710-2730.
  • the frame boundary 3310 has moved to include most of the tooth 2720.
  • Figure 34 illustrates another embodiment of a 3D frame model referred to herein as a 3D primitive model.
  • a 3D primitive model includes a plurality of primitive shapes based upon the frame's 3D points.
  • adjacent points from the 3D point model are selected to form triangles, including triangle PS1-PS3 as primitive shapes.
  • Other implementations can use different or varied primitive shapes.
  • a second 3D frame model is generated from the second frame of the cloud data.
  • the second 3D frame model may be a point model or a primitive model.
  • a regisfration is performed between the first frame model and the second frame model to generate a cumulative model.
  • "Regisfration" refers to the process of aligning the first model to the second model to determine a best fit by using those portions of the second model which overlap the first model. Those portions of the second model that do not overlap the first model are portions of the scanned object not yet mapped, and are added to the first model to create a cumulative model. Registration is better understood with reference to the method of Figure 35.
  • Figure 35 includes a registration method 3500 that, in a specific embodiment, would be called by one of the registration steps of Figure 31.
  • an entry point into registration is determined.
  • the entry point into regisfration defines an initial guess of the alignment of the overlapping portions of the two models. The specific embodiment of choosing an entry point will be discussed in greater detail with reference to Figure 36.
  • a registration of the two shapes is attempted. If an overlap is detected meeting a defined closeness of fit, or quality, the regisfration is successful. When the registration is successful the flow returns to the calling step of Figure 31. When a registration is not successful the flow proceeds to the step 3598 were a decision whether to continue is made.
  • a decision to continue can be made based on a number of factors. In one embodiment, the decision to continue is made based upon the number of registration entry points that have been tried. If the decision at step 3598 is quit registration attempts, the flow proceeds to step 3503 where registration error handling occurs. Otherwise the flow continues at step 3501.
  • Figure 36 illustrates a specific method for choosing a registration entry point.
  • a determination is made whether this is the first entry point for a specific regisfration attempt of a new frame. If so the flow proceeds to step 3601, otherwise the flow proceeds to step 3698.
  • the X and Y components of the entry point are determined based upon two-dimensional analysis of the 2D cloud of data for each of the two frames.
  • the two-dimensional analysis performs a cross-correlation of the 2D images. These 2D images do not have to be from the 2D cloud of data, instead, data associated with a plain video image of the object, with no pattern, can be used for cross correlation. In this way, a probable movement of the scanner can be determined.
  • the cross- correlation is used to determine how the pixels have moved to determine how the scanner has probably been moved.
  • a rotational analysis is possible, however, for a specific embodiment this is not done because it tends to be time consuming, and having the correct entry point in the X and Y-coordinate direction allows the registration algorithm described herein to handle rotations.
  • a probable movement in the Z direction is determined.
  • the previous frame's Z-coordinate is used, and any change in the Z-direction is calculated as part of the registration.
  • a probable Z coordinate is calculated as part of the entry point.
  • the optical parameters of the system can "zoom" the second frame in relationship to the first one until the best fit is received. The zoom factor that is used for that could tell us how far the two surfaces are away from each other in Z.
  • the X, Y and Z coordinates can be aligned so that the Z-coordinate is roughly parallel to the view axis.
  • step 3606 the entry point value is returned.
  • step 3698 a determination is made whether all entry point variations have been tried for the registration steps 3601 and 3602. If not the flow proceeds to step 3603, otherwise the flow proceeds to step 3697.
  • Figure 37 illustrates a specific method for selecting the regisfration entry point variations. Specifically, Figure 37 illustrates the initial entry point El and subsequent entry points E2-E9. The entry points E2-E9 are selected sequentially in any predetermined order.
  • the specific embodiment of Figure 37 illustrates the registration entry points E2-E9 as various points of a circle 3720 having a radius 3710.
  • the dimensions of the entry point variations are two-dimensional, for example the X and Y dimension. In other embodiments, the entry points can vary in three dimensions. Note that varying number of entry points, i.e. subsets of entry points, can be used to speed up the registration process. For example, single frame registration as used herein could use fewer than the nine entry points indicated. Likewise, cumulative registration, described herein, could benefit by using more than the nine points illustrated.
  • step 3697 the flow proceeds to step 3697 once all variations of the first identified entry point have been tried.
  • step 3697 all entry points associated with the first identified entry point have been tried, and it is determined whether a second identified entry point has been identified by step 3604. If not, flow proceeds to step 3604 where the second entry point is defined. Specifically, at step 3604 the scanner movement between two previous frame models is determined. Next, an assumption is made that the scanner movement is constant for at least one additional frame. Using these assumptions, the entry point at step 3604 is defined to be the location of the previous frame plus the calculated scanner movement. The flow proceeds to step 3606, which returns the entry point to the calling step of Figure 31.
  • step 3604 an assumption can be made that the direction of the scanner movement remained the same but that it accelerated at a difference rate. If the second identified entry point of step 3604 has been previously determined, the flow from step 3697 will proceed to step 3696. At step 3696, a determination is made whether an additional regisfration entry point variation for the second identified entry point exists. If so, the flow proceeds to step 3605, otherwise the flow returns to the calling step of Figure 31 at step 3607 and indicates that selection of a new entry point was unsuccessful. At step 3605 the next entry point variation of the second identified entry point is identified and the flow returns to the calling step of Figure 31.
  • Different entry point routines can be used depending upon the type of registration being performed. For example, for a registration process that is not tolerant of breaks in frame data, it will be necessary to try more entry points before discarding a specific frame. For a registration process that is tolerant of breaks in frame data, simpler or fewer entry points can be attempted, thereby speeding up the registration process.
  • next 3D model portion is generated from the next frame's of cloud data.
  • step 3106 registration is performed between the next 3D model portion and the cumulative model to update the cumulative model.
  • the cumulative model is updated by adding all the new points from frame to the existing cumulative model to arrive at a new cumulative model.
  • a new surface can be stored that is based on the 3D points acquired so far, thereby reducing the amount of data stored. If all frames have been registered, the method 3100 is completed, otherwise the flow proceeds to steps 3105 through step 3199, until each frame's cloud of points has been registered. As result of the registration process described in method 3100, it is possible to develop a model for the object 2700 from a plurality of smaller frames, such as frames 3210 and 3310.
  • a model of a patients entire dentition structure including gums, teeth, and orthodontic and prosthetic structures can be obtained.
  • a model of the patients face can be obtained.
  • Figure 38 illustrates a method 3800, which is an alternate method of registering an object using a plurality of frames from a reference independent scanner. Specifically, at step 3801 the object is scanned to receive a cloud data for the object. As previously described, the cloud of data includes data from a plurality of frames, with each frame including a plurality of points.
  • a single frame registration is performed.
  • a single frame registration performs a registration between adjacent frames of the scanned image without generating a cumulative model.
  • a cumulative image of the single frame regisfration process is displayed.
  • the image formed by the single frame registration process can be used to assist in the scanning process.
  • the image displayed as a result of the single frame registration while not as accurate as a cumulative model, can be used by the scanner's operator to determine areas where additional scanning is needed.
  • the single frame registration process is such that any error introduced between any two frames is "extended" to all subsequent frames of a 3D model generated using single frame regisfration.
  • the level of accuracy is adequate to assist an operator during the scanning process.
  • the registration results which describes the movement from one frame to another, can be used as an entry point for the cumulative regisfration process.
  • Single frame registration is discussed in greater detail with reference to Figure 39.
  • a cumulative regisfration is performed.
  • the cumulative registration creates a cumulative 3D model by registering each new frame into the cumulative model. For example, if 1000 individual frames were captured at step 3801 representing 1000 reference independent 3D model portions (frames), the cumulative regisfration step 3803 would combine the 1000 reference independent 3D model portions into a single cumulative 3D model representing the object. For example, where each of the 1000 reference independent 3D model portions represent a portion of one or more teeth, including frames 3210 and 3310 of Figures 32 and 33, the single cumulative 3D model will represent an entire set of teeth including teeth 2710-2730.
  • step 3804 the results of the regisfration are reported. This will be discussed in further detail below.
  • Figure 39 describes a method 3900 that is specific to a single frame rendering implementation for step 3802 of Figure 38.
  • a variable x is set equal to 2.
  • a registration between the current frame (3DFx) and the immediately, or first, previous adjacent frame (3DFx-l) is performed.
  • step 3999 it is determined whether or not the single frame registration of step 3904 was successful.
  • a registration method such as the method of Figure 40, provides a success indicator which is evaluated at step 3999. The flow proceeds to step 3905 when registration is successful, otherwise the flow proceeds to step 3907.
  • step 3905 the current 3D frame (3DFx) is added to the current frame set of 3D frames.
  • this set will generally be a set of transformation matrices.
  • the current frame set of 3D frames is a sequential set of frames, where each frame in the sequence has a high degree of likelihood being successfully registered with [both] of its two adjacent frames.
  • the newly registered frame can be displayed relative to the previous frame that is already being displayed.
  • step 3998 a determination is made whether the variable x has a value equal to n, where n is the total number of frames to be evaluated. If x is equal to n, single frame regisfration is complete and the flow can return to Figure 38 at step 3910. If x is less than n, single frame regisfration continues at step 3906, where x is incremented before proceeding to step 3904. Returning to step 3999, the flow proceeds to step 3907 if the regisfration of step 3904 was not successful. At step 3907 a registration is attempted between current frame (3DFx) and the second previously adjacent frame (3DFx- 2). Step 3997 directs the flow to step 3905 if the registration of step 3907 was successful. Otherwise, step 3997 directs the flow to step 3908, thereby indicating an unsuccessful registration of the current frame (3DFx).
  • step 3908 saves the current frame set, i.e. set of matrices, and a new current frame set is begun. Flow from step 3908 proceeds to step 3905 where the current frame is added to the current frame set, which was newly created at step 3908. Therefore, it is possible for the single frame regisfration step 3802 to identify multiple frames sets.
  • breaks in single frame registration are generally acceptable because the purpose of single frame regisfration is to assist the operator and define entry points to cumulative regisfration.
  • One method of dealing with breaks during single frame regisfration is to merely display the first frame after the break at the same location as the last frame before the break, thereby allowing the operator to continue to view an image.
  • a first model is a 3D primitive shape model
  • the second model is a 3D point model.
  • the primitive shapes in the first 3D model are referenced as S .Sn, where n is the total number shapes in the first model; and, the points in the second 3D model are references as PL.Pz, where z is the total number of points in the second model.
  • each individual point of the second model PL.Pz is analyzed to determine a shape closest to its location.
  • the shape Sl-Sn that is the closest to PI is the shape having the surface location that is the closest to PI than any other surface location of any other shapes.
  • the shape closest to point PI is referred to as Sci, while the shape closest to point Pz is referred to as Scz.
  • points that are located directly above or below a triangle are associated to a triangle, and points that are not located directly above or below a triangle surface are associated to a line formed between two triangles, or a point formed by multiple triangles. Note that in the broad sense that the lines that form the triangles and the points forming the corner points of the triangles can be regarded as shapes.
  • each vector for example Dl
  • PI the closest point of its closest shape
  • Sci the closest point of its closest shape
  • the non-overlapping points which are not needed for registration, have an associated vector having a comparatively large magnitude than an overlapping point, or may not reside directly above or below a specific triangle. Therefore, in a specific embodiment, only those vectors having a magnitude less than a predefined value (an epsilon value) are used for further regisfration.
  • epsilon values can also be used to further reduce risks of decoding errors. For example, if one of the measuring lines of the pattem is misinterpreted to be a different line, the misinterpretation can result in a large error in the Z-direction. For a typical distance between adjacent pattem lines of approximately 0.3 mm and an angle of triangulation of approximately 13°; an error in the X-direction of 0.3 mm results in a three-dimensional transformation error of approximately 1.3 mm (0.3 mm / tan 13°) in the Z-direction.
  • the epsilon value is first selected to be a value greater than 0.5mm, such as 2.0mm, and after reaching a certain quality the value is reduced.
  • the vectors DL.Dz are treated as spring forces to determine movement of the second 3D model frame.
  • the second 3D model is moved in a linear direction defined by the sum of all force vectors DL.Dz divided by the number of vectors.
  • the vectors DL.Dz are recalculated for each point of the second 3D model.
  • the vectors DL.Dz are treated as spring forces to determine movement of the second 3D model.
  • the second 3D model frame is rotated about its center of mass based upon the vectors DL.Dz.
  • the second 3D model is rotated about its center of mass until the spring forces are minimized.
  • the quality of the regisfration is determined with respect to the current orientation of the second 3D model.
  • various methods can be used to define the quality of the registration. For example, a standard deviation of the vectors DL.Dz having a magnitude less than epsilon can be used.
  • quality is calculated using the following steps: square the distance of the vectors, sum the squared distances of all vectors within the epsilon distance, divide this sum by the number of vectors, and take the square root. Note, one of ordinary skill in the art will recognize that the vector values DL.Dz need to be recalculated after the rotation step 4006. In addition, one of ordinary skill in the art will recognize that there are other statistical calculations that can be used to provide quantitative values indicative of quality.
  • step 4098 It is determined at step 4098 whether the current quality of regisfration is improving. In a specific embodiment, this is determined by comparing the quality of the previous pass through the loop including step 4003 with the current quality. If the quality is not improving the flow returns to the calling step with an indication that the regisfration was not successful. Otherwise, the flow proceeds to step 4003.
  • step 4003 Upon returning to step 4003, another regisfration iteration occurs, using the new frame location. Note that once the frame data has been scanned and stored there is no need to do the regisfration exactly in the order of scanning. Regisfration could start other way round, or use any other order that could make sense. Especially when scanning results in multiple passes there is already a knowledge of where a frame roughly belongs. Therefore, the regisfration of adjacent frames can be done independently of the order of imaging.
  • Figure 41 illusfrates a specific embodiment of a method 4100 for Figure 38.
  • the method 4100 discloses a cumulative registration which attempts to combine all of the individual 3D frame models into a single cumulative 3D model.
  • Steps 4101-4103 are setup steps.
  • a variable x is to set equal to 1
  • a variable x_last defines the total number of 3D model sets. Note, the number of 3D model sets is based upon the step 3908 of Figure 39.
  • a 3D cumulative model (3Dc) is initially defined to equal the first 3D frame of the current set of frames.
  • the 3D cumulative model is modified to include that information from subsequent frame models that is not already represented by the 3D cumulative model.
  • Y is set equal to 2
  • a variable Y_last is defined to indicate the total number of frames (3DF), or frame models, in the set Sx, where Sx represents the current set of frame models being registered.
  • the 3D cumulative model (3Dc) is modified to include additional information based upon the registration between the current 3D frame model being registered (Sx(3DFy)) and the 3D cumulative model (3DC).
  • the current 3D frame model is reference as Sx(3Dy), where 3Dy indicates the frame model and Sx indicates the frame set.
  • a specific embodiment for performing the registration of step 4104 is further described by the method illustrated in Figures 42-43.
  • step 4199 it is determined whether the current 3D frame model is the last 3D frame model of the current step. In accordance with a specific implementation of Figure 41, this can be accomplished by determining if the variable Y is equal to the value Y_last. When Y is equal to Y_last the flow proceeds to step 4198. Otherwise, the flow proceeds to step 4106, where Y is incremented, prior to returning to step 4104 for further registration of 3D frame models associated with current set Sy.
  • step 4198 it is determined whether the current set of frames is the last set of frames. In accordance with the specific implementation of Figure 41, this can be accomplished by determining if the variable x is equal to the value x_last. The flow proceeds to step 4105 when x is equal to a x_last. Otherwise, the flow proceeds to step 4107, where x is incremented, prior to returning to step 4103 for further registration using the next set.
  • Step 4105 reports results of the regisfration of the method 4100, as well as any other cleanup operations. For example, while ideally the method 4100 results in a single 3D cumulative model in reality multiple 3D cumulative models can be generated (see discussion at step 4207 of Figure 43). When this occurs step 4105 can report the resulting number of 3D cumulative models to the user, or to a subsequent routine for handling. As a part of step 4105, the user can have an option to assist in registering the multiple 3D models to each other. For example, if two 3D cumulative models are generated, the user can manipulate the 3D cumulative models graphically to assist identification of entry point, which can be used for performing a regisfration between the two 3D cumulative models.
  • a second cumulative registration process can be performed using the resulting matrices from the first cumulative regisfration as entry points for the new calculations.
  • an enlarged number of entry points can be used, or a higher percentage of points can be used.
  • Figures 42-42 illustrate a specific embodiment of regisfration associated with step 4104 of Figure 41.
  • Step 4201 is similar to step 4002 of Figure 40, where each point (PL.Pm) of the current frame Sx(3Dy) is analyzed to determine the shape of the cumulative model that is the closest shape.
  • Step 4202 defines vectors for each point of the current frame in a manner similar to that previously described with reference to step 4003 of Figure 40.
  • Steps 4203 through 4206 move the current 3D frame model in the manner described at steps 4004-4006 of Figure 40, where the first model of method 4000 is the cumulative model and a second model of method 4000 is the current frame.
  • One method of determining quality improvement is to compare a quality value based on the current position of the model register to the quality value based on the previous position of the model. As previously discussed with reference to Figure 40, the quality value can be determined using the standard deviation, or other quality calculation based on the D vectors. Note, by default, a first pass through steps 4202-4206 for each model 3Dy results in an improved alignment. If an improved alignment has occurred, the flow returns to step 4202, otherwise, the flow proceeds to step 4298 of Figure 43.
  • the flow control for the cumulative regisfration method of Figure 42 is different than the flow control for the single frame regisfration method of Figure 40. Specifically, the cumulative flow continues until no improvement in quality is realized, while the single frame flow stops once a specified quality is reached. Other embodiments of controlling the flow within the registration routines are anticipated.
  • the regisfration iteration process continues as long as a convergence criteria is met.
  • the convergence criteria is considered met as long as an improvement in quality of greater than a fixed percentage is realized.
  • Such a percentage can be in the range of 0.5-10%.
  • a stationary iteration is a pass through the registration routine, once the quality level has stopped improving, or has met a predefined criteria.
  • a number of stationary iterations can be fixed. For example, 3 to 10 additional iterations can be specified.
  • step 4207 it has been determined that current frame model cannot be successfully registered into the cumulative 3D model. Therefore, the current cumulative 3D model is saved, and a new cumulative 3D model is started having the current frame. As previously described, because a new 3D cumulative model has been started, the current 3D frame model, which is a point model, is converted to a primitive model before returning to call step.
  • the movement of the frame during steps 4004, 4006, 4203, and 4205 may include an acceleration, or over movement, component.
  • an analysis may indicate that a movement in a specific direction needs to be 1mm.
  • the frame can be moved by 1.5mm, or some other scaled factor.
  • Subsequent movements of the frame can use a similar or different acceleration factor.
  • a smaller acceleration value can be used as registration progresses.
  • the use of an acceleration factor helps compensate for local minima which result when no overlapping features happen to align. When this happens, a small movement value can result in a lower quality level.
  • acceleration it is more likely that the misalignment can be overcome.
  • acceleration can be beneficial to overcome "bumpiness" in a feature.
  • systems for scanning and/or registering of scanned data will include generic or specific processing modules and memory.
  • the processing modules can be based on a single processing device or a plurality of processing devices.
  • Such a processing device may be a microprocessor, microcontroller, digital processor, microcomputer, a portion of a central processing unit, a state machine, logic circuitry, and/or any device that manipulates the signal.
  • the manipulation of these signals is generally based upon operational instructions represented in a memory.
  • the memory may be a single memory device or a plurality of memory devices.
  • Such a memory device (machine readable media) may be a read only memory, a random access memory, a floppy disk memory, magnetic tape memory, erasable memory, a portion of a system memory, any other device that stores operational instructions in a digital format.
  • the processing module implements one or more of its functions, it may do so where the memory storing the corresponding operational instructions is embedded within the circuitry comprising a state machine and/or other logic circuitry.
  • the present invention has been described with reference to specific embodiments. In other embodiments, more than two registration processes can be used. For example, if the cumulative registration process has breaks resulting in multiple cumulative models, a subsequent registration routine can be used to attempt regisfration between the multiple cumulative models.
  • Figures 44-52 illustrate a specific method and apparatus using three- dimensional scan data of an anatomical structure, which may be obtained in the specific manner described herein.
  • the three-dimensional scan data is fransmitted to a remote facility for further use.
  • the three- dimensional scan data can represent the anatomy of an anatomical structure which is used to design an anatomical device, manufacture an anatomical device, monitor structural changes of the anatomy, archive data pertaining to the anatomical structure, perform a closed-loop iterative analysis of the anatomical structure, perform an interactive consultation of the structure, perform simulations based upon the structure, make a diagnosis related to the anatomical structure, or determine a treatment plan based on the anatomical structure.
  • an anatomical device is defined to include devices that actively or passively supplement or modify an anatomical structure.
  • anatomical devices include orthotic and prosthetic devices, and anatomical appliances.
  • Anatomical appliances include orthodontic appliances which may be active or passive, and can include, but is not limited to, such items as braces, retainers, brackets, wires and positioners.
  • Examples of other anatomical appliances include splints, and stents.
  • Examples of orthotic and prosthetic anatomical devices include removable prosthetic devices, fixed prosthetic devices, and implantable devices.
  • Removable prosthetic devices include dental structures such as dentures, partial dentures, and prosthetic structures for other body parts, such as prosthetic devices that serve as artificial body parts including limbs, eyes, implants, included cosmetic implants, hearing aids, and the like, such as spectacle frames.
  • Fixed prosthetic anatomical devices include caps, crowns and other non-dental anatomical replacement structures.
  • Implantable prosthetic devices include endosseous implants and orthodontic implants and fixture devices such as plates used for holding and reducing fractures.
  • Figure 44 illustrates a flow in accordance with the present invention. Specifically, Figure 44 illustrates the scanning of an anatomical structure 4400 by a scanning device 4401 at a facility 4441.
  • any scanner type or method capable of generating digital data for the purposes put forth herein can be used.
  • Direct three-dimensions surface scanning indicates that some or all of the anatomical structure can be scanned directly.
  • the scan is a surface scan, whereby the scanning device 4401 detects signals and/or patterns reflected from at or near the surface of the structure 4400.
  • a specific surface scanning method and device has been described previously herein. Other scanning methods can be used as well.
  • the surface scan of the anatomical structure will be a direct scan of the anatomical structure.
  • a direct scan refers to a scan of the actual anatomical structure (in-vivo).
  • an indirect scan of the anatomical structure can also be made and integrated with the direct scan.
  • An indirect scan refers to scanning a representation of the actual original anatomical structure (in- vitro).
  • Digital data 4405 is generated at the facility (location) 4441 based on the direct scan of anatomical structure 4400.
  • the digital data 4405 represents raw scan data, which is generally a two-dimensional cloud of points, generated by the scanning device 4401.
  • the digital data 4405 represents a three-dimensional point model, which is generally generated based upon a two-dimensional cloud of points.
  • the digital data 4405 represents a three-dimensional primitive model. Note that the digital data 4405 can be a composite of multiple independent scans, which may be performed at approximately the same or different points in times, as well as at the same or different locations.
  • the actual data type of digital data 4405 is determined by the amount of processing done to the raw scan data at the location 4441.
  • the data received directly from the scanner 4401 is a two-dimensional cloud of points. Therefore, when no processing is performed at the facility 4441, the digital data 4405 is a two-dimensional cloud of points. Three-dimensional point models and three-dimensional primitive models are typically generated by further processing of the two-dimensional point cloud.
  • Facility 4441 represents a location where the physical scanning of the anatomical structure occurs. In one embodiment, facility 4441 is a location that is dedicated to, or primarily dedicated to, scanning anatomical structures. In this embodiment, the facility would be located where it is easily accessible by large numbers of clients (patients) needing scans.
  • a kiosk in a mall, or a location in a strip mall can be dedicated to performing scans.
  • a facility may perform a broad variety of scans, or may specialize in specific types of scans, such as scans of facial structures or dental structures.
  • scans can be performed at home by a user.
  • a user can be provided with a portable scanner for remote use to scan the anatomical structure to generate scan data that can be used to monitor the progress of a treatment plan, for diagnostic purposes, or for surveillance or monitoring purposes.
  • the facility 4441 is a location that scans anatomical structures and performs other value-added services related to generating the digital data 4405.
  • Other value-added services include designing, or partially designing anatomical devices based upon the scan data to generate the digital data 4405, or installation of such anatomical devices, hi one implementation, no value added services beyond the generation of the digital data 4405 are performed at the facility 4441.
  • the digital data 4405 can be provided to the client.
  • the connection 4406 represents the digital data being provided to a third party. This step of providing can be done by the client, the facility 4441, or any other intermediate source. Generally, the client will specify the third party where the data is to be sent.
  • the digital data 4405 can be provided to the facility 4442 either physically, i.e. by mail or courier, or remotely, i.e. by transmission.
  • the digital data 4405 can be physically provided on a non-volatile storage device, such as a portable magnetic media, a read-only fuse device, or a programmable non-violate device.
  • the digital data can be fransmitted to the client or a third party by a direct connection, the internet, a local area network, a wide area network, a wireless connection, and/or any device that enables the transfer of digital information from one computing system to another.
  • either all or some of the digital data need be transmitted. For example, where the scan is of a patient's teeth and associated structures, such as the gums, a portion of the teeth may be transmitted.
  • the digital data 4405 received at the facility 4442 is used to design an anatomical device at step 4415.
  • Figure 45 illusfrates a method having two alternate embodiments of step 4415.
  • a first embodiment, which begins at step 4501 designs an anatomical structure using a physical model
  • a second embodiment, which begins at step 4511 designs the structure using a virtual model of the anatomical device.
  • a virtual model of the anatomical device will generally be a computer generated virtual model.
  • step 4501 generation of a physical three-dimensional physical model of the anatomical structure occurs using the digital data 4405.
  • the physical model of the scanned object is generated using numerically controlled processing techniques, such as three-dimensional printing, automated milling, laser sintering, stereo lithography, injection molding, and extmsion.
  • the three-dimensional physical model is used to design the anatomical device. For example, by using the physical model, practitioners will generate anatomical devices for use by the client.
  • anatomical devices are custom designed based upon the physical model.
  • standard orthodontic devices are selected based upon the physical model of the anatomical structure. These standard devices may be modified as required to form semi-custom devices.
  • the manufacture of the anatomical device can be based upon the physical model. Where physical models are used, the step 4502 of designing and the step 4503 of manufacturing are often steps, whereby the design and manufacturing process are occurring simultaneously. In other embodiments, moldings or specifications of the desired anatomical device are made and sent to processing centers for custom design and/or manufacturing.
  • a virtual three- dimensional model of the anatomical structure is used to design an anatomical device.
  • a virtual tliree-dimensional model refers to a model generated by a numerically controlled device, such as a computer, and either includes the digital data 4405, or is generated based upon the digital data 4405.
  • the virtual three-dimensional model is included as part of the digital data 4405 provided to a design center.
  • the three- dimensional model is generated using the digital data 4405 is received at step 4511.
  • an alternate-three-dimensional model is generated at step 4511 based on a three-dimensional model included as part of the digital data 4405.
  • multiple three-dimensional models can be stitched together from a plurality of scans. For example, data from multiple scan sessions can be used.
  • a virtual anatomical device is designed (modeled) using the virtual three-dimensional model.
  • the virtual device can be designed using standard or custom design software for specifying virtual devices. Examples of such design software include commercially available products such as AutoCAD, Alias, Inc and ProEngineer.
  • the design software can be used to design a virtual crown using the three- dimensional virtual model of the anatomical structure, or to select a near- custom, standard or virtual devices from a library of devices that represents actual devices. Subsequent to selecting a standard device, customizations can be made.
  • the anatomical device can be manufactured directly based upon a virtual specification of the device.
  • the anatomical device can be generated using numerically controlled processing techniques, such as three-dimensional printing, automated milling, or laser sintering, stereo lithography, and injection molding, extrusion and casting techniques. It will be appreciated that the manufacture of the anatomical device includes partially manufacturing the device, as well as manufacturing the device at multiple locations.
  • the manufactured anatomical device is scanned.
  • a simulation can be performed to verify the relationship between the anatomical device as manufactured and the anatomical structure, thereby, providing closed loop feedback to assure proper manufacture of the device.
  • the completed anatomical device is sent to a specified location for installation.
  • the anatomical device is sent to facility 4444, where installation occurs at step 4435.
  • the anatomical device is installed at step 4435 by a practitioner, such as a dentist, orthodontist, physician, or therapist.
  • the patient can install the anatomical device.
  • a patient can install some orthodontic devices, such as retainers or similar positioning devices.
  • the anatomical device is designed or manufactured at a remote location relative to where the digital data 4405 is received or generated.
  • the digital data is received at the location 4441.
  • the digital data is received by the scanning the anatomical structure 4400.
  • the digital data is transmitted to location 4442, which is a remote location relative to the location 4441, where an anatomical device is at least partially designed.
  • a remote location is one that is disassociated in some manner from another location.
  • the remote location can be a location that is physically separate from the other location.
  • the scanning facility can be in a different room, building, city, state, country, or other location.
  • the remote location can be a functionally independent location.
  • one location can be used to perform one specific function, or set of functions, while another location can be used to perform a different function. Examples of different functions include scanning, designing, and manufacturing.
  • Remote locations will generally be supported by separate infrasfructures, such as personnel and equipment.
  • the digital data 4405 at facility 4441 includes a partially designed anatomical device.
  • the anatomical device is further designed at the remote facility 4442.
  • facility 4442 can represent one or more remote facilities that can be used in parallel or serial to determine a final anatomical device, make a diagnosis, form freatment plan, monitor progress, or design a device based upon cost, expertise, ease of transmission, and turn around time required.
  • An example of parallel facilities is further illustrated in Figure 48.
  • Figure 46 illustrates another embodiment of the present invention.
  • the flow of Figure 46 is similar to the flow of Figure 44, with an additional intermediate step 4615.
  • the intermediate step 4615 indicates that the digital data 4405 does not need to be received directly from the facility 4441 where the data was scanned.
  • the digital data 4405 can be generated at the first facility (sending facility) by scanning and provide the digital data to a second facility 4641 (receiving facility) where the intermediate step 4615 occurs.
  • the digital data 4405, or a modified digital data that is a representation of the digital data can be fransmitted to a third facility (remote facility) that is remote relative to at least one of the first and second facilities.
  • the scan data can be processed to provide a three-dimensional virtual model of the anatomical structure; data can be added to the digital data, including image data of the scanned anatomical structure 4400, video and/or photo data containing color information, diagnosis information, treatment information, audio data, text data, X-ray data, anatomical device design information, and any other data which may be pertinent to the design or manufacture of the anatomical device.
  • the intermediary step 4615 need not alter the digital data 4405.
  • Figure 47 illustrates an alternate embodiment of the present invention where the digital data 4405 is received at facility 4742 for forensic evaluations at step 4741.
  • An example of a forensic evaluation includes identification of victims based on the anatomical structure scanned. Such identifications will generally be made based upon matching a specific anatomical structure to an anatomical structure contained within a target data base, where the target data base can contain a single structure, a plurality of structures.
  • the target database could be a centrally located database containing archived data.
  • Figure 48 illusfrates an embodiment of the present invention where digital data 4405, or its representation, is fransmitted to one or more remote facilities 4843 for diagnostic purposes or treatment planning at steps 4844 and 4845.
  • the ability to transmit the data for diagnostic purposes allows three- dimensional information of an anatomical structure to be provided to other practitioners, such as specialists, without the patient having to physically be present. This ability improves the overall speed and convenience of freatment, as well as accuracy when multiple diagnoses can be made in parallel.
  • the ability to send the digital data 4405, or its representation, to multiple facilities for treatment planning and diagnosis allows multiple opinions to be obtained. Once a specific freatment plan is selected, any one of the devices specified as part of the treatment plan can be selected for manufacturing.
  • price quotes are obtained from each of the facilities.
  • the price quotes can be based upon a specific freatment specified by a requesting party, where the freatment relates to the anatomical structure.
  • the price quotes can be based upon a desired result specified by the requesting party, where the treatment definition and its associated implementation costs are determined by the facility providing the quotes. In this manner, a patient or a patient's representative can obtain competitive bids in an efficient manner.
  • Figure 49 illusfrates an alternate embodiment of the present invention where the digital data 4405, or its representation, is received at facility 4942 so that the data can be used at the step 4941 for educational purposes.
  • educational techniques can be performed in a standardized manner not possible using previous methods. Examples of educational purposes include self learning purposes, education monitoring purposes, and the ability to provide standardized applied exams which were not possible before.
  • case facts for a specific patient can be matched to previous, or present, case histories of other patients, where the case histories are stored or archived.
  • Figure 50 illusfrates an embodiment where scan data can be archived at step 5001, which occurs at location 5002 for easy retrieval by authorized individuals.
  • such archives would be provided as a service, whereby that data would be commonly maintained, thereby allowing for a common site-independent "gold standard" copy of digital data to be obtained.
  • Figure 51 illusfrates a specific embodiment of the present invention where the digital data obtained by scanning the anatomical structure is used in a closed-loop iterative system.
  • the flow of Figure 51 may also be interactive. Specifically, changes in the anatomical structure, whether intentional or unintentional, can be monitored and controlled as part of the closed-loop system.
  • the closed-loop system in accordance with the present invention is deterministic because the scan data is measurable in three-dimensional space, thereby allowing a standard reference, in the form of a three-dimensional model, to be used for analysis.
  • the three-dimensional scan data of the anatomical device is obtained.
  • step 5102 the data from the scan of step 5001, or a representation of the data, is fransmitted to a remote facility.
  • step 5003 a design/evaluation of the fransmitted data is performed.
  • a freatment plan, diagnosis, and a design for anatomical device are determined during a first pass through the loop including step 5103.
  • the status or progress of the freatment or device is monitored and changes are made as necessary.
  • the monitoring is performed by comparing the current scan data to an expected result, wliich has been simulated, or to the previous history, or against matched case histories.
  • step 5104 the device or freatment plan is implemented or installed as appropriate. Any manufacturing of a device is also performed as part of step 5104.
  • step 5105 It is determined whether an additional pass through the closed-loop system of Figure 50 is needed at step 5105. If so, the flow proceeds to step 5105.
  • a closed-loop feedback loop can exist between any of the steps illustrated in Figure 50
  • one method of obtaining fixed reference points for an orthodontic structure includes selecting orientation reference points based on physical attributes of the orthodontic structure.
  • the orientation reference points can subsequently be used to map the digital image of the orthodontic structure into a three- dimensional coordinate system.
  • the frenum can be selected to be one of the orientation reference points and the rugae can be selected as the other reference point.
  • the frenum is a fixed point in the orthodontic patient that will not change, or change minimally, during the course of treatment.
  • the frenum is a triangular shaped tissue in the upper-portion of the gum of the upper-arch.
  • the rugae is a cavity in the roof of the mouth 68 in the upper-arch. The rugae will also not change its physical position through freattnent.
  • the frenum and the rugae are fixed physical points in the orthodontic patient that will not change during treatment. As such, by utilizing these as the orientation reference points, a three-dimensional coordinate system may be mapped thereto.
  • Figure 52 illusfrates that iterative feedback steps can take place within and between any combination of the steps illustrated herein.
  • an interactive and/or interactive loop can reside between the manufacturing step
  • fees for the use of such scan data 4405 may be fixed or variable fees based upon the use the data, the cost of a service provided, the anatomical device being generated, or the value added to a device or service based upon the data.
  • fees for the use of such scan data 4405 may be fixed or variable fees based upon the use the data, the cost of a service provided, the anatomical device being generated, or the value added to a device or service based upon the data.
  • the steps and methods described herein may be executed on a processing module (not shown).
  • the processing module may be a single processing device or a plurality of processing devices.
  • Such a processing device may be a microprocessor, microcomputer, digital signal processor, central processing unit of a computer or work station, digital circuitry, state machine, and/or any device that manipulates signals (e.g., analog and/or digital) based on operational instructions.
  • the processing module's operation is generally controlled by data stored in memory. For example, where a microprocessor is used, a bus of the microprocessor is connected to a bus of a memory to access instructions.
  • Examples of memory include single or multiple memory devices, such as random access memory, read-only memory, floppy disk memory, hard drive memory, extended memory, magnetic tape memory, zip drive memory, and/or any device that stores digital information. Such memory devices can be local (i.e. connected directly to the processing device), or at physically different locations (i.e. at a site connected to the Internet.) Note that when the processing module implements one or more of its functions, via a state machine or logic circuitry, the memory storing the corresponding operational instructions is embedded within the circuitry comprising the state machine or logic circuitry.
  • the anatomical structure being scanned may have one or more associated anatomical devices or appliances.
  • the present invention provides a deterministic method for diagnosing, treating, monitoring, designing and manufacturing anatomical devices.
  • the present embodiments can be used to provide an interactive method for communication between various parties designing and manufacturing the prosthetic device. Such an interactive method can be implemented in real-time.
  • the methods describe herein permit data to be archived in such a manner that others can obtain the actual information and knowledge obtained from the experience of others.
  • the present embodiments allow for the circumvention of traditional labs used to generate anatomical devices. Specifically, support facilities used for the generation of anatomical devices may now be numerous and remote relative to the patient. This can reduce the overall cost to the practitioner and patient. The patient does not have to visit the practitioner to have the status or progress of a device monitored, since the scanning location can be remote from other support locations. Overall, the fixed, archivable nature of the digital data of the present embodiments allows for a low cost of generating identical duplicate models from a gold standard model, thereby reducing the likelihood of lost and inaccurate data.
  • freatment costs are reduced.
  • the accessibility of multiple opinions (quotes, freatment plans, diagnoses, etc.) increases without the additional inconvenience to the patient associated with the prior art methods.
  • Competitive quoting is also readily obtainable using the specific embodiments indicated.
  • the digital data may be a composite of a direct scan of the anatomical structure and an indirect scan of the structure. This may occur when a portion of the anatomical structure is not viewable by the scanner 4401, so an impression of at least that portion of the anatomical structure that is not viewable. The impression, or a model made from the impression, is then scanned and "stitched" into the direct scan data to form a complete scan.
  • the digital data 4405 can be used in combination with other traditional methods.
  • the digital data described herein can be compressed, or secured using an encryption method.
  • an encryption method When encrypted, one or more of the patient, the scanning facility, and archival facility or a representative of the patient can have encryption keys for the digital data.

Abstract

An image is projected upon a surface of a three dimensional object (611), the image can include a pattern having plurality of individual shapes used to measure and map the surface. The image further comprises a feature containing an encoding information for identifying the plurality of shapes individually. The feature containing encoding information is oriented such that encoding information is retrieved along a line perpendicular to aplane formed by the projection axis and the point along the view axis (613). The use of the feature is used to perform multiframe independant scanning (612).

Description

METHOD AND SYSTEM FOR SCANNING A SURFACE AND GENERATING A THREE-DIMENSIONAL OBJECT
RELATED APPLICATIONS
This application claims priority to U.S. Application No. 09/560,131 filed on April 28, 2000; U.S. Application No. 09/560,132 filed on April 28, 2000; U.S. Application No. 09/560,583 filed on April 28, 2000; U.S. Application No. 09/616,093 filed on July 13, 2000; U.S. Application No. 09/560,645 filed on April 28, 2000; U.S. Application No. 09/560,644 filed on April 28, 2000; U.S. Application No. 09/560,584 filed on April 28, 2000; and U.S. Application No. 09/560,133 filed on April 28, 2000, the entire teachings of which are incorporated herein by reference.
Field of the Invention
The present invention relates generally to the mapping of objects, and more specifically to creating three-dimensional models of objects, to registering object portions to generate a model of the object, to registering scanned portions of the object to provide a three-dimensional object, to scanning anatomical structures to treat and diagnose, as well as develop and manufacture medical and dental devices and appliances and to providing specific images to aid the mapping of objects. Background of the Invention
The ability to generate anatomical devices such as prosthetics, orthotics, and appliances such as orthodontics is well known. Current methods of generating anatomical devices is subjective, whereby a practitioner specifies, or designs, the anatomical device based upon subjective criteria such as the practitioner's view of the anatomical structure, the location where a device is to be used, and the practitioner's experience and recall of similar situations. Such subjective criteria results in the development of an anatomical device that can vary significantly from practitioner to practitioner, and prevents the acquisition of a knowledge database that can be used by others.
One attempt to make the development of an anatomical devices less subjective includes taking an impression of the anatomical structure. From the impression, which is a negative representation of the anatomical structure, a positive model of the anatomical structure can be made. However, impressions, and models made from impressions, are subject to distortion, wear, damage, have a limited shelf life, are imprecise, require additional cost to generate multiply copies, and have an accuracy that is not readily verifiable. Therefore, whether an impression or a model of a structure is a true representation of the anatomical structure is not readily verifiable. Furthermore, impression processes are generally uncomfortable and inconvenient for patients, require a visit to a practitioner's office, and are time consuming. Furthermore, where multiple models are needed, either multiple impressions must be made, or multiple molds must be made from a single impression. In either case, no reliable standard reference is available to guarantee the similarity of each of the models. Furthermore, the mold still must be visually interpreted by the solo practitioner, resulting in a subjective process.
Another attempt to make development less subjective includes using two-dimensional images. However, the use of 2-dimensional images as known can not provide precise structure information, and still must be objectively interpreted by the practitioner. Furthermore, the manufacturing of the device is still based upon an objective interpretation.
When an impression is shipped from the practitioner to a manufacturing facility, communication between the practitioner and the teclinicians about issues pertaining to the model or device being manufactured is impeded, since the three-dimensional model which is being used to design a prosthetic device is available only to the manufacturing facility. Even if multiple molds exist, they can't be viewed simultaneously from the same perspective, as they are physically separate objects, nor is there an interactive way of referencing the multiple models to one another.
Other types of records, in addition to molds and impressions, that maintained by practitioners, such as dentists and orthodontists are subject to being lost or damaged, and are costly to duplicate. Therefore, a method or system that overcomes these disadvantages would be useful.
The use of scanning techniques to map surfaces of objects is well known. Prior art Figure 1 illustrates an object 100 having visible surfaces 101- 104. Generally, the visible surfaces 101-103 form a rectangular shape residing on top of a generally planer surface 104.
Projected onto the object 100 is an image, which includes the line 110. In operation, the image of line 110 is received by a viewing device, such as a camera, (not shown) and processed in order to determine the shape of that portion of object 100 where the line 110 resides. By moving the line 110 across the object 100, it is possible to map the entire object 100. Limitations associated with using an image comprising a single line 110 is that a significant amount of time is needed to scan the object 100 to provide an accurate map, and a fixed reference point is needed at either the scanner or the obj ect.
Figure 2 illustrates a prior art solution to reduce the amount of time taken to scan an object. Specifically, Figure 2 illustrates an image including lines 121 through 125. By providing multiple lines, it is possible to scan a greater surface area at once, thus allowing for more efficient processing of data associated with the object 100. Limitations of using patterns such as are illustrated in Figure 2 include the need for a fixed reference point, and that the surface resolution capable of being mapped can be reduced because of the potential for improper processing of data due to overlapping of the discrete portions of the image.
In order to better understand the concept of overlapping, it is helpful to understand the scanning process. Prior art Figure 3 illustrates the shapes of Figures 1 and 2 from a side view such that only surface 102 is visible. For discussion purposes, the projection device (not illustrated) projects a pattern in a direction perpendicular to the surface 101 which forms the top edge of surface 102 in Figure 3. The point from the center of the projection lens to the surface is referred to as the projection axis, the rotational axis of the projection lens, or the centerline of the projection lens. Likewise, an imaginary line from a center point of the viewing device (not shown) is referred to as the view axis, the rotational axis of the view device, or the centerline of the view device, extends in the direction which the viewing device is oriented.
The physical relationship of the projection axis and the view axis with respect to each other is generally known. In the specific illustration of Figure 3, the projection axis and the view axis reside in a common plane. The relationship between the projection system and the view system is physically calibrated, such that the relationship between the projector, and the view device is known. Note the term "point of reference" is to describe the reference from which a third person, such as the reader, is viewing an image. For example, for Figure 2, the point of reference is above and to the side of the point that is formed by surfaces 101, 102, and 103.
Figure 4 illustrates the object 100 with the image of Figure 2 projected upon it where the point of reference is equal to the projection angle. When the point of reference is equal to the projection angle, no discontinuities appear in the projected image. In other words, the lines 121-125 appear to be straight lines upon the object 100. However, where the point of reference is equal to the projection axis, no useful data for mapping objects is obtained, because the lines appear to be undistorted. Figure 5 illustrates the object 100 from a point of reference equal to the view angle fleet of Figure 2. In Figure 5, the surfaces 104, 103 and 101 are visible because the view axis is substantially perpendicular to the line formed by surfaces 101 and 103, and is to the right of the plane formed by surface 102, see Figure 2, which is therefore not illustrated in Figure 5. Because of the angle at which the image is being viewed, or received by the viewing device, the lines
121 and 122 appear to be a single continuous straight line. Likewise, line pairs
122 and 123, and 123 and 124, coincide to give the impression that they are single continuous lines. Because line 125 is projected upon a single level surface elevation, surface 104, line 125 is a continuous single line.
When the pattern of Figure 5 is received by a processing device to perform a mapping function, the line pairs 121 and 122, 122 and 123, and 123 and 124, will be improperly interpreted as single lines. As a result, the two- tiered object illustrated in Figure 2 may actually be mapped as a single level surface, or otherwise inaccurately displayed because the processing steps can not distinguish between the line pairs.
Figure 6 illustrates a prior art solution for overcoming the problem described in Figure 5. Specifically, Figure 6 illustrates the shape 100 having an image projected upon it whereby a plurality of lines having different line widths, or thickness, are used. Figure 7 illustrates the pattern of Figure 6 from the same point of reference as that of Figure 5.
As illustrated in Figure 7, it is now possible for a processing element analyzing the received data to distinguish between the previously indistinguishable line pairs. Referring to Figure 7, line 421 is still lined up with line 422 to form what appears to be a continuous line. However, because line 421 and line 425 have different thickness, it is now possible for an analysis of the image to determine the correct identity of the specific line segments. In other words, the analysis of the received image can now determine that line 422 projected on surface 104, and line 422 projected on surface 101 are actually a common line. Utilizing this information, the analysis of the received image can determine that a step type feature occurs on the object being scanned, resulting in the incongruity between the two segments of line 422.
While the use of varying line thickness, as illustrated in Figure 7, assists identifying line segments, objects that have varying features of the type illustrated can still result in errors during the analysis of the received image.
Figure 8 illustrates from a side point of reference a structure having a surface 710 with sharply varying features. The surface 710 is illustrated to be substantially perpendicular to the point of reference of Figure 8. In addition, the object 700 has side surfaces 713 and 715, and top surfaces 711 and 712. From the point of reference of Figure 8, the actual surfaces 711, 712, 713 and 715 are not viewed, only their edges are represented. The surface 711 is a relatively steep sloped surface, while the surface 712 is a relatively gentle sloped surface.
Further illustrated in Figure 8 are three projected lines 721 through 723 having various widths. A first line 721 has a width of four. A second projected line 722 has a width of one. A third projected line 723 has a width of eight. The line 721, having a width of four, is projected onto a relatively flat surface 714. Because of the angle between the projection axis and the view axis, the actual line 721 width viewed at the flat surface 714 is approximately two. If the lines 722 and 723 where also projected upon the relatively flat surface 714 their respected widths would vary by approximately the same proportion amount as that of 721, such that the thickness can be detected during the analysis steps of mapping the surface. However, because line 722 is projected onto the angled surface 711, the perspective from the viewing device along the viewing axis is such that the line 722 has a viewed width of two.
Line 722 appears to have a width of two because the steep angle of the surface 710 allows for a greater portion of the projected line 722 to be projected onto a greater area of the surface 711. It is this greater area of the surface 722 that is viewed to give the perception that the projected line 722 has a thickness of two.
In a manner opposite to how line 722 is affected by surface 711, line 723 is affected by surface 712 to give the perception that the projected line 723 having an actual width of eight, has a width of two. This occurs because the angle of the surface 712, relative to the viewing device allows the surface area with the projected line 723 to appear to have a width of two. The results of this phenomenon are further illustrated in Figure 9.
Figure 9 illustrates the shape 700 of Figure 8 from the point of reference of the view axis. From the point of reference of the view axis, the lines 721-723 are projected onto the surface 714 in such a manner that the difference between the line thickness can be readily determined. Therefore, when an analysis of the surface area 714 occurs, the lines are readily discernable based upon the viewed image. However, when an analysis includes the surfaces 711 and 712, the line 722 can be erroneously identified as being line 721 because not only are the widths the same, but line 722 on surface 711 lines up with line 721 on surface 714. Likewise, the line 723, having a projected width of eight, has a viewed width of two. Therefore, during the analysis of the received images, it may not be possible to distinguish between lines 721, 722, and 723 on surfaces 711 and 712. The inability to distinguish between such lines can result in an erroneous analysis of the surfaces.
One proposed method of scanning, disclosed in foreign patent DE 198 21 611.4, used a pattern that had rows of black and white triangles and squares running parallel to a plane of triangulation. The rows used measuring features that include a digital encrypted pattern. However, when a surface being scanned causes shadowing and/or undercuts, a break in the sequence can result due to a portion of the pattern be hidden. Furthermore, the disclosed encrypted pattern is such that breaks in the sequence can result in the inability to decode the pattern, since it may not be possible to know which portion of the pattern is missing. A further limitation of the type of encoding described is that distortion can cause one encoding feature to look like another. For example, a triangle can be made to look like a square.
Therefore, a method and apparatus capable of overcoming the problems associated with the prior art mapping of objects would be advantageous. Brief Description of the Drawings
Figure 1 illustrates an object being scanned by a single line in accordance with the prior art;
Figure 2 illustrates an object being scanned by a plurality of lines in accordance with the prior art;
Figure 3 illustrates a projection axis and a view axis associated with the lines of Figure 2 in accordance with the prior art;
Figure 4 illustrates the object of Figure 1 from a point of reference equal to the proj ection axis of Figure 3 ;
Figure 5 illustrates the object of Figure 3 from the view axis of Figure 3;
Figure 6 illustrates an object having a plurality of lines of varying thickness projected upon it in accordance with the prior art;
Figure 7 illustrates the object of Figure 6 from a point of reference equal to the view axis as shown in Figure 3 ;
Figure 8 illustrates an object from a side view having varying projected line thickness in accordance with the prior art;
Figure 9 illustrates the object of Figure 8 from point of reference equal to the view axis of Figure 8; Figure 10 illustrates a system in accordance with the present invention;
Figure 11 illustrates a portion of the system of Figure 10 in accordance with the present invention;
Figure 12 illustrates, in flow diagram form, a method in accordance with the present invention;
Figure 13 illustrates, the object of Figure 3 from a point of reference equal to the view axis of Figure 3 in accordance with the present invention;
Figure 14 illustrates, the object of Figure 3 from a point of reference equal to the view axis of Figure 3 in accordance with the present invention;
Figure 15 illustrates an object having a pattern projected upon it in accordance with the present invention;
Figure 16 illustrates a table identifying various types of pattern components in accordance with the present invention;
Figure 17 illustrates a set of unique identifiers in accordance with the present invention;
Figure 18 illustrates a set of repeating identifiers in accordance with the present invention;
Figures 19-22 illustrate, in flow diagram form, a method in accordance with the present invention; Figure 23 illustrates a sequence of images to be projected upon an object in accordance with an embodiment of the present invention;
Figure 24 illustrates an image having varying features in accordance with an embodiment of the present invention;
Figure 25 illustrates a projected image feature being reflected off surfaces at different depths in accordance with a preferred embodiment of the present invention;
Figure 26 illustrates the projected image of Figure 25 as viewed at the different depths;
Figures 27-30 illustrate a dentition object from various perspectives in accordance with preferred embodiments of the present invention;
Figure 31 illustrates a method in accordance with a specific embodiment of the present invention;
Figures 32 and 33 illustrate a dentition object being scanned from various perspectives in accordance with preferred embodiments of the present invention;
Figure 34 illustrates primitive shapes for modeling a dentition object;
Figures 35 and 36 illustrate methods in accordance with a specific embodiment of the present invention; Figure 37 illusfrates a graphical representation of a method for selecting various entry points for registration in accordance with a preferred embodiment of the present invention;
Figures 38-43 illustrate methods in accordance with a specific embodiment of the present invention; and
Figures 44-52 illustrate specific flows in accordance with specific embodiments of the present invention.
Detailed Description of Preferred Embodiments
In accordance with a specific embodiment of the present invention, an image is projected upon a surface. The image can include a pattern having a plurality of individual shapes used to measure and map the surface. The plurality of individual shapes include features that are detectable in a direction parallel to the plane formed by a projection axis of the projected shapes and a point associated with a view axis. The image further comprises a feature containing encoding information for identifying the plurality of shapes individually. The encoding feature varies in a direction substantially orthogonal to a plane formed by the projection axis and a point of a view axis, and can be a separate feature from each of the plurality of individual shapes, can be a feature integral to the plurality of individual shapes, and/or be displayed at different time intervals from the plurality of individual shapes. The feature containing encoding information is oriented such that the encoding information is retrieved along a line substantially perpendicular to a plane formed by the projection axis and the point along the view axis. The use of the feature is used to perform multiframe reference independent scanning. In a specific embodiment, scanned frames are registered to one another.
Specific embodiments of the present invention are best understood with reference to the accompanying Figures 10-24. Figures 10 and 11 represent a system for implementing a specific embodiment of the present invention,
Figures 12, and 19-22 illustrate specific methods in accordance with the present invention, and Figures 13-18, 23, 24 illustrates specific implementations of the method in combination with the system.
Figures 44-52 illustrate a specific method and apparatus of another inventive embodiment that uses three-dimensional scan data of an anatomical structure, which may be obtained in the specific manner described herein. The three-dimensional scan data is transmitted to a remote facility for further use. For example, the three-dimensional scan data can represent the anatomy of an anatomical structure which is used to design an anatomical device, manufacture an anatomical device, monitor structural changes of the anatomy, archive data pertaining to the anatomical structure, perform a closed-loop iterative analysis of the anatomical structure, perform an interactive consultation of the structure, perform simulations based upon the structure, make a diagnosis related to the anatomical structure, or determine a treatment plan based on the anatomical structure.
Figure 10 illustrates a system controller 951 that provides control signals to the scanning device 980. The scanning device 980 projects an image bound by lines 962 and 963, and retrieves, or views, the images within the reflected lines 972 and 973.
In one operation, the system controller 951 provides specific information to the scanner 980 specifying a specific image to be projected upon the surface 991 of the object 990. The reflected image is captured by the scanning device 980, which in turn provides the captured information back to the system controller 951. The captured information can be provided back to system controller 951 automatically, or can be stored within the scanning device 980 and retrieved by the system 951. The image data once received by the system controller 951 is analyzed in order to determine the shape of the surface 991. Note that the analysis of the received data can be performed either by the system controller 951 , or by an external-processing device that is not shown.
Further illustrated in Figure 10 is the scanning device 980, which includes a projecting device (projector) 960 and a viewing device (viewer) 970. The projector 960 is oriented such that the image is projected on the object 990. The projector 960 has a projection axis 961. The projection axis 961 begins at the center of the lens projecting the image and is representative of the direction of projection. Likewise, the viewer 970 has a view axis 971 that extends from the center of the lens associated with the viewer 970 and represents the direction from which images are being received.
Once the scanning device is calibrated, analysis of the received signals can be performed to map the scanned surface. One skilled in the art will recognize that the angles represented in the Figures herein are represented as such for illustrative purposes only. The actual angles and distances may vary substantially from those illustrated.
Figure 11 illusfrates in greater detail the system controller 951 of Figure 10. The system controller 951 further includes data processor 952, a projection image representation 953, the projector controller 954, and a viewer controller 955. The viewer controller 955 provides the interface needed to receive data from the viewer 970 representing the reflected image data. The reflected image data is received from the viewer 970 at the viewer controller 955, and subsequently provided to the data processor 952. In a similar manner, the projector controller 954 provides the interface necessary to control the projector 960. The projector controller 954 provides the projector 960 with the image to be projected in a format supported by the projector. In response, the projector 960 projects the image onto the surface of the object. The projector controller 954 receives or accesses the projection image representation 953 in order to provide the proj ector with the image.
In the embodiment illustrated, the projection image representation 953 is an electronic representation of the image stored in a memory location. The stored image can represent a bit mapped image, or other standard or custom protocol used to define the image to be projected by the projector 960. Where the projection image is a digital image (electrically generated), the representation can be stored in memory by data processor 952, thereby allowing the data processor 952 to modify the projection image representation, it is possible to vary the image as necessary in accordance with the present invention.
In another embodiment, the projection image representation 953 need not be present. Instead, the projection controller 954 may select one or more transparencies (not illustrated) associated with the projector 960. Such transparencies can include any combination of films, plates, or other types of retical devices that project images.
The data processor 952 controls the projection and reception of data through the controller 954 and 955 respectively.
Figure 12 illustrates a method in accordance with the present invention that will be discussed with reference to the system of Figure 10 and the accompanying Figures. In order to better understand the methods discussed herein, terminology and characteristics unique to the present invention are described. The term "projection/view plane" refers to a plane formed by the projection axis and at least one point of the view axis. The term projection/view plane is best understood with reference to Figure 3. Assuming that Figure 3 represents a cross section of the object 100. The projection axis illustrated is directed such that it lies entirely within the plane formed by the sheet of paper including Figure 3. Likewise, the view axis of Figure 3 is also lying entirely within the plane represented by the sheet of paper of Figure 3. In this example, the projection/view plane formed by the projection axis of Figure 3 and at least one point of the view axis of Figure 3 includes the sheet of paper on which the Figure is drawn.
However, if the view axis of Figure 3 was actually oriented such that the endpoint near the viewing device is on the plane of the paper, while the arrow end of the view axis representation is pointing out of the paper towards the reader, it would not be possible to form a plane that includes the entire view axis and projection axis. Therefore, the projection/view plane can be described to contain substantially all of the projection axis and at least one point of the view axis, or all of the view axis and at least one point of the projection axis. For purposes of discussion herein, it will be assumed that the point of the view axis nearest the view device is the point to be included within that projection/view plane. For example, referring to prior art Figure 4, the projection/view plane described with reference to Figure 3 would be substantially orthogonal to the surface 104, and orthogonal to each of the lines 121-125. The projection/view plane is represented by line 99, which represents the plane from an edge view intersecting the lines 121-125.
At step 611 of Figure 12, an image is projected having an encoding
(variable) feature with a component, or components, that varies orthogonal to the projection/view plane. With respect to Figure 13, the projection view plane is illustrated by the line 936 indicating that the orientation of the view/ projection plane is on edge such the plane appears to be a line, and each of the shapes or patterns 931-935 represent an encoding feature.
Each of the individual features 931-935 has a component(s) that varies in a direction orthogonal to the projection view plane. For example, feature 933 varies orthogonal to the projection plane such that three individual lines can be identified. By varying the thicknesses of the three individual lines a unique pattern is associated with each of the features 931-935. For example, the bar code feature 933 varies orthogonal between no line, thin line, no line, thick line, no line, thin line, and no line. The individual lines of the feature 933 are projected parallel to the projection/view plane. Projecting lines parallel to the projection/view plane reduces, or eliminates, the viewed distortion affects of surface topology on the width of the lines. Therefore, because the viewed width of the individual lines making up the feature 933 do not distort substantially, the thickness, or relative thickness, of each individual line of the feature 933 can be readily identified independent of surface topology. As a result, the feature 933 can be identified substantially independent of surface topology.
Figure 13 displays a specific embodiment of an image having five separate lines (measuring features) 431-435. The lines 431-435 illustrated have lengths that run substantially orthogonal to the projection view plane, and are uniformly spaced from each other in a direction parallel to the projection/view plane. By providing a plurality of lines which are detectable in the direction parallel to the projection/view plane, multiple measuring lines can be viewed and analyzed simultaneously. In one embodiment, the lines 431-435. In addition to the lines 431-435, five unique bar codes 931-935 are also illustrated. Each of the unique bar codes (variable features) 931-935 are associated with, and repeated along a respective measuring feature 431-435. In other implementations, each bar code can be repeated along a measuring feature more than the two times illustrated. Note that the bar codes illustrated are illustrated as repeating sets. In other implementations, the bar codes would not need to be grouped in sets.
In a specific embodiment, the lines 431-435 and bar codes 931-935 are generated using visible light that is low-intensity, such that the pattern is eye- tolerant and skin tolerant. For example, the lines 431-435 can be viewed as white lines, and the bar codes 931-935 can be viewed as specific colors or combinations of colors. In another embodiment, high-intensity or laser light can also be used depending upon the application.
By associating bar codes to specific lines in the manner illustrated, it is possible to distinguish lines from one another even when they appear to be linearly coincident. For example, the lines 432 and 433 appear to be a continuous line at the edge of object 101. However, the lines 432 and 433 can be distinguished from each other by analyzing the (encoding feature) barcodes associated with each line. In other words, where line 432 and line 433 appear to the viewer to be a common line, it can now be readily determined that they are two different lines because the bar code associated with line 432 on the left would not be the same as the bar code associated with line 433 on the right.
In the specific example illustrated in Figure 13, the analysis of the retrieved images would determine that there is a discontinuity somewhere between the left most bar code 932 and the right most bar code 933 causing the line segments 432 and 433 to appear as a common line. In a specific embodiment, the location of such an edge can be determined with greater precision by providing repeating bar code patterns in relatively close proximity to one another. For example, the edge where surface 102 meets surface 101 can be determined only to an accuracy equal to the spacing between adjacent bar codes. This is because when the analysis encounters what appears to be a single line having two different bar codes it is unknown where between the two bar codes the discontinuity has occurred. Therefore, by repeating the bar code more frequently along the measuring lines of Figure 13 the location of discontinuities can be more accurately identified.
The encoding features 931-935 of Figure 13 are non-repeating in that no two bar codes are the same. However, an encoding value, or sequence, can be repeated within a projected image as long as ambiguity is avoided. For example, if the image includes 60 lines (measuring features) using a binary encoding, 6 bits of data are needed to identify each line uniquely. However, due to the fact that the range of focus of the scanner is limited by the depth of field, each individual line of the 60 lines can show up as a recognizable image only within a certain range.
Figures 25 and 26 better illustrate how the depth of field affects the repeating of features. Figure 25 illustrates a projector projecting a SHAPE along a path 2540. When the SHAPE is projected onto a surface its image is reflected along a reflection path to a viewing device 2506. For example, reflection path 2544 results when the SHAPE is reflected off a surface at the location 2531, a reflection path 2541 results when the SHAPE is reflected off a surface at the location 2532, a reflection path 2542 results when the SHAPE is reflected off a surface at the location 2533, and a reflection path 2543 results when the SHAPE is reflected off a surface at the location 2534.
Figure 26 represents the SHAPE as the viewer 2506 would view it.
Specifically, the image reflected off of the surface 2531, which is the surface closest to the projector, is viewed as the right most image in Figure 26, while the image reflected off of the surface 2534, which is the surface furthest from the projector, is viewed as the left most image in Figure 26. However, it should be noted, that the left and right most images, which are furthest and closest to the projector 2505 respectively, are out of focus. Because they are out of focus they can not be accurately detected based upon the image received by the viewing device 2506.
Referring back to Figure 25, any surface closer to the projection device 2505 than plane 2525, or further from the projection device 2505 than the plane 2526 is not capable of reflecting a usable SHAPE because it is outside the viewable range 2610, or field of view. Therefore, the SHAPE can be repeated and still be uniquely identified, so long as the repeated SHAPE can not be viewed within the range 2610 of Figure 6.
In a specific embodiment, a projector projects approximately 80 lines. Each of the 80 lines have a color-coded encoding sequence. For example, if three colors are used (red, blue, Green), an encoding feature having three color locations could uniquely identify 27 different lines. This coding sequence of 27 lines can be repeated three times to cover all 80 lines, provided the field of view is such that lines having the same encoding can not be viewed at the same location. In another embodiment, five color locations can be added with or without increasing the number of lines in a sequence to provide recognition capability where a specific color location may be lost.
This means that coding features may be repeated, as long as the fields of view in which each of the repeating features may be viewed do not overlap. Thus, a sequence of 12 unique encoding features, requiring only four bits of binary data, can be repeated five times to encode all 60 lines, provided there is no chance for features to be viewed at the same location.
By providing a pattern having a large number of measuring features with associated coding features reference independent scanning is achieved. Specifically, neither the object nor the scanner needs to be fixed in space, nor with reference to each other. Instead, on a frame by frame basis, the reference independent scanner retrieves enough measuring information (a 3D cloud), which is accurate due to the encoding feature, to permit registration to its adjacent frame. Registration is the process which determines the overlapping features on adjacent frames to form an integrated map of the object.
Figure 14 illustrates the object of Figure 13 whereby the measuring lines
431-435 have varying thickness. However, the thickness of lines 431-435 is subject to distortion. Thereby making identification of the individual lines 431-
435 based upon their thickness alone prone to error. This is better illustrated with reference to Figure 15.
Figure 15 represents the object 700 of Figures 8 and 9 having a pattern in accordance with the present invention projected upon its surface. Figure 15 illustrates the projection of lines 721-723 having varying widths. As previously discussed, the lines 722 and 723, when projected onto the surfaces 711 and 712 respectively, appear to have the same line thickness as line 721. Therefore, merely having measuring lines of varying thickness will not allow an analysis of the images to determine which line is which. However, by further incorporating the encoding features 451-453, such that they have a component that varies orthogonal to the projection view plane, identification of the lines 721-723, and the subsequent mapping analysis, is improved over the prior art.
One skilled in the art will recognize that the specific implementations illustrated, whereby an encoding feature is projected to have a portion perpendicular to a projection/view plane, is advantageous over the prior art in that it allows for analysis of the received images to more accurately identify specific lines associated with the pattern. One skilled in the art will further recognize and understand that the specific implementation described herein has been described with reference to lines and bar codes. However, other patterns, shapes and features can also be used.
Referring to Figure 16, a table is illustrated where a specific set of shapes used in a direction orthogonal to the projection/view plane are illustrated. Column 1 of table 16 represents unique feature identifiers. The columns 2-4 of table 16 illustrate specific manners in which each feature identifier can be represented. Column 2 indicates bar codes. Column 3 indicates colors capable of being used either alone or with other encoding features. Note that some types of encoding features, including color features, can be implemented as an integral part of a measuring feature as well as an encoding feature separate from the measuring feature. Likewise, other types of encoding can be based upon the intensity at which a measuring and/or feature and its encoding feature is projected. Column 4 represents patterns that can be utilized either independently from the shape to identify the shape, or in combination as part of a shape. In other words, a line comprising a repeating pattern sequence of the type illustrated in Column 4 can be provided. In this manner, the change of pattern in a direction orthogonal to the projection/view plane can be relative to the actual shape itself. In addition, one of ordinary skill in the art will recognize that many variations as to variable components would be anticipated by the present invention.
Figure 17 illustrates in tabular form, the use of unique non-repeating identifiers for each line. For example, referring to the first row of Figure 17 the sequence 0-F sequentially is presented. In one implementation, each of the values from 0 through F will represent a unique code associated with a specific line. One skilled in the art will recognize that in order to identify the specific codes, some type of spacer may need to exist between each individual code. For example, a long space, or a unique code can be used.
In a system used to project and analyze four lines, each with one of the sequences illustrated in Figure 17, it would be possible to identify which one of the four lines is being analyzed once a sequence of three codes has been retrieved. Generally, because the codes vary orthogonal to the projection/view plane missing codes may not pose a problem of misidentification.
Figure 18 illustrates four unique repeating code sequences. The letter S in table 18 is utilized to represent a spacer used between repeating sequences. A spacer can be some unique identifier specifying where each of the repeating codes of the encoding sequence begins and/or ends.
Returning to the flow of Figure 12, once the image has been projected having an encoding feature orthogonal the projection/view plane, a representation of the surface image is received at a viewer. This is analogous to the discussion of Figure 10 whereby the viewer 970 receives the reflected image. Next, at step 613, the location of a point associated with an object is determined based upon the orthogonally varying feature. In a specific embodiment of the present invention, the point is based upon the variable component because each one of the shapes, e.g. lines is qualified to a unique code pattern prior to being used for object analysis.
Figure 19 illustrates sub steps to be associated with step 611 of Figure 12. At step 621, a first image is projected, while at step 622 a second feature is projected. Referring to Figure 14, the first image can be analogous to the combination of the measuring line 431 and its associated encoding features 931. In the similar manner, the second feature could be represented by the combination of the measuring line 432 and its encoding features 932. Note that in addition to being able to analyze line 431 with respect to the features 931, it would also be possible in another embodiment to determine the identity of line 431 based upon the encoding features 932. In other words, a specific line in a group of lines, such as illustrated in Figure 14, can be identified based on more than one of the various encoding features. However, in a specific embodiment, only the adjacent set of encoding features, or adjacent sets of encoding features, would be utilized. In addition, steps 621 and 622 can occur at different times as discussed with reference to Figure 23.
Figure 21 illusfrates another method in accordance with the present invention. At step 631, a plurality of first features, and a plurality of second features are projected. These features may be projected simultaneously, or at separate locations.
At step 632, one of the plurality of first features is determined, or identified, based upon the second features. Referring to Figure 14, the plurality of first features would include the lines measuring 431-435. By utilizing the second features, the bar code 931-935, a specific one of the lines 431-435 can be identified.
At step 633, the location of a point at the surface is determined based upon the specific one of the plurality of parallel first features.
This specific embodiment is an advantage over the prior art, in that a line identified by the analysis of the received shape is not utilized until its identity is verified based upon the encoding information.
Figure 22 illustrates another method in accordance with the present invention. At step 641 parallel first and second discrete shapes are projected. Examples of such discrete shapes would include the lines 431 and 432 of Figure
14. However, one of ordinary skill in the art will recognize that a variety of other parallel shapes could be projected.
At step 642, an encoding feature relative to the first discrete shape is projected. Again, referring to Figure 14, the encoding feature relative to the line 432 could include the encoding feature 932 or even an encoding feature 933. At step 643, an encoding feature relative to the second discrete shape is projected.
At step 644 the first discrete shape is identified based upon the first encoding feature. This is accomplished in a manner similar to as discussed previously.
At step 645 a location of a specific point of an object is determined based upon the first discrete shape.
Figure 23 illustrates another embodiment of the present invention. Specifically, Figure 23 illustrates a series of images projected at times Tl, T2, T3 and T4. At time Tl, the image projected includes measuring features 1011 through 1013. During time Tl, no encoding feature is projected. During time T2, an image containing encoding features 1021-1023 is projected. The patterns of times Tl and T2 are repeated during times T3 and T4 respectively. The result of alternating the projection of encoding and measuring features is that denser patterns can be used, allowing for more information to be obtained. Note that the image of time T4 shows the encoding features 1021-1023 overlying the measuring features 1011-1013. However, in one embodiment, the measuring features have been included for illustration purposes only, and would not generally be present at the same time as the encoding features.
In yet another embodiment of the present invention, Figure 24 illustrates an image having features with different characteristics. Specifically, Figure 24 illustrates an image 1100 having lines 1131 through 1134 with a distance X between the individual lines, while the distance between lines 1134, 1135, and 1136 have a substantially greater distance Y separating the lines. By allowing for features having different isolation characteristics, it is possible to provide for a high-resolution feature. In other words, the line 1135 can be used to map surface features that otherwise may not be mappable. Note that the pattern 1100 could be used with or without the coding techniques described herein.
Once a scanner receives, or views, a projected frame pattern, the frame pattern is digitized into a plurality of 2D points (2D image frame). Because the projection and view axis of the scanner are fixed and known, each 2D point of the 2D image frame can be converted into a 3D point using conventional 3D imaging techniques, provided each 2D point of the 2D image frame can be correlated to a projected point. The use of a projected frame pattern that has encoding features enables correlation of the points of the 2D image to a respective projected point.
Multi-frame reference independent scanning is described herein in accordance with another aspect of the present disclosure. In a specific embodiment, multiple 3D image frames are received by using a hand-held scanner to scan an object one frame at a time to obtain a plurality of frames, where each frame captures only a portion of the object. With reference to multiple frames, reference independent scanning has a spatial position that is frame-by-frame variable relative to the object being scanned, and whose spatial position is not fixed, or tracked, relative to a reference point. For example, there is no fixed reference point relative to the object being scanned. One type of reference independent scanner disclosed herein includes a hand-held scanner that projects a pattern in successive frames having measuring features and encoding features. This allows each viewed point of a frame to have a known corresponding projected point, thereby enabling the 2D frame data to be converted into 3D frame data.
Figures 27-28 are used to discuss multiple frame reference independent scanning.
Figures 27, 28, and 30 illustrate an object 2700 from different points of view. As illustrated in Figure 27, the object 2700 includes three teeth 2710, 2720, and 2730, and a gum portion 2740 that is adjacent to the three teeth.
The Figure 27 point-of-view is such that a plurality of non continuous surface portions are viewed. For example, from the Figure 27 point-of-view three noncontiguous surface portions 2711-2713 are viewed. The surface portion 2713 represents a side portion of the tooth 2710. The surface portion 2711 represents a portion of the tooth 2710 biting surface that is not continuous with surface portion 2713. The surface portion 2712 represents another portion of the tooth 2710 biting surface that is not continuous with either portion 2711 or 2713. In a similar manner, tooth 2720 has four surface portions 2721-2724, and tooth 2730 has four surface portions 2731-2734.
Figure 28 illustrates the object 2700 from a slightly different point-of- view (Figure 28 point-of-view). The point-of-view change from Figure 27 to Figure 28 is the result of the viewer, i.e. scanner, moving in a direction that allows a greater portion of the upper teeth surfaces to be viewed. The change in point-of-view has resulted in variations to a plurality of viewed surface portions. With respect tooth 2710, tooth portion 2813 now represents a smaller 2D surface than did its corresponding tooth portion 2713; while tooth portions 2811 and 2812 now are viewed as larger 2D surfaces than their corresponding portions 2711 and 2712 of Figure 27.
With respect to tooth 2720, surface 2824 now is viewed as a smaller 2D surface than its corresponding tooth surface 2724 of Figure 27. With respect to tooth 2720; tooth surface 2821 represents a continuously viewed tooth surface that includes both of the surfaces 2721 and 2723 from the Figure 27 point-of- view.
With respect to tooth 2730, the viewed 2D surfaces 2832 and 2835 each include portions of surface 2732 and previously unviewed surface area. This is the result of a topographical feature of the tooth 2730, which resulted in the inability of the surface 2732 to be viewed continuously from the second frame point-of-view.
The relationship of the tooth portions of Figure 27 to the tooth portions of Figure 28 are better understood with reference to Figure 29. Specifically, Figure 29 is from the same point-of-view as Figure 28 with the viewed surface portions of Figure 27 indicated as shaded areas. For example, surface portion 2711 of Figure 27 is represented as a shaded portion within the surface portion 2811. As illustrated, the change in the point-of-view between Figure 27 and Figure 28 results in a viewed surface portion 2811 that encompasses the smaller viewed surface portion 2711. Likewise, the change in perspective has resulted in different surface portions being viewed.
Figure 30 illustrates the object 2700 from another point-of-view.
Specifically, the Figure 30 point-of-view is from directly over the teeth 2710- 2730. Superimposed onto Figure 30 are the viewed surface portions of Figure
28. The object 2700 illustrated in Figure 27 will be referenced further herein to describe a specific embodiment of multiframe reference independent scanning.
Figure 31 illustrates a method 3100 in accordance with a specific embodiment of reference independent scanning. At step 3101 the object is scanned to obtain a 2D cloud of data. The 2D cloud of data includes a plurality of frames. Each of the frames has a plurality of 2D points, which, if viewed, would represent a 2D image.
At step 3102, a first frame of the 2D cloud of data is converted to 3D frame model. In one embodiment, a 3D frame model is a 3D point model, which includes a plurality of points in three-dimensional space. The actual conversion to a 3D frame point model is performed on some or all of the frame's 2D cloud of data using conventional techniques for converting a scanned 2D cloud of data into a 3D point model. In a specific embodiment using encoding features, as disclosed herein, surfaces with non continuous viewed surfaces, such as the teeth 2710-2730 of Figure 27, can be successfully scanned frame-by-frame. Figures 32 and 33 further illustrate the object 2700 being scanned from the Figure 27 and Figure 28 points-of-view respectively. In Figure 32, the scan pattern includes scan lines 3221-3223. Any scan line portion outside the frame boundary 3210 is not capable of being properly scanned. Within the boundary 3210 each scan line, when sensed at the CCD (charge coupled diode) chip of the scanner, is converted to plurality of 2D points (cloud of data). Some or all points of a scan line can be used in accordance with the present invention. For example, every other, or every third point of a scan line can be used depending upon the desired resolution of a final 3D model. Figure 32 illustrates four points (A-D) of each line being identified. A 2D coordinate value, such as an X-Y coordinate, is determined for each of these points.
In a specific embodiment of scanning, a scan rate of 1 to 20 frames per second is used. Greater scan rates can be used. In a specific embodiment, the scan rate is chosen to allow for real-time viewing of a three-dimensional image. The pulse time during which each frame is captured is a function of the speed at which the scanner is expected to be moving. For dentition structures, a maximum pulse width has been determined to be approximately 140 microsecond, although much faster pulse widths, i.e. 3 micro-seconds, are likely to be used. In addition, in a specific embodiment the teeth 2710-2730 are coated with a substance that results in a surface that is more opaque than the teeth themselves.
In a specific embodiment, each point of the cloud of data is analyzed during the various steps and functions described herein. In another embodiment, only a portion of the cloud of data may be analyzed. For example, it may be determined only every 3rd or 4th point needs to be analyzed for a desired resolution to be met. hi another embodiment, a portion of the frame data can be a bounding box that is smaller than the entire frame of data such that only a specific spatial portion of the cloud of data is used for example, only a center portion of the cloud of data is included within the bounding box. By using a subset of the cloud of data, it is possible to increase the speed of various routines described herein.
Figure 33 illustrates the object 2700 being scanned from the Figure 28 point of view. As such, the viewed pattern including lines 3321-3323 are positioned differently on the teeth 2710-2730. In addition, the frame boundary 3310 has moved to include most of the tooth 2720.
Figure 34 illustrates another embodiment of a 3D frame model referred to herein as a 3D primitive model. A 3D primitive model includes a plurality of primitive shapes based upon the frame's 3D points. In the specific embodiment illustrated adjacent points from the 3D point model are selected to form triangles, including triangle PS1-PS3 as primitive shapes. Other implementations can use different or varied primitive shapes.
The use of primitive shapes to perform registration is advantageous over registration techniques that attempt to get the points of two point clouds as close as possible to each other, because using a primitive surface representation of one of the point clouds allows a lower resolution model to be used, resulting in a faster registration, without the disadvantage of undesirable offset error. For example, if a scan resolution of 1mm is used for point-to-point registration, the best guaranteed alignment between two frames is 0.5 mm. This is due to the fact that the hand held scanner randomly captures which points of the surface are mapped. . Using point-to-surface registration provides a more accurate result since the registration can occur to any point of the surface, not just the vertices.
At step 3103 of Figure 31, a second 3D frame model is generated from the second frame of the cloud data. Depending upon the specific implementation, the second 3D frame model may be a point model or a primitive model.
At step 3104 a regisfration is performed between the first frame model and the second frame model to generate a cumulative model. "Regisfration" refers to the process of aligning the first model to the second model to determine a best fit by using those portions of the second model which overlap the first model. Those portions of the second model that do not overlap the first model are portions of the scanned object not yet mapped, and are added to the first model to create a cumulative model. Registration is better understood with reference to the method of Figure 35.
Figure 35 includes a registration method 3500 that, in a specific embodiment, would be called by one of the registration steps of Figure 31. At step 3501 of Figure 35 an entry point into registration is determined. The entry point into regisfration defines an initial guess of the alignment of the overlapping portions of the two models. The specific embodiment of choosing an entry point will be discussed in greater detail with reference to Figure 36. At step 3502, a registration of the two shapes is attempted. If an overlap is detected meeting a defined closeness of fit, or quality, the regisfration is successful. When the registration is successful the flow returns to the calling step of Figure 31. When a registration is not successful the flow proceeds to the step 3598 were a decision whether to continue is made.
A decision to continue can be made based on a number of factors. In one embodiment, the decision to continue is made based upon the number of registration entry points that have been tried. If the decision at step 3598 is quit registration attempts, the flow proceeds to step 3503 where registration error handling occurs. Otherwise the flow continues at step 3501.
Figure 36 illustrates a specific method for choosing a registration entry point. At step 3699 a determination is made whether this is the first entry point for a specific regisfration attempt of a new frame. If so the flow proceeds to step 3601, otherwise the flow proceeds to step 3698.
At step 3601 the X and Y components of the entry point are determined based upon two-dimensional analysis of the 2D cloud of data for each of the two frames. In a specific embodiment, the two-dimensional analysis performs a cross-correlation of the 2D images. These 2D images do not have to be from the 2D cloud of data, instead, data associated with a plain video image of the object, with no pattern, can be used for cross correlation. In this way, a probable movement of the scanner can be determined. For example, the cross- correlation is used to determine how the pixels have moved to determine how the scanner has probably been moved. In another embodiment, a rotational analysis is possible, however, for a specific embodiment this is not done because it tends to be time consuming, and having the correct entry point in the X and Y-coordinate direction allows the registration algorithm described herein to handle rotations.
At step 3602, a probable movement in the Z direction is determined.
In one specific embodiment, the previous frame's Z-coordinate is used, and any change in the Z-direction is calculated as part of the registration. In another embodiment, a probable Z coordinate is calculated as part of the entry point. For example, the optical parameters of the system can "zoom" the second frame in relationship to the first one until the best fit is received. The zoom factor that is used for that could tell us how far the two surfaces are away from each other in Z. In a specific embodiment, the X, Y and Z coordinates can be aligned so that the Z-coordinate is roughly parallel to the view axis.
At step 3606, the entry point value is returned.
At step 3698 a determination is made whether all entry point variations have been tried for the registration steps 3601 and 3602. If not the flow proceeds to step 3603, otherwise the flow proceeds to step 3697.
At step 3603 the next entry point variation is selected. Figure 37 illustrates a specific method for selecting the regisfration entry point variations. Specifically, Figure 37 illustrates the initial entry point El and subsequent entry points E2-E9. The entry points E2-E9 are selected sequentially in any predetermined order. The specific embodiment of Figure 37 illustrates the registration entry points E2-E9 as various points of a circle 3720 having a radius 3710. In accordance with a specific embodiment, the dimensions of the entry point variations are two-dimensional, for example the X and Y dimension. In other embodiments, the entry points can vary in three dimensions. Note that varying number of entry points, i.e. subsets of entry points, can be used to speed up the registration process. For example, single frame registration as used herein could use fewer than the nine entry points indicated. Likewise, cumulative registration, described herein, could benefit by using more than the nine points illustrated.
Returning to step 3698 of Figure 36, the flow proceeds to step 3697 once all variations of the first identified entry point have been tried. At step 3697, all entry points associated with the first identified entry point have been tried, and it is determined whether a second identified entry point has been identified by step 3604. If not, flow proceeds to step 3604 where the second entry point is defined. Specifically, at step 3604 the scanner movement between two previous frame models is determined. Next, an assumption is made that the scanner movement is constant for at least one additional frame. Using these assumptions, the entry point at step 3604 is defined to be the location of the previous frame plus the calculated scanner movement. The flow proceeds to step 3606, which returns the entry point to the calling step of Figure 31. In another embodiment, an assumption can be made that the direction of the scanner movement remained the same but that it accelerated at a difference rate. If the second identified entry point of step 3604 has been previously determined, the flow from step 3697 will proceed to step 3696. At step 3696, a determination is made whether an additional regisfration entry point variation for the second identified entry point exists. If so, the flow proceeds to step 3605, otherwise the flow returns to the calling step of Figure 31 at step 3607 and indicates that selection of a new entry point was unsuccessful. At step 3605 the next entry point variation of the second identified entry point is identified and the flow returns to the calling step of Figure 31.
Different entry point routines can be used depending upon the type of registration being performed. For example, for a registration process that is not tolerant of breaks in frame data, it will be necessary to try more entry points before discarding a specific frame. For a registration process that is tolerant of breaks in frame data, simpler or fewer entry points can be attempted, thereby speeding up the registration process.
Returning to Figure 31, at step 3105 the next 3D model portion is generated from the next frame's of cloud data.
At step 3106, registration is performed between the next 3D model portion and the cumulative model to update the cumulative model. In a specific implementation, the cumulative model is updated by adding all the new points from frame to the existing cumulative model to arrive at a new cumulative model. In other implementations, a new surface can be stored that is based on the 3D points acquired so far, thereby reducing the amount of data stored. If all frames have been registered, the method 3100 is completed, otherwise the flow proceeds to steps 3105 through step 3199, until each frame's cloud of points has been registered. As result of the registration process described in method 3100, it is possible to develop a model for the object 2700 from a plurality of smaller frames, such as frames 3210 and 3310. By being able to register plurality of frames, highly accurate models of large objects can be obtained. For example, a model of a patients entire dentition structure, including gums, teeth, and orthodontic and prosthetic structures can be obtained. In another embodiment, a model of the patients face can be obtained.
Figure 38 illustrates a method 3800, which is an alternate method of registering an object using a plurality of frames from a reference independent scanner. Specifically, at step 3801 the object is scanned to receive a cloud data for the object. As previously described, the cloud of data includes data from a plurality of frames, with each frame including a plurality of points.
At step 3802 a single frame registration is performed. A single frame registration performs a registration between adjacent frames of the scanned image without generating a cumulative model. Instead, in a specific implementation, a cumulative image of the single frame regisfration process is displayed. The image formed by the single frame registration process can be used to assist in the scanning process. For example, the image displayed as a result of the single frame registration, while not as accurate as a cumulative model, can be used by the scanner's operator to determine areas where additional scanning is needed. The single frame registration process is such that any error introduced between any two frames is "extended" to all subsequent frames of a 3D model generated using single frame regisfration. However, the level of accuracy is adequate to assist an operator during the scanning process. For example, the registration results, which describes the movement from one frame to another, can be used as an entry point for the cumulative regisfration process. Single frame registration is discussed in greater detail with reference to Figure 39.
At step 3803, a cumulative regisfration is performed. The cumulative registration creates a cumulative 3D model by registering each new frame into the cumulative model. For example, if 1000 individual frames were captured at step 3801 representing 1000 reference independent 3D model portions (frames), the cumulative regisfration step 3803 would combine the 1000 reference independent 3D model portions into a single cumulative 3D model representing the object. For example, where each of the 1000 reference independent 3D model portions represent a portion of one or more teeth, including frames 3210 and 3310 of Figures 32 and 33, the single cumulative 3D model will represent an entire set of teeth including teeth 2710-2730.
At step 3804, the results of the regisfration are reported. This will be discussed in further detail below.
Figure 39 describes a method 3900 that is specific to a single frame rendering implementation for step 3802 of Figure 38. At step 3903 a variable x is set equal to 2. At step 3904 a registration between the current frame (3DFx) and the immediately, or first, previous adjacent frame (3DFx-l) is performed.
Regisfration between two frames is referred to as single frame regisfration. A specific embodiment of registration between two models is discussed in greater detail with reference to the method illustrated in Figure 40.
At step 3999 it is determined whether or not the single frame registration of step 3904 was successful. In a specific implementation, a registration method, such as the method of Figure 40, provides a success indicator which is evaluated at step 3999. The flow proceeds to step 3905 when registration is successful, otherwise the flow proceeds to step 3907.
The flow proceeds to step 3905 when it is determined at step 3999 that the regisfration was successful. At step 3905 the current 3D frame (3DFx) is added to the current frame set of 3D frames. Note, that this set will generally be a set of transformation matrices. The current frame set of 3D frames is a sequential set of frames, where each frame in the sequence has a high degree of likelihood being successfully registered with [both] of its two adjacent frames. In addition, the newly registered frame can be displayed relative to the previous frame that is already being displayed.
At step 3998 a determination is made whether the variable x has a value equal to n, where n is the total number of frames to be evaluated. If x is equal to n, single frame regisfration is complete and the flow can return to Figure 38 at step 3910. If x is less than n, single frame regisfration continues at step 3906, where x is incremented before proceeding to step 3904. Returning to step 3999, the flow proceeds to step 3907 if the regisfration of step 3904 was not successful. At step 3907 a registration is attempted between current frame (3DFx) and the second previously adjacent frame (3DFx- 2). Step 3997 directs the flow to step 3905 if the registration of step 3907 was successful. Otherwise, step 3997 directs the flow to step 3908, thereby indicating an unsuccessful registration of the current frame (3DFx).
When the current frame cannot be registered, step 3908 saves the current frame set, i.e. set of matrices, and a new current frame set is begun. Flow from step 3908 proceeds to step 3905 where the current frame is added to the current frame set, which was newly created at step 3908. Therefore, it is possible for the single frame regisfration step 3802 to identify multiple frames sets.
Generation of multiple frame sets during cumulative registration is not desirable due to the amount of intervention required to reconcile multiple cumulative models. However, breaks in single frame registration are generally acceptable because the purpose of single frame regisfration is to assist the operator and define entry points to cumulative regisfration. One method of dealing with breaks during single frame regisfration is to merely display the first frame after the break at the same location as the last frame before the break, thereby allowing the operator to continue to view an image.
In accordance with step 4001 of Figure 40, a first model is a 3D primitive shape model, while the second model is a 3D point model. For reference purposes the primitive shapes in the first 3D model are referenced as S .Sn, where n is the total number shapes in the first model; and, the points in the second 3D model are references as PL.Pz, where z is the total number of points in the second model.
At step 4002, each individual point of the second model PL.Pz is analyzed to determine a shape closest to its location. In a specific embodiment, for a point PI, the shape Sl-Sn that is the closest to PI is the shape having the surface location that is the closest to PI than any other surface location of any other shapes. The shape closest to point PI is referred to as Sci, while the shape closest to point Pz is referred to as Scz.
In another embodiment, only points that are located directly above or below a triangle are associated to a triangle, and points that are not located directly above or below a triangle surface are associated to a line formed between two triangles, or a point formed by multiple triangles. Note that in the broad sense that the lines that form the triangles and the points forming the corner points of the triangles can be regarded as shapes.
At step 4003, vectors DL.Dz are calculated for each of the points
PL.Pz. In a specific implementation, each vector, for example Dl, has a magnitude and direction defined by the minimum distance from its corresponding point, for example PI, to the closest point of its closest shape, for example Sci. Generally, only a portion of the points PL.Pz overlap the cumulative image. The non-overlapping points, which are not needed for registration, have an associated vector having a comparatively large magnitude than an overlapping point, or may not reside directly above or below a specific triangle. Therefore, in a specific embodiment, only those vectors having a magnitude less than a predefined value (an epsilon value) are used for further regisfration.
In addition to eliminating points that are not likely to be overlapping points, the use of epsilon values can also be used to further reduce risks of decoding errors. For example, if one of the measuring lines of the pattem is misinterpreted to be a different line, the misinterpretation can result in a large error in the Z-direction. For a typical distance between adjacent pattem lines of approximately 0.3 mm and an angle of triangulation of approximately 13°; an error in the X-direction of 0.3 mm results in a three-dimensional transformation error of approximately 1.3 mm (0.3 mm / tan 13°) in the Z-direction. If the epsilon distance is kept below 0.5mm we can be sure that there is no influence of surface areas further away from each other than 0.5mm. Note that in a specific embodiment, the epsilon value is first selected to be a value greater than 0.5mm, such as 2.0mm, and after reaching a certain quality the value is reduced.
At step 4004, in a specific embodiment, the vectors DL.Dz are treated as spring forces to determine movement of the second 3D model frame. In a specific embodiment, the second 3D model is moved in a linear direction defined by the sum of all force vectors DL.Dz divided by the number of vectors.
At step 4005, the vectors DL.Dz are recalculated for each point of the second 3D model.
At step 4006, the vectors DL.Dz are treated as spring forces to determine movement of the second 3D model. For a specific embodiment of step 4004, the second 3D model frame is rotated about its center of mass based upon the vectors DL.Dz. For example, the second 3D model is rotated about its center of mass until the spring forces are minimized.
At step 4007, the quality of the regisfration is determined with respect to the current orientation of the second 3D model. One of ordinary skill in the art will recognize that various methods can be used to define the quality of the registration. For example, a standard deviation of the vectors DL.Dz having a magnitude less than epsilon can be used. In another embodiment quality is calculated using the following steps: square the distance of the vectors, sum the squared distances of all vectors within the epsilon distance, divide this sum by the number of vectors, and take the square root. Note, one of ordinary skill in the art will recognize that the vector values DL.Dz need to be recalculated after the rotation step 4006. In addition, one of ordinary skill in the art will recognize that there are other statistical calculations that can be used to provide quantitative values indicative of quality.
At step 4099, a determination is made whether the quality determined at step 4007 meets a desired quality level. If the quality is within a desired level, it indicates with a certain degree of confidence that a complete regisfration between the two frames models is achievable. By terminating the flow of method 4000 when a desired degree of quality is obtained, it is possible to quickly sort through all pairs of frames to provide an image to the user. By eliminating potential breaks in data at this point of the method, subsequent cumulative registration has a greater likelihood of producing a single cumulative model, as opposed to multiple segments of the cumulative model. If the current quality level meets the desired level the flow returns to the appropriate calling step with a successful indicator. If the current quality level does not meet desired level, the flow proceeds to step 4098.
It is determined at step 4098 whether the current quality of regisfration is improving. In a specific embodiment, this is determined by comparing the quality of the previous pass through the loop including step 4003 with the current quality. If the quality is not improving the flow returns to the calling step with an indication that the regisfration was not successful. Otherwise, the flow proceeds to step 4003.
Upon returning to step 4003, another regisfration iteration occurs, using the new frame location. Note that once the frame data has been scanned and stored there is no need to do the regisfration exactly in the order of scanning. Regisfration could start other way round, or use any other order that could make sense. Especially when scanning results in multiple passes there is already a knowledge of where a frame roughly belongs. Therefore, the regisfration of adjacent frames can be done independently of the order of imaging.
Figure 41 illusfrates a specific embodiment of a method 4100 for Figure 38. Specifically, the method 4100 discloses a cumulative registration which attempts to combine all of the individual 3D frame models into a single cumulative 3D model. Steps 4101-4103 are setup steps. At step 4101 a variable x is to set equal to 1, and a variable x_last defines the total number of 3D model sets. Note, the number of 3D model sets is based upon the step 3908 of Figure 39.
At step 4102 a 3D cumulative model (3Dc) is initially defined to equal the first 3D frame of the current set of frames. The 3D cumulative model is modified to include that information from subsequent frame models that is not already represented by the 3D cumulative model.
At step 4103, Y is set equal to 2, and a variable Y_last is defined to indicate the total number of frames (3DF), or frame models, in the set Sx, where Sx represents the current set of frame models being registered.
At step 4104, the 3D cumulative model (3Dc) is modified to include additional information based upon the registration between the current 3D frame model being registered (Sx(3DFy)) and the 3D cumulative model (3DC). Note, in Figure 41 the current 3D frame model is reference as Sx(3Dy), where 3Dy indicates the frame model and Sx indicates the frame set. A specific embodiment for performing the registration of step 4104 is further described by the method illustrated in Figures 42-43.
At step 4199 it is determined whether the current 3D frame model is the last 3D frame model of the current step. In accordance with a specific implementation of Figure 41, this can be accomplished by determining if the variable Y is equal to the value Y_last. When Y is equal to Y_last the flow proceeds to step 4198. Otherwise, the flow proceeds to step 4106, where Y is incremented, prior to returning to step 4104 for further registration of 3D frame models associated with current set Sy.
At step 4198 it is determined whether the current set of frames is the last set of frames. In accordance with the specific implementation of Figure 41, this can be accomplished by determining if the variable x is equal to the value x_last. The flow proceeds to step 4105 when x is equal to a x_last. Otherwise, the flow proceeds to step 4107, where x is incremented, prior to returning to step 4103 for further registration using the next set.
All frames of all sets have been registered when the flow reaches step 4105. Step 4105 reports results of the regisfration of the method 4100, as well as any other cleanup operations. For example, while ideally the method 4100 results in a single 3D cumulative model in reality multiple 3D cumulative models can be generated (see discussion at step 4207 of Figure 43). When this occurs step 4105 can report the resulting number of 3D cumulative models to the user, or to a subsequent routine for handling. As a part of step 4105, the user can have an option to assist in registering the multiple 3D models to each other. For example, if two 3D cumulative models are generated, the user can manipulate the 3D cumulative models graphically to assist identification of entry point, which can be used for performing a regisfration between the two 3D cumulative models.
In accordance with another embodiment of the present invention, a second cumulative registration process can be performed using the resulting matrices from the first cumulative regisfration as entry points for the new calculations. In one embodiment, when the process encounters a point where frame(s) could not be successfully registered in the first attempt, an enlarged number of entry points can be used, or a higher percentage of points can be used.
Figures 42-42 illustrate a specific embodiment of regisfration associated with step 4104 of Figure 41.
Step 4201 is similar to step 4002 of Figure 40, where each point (PL.Pm) of the current frame Sx(3Dy) is analyzed to determine the shape of the cumulative model that is the closest shape.
Step 4202 defines vectors for each point of the current frame in a manner similar to that previously described with reference to step 4003 of Figure 40.
Steps 4203 through 4206 move the current 3D frame model in the manner described at steps 4004-4006 of Figure 40, where the first model of method 4000 is the cumulative model and a second model of method 4000 is the current frame.
At step 4299 a determination is made whether the current pass through registration steps 4202-4206 has resulted in an improved alignment between the cumulative model and the current frame model. One method of determining quality improvement is to compare a quality value based on the current position of the model register to the quality value based on the previous position of the model. As previously discussed with reference to Figure 40, the quality value can be determined using the standard deviation, or other quality calculation based on the D vectors. Note, by default, a first pass through steps 4202-4206 for each model 3Dy results in an improved alignment. If an improved alignment has occurred, the flow returns to step 4202, otherwise, the flow proceeds to step 4298 of Figure 43.
Note that the flow control for the cumulative regisfration method of Figure 42 is different than the flow control for the single frame regisfration method of Figure 40. Specifically, the cumulative flow continues until no improvement in quality is realized, while the single frame flow stops once a specified quality is reached. Other embodiments of controlling the flow within the registration routines are anticipated.
hi an alternate flow control embodiment, the regisfration iteration process continues as long as a convergence criteria is met. For example, the convergence criteria is considered met as long as an improvement in quality of greater than a fixed percentage is realized. Such a percentage can be in the range of 0.5-10%.
In another embodiment, even once a specific first criteria is met, such as convergence or no improvement in quality, additional stationary iterations can be used. A stationary iteration is a pass through the registration routine, once the quality level has stopped improving, or has met a predefined criteria. In a specific implementation, a number of stationary iterations can be fixed. For example, 3 to 10 additional iterations can be specified. At step 4298 it is determined whether or not the current regisfration is successful. In a specific implementation success is based solely upon whether the calculated quality value of the current model placement meets a predefined criteria. If so, the registration has been successful and the routine 4200 returns to the calling step. If the criteria is not met, the flow proceeds to step 4207.
At step 4207, it has been determined that current frame model cannot be successfully registered into the cumulative 3D model. Therefore, the current cumulative 3D model is saved, and a new cumulative 3D model is started having the current frame. As previously described, because a new 3D cumulative model has been started, the current 3D frame model, which is a point model, is converted to a primitive model before returning to call step.
Many other embodiments of the present invention exist. For example, the movement of the frame during steps 4004, 4006, 4203, and 4205 may include an acceleration, or over movement, component. For example, an analysis may indicate that a movement in a specific direction needs to be 1mm. However, to compensate for the size of the sample being calculated or other factors, the frame can be moved by 1.5mm, or some other scaled factor. Subsequent movements of the frame can use a similar or different acceleration factor. For example, a smaller acceleration value can be used as registration progresses. The use of an acceleration factor helps compensate for local minima which result when no overlapping features happen to align. When this happens, a small movement value can result in a lower quality level. However, by using acceleration it is more likely that the misalignment can be overcome. Generally, acceleration can be beneficial to overcome "bumpiness" in a feature.
It should be understood that the specific steps indicated in the methods herein, and/or the functions of specific modules herein, may generally be implemented in hardware and/or software. For example, a specific step or function may be performed using software and/or firmware executed on one or more a processing modules.
Typically, systems for scanning and/or registering of scanned data will include generic or specific processing modules and memory. The processing modules can be based on a single processing device or a plurality of processing devices. Such a processing device may be a microprocessor, microcontroller, digital processor, microcomputer, a portion of a central processing unit, a state machine, logic circuitry, and/or any device that manipulates the signal.
The manipulation of these signals is generally based upon operational instructions represented in a memory. The memory may be a single memory device or a plurality of memory devices. Such a memory device (machine readable media) may be a read only memory, a random access memory, a floppy disk memory, magnetic tape memory, erasable memory, a portion of a system memory, any other device that stores operational instructions in a digital format. Note that when the processing module implements one or more of its functions, it may do so where the memory storing the corresponding operational instructions is embedded within the circuitry comprising a state machine and/or other logic circuitry. The present invention has been described with reference to specific embodiments. In other embodiments, more than two registration processes can be used. For example, if the cumulative registration process has breaks resulting in multiple cumulative models, a subsequent registration routine can be used to attempt regisfration between the multiple cumulative models.
Figures 44-52 illustrate a specific method and apparatus using three- dimensional scan data of an anatomical structure, which may be obtained in the specific manner described herein. The three-dimensional scan data is fransmitted to a remote facility for further use. For example, the three- dimensional scan data can represent the anatomy of an anatomical structure which is used to design an anatomical device, manufacture an anatomical device, monitor structural changes of the anatomy, archive data pertaining to the anatomical structure, perform a closed-loop iterative analysis of the anatomical structure, perform an interactive consultation of the structure, perform simulations based upon the structure, make a diagnosis related to the anatomical structure, or determine a treatment plan based on the anatomical structure.
As used herein, an anatomical device is defined to include devices that actively or passively supplement or modify an anatomical structure. Examples of anatomical devices include orthotic and prosthetic devices, and anatomical appliances. Anatomical appliances include orthodontic appliances which may be active or passive, and can include, but is not limited to, such items as braces, retainers, brackets, wires and positioners. Examples of other anatomical appliances include splints, and stents. Examples of orthotic and prosthetic anatomical devices include removable prosthetic devices, fixed prosthetic devices, and implantable devices. Removable prosthetic devices include dental structures such as dentures, partial dentures, and prosthetic structures for other body parts, such as prosthetic devices that serve as artificial body parts including limbs, eyes, implants, included cosmetic implants, hearing aids, and the like, such as spectacle frames. Fixed prosthetic anatomical devices include caps, crowns and other non-dental anatomical replacement structures. Implantable prosthetic devices include endosseous implants and orthodontic implants and fixture devices such as plates used for holding and reducing fractures.
Figure 44 illustrates a flow in accordance with the present invention. Specifically, Figure 44 illustrates the scanning of an anatomical structure 4400 by a scanning device 4401 at a facility 4441. In accordance with one aspect of the present invention, any scanner type or method capable of generating digital data for the purposes put forth herein can be used. Direct three-dimensions surface scanning indicates that some or all of the anatomical structure can be scanned directly. One embodiment of performing a direct three-dimensional surface scan is describe previously herein. In one embodiment, the scan is a surface scan, whereby the scanning device 4401 detects signals and/or patterns reflected from at or near the surface of the structure 4400. A specific surface scanning method and device has been described previously herein. Other scanning methods can be used as well. Generally, the surface scan of the anatomical structure will be a direct scan of the anatomical structure. A direct scan refers to a scan of the actual anatomical structure (in-vivo). In an alternate embodiment, an indirect scan of the anatomical structure can also be made and integrated with the direct scan. An indirect scan refers to scanning a representation of the actual original anatomical structure (in- vitro).
Digital data 4405 is generated at the facility (location) 4441 based on the direct scan of anatomical structure 4400. In one embodiment, the digital data 4405 represents raw scan data, which is generally a two-dimensional cloud of points, generated by the scanning device 4401. In another embodiment, the digital data 4405 represents a three-dimensional point model, which is generally generated based upon a two-dimensional cloud of points. In yet another embodiment, the digital data 4405 represents a three-dimensional primitive model. Note that the digital data 4405 can be a composite of multiple independent scans, which may be performed at approximately the same or different points in times, as well as at the same or different locations.
The actual data type of digital data 4405 is determined by the amount of processing done to the raw scan data at the location 4441. Generally, the data received directly from the scanner 4401 is a two-dimensional cloud of points. Therefore, when no processing is performed at the facility 4441, the digital data 4405 is a two-dimensional cloud of points. Three-dimensional point models and three-dimensional primitive models are typically generated by further processing of the two-dimensional point cloud. Facility 4441 represents a location where the physical scanning of the anatomical structure occurs. In one embodiment, facility 4441 is a location that is dedicated to, or primarily dedicated to, scanning anatomical structures. In this embodiment, the facility would be located where it is easily accessible by large numbers of clients (patients) needing scans. For example, a kiosk in a mall, or a location in a strip mall, can be dedicated to performing scans. Such a facility may perform a broad variety of scans, or may specialize in specific types of scans, such as scans of facial structures or dental structures. In an alternate embodiment, scans can be performed at home by a user. For example, a user can be provided with a portable scanner for remote use to scan the anatomical structure to generate scan data that can be used to monitor the progress of a treatment plan, for diagnostic purposes, or for surveillance or monitoring purposes.
In another embodiment the facility 4441 is a location that scans anatomical structures and performs other value-added services related to generating the digital data 4405. Examples of other value-added services include designing, or partially designing anatomical devices based upon the scan data to generate the digital data 4405, or installation of such anatomical devices, hi one implementation, no value added services beyond the generation of the digital data 4405 are performed at the facility 4441.
Once the digital data 4405 is generated at the facility 4441, the digital data can be provided to the client. The connection 4406 represents the digital data being provided to a third party. This step of providing can be done by the client, the facility 4441, or any other intermediate source. Generally, the client will specify the third party where the data is to be sent. The digital data 4405 can be provided to the facility 4442 either physically, i.e. by mail or courier, or remotely, i.e. by transmission. For example, the digital data 4405 can be physically provided on a non-volatile storage device, such as a portable magnetic media, a read-only fuse device, or a programmable non-violate device. In other embodiments, the digital data can be fransmitted to the client or a third party by a direct connection, the internet, a local area network, a wide area network, a wireless connection, and/or any device that enables the transfer of digital information from one computing system to another. In specific embodiments, either all or some of the digital data need be transmitted. For example, where the scan is of a patient's teeth and associated structures, such as the gums, a portion of the teeth may be transmitted.
In the specific embodiment of Figure 44, the digital data 4405 received at the facility 4442 (receiving facility) is used to design an anatomical device at step 4415. Figure 45 illusfrates a method having two alternate embodiments of step 4415. A first embodiment, which begins at step 4501 designs an anatomical structure using a physical model, while a second embodiment, which begins at step 4511, designs the structure using a virtual model of the anatomical device. A virtual model of the anatomical device will generally be a computer generated virtual model.
At step 4501, generation of a physical three-dimensional physical model of the anatomical structure occurs using the digital data 4405. In a specific embodiment, the physical model of the scanned object is generated using numerically controlled processing techniques, such as three-dimensional printing, automated milling, laser sintering, stereo lithography, injection molding, and extmsion.
At step 4502, the three-dimensional physical model is used to design the anatomical device. For example, by using the physical model, practitioners will generate anatomical devices for use by the client. In one embodiment, anatomical devices are custom designed based upon the physical model. In another embodiment, standard orthodontic devices are selected based upon the physical model of the anatomical structure. These standard devices may be modified as required to form semi-custom devices.
At step 4503 of Figure 45, the manufacture of the anatomical device can be based upon the physical model. Where physical models are used, the step 4502 of designing and the step 4503 of manufacturing are often steps, whereby the design and manufacturing process are occurring simultaneously. In other embodiments, moldings or specifications of the desired anatomical device are made and sent to processing centers for custom design and/or manufacturing.
In an alternate embodiment, beginning at step 4511, a virtual three- dimensional model of the anatomical structure is used to design an anatomical device. A virtual tliree-dimensional model refers to a model generated by a numerically controlled device, such as a computer, and either includes the digital data 4405, or is generated based upon the digital data 4405. In one embodiment, the virtual three-dimensional model is included as part of the digital data 4405 provided to a design center. In another embodiment, the three- dimensional model is generated using the digital data 4405 is received at step 4511. In another embodiment, an alternate-three-dimensional model is generated at step 4511 based on a three-dimensional model included as part of the digital data 4405. In another embodiment, multiple three-dimensional models can be stitched together from a plurality of scans. For example, data from multiple scan sessions can be used.
Furthermore, at step 4511, a virtual anatomical device is designed (modeled) using the virtual three-dimensional model. The virtual device can be designed using standard or custom design software for specifying virtual devices. Examples of such design software include commercially available products such as AutoCAD, Alias, Inc and ProEngineer. For example, the design software can be used to design a virtual crown using the three- dimensional virtual model of the anatomical structure, or to select a near- custom, standard or virtual devices from a library of devices that represents actual devices. Subsequent to selecting a standard device, customizations can be made.
At step 4512, the anatomical device can be manufactured directly based upon a virtual specification of the device. For example, the anatomical device can be generated using numerically controlled processing techniques, such as three-dimensional printing, automated milling, or laser sintering, stereo lithography, and injection molding, extrusion and casting techniques. It will be appreciated that the manufacture of the anatomical device includes partially manufacturing the device, as well as manufacturing the device at multiple locations.
At step 4426, the manufactured anatomical device is scanned. By generating a virtual model of the manufactured anatomical device, a simulation can be performed to verify the relationship between the anatomical device as manufactured and the anatomical structure, thereby, providing closed loop feedback to assure proper manufacture of the device.
At step 4504, the completed anatomical device is sent to a specified location for installation. For example, returning to Figure 44, the anatomical device is sent to facility 4444, where installation occurs at step 4435. In one embodiment, the anatomical device is installed at step 4435 by a practitioner, such as a dentist, orthodontist, physician, or therapist. In another embodiment, the patient can install the anatomical device. For example, a patient can install some orthodontic devices, such as retainers or similar positioning devices.
In accordance with a specific embodiment of the present invention, the anatomical device is designed or manufactured at a remote location relative to where the digital data 4405 is received or generated. In one embodiment, the digital data is received at the location 4441. Specifically, the digital data is received by the scanning the anatomical structure 4400. Once received, the digital data is transmitted to location 4442, which is a remote location relative to the location 4441, where an anatomical device is at least partially designed. A remote location (facility) is one that is disassociated in some manner from another location. For example, the remote location can be a location that is physically separate from the other location. For example, the scanning facility can be in a different room, building, city, state, country, or other location. In another embodiment, the remote location can be a functionally independent location. For example, one location can be used to perform one specific function, or set of functions, while another location can be used to perform a different function. Examples of different functions include scanning, designing, and manufacturing. Remote locations will generally be supported by separate infrasfructures, such as personnel and equipment.
In another embodiment, the digital data 4405 at facility 4441 includes a partially designed anatomical device. The anatomical device is further designed at the remote facility 4442. Note that facility 4442 can represent one or more remote facilities that can be used in parallel or serial to determine a final anatomical device, make a diagnosis, form freatment plan, monitor progress, or design a device based upon cost, expertise, ease of transmission, and turn around time required. An example of parallel facilities is further illustrated in Figure 48.
Figure 46 illustrates another embodiment of the present invention. The flow of Figure 46 is similar to the flow of Figure 44, with an additional intermediate step 4615. The intermediate step 4615 indicates that the digital data 4405 does not need to be received directly from the facility 4441 where the data was scanned. For example, the digital data 4405 can be generated at the first facility (sending facility) by scanning and provide the digital data to a second facility 4641 (receiving facility) where the intermediate step 4615 occurs. Once the intermediate step 4615 is completed, the digital data 4405, or a modified digital data that is a representation of the digital data, can be fransmitted to a third facility (remote facility) that is remote relative to at least one of the first and second facilities. During the intermediary step 4615, other steps can modify the digital data 4405 before the data is sent to the third facility. For example, the scan data can be processed to provide a three-dimensional virtual model of the anatomical structure; data can be added to the digital data, including image data of the scanned anatomical structure 4400, video and/or photo data containing color information, diagnosis information, treatment information, audio data, text data, X-ray data, anatomical device design information, and any other data which may be pertinent to the design or manufacture of the anatomical device. In an alternate embodiment, the intermediary step 4615 need not alter the digital data 4405.
Figure 47 illustrates an alternate embodiment of the present invention where the digital data 4405 is received at facility 4742 for forensic evaluations at step 4741. An example of a forensic evaluation includes identification of victims based on the anatomical structure scanned. Such identifications will generally be made based upon matching a specific anatomical structure to an anatomical structure contained within a target data base, where the target data base can contain a single structure, a plurality of structures. In one embodiment, the target database could be a centrally located database containing archived data. Figure 48 illusfrates an embodiment of the present invention where digital data 4405, or its representation, is fransmitted to one or more remote facilities 4843 for diagnostic purposes or treatment planning at steps 4844 and 4845. The ability to transmit the data for diagnostic purposes allows three- dimensional information of an anatomical structure to be provided to other practitioners, such as specialists, without the patient having to physically be present. This ability improves the overall speed and convenience of freatment, as well as accuracy when multiple diagnoses can be made in parallel. The ability to send the digital data 4405, or its representation, to multiple facilities for treatment planning and diagnosis allows multiple opinions to be obtained. Once a specific freatment plan is selected, any one of the devices specified as part of the treatment plan can be selected for manufacturing.
In a specific implementation of Figure 48, price quotes are obtained from each of the facilities. The price quotes can be based upon a specific freatment specified by a requesting party, where the freatment relates to the anatomical structure. Alternatively, the price quotes can be based upon a desired result specified by the requesting party, where the treatment definition and its associated implementation costs are determined by the facility providing the quotes. In this manner, a patient or a patient's representative can obtain competitive bids in an efficient manner.
Figure 49 illusfrates an alternate embodiment of the present invention where the digital data 4405, or its representation, is received at facility 4942 so that the data can be used at the step 4941 for educational purposes. Because of the deterministic nature of the embodiment described herein, educational techniques can be performed in a standardized manner not possible using previous methods. Examples of educational purposes include self learning purposes, education monitoring purposes, and the ability to provide standardized applied exams which were not possible before. Furthermore, case facts for a specific patient can be matched to previous, or present, case histories of other patients, where the case histories are stored or archived.
Figure 50 illusfrates an embodiment where scan data can be archived at step 5001, which occurs at location 5002 for easy retrieval by authorized individuals. In a specific embodiment, such archives would be provided as a service, whereby that data would be commonly maintained, thereby allowing for a common site-independent "gold standard" copy of digital data to be obtained.
Figure 51 illusfrates a specific embodiment of the present invention where the digital data obtained by scanning the anatomical structure is used in a closed-loop iterative system. The flow of Figure 51 may also be interactive. Specifically, changes in the anatomical structure, whether intentional or unintentional, can be monitored and controlled as part of the closed-loop system. The closed-loop system in accordance with the present invention is deterministic because the scan data is measurable in three-dimensional space, thereby allowing a standard reference, in the form of a three-dimensional model, to be used for analysis. At step 5101 the three-dimensional scan data of the anatomical device is obtained.
At step 5102 the data from the scan of step 5001, or a representation of the data, is fransmitted to a remote facility.
At step 5003, a design/evaluation of the fransmitted data is performed.
For example, a freatment plan, diagnosis, and a design for anatomical device are determined during a first pass through the loop including step 5103. During subsequent passes through step 5103, the status or progress of the freatment or device is monitored and changes are made as necessary. In one embodiment, the monitoring is performed by comparing the current scan data to an expected result, wliich has been simulated, or to the previous history, or against matched case histories.
At step 5104, the device or freatment plan is implemented or installed as appropriate. Any manufacturing of a device is also performed as part of step 5104.
It is determined whether an additional pass through the closed-loop system of Figure 50 is needed at step 5105. If so, the flow proceeds to step
5101. Otherwise, the flow terminates. As discussed with reference to Figure
51, a closed-loop feedback loop can exist between any of the steps illustrated in Figure 50
The ability to used feedback to verify progress is an advantage over the prior art, where practitioners relied upon one or more of text notes, visual observations of models and other images. However, these observations where made without fixed three-dimensional models that would allow the practitioner to be assure the same perspective is being viewed. The use of the visual models described herein allows for a fixed reference to be obtained. For example, one method of obtaining fixed reference points for an orthodontic structure includes selecting orientation reference points based on physical attributes of the orthodontic structure. The orientation reference points can subsequently be used to map the digital image of the orthodontic structure into a three- dimensional coordinate system. For example, the frenum can be selected to be one of the orientation reference points and the rugae can be selected as the other reference point. The frenum is a fixed point in the orthodontic patient that will not change, or change minimally, during the course of treatment. The frenum is a triangular shaped tissue in the upper-portion of the gum of the upper-arch. The rugae is a cavity in the roof of the mouth 68 in the upper-arch. The rugae will also not change its physical position through freattnent. As such, the frenum and the rugae are fixed physical points in the orthodontic patient that will not change during treatment. As such, by utilizing these as the orientation reference points, a three-dimensional coordinate system may be mapped thereto. Note that other physical attributes of the orthodontic patient including the incisive papilla, cupid's bow, the inter-pupillar midpoint, inter-commissural midpoint (e.g., between the lips), inter-alar midpoint (e.g., between the sides of the nose), the prone nasale (e.g., the tip of the nose), sub-nasale (e.g., junction of the nose and the lip), a dental mid-line point, a point on the bone, a fixed bone marker such as an implant (e.g., a screw from a root canal, oral surgery) may be used as the orientation reference points.
Figure 52 illusfrates that iterative feedback steps can take place within and between any combination of the steps illustrated herein. For example, an interactive and/or interactive loop can reside between the manufacturing step
4425 and the design step 4415, or within a single step as describe with reference to step 4426 of Figure 44.
In addition to the many methods of using the scanned data derived in the steps herein, many methods for financing the use of the data are possible. For example, fees for the use of such scan data 4405 may be fixed or variable fees based upon the use the data, the cost of a service provided, the anatomical device being generated, or the value added to a device or service based upon the data. In addition, it will be apparent that many other types of fees can be envisioned.
The steps and methods described herein may be executed on a processing module (not shown). The processing module may be a single processing device or a plurality of processing devices. Such a processing device may be a microprocessor, microcomputer, digital signal processor, central processing unit of a computer or work station, digital circuitry, state machine, and/or any device that manipulates signals (e.g., analog and/or digital) based on operational instructions. The processing module's operation is generally controlled by data stored in memory. For example, where a microprocessor is used, a bus of the microprocessor is connected to a bus of a memory to access instructions. Examples of memory include single or multiple memory devices, such as random access memory, read-only memory, floppy disk memory, hard drive memory, extended memory, magnetic tape memory, zip drive memory, and/or any device that stores digital information. Such memory devices can be local (i.e. connected directly to the processing device), or at physically different locations (i.e. at a site connected to the Internet.) Note that when the processing module implements one or more of its functions, via a state machine or logic circuitry, the memory storing the corresponding operational instructions is embedded within the circuitry comprising the state machine or logic circuitry.
One of ordinary skill in the art will appreciate that the specific embodiments described herein provide advantages over the known art. For example, the anatomical structure being scanned may have one or more associated anatomical devices or appliances. In addition, the present invention provides a deterministic method for diagnosing, treating, monitoring, designing and manufacturing anatomical devices. In addition, the present embodiments can be used to provide an interactive method for communication between various parties designing and manufacturing the prosthetic device. Such an interactive method can be implemented in real-time. The methods describe herein permit data to be archived in such a manner that others can obtain the actual information and knowledge obtained from the experience of others. Multiple consultants can have access to identical deterministic copies of the information independent of location, allowing objective design, manufacturing, and or treatment-monitoring information to be obtained even when multiple independent sites are being used. The present embodiments allow for the circumvention of traditional labs used to generate anatomical devices. Specifically, support facilities used for the generation of anatomical devices may now be numerous and remote relative to the patient. This can reduce the overall cost to the practitioner and patient. The patient does not have to visit the practitioner to have the status or progress of a device monitored, since the scanning location can be remote from other support locations. Overall, the fixed, archivable nature of the digital data of the present embodiments allows for a low cost of generating identical duplicate models from a gold standard model, thereby reducing the likelihood of lost and inaccurate data. By reducing the amount of time and travel required of a patient for analysis of specific anatomical structures, freatment costs are reduced. The accessibility of multiple opinions (quotes, freatment plans, diagnoses, etc.) increases without the additional inconvenience to the patient associated with the prior art methods. Competitive quoting is also readily obtainable using the specific embodiments indicated.
In the foregoing specification, the invention has been described with reference to specific embodiments. However, one of ordinary skill in the art appreciates that various modifications and changes can be made without departing from the scope of the present invention as set forth in the claims below. For example, the digital data may be a composite of a direct scan of the anatomical structure and an indirect scan of the structure. This may occur when a portion of the anatomical structure is not viewable by the scanner 4401, so an impression of at least that portion of the anatomical structure that is not viewable. The impression, or a model made from the impression, is then scanned and "stitched" into the direct scan data to form a complete scan. In other embodiments, the digital data 4405 can be used in combination with other traditional methods. In other embodiments, the digital data described herein can be compressed, or secured using an encryption method. When encrypted, one or more of the patient, the scanning facility, and archival facility or a representative of the patient can have encryption keys for the digital data. Accordingly, the specification and figures are to be regarded in an illustrative rather than a restrictive sense, and all such modifications are intended to be included within the scope of present invention. In the claims, means-plus- function clause(s), if any, cover the structures described herein that perform the recited function(s). The mean-plus-function clause(s) also cover structural equivalents and equivalent structures that perform the recited function(s). Benefits, other advantages, and solutions to problems have been described above with regard to specific embodiments. However, the benefits, advantages, solutions to problems, and any element(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential feature or element of any or all the claims. One of ordinary skill in the art will recognize that the present invention is advantageous over the prior art, in that a reference independent scanner is disclosed that in a specific embodiment incorporates variable identifiers in a direction orthogonal to a projection/view plane. By providing variables in a direction orthogonal to the projection/view plane, the distortion of these variables, which is less than distortion parallel to the projection/view plane, does not prohibit identification of specific shapes. As a result, greater accuracy, of mapping of objects can be obtained.

Claims

What is claimed is:
1. A method of creating a three-dimensional model of an object, the method comprising the steps of: receiving a 3D primitive model of a first object portion, wherein the 3D primitive model includes a plurality of primitive shapes; receiving a 3D point model of a second model portion, wherein the 3D point model includes a plurality of points; determining a first set of points in the 3D point model that are likely to be valid and to overlap a portion of the 3D primitive model of the first object portions; and aligning the 3D point model to the 3D primitive model based upon the first set of points.
2. The method of claim 1, wherein the step of determining includes the substeps of: for x = 1 to n, where n is the number of points in the 3D point model: determining a minimum distance Dx from a point Px to the primitive model; and including the point Px in the first set of points if Dx is less than a predefined value.
3. An apparatus for scanning a surface of an object, an apparatus comprising in combination: a scanning device wherein the scanning device includes a projecting device and a viewing device; and a system controller operably coupled to the scanning device, wherein the system controller comprises a first controller to confrol the projecting device and a second controller to control the viewing device.
4. A method for generating a three-dimensional model of an object, the method comprising: projecting an image having at least one encoded feature with at least one component that varies orthogonal to a plane formed by a projection axis and at least one point of a view axis; receiving a representation of a surface image at a viewing device; and determining a location of a point at the surface of the object based upon the at least one encoded feature.
5. A method for generating a three-dimensional model of an object from a reference independent scanning device, the method comprising: a) scanning and receiving a first data representing two dimensional object data, the first data comprising a plurality of frames; b) generating a first three-dimensional model from a first frame of the first data; c) generating a second three-dimensional model from a second frame of the first data; d) performing a process of aligning the first three-dimensional model to the second three-dimensional model to generate a cumulative model; and e) iterating the steps of b) through d) until the process of aligning and generating a cumulative model is complete for all frames of the first data.
6. A method of registering data, method comprising the steps of: receiving reference independent data representing an object, wherein the reference independent data includes a plurality of frames FI through Fn, where n is a positive integer greater than 1, and each of the frames FI through Fn represent an object portion of the object; and performing a regisfration between the object portions of frame Fx and frame Fx+1 , where x is an integer between 1 and n-1.
7. The method of claim 6, wherein the spatial location of the object portion of frame Fx is unknown, and the spatial location of the object portion of frame Fx+1 is unknown.
8. The method of claim 6, wherein the spatial location of a scanner when frame Fx was captured is unknown, and the spatial location of the scanner when frame Fx+1 was captured is unknown.
9. The method of claim 8, wherein the spatial location of the object portion of frame Fx is unknown, and the spatial location of the object portion of frame Fx+1 is unknown.
10. A method of registering data, the method comprising the steps of: receiving a first two-dimensional image representing a first portion of an object, wherein the first two-dimensional image is a reference independent image; receiving a second two-dimensional image representing a second portion of an object, wherein the second two-dimensional image is a reference independent image ; and determining an original entry point to be used for performing a three-dimensional registration of the second image to the first image.
11. The method of claim 10, further comprising the step of: performing a three-dimensional registration based on the original entry point.
12. The method of claim 11, further comprising the step of: defining an alternate entry point based upon the original entry point; and performing a three-dimensional regisfration based on the alternate entry point.
13. A method of registering data, the method comprising the steps of: receiving reference independent data representing an object, wherein the reference independent data includes a plurality of frames FI through Fn, where n is a positive integer greater than 1, and each of the frames FI through Fn represent an object portion of the object; performing a first registration using the reference independent data to create a first three-dimensional model; and performing a second regisfration using the reference independent data to create a second three-dimensional model.
14. The method of claim 13, wherein the first registration has a first characteristic and the second regisfration has a second characteristic.
15. The method of claim 14, wherein the first and second characteristics are accuracy characteristics.
16. The method of claim 15, wherein the first characteristic results in the second tliree-dimensional model having a greater accuracy than the first three-dimensional model.
17. The method of claim 16, wherein the step of performing the first regisfration takes less time than the step of performing the second regisfration.
18. The method of claim 13, wherein the first registration is a single frame regisfration that attempts to register each frame Fx to an adjacent frame Fx-1 for x = 2..n.
19. A method comprising the steps of: receiving digital data, wherein the digital data is based upon a direct three-dimensional surface scan of an anatomical structure; and transmitting a representation of the digital data to a remote location to have an anatomical device designed.
20. The method of claim 19, wherein the step of receiving further includes scanning the anatomical structure to directly receive the digital data.
21. The method of claim 19, wherein the step of receiving further includes receiving the digital data at a receiving facility from a sending facility, wherein the receiving facility, the sending facility, and the remote facility are different facilities.
22. The method of claim 19, wherein the step of transmitting to have an anatomical device designed includes having the device manufactured.
23. The method of claim 19„ wherein the step of receiving includes the digital data being based upon a reference independent scan.
24. The method of claim 23, wherein the reference independent scan includes a multi-frame reference independent three-dimensional scan of an object.
25. The method of claim 19, wherein the step of receiving includes receiving the digital data at a facility that does not design the anatomical device.
26. A method comprising the steps of: receiving digital data, wherein the digital data is based upon a direct three-dimensional surface scan of an anatomical structure; and transmitting a representation of the digital data to a remote location for forensics.
27. A system comprising in combination: a processing module having a bus; a memory having a bus coupled to the bus of the processing module; wherein the memory stores instructions that cause the processing module to: receive digital data, wherein the digital data is based upon a three-dimensional surface scan of an anatomical structure; and transmit a representation of the digital data to a remote location to have an anatomical device designed.
28. A method of mapping a surface, the method comprising: during a first time period projecting a first image along a projection axis to an object to be mapped, the first image having a measuring feature; and during a second time period projecting a second image along the projection axis to the surface, the second image having an encoding feature which is used to identify the measuring feature of the first image, wherein the first time period and the second time period are adjacent in time, and substantially mutually exclusive.
29. The method of claim 28, wherein the step of projecting the second image includes the encoding feature being at least partially substantially orthogonal to a plane formed by the projection axis and a point of a view axis.
30. An apparatus for mapping an object, the apparatus comprising: a projector to transmit a first image having a measuring feature during a first time period along a projection axis to an object to be mapped, and to transmit a second image having an encoding feature during a second time period along the projection axis , the second image being at least partially orthogonal to the plane and is used to identify the measuring feature of the first image, wherein the first time period and the second time period are adjacent in time, and substantially mutually exclusive.
31. An apparatus for scanning an object, the system comprising: a data processor for executing instructions; a plurality of instructions capable of being executed on the data processor for executing the following operations: specifying a first time period during which a first image is to be projected along a projection axis to an object to be mapped, the first image having a measuring feature which is detectable in a direction parallel to a plane formed by the projection axis and a point of a view axis; and specifying a second time period during which a second image is to be projected along the projection axis to the surface, the second image having an encoding feature which is at least partially orthogonal to the plane and is used to identify the measuring feature of the first image, wherein the first time period and the second time period are adjacent in time, and substantially mutually exclusive.
32. A method of scanning, the method comprising the steps of: providing a patient; providing a scanner that uses a pattem with an encoding feature orthogonal to a plane of triangulation; and scanning a portion of the patient using multi-frame reference independent scanning to receive scan data.
33. The method of claim 32, wherein the portion of the patient is a dentition structure.
34. The method of claim 33, wherein the portion of the patient includes a tooth portion.
35. The method of claim 32, wherein during the step of scanning, the portion of the patient is not fixed.
36. A method of scanning a portion of a patient's mouth, the method comprising the steps of: receiving scan data having at images with undercuts of a dentition stracture, wherein there is no known spatial reference associated with the scan data reference point; and generating a tliree-dimensional model of the dentition structure using the scan data.
37. A method of mapping an object, the method comprising the steps of: projecting a first image along a projection axis onto a surface of the object, wherein the first image includes a measuring feature to map a first set of surface features having a first characteristic; and projecting a second image along a projection axis onto the surface of the object, wherein the second image includes a measuring feature to map a second set of surface features having a second characteristic.
38. The method of claim 37, wherein the step of projecting the second image further includes projecting the second image simultaneously with the first image.
39. The method of claim 38, wherein the first image further includes the first image having a plurality of parallel features, each of the plurality of parallel features separated by a predetermined distance X, and wherein the second image includes at least one feature separated from any other feature by a predetermined distance Y, where Y is greater than X.
40. The method of claim 37, further comprising the step of: determining a the second set of surface features based upon projecting the first image feature, wherein the second set of surface features is not mapped using the first image feature.
41. An apparatus used to project an image for mapping an object, the method comprising a projector to project a first image and a second image along a projection axis onto a surface of the object, wherein the first image includes a measuring feature to map a first set of surface features having a first characteristic, and to project a second image along the projection axis onto the surface of the object, wherein the second image includes a measuring feature to map a second set of surface features having a second characteristic.
42. A method of mapping a surface, the method comprising: projecting an image along a projection axis onto a surface to provide a surface image, wherein the image includes a variable feature which varies in a direction substantially orthogonal to a plane formed by the projection axis and a point of a view axis; receiving a representation of the surface image along the view axis; and determining a location at a point of the surface based at least partially upon the variable feature of the representation of the surface image.
43. The method of claim 42, wherein the image is a predefined pattem.
44. The method of claim 43, wherein the predefined pattem includes a first image portion having a first feature that includes at least the location at the point of the surface, and a second feature wliich includes at least a portion of the variable feature; a second image portion having a third feature, and a fourth feature; and the step of determining further includes using the second feature to identify the first feature.
45. A method of mapping a surface, the method comprising: projecting a pattem upon a surface, the pattem having a plurality of first features and a plurality of second features, wherein each of the plurality of second features are uniquely identifiable independent of the topology of the surface; and determining a specific feature of the plurality of first features based upon the plurality of second features.
46. The method of claim 45 further comprising the step of: calculating a relative location of a point of the surface based upon the specific feature.
47. A method for scanning an object, the method comprising the steps of: projecting a plurality of discrete shapes for scanning an object, wherein the plurality of discrete shapes include a first discrete shape and a second discrete shape; and projecting a first encoding feature relative to the first discrete shape.
48. An apparatus for scanning an object, the apparatus comprising: a data processor for executing instructions; a plurality of instructions capable of being executed on the data processor to execute the following operations: projecting a plurality of discrete shapes for scanning an object, wherein the plurality of discrete shapes include a first discrete shape and a second discrete shape; and projecting a first encoding feature relative to the first discrete shape.
PCT/US2001/012107 2000-04-28 2001-04-13 Method and system for scanning a surface and generating a three-dimensional object WO2001084479A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2001581218A JP4206213B2 (en) 2000-04-28 2001-04-13 Method and system for scanning a surface and creating a three-dimensional object
EP01925005A EP1287482A4 (en) 2000-04-28 2001-04-13 Method and system for scanning a surface and generating a three-dimensional object
AU2001251606A AU2001251606A1 (en) 2000-04-28 2001-04-13 Method and system for scanning a surface and generating a three-dimensional object

Applications Claiming Priority (16)

Application Number Priority Date Filing Date Title
US09/560,644 2000-04-28
US09/560,583 US6738508B1 (en) 2000-04-28 2000-04-28 Method and system for registering data
US09/560,584 US7068836B1 (en) 2000-04-28 2000-04-28 System and method for mapping a surface
US09/560,645 2000-04-28
US09/560,644 US6413084B1 (en) 2000-04-28 2000-04-28 Method and system of scanning
US09/560,133 2000-04-28
US09/560,132 2000-04-28
US09/560,583 2000-04-28
US09/560,131 2000-04-28
US09/560,584 2000-04-28
US09/560,133 US6744932B1 (en) 2000-04-28 2000-04-28 System and method for mapping a surface
US09/560,132 US6771809B1 (en) 2000-04-28 2000-04-28 Method and system for registering data
US09/560,645 US6728423B1 (en) 2000-04-28 2000-04-28 System and method for mapping a surface
US09/560,131 US6744914B1 (en) 2000-04-28 2000-04-28 Method and system for generating a three-dimensional object
US09/616,093 2000-07-13
US09/616,093 US6532299B1 (en) 2000-04-28 2000-07-13 System and method for mapping a surface

Publications (1)

Publication Number Publication Date
WO2001084479A1 true WO2001084479A1 (en) 2001-11-08

Family

ID=27575480

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2001/012107 WO2001084479A1 (en) 2000-04-28 2001-04-13 Method and system for scanning a surface and generating a three-dimensional object

Country Status (4)

Country Link
EP (1) EP1287482A4 (en)
JP (4) JP4206213B2 (en)
AU (1) AU2001251606A1 (en)
WO (1) WO2001084479A1 (en)

Cited By (81)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004037111A1 (en) * 2002-10-24 2004-05-06 Jemtab Systems Device for determining positions for one or more teeth in dental prosthesis arrangement
US6854973B2 (en) 2002-03-14 2005-02-15 Orametrix, Inc. Method of wet-field scanning
WO2009129067A1 (en) * 2008-04-16 2009-10-22 Biomet Manufacturing Corp. Patient-modified implant and manufacuring method for such an implant
US8265949B2 (en) 2007-09-27 2012-09-11 Depuy Products, Inc. Customized patient surgical plan
US8343159B2 (en) 2007-09-30 2013-01-01 Depuy Products, Inc. Orthopaedic bone saw and method of use thereof
US8357111B2 (en) 2007-09-30 2013-01-22 Depuy Products, Inc. Method and system for designing patient-specific orthopaedic surgical instruments
US8738165B2 (en) 2007-12-21 2014-05-27 3M Innovative Properties Company Methods of preparing a virtual dentition model and fabricating a dental retainer therefrom
US8808302B2 (en) 2010-08-12 2014-08-19 DePuy Synthes Products, LLC Customized patient-specific acetabular orthopaedic surgical instrument and method of use and fabrication
US8858561B2 (en) 2006-06-09 2014-10-14 Blomet Manufacturing, LLC Patient-specific alignment guide
US8864769B2 (en) 2006-02-27 2014-10-21 Biomet Manufacturing, Llc Alignment guides with patient-specific anchoring elements
US8903530B2 (en) 2011-06-06 2014-12-02 Biomet Manufacturing, Llc Pre-operative planning and manufacturing method for orthopedic procedure
US8900244B2 (en) 2006-02-27 2014-12-02 Biomet Manufacturing, Llc Patient-specific acetabular guide and method
US8956364B2 (en) 2011-04-29 2015-02-17 Biomet Manufacturing, Llc Patient-specific partial knee guides and other instruments
US8992538B2 (en) 2008-09-30 2015-03-31 DePuy Synthes Products, Inc. Customized patient-specific acetabular orthopaedic surgical instrument and method of use and fabrication
US9005297B2 (en) 2006-02-27 2015-04-14 Biomet Manufacturing, Llc Patient-specific elbow guides and associated methods
US9060788B2 (en) 2012-12-11 2015-06-23 Biomet Manufacturing, Llc Patient-specific acetabular guide for anterior approach
US9066727B2 (en) 2010-03-04 2015-06-30 Materialise Nv Patient-specific computed tomography guides
US9066734B2 (en) 2011-08-31 2015-06-30 Biomet Manufacturing, Llc Patient-specific sacroiliac guides and associated methods
US9084618B2 (en) 2011-06-13 2015-07-21 Biomet Manufacturing, Llc Drill guides for confirming alignment of patient-specific alignment guides
US9113971B2 (en) 2006-02-27 2015-08-25 Biomet Manufacturing, Llc Femoral acetabular impingement guide
US9173666B2 (en) 2011-07-01 2015-11-03 Biomet Manufacturing, Llc Patient-specific-bone-cutting guidance instruments and methods
US9173661B2 (en) 2006-02-27 2015-11-03 Biomet Manufacturing, Llc Patient specific alignment guide with cutting surface and laser indicator
US9204977B2 (en) 2012-12-11 2015-12-08 Biomet Manufacturing, Llc Patient-specific acetabular guide for anterior approach
US9237950B2 (en) 2012-02-02 2016-01-19 Biomet Manufacturing, Llc Implant with patient-specific porous structure
US9241745B2 (en) 2011-03-07 2016-01-26 Biomet Manufacturing, Llc Patient-specific femoral version guide
US9271744B2 (en) 2010-09-29 2016-03-01 Biomet Manufacturing, Llc Patient-specific guide for partial acetabular socket replacement
US9289253B2 (en) 2006-02-27 2016-03-22 Biomet Manufacturing, Llc Patient-specific shoulder guide
US9295497B2 (en) 2011-08-31 2016-03-29 Biomet Manufacturing, Llc Patient-specific sacroiliac and pedicle guides
US9301812B2 (en) 2011-10-27 2016-04-05 Biomet Manufacturing, Llc Methods for patient-specific shoulder arthroplasty
US9339278B2 (en) 2006-02-27 2016-05-17 Biomet Manufacturing, Llc Patient-specific acetabular guides and associated instruments
US9345548B2 (en) 2006-02-27 2016-05-24 Biomet Manufacturing, Llc Patient-specific pre-operative planning
US9351743B2 (en) 2011-10-27 2016-05-31 Biomet Manufacturing, Llc Patient-specific glenoid guides
US9386993B2 (en) 2011-09-29 2016-07-12 Biomet Manufacturing, Llc Patient-specific femoroacetabular impingement instruments and methods
US9393028B2 (en) 2009-08-13 2016-07-19 Biomet Manufacturing, Llc Device for the resection of bones, method for producing such a device, endoprosthesis suited for this purpose and method for producing such an endoprosthesis
US9408616B2 (en) 2014-05-12 2016-08-09 Biomet Manufacturing, Llc Humeral cut guide
US9427320B2 (en) 2011-08-04 2016-08-30 Biomet Manufacturing, Llc Patient-specific pelvic implants for acetabular reconstruction
US9445907B2 (en) 2011-03-07 2016-09-20 Biomet Manufacturing, Llc Patient-specific tools and implants
US9451973B2 (en) 2011-10-27 2016-09-27 Biomet Manufacturing, Llc Patient specific glenoid guide
US9456833B2 (en) 2010-02-26 2016-10-04 Biomet Sports Medicine, Llc Patient-specific osteotomy devices and methods
US9474539B2 (en) 2011-04-29 2016-10-25 Biomet Manufacturing, Llc Patient-specific convertible guides
US9480580B2 (en) 2006-02-27 2016-11-01 Biomet Manufacturing, Llc Patient-specific acetabular alignment guides
US9480490B2 (en) 2006-02-27 2016-11-01 Biomet Manufacturing, Llc Patient-specific guides
US9498233B2 (en) 2013-03-13 2016-11-22 Biomet Manufacturing, Llc. Universal acetabular guide and associated hardware
US9517145B2 (en) 2013-03-15 2016-12-13 Biomet Manufacturing, Llc Guide alignment system and method
US9522010B2 (en) 2006-02-27 2016-12-20 Biomet Manufacturing, Llc Patient-specific orthopedic instruments
US9554910B2 (en) 2011-10-27 2017-01-31 Biomet Manufacturing, Llc Patient-specific glenoid guide and implants
US9561040B2 (en) 2014-06-03 2017-02-07 Biomet Manufacturing, Llc Patient-specific glenoid depth control
US9579107B2 (en) 2013-03-12 2017-02-28 Biomet Manufacturing, Llc Multi-point fit for patient specific guide
US9662216B2 (en) 2006-02-27 2017-05-30 Biomet Manufacturing, Llc Patient-specific hip joint devices
US9662127B2 (en) 2006-02-27 2017-05-30 Biomet Manufacturing, Llc Patient-specific acetabular guides and associated instruments
US9675400B2 (en) 2011-04-19 2017-06-13 Biomet Manufacturing, Llc Patient-specific fracture fixation instrumentation and method
US9717510B2 (en) 2011-04-15 2017-08-01 Biomet Manufacturing, Llc Patient-specific numerically controlled instrument
EP3202366A1 (en) * 2016-02-05 2017-08-09 Vatech Co. Ltd. Method, apparatus, and computer program for scanning an object in three dimensions using color dashed line pattern
US9795399B2 (en) 2006-06-09 2017-10-24 Biomet Manufacturing, Llc Patient-specific knee alignment guide and associated method
US9820868B2 (en) 2015-03-30 2017-11-21 Biomet Manufacturing, Llc Method and apparatus for a pin apparatus
US9826994B2 (en) 2014-09-29 2017-11-28 Biomet Manufacturing, Llc Adjustable glenoid pin insertion guide
US9826981B2 (en) 2013-03-13 2017-11-28 Biomet Manufacturing, Llc Tangential fit of patient-specific guides
US9833245B2 (en) 2014-09-29 2017-12-05 Biomet Sports Medicine, Llc Tibial tubercule osteotomy
US9839436B2 (en) 2014-06-03 2017-12-12 Biomet Manufacturing, Llc Patient-specific glenoid depth control
US9839438B2 (en) 2013-03-11 2017-12-12 Biomet Manufacturing, Llc Patient-specific glenoid guide with a reusable guide holder
US9861387B2 (en) 2006-06-09 2018-01-09 Biomet Manufacturing, Llc Patient-specific knee alignment guide and associated method
US9907659B2 (en) 2007-04-17 2018-03-06 Biomet Manufacturing, Llc Method and apparatus for manufacturing an implant
US9918740B2 (en) 2006-02-27 2018-03-20 Biomet Manufacturing, Llc Backup surgical instrument system and method
US9968376B2 (en) 2010-11-29 2018-05-15 Biomet Manufacturing, Llc Patient-specific orthopedic instruments
US10034753B2 (en) 2015-10-22 2018-07-31 DePuy Synthes Products, Inc. Customized patient-specific orthopaedic instruments for component placement in a total hip arthroplasty
EP2303192B1 (en) * 2008-04-16 2018-11-14 Biomet Manufacturing, LLC Method for manufacturing an implant
US10159498B2 (en) 2008-04-16 2018-12-25 Biomet Manufacturing, Llc Method and apparatus for manufacturing an implant
US10226262B2 (en) 2015-06-25 2019-03-12 Biomet Manufacturing, Llc Patient-specific humeral guide designs
US10278711B2 (en) 2006-02-27 2019-05-07 Biomet Manufacturing, Llc Patient-specific femoral guide
US10282488B2 (en) 2014-04-25 2019-05-07 Biomet Manufacturing, Llc HTO guide with optional guided ACL/PCL tunnels
US10492798B2 (en) 2011-07-01 2019-12-03 Biomet Manufacturing, Llc Backup kit for a patient-specific arthroplasty kit assembly
US10568647B2 (en) 2015-06-25 2020-02-25 Biomet Manufacturing, Llc Patient-specific humeral guide designs
US10603179B2 (en) 2006-02-27 2020-03-31 Biomet Manufacturing, Llc Patient-specific augments
WO2020145945A1 (en) * 2019-01-08 2020-07-16 Hewlett-Packard Development Company, L.P. Simulation-based capture system adjustments
US10722310B2 (en) 2017-03-13 2020-07-28 Zimmer Biomet CMF and Thoracic, LLC Virtual surgery planning system and method
US11051829B2 (en) 2018-06-26 2021-07-06 DePuy Synthes Products, Inc. Customized patient-specific orthopaedic surgical instrument
US11179165B2 (en) 2013-10-21 2021-11-23 Biomet Manufacturing, Llc Ligament guide registration
US11368667B2 (en) 2009-06-17 2022-06-21 3Shape A/S Intraoral scanning apparatus
US11419618B2 (en) 2011-10-27 2022-08-23 Biomet Manufacturing, Llc Patient-specific glenoid guides
KR20230053738A (en) * 2021-10-14 2023-04-24 주식회사 메디트 Method, apparatus and recording medium storing commands for aligning scanned images of 3d scanner
US11701208B2 (en) 2014-02-07 2023-07-18 3Shape A/S Detecting tooth shade

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2001251606A1 (en) * 2000-04-28 2001-11-12 Orametirix, Inc. Method and system for scanning a surface and generating a three-dimensional object
DE102004037464A1 (en) * 2004-07-30 2006-03-23 Heraeus Kulzer Gmbh Arrangement for imaging surface structures of three-dimensional objects
DE102007001684B4 (en) * 2007-01-11 2023-01-26 Sicat Gmbh & Co. Kg Image Registration
US20090061381A1 (en) * 2007-09-05 2009-03-05 Duane Milford Durbin Systems and methods for 3D previewing
US20090232388A1 (en) * 2008-03-12 2009-09-17 Harris Corporation Registration of 3d point cloud data by creation of filtered density images
US8243289B2 (en) * 2009-05-29 2012-08-14 Perceptron, Inc. System and method for dynamic windowing
DE102012214467B4 (en) * 2012-08-14 2019-08-08 Sirona Dental Systems Gmbh Method for registering individual three-dimensional optical recordings of a dental object
ES2593800T3 (en) * 2012-10-31 2016-12-13 Vitronic Dr.-Ing. Stein Bildverarbeitungssysteme Gmbh Procedure and light pattern to measure the height or course of the height of an object
DE102012220048B4 (en) * 2012-11-02 2018-09-20 Sirona Dental Systems Gmbh Calibration device and method for calibrating a dental camera
KR101416985B1 (en) 2012-12-15 2014-08-14 주식회사 디오에프연구소 An auxiliary scan table for scanning 3D articulator dental model
KR101617738B1 (en) 2015-05-19 2016-05-04 모젼스랩(주) Real-time image mapping system and method for multi-object
KR102441485B1 (en) * 2020-05-29 2022-09-07 주식회사 메디트 METHOD AND APPARATUS FOR OBTAINING THREE Dimensional Data AND COMPUTER READABLE MEDIUM STORING A PROGRAM FOR PERFORMING THE SAME METHOD

Citations (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4286852A (en) * 1979-05-23 1981-09-01 Solid Photography, Inc. Recording images of a three-dimensional surface by focusing on a plane of light irradiating the surface
US4294544A (en) * 1979-08-03 1981-10-13 Altschuler Bruce R Topographic comparator
US4508452A (en) * 1975-08-27 1985-04-02 Robotic Vision Systems, Inc. Arrangement for sensing the characteristics of a surface and determining the position of points thereon
US4616121A (en) * 1982-11-01 1986-10-07 National Research Development Corporation Automatic welding with imaging of workpiece surfaces and of the junction between the surfaces
US4745469A (en) * 1987-02-18 1988-05-17 Perceptron, Inc. Vehicle wheel alignment apparatus and method
US4935635A (en) * 1988-12-09 1990-06-19 Harra Dale G O System for measuring objects in three dimensions
US5028799A (en) * 1988-08-01 1991-07-02 Robotic Vision System, Inc. Method and apparatus for three dimensional object surface determination using co-planar data from multiple sensors
US5098426A (en) * 1989-02-06 1992-03-24 Phoenix Laser Systems, Inc. Method and apparatus for precision laser surgery
US5131844A (en) * 1991-04-08 1992-07-21 Foster-Miller, Inc. Contact digitizer, particularly for dental applications
US5214686A (en) * 1991-12-13 1993-05-25 Wake Forest University Three-dimensional panoramic dental radiography method and apparatus which avoids the subject's spine
US5243665A (en) * 1990-03-07 1993-09-07 Fmc Corporation Component surface distortion evaluation apparatus and method
US5347454A (en) * 1990-04-10 1994-09-13 Mushabac David R Method, system and mold assembly for use in preparing a dental restoration
US5513276A (en) * 1994-06-02 1996-04-30 The Board Of Regents Of The University Of Oklahoma Apparatus and method for three-dimensional perspective imaging of objects
US5724435A (en) * 1994-04-15 1998-03-03 Hewlett Packard Company Digital filter and method of tracking a structure extending in three spatial dimensions
USRE35816E (en) * 1990-10-15 1998-06-02 Image Guided Technologies Inc. Method and apparatus for three-dimensional non-contact shape sensing
US5823778A (en) * 1996-06-14 1998-10-20 The United States Of America As Represented By The Secretary Of The Air Force Imaging method for fabricating dental devices
US5848115A (en) * 1997-05-02 1998-12-08 General Electric Company Computed tomography metrology
US5880961A (en) * 1994-08-02 1999-03-09 Crump; Craig D. Appararus and method for creating three-dimensional modeling data from an object
US5886775A (en) * 1997-03-12 1999-03-23 M+Ind Noncontact digitizing imaging system
US5985495A (en) * 1996-03-25 1999-11-16 Nikon Corporation Methods for measuring image-formation characteristics of a projection-optical system
US5991437A (en) * 1996-07-12 1999-11-23 Real-Time Geometry Corporation Modular digital audio system having individualized functional modules
US5988862A (en) * 1996-04-24 1999-11-23 Cyra Technologies, Inc. Integrated system for quickly and accurately imaging and modeling three dimensional objects
US6088695A (en) * 1996-09-17 2000-07-11 Kara; Salim G. System and method for communicating medical records using bar coding
US6100893A (en) * 1997-05-23 2000-08-08 Light Sciences Limited Partnership Constructing solid models using implicit functions defining connectivity relationships among layers of an object to be modeled
US6124934A (en) * 1999-01-08 2000-09-26 Shahar; Arie High-accuracy high-stability method and apparatus for measuring distance from surface to reference plane
US6139499A (en) * 1999-02-22 2000-10-31 Wilk; Peter J. Ultrasonic medical system and associated method
US6167151A (en) * 1996-12-15 2000-12-26 Cognitens, Ltd. Apparatus and method for 3-dimensional surface geometry reconstruction
US6201546B1 (en) * 1998-05-29 2001-03-13 Point Cloud, Inc. Systems and methods for generating three dimensional, textured models
US6205716B1 (en) * 1995-12-04 2001-03-27 Diane P. Peltz Modular video conference enclosure
US6211506B1 (en) * 1979-04-30 2001-04-03 Diffracto, Ltd. Method and apparatus for electro-optically determining the dimension, location and attitude of objects
US6217334B1 (en) * 1997-01-28 2001-04-17 Iris Development Corporation Dental scanning method and apparatus
US6227850B1 (en) * 1999-05-13 2001-05-08 Align Technology, Inc. Teeth viewing system
US6253164B1 (en) * 1997-12-24 2001-06-26 Silicon Graphics, Inc. Curves and surfaces modeling based on a cloud of points

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5256558A (en) * 1975-11-04 1977-05-10 Nippon Telegr & Teleph Corp <Ntt> Three-dimentional object measuring system
JPS5368267A (en) * 1976-11-30 1978-06-17 Nippon Telegr & Teleph Corp <Ntt> Object shape identifying system
JP2517062B2 (en) * 1988-04-26 1996-07-24 三菱電機株式会社 3D measuring device
JPH02110305A (en) * 1988-10-19 1990-04-23 Mitsubishi Electric Corp Three-dimensional measuring device
JPH03293507A (en) * 1990-04-11 1991-12-25 Nippondenso Co Ltd Three-dimensional shape measuring apparatus
SE469158B (en) * 1991-11-01 1993-05-24 Nobelpharma Ab DENTAL SENSOR DEVICE INTENDED TO BE USED IN CONNECTION WITH CONTROL OF A WORKING EQUIPMENT
JPH0666527A (en) * 1992-08-20 1994-03-08 Sharp Corp Three-dimensional measurement method
US5568384A (en) * 1992-10-13 1996-10-22 Mayo Foundation For Medical Education And Research Biomedical imaging and analysis
AT404638B (en) * 1993-01-28 1999-01-25 Oesterr Forsch Seibersdorf METHOD AND DEVICE FOR THREE-DIMENSIONAL MEASUREMENT OF THE SURFACE OF OBJECTS
DE4415834C2 (en) * 1994-05-05 2000-12-21 Breuckmann Gmbh Device for measuring distances and spatial coordinates
US5999840A (en) * 1994-09-01 1999-12-07 Massachusetts Institute Of Technology System and method of registration of three-dimensional data sets
DE19636354A1 (en) * 1996-09-02 1998-03-05 Ruedger Dipl Ing Rubbert Method and device for performing optical recordings
DE19638727A1 (en) * 1996-09-12 1998-03-19 Ruedger Dipl Ing Rubbert Method for increasing the significance of the three-dimensional measurement of objects
JPH10170239A (en) * 1996-10-08 1998-06-26 Matsushita Electric Ind Co Ltd Three-dimensional shape measuring device
JP3121301B2 (en) * 1997-10-28 2000-12-25 得丸 正博 Artificial tooth manufacturing system and method
DE19821611A1 (en) * 1998-05-14 1999-11-18 Syrinx Med Tech Gmbh Recording method for spatial structure of three-dimensional surface, e.g. for person recognition
JP2001204757A (en) * 2000-01-31 2001-07-31 Ecchandesu:Kk Artificial eyeball
AU2001251606A1 (en) * 2000-04-28 2001-11-12 Orametirix, Inc. Method and system for scanning a surface and generating a three-dimensional object
JP2008276743A (en) * 2000-04-28 2008-11-13 Orametrix Inc Method and system for scanning surface and preparing three-dimensional object
JP2001349713A (en) * 2000-06-06 2001-12-21 Asahi Hightech Kk Three-dimensional shape measuring device
JP2001356010A (en) * 2000-06-12 2001-12-26 Asahi Hightech Kk Three-dimensional shape measuring apparatus

Patent Citations (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4508452A (en) * 1975-08-27 1985-04-02 Robotic Vision Systems, Inc. Arrangement for sensing the characteristics of a surface and determining the position of points thereon
US6211506B1 (en) * 1979-04-30 2001-04-03 Diffracto, Ltd. Method and apparatus for electro-optically determining the dimension, location and attitude of objects
US4286852A (en) * 1979-05-23 1981-09-01 Solid Photography, Inc. Recording images of a three-dimensional surface by focusing on a plane of light irradiating the surface
US4294544A (en) * 1979-08-03 1981-10-13 Altschuler Bruce R Topographic comparator
US4616121A (en) * 1982-11-01 1986-10-07 National Research Development Corporation Automatic welding with imaging of workpiece surfaces and of the junction between the surfaces
US4745469A (en) * 1987-02-18 1988-05-17 Perceptron, Inc. Vehicle wheel alignment apparatus and method
US5028799A (en) * 1988-08-01 1991-07-02 Robotic Vision System, Inc. Method and apparatus for three dimensional object surface determination using co-planar data from multiple sensors
US4935635A (en) * 1988-12-09 1990-06-19 Harra Dale G O System for measuring objects in three dimensions
US5098426A (en) * 1989-02-06 1992-03-24 Phoenix Laser Systems, Inc. Method and apparatus for precision laser surgery
US5243665A (en) * 1990-03-07 1993-09-07 Fmc Corporation Component surface distortion evaluation apparatus and method
US5347454A (en) * 1990-04-10 1994-09-13 Mushabac David R Method, system and mold assembly for use in preparing a dental restoration
USRE35816E (en) * 1990-10-15 1998-06-02 Image Guided Technologies Inc. Method and apparatus for three-dimensional non-contact shape sensing
US5131844A (en) * 1991-04-08 1992-07-21 Foster-Miller, Inc. Contact digitizer, particularly for dental applications
US5214686A (en) * 1991-12-13 1993-05-25 Wake Forest University Three-dimensional panoramic dental radiography method and apparatus which avoids the subject's spine
US5724435A (en) * 1994-04-15 1998-03-03 Hewlett Packard Company Digital filter and method of tracking a structure extending in three spatial dimensions
US5513276A (en) * 1994-06-02 1996-04-30 The Board Of Regents Of The University Of Oklahoma Apparatus and method for three-dimensional perspective imaging of objects
US5880961A (en) * 1994-08-02 1999-03-09 Crump; Craig D. Appararus and method for creating three-dimensional modeling data from an object
US6205716B1 (en) * 1995-12-04 2001-03-27 Diane P. Peltz Modular video conference enclosure
US5985495A (en) * 1996-03-25 1999-11-16 Nikon Corporation Methods for measuring image-formation characteristics of a projection-optical system
US5988862A (en) * 1996-04-24 1999-11-23 Cyra Technologies, Inc. Integrated system for quickly and accurately imaging and modeling three dimensional objects
US5823778A (en) * 1996-06-14 1998-10-20 The United States Of America As Represented By The Secretary Of The Air Force Imaging method for fabricating dental devices
US5991437A (en) * 1996-07-12 1999-11-23 Real-Time Geometry Corporation Modular digital audio system having individualized functional modules
US6088695A (en) * 1996-09-17 2000-07-11 Kara; Salim G. System and method for communicating medical records using bar coding
US6167151A (en) * 1996-12-15 2000-12-26 Cognitens, Ltd. Apparatus and method for 3-dimensional surface geometry reconstruction
US6217334B1 (en) * 1997-01-28 2001-04-17 Iris Development Corporation Dental scanning method and apparatus
US5886775A (en) * 1997-03-12 1999-03-23 M+Ind Noncontact digitizing imaging system
US5848115A (en) * 1997-05-02 1998-12-08 General Electric Company Computed tomography metrology
US6100893A (en) * 1997-05-23 2000-08-08 Light Sciences Limited Partnership Constructing solid models using implicit functions defining connectivity relationships among layers of an object to be modeled
US6253164B1 (en) * 1997-12-24 2001-06-26 Silicon Graphics, Inc. Curves and surfaces modeling based on a cloud of points
US6201546B1 (en) * 1998-05-29 2001-03-13 Point Cloud, Inc. Systems and methods for generating three dimensional, textured models
US6124934A (en) * 1999-01-08 2000-09-26 Shahar; Arie High-accuracy high-stability method and apparatus for measuring distance from surface to reference plane
US6139499A (en) * 1999-02-22 2000-10-31 Wilk; Peter J. Ultrasonic medical system and associated method
US6227850B1 (en) * 1999-05-13 2001-05-08 Align Technology, Inc. Teeth viewing system

Cited By (155)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6854973B2 (en) 2002-03-14 2005-02-15 Orametrix, Inc. Method of wet-field scanning
WO2004037111A1 (en) * 2002-10-24 2004-05-06 Jemtab Systems Device for determining positions for one or more teeth in dental prosthesis arrangement
US9480490B2 (en) 2006-02-27 2016-11-01 Biomet Manufacturing, Llc Patient-specific guides
US10426492B2 (en) 2006-02-27 2019-10-01 Biomet Manufacturing, Llc Patient specific alignment guide with cutting surface and laser indicator
US10743937B2 (en) 2006-02-27 2020-08-18 Biomet Manufacturing, Llc Backup surgical instrument system and method
US9480580B2 (en) 2006-02-27 2016-11-01 Biomet Manufacturing, Llc Patient-specific acetabular alignment guides
US9662216B2 (en) 2006-02-27 2017-05-30 Biomet Manufacturing, Llc Patient-specific hip joint devices
US10603179B2 (en) 2006-02-27 2020-03-31 Biomet Manufacturing, Llc Patient-specific augments
US10507029B2 (en) 2006-02-27 2019-12-17 Biomet Manufacturing, Llc Patient-specific acetabular guides and associated instruments
US9005297B2 (en) 2006-02-27 2015-04-14 Biomet Manufacturing, Llc Patient-specific elbow guides and associated methods
US9539013B2 (en) 2006-02-27 2017-01-10 Biomet Manufacturing, Llc Patient-specific elbow guides and associated methods
US9522010B2 (en) 2006-02-27 2016-12-20 Biomet Manufacturing, Llc Patient-specific orthopedic instruments
US9345548B2 (en) 2006-02-27 2016-05-24 Biomet Manufacturing, Llc Patient-specific pre-operative planning
US8864769B2 (en) 2006-02-27 2014-10-21 Biomet Manufacturing, Llc Alignment guides with patient-specific anchoring elements
US11534313B2 (en) 2006-02-27 2022-12-27 Biomet Manufacturing, Llc Patient-specific pre-operative planning
US10390845B2 (en) 2006-02-27 2019-08-27 Biomet Manufacturing, Llc Patient-specific shoulder guide
US9339278B2 (en) 2006-02-27 2016-05-17 Biomet Manufacturing, Llc Patient-specific acetabular guides and associated instruments
US9289253B2 (en) 2006-02-27 2016-03-22 Biomet Manufacturing, Llc Patient-specific shoulder guide
US9662127B2 (en) 2006-02-27 2017-05-30 Biomet Manufacturing, Llc Patient-specific acetabular guides and associated instruments
US9913734B2 (en) 2006-02-27 2018-03-13 Biomet Manufacturing, Llc Patient-specific acetabular alignment guides
US8900244B2 (en) 2006-02-27 2014-12-02 Biomet Manufacturing, Llc Patient-specific acetabular guide and method
US10278711B2 (en) 2006-02-27 2019-05-07 Biomet Manufacturing, Llc Patient-specific femoral guide
US10206695B2 (en) 2006-02-27 2019-02-19 Biomet Manufacturing, Llc Femoral acetabular impingement guide
US9918740B2 (en) 2006-02-27 2018-03-20 Biomet Manufacturing, Llc Backup surgical instrument system and method
US9113971B2 (en) 2006-02-27 2015-08-25 Biomet Manufacturing, Llc Femoral acetabular impingement guide
US9700329B2 (en) 2006-02-27 2017-07-11 Biomet Manufacturing, Llc Patient-specific orthopedic instruments
US9173661B2 (en) 2006-02-27 2015-11-03 Biomet Manufacturing, Llc Patient specific alignment guide with cutting surface and laser indicator
US9993344B2 (en) 2006-06-09 2018-06-12 Biomet Manufacturing, Llc Patient-modified implant
US11576689B2 (en) 2006-06-09 2023-02-14 Biomet Manufacturing, Llc Patient-specific knee alignment guide and associated method
US10206697B2 (en) 2006-06-09 2019-02-19 Biomet Manufacturing, Llc Patient-specific knee alignment guide and associated method
US9861387B2 (en) 2006-06-09 2018-01-09 Biomet Manufacturing, Llc Patient-specific knee alignment guide and associated method
US9795399B2 (en) 2006-06-09 2017-10-24 Biomet Manufacturing, Llc Patient-specific knee alignment guide and associated method
US8979936B2 (en) 2006-06-09 2015-03-17 Biomet Manufacturing, Llc Patient-modified implant
US10893879B2 (en) 2006-06-09 2021-01-19 Biomet Manufacturing, Llc Patient-specific knee alignment guide and associated method
US8858561B2 (en) 2006-06-09 2014-10-14 Blomet Manufacturing, LLC Patient-specific alignment guide
US9907659B2 (en) 2007-04-17 2018-03-06 Biomet Manufacturing, Llc Method and apparatus for manufacturing an implant
US11554019B2 (en) 2007-04-17 2023-01-17 Biomet Manufacturing, Llc Method and apparatus for manufacturing an implant
US8265949B2 (en) 2007-09-27 2012-09-11 Depuy Products, Inc. Customized patient surgical plan
US8357166B2 (en) 2007-09-30 2013-01-22 Depuy Products, Inc. Customized patient-specific instrumentation and method for performing a bone re-cut
US8357111B2 (en) 2007-09-30 2013-01-22 Depuy Products, Inc. Method and system for designing patient-specific orthopaedic surgical instruments
US10028750B2 (en) 2007-09-30 2018-07-24 DePuy Synthes Products, Inc. Apparatus and method for fabricating a customized patient-specific orthopaedic instrument
US8377068B2 (en) 2007-09-30 2013-02-19 DePuy Synthes Products, LLC. Customized patient-specific instrumentation for use in orthopaedic surgical procedures
US11931049B2 (en) 2007-09-30 2024-03-19 DePuy Synthes Products, Inc. Apparatus and method for fabricating a customized patient-specific orthopaedic instrument
US11696768B2 (en) 2007-09-30 2023-07-11 DePuy Synthes Products, Inc. Apparatus and method for fabricating a customized patient-specific orthopaedic instrument
US8361076B2 (en) 2007-09-30 2013-01-29 Depuy Products, Inc. Patient-customizable device and system for performing an orthopaedic surgical procedure
US10828046B2 (en) 2007-09-30 2020-11-10 DePuy Synthes Products, Inc. Apparatus and method for fabricating a customized patient-specific orthopaedic instrument
US8398645B2 (en) 2007-09-30 2013-03-19 DePuy Synthes Products, LLC Femoral tibial customized patient-specific orthopaedic surgical instrumentation
US8343159B2 (en) 2007-09-30 2013-01-01 Depuy Products, Inc. Orthopaedic bone saw and method of use thereof
US8738165B2 (en) 2007-12-21 2014-05-27 3M Innovative Properties Company Methods of preparing a virtual dentition model and fabricating a dental retainer therefrom
EP2303192B1 (en) * 2008-04-16 2018-11-14 Biomet Manufacturing, LLC Method for manufacturing an implant
EP3205293A1 (en) * 2008-04-16 2017-08-16 Biomet Manufacturing, LLC Patient-modified implant and manufacturing method for such an implant
WO2009129067A1 (en) * 2008-04-16 2009-10-22 Biomet Manufacturing Corp. Patient-modified implant and manufacuring method for such an implant
US10159498B2 (en) 2008-04-16 2018-12-25 Biomet Manufacturing, Llc Method and apparatus for manufacturing an implant
US9492182B2 (en) 2008-09-30 2016-11-15 DePuy Synthes Products, Inc. Customized patient-specific acetabular orthopaedic surgical instrument and method of use and fabrication
US8992538B2 (en) 2008-09-30 2015-03-31 DePuy Synthes Products, Inc. Customized patient-specific acetabular orthopaedic surgical instrument and method of use and fabrication
US11671582B2 (en) 2009-06-17 2023-06-06 3Shape A/S Intraoral scanning apparatus
US11368667B2 (en) 2009-06-17 2022-06-21 3Shape A/S Intraoral scanning apparatus
US11831815B2 (en) 2009-06-17 2023-11-28 3Shape A/S Intraoral scanning apparatus
US11539937B2 (en) 2009-06-17 2022-12-27 3Shape A/S Intraoral scanning apparatus
US11622102B2 (en) 2009-06-17 2023-04-04 3Shape A/S Intraoral scanning apparatus
US9393028B2 (en) 2009-08-13 2016-07-19 Biomet Manufacturing, Llc Device for the resection of bones, method for producing such a device, endoprosthesis suited for this purpose and method for producing such an endoprosthesis
US10052110B2 (en) 2009-08-13 2018-08-21 Biomet Manufacturing, Llc Device for the resection of bones, method for producing such a device, endoprosthesis suited for this purpose and method for producing such an endoprosthesis
US9839433B2 (en) 2009-08-13 2017-12-12 Biomet Manufacturing, Llc Device for the resection of bones, method for producing such a device, endoprosthesis suited for this purpose and method for producing such an endoprosthesis
US11324522B2 (en) 2009-10-01 2022-05-10 Biomet Manufacturing, Llc Patient specific alignment guide with cutting surface and laser indicator
US9456833B2 (en) 2010-02-26 2016-10-04 Biomet Sports Medicine, Llc Patient-specific osteotomy devices and methods
US9579112B2 (en) 2010-03-04 2017-02-28 Materialise N.V. Patient-specific computed tomography guides
US9066727B2 (en) 2010-03-04 2015-06-30 Materialise Nv Patient-specific computed tomography guides
US10893876B2 (en) 2010-03-05 2021-01-19 Biomet Manufacturing, Llc Method and apparatus for manufacturing an implant
US9168048B2 (en) 2010-08-12 2015-10-27 DePuy Synthes Products, Inc. Customized patient-specific acetabular orthopaedic surgical instrument and method of use and fabrication
US8808302B2 (en) 2010-08-12 2014-08-19 DePuy Synthes Products, LLC Customized patient-specific acetabular orthopaedic surgical instrument and method of use and fabrication
US10098648B2 (en) 2010-09-29 2018-10-16 Biomet Manufacturing, Llc Patient-specific guide for partial acetabular socket replacement
US9271744B2 (en) 2010-09-29 2016-03-01 Biomet Manufacturing, Llc Patient-specific guide for partial acetabular socket replacement
US11234719B2 (en) 2010-11-03 2022-02-01 Biomet Manufacturing, Llc Patient-specific shoulder guide
US9968376B2 (en) 2010-11-29 2018-05-15 Biomet Manufacturing, Llc Patient-specific orthopedic instruments
US9445907B2 (en) 2011-03-07 2016-09-20 Biomet Manufacturing, Llc Patient-specific tools and implants
US9743935B2 (en) 2011-03-07 2017-08-29 Biomet Manufacturing, Llc Patient-specific femoral version guide
US9241745B2 (en) 2011-03-07 2016-01-26 Biomet Manufacturing, Llc Patient-specific femoral version guide
US9717510B2 (en) 2011-04-15 2017-08-01 Biomet Manufacturing, Llc Patient-specific numerically controlled instrument
US10251690B2 (en) 2011-04-19 2019-04-09 Biomet Manufacturing, Llc Patient-specific fracture fixation instrumentation and method
US9675400B2 (en) 2011-04-19 2017-06-13 Biomet Manufacturing, Llc Patient-specific fracture fixation instrumentation and method
US8956364B2 (en) 2011-04-29 2015-02-17 Biomet Manufacturing, Llc Patient-specific partial knee guides and other instruments
US9474539B2 (en) 2011-04-29 2016-10-25 Biomet Manufacturing, Llc Patient-specific convertible guides
US9743940B2 (en) 2011-04-29 2017-08-29 Biomet Manufacturing, Llc Patient-specific partial knee guides and other instruments
US8903530B2 (en) 2011-06-06 2014-12-02 Biomet Manufacturing, Llc Pre-operative planning and manufacturing method for orthopedic procedure
US9757238B2 (en) 2011-06-06 2017-09-12 Biomet Manufacturing, Llc Pre-operative planning and manufacturing method for orthopedic procedure
US9687261B2 (en) 2011-06-13 2017-06-27 Biomet Manufacturing, Llc Drill guides for confirming alignment of patient-specific alignment guides
US9084618B2 (en) 2011-06-13 2015-07-21 Biomet Manufacturing, Llc Drill guides for confirming alignment of patient-specific alignment guides
US10492798B2 (en) 2011-07-01 2019-12-03 Biomet Manufacturing, Llc Backup kit for a patient-specific arthroplasty kit assembly
US11253269B2 (en) 2011-07-01 2022-02-22 Biomet Manufacturing, Llc Backup kit for a patient-specific arthroplasty kit assembly
US9668747B2 (en) 2011-07-01 2017-06-06 Biomet Manufacturing, Llc Patient-specific-bone-cutting guidance instruments and methods
US9173666B2 (en) 2011-07-01 2015-11-03 Biomet Manufacturing, Llc Patient-specific-bone-cutting guidance instruments and methods
US9427320B2 (en) 2011-08-04 2016-08-30 Biomet Manufacturing, Llc Patient-specific pelvic implants for acetabular reconstruction
US9066734B2 (en) 2011-08-31 2015-06-30 Biomet Manufacturing, Llc Patient-specific sacroiliac guides and associated methods
US9603613B2 (en) 2011-08-31 2017-03-28 Biomet Manufacturing, Llc Patient-specific sacroiliac guides and associated methods
US9295497B2 (en) 2011-08-31 2016-03-29 Biomet Manufacturing, Llc Patient-specific sacroiliac and pedicle guides
US9439659B2 (en) 2011-08-31 2016-09-13 Biomet Manufacturing, Llc Patient-specific sacroiliac guides and associated methods
US9386993B2 (en) 2011-09-29 2016-07-12 Biomet Manufacturing, Llc Patient-specific femoroacetabular impingement instruments and methods
US10456205B2 (en) 2011-09-29 2019-10-29 Biomet Manufacturing, Llc Patient-specific femoroacetabular impingement instruments and methods
US11406398B2 (en) 2011-09-29 2022-08-09 Biomet Manufacturing, Llc Patient-specific femoroacetabular impingement instruments and methods
US11419618B2 (en) 2011-10-27 2022-08-23 Biomet Manufacturing, Llc Patient-specific glenoid guides
US9351743B2 (en) 2011-10-27 2016-05-31 Biomet Manufacturing, Llc Patient-specific glenoid guides
US11602360B2 (en) 2011-10-27 2023-03-14 Biomet Manufacturing, Llc Patient specific glenoid guide
US9451973B2 (en) 2011-10-27 2016-09-27 Biomet Manufacturing, Llc Patient specific glenoid guide
US9936962B2 (en) 2011-10-27 2018-04-10 Biomet Manufacturing, Llc Patient specific glenoid guide
US9301812B2 (en) 2011-10-27 2016-04-05 Biomet Manufacturing, Llc Methods for patient-specific shoulder arthroplasty
US9554910B2 (en) 2011-10-27 2017-01-31 Biomet Manufacturing, Llc Patient-specific glenoid guide and implants
US11298188B2 (en) 2011-10-27 2022-04-12 Biomet Manufacturing, Llc Methods for patient-specific shoulder arthroplasty
US10426549B2 (en) 2011-10-27 2019-10-01 Biomet Manufacturing, Llc Methods for patient-specific shoulder arthroplasty
US10426493B2 (en) 2011-10-27 2019-10-01 Biomet Manufacturing, Llc Patient-specific glenoid guides
US10842510B2 (en) 2011-10-27 2020-11-24 Biomet Manufacturing, Llc Patient specific glenoid guide
US9827106B2 (en) 2012-02-02 2017-11-28 Biomet Manufacturing, Llc Implant with patient-specific porous structure
US9237950B2 (en) 2012-02-02 2016-01-19 Biomet Manufacturing, Llc Implant with patient-specific porous structure
US9060788B2 (en) 2012-12-11 2015-06-23 Biomet Manufacturing, Llc Patient-specific acetabular guide for anterior approach
US9597201B2 (en) 2012-12-11 2017-03-21 Biomet Manufacturing, Llc Patient-specific acetabular guide for anterior approach
US9204977B2 (en) 2012-12-11 2015-12-08 Biomet Manufacturing, Llc Patient-specific acetabular guide for anterior approach
US10441298B2 (en) 2013-03-11 2019-10-15 Biomet Manufacturing, Llc Patient-specific glenoid guide with a reusable guide holder
US11617591B2 (en) 2013-03-11 2023-04-04 Biomet Manufacturing, Llc Patient-specific glenoid guide with a reusable guide holder
US9839438B2 (en) 2013-03-11 2017-12-12 Biomet Manufacturing, Llc Patient-specific glenoid guide with a reusable guide holder
US9579107B2 (en) 2013-03-12 2017-02-28 Biomet Manufacturing, Llc Multi-point fit for patient specific guide
US9700325B2 (en) 2013-03-12 2017-07-11 Biomet Manufacturing, Llc Multi-point fit for patient specific guide
US10426491B2 (en) 2013-03-13 2019-10-01 Biomet Manufacturing, Llc Tangential fit of patient-specific guides
US10376270B2 (en) 2013-03-13 2019-08-13 Biomet Manufacturing, Llc Universal acetabular guide and associated hardware
US9498233B2 (en) 2013-03-13 2016-11-22 Biomet Manufacturing, Llc. Universal acetabular guide and associated hardware
US9826981B2 (en) 2013-03-13 2017-11-28 Biomet Manufacturing, Llc Tangential fit of patient-specific guides
US11191549B2 (en) 2013-03-13 2021-12-07 Biomet Manufacturing, Llc Tangential fit of patient-specific guides
US9517145B2 (en) 2013-03-15 2016-12-13 Biomet Manufacturing, Llc Guide alignment system and method
US11179165B2 (en) 2013-10-21 2021-11-23 Biomet Manufacturing, Llc Ligament guide registration
US11701208B2 (en) 2014-02-07 2023-07-18 3Shape A/S Detecting tooth shade
US11707347B2 (en) 2014-02-07 2023-07-25 3Shape A/S Detecting tooth shade
US11723759B2 (en) 2014-02-07 2023-08-15 3Shape A/S Detecting tooth shade
US10282488B2 (en) 2014-04-25 2019-05-07 Biomet Manufacturing, Llc HTO guide with optional guided ACL/PCL tunnels
US9408616B2 (en) 2014-05-12 2016-08-09 Biomet Manufacturing, Llc Humeral cut guide
US9839436B2 (en) 2014-06-03 2017-12-12 Biomet Manufacturing, Llc Patient-specific glenoid depth control
US9561040B2 (en) 2014-06-03 2017-02-07 Biomet Manufacturing, Llc Patient-specific glenoid depth control
US9833245B2 (en) 2014-09-29 2017-12-05 Biomet Sports Medicine, Llc Tibial tubercule osteotomy
US9826994B2 (en) 2014-09-29 2017-11-28 Biomet Manufacturing, Llc Adjustable glenoid pin insertion guide
US11026699B2 (en) 2014-09-29 2021-06-08 Biomet Manufacturing, Llc Tibial tubercule osteotomy
US10335162B2 (en) 2014-09-29 2019-07-02 Biomet Sports Medicine, Llc Tibial tubercle osteotomy
US9820868B2 (en) 2015-03-30 2017-11-21 Biomet Manufacturing, Llc Method and apparatus for a pin apparatus
US10925622B2 (en) 2015-06-25 2021-02-23 Biomet Manufacturing, Llc Patient-specific humeral guide designs
US10568647B2 (en) 2015-06-25 2020-02-25 Biomet Manufacturing, Llc Patient-specific humeral guide designs
US10226262B2 (en) 2015-06-25 2019-03-12 Biomet Manufacturing, Llc Patient-specific humeral guide designs
US11801064B2 (en) 2015-06-25 2023-10-31 Biomet Manufacturing, Llc Patient-specific humeral guide designs
US10034753B2 (en) 2015-10-22 2018-07-31 DePuy Synthes Products, Inc. Customized patient-specific orthopaedic instruments for component placement in a total hip arthroplasty
EP3202366A1 (en) * 2016-02-05 2017-08-09 Vatech Co. Ltd. Method, apparatus, and computer program for scanning an object in three dimensions using color dashed line pattern
US20170230636A1 (en) * 2016-02-05 2017-08-10 Vatech Co., Ltd. Scanning an object in three dimensions using color dashed line pattern
US11012678B2 (en) 2016-02-05 2021-05-18 Vatech Co., Ltd. Scanning an object in three dimensions using color dashed line pattern
US20190349570A1 (en) * 2016-02-05 2019-11-14 Vatech Co., Ltd. Scanning an object in three dimensions using color dashed line pattern
US10722310B2 (en) 2017-03-13 2020-07-28 Zimmer Biomet CMF and Thoracic, LLC Virtual surgery planning system and method
US11950786B2 (en) 2018-06-26 2024-04-09 DePuy Synthes Products, Inc. Customized patient-specific orthopaedic surgical instrument
US11051829B2 (en) 2018-06-26 2021-07-06 DePuy Synthes Products, Inc. Customized patient-specific orthopaedic surgical instrument
US11785333B2 (en) 2019-01-08 2023-10-10 Hewlett-Packard Development Company, L.P. Simulation-based capture system adjustments
WO2020145945A1 (en) * 2019-01-08 2020-07-16 Hewlett-Packard Development Company, L.P. Simulation-based capture system adjustments
KR102615021B1 (en) 2021-10-14 2023-12-20 주식회사 메디트 Method, apparatus and recording medium storing commands for aligning scanned images of 3d scanner
KR20230053738A (en) * 2021-10-14 2023-04-24 주식회사 메디트 Method, apparatus and recording medium storing commands for aligning scanned images of 3d scanner

Also Published As

Publication number Publication date
AU2001251606A1 (en) 2001-11-12
EP1287482A1 (en) 2003-03-05
JP5325366B2 (en) 2013-10-23
JP2005230530A (en) 2005-09-02
JP2005201896A (en) 2005-07-28
JP4206213B2 (en) 2009-01-07
EP1287482A4 (en) 2007-07-11
JP5362166B2 (en) 2013-12-11
JP2005214965A (en) 2005-08-11
JP4989848B2 (en) 2012-08-01
JP2003532125A (en) 2003-10-28

Similar Documents

Publication Publication Date Title
US6532299B1 (en) System and method for mapping a surface
WO2001084479A1 (en) Method and system for scanning a surface and generating a three-dimensional object
US11672629B2 (en) Photo realistic rendering of smile image after treatment
CN112739287B (en) Providing a simulated effect of a dental treatment of a patient
US11033368B2 (en) Methods and systems for dental procedures
EP3600130B1 (en) Generating a virtual depiction of an orthodontic treatment of a patient
US7305110B2 (en) Scanning system and calibration method for capturing precise three-dimensional information of objects
US7027642B2 (en) Methods for registration of three-dimensional frames to create three-dimensional virtual models of objects
Vág et al. A novel method for complex three-dimensional evaluation of intraoral scanner accuracy
US6512994B1 (en) Method and apparatus for producing a three-dimensional digital model of an orthodontic patient
EP2258303B1 (en) System for creating an individual three-dimensional virtual tooth model
US7080979B2 (en) Method and workstation for generating virtual tooth models from three-dimensional tooth data
CA2739586C (en) A method of producing dental prosthetic items or making tooth restorations using electronic dental representations
WO2006065955A2 (en) Image based orthodontic treatment methods
JP2008276743A (en) Method and system for scanning surface and preparing three-dimensional object
EP2626036A2 (en) Virtually designing a post and core restoration using a digital 3D shape
US20060127858A1 (en) Producing accurate base for a dental arch model
Zhang et al. Reconstruction-based digital dental occlusion of the partially edentulous dentition
US20160228211A1 (en) Crown assistance device

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
ENP Entry into the national phase

Ref country code: JP

Ref document number: 2001 581218

Kind code of ref document: A

Format of ref document f/p: F

WWE Wipo information: entry into national phase

Ref document number: 2001925005

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 2001925005

Country of ref document: EP