CA2524213A1 - Intra-oral imaging system - Google Patents
Intra-oral imaging system Download PDFInfo
- Publication number
- CA2524213A1 CA2524213A1 CA002524213A CA2524213A CA2524213A1 CA 2524213 A1 CA2524213 A1 CA 2524213A1 CA 002524213 A CA002524213 A CA 002524213A CA 2524213 A CA2524213 A CA 2524213A CA 2524213 A1 CA2524213 A1 CA 2524213A1
- Authority
- CA
- Canada
- Prior art keywords
- image
- sensor
- imaging system
- display
- view
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/00048—Constructional features of the display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/24—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the mouth, i.e. stomatoscopes, e.g. with tongue depressors; Instruments for opening or keeping open the mouth
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
Abstract
An intra-oral imaging system (100) has an imaging device (102), a processor (104), and a head mounted display (HMD) (106). The HMD (106) may be worn by an operator (112). The intra-oral imaging system (100) displays a computer-generated image in the HMD (106) which displays a tangible object (108) in an operator~s view. The computer-generated image may be projected in the field of view of the operator (112).
Description
INTRA-ORAL IMAGING SYSTEM
PRIORITY AND CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims the benefit under 35 U.S.C. ~ 119(e) of co-pending provisional application no. 60/466,549 filed on Apri130, 2003, for Digitizing/Imaging System with Head-Mounted Display For Dental Applications, which is incorporated in its entirety herein by reference.
BACKGROUND OF THE INVENTION
Related Field [0002] The invention relates to three-dimensional imaging of objects. In particular, the invention relates to displaying a three-dimensional image of an infra-oral (in vivo) dental item that may include dentition, prepared dentition, restorations, impression materials and the like.
Description of the Related Art [0003] Existing infra-oral imaging systems may use a Moire imaging technique. With Moire imaging, a three-dimensional ("3D") image of a physical object may be generated by scanning the object with white light. The 3D image may be viewed on a display or video monitor. Operators may evaluate the 3D
image only through the display, which may require the operator to look away from the object. In addition, there may be little or no feedback as to whether the image is suitable for its intended purpose.
SUMMARY OF THE INVENTION
PRIORITY AND CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims the benefit under 35 U.S.C. ~ 119(e) of co-pending provisional application no. 60/466,549 filed on Apri130, 2003, for Digitizing/Imaging System with Head-Mounted Display For Dental Applications, which is incorporated in its entirety herein by reference.
BACKGROUND OF THE INVENTION
Related Field [0002] The invention relates to three-dimensional imaging of objects. In particular, the invention relates to displaying a three-dimensional image of an infra-oral (in vivo) dental item that may include dentition, prepared dentition, restorations, impression materials and the like.
Description of the Related Art [0003] Existing infra-oral imaging systems may use a Moire imaging technique. With Moire imaging, a three-dimensional ("3D") image of a physical object may be generated by scanning the object with white light. The 3D image may be viewed on a display or video monitor. Operators may evaluate the 3D
image only through the display, which may require the operator to look away from the object. In addition, there may be little or no feedback as to whether the image is suitable for its intended purpose.
SUMMARY OF THE INVENTION
[0004] An imaging embodiment projects or displays a computer-generated visual image in a field of view of an operator. The systems, methods, apparatuses, and techniques digitize physical objects, such as dental items. The image may be displayed on and viewed through a head-mounted display ("HMD"), which displays computer-generated images that are easily viewed by the operator. The image also may be displayed on a computer monitor, screen, display, or the like.
[0005] A computer-generated image may correspond to an image of a real-world object. The image may be captured with an imaging device, such as an infra-oral imaging system. The infra-oral imaging embodiment projects structured light toward tissue in an oral cavity so that the light is reflected from a surface of that tissue. The tissue may include a tooth, multiple teeth, a preparation, a restoration or other dentition. The infra-oral imaging embodiment detects the reflected white light and generates a dataset related to characteristics of the tissue.
The dataset is then processed by a controller to generate a visual image. The controller-generated visual image may be displayed on a screen in the HMD. The image may be displayed at a position and/or orientation corresponding to position and/or orientation of the tissue within the field of view of an operator. The imaging embodiment senses changes in a field of view of an operator, such as by movement of the operator's head, and adjusts the position and/or orientation of the image to correspond with the changes in the field of view of the operator.
The dataset is then processed by a controller to generate a visual image. The controller-generated visual image may be displayed on a screen in the HMD. The image may be displayed at a position and/or orientation corresponding to position and/or orientation of the tissue within the field of view of an operator. The imaging embodiment senses changes in a field of view of an operator, such as by movement of the operator's head, and adjusts the position and/or orientation of the image to correspond with the changes in the field of view of the operator.
(0006] An exemplary intra-oral imaging system includes an imaging device, a processor and a head mounted display. The imaging device may project light towards or onto a surface of the object so that the light is reflected from the object.
The imaging system generates a dataset that represents some or substantially of the surface characteristics of the object. The imaging system may include a tracking sensor that tracks a position of the imaging system relative to the head-mounted display. The tracking sensor may detect an orientation of the imaging system to provide temporal orientation information. The tracking sensor also may detect a position of the imaging device to provide temporal position information. The orientation information may include data related to various angles of the imaging device relative to a predetermined origin in free space. The position information may include data related to a distance or position measurement of the imaging device relative to a predetermined origin in free space. The orientation information may include data for multiple angles, such as three angles, and the position may include measurements along multiple axes, such as three axes.
Accordingly, the tracking sensor may provide information for multiple degrees of freedom such as the six-degrees of freedom described above. The dataset generated by the imaging system may also correspond to a two-dimensional or a three dimensional representation of the surfaces of an object.
The imaging system generates a dataset that represents some or substantially of the surface characteristics of the object. The imaging system may include a tracking sensor that tracks a position of the imaging system relative to the head-mounted display. The tracking sensor may detect an orientation of the imaging system to provide temporal orientation information. The tracking sensor also may detect a position of the imaging device to provide temporal position information. The orientation information may include data related to various angles of the imaging device relative to a predetermined origin in free space. The position information may include data related to a distance or position measurement of the imaging device relative to a predetermined origin in free space. The orientation information may include data for multiple angles, such as three angles, and the position may include measurements along multiple axes, such as three axes.
Accordingly, the tracking sensor may provide information for multiple degrees of freedom such as the six-degrees of freedom described above. The dataset generated by the imaging system may also correspond to a two-dimensional or a three dimensional representation of the surfaces of an object.
(0007] The imaging device may manipulate the properties of white light through Moire or image encoding, laser triangulation, confocal or coherence tomography, or wave front sensing. The coherence tomography imaging may digitize a surface representation of the object that may be visually occluded.
For example, an imaging device based on coherence tomography may capture an image of the tooth structure behind soft tissues such as the underlying gum tissue, other soft matter such as tartar, food particles, or any other material.
For example, an imaging device based on coherence tomography may capture an image of the tooth structure behind soft tissues such as the underlying gum tissue, other soft matter such as tartar, food particles, or any other material.
[0008] A processor may receive the dataset from the imaging device. Based on the information contained in the dataset, the processor may generate signals representative of a visual image of the surface of the object. The processor may generate signals substantially simultaneously as the generation of the dataset by the imaging system. The processor also may generate signals in response to receiving the dataset or as the dataset is received. The processor may be coupled to the imaging system through a link that may include wires, cables, via radio frequency, infra-red, microwave communications and/or some other technology that does not require physical connection between the processor and imaging system. The processor may be portable and may be worn by the operator.
[0009] The HMD may be fitted or otherwise coupled to the head of an operator. The HMD receives the signals from the processor. Based on the signals received from the processor, the HMD may project the image onto a screen positioned in the field of view of an operator. The HMD may project the image to be seen by one or both eyes of the operator. The HMD may project a single image or a stereoscopic image.
(0010] The HMD may include a HMD position sensor. The position sensor may track the HMD's position relative to a predetermined origin or reference point. The position sensor also may detect an orientation of the HMD to provide HMD orientation information as a function of time. The position sensor may also detect a position of the HMD to provide position information of the HMD as a function of time. The orientation information may include data related to various angles of the HMD relative to the predetermined origin. The position information may include data related to a distance or position measurement of the HMD
relative to the predetermined origin. The orientation information may include data for one or more angles and the position may include measurements along one or more axes. Accordingly, the sensor may provide information for at least one or more degrees of freedom. The HMD position sensor may include optical tracking, acoustic tracking, inertial tracking, accelerometer tracking, magnetic field-based tracking and measurement or any combination thereof.
relative to the predetermined origin. The orientation information may include data for one or more angles and the position may include measurements along one or more axes. Accordingly, the sensor may provide information for at least one or more degrees of freedom. The HMD position sensor may include optical tracking, acoustic tracking, inertial tracking, accelerometer tracking, magnetic field-based tracking and measurement or any combination thereof.
(0011] The HMD also may include one or more eye tracking sensors that track limbus or pupil, with video images or infrared emitters and transmitters. The location andlor orientation and the location of the operator's pupil are transmitted at frequent intervals to a processing system such as a computer coupled to an intra-oral probe.
[0012] The intra-oral probe may include a mufti-dimensional tracking device such as a 3D tracking device. A 3D location of the probe may be transmitted to a controller to track the orientation and location of the probe. A 3D
visualization of an image of the object may be displayed to the operator so that the operator can view the image over at least a portion of the actual object being digitized.
The operator may progressively digitize portions of the surface of the object including various surface patches. Each portion or, patch may be captured in a sufficiently brief time period to eliminate, or substantially reduce, effects of relative motion between the infra-oral probe and the object.
visualization of an image of the object may be displayed to the operator so that the operator can view the image over at least a portion of the actual object being digitized.
The operator may progressively digitize portions of the surface of the object including various surface patches. Each portion or, patch may be captured in a sufficiently brief time period to eliminate, or substantially reduce, effects of relative motion between the infra-oral probe and the object.
[0013] Overlapping data between patches and a 3D localization relationship between patches may be determined based on the localization information received from the tracking sensor and the HMD position sensor. In addition, overlap between the digitized image of the object and the operator's eye may also be determined. Simultaneous, or substantially instant, feedback of the 3D
image may be transmitted to the HMD to allow the image to be displayed in real-time.
The computer-generated image may be displayed localized in the operator's field of view in about the same location as the actual object being digitized. The generated image also may be displayed with a scaling and orientation factors corresponding to the actual object being digitized. Gaps in the imaged surface, as well as crucial features may be enhanced to alert the operator to potential issues.
Triangulation shadowing and other issues may be communicated to the operator in a visual andlor intuitive way. The infra-oral imaging system may provide substantially instant and direct feedback to an operator regarding the object being imaged.
image may be transmitted to the HMD to allow the image to be displayed in real-time.
The computer-generated image may be displayed localized in the operator's field of view in about the same location as the actual object being digitized. The generated image also may be displayed with a scaling and orientation factors corresponding to the actual object being digitized. Gaps in the imaged surface, as well as crucial features may be enhanced to alert the operator to potential issues.
Triangulation shadowing and other issues may be communicated to the operator in a visual andlor intuitive way. The infra-oral imaging system may provide substantially instant and direct feedback to an operator regarding the object being imaged.
[0014] Other systems, methods, features and advantages of the invention will be, or will become, apparent to one with skill in the art upon examination of the following figures and detailed description. It is intended -that all such additional systems, methods, features and advantages be included within this description, be within the scope of the invention, and be protected by the following claims.
BRIEF DESCRIPTION OF THE DRAWINGS
BRIEF DESCRIPTION OF THE DRAWINGS
[0015] The invention can be better understood with reference to the following drawings and description. The components in the figures are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the invention. Moreover, in the figures, like referenced numerals designate corresponding parts throughout the different views.
[0016] Figure 1 illustrates an example of the infra-oral digitizing embodiment.
[0017] Figure 2 illustrates an operator wearing a head mounted display.
[0018] Figure 3 illustrates a side view of the operator wearing the head mounted display.
DETAILED DESCRIPTION OF THE INVENTION
DETAILED DESCRIPTION OF THE INVENTION
[0019] Figure 1 illustrates an exemplary infra-oral imaging system 100 having an imaging device 102, a processor 104, and a head mounted display (HMD) 106.
The HMD may be worn by an operator 112 of the infra-oral imaging system 100.
The infra-oral imaging system 100 displays a computer-generated image in the HMD 106. The computer-generated image may illustrate a tangible object 108 in an operator's view. The object 108 may be infra-oral tissue, such as all or portions of a tooth, multiple teeth, a preparation, a restoration, or any other dentition or combination. The computer-generated image may be projected in the field of view of the operator 112.
The HMD may be worn by an operator 112 of the infra-oral imaging system 100.
The infra-oral imaging system 100 displays a computer-generated image in the HMD 106. The computer-generated image may illustrate a tangible object 108 in an operator's view. The object 108 may be infra-oral tissue, such as all or portions of a tooth, multiple teeth, a preparation, a restoration, or any other dentition or combination. The computer-generated image may be projected in the field of view of the operator 112.
[0020] The imaging device 102 may capture an image of the object 108. The imaging device 102 may be an infra-oral imaging device, such as the Laser Digitizer System For Dental Application disclosed in co-owned application no. , referenced by attorney docket number 12075/37, filed on March 19, 2004, the disclosure of which is incorporated by reference in its entirety.
The imaging device 102 also may be an infra-oral imaging device, such as the Laser Digitizer System For Dental Application disclosed in co-owned application no.
10/749,579, filed on December 30, 2003, the disclosure of which is also incorporated by reference in its entirety. The imaging device 102 projects structured light towards the object 108 so that the light is reflected therefrom. The imaging device 102 scans a surface of the object with the structured light so that the reflected structured light may be detected. The imaging device 102 detects the reflected light from the object 108. Based on the detected light, the imaging device 102 generates a dataset related to surface characteristics of an object. The imaging device may include a processor and memory devices that generates a dataset. The dataset may relate to a two-dimensional image of the object 108, the scanned surface of the object, or one or more portions thereof. The dataset also may relate to a three-dimensional image of the object, a scanned surface of the object, or one or more portions thereof.
The imaging device 102 also may be an infra-oral imaging device, such as the Laser Digitizer System For Dental Application disclosed in co-owned application no.
10/749,579, filed on December 30, 2003, the disclosure of which is also incorporated by reference in its entirety. The imaging device 102 projects structured light towards the object 108 so that the light is reflected therefrom. The imaging device 102 scans a surface of the object with the structured light so that the reflected structured light may be detected. The imaging device 102 detects the reflected light from the object 108. Based on the detected light, the imaging device 102 generates a dataset related to surface characteristics of an object. The imaging device may include a processor and memory devices that generates a dataset. The dataset may relate to a two-dimensional image of the object 108, the scanned surface of the object, or one or more portions thereof. The dataset also may relate to a three-dimensional image of the object, a scanned surface of the object, or one or more portions thereof.
[0021] The imaging device 102 may generate the dataset based on many white light projection techniques, such as Moire or laser triangulation. The imaging device 102 may generate the dataset based on image encoding such as light intensity or wavelength encoding. The imaging device 102 also may generate the data set based on laser triangulation, confocal or coherence tomography, wave front sensing or any other technique (0022] In an embodiment based on coherence tomography, the dataset generated by the imaging device 102 include data related to a surface of the object 108 that may be visually blinded behind other surfaces or materials. For example, the imaging device 102 based on coherence tomography may generate a dataset that includes information related to a surface of the tooth structure behind soft tissues such as the underlying gum tissue, or other soft matter such as tartar, food particles, andlor any other materials.
[0023] The imaging device 102 may include a tracking sensor 110. The tracking sensor 110 senses the position of the imaging device. The tracking sensor 110 senses the position of the imaging system in free-space, for example in three degrees of freedom. The tracking sensor 110 may be a magnetic field sensor, an acoustical tracking sensor, an optical tracking sensor such as a photogrammetry sensor, an active IR marker, or a passive IR marker or any other tracking sensor.
The tracking sensor 110 may include one or more sensors positioned on the imaging device 102. An example of a tracking sensor 110 includes the Liberty Electromagnetic tracking system, by Polhemus of Colchester, VT, which may produce a data stream of at least 100 updates per second, where each update includes information concerning the location in a multi-dimensional space of each of a number of sensors placed on the imaging device 102. By tracking the position coordinates of each of the sensors placed on the imaging device 102, the imaging device 102 may be sensed in six degrees of freedom. The six degrees of freedom may specify the position and orientation of the imaging device 102 for each update period.
The tracking sensor 110 may include one or more sensors positioned on the imaging device 102. An example of a tracking sensor 110 includes the Liberty Electromagnetic tracking system, by Polhemus of Colchester, VT, which may produce a data stream of at least 100 updates per second, where each update includes information concerning the location in a multi-dimensional space of each of a number of sensors placed on the imaging device 102. By tracking the position coordinates of each of the sensors placed on the imaging device 102, the imaging device 102 may be sensed in six degrees of freedom. The six degrees of freedom may specify the position and orientation of the imaging device 102 for each update period.
[0024] A processor 104 may be coupled to the imaging device 102. The processor 104 may be a component of or a unitary part of the imaging device 102.
The processor 104 and the imaging device 102 may be coupled through a data link including wires, cables, radio frequency, infra-red, microwave communications or other wireless links. The processor may also include communications device that provide for wireless communication protocol, such as wireless TCPIIP for transmission of bidirectional data. The processor 104 may be portable and may be worn around any portion of an operator or carried the operator.
The processor 104 and the imaging device 102 may be coupled through a data link including wires, cables, radio frequency, infra-red, microwave communications or other wireless links. The processor may also include communications device that provide for wireless communication protocol, such as wireless TCPIIP for transmission of bidirectional data. The processor 104 may be portable and may be worn around any portion of an operator or carried the operator.
[0025] The processor 104 may receive datasets from the imaging device 102.
Based on the dataset, the processor 104 may generate image signals. The image signal may be characterized as a digital or logic signal, or an analog signal.
The processor 104 generates the image signal based on the captured images from the imaging device 102. The image signal represents a computer-generated image, or visual representation, of a captured image of the object 108, an image if the surface of the object 108 or a portion thereof.
Based on the dataset, the processor 104 may generate image signals. The image signal may be characterized as a digital or logic signal, or an analog signal.
The processor 104 generates the image signal based on the captured images from the imaging device 102. The image signal represents a computer-generated image, or visual representation, of a captured image of the object 108, an image if the surface of the object 108 or a portion thereof.
(0026] In one embodiment, the processor 104 may generate the image signal in response to, and substantially simultaneously with, the generation of the dataset by the imaging system 102. The processor 104 also may generate the image signals when receiving the dataset.
(0027] The processor 104 also may receive tracking information from a tracking sensor 110. Based on the information received from the tracking sensor i 10, the processor 104 may align or calibrate a projected image of the object with a captured image of the object 108. The processor 104 may includes a wireless transmitter and antenna 28 for wireless connectivity to an open or private network or to a remote computer or terminal.
(0028] The HMD 106 is coupled to the processor 104 to receive the image signal generated by the processor 104. The HMD 106 may be coupled to the processor through a data link including wires, cables, radio frequency, infra-red, microwave communications or other wireless links. The processor 104 also may be a unitary part of the HMD 106.
[0029] The HMD 106 receives the image signals from the processor 104.
Based on the image signals, the HMD 106 may display a controller-generated image to the operator 112. The HMD 102 may use an image display system positioned in the line of sight of the operator 112. Alternatively, the display system may project the controller-generated image in a field of view of the operator 112. An example of such an image display is the Nomad display sold by Microvision Inc, of Bothell WA. The image may include detailed information about the image capture process, including a visualization of the object 108 or portion thereof. The information also may include analysis of the dataset.
Based on the image signals, the HMD 106 may display a controller-generated image to the operator 112. The HMD 102 may use an image display system positioned in the line of sight of the operator 112. Alternatively, the display system may project the controller-generated image in a field of view of the operator 112. An example of such an image display is the Nomad display sold by Microvision Inc, of Bothell WA. The image may include detailed information about the image capture process, including a visualization of the object 108 or portion thereof. The information also may include analysis of the dataset.
[0030] Figure 2 illustrates an example of the HMD 106 worn by an operator 112. The HMD 106 includes a screen 116 that may display the controller-generated image. The screen 116 may include transparent, or semi-transparent, material that reflects or directs the controller-generated image towards the operator 112. The HMD 106 may be positioned so that the operator 112 can view images displayed on the screen 116. The image may be projected on the screen 116 in the field of view of the operator 112. The image may be projected on the screen in a position and orientation that overlays the object 108 within the field of view of the operator 112. By projecting the image onto the screen 116, the operator's view may be augmented or enhanced. The image may also include graphics, data, and textual information.
[0031] In a second embodiment, a headband 114 is used to position the HMD
106 on the operator's head so that the screen 116 is in the field of view of the operator 112. The screen may be positioned in front of, or before, at least one of the operator's eyes. The processor 104 also may be affixed to the headband 114.
In one embodiment with the processor 104 coupled to the headband 114, the headband 114 may provide a channel for routing wires between the processor 114 and the HMD 106.
106 on the operator's head so that the screen 116 is in the field of view of the operator 112. The screen may be positioned in front of, or before, at least one of the operator's eyes. The processor 104 also may be affixed to the headband 114.
In one embodiment with the processor 104 coupled to the headband 114, the headband 114 may provide a channel for routing wires between the processor 114 and the HMD 106.
[0032] In a third embodiment, the intra-oral imaging system 100 includes an eye tracking sensor 118. Figure 3 illustrates a side view of the HMD 106 worn by an operator 112 having an eye tracking sensor 118. The eye tracking sensor 118 may be affixed to the HMD 106. The eye tracking sensor 118 may be coupled to or a unitary part of the HMD 106.
(0033] The eye tracking sensor 118 may track or detect movement, location, orientation of the operator's eye 122. By tracking the operator's eye 122, the eye tracking sensor may provide feedback on the operator's line of vision. The eye tracking sensor 118 may also detect the operator's line of vision with respect to an object 108 or with respect to the operator's environment. The eye tracking sensor 118 provides a signal to the processor 104 corresponding to operator's line of sight. The processor 104 receive the signal from the eye tracking sensor 118 and may store the position and view of the eye 112 to the image displayed on the screen 116. .The processor may also store the position and view of the eye 112 relative to the actual scene. Alternatively, the eye tracking sensor 118 may register the operator's line of sight with respect to the screen 116.
[0034] The eye tracking sensor 118 may track various areas of the eye 122 such as the limbus, the cornea, retina, pupil, sclera, fovea, lens, iris, or other parts of the eye 122. In one embodiment, the eye tracking sensor 118 employs a video camera to track the eye 122. In another embodiment, the eye tracking sensor may use infrared emitters and transmitters to track the eye 122. Location and orientation parameters are provided to the processor 104 at predetermined frequent intervals to provide substantially real-time feedback to the processor 104. An example of an eye and head tracking system that measures eye movement substantially in real-time and point-of regard data is the VisionTrak head mounted eye tracking system sold by Polhemus of Colchester VT.
[0035] The HMD also may include one or more position sensors 120. The position sensor 120 may provide a position signal to the processor 104 related to the position, location and orientation of the operators head. The position sensor 120 may produce position information in multiple degrees of freedom. The signal provided by the position sensor 120 allows accurate alignment of the projected and captured images. The position sensor may be a magnetic field tracking sensor, an acoustical tracking sensor, or an optical tracking sensor such as a photogrammetry sensor, or active and passive IR markers.
[0036] Based on the signals received from the tracking sensor 110, the eye tracking sensor 118, and the position sensor 120, the processor 104 may determine a spatial relationship between the object 108, and the eye 122 of the operator and the imaging device 102. Scan information of the object 108 rnay be displayed at a location in the operator's line of sight. The image may be perceived by the operator 112 as an overlay to the object 108.
(0037] The imaging system 100 may also include additional tracking devices for tracking movement of the upper and/or lower jaw. Such tracking devices) may provide additional information between the HMD 106 and the object 108.
These tracking sensors also may utilize magnetic field tracking technology, active or passive Infrared tracking technology, acoustic tracking technology, or optical technology, photogrammetry technology, or any combination thereof.
These tracking sensors also may utilize magnetic field tracking technology, active or passive Infrared tracking technology, acoustic tracking technology, or optical technology, photogrammetry technology, or any combination thereof.
[0038] Although embodiments of the invention are described in detail, it should be understood that various changes, substitutions and alterations can be made hereto without departing from the spirit and scope of the invention as described by the appended claims. An example of the infra-oral imaging system described above may include a three-dimensional imaging device transmitting modulated laser light from a light source at high frequency for the purpose of reducing coherence of the laser source and reducing speckle. The infra-oral imaging system may focus light onto an area of an object to image a portion of the object. The HMD may include a corrective lens on which the computer-generated image is projected or displayed, where the corrective lens corrects the vision of the operator. The HMD may include a monochromatic or a color display.
[0039] While various embodiments of the invention have been described, it will be apparent to those of ordinary skill in the art that many more embodiments and implementations are possible within the scope of the invention.
Accordingly, the invention is not to be restricted except in light of the attached claims and their equivalents.
Accordingly, the invention is not to be restricted except in light of the attached claims and their equivalents.
Claims (20)
1. An imaging system comprising:
a three dimensional (3D) imaging device configured to generate a dataset representative characteristics of at least a portion of a surface of an object;
a processor coupled to the 3D imaging device, the processor being configured to receive the dataset from the 3D imaging device and generate signals representative of a visual image of the surface of the object, the signals being based on the dataset as the dataset is received by the processor; and a display configured to receive the signals representative of the visual image and to display the visual image in a field of view of an operator.
a three dimensional (3D) imaging device configured to generate a dataset representative characteristics of at least a portion of a surface of an object;
a processor coupled to the 3D imaging device, the processor being configured to receive the dataset from the 3D imaging device and generate signals representative of a visual image of the surface of the object, the signals being based on the dataset as the dataset is received by the processor; and a display configured to receive the signals representative of the visual image and to display the visual image in a field of view of an operator.
2. The imaging system of claim 1where the three-dimensional imaging system comprises an intra-oral probe configured to capture a three dimensional image.
3. The imaging system of claim 1 where the display is coupled to a forward-most part of an operator's body and is configured to display the visual image substantially simultaneously as the dataset is generated.
4. The imaging system of claim 1 where the 3D imaging device comprises a tracking sensor configured to generate signals representative of a position of the 3D imaging device relative to an origin.
5. The imaging system of claim 4 where the tracking sensor is further configured to generate signals representative of an orientation of the intra-oral device relative to the origin.
6. The imaging system of claim 5 where the tracking sensor is one of a magnetic field sensor, an acoustical tracking sensor, an optical tracking sensor or any combination thereof.
7. The imaging system of claim 1 where the display comprises at least one position sensor configured to generate signals representative of a position of the display relative to an origin.
8. The imaging system of claim 7 where the at least one position sensor is further configured to generate signals representative of a position of the display relative to the origin.
9. The imaging system of claim 8 where the position sensor is one of a magnetic field sensor, an acoustical sensor, an optical sensor or any combination thereof.
10. A head-mounted display, comprising:
a see-through display configured to display a computer-generated image in a field of view of a user, the computer-generated image representing at least a surface characteristic of an object and being displayed in the field of view of the user having and an orientation and scale corresponding to a view of the object in the field of view of the user; and a wearable processor configured to generate the computer-generated image based on a dataset generated by an intra-oral imaging system.
a see-through display configured to display a computer-generated image in a field of view of a user, the computer-generated image representing at least a surface characteristic of an object and being displayed in the field of view of the user having and an orientation and scale corresponding to a view of the object in the field of view of the user; and a wearable processor configured to generate the computer-generated image based on a dataset generated by an intra-oral imaging system.
11. The head-mounted display of claim 10 further comprising a headband configured to position the see-through display in the field of view of the user.
12. The head-mounted display of claim 10 where the computer-generated image comprises a three-dimensional representation of at least a surface characteristic of the object.
13. The head-mounted display of claim 10 where the wearable processor is mounted with a headband.
14. The head-mounted display of claim 10 where the intra-oral imaging system comprises a three dimensional (3D) intra-oral imaging device configured to scan a surface of a dental item with light and generates a dataset representative of surface characteristics of scanned dental item in response to detecting a reflected light from the scanned surface.
15. The head-mounted display of claim 14 where the intra-oral imaging system further comprises at least one tracking sensor configured to generate signals representative of a position and an orientation of the intra-oral device relative to an origin and the head-mounted display comprises at least one position sensor configured to generate signals representative of a position of the head-mounted display relative to the origin, the wearable processor being configured to generate the computer-generated image in the field of view of the user based on the signals representative of a position of the intra-oral device and the electrical signals representative of a position of the head-mounted display.
16. The imaging system of claim 15 where the at least one tracking sensor is one of a magnetic field sensor, an acoustical sensor, an optical sensor or a combination thereof and the position sensor is one of a magnetic field sensor, an acoustical tracking sensor, an optical tracking sensor or any combination thereof.
17. A method of displaying a visual image:
projecting structured light towards a surface of an object;
detecting the structured light reflected from the surface of the object;
generating a dataset representative of the surface in response to the detecting of light reflected from the scanned surface of the object;
generating an image of the object based on the dataset;
displaying the image in a field of view of an operator on a see-through screen, the field of view of the operator including a view of the object, the image being displayed at a position and orientation in the field of view of the operator.
projecting structured light towards a surface of an object;
detecting the structured light reflected from the surface of the object;
generating a dataset representative of the surface in response to the detecting of light reflected from the scanned surface of the object;
generating an image of the object based on the dataset;
displaying the image in a field of view of an operator on a see-through screen, the field of view of the operator including a view of the object, the image being displayed at a position and orientation in the field of view of the operator.
18. The method for displaying a visual image of claim 17, where the act of displaying the image further comprises:
detecting the position of the field of view of the operator;
detecting the position of the intra-oral imaging device; and determining a shape and orientation of the image based on detecting the position of the field of view and detecting the position of the intra-oral imaging device.
detecting the position of the field of view of the operator;
detecting the position of the intra-oral imaging device; and determining a shape and orientation of the image based on detecting the position of the field of view and detecting the position of the intra-oral imaging device.
19. The method for displaying a visual image of claim 17, further comprising:
determining an area of interest in the image; and displaying the area of interest through the display.
determining an area of interest in the image; and displaying the area of interest through the display.
20. The method for displaying a visual image of claim 17 where the image comprises a three-dimensional visual representation of at least a portion of the object.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US46654903P | 2003-04-30 | 2003-04-30 | |
US60/466,549 | 2003-04-30 | ||
PCT/US2004/013632 WO2004100067A2 (en) | 2003-04-30 | 2004-04-30 | Intra-oral imaging system |
Publications (1)
Publication Number | Publication Date |
---|---|
CA2524213A1 true CA2524213A1 (en) | 2004-11-18 |
Family
ID=33434954
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA002524213A Abandoned CA2524213A1 (en) | 2003-04-30 | 2004-04-30 | Intra-oral imaging system |
Country Status (6)
Country | Link |
---|---|
US (1) | US20050020910A1 (en) |
EP (1) | EP1617759A4 (en) |
JP (1) | JP2007528743A (en) |
AU (1) | AU2004237202A1 (en) |
CA (1) | CA2524213A1 (en) |
WO (1) | WO2004100067A2 (en) |
Families Citing this family (50)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070081123A1 (en) | 2005-10-07 | 2007-04-12 | Lewis Scott W | Digital eyewear |
US11428937B2 (en) * | 2005-10-07 | 2022-08-30 | Percept Technologies | Enhanced optical and perceptual digital eyewear |
US9658473B2 (en) | 2005-10-07 | 2017-05-23 | Percept Technologies Inc | Enhanced optical and perceptual digital eyewear |
US7698014B2 (en) | 2006-01-20 | 2010-04-13 | 3M Innovative Properties Company | Local enforcement of accuracy in fabricated models |
US7840042B2 (en) | 2006-01-20 | 2010-11-23 | 3M Innovative Properties Company | Superposition for visualization of three-dimensional data acquisition |
US8566894B2 (en) | 2006-02-10 | 2013-10-22 | Scott W. Lewis | Method and system for distribution of media |
CN2872361Y (en) * | 2006-02-28 | 2007-02-21 | 南京德朔实业有限公司 | Object detector |
WO2007137059A2 (en) * | 2006-05-17 | 2007-11-29 | Kent Moore | Stereovideoscope and method of using the same |
JP2008194108A (en) * | 2007-02-09 | 2008-08-28 | Shiyoufuu:Kk | Three-dimensional characteristic measuring and displaying apparatus with positional direction detecting function |
WO2009006273A2 (en) * | 2007-06-29 | 2009-01-08 | 3M Innovative Properties Company | Synchronized views of video data and three-dimensional model data |
EP2012170B1 (en) | 2007-07-06 | 2017-02-15 | Harman Becker Automotive Systems GmbH | Head-tracking system and operating method thereof |
DE102007060263A1 (en) * | 2007-08-16 | 2009-02-26 | Steinbichler Optotechnik Gmbh | Scanner for scanning e.g. teeth, in mouth of patient, has image optics arranged at distance to each other, where distance of optics and directions of optical axes are selected such that optics and axes are oriented to common area of tooth |
DE102007043366A1 (en) * | 2007-09-12 | 2009-03-19 | Degudent Gmbh | Method for determining the position of an intraoral measuring device |
JP5135007B2 (en) * | 2008-03-10 | 2013-01-30 | オリンパスメディカルシステムズ株式会社 | Capsule guidance system |
US8878905B2 (en) | 2009-06-17 | 2014-11-04 | 3Shape A/S | Focus scanning apparatus |
US20120056993A1 (en) * | 2010-09-08 | 2012-03-08 | Salman Luqman | Dental Field Visualization System with Improved Ergonomics |
AU2011336647B2 (en) | 2010-12-02 | 2015-02-12 | Ultradent Products, Inc. | System and method of viewing and tracking stereoscopic video images |
US8964008B2 (en) * | 2011-06-17 | 2015-02-24 | Microsoft Technology Licensing, Llc | Volumetric video presentation |
US9069164B2 (en) * | 2011-07-12 | 2015-06-30 | Google Inc. | Methods and systems for a virtual input device |
US8228315B1 (en) * | 2011-07-12 | 2012-07-24 | Google Inc. | Methods and systems for a virtual input device |
JP6351579B2 (en) | 2012-06-01 | 2018-07-04 | ウルトラデント プロダクツ インク. | Stereoscopic video imaging |
JP5903018B2 (en) | 2012-09-26 | 2016-04-13 | ソニー株式会社 | Head mounted display |
US9265458B2 (en) | 2012-12-04 | 2016-02-23 | Sync-Think, Inc. | Application of smooth pursuit cognitive testing paradigms to clinical drug development |
US9380976B2 (en) | 2013-03-11 | 2016-07-05 | Sync-Think, Inc. | Optical neuroinformatics |
US20140272766A1 (en) * | 2013-03-14 | 2014-09-18 | Ormco Corporation | Head mounted display for an intra-oral imaging system |
CN105142498B (en) * | 2013-03-15 | 2022-10-11 | 感知技术有限公司 | Enhanced optical and perceptual digital eyewear |
US11181740B1 (en) | 2013-03-15 | 2021-11-23 | Percept Technologies Inc | Digital eyewear procedures related to dry eyes |
US10010387B2 (en) | 2014-02-07 | 2018-07-03 | 3Shape A/S | Detecting tooth shade |
FR3021518A1 (en) * | 2014-05-27 | 2015-12-04 | Francois Duret | VISUALIZATION DEVICE FOR FACILITATING MEASUREMENT AND 3D DIAGNOSIS BY OPTICAL FOOTPRINT IN DENTISTRY |
US20160178906A1 (en) * | 2014-12-19 | 2016-06-23 | Intel Corporation | Virtual wearables |
WO2016106614A1 (en) * | 2014-12-30 | 2016-07-07 | 深圳市圆梦精密技术研究院 | Wearable visual structure for dental surgery |
DE102015212806A1 (en) | 2015-07-08 | 2017-01-12 | Sirona Dental Systems Gmbh | System and method for scanning anatomical structures and displaying a scan result |
ITUB20155830A1 (en) * | 2015-11-23 | 2017-05-23 | R A W Srl | "NAVIGATION, TRACKING AND GUIDE SYSTEM FOR THE POSITIONING OF OPERATOR INSTRUMENTS" |
CA2958010C (en) | 2016-02-19 | 2021-09-07 | Covidien Lp | System and methods for video-based monitoring of vital signs |
WO2017144934A1 (en) * | 2016-02-26 | 2017-08-31 | Trophy | Guided surgery apparatus and method |
JP6229747B2 (en) * | 2016-03-10 | 2017-11-15 | ソニー株式会社 | Endoscope system and head mounted display |
WO2017222497A1 (en) * | 2016-06-20 | 2017-12-28 | Carestream Health, Inc. | Dental restoration assessment and manufacturing using virtual model |
US11457998B2 (en) | 2016-07-29 | 2022-10-04 | Ivoclar Vivadent Ag | Recording device |
US10299741B2 (en) * | 2016-09-14 | 2019-05-28 | Dental Imaging Technologies Corporation | Multiple-dimension imaging sensor and state-based operation of an imaging system including a multiple-dimension imaging sensor |
US10213180B2 (en) | 2016-09-14 | 2019-02-26 | Dental Imaging Technologies Corporation | Multiple-dimension imaging sensor with operation based on magnetic field detection |
US10932733B2 (en) | 2016-09-14 | 2021-03-02 | Dental Imaging Technologies Corporation | Multiple-dimension imaging sensor with operation based on movement detection |
US10657726B1 (en) | 2017-10-02 | 2020-05-19 | International Osseointegration Ai Research And Training Center | Mixed reality system and method for determining spatial coordinates of dental instruments |
US10939824B2 (en) | 2017-11-13 | 2021-03-09 | Covidien Lp | Systems and methods for video-based monitoring of a patient |
CA3086527A1 (en) | 2018-01-08 | 2019-07-11 | Covidien Lp | Systems and methods for video-based non-contact tidal volume monitoring |
WO2019240991A1 (en) | 2018-06-15 | 2019-12-19 | Covidien Lp | Systems and methods for video-based patient monitoring during surgery |
EP3833241A1 (en) | 2018-08-09 | 2021-06-16 | Covidien LP | Video-based patient monitoring systems and associated methods for detecting and monitoring breathing |
US11617520B2 (en) | 2018-12-14 | 2023-04-04 | Covidien Lp | Depth sensing visualization modes for non-contact monitoring |
US11357593B2 (en) * | 2019-01-10 | 2022-06-14 | Covidien Lp | Endoscopic imaging with augmented parallax |
US11315275B2 (en) | 2019-01-28 | 2022-04-26 | Covidien Lp | Edge handling methods for associated depth sensing camera devices, systems, and methods |
US11484208B2 (en) | 2020-01-31 | 2022-11-01 | Covidien Lp | Attached sensor activation of additionally-streamed physiological parameters from non-contact monitoring systems and associated devices, systems, and methods |
Family Cites Families (94)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE3003435A1 (en) * | 1980-01-31 | 1981-08-06 | Becker Dental-Labor Gmbh, 5100 Aachen | METHOD AND DEVICE FOR PRODUCING A CROWN PART |
US4575805A (en) * | 1980-12-24 | 1986-03-11 | Moermann Werner H | Method and apparatus for the fabrication of custom-shaped implants |
FR2525103B1 (en) * | 1982-04-14 | 1985-09-27 | Duret Francois | IMPRESSION TAKING DEVICE BY OPTICAL MEANS, PARTICULARLY FOR THE AUTOMATIC PRODUCTION OF PROSTHESES |
FR2536654B1 (en) * | 1982-11-30 | 1987-01-09 | Duret Francois | METHOD FOR PRODUCING A DENTAL PROSTHESIS |
US4663720A (en) * | 1984-02-21 | 1987-05-05 | Francois Duret | Method of and apparatus for making a prosthesis, especially a dental prosthesis |
EP0163076B1 (en) * | 1984-04-17 | 1991-11-13 | Kawasaki Jukogyo Kabushiki Kaisha | Apparatus for producing a three-dimensional copy of an object |
US4798534A (en) * | 1984-08-03 | 1989-01-17 | Great Lakes Orthodontic Laboratories Inc. | Method of making a dental appliance |
CH663891A5 (en) * | 1984-10-24 | 1988-01-29 | Marco Dr Sc Techn Brandestini | DEVICE FOR THE SHAPING PROCESSING OF A BLANK MADE OF DENTAL CERAMIC OR DENTAL COMPOSITE MATERIAL AND METHOD FOR THE OPERATION THEREOF. |
US4936862A (en) * | 1986-05-30 | 1990-06-26 | Walker Peter S | Method of designing and manufacturing a human joint prosthesis |
US4816920A (en) * | 1986-11-18 | 1989-03-28 | General Scanning, Inc. | Planar surface scanning system |
FR2610821B1 (en) * | 1987-02-13 | 1989-06-09 | Hennson Int | METHOD FOR TAKING MEDICAL IMPRESSION AND DEVICE FOR IMPLEMENTING SAME |
US5186623A (en) * | 1987-05-05 | 1993-02-16 | Great Lakes Orthodontics, Ltd. | Orthodontic finishing positioner and method of construction |
US4856991A (en) * | 1987-05-05 | 1989-08-15 | Great Lakes Orthodontics, Ltd. | Orthodontic finishing positioner and method of construction |
US4935635A (en) * | 1988-12-09 | 1990-06-19 | Harra Dale G O | System for measuring objects in three dimensions |
US5011405A (en) * | 1989-01-24 | 1991-04-30 | Dolphin Imaging Systems | Method for determining orthodontic bracket placement |
SE464908B (en) * | 1989-03-23 | 1991-07-01 | Nobelpharma Ab | METHOD FOR MANUFACTURING ARTIFICIAL DENTAL CHRONICLES OF ONLINE TYPE OR INPUT |
US5151044A (en) * | 1989-05-12 | 1992-09-29 | Rotsaert Henri L | Blanks for the manufacture of artificial teeth and crowns |
US5027281A (en) * | 1989-06-09 | 1991-06-25 | Regents Of The University Of Minnesota | Method and apparatus for scanning and recording of coordinates describing three dimensional objects of complex and unique geometry |
US5447432A (en) * | 1990-01-19 | 1995-09-05 | Ormco Corporation | Custom orthodontic archwire forming method and apparatus |
US5395238A (en) * | 1990-01-19 | 1995-03-07 | Ormco Corporation | Method of forming orthodontic brace |
US5533895A (en) * | 1990-01-19 | 1996-07-09 | Ormco Corporation | Orthodontic appliance and group standardized brackets therefor and methods of making, assembling and using appliance to straighten teeth |
US5431562A (en) * | 1990-01-19 | 1995-07-11 | Ormco Corporation | Method and apparatus for designing and forming a custom orthodontic appliance and for the straightening of teeth therewith |
US5139419A (en) * | 1990-01-19 | 1992-08-18 | Ormco Corporation | Method of forming an orthodontic brace |
EP0519915B1 (en) * | 1990-03-13 | 1995-01-11 | Comdent Gmbh | Process and device for measuring the dimensions of a space, in particular a buccal cavity |
US5569578A (en) * | 1990-04-10 | 1996-10-29 | Mushabac; David R. | Method and apparatus for effecting change in shape of pre-existing object |
US5545039A (en) * | 1990-04-10 | 1996-08-13 | Mushabac; David R. | Method and apparatus for preparing tooth or modifying dental restoration |
US5224049A (en) * | 1990-04-10 | 1993-06-29 | Mushabac David R | Method, system and mold assembly for use in preparing a dental prosthesis |
US5347454A (en) * | 1990-04-10 | 1994-09-13 | Mushabac David R | Method, system and mold assembly for use in preparing a dental restoration |
US5452219A (en) * | 1990-06-11 | 1995-09-19 | Dentsply Research & Development Corp. | Method of making a tooth mold |
US5340309A (en) * | 1990-09-06 | 1994-08-23 | Robertson James G | Apparatus and method for recording jaw motion |
ES2080206T3 (en) * | 1990-10-10 | 1996-02-01 | Mikrona Technologie Ag | RAW PIECE FOR THE MANUFACTURE OF A MACHINED PIECE OF DENTAL TECHNIQUE AND CLAMPING DEVICE FOR THE SAME. |
US5198877A (en) * | 1990-10-15 | 1993-03-30 | Pixsys, Inc. | Method and apparatus for three-dimensional non-contact shape sensing |
SE468198B (en) * | 1990-12-12 | 1992-11-23 | Nobelpharma Ab | PROCEDURE AND DEVICE FOR MANUFACTURE OF INDIVIDUALLY DESIGNED THREE-DIMENSIONAL BODIES USEFUL AS TENDERS, PROTESTES, ETC |
SE469158B (en) * | 1991-11-01 | 1993-05-24 | Nobelpharma Ab | DENTAL SENSOR DEVICE INTENDED TO BE USED IN CONNECTION WITH CONTROL OF A WORKING EQUIPMENT |
CH686657A5 (en) * | 1991-11-17 | 1996-05-31 | Liconic Ag | Process for producing a partial replacement of a tooth and means for implementing the method. |
JPH05269146A (en) * | 1992-03-23 | 1993-10-19 | Nikon Corp | Extracting method for margin line at the time of designing crown |
ES2127785T3 (en) * | 1992-04-06 | 1999-05-01 | Elephant Dental Bv | DENTAL PROSTHESIS AND METHOD FOR MAKING A DENTAL PROSTHESIS. |
DE4214876C2 (en) * | 1992-05-05 | 2000-07-06 | Kaltenbach & Voigt | Optical measurement of teeth without a matt surface treatment |
FR2693096B1 (en) * | 1992-07-06 | 1994-09-23 | Deshayes Marie Josephe | Process for modeling the cranial and facial morphology from an x-ray of the skull. |
WO1994010935A1 (en) * | 1992-11-09 | 1994-05-26 | Ormco Corporation | Custom orthodontic appliance forming method and apparatus |
US5526812A (en) * | 1993-06-21 | 1996-06-18 | General Electric Company | Display system for enhancing visualization of body structures during medical procedures |
SE501411C2 (en) * | 1993-07-12 | 1995-02-06 | Nobelpharma Ab | Method and apparatus for three-dimensional body useful in the human body |
SE501410C2 (en) * | 1993-07-12 | 1995-02-06 | Nobelpharma Ab | Method and apparatus in connection with the manufacture of tooth, bridge, etc. |
NL9301308A (en) * | 1993-07-26 | 1995-02-16 | Willem Frederick Van Nifterick | Method of securing a dental prosthesis to implants in a patient's jawbone and using means thereof. |
US5382164A (en) * | 1993-07-27 | 1995-01-17 | Stern; Sylvan S. | Method for making dental restorations and the dental restoration made thereby |
US5338198A (en) * | 1993-11-22 | 1994-08-16 | Dacim Laboratory Inc. | Dental modeling simulator |
SE502427C2 (en) * | 1994-02-18 | 1995-10-16 | Nobelpharma Ab | Method and device utilizing articulator and computer equipment |
SE503498C2 (en) * | 1994-10-04 | 1996-06-24 | Nobelpharma Ab | Method and device for a product intended to be part of the human body and a scanning device for a model for the product |
US5549476A (en) * | 1995-03-27 | 1996-08-27 | Stern; Sylvan S. | Method for making dental restorations and the dental restoration made thereby |
JP3727660B2 (en) * | 1995-07-21 | 2005-12-14 | カデント・リミテッド | Method for obtaining a three-dimensional tooth image |
US6256529B1 (en) * | 1995-07-26 | 2001-07-03 | Burdette Medical Systems, Inc. | Virtual reality 3D visualization for surgical procedures |
EP0781530B1 (en) * | 1995-12-19 | 2004-09-01 | Ivoclar Vivadent AG | Method for manufacturing tooth crowns and/or dental bridges |
US5725376A (en) * | 1996-02-27 | 1998-03-10 | Poirier; Michel | Methods for manufacturing a dental implant drill guide and a dental implant superstructure |
US6382975B1 (en) * | 1997-02-26 | 2002-05-07 | Technique D'usinage Sinlab Inc. | Manufacturing a dental implant drill guide and a dental implant superstructure |
US6044170A (en) * | 1996-03-21 | 2000-03-28 | Real-Time Geometry Corporation | System and method for rapid shape digitizing and adaptive mesh generation |
US5831719A (en) * | 1996-04-12 | 1998-11-03 | Holometrics, Inc. | Laser scanning system |
US5870220A (en) * | 1996-07-12 | 1999-02-09 | Real-Time Geometry Corporation | Portable 3-D scanning system and method for rapid shape digitizing and adaptive mesh generation |
US5812269A (en) * | 1996-07-29 | 1998-09-22 | General Scanning, Inc. | Triangulation-based 3-D imaging and processing method and system |
JPH1075963A (en) * | 1996-09-06 | 1998-03-24 | Nikon Corp | Method for designing dental prosthetic appliance model and medium recording program for executing the method |
DE19638758A1 (en) * | 1996-09-13 | 1998-03-19 | Rubbert Ruedger | Method and device for three-dimensional measurement of objects |
AUPO280996A0 (en) * | 1996-10-04 | 1996-10-31 | Dentech Investments Pty Ltd | Creation and utilization of 3D teeth models |
AU1198797A (en) * | 1996-12-20 | 1998-07-17 | Elypse | Method for producing a dental prosthesis |
US5813859A (en) * | 1997-01-23 | 1998-09-29 | Hajjar; Victor J. | Method and apparatus for tooth restoration |
US6217334B1 (en) * | 1997-01-28 | 2001-04-17 | Iris Development Corporation | Dental scanning method and apparatus |
US6175415B1 (en) * | 1997-02-19 | 2001-01-16 | United Technologies Corporation | Optical profile sensor |
SE509005C2 (en) * | 1997-02-24 | 1998-11-23 | Dentronic Ab | Method and arrangement for non-contact measurement of the three-dimensional shape of detail objects |
US5879158A (en) * | 1997-05-20 | 1999-03-09 | Doyle; Walter A. | Orthodontic bracketing system and method therefor |
US6409504B1 (en) * | 1997-06-20 | 2002-06-25 | Align Technology, Inc. | Manipulating a digital dentition model to form models of individual dentition components |
US5975893A (en) * | 1997-06-20 | 1999-11-02 | Align Technology, Inc. | Method and system for incrementally moving teeth |
AU744385B2 (en) * | 1997-06-20 | 2002-02-21 | Align Technology, Inc. | Method and system for incrementally moving teeth |
US5882192A (en) * | 1997-10-30 | 1999-03-16 | Ortho-Tain, Inc. | Computerized orthodontic diagnosis and appliance dispenser |
DK0913130T3 (en) * | 1997-10-31 | 2003-06-16 | Dcs Forschungs & Entwicklungs | Method and apparatus for making a tooth replacement part |
US6227850B1 (en) * | 1999-05-13 | 2001-05-08 | Align Technology, Inc. | Teeth viewing system |
US6514074B1 (en) * | 1999-05-14 | 2003-02-04 | Align Technology, Inc. | Digitally modeling the deformation of gingival |
US6406292B1 (en) * | 1999-05-13 | 2002-06-18 | Align Technology, Inc. | System for determining final position of teeth |
US6227851B1 (en) * | 1998-12-04 | 2001-05-08 | Align Technology, Inc. | Manipulable dental model system for fabrication of a dental appliance |
JP2000201925A (en) * | 1999-01-12 | 2000-07-25 | Toshiba Corp | Three-dimensional ultrasonograph |
US6532299B1 (en) * | 2000-04-28 | 2003-03-11 | Orametrix, Inc. | System and method for mapping a surface |
US6431870B1 (en) * | 1999-11-30 | 2002-08-13 | Ora Metrix, Inc. | Method and apparatus for generating a desired three-dimensional digital model of an orthodontic structure |
US6413084B1 (en) * | 2000-04-28 | 2002-07-02 | Ora Metrix, Inc. | Method and system of scanning |
US6512994B1 (en) * | 1999-11-30 | 2003-01-28 | Orametrix, Inc. | Method and apparatus for producing a three-dimensional digital model of an orthodontic patient |
US6594539B1 (en) * | 1999-03-29 | 2003-07-15 | Genex Technologies, Inc. | Three-dimensional dental imaging method and apparatus having a reflective member |
US6503195B1 (en) * | 1999-05-24 | 2003-01-07 | University Of North Carolina At Chapel Hill | Methods and systems for real-time structured light depth extraction and endoscope using real-time structured light depth extraction |
US6350120B1 (en) * | 1999-11-30 | 2002-02-26 | Orametrix, Inc. | Method and apparatus for designing an orthodontic apparatus to provide tooth movement |
US6540512B1 (en) * | 1999-11-30 | 2003-04-01 | Orametrix, Inc. | Method and apparatus for treating an orthodontic patient |
US6554613B1 (en) * | 2000-04-19 | 2003-04-29 | Ora Metrix, Inc. | Method and apparatus for generating an orthodontic template that assists in placement of orthodontic apparatus |
US6250918B1 (en) * | 1999-11-30 | 2001-06-26 | Orametrix, Inc. | Method and apparatus for simulating tooth movement for an orthodontic patient |
US6568936B2 (en) * | 2000-01-05 | 2003-05-27 | Pentron Laboratory Technologies, Llc. | Method and apparatus for preparing dental restorations |
US6402707B1 (en) * | 2000-06-28 | 2002-06-11 | Denupp Corporation Bvi | Method and system for real time intra-orally acquiring and registering three-dimensional measurements and images of intra-oral objects and features |
US6386878B1 (en) * | 2000-08-16 | 2002-05-14 | Align Technology, Inc. | Systems and methods for removing gingiva from teeth |
US6386867B1 (en) * | 2000-11-30 | 2002-05-14 | Duane Milford Durbin | Method and system for imaging and modeling dental structures |
US6364660B1 (en) * | 2000-10-25 | 2002-04-02 | Duane Milford Durbin | Method and system for imaging and modeling dental structures |
US6506054B2 (en) * | 2001-03-20 | 2003-01-14 | Itzhak Shoher | Method of forming a dental coping |
US6919867B2 (en) * | 2001-03-29 | 2005-07-19 | Siemens Corporate Research, Inc. | Method and apparatus for augmented reality visualization |
-
2004
- 2004-04-30 WO PCT/US2004/013632 patent/WO2004100067A2/en active Application Filing
- 2004-04-30 US US10/837,089 patent/US20050020910A1/en not_active Abandoned
- 2004-04-30 AU AU2004237202A patent/AU2004237202A1/en not_active Abandoned
- 2004-04-30 JP JP2006514220A patent/JP2007528743A/en active Pending
- 2004-04-30 EP EP04751156A patent/EP1617759A4/en not_active Withdrawn
- 2004-04-30 CA CA002524213A patent/CA2524213A1/en not_active Abandoned
Also Published As
Publication number | Publication date |
---|---|
US20050020910A1 (en) | 2005-01-27 |
EP1617759A4 (en) | 2009-04-22 |
WO2004100067A2 (en) | 2004-11-18 |
WO2004100067A3 (en) | 2005-04-21 |
AU2004237202A1 (en) | 2004-11-18 |
JP2007528743A (en) | 2007-10-18 |
EP1617759A2 (en) | 2006-01-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20050020910A1 (en) | Intra-oral imaging system | |
US11412993B2 (en) | System and method for scanning anatomical structures and for displaying a scanning result | |
US11856178B2 (en) | Stereoscopic video imaging | |
RU2621488C2 (en) | Display fixed on head and method of controlling display fixed on head | |
CN107209950B (en) | Automatic generation of virtual material from real world material | |
US20180275755A1 (en) | Method of interaction by gaze and associated device | |
EP3076892B1 (en) | A medical optical tracking system | |
EP2825087B1 (en) | Otoscanner | |
US9715274B2 (en) | Method for determining the direction in which a user is looking | |
Hennessey et al. | Noncontact binocular eye-gaze tracking for point-of-gaze estimation in three dimensions | |
EP3420538A1 (en) | Guided surgery apparatus and method | |
US20060176242A1 (en) | Augmented reality device and method | |
KR101772187B1 (en) | Method and device for stereoscopic depiction of image data | |
JP2008018015A (en) | Medical display unit and system | |
JP3707830B2 (en) | Image display device for surgical support | |
CN114846518A (en) | Augmented reality headset for medical imaging | |
US11698535B2 (en) | Systems and methods for superimposing virtual image on real-time image | |
JP2005312605A (en) | Gaze position display device | |
Thoma et al. | [POSTER] augmented reality for user-friendly intra-oral scanning | |
US20200305702A1 (en) | Dental observation device and method for displaying dental image | |
JP2021045272A (en) | Echo guide system for acupuncture | |
JP2018125727A (en) | Face image processing apparatus | |
WO2021079504A1 (en) | Image processing method, image processing device, and program | |
JP2004077278A (en) | Recognizing method of optical fluctuation and device for outputting optical fluctuation as audibly recognizable signal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
EEER | Examination request | ||
FZDE | Discontinued |