US20010027272A1 - Navigation apparatus and surgical operation image acquisition/display apparatus using the same - Google Patents

Navigation apparatus and surgical operation image acquisition/display apparatus using the same Download PDF

Info

Publication number
US20010027272A1
US20010027272A1 US09/875,628 US87562801A US2001027272A1 US 20010027272 A1 US20010027272 A1 US 20010027272A1 US 87562801 A US87562801 A US 87562801A US 2001027272 A1 US2001027272 A1 US 2001027272A1
Authority
US
United States
Prior art keywords
section
navigation
target
image
orientation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US09/875,628
Other versions
US6456868B2 (en
Inventor
Akito Saito
Takao Shibasaki
Takeo Asano
Hiroshi Matsuzaki
Yukihito Furuhashi
Akio Kosaka
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Corp
Original Assignee
Olympus Optical Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP8940599A external-priority patent/JP2000279425A/en
Priority claimed from JP11163964A external-priority patent/JP2000350734A/en
Application filed by Olympus Optical Co Ltd filed Critical Olympus Optical Co Ltd
Priority to US09/875,628 priority Critical patent/US6456868B2/en
Publication of US20010027272A1 publication Critical patent/US20010027272A1/en
Application granted granted Critical
Publication of US6456868B2 publication Critical patent/US6456868B2/en
Assigned to OLYMPUS CORPORATION reassignment OLYMPUS CORPORATION CHANGE OF ADDRESS Assignors: OLYMPUS CORPORATION
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/061Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body
    • A61B5/064Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body using markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • A61B1/0005Display arrangement combining images e.g. side-by-side, superimposed or tiled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2072Reference field transducer attached to an instrument or patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3983Reference marker arrangements for use with image guided surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/20Surgical microscopes characterised by non-optical aspects
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras

Definitions

  • This invention relates to a navigation apparatus and, more particularly, to a navigation apparatus adapted to modify navigation-related information according to the relative position and orientation of the object of navigation and the target within a three-dimensional space.
  • This invention also relates to a surgical operation image acquisition/display apparatus and, more particularly, to an operation image acquisition/display apparatus adapted to acquisition and display images of a plurality of observation systems used in surgical operations.
  • the medical navigation system disclosed in Jpn. Pat. Appln. KOKAI Publication No. 9-173352 is adapted to display information (profile information, medical image information) on the desired part of the object of examination specified by a specifying section for specifying a desired part of the object of examination.
  • the surgical operation assisting apparatus disclosed in Jpn. Pat. Appln. KOKAI Publication No. 10-5245 is adapted to display the current position of the surgical instrument being used in a surgical operation and the blood vessel located closest to the instrument on a tomographic image of the area of surgical operation in an overlaid manner by using the image data on the tomographic image, the surgical instrument being used, blood vessel detection section for detecting the blood vessel located closest to the surgical instrument, a position detection section for detecting the current position of the surgical instrument, an arithmetic computing section for computationally determines the position of the front end of the surgical instrument and the direction in which the surgical instrument is inserted, an image selection section for selecting the image data on the image being acquired for the area where the front end of the surgical instrument is located and an image synthesizing section for synthetically combining the image selected by the image selection section and a predetermined pattern indicating the front end of the surgical instrument in an overlaid manner.
  • the above described arrangement is intended to allow the operator to visually confirm the position of the front end of the surgical instrument inserted into the body of the patient on the tomographic image being displayed.
  • this known medical navigation system provides a difficulty with which the surgeon realizes distances in the perspective of displayed information in the direction connecting the eyes of the surgeon and the display screen that is perpendicular to the latter.
  • this known medical navigation system provides an additional difficulty with which the surgeon determines the route of navigation on the basis of the displayed information when both the object of examination and the section for specifying the desired part of the object of examination are located at respective positions that are found within the measurable area but outside the displayable area of the system.
  • the surgical operation assisting system disclosed in Jpn. Pat. Appln. KOKAI Publication No. 10-5245 is accompanied by a problem of cumbersomeness that the user has to be constantly aware of the distance between the position of the front end of the surgical instrument on the displayed tomographic image and the position of the detected blood vessel in order to know the distance between the blood vessel and the surgical instrument.
  • micro-surgery In micro-surgery, generally a surgical microscope is used to observe an enlarged view of the area of surgical operation.
  • Jpn. Pat. Appln. KOKAI Publication No. 5-203881 proposes an integrated image system comprising a plurality of CCD cameras connected to respective observation systems, each including a surgical microscope, an endoscope and other instruments, a CCD camera controller for controlling the operation of selectively using any of the observation systems and a view finder controller so that the user may select any of the observation systems by means of the CCD camera controller in the course of the ongoing surgical operation.
  • Jpn. Pat. Appln. KOKAI Publication No. 7-261094 discloses a surgical microscope with which the user can switch from the image of the surgical microscope to that of the endoscope or vice versa or overlay one on the other whenever necessary.
  • Jpn. Pat. Appln. KOKAI Publication No. 7-261094 involves the use of a mode switch with which the surgical operator can switch the display mode whenever necessary.
  • the object of the present invention to provide a navigation apparatus with which the user can easily and visually realize the distance between a target and an object of navigation by modifying the obtained navigation-related information according to the relative position and orientation of the object of navigation and the target within a three-dimensional space and the user can easily obtain navigation-related information of the type necessary for the user.
  • Another object of the invention is to provide an operation image acquisition/display apparatus adapted to acquisition and display images of a plurality of observation systems used in surgical operations without requiring the operator to manually switch from one observation system to another so that the ongoing surgical operation may proceeds smoothly.
  • the above first object is achieved by providing a navigation apparatus comprising:
  • a navigation-related information generating section for generating navigation-related information by measuring the relative position and orientation of an object and a target in a three-dimensional space in order to navigate the object to the target;
  • a display section for displaying the navigation-related information generated by the navigation-related information generating section in different modes according to the relative position and orientation of the object and the target.
  • a navigation apparatus adapted to display navigation-related information in different modes according to the relative position and orientation of the object and the target within a three-dimensional spatial.
  • a navigation apparatus will be described hereinafter in terms of the firs and second embodiments.
  • the above target may normally be a patient, a tumor to be surgically treated of a patient or an area of the body of a patient requiring special attention during a surgical operation, it is by no means limited to an existing object of examination and may alternatively be a virtual target displayed as a two-dimensional or three-dimensional image of a model synthesized by using the video information of an existing target that is obtained in advance.
  • an endoscope 3 may normally refer to an endoscope 3 , it may alternatively refer to some other surgical instrument such as a suction pipe or a pair of forceps.
  • the above display section may normally refer to a liquid crystal monitor, it may alternatively refer to some other video information display such as a CRT display or a head mount display.
  • the above expression “in different modes” refers to differences in color, in the thickness of line, in dimensions and in the density of drawing.
  • a surgical operation image acquisition/display apparatus comprising:
  • an observation section having a plurality of observation section and adapted to modify its position and orientation
  • an image display section adapted to alternatively or synthetically display the images obtained by the plurality of observation section of the observation section;
  • an indication section for indicating the images to be displayed.
  • FIG. 1 is a schematic illustration of the first embodiment of the invention which is a navigation apparatus, showing its configuration
  • FIG. 2 is a schematic illustration of a distance map that can be used for a navigation apparatus according to the invention
  • FIG. 3 is a schematic illustration of a distance map that can be used for a navigation apparatus according to the invention.
  • FIG. 4 is a schematic illustration of the relationship between data on an object of examination and the object of examination itself;
  • FIG. 5 is a schematic illustration of a coordinate transformation matrix for correlating data on an object of examination and the object of examination itself;
  • FIG. 6 is a schematic illustration of a coordinate transformation matrix for transforming a coordinate system defined by a sensing plate fitted to an endoscope into a coordinate system to be used by a camera model expressing the optical system of the endoscope and a coordinate transformation matrix for transforming the coordinate system of a camera model into the coordinate system of a liquid crystal monitor;
  • FIG. 7 is a schematic illustration of a coordinate transformation matrix for transforming a coordinate system defined by a sensing plate fitted to the head of an object of examination into the coordinate system defined by a sensing plate fitted to an endoscope.
  • FIG. 8 is a schematic illustration of a transformation using a plurality of coordinate transformation matrices for transforming data on a target area into positional data on a liquid crystal monitor;
  • FIGS. 9A through 9D schematically illustrate examples of images that may be displayed on a liquid crystal monitor, of which FIG. 9A is an image obtained by overlaid a wireframe image as navigation information on an image obtained by means of the optical system of an endoscope, FIG. 9B is an image obtained by overlaid an internal tomographic image of three-dimensional volume data as navigation information on an image obtained by means of the optical system of an endoscope, FIG. 9C is an image obtained when no target area is found within the effective area of measurement of an endoscope and FIG. 9D is an image obtained when the apparatus is inoperative for measurement;
  • FIG. 10 is a flow chart of a display operation for displaying an image as shown in FIG. 9C;
  • FIG. 11 is a schematic illustration of the second embodiment of the invention which is also a navigation apparatus, showing its configuration
  • FIG. 12 is a schematic illustration of an example of an image displayed by the second embodiment and a coordinate transformation matrix that can be used for the display;
  • FIG. 13 is a schematic illustration of an operation of modifying the thickness of lines of an orthogonally projected image of an endoscope as a function of the relative distance between the target area and the front end of an endoscope;
  • FIG. 14 is a schematic block diagram of the third embodiment of the invention which is a surgical operation image acquisition/display apparatus, showing its configuration
  • FIGS. 15A through 15F are schematic illustrations of a plurality of display modes that can be realized by the video mixer 143 of FIG. 14;
  • FIG. 16 is a schematic illustration of an operation of correlating data on the operation area of a patient 146 and data on the characteristic points of a model data coordinate system m;
  • FIG. 17 is a schematic illustration of a mode of computationally obtaining a coordinate transformation matrix pHe for transforming the patient coordinate system p defined by the sensing plate 145 b fitted to the head of a patient 146 to the endoscope coordinate system e defined by the sensing plate 145 c fitted to an endoscope 142 ;
  • FIG. 18 is a flow chart of the operation of the image controller 147 of FIG. 14.
  • FIG. 19 is a flow chart of the operation of the image controller 147 of the fourth embodiment of the invention which is also a surgical operation image acquisition/display apparatus.
  • FIG. 1 is a schematic illustration of the first embodiment of the invention which is a navigation apparatus, showing its configuration.
  • object of examination 1 or patient, is lying flat on an operating table, facing upward.
  • a hard sensing plate 2 carrying three LEDs for emitting infrared rays that are arranged at the respective corners of a triangle is securely fitted to the head of the object of examination 1 in such a way that its position and orientation relative to the head would not change easily.
  • Another hard sensing plate 4 carrying three LEDs for emitting infrared rays arranged at the respective corners of a triangle is securely fitted to an endoscope 3 .
  • the sensor information storage section 5 is connected to sensor control section 6 .
  • image acquisition type sensor assembly 7 is arranged at a position where the sensing plates 2 and 4 are found within its effective area of measurement.
  • the three-dimensional position and orientation information obtained by the three-dimensional position and orientation measuring section is sent to navigation-related information control section 8 .
  • the information including profile information and internal tomographic image information on the object of examination, the tumor thereof to be surgically treated and the parts thereof requiring special attention during a surgical operation and obtained in advance by measurement using CT and/or MRI is divided into low resolution information (e.g., for a resolution level of 32 ⁇ 32 ⁇ 32 voxels), medium resolution information (e.g., for a resolution level of 128 ⁇ 128 ⁇ 128 voxels) and high resolution information (e.g., for a resolution level of 512 ⁇ 512 ⁇ 512) and then transformed into wireframe three-dimensional model data 10 (high resolution wireframe three-dimensional model data 10 a , medium resolution wireframe three-dimensional model data 10 b , low resolution wireframe three-dimensional model data 10 c ) and three-dimensional volume data 11 (high resolution three-dimensional volume data 11 a , medium resolution three-dimensional volume data 11 b , low resolution three-dimensional volume data 11 c ) and stored in the navigation-related information storage section 9 as data.
  • low resolution information e.g.
  • the navigation-related information storage section 9 additionally stores in advance as distance map.
  • a distance map 12 contains a three-dimensional array having values representing the shortest distances from the surface of the target area (the object of examination, the tumor to be surgically treated or the parts of the body requiring special attention during a surgical operation), the affixed numbers of the array being variables corresponding to the three-dimensional positional coordinate system of the space where the target area is located.
  • a ⁇ fraction (1/10) ⁇ of an index number represents a corresponding coordinate value as expressed in terms of millimeter.
  • wireframe three-dimensional model data 10 the three-dimensional volume data 11 and the distance map 12 are subjected to a coordinate transforming operation so that they are be expressed in terms of a same coordinate system.
  • the image obtained by way of the optical system of the endoscope 3 is taken into the navigation-related information control section 8 by way of a camera control unit and an image input board (not shown).
  • the navigation-related information generated by the navigation-related information control section 8 is display to the user on information display section, which is a liquid crystal monitor 13 .
  • data on the object of examination 1 and the object of examination 1 itself are correlated by measuring the coordinate value m of each characteristic point on the data and the coordinate values p of the corresponding characteristic point as defined by the sensing plate 2 and computing a coordinate transformation matrix (pHm) 14 .
  • the coordinate transformation matrix (pHm) 14 is stored in the above navigation-related information storage section 9 .
  • a coordinate transformation matrix is a 4-row and 4-column matrix comprising a rotational component R representing a rotary motion in a three-dimensional space, a translational component T representing a translation in the three-dimensional space and a constant component.
  • a coordinate transformation matrix (cHe) 15 for transforming the coordinate system defined by the sensing plate 4 into the coordinate system to be used by a camera model expressing the optical system of the endoscope 3 and a coordinate transformation matrix (f_ctos) 16 for transforming the camera model coordinate system into the coordinate system on the actual liquid crystal monitor 13 are also determined and stored in the navigation-related information storage section 9 .
  • the sensor control section 6 that is a component of the three-dimensional position and orientation measuring section measures the three-dimensional position of each of the LEDs that are emitting infrared rays of the sensing plates 2 and 4 and then computationally determines the three-dimensional position and orientation information of each of the sensing plates 2 and 4 in terms of the coordinate values of the original point of the space defined by the sensing plate 4 on the three-dimensional space defined by turn by the sensing plate 2 and the values of the unit vectors along the X, Y and Z axis of the space defined by the sensing plate 4 by using the LED definition data stored in the sensor information storage section 5 .
  • the coordinate transformation matrix (pHe) 17 from the sensing plate 2 attached to the head of the object of examination 1 to the sensing plate 4 attached to the endoscope 3 is computationally determined on the basis of the obtained three-dimensional position and orientation information.
  • the data of target area is converted to the positional data on the liquid crystal monitor 13
  • the navigation-related information control sections 8 generates navigation-related information by using the obtained positional data based on the coordinate transformation matrix 17 and the coordinate transformation matrixes 14 , 15 and 16 .
  • the navigation-related information control section 8 As the image formed by the optical system of the endoscope 3 is input to the navigation-related information control section 8 , the navigation-related information and the image are displayed on the liquid crystal monitor 13 in an overlaid manner as shown in FIG. 9A.
  • the position of the front end of the endoscope 3 is subjected to an operation of coordinate transformation by using the above described coordinate transformation matrices 14 , 15 and 17 and the relative distance between the target area and the front end of the endoscope 3 is determined by referring to the distance map 12 .
  • the endoscope 3 In a surgical operation using an endoscope 3 , the endoscope 3 has to be inserted toward the tumor from the outside of the object of examination 1 , paying attention to the parts that should not be damaged, and then the tumor has to be surgically treated.
  • the model image of the target area is generated as a profiled, wireframe image 18 as shown in FIG. 9A.
  • the color and the thickness of the lines of the wireframe image 18 are made to vary as a function of the relative distance between the front end of the endoscope 3 and the surface of the target area as determined in a manner as described above.
  • the color of the lines of the wireframe image 18 may be blue and the thickness of the lines may be equal to 1 pixel while both the color of the bar 30 representing the distance and that of the background of the numerical value 31 may be equally blue and the width of the bar 30 may equal to 20 pixels.
  • the color of the lines of the wireframe image 18 may be yellow and the thickness of the lines may be equal to 2 pixels while both the color of the bar 30 representing the distance and that of the background of the numerical value 31 may be equally yellow and the width of the bar 30 may equal to 30 pixels.
  • the color of the lines of the wireframe image 18 may be purple and the thickness of the lines may be equal to 2 pixel while both the color of the bar 30 representing the distance and that of the background of the numerical value 31 may be equally purple and the width of the bar 30 may equal to 30 pixels.
  • both the color of the wireframe image 18 and the thickness of the lines of the wireframe image 18 may be made to change so that the user can visually recognize the distance between the surface of the target area and the front end of the endoscope 3 .
  • the wireframe image 18 of an area requiring special attention may be drawn with thick lines when the endoscope 3 is located close to the area and separated therefrom by a distance smaller than a predetermined value so that the user may visually recognize that the endoscope 3 is too close to the area.
  • the lines of the area requiring special attention of the wireframe image 18 may be made five times thicker than before.
  • high resolution wireframe three-dimensional model data 10 a will be used when the distance to the target area is less than 30 mm and medium resolution wireframe three-dimensional model data 10 b will be used when the distance to the target area is between 30 mm and 100 mm, whereas low resolution wireframe three-dimensional model data 10 c will be used when the distance to the target area is greater than 100 mm.
  • the model image of the object of examination 1 drawn by the embodiment is switched from the wireframe image 18 to the internal tomographic image 19 obtained by using the three-dimensional volume data of the object of examination 1 as shown in FIG. 9B depending on the relative distance between the endoscope 3 and the outermost zone of the target area.
  • the model image of the object of examination may be switched from the wireframe image 18 to an internal tomographic image 19 obtained by using the three-dimensional volume data of the object of examination that reflect the viewing direction of the endoscope.
  • the navigation-related information displayed to the user when the endoscope 3 is inserted into the object of examination includes the internal tomgraphic image obtained by using the three-dimensional volume data of the object of examination 1 and the wireframe image 20 of the area requiring special attention.
  • arrow 21 indicates the direction in which the target area will be found as shown in FIG. 9C.
  • model image is found within the drawable (displayable) range or not can be determined by checking if the coordinate of each and every point of the model image as computed when drawing the model image is found as a point on the monitor to be used for displaying the image.
  • the coordinate of the model image is transformed into the coordinate of the display screen (Step S 1 ) and it is determined if the coordinate of a transformed point is found within the displayable range of the display screen (Step S 2 ).
  • the model image is displayed on the screen (Step S 3 ).
  • Step S 4 the coordinate of a representative point of the model image is transformed into the corresponding coordinate of the display screen
  • Step S 5 the coordinate of the front end of the endoscope 3 is transformed into the corresponding coordinate of the display screen.
  • Step S 6 the distance and the direction of the line connecting the representative point of the model image and the front end of the endoscope 3 are computationally determined.
  • the relative distance and the relative direction of the line connecting the center 22 of the endoscope image, or the front end of the endoscope 3 , and the target can be determined by transforming the coordinate values 23 on the model data coordinate system of the representative point of the target into the coordinate values on the liquid crystal monitor 13 by means of the above described coordinate transformation matrices 14 , 15 , 16 and 17 .
  • the user can visually comprehend the extent to which the endoscope 3 should be moved to bring the target into the effective area of measurement of the endoscope 3 by modifying the size of the arrow 21 indicating the target area in proportion to the obtained distance.
  • the sensor control section 6 When the apparatus is incapable of measuring the distance and the direction, the sensor control section 6 outputs a message telling the user that the apparatus is incapable of measuring the distance and the direction in place of three-dimensional position and orientation information.
  • the navigation-related information control section 8 erases the model image being displayed as navigation-related information and generates character information 28 of “unmeasurable condition” and a yellow pixel frame 29 having a width equal to 60 pixels, which are then displayed to the user on the liquid crystal monitor 13 as shown in FIG. 9D.
  • the user can easily comprehend that the endoscope 3 or the object of examination 1 located at a position that makes the intended measurement impossible so that the user can be effectively protected against he risk of operating the endoscope 3 in a wrong way according to navigation-related information that does not reflect the reality.
  • the area that provides the object of navigation is not limited to the target area and may alternatively be a plurality of any areas which are defined by information on the profile of the object of examination 1 or information on an internal tomographic image.
  • the three-dimensional position and orientation measuring section may be made to alternatively comprises a magnetic sensor or a set of mechanical links and joints, encoders and potentiometers popularly used in ordinary three-dimensional position and orientation measuring systems.
  • the wireframe for expressing the profile information of the target area may be replaced by any known technique for graphic expression that is popularly used for three-dimensional computer graphics including polygons.
  • contour lines and equidistant curves relative to the viewing direction may be used.
  • the endoscope 3 that is the object of navigation may be replaced by a plurality of endoscopes.
  • the object of navigation may be a surgical instrument that is not provided with a section of observation.
  • the technique for determining the relative distance between the object of navigation and the target area is not limited to the above described one that uses a distance map and any known technique for computationally determining the distance between two points in a three-dimensional space may alternatively be used for determining the distance between a representative point of the object of navigation and a representative point of the target area for the purpose of the invention.
  • the color may be made to change continuously as a function of the distance in stead of the above described use of a single boundary value.
  • the color may be made to change stepwise by providing a plurality of boundary values.
  • the line thickness may be made to change continuously in stead of the above described use of a single boundary value.
  • the line thickness may be made to change stepwise by providing a plurality of boundary values.
  • a situation where there is no navigation-related information to be displayed may be indicated by making the color to be transparent and the lines to be practically invisible.
  • the density of lines for drawing the model image that varies as a function of the distance may be made to change continuously on the basis of a single set of data in stead of selectively using a plurality of sets of data with different levels of density that are provided in advance as described above.
  • the pattern that is displayed when the target area is out of the effective area of measurement is not limited to the arrow 21 .
  • a triangle, a circle, a bar or some other figure may be used.
  • the distance may be expressed by the size of the figure.
  • the size of the arrow 21 may be made to vary stepwise by using a plurality of preselected values in stead of making it vary continuously in a manner as described above.
  • the navigation-related information indicating a situation where the apparatus is incapable of measuring the distance and the direction may not require both character information 28 and a frame 29 . It may be sufficient to use only either character information 28 or a frame 29 to convey the information.
  • the user can define the color and the line thickness that are used as attributes of the navigation-related information, the density of lines for drawing the model image, the size of the displayed pattern, the boundary values for changing the color and the line thickness as a function of the distance and the character string of the characteristic information 28 indicating that the incapability of measurement of the apparatus.
  • FIG. 11 is a schematic illustration of the second embodiment of the invention which is also a navigation apparatus, showing its configuration.
  • This second embodiment has a configuration same as the above described first embodiment except the following.
  • the endoscope 3 is not required to pass the imaging information obtained by the optical system to the navigation-related information control section 8 .
  • the navigation-related information storage section 9 stores in advance vectors 24 for expressing the route along which the endoscope 3 is inserted (minimal invasive route) as data.
  • the coordinate values 25 for the front end and the rear end of the endoscope 3 are determined in terms of the coordinate system defined by the sensing plate 4 rigidly fitted to the endoscope 3 and stored in the navigation-related information storage section 9 .
  • the sensor control section 6 measures the three-dimensional position of each of the LEDs that are emitting infrared rays of the sensing plates 2 and 4 and then computationally determines the three-dimensional position and orientation of each of the sensing plates 2 and 4 , by using the LED definition data stored in the sensor information storage section 5 .
  • the coordinate transformation matrix 17 from the sensing plate 2 fitted to the head of the object of examination 1 to the sensing plate 4 fitted to the endoscope 3 is computationally determined on the basis of the obtained three-dimensional position and orientation information.
  • the position and the orientation of the endoscope 3 is determined in terms of the data on the target area by using the coordinate transformation matrix 17 and the above described coordinate transformation matrices 14 , 15 .
  • the navigation-related information control section 8 generates a tri-sectional image 26 of the three-dimensional volume data 11 including those of the tumor and an orthogonal projection image 27 of the endoscope 3 projected on the cross section as navigation-related information and displays it on the liquid crystal monitor 13 .
  • the relative distance between the surface of the target area and the front end of the endoscope 3 is determined by referring to the above described distance map 12 , using the position of the front end of the endoscope 3 .
  • the thickness of the lines of the orthogonal projection image 27 of the endoscope 3 is continuously changed as a function of the distance between the surface of the target area and the front end of the endoscope 3 .
  • the orientation of the endoscope 3 determined by the coordinate values 25 of the position of the front end and that of the rear end of the endoscope 3 is compared with the data of the vector indicating the direction in which the endoscope 3 is to be inserted and, if the orientation is inclined relative to the vector by a predetermined value (e.g., 10 degrees), the color and the line thickness of the orthogonal projection image 27 of the endoscope 3 will be changed.
  • a predetermined value e.g. 10 degrees
  • the area that provides the object of navigation is not limited to the target area and may alternatively be a plurality of any areas which are defined by information on the profile of the object of examination 1 or information on an internal tomographic image.
  • the three-dimensional position and orientation measuring section may be made to alternatively comprises a magnetic sensor or a set of mechanical links and joints, an encoder and a potentiometer popularly used in ordinary three-dimensional position and orientation measuring systems.
  • the object of examination When the object of examination is immovable, it is sufficient to measure the three-dimensional position and orientation of the object of examination 3 , store the information in the sensor information storage section and utilize it in the computational operation for determining the relative three-dimensional position and orientation of the object of examination and the object of navigation in advance so that it is only necessary to measure the three-dimensional position and orientation of the object of navigation when the system is in operation.
  • the endoscope 3 that is the object of navigation may be replaced by a plurality of endoscopes.
  • an endoscope 3 may alternatively refer to some other surgical instrument such as a suction pipe or a pair of forceps so long as the mechanical profile thereof can be determined by measurement.
  • the coordinate of the front end of the endoscope 3 does not need to agree with the actual front end and data may be manipulated to make the endoscope 3 virtually have an extended front end.
  • the coordinate of the front end of the endoscope 3 may be defined by means of an operational formula using the extension of the front end from the actual front end as parameter so that the coordinate of the front end may be determined successively on the basis of the extension specified by the user.
  • the color may be made to change on the basis of a single boundary value instead of making it change continuously as a function of the distance in a manner as described above.
  • the color may be made to change stepwise by providing a plurality of boundary values.
  • the line thickness may be made to change on the basis of a single boundary value in stead of making it change continuously in a manner as described above.
  • the line thickness may be made to change stepwise by providing a plurality of boundary values.
  • the technique of determining the angle of inclination of the endoscope 3 is not limited to the one described above.
  • the user can define the color and the line thickness that are used as attributes of the navigation-related information, the density of lines for drawing the model image, the size of the displayed pattern, the boundary values for changing the color and the line thickness as a function of the distance and the character string of the characteristic information 28 indicating that the incapability of measurement of the apparatus.
  • the object of navigation may be a microscope.
  • position of the focal point can be defined as object of navigation by obtaining the focal length of the microscope from the microscope main body and replacing the coordinate of the front end of the endoscope 3 by that of the focal point.
  • a model image of the object or the target, information on the direction of navigation and/or information on the distance between the object of navigation and the target are displayed whenever the position and orientation of the object in a three-dimensional space can be determined so that the user can easily comprehend the position and orientation of the object of navigation in the three-dimensional space.
  • a navigation apparatus according to claim 2 covers both the above described first and second embodiments.
  • the above target may normally be a patient, a tumor to be surgically treated of a patient or an area of the body of a patient requiring special attention during a surgical operation, it is by no means limited to an existing object of examination and may alternatively be a virtual target displayed as a two-dimensional or three-dimensional image of a model synthesized by using the video information of an existing target that is obtained in advance.
  • an endoscope 3 it may alternatively refer to some other surgical instrument such as a suction pipe or a pair of forceps.
  • the above display section may normally refer to a liquid crystal monitor, it may alternatively refer to some other video information display such as a CRT display or a head mount display.
  • model image refers to wireframe model data 10 in the first embodiment, it may alternatively refer model data adapted to express a profile, including a popular data structure to be used for three-dimensional computer graphics.
  • model image refers to the three-dimensional volume data 11 of the target area in the first and second embodiments, it may alternatively take a form where a plurality of two-dimensional pixel data exist.
  • the above described information on the direction of navigation refers to the arrow 21 in the first embodiment, it may alternatively be a two-dimensional geometric figure such as a triangle or a circle, a three-dimensional geometric figure such a cone or a set of visually recognizable image data.
  • the above described distance information refers to the numeral 31 indicating the distance to the tumor in the first embodiment, it may alternatively be a numeral indicating the distance to an appropriate target.
  • the bar 30 may alternatively be a two-dimensional geometric figure such as a triangle or a circle, a three-dimensional geometric figure such a cone or a set of visually recognizable image data.
  • the information indicating an unmeasurable condition refers to the character information 28 of “unmeasurable condition” in the first embodiment, it may alternatively include any character information telling the user that the apparatus is in an unmeasurable condition.
  • an image of an object acquired by the imaging section is displayed with other information on the object in an overlaid manner so that the user can obtain an actual image of the object and navigation-related information simultaneously and hence comprehend the position, the profile and the condition of the object that he or she cannot see on the basis of the navigation-related information.
  • a navigation apparatus covers both the above described first and second embodiments.
  • the object refers to the endoscope 3 in the first embodiment, it may alternatively refer to a microscope or some other object.
  • the above display section normally refers to the liquid crystal monitor in the first embodiment, it may alternatively refer to some other video information display such as a CRT display or a head mount display.
  • the information generating section of the navigation apparatus generates information necessary for navigating the object on the basis of the outcome of the measurement of the three-dimensional position and orientation measuring section.
  • the display section of the navigation apparatus displays navigation-related information in a display mode selected out of a plurality of different display modes according to at least any of distance information on the distance between the object and the target, direction information on the direction of the target as viewed from the object or information telling if the object or the target is found within the effective area of measurement of the three-dimensional position and orientation measuring section or not.
  • the user can easily comprehend the distance between the target and the object, the direction of the target as viewed from the object and if the object or the target is found within the effective area of measurement of the three-dimensional position and orientation measuring section or not.
  • a navigation apparatus covers both the above described first and second embodiments.
  • the target is a patient 1 , a tumor to be surgically treated of a patient or an area of the body of a patient requiring special attention during a surgical operation in the above first and second embodiments, it is by no means limited to an existing object of examination and may alternatively be a virtual target displayed as a two-dimensional or three-dimensional image of a model synthesized by using the video information of an existing target that is obtained in advance.
  • the above three-dimensional position and orientation measuring section refers to sensors using LEDs for emitting infrared rays (sensing plates 2 , 4 , sensor assembly 7 , sensor information storage section 5 and sensor control section 6 ), it may alternatively refer to a sensing system using magnetic sensors or a sensing system using a set of mechanical links and joints, an encoder and a potentiometer popularly used in ordinary three-dimensional position and orientation measuring systems.
  • the information generating section refers to the navigation-related information storage section 9 and the navigation-related information control section 8 .
  • the display section normally refers to the liquid crystal monitor 13 , it may alternatively refer to some other video information display such as a CRT display or a head mount display.
  • the expression of “a plurality of different display modes” refers to differences in color, in the thickness of line, in the dimensions of the drawing and in the density of drawing lines.
  • a display section as set forth in claim 4 displays at least profile information on the target or the object, internal tomographic information on the object, information on the direction of the object as viewed from the object or vice versa or information on the distance to the object when the target or the object is measurable by the three-dimensional position and orientation measuring section but it displays information telling that neither the target nor the object can be measured.
  • a navigation apparatus covers both the above described first and second embodiments.
  • profile information refers to wireframe model data 10 in the first embodiment, it may also refer to model data adapted to express a profile, including a popular data structure to be used for three-dimensional computer graphics.
  • the internal tomographic information refers to the three-dimensional volume data 11 of the target area in the first and second embodiments, it may alternatively take a form where a plurality of two-dimensional pixel data exist.
  • the above described direction of the target refers to the arrow 21 in the first embodiment, it may alternatively be a two-dimensional geometric figure such as a triangle or a circle, a three-dimensional geometric figure such a cone or a set of visually recognizable image data.
  • the above described distance information refers to the numeral 31 indicating the distance to the tumor in the first embodiment, it may alternatively be a numeral indicating the distance to an appropriate target.
  • the bar 30 may alternatively be a two-dimensional geometric figure such as a triangle or a circle, a three-dimensional geometric figure such a cone or a set of visually recognizable image data.
  • the information indicating an unmeasurable condition refers to the character information 28 of “unmeasurable condition” in the first embodiment, it may alternatively include any character information telling the user that the apparatus is in an unmeasurable condition.
  • the object has an imaging section and the image acquired by the imaging section is displayed with other navigation-related information obtained by the information generating section in an overlaid manner.
  • the user can obtain an actual image of the object and navigation-related information simultaneously and hence comprehend the position, the profile and the condition of the object that he or she cannot see on the basis of the navigation-related information.
  • a navigation apparatus covers the above described first embodiment.
  • the object having an imaging section refers to the endoscope 3 in the first embodiment, it may alternatively refer to a microscope or some other object.
  • the above display section normally refers to the liquid crystal monitor 13 in the first embodiment, it may alternatively refer to some other video information display such as a CRT display or a head mount display.
  • the navigation-related information displayed on the display section changes its color as a function of the relative distance between the target and the object as measured by the three-dimensional position and orientation measuring section so that the user can visually comprehend with ease a situation where the relative distance is made too small.
  • both the relative distance and the direction toward the target as viewed from the object are evaluated at the same time and the color of the displayed information is changed depending on the situation. Then, the user can visually comprehend both the relative distance and the direction with ease.
  • a navigation apparatus covers both the above described first and second embodiments.
  • the color of the displayed navigation-related information refers to that of the wireframe image 18 of the target area and the internal tomographic image 19 displayed on the monitor in the first embodiment, it may also refer to the color of the arrow 21 of the first embodiment.
  • the thickness of the lines of the navigation-related information displayed on the display section changes as a function of the relative distance between the target and the object as measured by the three-dimensional position and orientation measuring section so that the user can visually comprehend with ease a situation where the relative distance is made too small.
  • both the relative distance and the direction toward the target as viewed from the object are evaluated at the same time and the line thickness of the displayed information is changed depending on the situation. Then, the user can visually comprehend both the relative distance and the direction with ease.
  • a navigation apparatus according to claim 8 covers both the above described first and second embodiments.
  • the line thickness of the displayed navigation-related information refers to that of the wireframe image 18 of the target area and the internal tomographic image 19 displayed on the monitor in the first embodiment, it may also refer to the line thickness of the arrow 21 of the first embodiment.
  • the profile model of the target and the internal tomographic image are switched from one to the other on the display section as a function of the relative distance between the target and the object as measured by the three-dimensional position and orientation measuring section so that the user can visually comprehend with ease that the object is located close to the target.
  • the user can get necessary information with ease depending on if the distance between the object and the target is smaller than a predetermined value or not.
  • a navigation apparatus covers the above described first embodiment.
  • the above profile model refers to wireframe image 18 in the first embodiment, it may also refer to various profiles used in three-dimensional computer graphics such as polygon as well as contour lines and equidistant curves drawn relative to the viewing direction.
  • the above internal tomographic image refers to the internal tomographic image 19 of the first embodiment.
  • the density of lines drawing the target model image is finely lowered when the relative distance between the target and the object is large and finely raised when the relative distance is small so as to make the load of drawing the target image and the quantity of information used for displaying the image may be well balanced.
  • the user can obtain an adequate amount of information that is displayed with an adequate drawing rate depending as a function of the relative distance between the target and the object.
  • a navigation apparatus according to claim 10 covers the above described first embodiment.
  • profile model refers to wireframe image 18 in the first embodiment, it may also refer to various profiles used in three-dimensional computer graphics such as polygon as well as contour lines and equidistant curves drawn relative to the viewing direction.
  • the information generating section computationally determines the positional relationship of the image to be displayed and the display area of the display section on the basis of the relative distance between the target and the object and the relative direction of the target as viewed from the object and simply indicates the direction of the target when no image is displayed in the display area for the target so that the user can comprehend the relative positions of the target and the object and the direction of the target as viewed from the object without missing either of them by selecting the display of the direction when no image is display for the target.
  • a navigation apparatus covers the above described first embodiment.
  • profile model refers to wireframe image 18 in the first embodiment, it may also refer to various profiles used in three-dimensional computer graphics such as polygon as well as contour lines and equidistant curves drawn relative to the viewing direction along with information on the internal tomographic image.
  • the above described information on the direction of navigation refers to the arrow 21 in the first embodiment, it may alternatively be a two-dimensional geometric figure such as a triangle or a circle, a three-dimensional geometric figure such a cone or a set of visually recognizable image data.
  • the relative distance between the target and the object is indicated by the size or the shape of a symbol so that the user can visually comprehend with ease not only the distance but also the direction of the target as viewed from the object.
  • a navigation apparatus according to claim 12 covers the above described first embodiment.
  • the above described symbol refers to the arrow 21 in the first embodiment, it may alternatively be a two-dimensional geometric figure such as a triangle or a circle, a three-dimensional geometric figure such a cone or a set of visually recognizable image data.
  • the relative distance between the target and the object of navigation and their orientations in a three-dimensional space are determined by the three-dimensional position and orientation measuring section.
  • an computational information determining section generates navigation-related information such as information three-dimensional position and orientation information on the target and the object of navigation including the relative distance between the target and the object of navigation and their orientations and if they are measurable or not and controls the generated information.
  • the information display section displays the navigation-related information generated by the computational information determining section.
  • a navigation apparatus according to claim 13 covers both the first and second embodiments.
  • the above target may normally be a patient, a tumor to be surgically treated of a patient or an area of the body of a patient requiring special attention during a surgical operation, it is by no means limited to an existing object of examination and may alternatively be a virtual target displayed as a two-dimensional or three-dimensional image of a model synthesized by using the video information of an existing target that is obtained in advance.
  • an endoscope 3 may normally refer to an endoscope 3 , it may alternatively refer to some other surgical instrument such as a suction pipe or a pair of forceps.
  • the above three-dimensional position and orientation measuring section refers to sensors using LEDs for emitting infrared rays (sensing plates 2 , 4 , sensor assembly 7 , sensor information storage section 5 and sensor control section 6 ), it may alternatively refer to a sensing system using magnetic sensors or a sensing system using a set of mechanical links and joints, an encoder and a potentiometer popularly used in ordinary three-dimensional position and orientation measuring systems.
  • the computational information determining section refers to the navigation-related information storage section 9 and the navigation-related information control section 8 .
  • the information display section refers to the liquid crystal monitor 13 , it may alternatively refer to some other video information display such as a CRT display or a head mount display.
  • the expression “attributes of navigation-related information” as used herein refers to the color, the thickness of line, the dimensions the drawing and the density of drawing lines.
  • the navigation-related information includes a model image of the profile of the target and/or that of the object of navigation, a model image of the internal tomographic information of the target, a symbol pattern indicating the direction in which the target and/or the object of navigation will be found and/or a numerical value o a symbol pattern indicating the distance between the target and the object of navigation when the three-dimensional position and orientation measuring section is operating normally.
  • the navigation-related information refers to character information or a symbol pattern indicating that the three-dimensional position and orientation measuring section is in operative.
  • a navigation apparatus according to claim 14 covers both the first and second embodiments.
  • model image of the profile refers to wireframe model data 10 in the first embodiment, it may also refer to model data adapted to express a profile, including a popular data structure to be used for three-dimensional computer graphics.
  • model image of the internal tomographic information refers to the three-dimensional volume data 11 of the target area in the first and second embodiments, it may alternatively take a form where a plurality of two-dimensional pixel data exist.
  • the above described symbol pattern indicating the direction in which the object of navigation will be found refers to the arrow 21 in the first embodiment, it may alternatively be a two-dimensional geometric figure such as a triangle or a circle, a three-dimensional geometric figure such a cone or a set of visually recognizable image data.
  • the above described numerical value indicating the distance to the target refers to the numeral 31 indicating the distance to the tumor in the first embodiment, it may alternatively be a numeral indicating the distance to an appropriate target.
  • the bar 30 may alternatively be a two-dimensional geometric figure such as a triangle or a circle, a three-dimensional geometric figure such a cone or a set of visually recognizable image data.
  • character information indicating an unmeasurable condition refers to the character information 28 of “unmeasurable condition” in the first embodiment, it may alternatively include any character information telling the user that the apparatus is in an unmeasurable condition.
  • the object of navigation has an observational function and the observed image obtained by means of the observational function is displayed with other navigation-related information obtained by the computational information determining section in an overlaid manner.
  • the user can obtain an actual image of the object and navigation-related information simultaneously and hence comprehend the position, the profile and the condition of the object that he or she cannot see on the basis of the navigation-related information.
  • a navigation apparatus according to claim 15 covers the above described first embodiment.
  • the object of navigation having an observational function refers to the endoscope 3 in the first embodiment, it may alternatively refer to a microscope or some other object.
  • the above information display section normally refers to the liquid crystal monitor 13 in the first embodiment, it may alternatively refer to some other video information display such as a CRT display or a head mount display.
  • the navigation-related information displayed on the display section changes its color as a function of the relative distance between the target and the object as measured by the three-dimensional position and orientation measuring section so that the user can visually comprehend with ease a situation where the relative distance is made too small.
  • the navigation-related information displayed on the display section changes its color as a function of the relative direction of the target and the object of navigation so that the user also can visually comprehend a situation where the relative direction is deviated from the right direction.
  • the color may be made to change simultaneously as a function of both the relative distance and the relative direction. Then, the user can visually comprehend both the relative distance and the direction with ease.
  • a navigation apparatus according to claim 16 covers both the above described first and second embodiments.
  • the color of the displayed navigation-related information refers to that of the wireframe image 18 of the target area and the internal tomographic image 19 displayed on the monitor in the first embodiment, it may also refer to the color of the arrow 21 of the first embodiment.
  • the thickness of the lines of the navigation-related information obtained by the three-dimensional position and orientation measuring section is made to vary as a function of the relative distance between the target and the object of navigation so that the user can visually comprehend with ease a situation where the relative distance has become too small.
  • the thickness of the lines of the navigation-related information vary as a function of the direction to the target as viewed from the object of navigation so that the user can also visually comprehend with ease a situation where the relative direction has deviated.
  • a navigation according to claim 17 covers both the first and second embodiments.
  • the thickness of the lines of navigation-related information refers to the wireframe image 18 of the target area drawn on the monitor in the first embodiment, it may also include the arrow 21 in the first embodiment.
  • the profile model of the target and the internal tomographic image are switched from one to the other on the display section as a function of the relative distance between the target and the object as measured by the three-dimensional position and orientation measuring section so that the user can visually comprehend with ease that the object is located close to the target.
  • the user can get necessary information with ease depending on if the distance between the object and the target is smaller than a predetermined value or not.
  • a navigation apparatus according to claim 18 covers the above described first embodiment.
  • profile model refers to wireframe image 18 in the first embodiment, it may also refer to various profiles used in three-dimensional computer graphics such as polygon as well as contour lines and equidistant curves drawn relative to the viewing direction.
  • the above internal tomographic image refers to the internal tomographic image 19 of the first embodiment.
  • the density of lines drawing the target model image is finely lowered when the relative distance between the target and the object is large and finely raised when the relative distance is small so as to make the load of drawing the target image and the quantity of information used for displaying the image may be well balanced.
  • the user can obtain an adequate amount of information that is displayed with an adequate drawing rate depending as a function of the relative distance between the target and the object.
  • a navigation apparatus covers the above described first embodiment.
  • profile model refers to wireframe image 18 in the first embodiment, it may also refer to various profiles used in three-dimensional computer graphics such as polygon as well as contour lines and equidistant curves drawn relative to the viewing direction.
  • the computational information determining section computationally determines the positional relationship of the image to be displayed and the display area of the display section on the basis of the relative distance between the target and the object of navigation and the relative direction of the target as viewed from the object and only a symbol pattern is displayed when no model image is displayed in the display area so that the user can comprehend the relative positions of the target and the object and the direction of the target as viewed from the object without missing either of them by selecting the display of the direction when no image is display for the target.
  • a navigation apparatus according to claim 20 covers the above described first embodiment.
  • profile model refers to wireframe image 18 in the first embodiment, it may also refer to various profiles used in three-dimensional computer graphics such as polygon as well as contour lines and equidistant curves drawn relative to the viewing direction along with information on the internal tomographic image.
  • the above described symbol pattern refers to the arrow 21 in the first embodiment, it may alternatively be a two-dimensional geometric figure such as a triangle or a circle, a three-dimensional geometric figure such a cone or a set of visually recognizable image data.
  • the relative distance between the target and the object is indicated by the size of a pattern so that the user can visually comprehend with ease not only the distance but also the direction of the target as viewed from the object.
  • a navigation apparatus according to claim 21 covers the above described first embodiment.
  • the above described symbol pattern refers to the arrow 21 in the first embodiment, it may alternatively be a two-dimensional geometric figure such as a triangle or a circle, a three-dimensional geometric figure such a cone or a set of visually recognizable image data.
  • both the above described first and second embodiments of navigation apparatus according to the invention are adapted to modify the navigation-related information displayed on the display section as a function of the relative three-dimensional positions and orientations of the target and the object of navigation to make the user easily comprehend the distance between the target and the object and obtain navigation-related information of necessary type with ease.
  • FIG. 14 is a schematic block diagram of the third embodiment of the invention which is a surgical operation image acquisition/display apparatus, showing its configuration;
  • the third embodiment of surgical operation image acquisition/display apparatus according to the invention has a configuration as described below.
  • the surgical operation image acquisition/display apparatus comprises a surgical microscope 141 (first observation section) and an endoscope 142 (second observation section) for observing an area located in the dead angle of the surgical microscope 141 .
  • the surgical microscope 141 includes an microscope optical system 141 a mounted on a stand (not shown), a microscope camera 141 b attached to the microscope optical system 141 a and a microscope camera control unit (hereinafter referred to as microscope CCU) 141 c for converting the output of the microscope camera 141 b into a video signal.
  • microscope CCU microscope camera control unit
  • the microscope optical system 141 a is provided with illumination light emitted from a light source (not shown) and guided by a light guide (not shown) for the purpose of observation.
  • the endoscope 142 includes an endoscope optical system 142 a , an endoscope camera 142 b attached to the endoscope optical system 142 a and an endoscope camera control unit (hereinafter referred to as endoscope CCU) 142 for converting the output of the endoscope camera into a video signal.
  • endoscope CCU endoscope camera control unit
  • the endoscope optical system 142 a is provided with illumination light emitted from a light source (not shown) and guided by a light guide (not shown) for the purpose of observation.
  • the video mixer 143 has a plurality of display modes 0 through 5 .
  • the video output of the endoscope CCU 142 c is dimensionally reduced and displayed in the video output of the microscope CCU 141 c.
  • the extent of dimensional reduction and the display position of the dimensionally reduced output are variable both in the display mode 2 and the display mode 3 .
  • the display mode 4 (FIG. 15E), which is a variation of the display mode 2 and in which the video output of the endoscope CCU 142 c is dimensionally reduced and displayed at the right top corner of the video output of the microscope CCU 141 c , the video output of the endoscope CCU 142 c is dimensionally reduced to the same extent and displayed at the left bottom corner of the video output of the microscope CCU 141 c.
  • the display mode 5 which is a variation of the display mode 3 and in which the video output of the microscope CCU 141 c is dimensionally reduced and displayed at the right top corner of the video output of the endoscope CCU 142 c
  • the video output of the microscope CCU 141 c is dimensionally reduced to the extent smaller than that of FIG. 15D and displayed at the right top corner of the video output of the endoscope CCU 142 c.
  • the mode of display and the position and the size of the displayed image (or each of the displayed images) as defined by the video mixer 143 can be appropriately changed by means of an externally applied control signal. More specifically, the displayed image(s) can be enlarged or reduced independently with an appropriately selected magnification factor.
  • the output of the video mixer 143 is fed to a liquid crystal display 144 (display section) and displayed to the surgical operator for observation.
  • position and orientation sensor 145 (position and orientation detection section) comprises hard sensing plates 145 b and 145 c , each having three infrared light emitting diodes (LEDs) 145 a for emitting arranged at the respective corners of a triangle, a sensor assembly 145 d for detecting the quantity of light emitted from each of the infrared LEDs 145 a for emitting and a sensor controller 145 e for computationally determining the three-dimensional position and orientation of the sensing plate 145 b and that of the sensing plate 145 c from the output of the sensor assembly.
  • LEDs infrared light emitting diodes
  • each of the LEDs 145 a for emitting infrared is observed and determined in advance in terms of the coordinate system defined on each of the sensing plates 145 b and 145 c and stored in the sensor controller 145 e as LED definition data.
  • sensing plate 145 b is attached to the head of the patient 146 in such a way that its position and orientation relative to the head would not change easily.
  • the other sensing plate 145 c is attached to the endoscope 143 by a mount section (not shown).
  • the sensor controller 145 e is connected to an image controller 147 (state of synthesis modification specifying section).
  • the image controller 147 is connected to the video mixer 143 .
  • the data on the area of surgical operation and the patient 146 are correlated by observing the coordinate values of the characteristic points in the model data coordinate system m and those of the characteristic points on the patient 146 in the patient coordinate system p defined by the sensing plate 145 b and computing a coordinate transformation matrix pHm.
  • the coordinate transformation matrix pHm is stored in the storage section of the image controller 147 .
  • the coordinate values of the front end and those of the rear end of the endoscope 142 are observed in terms of the endoscope coordinate system e defined by the sensing plate 145 c attached to the endoscope 142 and stored in the storage section of the image controller 147 .
  • the sensor controller 145 e drives the infrared LEDs 145 a to sequentially emit and determines the three-dimensional position of each of the infrared LEDs 145 a on the basis of its output.
  • the sensor controller 145 e computationally determines the three-dimensional position and orientation of the sensing plate 145 b and that of the sensing plate 145 c , using the LED definition data stored in the sensor controller 145 e and outputs the obtained data to the image controller 147 upon request.
  • the image controller 147 computes the coordinate transformation matrix pHe from the sensing plate 145 b of the patient coordinate system p attached to the head of the patient 146 to the sensing plate 145 c of the endoscope coordinate system e attached to the endoscope 142 on the basis of the three-dimensional position and orientation information.
  • the image controller 147 also computationally determines the relative distance between the patient 146 and the endoscope 142 and the relative direction of the patient 146 as viewed from the endoscope 142 on the basis of the coordinate transformation matrix pHe and the above described coordinate transformation matrix pHm.
  • the image controller 147 outputs a request signal cyclically with a predetermined period (e.g., 33 msec) to the sensor controller 145 e to receive the three-dimensional position and orientation information on the sensing plates 145 b and 145 c from the sensor controller 145 e (Step S 101 ).
  • a predetermined period e.g. 33 msec
  • the image controller 147 judges if the endoscope 142 is located close to the area of surgical operation (e.g., within a range of 50 mm) on the basis of the received three-dimensional position and orientation information (Step S 102 ).
  • the image controller 147 judges that the endoscope 142 is located close to the area of surgical operation, it outputs an instruction for switching from the microscope image to the endoscope image (mode 1 ) to the video mixer 143 (Step S 103 ).
  • the image controller 147 judges that the endoscope 142 is not located close to the area of surgical operation, it outputs an instruction for switching from the endoscope image to the microscope image (mode 0 ) to the video mixer 143 (Step S 104 ).
  • the first observation section refers to the surgical microscope 141 in this embodiment, it may alternatively be the endoscope 142 or some other observation sections or units.
  • the second observation section refers to the endoscope 142 in this embodiment, it may alternatively be the surgical microscope 141 or some other observation sections or units.
  • the image synthesizing section refers to the video mixer 143 in this embodiment, it may be some other section for externally modifying the synthesized state of a plurality of images.
  • the display section refers to the liquid crystal display 144 in this embodiment, it may alternatively be a CRT display, a head mounted display or a projector adapted to display video signals.
  • the position and orientation detection section is the position and orientation sensor (comprising the infrared LEDs 145 a for emitting, the sensing plates 145 b , 145 c , the sensor assembly 145 d and the sensor controller 145 e ) in this embodiment, it may alternatively be any appropriate section for detecting the three-dimensional position and orientation of an object such as a magnetic sensor or a set of mechanical links and joints, encoders and potentiometers.
  • the position and orientation detection section of this embodiment detects the position and orientation of the endoscope 142 , it may alternatively detect the position and orientation of the microscope 141 or both the position and orientation of the endoscope 142 and that of the microscope 141 .
  • the fourth embodiment of surgical operation image acquisition/display apparatus has a configuration substantially same as the above described third embodiment and hence the similar components in the graphic illustrations thereof will be denoted respectively by the same reference symbols and will not be described any further.
  • one of the obtained two images is dimensionally reduced and synthetically combined with the other image so that they may be displayed simultaneously on the display screen for observation.
  • the image controller 147 outputs a request signal cyclically with a predetermined period (e.g., 33 msec) to the sensor controller 145 e to receive the three-dimensional position and orientation information on the sensing plates 145 b and 145 c from the sensor controller 145 e (Step S 201 ).
  • a predetermined period e.g. 33 msec
  • the image controller 147 judges if the endoscope 142 is located close to the area of surgical operation (e.g., within a range of 50 mm) on the basis of the received three-dimensional position and orientation information (Step S 202 ).
  • the image controller 147 judges that the endoscope 142 is located close to the area of surgical operation, it outputs an instruction for dimensionally reducing the microscope image and display it with the endoscope image (mode 3 ) to the endoscope image (mode 1 ) to the video mixer 143 (Step S 203 ).
  • the image controller 147 judges that the endoscope 142 is not located close to the sit of surgical operation, it outputs an instruction for dimensionally reducing the endoscope image and display it with the microscope image (mode 2 ) to the video mixer 143 (Step S 204 ).
  • the first observation section refers to the surgical microscope 141 in these embodiments, it may alternatively be the endoscope 142 or some other observation sections or units.
  • the second observation section refers to the endoscope 142 in these embodiments, it may alternatively be the surgical microscope 141 or some other observation sections or units.
  • image synthesizing section refers to the video mixer 143 in these embodiments, it is by no means limited thereto and may be some other section for externally modifying the synthesized state of a plurality of images.
  • the display section refers to the liquid crystal display 144 in these embodiments, it may alternatively be a CRT display, a head mounted display or a projector adapted to display video signals.
  • the position and orientation detection section is the position and orientation sensor (comprising the infrared LEDs 145 a for emitting, the sensing plates 145 b , 145 c , the sensor assembly 145 d and the sensor controller 145 e ) in these embodiments, it may alternatively be any appropriate section for detecting the three-dimensional position and orientation of an object such as a magnetic sensor or a set of mechanical links and joints, encoders and potentiometers.
  • the state of synthesis modification specifying section refers to the image controller in these embodiments.
  • both the third and fourth embodiments of the invention provide a surgical operation image acquisition/display apparatus that efficiently assists a surgeon to smoothly carry out a surgical operation without requiring him or her to switch the observation system from one to another by means of a navigation apparatus when the surgical operation is conducted by using a plurality of observation systems including a surgical microscope and an endoscope.

Abstract

A navigation apparatus comprises a navigation-related information generating section and a display section. The navigation-related information generating section measures the position and orientation of an object and a target in a three-dimensional space and generate navigation-related information to be used for navigating the object toward the target. The display section displays the navigation-related information generated by the navigation-related information generating section in any of different modes depending on the relationship of the position and orientation of the object and that of the target. A surgical operation image acquisition/display apparatus comprises an observation section, an image display section and a specifying section. The observation section includes a plurality of observation sections whose position and orientation is modifiable. The image display section is adapted to alternatively display any of the images obtained by the observation sections or synthetically combine and display the combined images. The specifying section specifies the image to be displayed to the image display section according to the position and orientation of the observation section.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is based upon and claims the benefit of priority from the prior Japanese Patent Applications No. 11-089405, filed Mar. 30, 1999; and No. 11-163964, filed Jun. 10, 1999, the entire contents of which are incorporated herein by reference.[0001]
  • BACKGROUND OF THE INVENTION
  • This invention relates to a navigation apparatus and, more particularly, to a navigation apparatus adapted to modify navigation-related information according to the relative position and orientation of the object of navigation and the target within a three-dimensional space. [0002]
  • This invention also relates to a surgical operation image acquisition/display apparatus and, more particularly, to an operation image acquisition/display apparatus adapted to acquisition and display images of a plurality of observation systems used in surgical operations. [0003]
  • Various navigation apparatus have been proposed for applications in the field of surgical operations, including those disclosed in Jpn. Pat. Appln. KOKAI Publication Nos. 9-173352 and 10-5245. [0004]
  • The medical navigation system disclosed in Jpn. Pat. Appln. KOKAI Publication No. 9-173352 is adapted to display information (profile information, medical image information) on the desired part of the object of examination specified by a specifying section for specifying a desired part of the object of examination. [0005]
  • It is also adapted to display video information obtained by an appearance imaging section, profile information on the profile measured by a profile measuring section and medical image information obtained by a medical image acquisition section on an image display section in an overlaid way. [0006]
  • The surgical operation assisting apparatus disclosed in Jpn. Pat. Appln. KOKAI Publication No. 10-5245 is adapted to display the current position of the surgical instrument being used in a surgical operation and the blood vessel located closest to the instrument on a tomographic image of the area of surgical operation in an overlaid manner by using the image data on the tomographic image, the surgical instrument being used, blood vessel detection section for detecting the blood vessel located closest to the surgical instrument, a position detection section for detecting the current position of the surgical instrument, an arithmetic computing section for computationally determines the position of the front end of the surgical instrument and the direction in which the surgical instrument is inserted, an image selection section for selecting the image data on the image being acquired for the area where the front end of the surgical instrument is located and an image synthesizing section for synthetically combining the image selected by the image selection section and a predetermined pattern indicating the front end of the surgical instrument in an overlaid manner. [0007]
  • The above described arrangement is intended to allow the operator to visually confirm the position of the front end of the surgical instrument inserted into the body of the patient on the tomographic image being displayed. [0008]
  • However, the medical navigation system and the surgical operation assisting apparatus as disclosed in the above patent documents are accompanied by the following problems. [0009]
  • As for the medical navigation system disclosed in Jpn. Pat. Appln. KOKAI Publication No. 9-173352, it simply displays information on a desired part of the object of examination specified by the section for specifying a desired part of the object of examination and it is difficult to navigate the section to the part desired by the user. [0010]
  • Additionally, this known medical navigation system provides a difficulty with which the surgeon realizes distances in the perspective of displayed information in the direction connecting the eyes of the surgeon and the display screen that is perpendicular to the latter. [0011]
  • Furthermore, this known medical navigation system provides an additional difficulty with which the surgeon determines the route of navigation on the basis of the displayed information when both the object of examination and the section for specifying the desired part of the object of examination are located at respective positions that are found within the measurable area but outside the displayable area of the system. [0012]
  • The surgical operation assisting system disclosed in Jpn. Pat. Appln. KOKAI Publication No. 10-5245 is accompanied by a problem of cumbersomeness that the user has to be constantly aware of the distance between the position of the front end of the surgical instrument on the displayed tomographic image and the position of the detected blood vessel in order to know the distance between the blood vessel and the surgical instrument. [0013]
  • In recent years, micro-surgery has become popular as a result of the development of both surgical techniques and surgical instruments. [0014]
  • In micro-surgery, generally a surgical microscope is used to observe an enlarged view of the area of surgical operation. [0015]
  • Particularly, in the field of cranial nerve surgery and otorhinolarygology, there arise occasions frequently where the area of operation can hardly be observed because it is at the so-called dead angle if the surgical microscope is handled elaborately when the area is located deep in the body. [0016]
  • For observing an area at such a dead angle, normally a mirror or an endoscope is used. [0017]
  • When using an endoscope for micro-surgery, it has to manipulated and placed accurately at the right position that is located deep in the body having an exquiarealy complicated three-dimensional structure because the area of operation is always at the dead angle of the surgical microscope. [0018]
  • The manipulation has to be conducted carefully by the operator, while observing it through the surgical microscope so that any normal tissues of the patient would not be inadvertently damaged by the endoscope and, at the same time, the area of operation has to be visually confirmed by means of the endoscope. [0019]
  • While manipulating the endoscope, the operator has to select instantaneously either the image taken by the surgical microscope or the image acquired by way of the endoscope as object of observation and the selection has to be correct. [0020]
  • As an attempt for aiding a surgeon manipulating the endoscope, Jpn. Pat. Appln. KOKAI Publication No. 5-203881 proposes an integrated image system comprising a plurality of CCD cameras connected to respective observation systems, each including a surgical microscope, an endoscope and other instruments, a CCD camera controller for controlling the operation of selectively using any of the observation systems and a view finder controller so that the user may select any of the observation systems by means of the CCD camera controller in the course of the ongoing surgical operation. [0021]
  • Jpn. Pat. Appln. KOKAI Publication No. 7-261094 discloses a surgical microscope with which the user can switch from the image of the surgical microscope to that of the endoscope or vice versa or overlay one on the other whenever necessary. [0022]
  • However, with the known technique disclosed in the above described Jpn. Pat. Appln. KOKAI Publication No. 5-203881, the operator has to carry out the switching or overlaid operation at the cost of a smooth progress of the ongoing surgical operation. [0023]
  • Additionally, while the above patent document describes that the image may be switched from one to the other, it does not describe specifically how the switching operation proceeds. [0024]
  • On the other hand, the known technique disclosed in Jpn. Pat. Appln. KOKAI Publication No. 7-261094 involves the use of a mode switch with which the surgical operator can switch the display mode whenever necessary. [0025]
  • However, it is highly cumbersome for the operator to switch from the image of the surgical microscope to that of the endoscope or vice versa when he or she has to place the endoscope in a position deep in the body of the patient having an exquiarealy complicated three-dimensional structure. Additionally, such a switching operation can obstruct the smooth progress of the surgical operation. [0026]
  • BRIEF SUMMARY OF THE INVENTION
  • In view of the above identified problems of the prior art, it is therefore the object of the present invention to provide a navigation apparatus with which the user can easily and visually realize the distance between a target and an object of navigation by modifying the obtained navigation-related information according to the relative position and orientation of the object of navigation and the target within a three-dimensional space and the user can easily obtain navigation-related information of the type necessary for the user. [0027]
  • Another object of the invention is to provide an operation image acquisition/display apparatus adapted to acquisition and display images of a plurality of observation systems used in surgical operations without requiring the operator to manually switch from one observation system to another so that the ongoing surgical operation may proceeds smoothly. [0028]
  • In the first aspect of the invention, the above first object is achieved by providing a navigation apparatus comprising: [0029]
  • a navigation-related information generating section for generating navigation-related information by measuring the relative position and orientation of an object and a target in a three-dimensional space in order to navigate the object to the target; and [0030]
  • a display section for displaying the navigation-related information generated by the navigation-related information generating section in different modes according to the relative position and orientation of the object and the target. [0031]
  • Thus, with a navigation apparatus according to the invention adapted to display navigation-related information in different modes according to the relative position and orientation of the object and the target within a three-dimensional spatial. [0032]
  • A navigation apparatus according to the invention will be described hereinafter in terms of the firs and second embodiments. While the above target may normally be a patient, a tumor to be surgically treated of a patient or an area of the body of a patient requiring special attention during a surgical operation, it is by no means limited to an existing object of examination and may alternatively be a virtual target displayed as a two-dimensional or three-dimensional image of a model synthesized by using the video information of an existing target that is obtained in advance. [0033]
  • While the above object may normally refer to an [0034] endoscope 3, it may alternatively refer to some other surgical instrument such as a suction pipe or a pair of forceps.
  • While the above display section may normally refer to a liquid crystal monitor, it may alternatively refer to some other video information display such as a CRT display or a head mount display. [0035]
  • For the purpose of the invention, the above expression “in different modes” refers to differences in color, in the thickness of line, in dimensions and in the density of drawing. [0036]
  • In the second aspect of the invention, the above second object is achieved by providing a surgical operation image acquisition/display apparatus comprising: [0037]
  • an observation section having a plurality of observation section and adapted to modify its position and orientation; [0038]
  • an image display section adapted to alternatively or synthetically display the images obtained by the plurality of observation section of the observation section; and [0039]
  • an indication section for indicating the images to be displayed. [0040]
  • Additional objects and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objects and advantages of the invention may be realized and obtained by means of the instrumentalities and combinations particularly pointed out hereinafter.[0041]
  • BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWING
  • The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate presently preferred embodiments of the invention, and together with the general description given above and the detailed description of the preferred embodiments given below, serve to explain the principles of the invention. [0042]
  • FIG. 1 is a schematic illustration of the first embodiment of the invention which is a navigation apparatus, showing its configuration; [0043]
  • FIG. 2 is a schematic illustration of a distance map that can be used for a navigation apparatus according to the invention; [0044]
  • FIG. 3 is a schematic illustration of a distance map that can be used for a navigation apparatus according to the invention; [0045]
  • FIG. 4 is a schematic illustration of the relationship between data on an object of examination and the object of examination itself; [0046]
  • FIG. 5 is a schematic illustration of a coordinate transformation matrix for correlating data on an object of examination and the object of examination itself; [0047]
  • FIG. 6 is a schematic illustration of a coordinate transformation matrix for transforming a coordinate system defined by a sensing plate fitted to an endoscope into a coordinate system to be used by a camera model expressing the optical system of the endoscope and a coordinate transformation matrix for transforming the coordinate system of a camera model into the coordinate system of a liquid crystal monitor; [0048]
  • FIG. 7 is a schematic illustration of a coordinate transformation matrix for transforming a coordinate system defined by a sensing plate fitted to the head of an object of examination into the coordinate system defined by a sensing plate fitted to an endoscope. [0049]
  • FIG. 8 is a schematic illustration of a transformation using a plurality of coordinate transformation matrices for transforming data on a target area into positional data on a liquid crystal monitor; [0050]
  • FIGS. 9A through 9D schematically illustrate examples of images that may be displayed on a liquid crystal monitor, of which FIG. 9A is an image obtained by overlaid a wireframe image as navigation information on an image obtained by means of the optical system of an endoscope, FIG. 9B is an image obtained by overlaid an internal tomographic image of three-dimensional volume data as navigation information on an image obtained by means of the optical system of an endoscope, FIG. 9C is an image obtained when no target area is found within the effective area of measurement of an endoscope and FIG. 9D is an image obtained when the apparatus is inoperative for measurement; [0051]
  • FIG. 10 is a flow chart of a display operation for displaying an image as shown in FIG. 9C; [0052]
  • FIG. 11 is a schematic illustration of the second embodiment of the invention which is also a navigation apparatus, showing its configuration; [0053]
  • FIG. 12 is a schematic illustration of an example of an image displayed by the second embodiment and a coordinate transformation matrix that can be used for the display; [0054]
  • FIG. 13 is a schematic illustration of an operation of modifying the thickness of lines of an orthogonally projected image of an endoscope as a function of the relative distance between the target area and the front end of an endoscope; [0055]
  • FIG. 14 is a schematic block diagram of the third embodiment of the invention which is a surgical operation image acquisition/display apparatus, showing its configuration; [0056]
  • FIGS. 15A through 15F are schematic illustrations of a plurality of display modes that can be realized by the [0057] video mixer 143 of FIG. 14;
  • FIG. 16 is a schematic illustration of an operation of correlating data on the operation area of a [0058] patient 146 and data on the characteristic points of a model data coordinate system m;
  • FIG. 17 is a schematic illustration of a mode of computationally obtaining a coordinate transformation matrix pHe for transforming the patient coordinate system p defined by the [0059] sensing plate 145 b fitted to the head of a patient 146 to the endoscope coordinate system e defined by the sensing plate 145 c fitted to an endoscope 142;
  • FIG. 18 is a flow chart of the operation of the [0060] image controller 147 of FIG. 14; and
  • FIG. 19 is a flow chart of the operation of the [0061] image controller 147 of the fourth embodiment of the invention which is also a surgical operation image acquisition/display apparatus.
  • DETAILED DESCRIPTION OF THE INVENTION
  • Reference will now be made in detail to the presently preferred embodiments of the invention as illustrated in the several views of the accompanying drawing, in which like reference numerals designates like or corresponding parts. [0062]
  • (Embodiment 1) [0063]
  • FIG. 1 is a schematic illustration of the first embodiment of the invention which is a navigation apparatus, showing its configuration. [0064]
  • Referring to FIG. 1, object of [0065] examination 1, or patient, is lying flat on an operating table, facing upward.
  • A [0066] hard sensing plate 2 carrying three LEDs for emitting infrared rays that are arranged at the respective corners of a triangle is securely fitted to the head of the object of examination 1 in such a way that its position and orientation relative to the head would not change easily.
  • Another [0067] hard sensing plate 4 carrying three LEDs for emitting infrared rays arranged at the respective corners of a triangle is securely fitted to an endoscope 3.
  • The LEDs arranged on the [0068] sensing plate 2 and those arranged on the sensing plate 4 do not change their positional relationships.
  • The positions of the LEDS of each of the [0069] sensing plates 2 and 4 are observed and determined in advance in terms of the coordinate system defined on the sensing plate and stored in sensor information storage section 5 as LED definition data.
  • The sensor [0070] information storage section 5 is connected to sensor control section 6.
  • Then, image acquisition [0071] type sensor assembly 7 is arranged at a position where the sensing plates 2 and 4 are found within its effective area of measurement.
  • Then, a three-dimensional position and orientation measuring section is established as the [0072] sensing plates 2 and 4 and the sensor assembly 7 are connected to sensor control section 6.
  • The three-dimensional position and orientation information obtained by the three-dimensional position and orientation measuring section is sent to navigation-related [0073] information control section 8.
  • The information including profile information and internal tomographic image information on the object of examination, the tumor thereof to be surgically treated and the parts thereof requiring special attention during a surgical operation and obtained in advance by measurement using CT and/or MRI is divided into low resolution information (e.g., for a resolution level of 32×32×32 voxels), medium resolution information (e.g., for a resolution level of 128×128×128 voxels) and high resolution information (e.g., for a resolution level of 512×512×512) and then transformed into wireframe three-dimensional model data [0074] 10 (high resolution wireframe three-dimensional model data 10 a, medium resolution wireframe three-dimensional model data 10 b, low resolution wireframe three-dimensional model data 10 c) and three-dimensional volume data 11 (high resolution three-dimensional volume data 11 a, medium resolution three-dimensional volume data 11 b, low resolution three-dimensional volume data 11 c) and stored in the navigation-related information storage section 9 as data.
  • The navigation-related [0075] information storage section 9 additionally stores in advance as distance map.
  • As shown in FIGS. 2 and 3, a [0076] distance map 12 contains a three-dimensional array having values representing the shortest distances from the surface of the target area (the object of examination, the tumor to be surgically treated or the parts of the body requiring special attention during a surgical operation), the affixed numbers of the array being variables corresponding to the three-dimensional positional coordinate system of the space where the target area is located.
  • For example, when the smallest unit of division is 0.1 mm, a {fraction (1/10)} of an index number represents a corresponding coordinate value as expressed in terms of millimeter. [0077]
  • Assume that such a distance map is prepared for each target area in advance by means of a distance map preparing computer and stored in the navigation-related [0078] information storage section 9 as data.
  • Note that all the wireframe three-[0079] dimensional model data 10, the three-dimensional volume data 11 and the distance map 12 are subjected to a coordinate transforming operation so that they are be expressed in terms of a same coordinate system.
  • Then, the image obtained by way of the optical system of the [0080] endoscope 3 is taken into the navigation-related information control section 8 by way of a camera control unit and an image input board (not shown).
  • The navigation-related information generated by the navigation-related [0081] information control section 8 is display to the user on information display section, which is a liquid crystal monitor 13.
  • As shown in FIG. 4, data on the object of [0082] examination 1 and the object of examination 1 itself are correlated by measuring the coordinate value m of each characteristic point on the data and the coordinate values p of the corresponding characteristic point as defined by the sensing plate 2 and computing a coordinate transformation matrix (pHm) 14.
  • The coordinate transformation matrix (pHm) [0083] 14 is stored in the above navigation-related information storage section 9.
  • As shown in FIG. 5, a coordinate transformation matrix is a 4-row and 4-column matrix comprising a rotational component R representing a rotary motion in a three-dimensional space, a translational component T representing a translation in the three-dimensional space and a constant component. [0084]
  • Additionally, as shown in FIG. 6 a coordinate transformation matrix (cHe) [0085] 15 for transforming the coordinate system defined by the sensing plate 4 into the coordinate system to be used by a camera model expressing the optical system of the endoscope 3 and a coordinate transformation matrix (f_ctos) 16 for transforming the camera model coordinate system into the coordinate system on the actual liquid crystal monitor 13 are also determined and stored in the navigation-related information storage section 9.
  • Now, the operation of the first embodiment of navigation apparatus according to the invention and having the above described configuration will be discussed below. [0086]
  • During the operation of the navigation apparatus, the [0087] sensor control section 6 that is a component of the three-dimensional position and orientation measuring section measures the three-dimensional position of each of the LEDs that are emitting infrared rays of the sensing plates 2 and 4 and then computationally determines the three-dimensional position and orientation information of each of the sensing plates 2 and 4 in terms of the coordinate values of the original point of the space defined by the sensing plate 4 on the three-dimensional space defined by turn by the sensing plate 2 and the values of the unit vectors along the X, Y and Z axis of the space defined by the sensing plate 4 by using the LED definition data stored in the sensor information storage section 5.
  • Then, as shown in FIG. 7, the coordinate transformation matrix (pHe) [0088] 17 from the sensing plate 2 attached to the head of the object of examination 1 to the sensing plate 4 attached to the endoscope 3 is computationally determined on the basis of the obtained three-dimensional position and orientation information.
  • Then, as shown in FIG. 8, the data of target area is converted to the positional data on the [0089] liquid crystal monitor 13, the navigation-related information control sections 8 generates navigation-related information by using the obtained positional data based on the coordinate transformation matrix 17 and the coordinate transformation matrixes 14, 15 and 16.
  • As the image formed by the optical system of the [0090] endoscope 3 is input to the navigation-related information control section 8, the navigation-related information and the image are displayed on the liquid crystal monitor 13 in an overlaid manner as shown in FIG. 9A.
  • Then, as shown in FIG. 7, the position of the front end of the [0091] endoscope 3 is subjected to an operation of coordinate transformation by using the above described coordinate transformation matrices 14, 15 and 17 and the relative distance between the target area and the front end of the endoscope 3 is determined by referring to the distance map 12.
  • Then, as shown in FIG. 9A, when the object of [0092] examination 1 and the endoscope 3 are in a measurable state, the relative distance between the target area and the front end of the endoscope 3 is displayed on the liquid crystal monitor 13 as the distance to the tumor in terms of the length of a bar 30 and a numerical value 31.
  • In a surgical operation using an [0093] endoscope 3, the endoscope 3 has to be inserted toward the tumor from the outside of the object of examination 1, paying attention to the parts that should not be damaged, and then the tumor has to be surgically treated.
  • When the [0094] endoscope 3 is located outside the object of examination 1, the model image of the target area is generated as a profiled, wireframe image 18 as shown in FIG. 9A.
  • In this embodiment, the color and the thickness of the lines of the [0095] wireframe image 18 are made to vary as a function of the relative distance between the front end of the endoscope 3 and the surface of the target area as determined in a manner as described above.
  • The color and the width of the [0096] bar 30 and those of the numerical value 31 showing the distance to the tumor are also made to vary along with those of the background.
  • For instance, when the relative distance is equal to or greater than 10 mm, the color of the lines of the [0097] wireframe image 18 may be blue and the thickness of the lines may be equal to 1 pixel while both the color of the bar 30 representing the distance and that of the background of the numerical value 31 may be equally blue and the width of the bar 30 may equal to 20 pixels.
  • When, on the other hand, the relative distance is equal to or greater than 0 mm and smaller than 10 mm, the color of the lines of the [0098] wireframe image 18 may be yellow and the thickness of the lines may be equal to 2 pixels while both the color of the bar 30 representing the distance and that of the background of the numerical value 31 may be equally yellow and the width of the bar 30 may equal to 30 pixels.
  • If the front end of the [0099] endoscope 3 is inserted by a distance equal to or greater than 0 mm and smaller than 10 mm, the color of the lines of the wireframe image 18 may be purple and the thickness of the lines may be equal to 2 pixel while both the color of the bar 30 representing the distance and that of the background of the numerical value 31 may be equally purple and the width of the bar 30 may equal to 30 pixels.
  • In this way, when the front end of the [0100] endoscope 3 traveled by a predetermined distance, both the color of the wireframe image 18 and the thickness of the lines of the wireframe image 18 may be made to change so that the user can visually recognize the distance between the surface of the target area and the front end of the endoscope 3.
  • Additionally, the [0101] wireframe image 18 of an area requiring special attention may be drawn with thick lines when the endoscope 3 is located close to the area and separated therefrom by a distance smaller than a predetermined value so that the user may visually recognize that the endoscope 3 is too close to the area.
  • For instance, when the reference value of the [0102] distance map 12 for the area requiring special attention is less than 10 mm, the lines of the area requiring special attention of the wireframe image 18 may be made five times thicker than before.
  • The denseness or coarseness of the [0103] wireframe image 18 that is drawn in correspondence to the relative distance between the endoscope 3 and the surface of the target area is also made to vary.
  • More specifically, a set of more detailed wireframe three-[0104] dimensional model data 10 a will be selected as the relative distance is reduced, whereas a set of more scarce wireframe three-dimensional model data 10 c will be selected as the relative distance is increased.
  • For instance, high resolution wireframe three-[0105] dimensional model data 10 a will be used when the distance to the target area is less than 30 mm and medium resolution wireframe three-dimensional model data 10 b will be used when the distance to the target area is between 30 mm and 100 mm, whereas low resolution wireframe three-dimensional model data 10 c will be used when the distance to the target area is greater than 100 mm.
  • With this arrangement, the problem of the prior art that coarse wireframe three-dimensional model data have to be used to reduce the time until the completion of drawing a wireframe image in order to save time when the endoscope is approaching the target area whereas dense wireframe three-dimensional model data are used to unnecessarily consume time before the completion of drawing a wireframe image when the endoscope is remote from the target area is successfully eliminated and a required level of detail and drawing rate can be realized depending on the distance between the endoscope and the target area. [0106]
  • Additionally, when the [0107] endoscope 3 is inserted into the object of examination 1 by using the above described embodiment, the model image of the object of examination 1 drawn by the embodiment is switched from the wireframe image 18 to the internal tomographic image 19 obtained by using the three-dimensional volume data of the object of examination 1 as shown in FIG. 9B depending on the relative distance between the endoscope 3 and the outermost zone of the target area.
  • The relative distance between the front end of the [0108] endoscope 3 and the surface of the target is determined in a manner as described above.
  • For instance, the model image of the object of examination may be switched from the [0109] wireframe image 18 to an internal tomographic image 19 obtained by using the three-dimensional volume data of the object of examination that reflect the viewing direction of the endoscope.
  • As a result of this switching operation, the user can easily acquire the internal [0110] tomographic image 19 that is very important after the insertion of the endoscope 3 in stead of the wireframe image 18 of the object of examination that becomes unnecessary after the insertion of the endoscope 3 without being required to carrying out the switching operation by him- or herself.
  • Thus, the navigation-related information displayed to the user when the [0111] endoscope 3 is inserted into the object of examination includes the internal tomgraphic image obtained by using the three-dimensional volume data of the object of examination 1 and the wireframe image 20 of the area requiring special attention.
  • As the [0112] endoscope 3 is brought close to the target and separated from the latter by a distance smaller than a predetermined value under this condition, not only the drawing attributes of the wireframe image 20 but also the color of the internal tomographic image 19 drawn by using the three-dimensional volume data are made to change.
  • If, on the other hand, the target is not found within the effective area of measurement of the [0113] endoscope 3, arrow 21 indicates the direction in which the target area will be found as shown in FIG. 9C.
  • If the model image is found within the drawable (displayable) range or not can be determined by checking if the coordinate of each and every point of the model image as computed when drawing the model image is found as a point on the monitor to be used for displaying the image. [0114]
  • Referring to FIG. 10, the coordinate of the model image is transformed into the coordinate of the display screen (Step S[0115] 1) and it is determined if the coordinate of a transformed point is found within the displayable range of the display screen (Step S2).
  • If the coordinate is found within the displayable range, the model image is displayed on the screen (Step S[0116] 3).
  • If, on the other hand, the coordinate is not found within the display range, the coordinate of a representative point of the model image is transformed into the corresponding coordinate of the display screen (Step S[0117] 4) and, at the same time, the coordinate of the front end of the endoscope 3 is transformed into the corresponding coordinate of the display screen (Step S5). Then, the distance and the direction of the line connecting the representative point of the model image and the front end of the endoscope 3 are computationally determined (Step S6).
  • Thus, the relative distance and the relative direction of the line connecting the [0118] center 22 of the endoscope image, or the front end of the endoscope 3, and the target can be determined by transforming the coordinate values 23 on the model data coordinate system of the representative point of the target into the coordinate values on the liquid crystal monitor 13 by means of the above described coordinate transformation matrices 14, 15, 16 and 17.
  • Then, the user can visually comprehend the extent to which the [0119] endoscope 3 should be moved to bring the target into the effective area of measurement of the endoscope 3 by modifying the size of the arrow 21 indicating the target area in proportion to the obtained distance.
  • When the apparatus is incapable of measuring the distance and the direction, the [0120] sensor control section 6 outputs a message telling the user that the apparatus is incapable of measuring the distance and the direction in place of three-dimensional position and orientation information.
  • Upon receiving this message, the navigation-related [0121] information control section 8 erases the model image being displayed as navigation-related information and generates character information 28 of “unmeasurable condition” and a yellow pixel frame 29 having a width equal to 60 pixels, which are then displayed to the user on the liquid crystal monitor 13 as shown in FIG. 9D.
  • Then, the user can easily comprehend that the [0122] endoscope 3 or the object of examination 1 located at a position that makes the intended measurement impossible so that the user can be effectively protected against he risk of operating the endoscope 3 in a wrong way according to navigation-related information that does not reflect the reality.
  • It may be needless to say that the configuration of this embodiment can be modified and/or altered in various different ways. [0123]
  • For instance, the area that provides the object of navigation is not limited to the target area and may alternatively be a plurality of any areas which are defined by information on the profile of the object of [0124] examination 1 or information on an internal tomographic image.
  • It is also possible to carry out a simulation by fitting a sensing plate to the head of a virtual object of examination without using an actual object of [0125] examination 1.
  • The three-dimensional position and orientation measuring section may be made to alternatively comprises a magnetic sensor or a set of mechanical links and joints, encoders and potentiometers popularly used in ordinary three-dimensional position and orientation measuring systems. [0126]
  • When the object of examination is immobile, it is sufficient to measure the three-dimensional position and orientation of the object of [0127] examination 1, store the information in the sensor information storage section and utilize it in the computational operation for determining the relative three-dimensional position and orientation of the object of examination and the object of navigation in advance so that it is only necessary to measure the three-dimensional position and orientation of the object of navigation when the system is in operation.
  • The wireframe for expressing the profile information of the target area may be replaced by any known technique for graphic expression that is popularly used for three-dimensional computer graphics including polygons. [0128]
  • Alternatively, contour lines and equidistant curves relative to the viewing direction may be used. [0129]
  • The [0130] endoscope 3 that is the object of navigation may be replaced by a plurality of endoscopes.
  • The object of navigation may be a surgical instrument that is not provided with a section of observation. [0131]
  • The technique used for determining if the target area is located within the effective area of measurement or not is limited to the above described one. [0132]
  • The technique for determining the relative distance between the object of navigation and the target area is not limited to the above described one that uses a distance map and any known technique for computationally determining the distance between two points in a three-dimensional space may alternatively be used for determining the distance between a representative point of the object of navigation and a representative point of the target area for the purpose of the invention. [0133]
  • Additionally, the color may be made to change continuously as a function of the distance in stead of the above described use of a single boundary value. Alternatively, the color may be made to change stepwise by providing a plurality of boundary values. [0134]
  • Similarly, the line thickness may be made to change continuously in stead of the above described use of a single boundary value. Alternatively, the line thickness may be made to change stepwise by providing a plurality of boundary values. [0135]
  • A situation where there is no navigation-related information to be displayed may be indicated by making the color to be transparent and the lines to be practically invisible. [0136]
  • The density of lines for drawing the model image that varies as a function of the distance may be made to change continuously on the basis of a single set of data in stead of selectively using a plurality of sets of data with different levels of density that are provided in advance as described above. [0137]
  • The pattern that is displayed when the target area is out of the effective area of measurement is not limited to the [0138] arrow 21. Alternatively, a triangle, a circle, a bar or some other figure may be used. The distance may be expressed by the size of the figure.
  • Furthermore, the size of the [0139] arrow 21 may be made to vary stepwise by using a plurality of preselected values in stead of making it vary continuously in a manner as described above.
  • When a section for determining the density of lines for drawing the model image of the target area on the basis of a single set of data is provided, it is no longer necessary to store in advance a plurality of sets of data with different levels of density. [0140]
  • The navigation-related information indicating a situation where the apparatus is incapable of measuring the distance and the direction may not require both [0141] character information 28 and a frame 29. It may be sufficient to use only either character information 28 or a frame 29 to convey the information.
  • It may be so arranged that the user can define the color and the line thickness that are used as attributes of the navigation-related information, the density of lines for drawing the model image, the size of the displayed pattern, the boundary values for changing the color and the line thickness as a function of the distance and the character string of the [0142] characteristic information 28 indicating that the incapability of measurement of the apparatus.
  • [Embodiment [0143] 2]
  • Now, the second embodiment of the invention, which is a navigation apparatus, will be discussed below. [0144]
  • FIG. 11 is a schematic illustration of the second embodiment of the invention which is also a navigation apparatus, showing its configuration. [0145]
  • This second embodiment has a configuration same as the above described first embodiment except the following. [0146]
  • In this embodiment, the [0147] endoscope 3 is not required to pass the imaging information obtained by the optical system to the navigation-related information control section 8.
  • In this embodiment, the navigation-related [0148] information storage section 9 stores in advance vectors 24 for expressing the route along which the endoscope 3 is inserted (minimal invasive route) as data.
  • Then, the coordinate [0149] values 25 for the front end and the rear end of the endoscope 3 are determined in terms of the coordinate system defined by the sensing plate 4 rigidly fitted to the endoscope 3 and stored in the navigation-related information storage section 9.
  • Now, the operation of the embodiment having the above described configuration will be described below. [0150]
  • When the embodiment of navigation apparatus is in operation, the [0151] sensor control section 6 measures the three-dimensional position of each of the LEDs that are emitting infrared rays of the sensing plates 2 and 4 and then computationally determines the three-dimensional position and orientation of each of the sensing plates 2 and 4, by using the LED definition data stored in the sensor information storage section 5.
  • Then, the coordinate [0152] transformation matrix 17 from the sensing plate 2 fitted to the head of the object of examination 1 to the sensing plate 4 fitted to the endoscope 3 is computationally determined on the basis of the obtained three-dimensional position and orientation information.
  • Then, the position and the orientation of the [0153] endoscope 3 is determined in terms of the data on the target area by using the coordinate transformation matrix 17 and the above described coordinate transformation matrices 14, 15.
  • Then, the navigation-related [0154] information control section 8 generates a tri-sectional image 26 of the three-dimensional volume data 11 including those of the tumor and an orthogonal projection image 27 of the endoscope 3 projected on the cross section as navigation-related information and displays it on the liquid crystal monitor 13.
  • If, for instance, the coordinate of a representative point of the tumor is expressed by ([0155] 260, 180, 280), the tri-sectional image 26 of the three-dimensional volume data 11 will have a YZ plane with x=260, a ZX plane with y=180 and an XY plane with z=280.
  • Then, the relative distance between the surface of the target area and the front end of the [0156] endoscope 3 is determined by referring to the above described distance map 12, using the position of the front end of the endoscope 3.
  • The thickness of the lines of the [0157] orthogonal projection image 27 of the endoscope 3 is continuously changed as a function of the distance between the surface of the target area and the front end of the endoscope 3.
  • Then, the user can easily comprehend a situation where the [0158] endoscope 3 is approaching the target area.
  • The orientation of the [0159] endoscope 3 determined by the coordinate values 25 of the position of the front end and that of the rear end of the endoscope 3 is compared with the data of the vector indicating the direction in which the endoscope 3 is to be inserted and, if the orientation is inclined relative to the vector by a predetermined value (e.g., 10 degrees), the color and the line thickness of the orthogonal projection image 27 of the endoscope 3 will be changed.
  • Thus, the user can easily comprehend that the direction in which the [0160] endoscope 3 is currently inserted is deviating from the direction in which it is to be inserted.
  • It may be needless to say that the configuration of this embodiment can be modified and/or altered in various different ways. [0161]
  • For instance, the area that provides the object of navigation is not limited to the target area and may alternatively be a plurality of any areas which are defined by information on the profile of the object of [0162] examination 1 or information on an internal tomographic image.
  • It is also possible to carry out a simulation by fitting a sensing plate to the head of a virtual object of examination without using an actual object of [0163] examination 1.
  • The three-dimensional position and orientation measuring section may be made to alternatively comprises a magnetic sensor or a set of mechanical links and joints, an encoder and a potentiometer popularly used in ordinary three-dimensional position and orientation measuring systems. [0164]
  • When the object of examination is immovable, it is sufficient to measure the three-dimensional position and orientation of the object of [0165] examination 3, store the information in the sensor information storage section and utilize it in the computational operation for determining the relative three-dimensional position and orientation of the object of examination and the object of navigation in advance so that it is only necessary to measure the three-dimensional position and orientation of the object of navigation when the system is in operation.
  • The [0166] endoscope 3 that is the object of navigation may be replaced by a plurality of endoscopes.
  • While the above object of navigation may normally refer to an [0167] endoscope 3, it may alternatively refer to some other surgical instrument such as a suction pipe or a pair of forceps so long as the mechanical profile thereof can be determined by measurement.
  • The coordinate of the front end of the [0168] endoscope 3 does not need to agree with the actual front end and data may be manipulated to make the endoscope 3 virtually have an extended front end.
  • Alternatively, the coordinate of the front end of the [0169] endoscope 3 may be defined by means of an operational formula using the extension of the front end from the actual front end as parameter so that the coordinate of the front end may be determined successively on the basis of the extension specified by the user.
  • Additionally, the color may be made to change on the basis of a single boundary value instead of making it change continuously as a function of the distance in a manner as described above. Alternatively, the color may be made to change stepwise by providing a plurality of boundary values. [0170]
  • Similarly, the line thickness may be made to change on the basis of a single boundary value in stead of making it change continuously in a manner as described above. Alternatively, the line thickness may be made to change stepwise by providing a plurality of boundary values. [0171]
  • The technique of determining the angle of inclination of the [0172] endoscope 3 is not limited to the one described above.
  • It may be so arranged that the user can define the color and the line thickness that are used as attributes of the navigation-related information, the density of lines for drawing the model image, the size of the displayed pattern, the boundary values for changing the color and the line thickness as a function of the distance and the character string of the [0173] characteristic information 28 indicating that the incapability of measurement of the apparatus.
  • The object of navigation may be a microscope. [0174]
  • Then, position of the focal point can be defined as object of navigation by obtaining the focal length of the microscope from the microscope main body and replacing the coordinate of the front end of the [0175] endoscope 3 by that of the focal point.
  • While a navigation apparatus according to the invention is described above in terms of the first and second embodiments, the present invention is by no means limited thereto and the embodiments can be modified and/or alterered in various different ways without departing from the scope of the present invention. [0176]
  • With a navigation apparatus as set forth in [0177] claim 2 of the appended claims, a model image of the object or the target, information on the direction of navigation and/or information on the distance between the object of navigation and the target are displayed whenever the position and orientation of the object in a three-dimensional space can be determined so that the user can easily comprehend the position and orientation of the object of navigation in the three-dimensional space.
  • Additionally, when the apparatus is incapable of measuring the position and orientation, it displays so and, therefore, the user can easily be aware of the situation. [0178]
  • A navigation apparatus according to [0179] claim 2 covers both the above described first and second embodiments.
  • More specifically, while the above target may normally be a patient, a tumor to be surgically treated of a patient or an area of the body of a patient requiring special attention during a surgical operation, it is by no means limited to an existing object of examination and may alternatively be a virtual target displayed as a two-dimensional or three-dimensional image of a model synthesized by using the video information of an existing target that is obtained in advance. [0180]
  • While the above object may normally refer to an [0181] endoscope 3, it may alternatively refer to some other surgical instrument such as a suction pipe or a pair of forceps.
  • While the above display section may normally refer to a liquid crystal monitor, it may alternatively refer to some other video information display such as a CRT display or a head mount display. [0182]
  • While the above model image refers to [0183] wireframe model data 10 in the first embodiment, it may alternatively refer model data adapted to express a profile, including a popular data structure to be used for three-dimensional computer graphics.
  • Additionally, while the above model image refers to the three-[0184] dimensional volume data 11 of the target area in the first and second embodiments, it may alternatively take a form where a plurality of two-dimensional pixel data exist.
  • While the above described information on the direction of navigation refers to the [0185] arrow 21 in the first embodiment, it may alternatively be a two-dimensional geometric figure such as a triangle or a circle, a three-dimensional geometric figure such a cone or a set of visually recognizable image data.
  • While the above described distance information refers to the numeral [0186] 31 indicating the distance to the tumor in the first embodiment, it may alternatively be a numeral indicating the distance to an appropriate target.
  • While it also refers to the [0187] bar 30 indicating the distance to the tumor in the first embodiment, it may alternatively be a two-dimensional geometric figure such as a triangle or a circle, a three-dimensional geometric figure such a cone or a set of visually recognizable image data.
  • While the information indicating an unmeasurable condition refers to the [0188] character information 28 of “unmeasurable condition” in the first embodiment, it may alternatively include any character information telling the user that the apparatus is in an unmeasurable condition.
  • While it also refers to a [0189] yellow pixel frame 29 having a width equal to 60 pixels in the first embodiment, it may alternatively refer to any expression using symbols defined to indicate an unmeasurable condition.
  • With a navigation apparatus as set forth in [0190] claim 3 of the appended claims, an image of an object acquired by the imaging section is displayed with other information on the object in an overlaid manner so that the user can obtain an actual image of the object and navigation-related information simultaneously and hence comprehend the position, the profile and the condition of the object that he or she cannot see on the basis of the navigation-related information.
  • A navigation apparatus according to [0191] claim 3 covers both the above described first and second embodiments.
  • While the object refers to the [0192] endoscope 3 in the first embodiment, it may alternatively refer to a microscope or some other object.
  • While the above display section normally refers to the liquid crystal monitor in the first embodiment, it may alternatively refer to some other video information display such as a CRT display or a head mount display. [0193]
  • With a navigation apparatus as set forth in [0194] claim 3 of the appended claims, the relative position and orientation of the target and those of the object in a three-dimensional space are measured by means of a three-dimensional position and orientation measuring section.
  • Then, the information generating section of the navigation apparatus generates information necessary for navigating the object on the basis of the outcome of the measurement of the three-dimensional position and orientation measuring section. [0195]
  • Then, the display section of the navigation apparatus displays navigation-related information in a display mode selected out of a plurality of different display modes according to at least any of distance information on the distance between the object and the target, direction information on the direction of the target as viewed from the object or information telling if the object or the target is found within the effective area of measurement of the three-dimensional position and orientation measuring section or not. [0196]
  • As a result, the user can easily comprehend the distance between the target and the object, the direction of the target as viewed from the object and if the object or the target is found within the effective area of measurement of the three-dimensional position and orientation measuring section or not. [0197]
  • A navigation apparatus according to [0198] claim 4 covers both the above described first and second embodiments.
  • Thus, while the target is a [0199] patient 1, a tumor to be surgically treated of a patient or an area of the body of a patient requiring special attention during a surgical operation in the above first and second embodiments, it is by no means limited to an existing object of examination and may alternatively be a virtual target displayed as a two-dimensional or three-dimensional image of a model synthesized by using the video information of an existing target that is obtained in advance.
  • While the above object refers to an [0200] endoscope 3, it may alternatively refer to some other surgical instrument such as a suction pipe or a pair of forceps.
  • While the above three-dimensional position and orientation measuring section refers to sensors using LEDs for emitting infrared rays ([0201] sensing plates 2, 4, sensor assembly 7, sensor information storage section 5 and sensor control section 6), it may alternatively refer to a sensing system using magnetic sensors or a sensing system using a set of mechanical links and joints, an encoder and a potentiometer popularly used in ordinary three-dimensional position and orientation measuring systems.
  • The information generating section refers to the navigation-related [0202] information storage section 9 and the navigation-related information control section 8.
  • While the display section normally refers to the [0203] liquid crystal monitor 13, it may alternatively refer to some other video information display such as a CRT display or a head mount display.
  • For the purpose of the invention, the expression of “a plurality of different display modes” refers to differences in color, in the thickness of line, in the dimensions of the drawing and in the density of drawing lines. [0204]
  • With a navigation apparatus as set forth in [0205] claim 5 of the appended claims, a display section as set forth in claim 4 displays at least profile information on the target or the object, internal tomographic information on the object, information on the direction of the object as viewed from the object or vice versa or information on the distance to the object when the target or the object is measurable by the three-dimensional position and orientation measuring section but it displays information telling that neither the target nor the object can be measured.
  • A navigation apparatus according to [0206] claim 5 covers both the above described first and second embodiments.
  • More specifically, while the above profile information refers to [0207] wireframe model data 10 in the first embodiment, it may also refer to model data adapted to express a profile, including a popular data structure to be used for three-dimensional computer graphics.
  • Additionally, while it refers to the lines drawing the [0208] orthogonal projection image 27 of the endoscope 3 in the second embodiment, it may also refer to expression techniques adapted to express a profile, including a popular data structure to be used for three-dimensional computer graphics.
  • Still additionally, while the internal tomographic information refers to the three-[0209] dimensional volume data 11 of the target area in the first and second embodiments, it may alternatively take a form where a plurality of two-dimensional pixel data exist.
  • While the above described direction of the target refers to the [0210] arrow 21 in the first embodiment, it may alternatively be a two-dimensional geometric figure such as a triangle or a circle, a three-dimensional geometric figure such a cone or a set of visually recognizable image data.
  • While the above described distance information refers to the numeral [0211] 31 indicating the distance to the tumor in the first embodiment, it may alternatively be a numeral indicating the distance to an appropriate target.
  • While it also refers to the [0212] bar 30 indicating the distance to the tumor in the first embodiment, it may alternatively be a two-dimensional geometric figure such as a triangle or a circle, a three-dimensional geometric figure such a cone or a set of visually recognizable image data.
  • While the information indicating an unmeasurable condition refers to the [0213] character information 28 of “unmeasurable condition” in the first embodiment, it may alternatively include any character information telling the user that the apparatus is in an unmeasurable condition.
  • While it also refers to a [0214] yellow pixel frame 29 having a width equal to 60 pixels in the first embodiment, it may alternatively refer to any expression using symbols defined to indicate an unmeasurable condition.
  • With a navigation apparatus as set forth in [0215] claim 6 of the appended claims, the object has an imaging section and the image acquired by the imaging section is displayed with other navigation-related information obtained by the information generating section in an overlaid manner.
  • Thus, the user can obtain an actual image of the object and navigation-related information simultaneously and hence comprehend the position, the profile and the condition of the object that he or she cannot see on the basis of the navigation-related information. [0216]
  • A navigation apparatus according to [0217] claim 6 covers the above described first embodiment.
  • While the object having an imaging section refers to the [0218] endoscope 3 in the first embodiment, it may alternatively refer to a microscope or some other object.
  • While the above display section normally refers to the liquid crystal monitor [0219] 13 in the first embodiment, it may alternatively refer to some other video information display such as a CRT display or a head mount display.
  • With a navigation apparatus as set forth in [0220] claim 7 of the appended claims, the navigation-related information displayed on the display section changes its color as a function of the relative distance between the target and the object as measured by the three-dimensional position and orientation measuring section so that the user can visually comprehend with ease a situation where the relative distance is made too small.
  • Alternatively, it may be so arranged that both the relative distance and the direction toward the target as viewed from the object are evaluated at the same time and the color of the displayed information is changed depending on the situation. Then, the user can visually comprehend both the relative distance and the direction with ease. [0221]
  • A navigation apparatus according to [0222] claim 7 covers both the above described first and second embodiments.
  • More specifically, while the color of the displayed navigation-related information refers to that of the [0223] wireframe image 18 of the target area and the internal tomographic image 19 displayed on the monitor in the first embodiment, it may also refer to the color of the arrow 21 of the first embodiment.
  • It may additionally refers to the color of the [0224] tri-sectional image 26 of the target area and the orthogonal projection image 27 of the endoscope 3 in the second embodiment.
  • With a navigation apparatus as set forth in [0225] claim 8 of the appended claims, the thickness of the lines of the navigation-related information displayed on the display section changes as a function of the relative distance between the target and the object as measured by the three-dimensional position and orientation measuring section so that the user can visually comprehend with ease a situation where the relative distance is made too small.
  • Additionally, as the thickness of the lines of the navigation-related information displayed on the display section changes as a function of the direction of the target as viewed from the object, the user can visually comprehend with ease a situation where the relative direction is deviating from the right direction. [0226]
  • Alternatively, it may be so arranged that both the relative distance and the direction toward the target as viewed from the object are evaluated at the same time and the line thickness of the displayed information is changed depending on the situation. Then, the user can visually comprehend both the relative distance and the direction with ease. [0227]
  • A navigation apparatus according to [0228] claim 8 covers both the above described first and second embodiments.
  • More specifically, while the line thickness of the displayed navigation-related information refers to that of the [0229] wireframe image 18 of the target area and the internal tomographic image 19 displayed on the monitor in the first embodiment, it may also refer to the line thickness of the arrow 21 of the first embodiment.
  • It may additionally refers to line thickness of the [0230] tri-sectional image 26 of the target area and the orthogonal projection image 27 of the endoscope 3 in the second embodiment.
  • With a navigation apparatus as set forth in [0231] claim 9 of the appended claims, the profile model of the target and the internal tomographic image are switched from one to the other on the display section as a function of the relative distance between the target and the object as measured by the three-dimensional position and orientation measuring section so that the user can visually comprehend with ease that the object is located close to the target.
  • Thus, the user can get necessary information with ease depending on if the distance between the object and the target is smaller than a predetermined value or not. [0232]
  • A navigation apparatus according to [0233] claim 9 covers the above described first embodiment.
  • More specifically, while the above profile model refers to [0234] wireframe image 18 in the first embodiment, it may also refer to various profiles used in three-dimensional computer graphics such as polygon as well as contour lines and equidistant curves drawn relative to the viewing direction. The above internal tomographic image refers to the internal tomographic image 19 of the first embodiment.
  • With a navigation apparatus as set forth in [0235] claim 10 of the appended claims, the density of lines drawing the target model image is finely lowered when the relative distance between the target and the object is large and finely raised when the relative distance is small so as to make the load of drawing the target image and the quantity of information used for displaying the image may be well balanced.
  • As a result, the user can obtain an adequate amount of information that is displayed with an adequate drawing rate depending as a function of the relative distance between the target and the object. [0236]
  • A navigation apparatus according to claim [0237] 10 covers the above described first embodiment.
  • More specifically, while the above profile model refers to [0238] wireframe image 18 in the first embodiment, it may also refer to various profiles used in three-dimensional computer graphics such as polygon as well as contour lines and equidistant curves drawn relative to the viewing direction.
  • With a navigation apparatus as set forth in [0239] claim 11 of the appended claims, the information generating section computationally determines the positional relationship of the image to be displayed and the display area of the display section on the basis of the relative distance between the target and the object and the relative direction of the target as viewed from the object and simply indicates the direction of the target when no image is displayed in the display area for the target so that the user can comprehend the relative positions of the target and the object and the direction of the target as viewed from the object without missing either of them by selecting the display of the direction when no image is display for the target.
  • A navigation apparatus according to claim [0240] 11 covers the above described first embodiment.
  • More specifically, while the above profile model refers to [0241] wireframe image 18 in the first embodiment, it may also refer to various profiles used in three-dimensional computer graphics such as polygon as well as contour lines and equidistant curves drawn relative to the viewing direction along with information on the internal tomographic image.
  • While the above described information on the direction of navigation refers to the [0242] arrow 21 in the first embodiment, it may alternatively be a two-dimensional geometric figure such as a triangle or a circle, a three-dimensional geometric figure such a cone or a set of visually recognizable image data.
  • With a navigation apparatus as set forth in [0243] claim 12 of the appended claims, the relative distance between the target and the object is indicated by the size or the shape of a symbol so that the user can visually comprehend with ease not only the distance but also the direction of the target as viewed from the object.
  • A navigation apparatus according to claim [0244] 12 covers the above described first embodiment.
  • More specifically, while the above described symbol refers to the [0245] arrow 21 in the first embodiment, it may alternatively be a two-dimensional geometric figure such as a triangle or a circle, a three-dimensional geometric figure such a cone or a set of visually recognizable image data.
  • With a navigation apparatus as set forth in [0246] claim 13 of the appended claims, the relative distance between the target and the object of navigation and their orientations in a three-dimensional space are determined by the three-dimensional position and orientation measuring section.
  • Then, an computational information determining section generates navigation-related information such as information three-dimensional position and orientation information on the target and the object of navigation including the relative distance between the target and the object of navigation and their orientations and if they are measurable or not and controls the generated information. [0247]
  • Then, the information display section displays the navigation-related information generated by the computational information determining section. [0248]
  • As a result, the user can easily comprehend the positional relationship between the target and the object of navigation including their orientations and if they are measurable or not. [0249]
  • A navigation apparatus according to claim [0250] 13 covers both the first and second embodiments.
  • More specifically, while the above target may normally be a patient, a tumor to be surgically treated of a patient or an area of the body of a patient requiring special attention during a surgical operation, it is by no means limited to an existing object of examination and may alternatively be a virtual target displayed as a two-dimensional or three-dimensional image of a model synthesized by using the video information of an existing target that is obtained in advance. [0251]
  • While the above object may normally refer to an [0252] endoscope 3, it may alternatively refer to some other surgical instrument such as a suction pipe or a pair of forceps.
  • While the above three-dimensional position and orientation measuring section refers to sensors using LEDs for emitting infrared rays ([0253] sensing plates 2, 4, sensor assembly 7, sensor information storage section 5 and sensor control section 6), it may alternatively refer to a sensing system using magnetic sensors or a sensing system using a set of mechanical links and joints, an encoder and a potentiometer popularly used in ordinary three-dimensional position and orientation measuring systems.
  • The computational information determining section refers to the navigation-related [0254] information storage section 9 and the navigation-related information control section 8.
  • While the information display section refers to the [0255] liquid crystal monitor 13, it may alternatively refer to some other video information display such as a CRT display or a head mount display.
  • The expression “attributes of navigation-related information” as used herein refers to the color, the thickness of line, the dimensions the drawing and the density of drawing lines. [0256]
  • With a navigation apparatus as set forth in [0257] claim 14 of the appended claims, the navigation-related information includes a model image of the profile of the target and/or that of the object of navigation, a model image of the internal tomographic information of the target, a symbol pattern indicating the direction in which the target and/or the object of navigation will be found and/or a numerical value o a symbol pattern indicating the distance between the target and the object of navigation when the three-dimensional position and orientation measuring section is operating normally.
  • When the three-dimensional position and orientation measuring section is inoperative, the navigation-related information refers to character information or a symbol pattern indicating that the three-dimensional position and orientation measuring section is in operative. [0258]
  • A navigation apparatus according to claim [0259] 14 covers both the first and second embodiments.
  • More specifically, while the above model image of the profile refers to [0260] wireframe model data 10 in the first embodiment, it may also refer to model data adapted to express a profile, including a popular data structure to be used for three-dimensional computer graphics.
  • Additionally, while it refers to the lines drawing the [0261] orthogonal projection image 27 of the endoscope 3 in the second embodiment, it may also refer to expression techniques adapted to express a profile, including a popular data structure to be used for three-dimensional computer graphics.
  • Still additionally, while the model image of the internal tomographic information refers to the three-[0262] dimensional volume data 11 of the target area in the first and second embodiments, it may alternatively take a form where a plurality of two-dimensional pixel data exist.
  • While the above described symbol pattern indicating the direction in which the object of navigation will be found refers to the [0263] arrow 21 in the first embodiment, it may alternatively be a two-dimensional geometric figure such as a triangle or a circle, a three-dimensional geometric figure such a cone or a set of visually recognizable image data.
  • While the above described numerical value indicating the distance to the target refers to the numeral [0264] 31 indicating the distance to the tumor in the first embodiment, it may alternatively be a numeral indicating the distance to an appropriate target.
  • While it also refers to the [0265] bar 30 indicating the distance to the tumor in the first embodiment, it may alternatively be a two-dimensional geometric figure such as a triangle or a circle, a three-dimensional geometric figure such a cone or a set of visually recognizable image data.
  • While the character information indicating an unmeasurable condition refers to the [0266] character information 28 of “unmeasurable condition” in the first embodiment, it may alternatively include any character information telling the user that the apparatus is in an unmeasurable condition.
  • While it also refers to a [0267] yellow pixel frame 29 having a width equal to 60 pixels in the first embodiment, it may alternatively refer to any expression using symbols defined to indicate an unmeasurable condition.
  • With a navigation apparatus as set forth in [0268] claim 15 of the appended claims, the object of navigation has an observational function and the observed image obtained by means of the observational function is displayed with other navigation-related information obtained by the computational information determining section in an overlaid manner.
  • Thus, the user can obtain an actual image of the object and navigation-related information simultaneously and hence comprehend the position, the profile and the condition of the object that he or she cannot see on the basis of the navigation-related information. [0269]
  • A navigation apparatus according to claim [0270] 15 covers the above described first embodiment.
  • While the object of navigation having an observational function refers to the [0271] endoscope 3 in the first embodiment, it may alternatively refer to a microscope or some other object.
  • While the above information display section normally refers to the liquid crystal monitor [0272] 13 in the first embodiment, it may alternatively refer to some other video information display such as a CRT display or a head mount display.
  • With a navigation apparatus as set forth in [0273] claim 16 of the appended claims, the navigation-related information displayed on the display section changes its color as a function of the relative distance between the target and the object as measured by the three-dimensional position and orientation measuring section so that the user can visually comprehend with ease a situation where the relative distance is made too small.
  • Alternatively, it may be so arranged that the navigation-related information displayed on the display section changes its color as a function of the relative direction of the target and the object of navigation so that the user also can visually comprehend a situation where the relative direction is deviated from the right direction. [0274]
  • Still alternatively, the color may be made to change simultaneously as a function of both the relative distance and the relative direction. Then, the user can visually comprehend both the relative distance and the direction with ease. [0275]
  • A navigation apparatus according to claim [0276] 16 covers both the above described first and second embodiments.
  • More specifically, while the color of the displayed navigation-related information refers to that of the [0277] wireframe image 18 of the target area and the internal tomographic image 19 displayed on the monitor in the first embodiment, it may also refer to the color of the arrow 21 of the first embodiment.
  • It may additionally refers to the color of the [0278] tri-sectional image 26 of the target area and the orthogonal projection image 27 of the endoscope 3 in the second embodiment.
  • With a navigation apparatus as set forth in [0279] claim 17 of the appended claims, the thickness of the lines of the navigation-related information obtained by the three-dimensional position and orientation measuring section is made to vary as a function of the relative distance between the target and the object of navigation so that the user can visually comprehend with ease a situation where the relative distance has become too small.
  • Alternatively, it may be so arranged that the thickness of the lines of the navigation-related information vary as a function of the direction to the target as viewed from the object of navigation so that the user can also visually comprehend with ease a situation where the relative direction has deviated. [0280]
  • A navigation according to claim [0281] 17 covers both the first and second embodiments.
  • While the thickness of the lines of navigation-related information refers to the [0282] wireframe image 18 of the target area drawn on the monitor in the first embodiment, it may also include the arrow 21 in the first embodiment.
  • It refers to the [0283] orthogonal projection image 27 of the endoscope 3 drawn on the monitor in the second embodiment.
  • With a navigation apparatus as set forth in [0284] claim 18 of the appended claims, the profile model of the target and the internal tomographic image are switched from one to the other on the display section as a function of the relative distance between the target and the object as measured by the three-dimensional position and orientation measuring section so that the user can visually comprehend with ease that the object is located close to the target.
  • Thus, the user can get necessary information with ease depending on if the distance between the object and the target is smaller than a predetermined value or not. [0285]
  • A navigation apparatus according to claim [0286] 18 covers the above described first embodiment.
  • More specifically, while the above profile model refers to [0287] wireframe image 18 in the first embodiment, it may also refer to various profiles used in three-dimensional computer graphics such as polygon as well as contour lines and equidistant curves drawn relative to the viewing direction.
  • The above internal tomographic image refers to the internal [0288] tomographic image 19 of the first embodiment.
  • With a navigation apparatus as set forth in [0289] claim 19 of the appended claims, the density of lines drawing the target model image is finely lowered when the relative distance between the target and the object is large and finely raised when the relative distance is small so as to make the load of drawing the target image and the quantity of information used for displaying the image may be well balanced.
  • As a result, the user can obtain an adequate amount of information that is displayed with an adequate drawing rate depending as a function of the relative distance between the target and the object. [0290]
  • A navigation apparatus according to claim [0291] 19 covers the above described first embodiment.
  • More specifically, while the above profile model refers to [0292] wireframe image 18 in the first embodiment, it may also refer to various profiles used in three-dimensional computer graphics such as polygon as well as contour lines and equidistant curves drawn relative to the viewing direction.
  • With a navigation apparatus as set forth in [0293] claim 20 of the appended claims, the computational information determining section computationally determines the positional relationship of the image to be displayed and the display area of the display section on the basis of the relative distance between the target and the object of navigation and the relative direction of the target as viewed from the object and only a symbol pattern is displayed when no model image is displayed in the display area so that the user can comprehend the relative positions of the target and the object and the direction of the target as viewed from the object without missing either of them by selecting the display of the direction when no image is display for the target.
  • A navigation apparatus according to claim [0294] 20 covers the above described first embodiment.
  • More specifically, while the above profile model refers to [0295] wireframe image 18 in the first embodiment, it may also refer to various profiles used in three-dimensional computer graphics such as polygon as well as contour lines and equidistant curves drawn relative to the viewing direction along with information on the internal tomographic image.
  • While the above described symbol pattern refers to the [0296] arrow 21 in the first embodiment, it may alternatively be a two-dimensional geometric figure such as a triangle or a circle, a three-dimensional geometric figure such a cone or a set of visually recognizable image data.
  • With a navigation apparatus as set forth in [0297] claim 21 of the appended claims, the relative distance between the target and the object is indicated by the size of a pattern so that the user can visually comprehend with ease not only the distance but also the direction of the target as viewed from the object.
  • A navigation apparatus according to claim [0298] 21 covers the above described first embodiment.
  • More specifically, while the above described symbol pattern refers to the [0299] arrow 21 in the first embodiment, it may alternatively be a two-dimensional geometric figure such as a triangle or a circle, a three-dimensional geometric figure such a cone or a set of visually recognizable image data.
  • As described above, both the above described first and second embodiments of navigation apparatus according to the invention are adapted to modify the navigation-related information displayed on the display section as a function of the relative three-dimensional positions and orientations of the target and the object of navigation to make the user easily comprehend the distance between the target and the object and obtain navigation-related information of necessary type with ease. [0300]
  • Now, the third and fourth embodiments of the invention will be described. They are surgical operation image acquisition/display apparatus realized by applying a navigation apparatus according to the invention. [0301]
  • (Embodiment 3) [0302]
  • FIG. 14 is a schematic block diagram of the third embodiment of the invention which is a surgical operation image acquisition/display apparatus, showing its configuration; The third embodiment of surgical operation image acquisition/display apparatus according to the invention has a configuration as described below. [0303]
  • Referring to FIG. 14, the surgical operation image acquisition/display apparatus comprises a surgical microscope [0304] 141 (first observation section) and an endoscope 142 (second observation section) for observing an area located in the dead angle of the surgical microscope 141.
  • The [0305] surgical microscope 141 includes an microscope optical system 141 a mounted on a stand (not shown), a microscope camera 141 b attached to the microscope optical system 141 a and a microscope camera control unit (hereinafter referred to as microscope CCU) 141 c for converting the output of the microscope camera 141 b into a video signal.
  • The microscope [0306] optical system 141 a is provided with illumination light emitted from a light source (not shown) and guided by a light guide (not shown) for the purpose of observation.
  • On the other hand, the [0307] endoscope 142 includes an endoscope optical system 142 a, an endoscope camera 142 b attached to the endoscope optical system 142 a and an endoscope camera control unit (hereinafter referred to as endoscope CCU) 142 for converting the output of the endoscope camera into a video signal.
  • The endoscope [0308] optical system 142 a is provided with illumination light emitted from a light source (not shown) and guided by a light guide (not shown) for the purpose of observation.
  • The video output of the [0309] microscope CCU 141 c and that of the endoscope CCU 142 c are fed to a video mixer 143 (observed image synthesizing section).
  • As illustrated in FIGS. 15A through 15F, the [0310] video mixer 143 has a plurality of display modes 0 through 5.
  • In the display mode [0311] 0 (FIG. 15A), the video output of the microscope CCU 141 c is displayed without being processed.
  • In the display mode [0312] 1 (FIG. 15B), the video output of the endoscope CCU 142 c is displayed without being processed.
  • In the display mode [0313] 2 (FIG. 15C), the video output of the endoscope CCU 142 c is dimensionally reduced and displayed in the video output of the microscope CCU 141 c.
  • In the display mode [0314] 3 (FIG. 15D), the video output of the microscope CCU 141 c is dimensionally reduced and displayed in the video output of the endoscope CCU 142 c.
  • The extent of dimensional reduction and the display position of the dimensionally reduced output are variable both in the [0315] display mode 2 and the display mode 3.
  • In the display mode [0316] 4 (FIG. 15E), which is a variation of the display mode 2 and in which the video output of the endoscope CCU 142 c is dimensionally reduced and displayed at the right top corner of the video output of the microscope CCU 141 c, the video output of the endoscope CCU 142 c is dimensionally reduced to the same extent and displayed at the left bottom corner of the video output of the microscope CCU 141 c.
  • Similarly, in the display mode [0317] 5 (FIG. 15F), which is a variation of the display mode 3 and in which the video output of the microscope CCU 141 c is dimensionally reduced and displayed at the right top corner of the video output of the endoscope CCU 142 c, the video output of the microscope CCU 141 c is dimensionally reduced to the extent smaller than that of FIG. 15D and displayed at the right top corner of the video output of the endoscope CCU 142 c.
  • The mode of display and the position and the size of the displayed image (or each of the displayed images) as defined by the [0318] video mixer 143 can be appropriately changed by means of an externally applied control signal. More specifically, the displayed image(s) can be enlarged or reduced independently with an appropriately selected magnification factor.
  • Then, the output of the [0319] video mixer 143 is fed to a liquid crystal display 144 (display section) and displayed to the surgical operator for observation.
  • The combination of the [0320] video mixer 143 and the liquid crystal display 144 corresponds to the image display section as used in the appended claims.
  • Referring to FIG. 14, position and orientation sensor [0321] 145 (position and orientation detection section) comprises hard sensing plates 145 b and 145 c, each having three infrared light emitting diodes (LEDs) 145 a for emitting arranged at the respective corners of a triangle, a sensor assembly 145 d for detecting the quantity of light emitted from each of the infrared LEDs 145 a for emitting and a sensor controller 145 e for computationally determining the three-dimensional position and orientation of the sensing plate 145 b and that of the sensing plate 145 c from the output of the sensor assembly.
  • Note that the position of each of the [0322] LEDs 145 a for emitting infrared is observed and determined in advance in terms of the coordinate system defined on each of the sensing plates 145 b and 145 c and stored in the sensor controller 145 e as LED definition data.
  • Assume that the [0323] sensing plate 145 b is attached to the head of the patient 146 in such a way that its position and orientation relative to the head would not change easily.
  • The [0324] other sensing plate 145 c is attached to the endoscope 143 by a mount section (not shown).
  • The [0325] sensor controller 145 e is connected to an image controller 147 (state of synthesis modification specifying section).
  • The [0326] image controller 147 is connected to the video mixer 143.
  • Assume that the area of operation to be surgically treated and certain characteristic points on the patient are observed by means of CT or MRI and the three-dimensional positions thereof are computationally determined by an image processing computer (not shown) and stored in the [0327] image controller 147 as data on the area and the characteristic points.
  • At this time, as shown in FIG. 16, the data on the area of surgical operation and the [0328] patient 146 are correlated by observing the coordinate values of the characteristic points in the model data coordinate system m and those of the characteristic points on the patient 146 in the patient coordinate system p defined by the sensing plate 145 b and computing a coordinate transformation matrix pHm.
  • The coordinate transformation matrix pHm is stored in the storage section of the [0329] image controller 147.
  • Similarly, the coordinate values of the front end and those of the rear end of the [0330] endoscope 142 are observed in terms of the endoscope coordinate system e defined by the sensing plate 145 c attached to the endoscope 142 and stored in the storage section of the image controller 147.
  • Now, the operation of the third embodiment will be described below. [0331]
  • When the surgical operation image acquisition/display apparatus is used, the [0332] surgical microscope 141 and the endoscope 142 are combined for use as shown in FIG. 14.
  • When the surgical operation image acquisition/display apparatus is in operation, the [0333] sensor controller 145 e drives the infrared LEDs 145 a to sequentially emit and determines the three-dimensional position of each of the infrared LEDs 145 a on the basis of its output. At the same time, the sensor controller 145 e computationally determines the three-dimensional position and orientation of the sensing plate 145 b and that of the sensing plate 145 c, using the LED definition data stored in the sensor controller 145 e and outputs the obtained data to the image controller 147 upon request.
  • The [0334] image controller 147 computes the coordinate transformation matrix pHe from the sensing plate 145 b of the patient coordinate system p attached to the head of the patient 146 to the sensing plate 145 c of the endoscope coordinate system e attached to the endoscope 142 on the basis of the three-dimensional position and orientation information.
  • The [0335] image controller 147 also computationally determines the relative distance between the patient 146 and the endoscope 142 and the relative direction of the patient 146 as viewed from the endoscope 142 on the basis of the coordinate transformation matrix pHe and the above described coordinate transformation matrix pHm.
  • The processing operation of the [0336] image controller 147 will be described further by referring to the flow chart of FIG. 18.
  • Firstly, the [0337] image controller 147 outputs a request signal cyclically with a predetermined period (e.g., 33 msec) to the sensor controller 145 e to receive the three-dimensional position and orientation information on the sensing plates 145 b and 145 c from the sensor controller 145 e (Step S101).
  • Then, the [0338] image controller 147 judges if the endoscope 142 is located close to the area of surgical operation (e.g., within a range of 50 mm) on the basis of the received three-dimensional position and orientation information (Step S102).
  • If the [0339] image controller 147 judges that the endoscope 142 is located close to the area of surgical operation, it outputs an instruction for switching from the microscope image to the endoscope image (mode 1) to the video mixer 143 (Step S103).
  • If, on the other hand, the [0340] image controller 147 judges that the endoscope 142 is not located close to the area of surgical operation, it outputs an instruction for switching from the endoscope image to the microscope image (mode 0) to the video mixer 143 (Step S104).
  • When, the [0341] endoscope 142 is moved away from the area of surgical operation and the area is observed mainly by the surgical microscope 141 as a result of the above judgment, the image observed through the surgical microscope 141 is displayed on the liquid crystal display 144. On the other hand, when the endoscope 142 is moved closer to the area of surgical operation, the image observed through the endoscope 142 is displayed on the liquid crystal display 144.
  • Because the image observed through the surgical microscope is displayed on the [0342] liquid crystal display 144 when the endoscope 142 is moved away from the area of surgical operation, the surgeon can see the desired image without paying effort for switching from the microscope image to the endoscope image.
  • It may be needless to say that the configuration of this embodiment can be modified and/or altered in various different ways without departing from the scope of the invention. [0343]
  • For instance, while the first observation section refers to the [0344] surgical microscope 141 in this embodiment, it may alternatively be the endoscope 142 or some other observation sections or units.
  • While the second observation section refers to the [0345] endoscope 142 in this embodiment, it may alternatively be the surgical microscope 141 or some other observation sections or units.
  • While the image synthesizing section refers to the [0346] video mixer 143 in this embodiment, it may be some other section for externally modifying the synthesized state of a plurality of images.
  • While the display section refers to the [0347] liquid crystal display 144 in this embodiment, it may alternatively be a CRT display, a head mounted display or a projector adapted to display video signals.
  • While the position and orientation detection section is the position and orientation sensor (comprising the [0348] infrared LEDs 145 a for emitting, the sensing plates 145 b, 145 c, the sensor assembly 145 d and the sensor controller 145 e) in this embodiment, it may alternatively be any appropriate section for detecting the three-dimensional position and orientation of an object such as a magnetic sensor or a set of mechanical links and joints, encoders and potentiometers.
  • While the state of synthesis modification specifying section of this embodiment uses the distance between the area of surgical operation and the [0349] endoscope 142 for its judgment, it may alternatively use both the distance and the direction of the area as viewed from the endoscope 142.
  • While the position and orientation detection section of this embodiment detects the position and orientation of the [0350] endoscope 142, it may alternatively detect the position and orientation of the microscope 141 or both the position and orientation of the endoscope 142 and that of the microscope 141.
  • (Embodiment 4) [0351]
  • Now, the fourth embodiment of the invention, which is also a surgical operation image acquisition/display apparatus, will be described below. [0352]
  • The fourth embodiment of surgical operation image acquisition/display apparatus has a configuration substantially same as the above described third embodiment and hence the similar components in the graphic illustrations thereof will be denoted respectively by the same reference symbols and will not be described any further. [0353]
  • Additionally, since the operational function of the fourth embodiment of surgical operation image acquisition/display apparatus is same as that of the above described third embodiment except the processing operation of the [0354] image controller 147, only the latter will be discussed hereinafter.
  • More specifically, in this embodiment, one of the obtained two images is dimensionally reduced and synthetically combined with the other image so that they may be displayed simultaneously on the display screen for observation. [0355]
  • The processing operation of the [0356] image controller 147 will be discussed below by referring to the flow chart of FIG. 19.
  • Firstly, the [0357] image controller 147 outputs a request signal cyclically with a predetermined period (e.g., 33 msec) to the sensor controller 145 e to receive the three-dimensional position and orientation information on the sensing plates 145 b and 145 c from the sensor controller 145 e (Step S201).
  • Then, the [0358] image controller 147 judges if the endoscope 142 is located close to the area of surgical operation (e.g., within a range of 50 mm) on the basis of the received three-dimensional position and orientation information (Step S202).
  • If the [0359] image controller 147 judges that the endoscope 142 is located close to the area of surgical operation, it outputs an instruction for dimensionally reducing the microscope image and display it with the endoscope image (mode 3) to the endoscope image (mode 1) to the video mixer 143 (Step S203).
  • If, on the other hand, the [0360] image controller 147 judges that the endoscope 142 is not located close to the sit of surgical operation, it outputs an instruction for dimensionally reducing the endoscope image and display it with the microscope image (mode 2) to the video mixer 143 (Step S204).
  • When, the [0361] endoscope 142 is moved away from the area of surgical operation and the area is observed mainly by the surgical microscope 141 as a result of the above judgment, the dimensionally reduced image observed through the endoscope 142 is displayed with the image observed through the surgical microscope 141 on the liquid crystal display 144.
  • On the other hand, when the [0362] endoscope 142 is moved closer to the area of surgical operation, the dimensionally reduced image observed through the surgical microscope 141 is displayed with the image observed through the endoscope 142 on the liquid crystal display 144.
  • Because the dimensionally reduced image observed through the [0363] endoscope 142 is displayed with the image observed through the surgical microscope on the liquid crystal display 144 when the endoscope 142 is moved away from the area of surgical operation, the surgeon can see the desired plurality of images without paying effort for switching from the microscope image to the endoscope image.
  • It may be needless to say that the configuration of this fourth embodiment can be modified and/or altered in various different ways without departing from the scope of the invention. [0364]
  • For instance, while the first observation section refers to the [0365] surgical microscope 141 in these embodiments, it may alternatively be the endoscope 142 or some other observation sections or units.
  • While the second observation section refers to the [0366] endoscope 142 in these embodiments, it may alternatively be the surgical microscope 141 or some other observation sections or units.
  • While the image synthesizing section refers to the [0367] video mixer 143 in these embodiments, it is by no means limited thereto and may be some other section for externally modifying the synthesized state of a plurality of images.
  • While the display section refers to the [0368] liquid crystal display 144 in these embodiments, it may alternatively be a CRT display, a head mounted display or a projector adapted to display video signals.
  • While the position and orientation detection section is the position and orientation sensor (comprising the [0369] infrared LEDs 145 a for emitting, the sensing plates 145 b, 145 c, the sensor assembly 145 d and the sensor controller 145 e) in these embodiments, it may alternatively be any appropriate section for detecting the three-dimensional position and orientation of an object such as a magnetic sensor or a set of mechanical links and joints, encoders and potentiometers.
  • The state of synthesis modification specifying section refers to the image controller in these embodiments. [0370]
  • As described above, both the third and fourth embodiments of the invention provide a surgical operation image acquisition/display apparatus that efficiently assists a surgeon to smoothly carry out a surgical operation without requiring him or her to switch the observation system from one to another by means of a navigation apparatus when the surgical operation is conducted by using a plurality of observation systems including a surgical microscope and an endoscope. [0371]
  • Additional advantages and modifications will readily occur to those skilled in the art. Therefore, the invention in its broader aspects is not limited to the specific details and representative embodiments shown and described herein. Accordingly, various modifications may be made without departing from the spirit or scope of the general inventive concept as defined by the appended claims and their equivalents. [0372]

Claims (42)

What is claimed is:
1. A navigation apparatus comprising:
a navigation-related information generating section for generating navigation-related information by measuring the relative position and orientation of an object and a target in a three-dimensional space in order to navigate said object to said target; and
a display section for displaying said navigation-related information generated by said navigation-related information generating section in different modes according to the relative position and orientation of said object and said target.
2. A navigation apparatus comprising:
a navigation-related information generating section for generating navigation-related information by measuring the relative position and orientation of an object and a target in a three-dimensional space in order to navigate said object to said target; and
a display section for displaying at least a model image of the object or the target, information on the direction of navigation or information on the distance between the object of navigation and the target when said navigation-related information generating section can measure said position and orientation in a three-dimensional space but displaying information indicating a measurement incapable situation when said navigation-related information generating section cannot measure said position and orientation.
3. A navigation apparatus according to claims 1 and 2, wherein
said object has an image acquisition function; and
said display section can display the image acquired by the image acquisition function of said object with other navigation-related information in an overlaid.
4. A navigation apparatus adapted to generate information for navigating an object to a target comprising:
a three-dimensional position and orientation measuring section for measuring the three-dimensional position and orientation of said object and that of said target;
an information generating section for generating information necessary for the navigation on the basis of the obtained result of said three-dimensional position and orientation measuring section; and
a display section for displaying the information generated by said information generating section;
wherein said display section displays navigation-related information according to at least information on the distance between said object and said target, information on the direction toward said target as viewed from said object and information if either said object or said target is located within the effective area of measurement of said three-dimensional position and orientation measuring section in a display mode selected out of a plurality of display modes.
5. A navigation apparatus according to
claim 4
, wherein
said display section displays at least profile information on said target or said object, internal tomographic information on said target, information on the direction toward said target as viewed from said object or vice versa or information on the distance between said object and said target when said target and said object is measurable by said three-dimensional position and orientation measuring section but displays information indicating a measurement incapable situation when neither said target nor said object is measurable by said three-dimensional position and orientation measuring section.
6. A navigation apparatus according to
claim 4
, wherein
said object has an image acquisition function; and
said display section can display the image acquired by the image acquisition function of said object with other navigation-related information in an overlaid.
7. A navigation apparatus according to
claim 4
, wherein
said navigation-related information displayed on the display section changes its color as a function of the relative distance between said target and said object or the direction toward said target as viewed from said object.
8. A navigation apparatus according to
claim 4
, wherein
the thickness of the lines of the navigation-related information displayed on the display section changes as a function of the relative distance between said target and said object or the direction toward said target as viewed from said object.
9. A navigation apparatus according to
claim 4
, wherein
a profile model of the target and an internal tomographic image are displayed alternatively on said display section as a function of the distance between said target and said object.
10. A navigation apparatus according to
claim 4
, wherein
the density of lines displayed on said display section for drawing said target model image is varied as a function of the distance between said target and said object.
11. A navigation apparatus according to
claim 4
, wherein
an image representing said target and directional information of said target are displayed alternatively on said display section as a function of the distance between said target and said object or the direction toward said target as viewed from said object.
12. A navigation apparatus according to
claim 4
, wherein
direction information of said target is displayed on said display section in the form of a symbol at least whose size or shape varies as a function of the distance between said target and said object or the direction toward said target as viewed from said object.
13. A navigation apparatus comprising:
a target of navigation;
an object to be navigated to said target;
a three-dimensional position and orientation measuring section for measuring the three-dimensional position and orientation of at least said target or said object of navigation;
a computational information determining section for generating navigation-related information on the basis of the information obtained by the measurement of said three-dimensional position and orientation measuring section and controlling the generated information; and
an information display section for display the navigation-related information generated and controlled by said computational information determining section;
the relative distance between the target and the object of navigation and their orientations in a three-dimensional space are determined by the three-dimensional position and orientation measuring section;
wherein said computational information determining section modifies either the attribute or the type of said navigation-related information as a function of at least the relative distance between said target and said object of navigation or the direction toward said target as viewed from said object measured by said three-dimensional position and orientation measuring section or the immeasurability thereof so as to make it visibly reflect the outcome of measurement.
14. A navigation apparatus according to
claim 13
, wherein
said navigation-related information includes a model image of the profile of the target and/or that of the object of navigation, a model image of the internal tomographic information of the target, a symbol pattern indicating the direction in which the target and/or the object of navigation will be found and/or a numerical value or a symbol pattern indicating the distance between the target and the object of navigation when said three-dimensional position and orientation measuring section is operating normally whereas it includes character information or a symbol pattern indicating an immeasurable situation when the three-dimensional position and orientation measuring section is inoperative.
15. A navigation apparatus according to
claim 13
, wherein
said object of navigation has an image acquisition function and the acquired image obtained by means of the image acquisition function is displayed with other navigation-related information obtained by said computational information determining section in an overlaid manner by means of said computational information determining section.
16. A navigation apparatus according to
claim 13
, wherein
the navigation-related information displayed on the display section changes its color as a function of the relative distance between said target and said object of navigation or the direction toward said target as viewed from said object of navigation.
17. A navigation apparatus according to
claim 13
, wherein
the thickness of the lines of the navigation-related information displayed on the display section changes as a function of the relative distance between said target and said object of navigation and/or the direction toward the target as viewed from the object of navigation.
18. A navigation apparatus according to
claim 13
, wherein
the profile model of said target and the internal tomographic image are switched from one to the other on the display section as a function of the relative distance between said target and said object of navigation.
19. A navigation apparatus according to
claim 13
, wherein
the density of lines of the displayed target model image is modified as a function of the relative distance between said target and said object of navigation.
20. A navigation apparatus according to
claim 13
, wherein
a model image of at least either said target or said object of navigation or a symbol pattern indicating the direction in which at least said target or said object of navigation is found is displayed alternatively as a function of the relative distance between said target and said object of navigation and the direction toward the target as viewed from the object of navigation.
21. A navigation apparatus according to
claim 13
, wherein
the size of the symbol pattern indicating the direction of said target as viewed from the object of navigation is modified as a function of the relative distance between said target and said object of navigation.
22. A surgical operation image acquisition/display apparatus comprising:
an observation section having a plurality of observation units and adapted to modify its position and orientation;
an image display section adapted to alternatively or synthetically display the images obtained by said plurality of observation units of said observation section; and
a specifying section for specifying the images to be displayed according to the position and orientation of said observation section.
23. A surgical operation image acquisition/display apparatus comprising:
an observation section having a plurality of observation units and adapted to modify its position and orientation;
an image display section adapted to display each of the images obtained by said plurality of observation unit of said observation section in a size either enlarged or reduced with an independently selected magnitude; and
a specifying section for specifying the images to be synthetically combined according to the position and orientation of said observation section.
24. A surgical operation image acquisition/display apparatus comprising:
an observation section having a plurality of observation units and adapted to modify its position and orientation;
an image display section adapted to display an selected from the images obtained by said plurality of observation section of said observation section; and
a specifying section for specifying the image to be selected according to the position and orientation of said observation section.
25. A surgical operation image acquisition/display apparatus comprising:
an observation section having a plurality of observation units and adapted to modify its position and orientation relative to an area of surgical operation;
an image display section adapted to synthetically display the images obtained by said plurality of observation units of said observation section in a mode selected out of a plurality of different modes; and
a specifying section for specifying a mode of image synthesis to said image display section according to the position and orientation of said observation section.
26. A surgical operation image acquisition/display apparatus comprising:
an observation section having a surgical microscope adapted to modify its position and orientation and an endoscope also adapted to modify its position and orientation;
a detection section for detecting the position and orientation of said observation section;
an image display section adapted to synthetically combine the image observed by said surgical microscope and the image observed by said endoscope and display the images in a mode selected out of a plurality of different modes; and
a specifying section for specifying the mode of image synthesis to said image display section according to the outcome of the detection of said detection section.
27. A surgical operation image acquisition/display apparatus comprising:
an observation section having a surgical microscope adapted to modify its position and orientation and an endoscope also adapted to modify its position and orientation;
a detection section for detecting the position and orientation of said observation section;
an image display section adapted to alternatively display the image observed by said surgical microscope or the image observed by said endoscope; and
a specifying section for specifying the image to be displayed according to the outcome of the detection of said detection section.
28. A surgical operation image acquisition/display apparatus according to
claim 22
, wherein said specifying section specifies in accordance with the distance between the area of surgical operation and the observation section.
29. A surgical operation image acquisition/display apparatus comprising:
a first observation section for observing an area of surgical operation;
a second observation section arranged apart from said first observation section and adapted to observe at least said area of surgical operation or the vicinity thereof;
an observed image synthesizing section for synthetically combining the image observed by said first observation section and the image observed by said second observation section;
a display section for display the image synthesized by said observed image synthesizing section;
a position and orientation detection section for detecting at least either the position and orientation of said first observation section of that of said second observation section; and
a state of synthesis modification specifying section for specifying the modification made to the state of synthesis of the observed images to said observed image synthesizing section according to the position and orientation information from said position and orientation detection section.
30. A surgical operation image acquisition/display apparatus according to
claim 29
, wherein said state of synthesis modification specifying section specifies for use either the image observed by said first observation section or the image observed by said second observation section according to the position and orientation information of either said first observation section of said second observation section fed from said position and orientation detection section and either the relative distance between said area of surgical operation and said first observation section or the direction of said area of surgical operation as viewed from said second observation section.
31. A surgical operation image acquisition/display apparatus according to
claim 29
, wherein said state of synthesis modification specifying section specifies reduction of either the image observed by said first observation section or the image observed by said second observation section according to the position and orientation information of either said first observation section of said second observation section fed from said position and orientation detection section and either the relative distance between said area of surgical operation and said first observation section or the direction of said area of surgical operation as viewed from said second observation section, and synthesis the reduced image into the other image.
32. A navigation apparatus comprising:
navigation-related information generating means for generating navigation-related information by measuring the relative position and orientation of an object and a target in a three-dimensional space in order to navigate said object to said target; and
display means for displaying said navigation-related information generated by said navigation-related information generating means in different modes according to the relative position and orientation of said object and said target.
33. A navigation apparatus comprising:
navigation-related information generating means for generating navigation-related information by measuring the relative position and orientation of an object and a target in a three-dimensional space in order to navigate said object to said target; and
display means for displaying at least a model image of the object or the target, information on the direction of navigation or information on the distance between the object of navigation and the target when said navigation-related information generating means can measure said position and orientation in a three-dimensional space but displaying information indicating a measurement incapable situation when said navigation-related information generating means cannot measure said position and orientation.
34. A navigation apparatus adapted to generate information for navigating an object to a target comprising:
three-dimensional position and orientation measuring means for measuring the three-dimensional position and orientation of said object and that of said target;
information generating means for generating information necessary for the navigation on the basis of the obtained result of said three-dimensional position and orientation measuring means; and
display means for displaying the information generated by said information generating means;
wherein said display means displays navigation-related information according to at least information on the distance between said object and said target, information on the direction toward said target as viewed from said object and information if either said object or said target is located within the effective area of measurement of said three-dimensional position and orientation measuring means in a display mode selected out of a plurality of display modes.
35. A navigation apparatus comprising:
a target of navigation;
an object to be navigated to said target;
three-dimensional position and orientation measuring means for measuring the three-dimensional position and orientation of at least said target or said object of navigation;
computational information determining means for generating navigation-related information on the basis of the information obtained by the measurement of said three-dimensional position and orientation measuring means and controlling the generated information; and
information display means for display the navigation-related information generated and controlled by said computational information determining means;
the relative distance between the target and the object of navigation and their orientations in a three-dimensional space are determined by the three-dimensional position and orientation measuring means;
wherein said computational information determining means modifies either the attribute or the type of said navigation-related information as a function of at least the relative distance between said target and said object of navigation or the direction of said target as viewed from said object measured by said three-dimensional position and orientation measuring means or the immeasurability thereof so as to make it visibly reflect the outcome of measurement.
36. A surgical operation image acquisition/display apparatus comprising:
observation means having a plurality of observation sections and adapted to modify its position and orientation;
image display means adapted to alternatively or synthetically display the images obtained by said plurality of observation sections of said observation means; and
specifying means for specifying the images to be displayed according to the position and orientation of said observation section.
37. A surgical operation image acquisition/display apparatus comprising:
observation means having a plurality of observation sections and adapted to modify its position and orientation;
image display means adapted to display each of the images obtained by said plurality of observation sections of said observation section in a size either enlarged or reduced with an independently selected magnitude; and
specifying means for specifying the images to be synthetically combined according to the position and orientation of said observation means.
38. A surgical operation image acquisition/display apparatus comprising:
observation means having a plurality of observation sections and adapted to modify its position and orientation;
image display means adapted to display an selected from the images obtained by said plurality of observation sections of said observation means; and
specifying means for specifying the image to be selected according to the position and orientation of said observation means.
39. A surgical operation image acquisition/display apparatus comprising:
observation means having a plurality of observation means and adapted to modify its position and orientation relative to an area of surgical operation;
image display means adapted to synthetically display the images obtained by said plurality of observation means of said observation section in a mode selected out of a plurality of different modes; and
specifying means for specifying a mode of image synthesis to said image display means according to the position and orientation of said observation section.
40. A surgical operation image acquisition/display apparatus comprising:
observation means having a surgical microscope adapted to modify its position and orientation and an endoscope also adapted to modify its position and orientation;
a detection means for detecting the position and orientation of said observation means;
image display means adapted to synthetically combine the image observed by said surgical microscope and the image observed by said endoscope and display the images in a mode selected out of a plurality of different modes; and
specifying means for specifying the mode of image synthesis to said image display means according to the outcome of the detection of said detection means.
41. A surgical operation image acquisition/display apparatus comprising:
observation means having a surgical microscope adapted to modify its position and orientation, and an endoscope also adapted to modify its position and orientation;
detection means for detecting the position and orientation of said observation means;
image display means adapted to alternatively display the image observed by said surgical microscope or the image observed by said endoscope; and
specifying means for specifying the image to be displayed according to the outcome of the detection of said detection means.
42. A surgical operation image acquisition/display apparatus comprising:
first observation means for observing an area of surgical operation;
second observation means arranged apart from said first observation means and adapted to observe at least said area of surgical operation or the vicinity thereof;
observed image synthesizing means for synthetically combining the image observed by said first observation means and the image observed by said second observation means;
display means for display the image synthesized by said observed image synthesizing means;
position and orientation detection means for detecting at least either the position and orientation of said first observation means of that of said second observation means; and
state of synthesis modification specifying means for specifying the modification made to the state of synthesis of the observed images to said observed image synthesizing means according to the position and orientation information from said position and orientation detection means.
US09/875,628 1999-03-30 2001-06-06 Navigation apparatus and surgical operation image acquisition/display apparatus using the same Expired - Lifetime US6456868B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US09/875,628 US6456868B2 (en) 1999-03-30 2001-06-06 Navigation apparatus and surgical operation image acquisition/display apparatus using the same

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP11-089405 1999-03-30
JP8940599A JP2000279425A (en) 1999-03-30 1999-03-30 Navigation device
JP11-163964 1999-06-10
JP11163964A JP2000350734A (en) 1999-06-10 1999-06-10 Operation imaging display apparatus
US09/533,651 US6466815B1 (en) 1999-03-30 2000-03-22 Navigation apparatus and surgical operation image acquisition/display apparatus using the same
US09/875,628 US6456868B2 (en) 1999-03-30 2001-06-06 Navigation apparatus and surgical operation image acquisition/display apparatus using the same

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US09/533,651 Division US6466815B1 (en) 1999-03-30 2000-03-22 Navigation apparatus and surgical operation image acquisition/display apparatus using the same

Publications (2)

Publication Number Publication Date
US20010027272A1 true US20010027272A1 (en) 2001-10-04
US6456868B2 US6456868B2 (en) 2002-09-24

Family

ID=26430826

Family Applications (2)

Application Number Title Priority Date Filing Date
US09/533,651 Expired - Lifetime US6466815B1 (en) 1999-03-30 2000-03-22 Navigation apparatus and surgical operation image acquisition/display apparatus using the same
US09/875,628 Expired - Lifetime US6456868B2 (en) 1999-03-30 2001-06-06 Navigation apparatus and surgical operation image acquisition/display apparatus using the same

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US09/533,651 Expired - Lifetime US6466815B1 (en) 1999-03-30 2000-03-22 Navigation apparatus and surgical operation image acquisition/display apparatus using the same

Country Status (1)

Country Link
US (2) US6466815B1 (en)

Cited By (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020191814A1 (en) * 2001-06-14 2002-12-19 Ellis Randy E. Apparatuses and methods for surgical navigation
EP1306050A1 (en) * 2001-10-24 2003-05-02 BrainLAB AG Microprobe with navigation system
US20040034282A1 (en) * 2002-03-06 2004-02-19 Quaid Arthur E. System and method for using a haptic device as an input device
US20060033908A1 (en) * 2004-07-21 2006-02-16 The Boeing Company Rotary borescopic optical dimensional mapping tool
US20060100526A1 (en) * 2004-10-26 2006-05-11 Yukari Yamamoto Optical bioinstrumentation for living body
US20060281971A1 (en) * 2005-06-14 2006-12-14 Siemens Corporate Research Inc Method and apparatus for minimally invasive surgery using endoscopes
US20070060792A1 (en) * 2004-02-11 2007-03-15 Wolfgang Draxinger Method and apparatus for generating at least one section of a virtual 3D model of a body interior
US20080030192A1 (en) * 2004-07-21 2008-02-07 Hitachi Medical Corporation Tomographic System
US20090190808A1 (en) * 2008-01-28 2009-07-30 Advanced Medical Optics, Inc. User adjustment measurement scale on video overlay
WO2009150647A2 (en) * 2008-06-11 2009-12-17 Dune Medical Devices Ltd. Double registration
US20100094085A1 (en) * 2007-01-31 2010-04-15 National University Corporation Hamamatsu Universi Ty School Of Medicine Device for Displaying Assistance Information for Surgical Operation, Method for Displaying Assistance Information for Surgical Operation, and Program for Displaying Assistance Information for Surgical Operation
US20100228257A1 (en) * 2000-01-14 2010-09-09 Bonutti Peter M Joint replacement component
WO2010111090A1 (en) * 2009-03-26 2010-09-30 Intuitive Surgical Operations, Inc. System for providing visual guidance for steering a tip of an endoscopic device towards one or more landmarks and assisting an operator in endoscopic navigation
US20110016306A1 (en) * 2009-07-15 2011-01-20 Kabushiki Kaisha Toshiba Medical image display system and medical image communication method
US20120051606A1 (en) * 2010-08-24 2012-03-01 Siemens Information Systems Ltd. Automated System for Anatomical Vessel Characteristic Determination
US20130096424A1 (en) * 2010-06-22 2013-04-18 Koninklijke Philips Electronics N.V. System and method for real-time endoscope calibration
CN103315696A (en) * 2012-03-21 2013-09-25 柯惠Lp公司 System and method for determining camera angles by using virtual planes derived from actual images
JP2013190766A (en) * 2012-02-17 2013-09-26 Morita Mfg Co Ltd Medical treatment device
US8623030B2 (en) 2001-08-28 2014-01-07 Bonutti Skeletal Innovations Llc Robotic arthroplasty system including navigation
US8666476B2 (en) 2009-03-01 2014-03-04 National University Corporation Hamamatsu University School Of Medicine Surgery assistance system
US8801601B2 (en) 2009-03-26 2014-08-12 Intuitive Surgical Operations, Inc. Method and system for providing visual guidance to an operator for steering a tip of an endoscopic device toward one or more landmarks in a patient
US20150182415A1 (en) * 2013-12-31 2015-07-02 John David Olkowski Eyelid Care Appliance
US20150287192A1 (en) * 2012-12-20 2015-10-08 Olympus Corporation Image processing device, electronic device, endoscope apparatus, information storage device, and image processing method
EP2950130A1 (en) * 2014-05-27 2015-12-02 Carl Zeiss Meditec AG Microscope system with depth preview
US20160295199A1 (en) * 2013-11-29 2016-10-06 Allm Inc. Microscope video processing device and medical microscope system
CN106308944A (en) * 2016-08-16 2017-01-11 中国医学科学院北京协和医院 Endoscope and microscope unified body
US20170251900A1 (en) * 2015-10-09 2017-09-07 3Dintegrated Aps Depiction system
US9757034B2 (en) 2010-10-08 2017-09-12 Koninklijke Philips N.V. Flexible tether with integrated sensors for dynamic instrument tracking
US9763746B2 (en) 2011-04-07 2017-09-19 3Shape A/S 3D system and method for guiding objects
DE102016117263A1 (en) 2016-09-14 2018-03-15 Carl Zeiss Meditec Ag Optical observation device system
US10004387B2 (en) 2009-03-26 2018-06-26 Intuitive Surgical Operations, Inc. Method and system for assisting an operator in endoscopic navigation
EP3351201A3 (en) * 2012-05-22 2018-10-24 Covidien LP Surgical navigation system
CN108882964A (en) * 2015-10-09 2018-11-23 柯惠Lp公司 Make body cavity visualization method with robotic surgical system using angled endoscope
US11020144B2 (en) 2015-07-21 2021-06-01 3Dintegrated Aps Minimally invasive surgery system
US11033182B2 (en) 2014-02-21 2021-06-15 3Dintegrated Aps Set comprising a surgical instrument
US20220015982A1 (en) * 2018-11-30 2022-01-20 University Of Southern California Double-blinded, randomized trial of augmented reality low-vision mobility and grasp aid
US20220044398A1 (en) * 2020-08-10 2022-02-10 Kunnskap Medical, LLC Endoscopic system with medium management system control
US11331120B2 (en) 2015-07-21 2022-05-17 3Dintegrated Aps Cannula assembly kit
WO2023078803A1 (en) * 2021-11-02 2023-05-11 B. Braun New Ventures GmbH Surgical navigation system having improved instrument tracking and navigation method
US11684491B2 (en) 2003-01-30 2023-06-27 Medtronic Navigation, Inc. Method and apparatus for post-operative tuning of a spinal implant
US11707329B2 (en) 2018-08-10 2023-07-25 Covidien Lp Systems and methods for ablation visualization

Families Citing this family (111)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020138008A1 (en) * 2000-01-13 2002-09-26 Kazuhiro Tsujita Method and apparatus for displaying fluorescence images and method and apparatus for acquiring endoscope images
JP4822634B2 (en) * 2000-08-31 2011-11-24 シーメンス アクチエンゲゼルシヤフト A method for obtaining coordinate transformation for guidance of an object
FR2816200A1 (en) 2000-11-06 2002-05-10 Praxim DETERMINING THE POSITION OF A KNEE PROSTHESIS
AU2002361572A1 (en) * 2001-10-19 2003-04-28 University Of North Carolina At Chape Hill Methods and systems for dynamic virtual convergence and head mountable display
AU2003212481A1 (en) * 2002-02-28 2003-09-09 Ekos Corporation Ultrasound assembly for use with a catheter
US8996169B2 (en) 2011-12-29 2015-03-31 Mako Surgical Corp. Neural monitor-based dynamic haptics
US8010180B2 (en) 2002-03-06 2011-08-30 Mako Surgical Corp. Haptic guidance system and method
US11202676B2 (en) 2002-03-06 2021-12-21 Mako Surgical Corp. Neural monitor-based dynamic haptics
US7319897B2 (en) * 2002-12-02 2008-01-15 Aesculap Ag & Co. Kg Localization device display method and apparatus
US8276091B2 (en) * 2003-09-16 2012-09-25 Ram Consulting Haptic response system and method of use
DE10350011A1 (en) * 2003-10-26 2005-06-16 Lb Medical Gmbh Medical training and patient treatment system for use in hospital, has coordinate measuring systems aimed at landmark indicators on patient and has base station and computer with display
US7542602B2 (en) * 2004-11-19 2009-06-02 Carestream Health, Inc. Digital image processing of medical images
US8784336B2 (en) 2005-08-24 2014-07-22 C. R. Bard, Inc. Stylet apparatuses and methods of manufacture
AT502919B1 (en) 2005-12-14 2010-11-15 Univ Innsbruck MEDICAL NAVIGATION SYSTEM
US20070271122A1 (en) * 2006-05-02 2007-11-22 Siemens Medical Solutions Usa, Inc. Patient Video and Audio Monitoring System
JP2009537231A (en) 2006-05-19 2009-10-29 マコ サージカル コーポレーション Method and apparatus for controlling a haptic device
US8560047B2 (en) 2006-06-16 2013-10-15 Board Of Regents Of The University Of Nebraska Method and apparatus for computer aided surgery
US7728868B2 (en) 2006-08-02 2010-06-01 Inneroptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
DE502006004198D1 (en) * 2006-10-17 2009-08-20 Brainlab Ag Marker navigation system for detecting and displaying the location of marker devices
EP1913890B1 (en) * 2006-10-20 2009-04-15 BrainLAB AG Marker-navigation device in particular for medical use
US7794407B2 (en) 2006-10-23 2010-09-14 Bard Access Systems, Inc. Method of locating the tip of a central venous catheter
US8388546B2 (en) 2006-10-23 2013-03-05 Bard Access Systems, Inc. Method of locating the tip of a central venous catheter
US8214016B2 (en) * 2006-12-12 2012-07-03 Perception Raisonnement Action En Medecine System and method for determining an optimal type and position of an implant
CN101925333B (en) 2007-11-26 2014-02-12 C·R·巴德股份有限公司 Integrated system for intravascular placement of catheter
US9521961B2 (en) 2007-11-26 2016-12-20 C. R. Bard, Inc. Systems and methods for guiding a medical instrument
US8849382B2 (en) 2007-11-26 2014-09-30 C. R. Bard, Inc. Apparatus and display methods relating to intravascular placement of a catheter
US10449330B2 (en) 2007-11-26 2019-10-22 C. R. Bard, Inc. Magnetic element-equipped needle assemblies
US10751509B2 (en) 2007-11-26 2020-08-25 C. R. Bard, Inc. Iconic representations for guidance of an indwelling medical device
US9649048B2 (en) 2007-11-26 2017-05-16 C. R. Bard, Inc. Systems and methods for breaching a sterile field for intravascular placement of a catheter
US9636031B2 (en) 2007-11-26 2017-05-02 C.R. Bard, Inc. Stylets for use with apparatus for intravascular placement of a catheter
US8781555B2 (en) 2007-11-26 2014-07-15 C. R. Bard, Inc. System for placement of a catheter including a signal-generating stylet
US10524691B2 (en) 2007-11-26 2020-01-07 C. R. Bard, Inc. Needle assembly including an aligned magnetic element
WO2009094646A2 (en) 2008-01-24 2009-07-30 The University Of North Carolina At Chapel Hill Methods, systems, and computer readable media for image guided ablation
US8219179B2 (en) * 2008-03-06 2012-07-10 Vida Diagnostics, Inc. Systems and methods for navigation within a branched structure of a body
US8340379B2 (en) 2008-03-07 2012-12-25 Inneroptic Technology, Inc. Systems and methods for displaying guidance data based on updated deformable imaging data
US9901714B2 (en) 2008-08-22 2018-02-27 C. R. Bard, Inc. Catheter assembly including ECG sensor and magnetic assemblies
US8437833B2 (en) 2008-10-07 2013-05-07 Bard Access Systems, Inc. Percutaneous magnetic gastrostomy
WO2010048475A1 (en) * 2008-10-23 2010-04-29 Immersion Corporation Systems and methods for ultrasound simulation using depth peeling
KR100961661B1 (en) * 2009-02-12 2010-06-09 주식회사 래보 Apparatus and method of operating a medical navigation system
US8641621B2 (en) 2009-02-17 2014-02-04 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
US8554307B2 (en) 2010-04-12 2013-10-08 Inneroptic Technology, Inc. Image annotation in image-guided medical procedures
US8690776B2 (en) 2009-02-17 2014-04-08 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
US11464578B2 (en) 2009-02-17 2022-10-11 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
US9521994B2 (en) * 2009-05-11 2016-12-20 Siemens Healthcare Gmbh System and method for image guided prostate cancer needle biopsy
US9532724B2 (en) 2009-06-12 2017-01-03 Bard Access Systems, Inc. Apparatus and method for catheter navigation using endovascular energy mapping
JP5795576B2 (en) 2009-06-12 2015-10-14 バード・アクセス・システムズ,インコーポレーテッド Method of operating a computer-based medical device that uses an electrocardiogram (ECG) signal to position an intravascular device in or near the heart
WO2011019760A2 (en) 2009-08-10 2011-02-17 Romedex International Srl Devices and methods for endovascular electrography
US9282947B2 (en) 2009-12-01 2016-03-15 Inneroptic Technology, Inc. Imager focusing based on intraoperative data
EP2531098B1 (en) 2010-02-02 2020-07-15 C.R. Bard, Inc. Apparatus and method for catheter navigation and tip location
EP2549943B1 (en) * 2010-03-22 2018-01-31 Brainlab AG Controlling a surgical microscope
MX2012013858A (en) 2010-05-28 2013-04-08 Bard Inc C R Insertion guidance system for needles and medical components.
WO2011150376A1 (en) 2010-05-28 2011-12-01 C.R. Bard, Inc. Apparatus for use with needle insertion guidance system
MX338127B (en) 2010-08-20 2016-04-04 Bard Inc C R Reconfirmation of ecg-assisted catheter tip placement.
CN103189009B (en) 2010-10-29 2016-09-07 C·R·巴德股份有限公司 The bio-impedance auxiliary of Medical Devices is placed
US9119655B2 (en) 2012-08-03 2015-09-01 Stryker Corporation Surgical manipulator capable of controlling a surgical instrument in multiple modes
US9921712B2 (en) 2010-12-29 2018-03-20 Mako Surgical Corp. System and method for providing substantially stable control of a surgical tool
US11911117B2 (en) 2011-06-27 2024-02-27 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery
US9498231B2 (en) 2011-06-27 2016-11-22 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery
CN106913366B (en) 2011-06-27 2021-02-26 内布拉斯加大学评议会 On-tool tracking system and computer-assisted surgery method
DE102011078212B4 (en) 2011-06-28 2017-06-29 Scopis Gmbh Method and device for displaying an object
KR20140051284A (en) 2011-07-06 2014-04-30 씨. 알. 바드, 인크. Needle length determination and calibration for insertion guidance system
EP2750620B1 (en) 2011-09-02 2017-04-26 Stryker Corporation Surgical instrument including a cutting accessory extending from a housing and actuators that establish the position of the cutting accessory relative to the housing
DE102011114146A1 (en) * 2011-09-23 2013-03-28 Scopis Gmbh Method for representing e.g. head region of human for controlling operation to remove tumor, involves producing point set in coordinate system, and imaging coordinates of points of set in another coordinate system using determined image
CN103313642B (en) * 2011-09-29 2016-01-20 奥林巴斯株式会社 Endoscope apparatus
WO2013116240A1 (en) 2012-01-30 2013-08-08 Inneroptic Technology, Inc. Multiple medical device guidance
US9439623B2 (en) 2012-05-22 2016-09-13 Covidien Lp Surgical planning system and navigation system
US9498182B2 (en) 2012-05-22 2016-11-22 Covidien Lp Systems and methods for planning and navigation
US9439627B2 (en) 2012-05-22 2016-09-13 Covidien Lp Planning system and navigation system for an ablation procedure
US8750568B2 (en) 2012-05-22 2014-06-10 Covidien Lp System and method for conformal ablation planning
US9820818B2 (en) 2012-08-03 2017-11-21 Stryker Corporation System and method for controlling a surgical manipulator based on implant parameters
US9226796B2 (en) 2012-08-03 2016-01-05 Stryker Corporation Method for detecting a disturbance as an energy applicator of a surgical instrument traverses a cutting path
AU2013296278B2 (en) 2012-08-03 2018-06-14 Stryker Corporation Systems and methods for robotic surgery
US9008757B2 (en) 2012-09-26 2015-04-14 Stryker Corporation Navigation system including optical and non-optical sensors
AU2014240998B2 (en) 2013-03-13 2018-09-20 Stryker Corporation System for arranging objects in an operating room in preparation for surgical procedures
EP2996611B1 (en) 2013-03-13 2019-06-26 Stryker Corporation Systems and software for establishing virtual constraint boundaries
US10314559B2 (en) 2013-03-14 2019-06-11 Inneroptic Technology, Inc. Medical device guidance
US10105149B2 (en) 2013-03-15 2018-10-23 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery
DE102013211055B3 (en) 2013-06-13 2014-09-18 Scopis Gmbh Adapter for receiving a medical device and a position detection system
JP6265630B2 (en) * 2013-06-13 2018-01-24 オリンパス株式会社 Endoscope apparatus and method for operating endoscope apparatus
ES2811323T3 (en) 2014-02-06 2021-03-11 Bard Inc C R Systems for the guidance and placement of an intravascular device
GB2524955A (en) 2014-04-01 2015-10-14 Scopis Gmbh Method for cell envelope segmentation and visualisation
US9754367B2 (en) 2014-07-02 2017-09-05 Covidien Lp Trachea marking
US9770216B2 (en) 2014-07-02 2017-09-26 Covidien Lp System and method for navigating within the lung
US20160000414A1 (en) 2014-07-02 2016-01-07 Covidien Lp Methods for marking biopsy location
WO2016004310A2 (en) 2014-07-02 2016-01-07 Covidien Lp Real-time automatic registration feedback
EP3164073B1 (en) 2014-07-02 2020-07-01 Covidien LP System and method for detecting trachea
CA2953146A1 (en) 2014-07-02 2016-01-07 Covidien Lp System and method for segmentation of lung
US9603668B2 (en) 2014-07-02 2017-03-28 Covidien Lp Dynamic 3D lung map view for tool navigation inside the lung
US9901406B2 (en) 2014-10-02 2018-02-27 Inneroptic Technology, Inc. Affected region display associated with a medical device
GB201501157D0 (en) 2015-01-23 2015-03-11 Scopis Gmbh Instrument guidance system for sinus surgery
US10188467B2 (en) 2014-12-12 2019-01-29 Inneroptic Technology, Inc. Surgical guidance intersection display
US10973584B2 (en) 2015-01-19 2021-04-13 Bard Access Systems, Inc. Device and method for vascular access
GB2536650A (en) 2015-03-24 2016-09-28 Augmedics Ltd Method and system for combining video-based and optic-based augmented reality in a near eye display
WO2016210325A1 (en) 2015-06-26 2016-12-29 C.R. Bard, Inc. Connector interface for ecg-based catheter positioning system
US9949700B2 (en) 2015-07-22 2018-04-24 Inneroptic Technology, Inc. Medical device approaches
JP6586824B2 (en) 2015-08-27 2019-10-09 富士通株式会社 Image processing apparatus, image processing method, and image processing program
US10986990B2 (en) 2015-09-24 2021-04-27 Covidien Lp Marker placement
US10709352B2 (en) 2015-10-27 2020-07-14 Covidien Lp Method of using lung airway carina locations to improve ENB registration
EP3397188B1 (en) 2015-12-31 2020-09-09 Stryker Corporation System and methods for preparing surgery on a patient at a target site defined by a virtual object
US11000207B2 (en) 2016-01-29 2021-05-11 C. R. Bard, Inc. Multiple coil system for tracking a medical device
US9675319B1 (en) 2016-02-17 2017-06-13 Inneroptic Technology, Inc. Loupe display
US10278778B2 (en) 2016-10-27 2019-05-07 Inneroptic Technology, Inc. Medical device navigation using a virtual 3D space
WO2018112025A1 (en) 2016-12-16 2018-06-21 Mako Surgical Corp. Techniques for modifying tool operation in a surgical robotic system based on comparing actual and commanded states of the tool relative to a surgical site
US11259879B2 (en) 2017-08-01 2022-03-01 Inneroptic Technology, Inc. Selective transparency to assist medical device navigation
US11484365B2 (en) 2018-01-23 2022-11-01 Inneroptic Technology, Inc. Medical image guidance
US11224392B2 (en) 2018-02-01 2022-01-18 Covidien Lp Mapping disease spread
US10992079B2 (en) 2018-10-16 2021-04-27 Bard Access Systems, Inc. Safety-equipped connection systems and methods thereof for establishing electrical connections
WO2020097425A2 (en) 2018-11-09 2020-05-14 Vida Diagnostics, Inc. Cut-surface display of tubular structures
US11382712B2 (en) 2019-12-22 2022-07-12 Augmedics Ltd. Mirroring in image guided surgery
WO2021207289A1 (en) 2020-04-07 2021-10-14 Vida Diagnostics, Inc. Subject specific coordinatization and virtual navigation systems and methods
US11896445B2 (en) 2021-07-07 2024-02-13 Augmedics Ltd. Iliac pin and adapter

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5662111A (en) * 1991-01-28 1997-09-02 Cosman; Eric R. Process of stereotactic optical navigation
US5417210A (en) 1992-05-27 1995-05-23 International Business Machines Corporation System and method for augmentation of endoscopic surgery
JPH05203881A (en) 1992-01-28 1993-08-13 Ito Takayuki Image system for stereoscopy and image integration system for stereoscopy
US5633675A (en) * 1993-02-16 1997-05-27 Welch Allyn, Inc, Shadow probe
JPH07261094A (en) 1994-03-25 1995-10-13 Olympus Optical Co Ltd Microscope for surgical operation
US5770190A (en) * 1995-07-14 1998-06-23 Schering Corporation Method of treatment of acute leukemia with inteleukin-10
EP0845959A4 (en) * 1995-07-16 1998-09-30 Ultra Guide Ltd Free-hand aiming of a needle guide
US5638819A (en) * 1995-08-29 1997-06-17 Manwaring; Kim H. Method and apparatus for guiding an instrument to a target
JPH09173352A (en) 1995-12-25 1997-07-08 Toshiba Medical Eng Co Ltd Medical navigation system
JPH105245A (en) 1996-06-25 1998-01-13 Shimadzu Corp Surgical operation support system
US6097994A (en) * 1996-09-30 2000-08-01 Siemens Corporate Research, Inc. Apparatus and method for determining the correct insertion depth for a biopsy needle
US6026315A (en) * 1997-03-27 2000-02-15 Siemens Aktiengesellschaft Method and apparatus for calibrating a navigation system in relation to image data of a magnetic resonance apparatus
US5978696A (en) * 1997-10-06 1999-11-02 General Electric Company Real-time image-guided placement of anchor devices
US6236878B1 (en) * 1998-05-22 2001-05-22 Charles A. Taylor Method for predictive modeling for planning medical interventions and simulating physiological conditions

Cited By (87)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100228257A1 (en) * 2000-01-14 2010-09-09 Bonutti Peter M Joint replacement component
US9101443B2 (en) 2000-01-14 2015-08-11 Bonutti Skeletal Innovations Llc Methods for robotic arthroplasty
US9192459B2 (en) 2000-01-14 2015-11-24 Bonutti Skeletal Innovations Llc Method of performing total knee arthroplasty
US8784495B2 (en) 2000-01-14 2014-07-22 Bonutti Skeletal Innovations Llc Segmental knee arthroplasty
US8632552B2 (en) 2000-01-14 2014-01-21 Bonutti Skeletal Innovations Llc Method of preparing a femur and tibia in knee arthroplasty
US9795394B2 (en) 2000-01-14 2017-10-24 Bonutti Skeletal Innovations Llc Method for placing implant using robotic system
US8425522B2 (en) 2000-01-14 2013-04-23 Bonutti Skeletal Innovations Llc Joint replacement method
US6990220B2 (en) * 2001-06-14 2006-01-24 Igo Technologies Inc. Apparatuses and methods for surgical navigation
US20020191814A1 (en) * 2001-06-14 2002-12-19 Ellis Randy E. Apparatuses and methods for surgical navigation
US8623030B2 (en) 2001-08-28 2014-01-07 Bonutti Skeletal Innovations Llc Robotic arthroplasty system including navigation
US9763683B2 (en) 2001-08-28 2017-09-19 Bonutti Skeletal Innovations Llc Method for performing surgical procedures using optical cutting guides
US10321918B2 (en) 2001-08-28 2019-06-18 Bonutti Skeletal Innovations Llc Methods for robotic surgery using a cannula
US8858557B2 (en) 2001-08-28 2014-10-14 Bonutti Skeletal Innovations Llc Method of preparing a femur and tibia in knee arthroplasty
US8840629B2 (en) 2001-08-28 2014-09-23 Bonutti Skeletal Innovations Llc Robotic arthroplasty system including navigation
US8834490B2 (en) 2001-08-28 2014-09-16 Bonutti Skeletal Innovations Llc Method for robotic arthroplasty using navigation
US10231739B1 (en) 2001-08-28 2019-03-19 Bonutti Skeletal Innovations Llc System and method for robotic surgery
US10470780B2 (en) 2001-08-28 2019-11-12 Bonutti Skeletal Innovations Llc Systems and methods for ligament balancing in robotic surgery
US8641726B2 (en) 2001-08-28 2014-02-04 Bonutti Skeletal Innovations Llc Method for robotic arthroplasty using navigation
US9060797B2 (en) 2001-08-28 2015-06-23 Bonutti Skeletal Innovations Llc Method of preparing a femur and tibia in knee arthroplasty
US6694162B2 (en) 2001-10-24 2004-02-17 Brainlab Ag Navigated microprobe
EP1306050A1 (en) * 2001-10-24 2003-05-02 BrainLAB AG Microprobe with navigation system
US8391954B2 (en) 2002-03-06 2013-03-05 Mako Surgical Corp. System and method for interactive haptic positioning of a medical device
US8095200B2 (en) 2002-03-06 2012-01-10 Mako Surgical Corp. System and method for using a haptic device as an input device
US20040034282A1 (en) * 2002-03-06 2004-02-19 Quaid Arthur E. System and method for using a haptic device as an input device
US11707363B2 (en) * 2003-01-30 2023-07-25 Medtronic Navigation, Inc. Method and apparatus for post-operative tuning of a spinal implant
US11684491B2 (en) 2003-01-30 2023-06-27 Medtronic Navigation, Inc. Method and apparatus for post-operative tuning of a spinal implant
US20070060792A1 (en) * 2004-02-11 2007-03-15 Wolfgang Draxinger Method and apparatus for generating at least one section of a virtual 3D model of a body interior
US7794388B2 (en) * 2004-02-11 2010-09-14 Karl Storz Gmbh & Co. Kg Method and apparatus for generating at least one section of a virtual 3D model of a body interior
US7349083B2 (en) 2004-07-21 2008-03-25 The Boeing Company Rotary borescopic optical dimensional mapping tool
US20080030192A1 (en) * 2004-07-21 2008-02-07 Hitachi Medical Corporation Tomographic System
US8306605B2 (en) * 2004-07-21 2012-11-06 Hitachi Medical Corporation Tomographic system
US20060033908A1 (en) * 2004-07-21 2006-02-16 The Boeing Company Rotary borescopic optical dimensional mapping tool
US20060100526A1 (en) * 2004-10-26 2006-05-11 Yukari Yamamoto Optical bioinstrumentation for living body
US7613502B2 (en) * 2004-10-26 2009-11-03 Hitachi, Ltd. Optical bioinstrumentation for living body
US9289267B2 (en) * 2005-06-14 2016-03-22 Siemens Medical Solutions Usa, Inc. Method and apparatus for minimally invasive surgery using endoscopes
US20060281971A1 (en) * 2005-06-14 2006-12-14 Siemens Corporate Research Inc Method and apparatus for minimally invasive surgery using endoscopes
US8251893B2 (en) 2007-01-31 2012-08-28 National University Corporation Hamamatsu University School Of Medicine Device for displaying assistance information for surgical operation, method for displaying assistance information for surgical operation, and program for displaying assistance information for surgical operation
US20100094085A1 (en) * 2007-01-31 2010-04-15 National University Corporation Hamamatsu Universi Ty School Of Medicine Device for Displaying Assistance Information for Surgical Operation, Method for Displaying Assistance Information for Surgical Operation, and Program for Displaying Assistance Information for Surgical Operation
US8194949B2 (en) * 2008-01-28 2012-06-05 Abbott Medical Optics Inc. User adjustment measurement scale on video overlay
US20090190808A1 (en) * 2008-01-28 2009-07-30 Advanced Medical Optics, Inc. User adjustment measurement scale on video overlay
WO2009150647A3 (en) * 2008-06-11 2010-03-18 Dune Medical Devices Ltd. Double registration
WO2009150647A2 (en) * 2008-06-11 2009-12-17 Dune Medical Devices Ltd. Double registration
US9008756B2 (en) 2008-06-11 2015-04-14 Dune Medical Devices Ltd. Mapping system and method for mapping a target containing tissue
US8666476B2 (en) 2009-03-01 2014-03-04 National University Corporation Hamamatsu University School Of Medicine Surgery assistance system
CN102449666A (en) * 2009-03-26 2012-05-09 直观外科手术操作公司 System for providing visual guidance for steering a tip of an endoscopic device towards one or more landmarks and assisting an operator in endoscopic navigation
US10524641B2 (en) 2009-03-26 2020-01-07 Intuitive Surgical Operations, Inc. Method and system for assisting an operator in endoscopic navigation
US11744445B2 (en) 2009-03-26 2023-09-05 Intuitive Surgical Operations, Inc. Method and system for assisting an operator in endoscopic navigation
US8801601B2 (en) 2009-03-26 2014-08-12 Intuitive Surgical Operations, Inc. Method and system for providing visual guidance to an operator for steering a tip of an endoscopic device toward one or more landmarks in a patient
WO2010111090A1 (en) * 2009-03-26 2010-09-30 Intuitive Surgical Operations, Inc. System for providing visual guidance for steering a tip of an endoscopic device towards one or more landmarks and assisting an operator in endoscopic navigation
US10856770B2 (en) 2009-03-26 2020-12-08 Intuitive Surgical Operations, Inc. Method and system for providing visual guidance to an operator for steering a tip of an endoscopic device towards one or more landmarks in a patient
US10004387B2 (en) 2009-03-26 2018-06-26 Intuitive Surgical Operations, Inc. Method and system for assisting an operator in endoscopic navigation
US8918481B2 (en) * 2009-07-15 2014-12-23 Kabushiki Kaisha Toshiba Medical image display system and medical image communication method
US20110016306A1 (en) * 2009-07-15 2011-01-20 Kabushiki Kaisha Toshiba Medical image display system and medical image communication method
US20130096424A1 (en) * 2010-06-22 2013-04-18 Koninklijke Philips Electronics N.V. System and method for real-time endoscope calibration
US8553954B2 (en) * 2010-08-24 2013-10-08 Siemens Medical Solutions Usa, Inc. Automated system for anatomical vessel characteristic determination
US20120051606A1 (en) * 2010-08-24 2012-03-01 Siemens Information Systems Ltd. Automated System for Anatomical Vessel Characteristic Determination
US9757034B2 (en) 2010-10-08 2017-09-12 Koninklijke Philips N.V. Flexible tether with integrated sensors for dynamic instrument tracking
US10716634B2 (en) 2011-04-07 2020-07-21 3Shape A/S 3D system and method for guiding objects
US9763746B2 (en) 2011-04-07 2017-09-19 3Shape A/S 3D system and method for guiding objects
US10582972B2 (en) 2011-04-07 2020-03-10 3Shape A/S 3D system and method for guiding objects
US10299865B2 (en) 2011-04-07 2019-05-28 3Shape A/S 3D system and method for guiding objects
JP2013190766A (en) * 2012-02-17 2013-09-26 Morita Mfg Co Ltd Medical treatment device
CN103315696A (en) * 2012-03-21 2013-09-25 柯惠Lp公司 System and method for determining camera angles by using virtual planes derived from actual images
EP2641561A1 (en) * 2012-03-21 2013-09-25 Covidien LP System and method for determining camera angles by using virtual planes derived from actual images
EP3351201A3 (en) * 2012-05-22 2018-10-24 Covidien LP Surgical navigation system
US20150287192A1 (en) * 2012-12-20 2015-10-08 Olympus Corporation Image processing device, electronic device, endoscope apparatus, information storage device, and image processing method
US9736464B2 (en) * 2013-11-29 2017-08-15 Allm Inc. Microscope video processing device and medical microscope system
US20160295199A1 (en) * 2013-11-29 2016-10-06 Allm Inc. Microscope video processing device and medical microscope system
US20150182415A1 (en) * 2013-12-31 2015-07-02 John David Olkowski Eyelid Care Appliance
US10314763B2 (en) * 2013-12-31 2019-06-11 Teeny Clean, Llc Eyelid care appliance
US11033182B2 (en) 2014-02-21 2021-06-15 3Dintegrated Aps Set comprising a surgical instrument
EP2950130A1 (en) * 2014-05-27 2015-12-02 Carl Zeiss Meditec AG Microscope system with depth preview
US11331120B2 (en) 2015-07-21 2022-05-17 3Dintegrated Aps Cannula assembly kit
US11020144B2 (en) 2015-07-21 2021-06-01 3Dintegrated Aps Minimally invasive surgery system
US20170251900A1 (en) * 2015-10-09 2017-09-07 3Dintegrated Aps Depiction system
US11039734B2 (en) * 2015-10-09 2021-06-22 3Dintegrated Aps Real time correlated depiction system of surgical tool
CN108289598B (en) * 2015-10-09 2022-02-18 3D集成公司 Drawing system
EP3206559A4 (en) * 2015-10-09 2017-11-29 3dintegrated ApS A depiction system
CN108882964A (en) * 2015-10-09 2018-11-23 柯惠Lp公司 Make body cavity visualization method with robotic surgical system using angled endoscope
CN108289598A (en) * 2015-10-09 2018-07-17 3D集成公司 Trace system
CN112294449A (en) * 2016-08-16 2021-02-02 中国医学科学院北京协和医院 Endoscopic microscope combination for operation
CN106308944A (en) * 2016-08-16 2017-01-11 中国医学科学院北京协和医院 Endoscope and microscope unified body
DE102016117263A1 (en) 2016-09-14 2018-03-15 Carl Zeiss Meditec Ag Optical observation device system
US11707329B2 (en) 2018-08-10 2023-07-25 Covidien Lp Systems and methods for ablation visualization
US20220015982A1 (en) * 2018-11-30 2022-01-20 University Of Southern California Double-blinded, randomized trial of augmented reality low-vision mobility and grasp aid
US20220044398A1 (en) * 2020-08-10 2022-02-10 Kunnskap Medical, LLC Endoscopic system with medium management system control
WO2023078803A1 (en) * 2021-11-02 2023-05-11 B. Braun New Ventures GmbH Surgical navigation system having improved instrument tracking and navigation method

Also Published As

Publication number Publication date
US6466815B1 (en) 2002-10-15
US6456868B2 (en) 2002-09-24

Similar Documents

Publication Publication Date Title
US6466815B1 (en) Navigation apparatus and surgical operation image acquisition/display apparatus using the same
US10733700B2 (en) System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
KR100439756B1 (en) Apparatus and method for displaying virtual endoscopy diaplay
US8248413B2 (en) Visual navigation system for endoscopic surgery
US7945310B2 (en) Surgical instrument path computation and display for endoluminal surgery
US8248414B2 (en) Multi-dimensional navigation of endoscopic video
US7824328B2 (en) Method and apparatus for tracking a surgical instrument during surgery
US6049622A (en) Graphic navigational guides for accurate image orientation and navigation
US20080071141A1 (en) Method and apparatus for measuring attributes of an anatomical feature during a medical procedure
US6690964B2 (en) Method and device for visualization of positions and orientation of intracorporeally guided instruments during a surgical intervention
US20070161854A1 (en) System and method for endoscopic measurement and mapping of internal organs, tumors and other objects
US20050187432A1 (en) Global endoscopic viewing indicator
RU2707369C1 (en) Method for preparing and performing a surgical operation using augmented reality and a complex of equipment for its implementation
JP2008532602A (en) Surgical navigation and microscopy visualization method and apparatus
JPH08107875A (en) Endoscope shape detector
JP2000279425A (en) Navigation device
US20220215539A1 (en) Composite medical imaging systems and methods
JP4159396B2 (en) Endoscope shape detection device
KR101464330B1 (en) Method of comparing preoperative respiratory level with intraoperative respiratory level
US20210378748A1 (en) Anatomical structure visualization systems and methods
JP2002017751A (en) Surgery navigation device
KR20140128131A (en) Method of comparing preoperative respiratory level with intraoperative respiratory level

Legal Events

Date Code Title Description
STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

FPAY Fee payment

Year of fee payment: 12

AS Assignment

Owner name: OLYMPUS CORPORATION, JAPAN

Free format text: CHANGE OF ADDRESS;ASSIGNOR:OLYMPUS CORPORATION;REEL/FRAME:039344/0502

Effective date: 20160401