US20140052005A1 - Endoscope apparatus and measuring method - Google Patents

Endoscope apparatus and measuring method Download PDF

Info

Publication number
US20140052005A1
US20140052005A1 US14/061,530 US201314061530A US2014052005A1 US 20140052005 A1 US20140052005 A1 US 20140052005A1 US 201314061530 A US201314061530 A US 201314061530A US 2014052005 A1 US2014052005 A1 US 2014052005A1
Authority
US
United States
Prior art keywords
pattern
light
endoscope apparatus
subject
insertion section
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/061,530
Inventor
Masayoshi Yokota
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Corp
Original Assignee
Olympus Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2011099889A external-priority patent/JP5893264B2/en
Priority claimed from JP2011099890A external-priority patent/JP6032870B2/en
Application filed by Olympus Corp filed Critical Olympus Corp
Assigned to OLYMPUS CORPORATION reassignment OLYMPUS CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: YOKOTA, MASAYOSHI
Publication of US20140052005A1 publication Critical patent/US20140052005A1/en
Priority to US15/423,043 priority Critical patent/US10342459B2/en
Priority to US16/427,001 priority patent/US10898110B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1077Measuring of profiles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • A61B1/000095Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope for image enhancement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00064Constructional details of the endoscope body
    • A61B1/00071Insertion part of the endoscope body
    • A61B1/0008Insertion part of the endoscope body characterised by distal tip features
    • A61B1/00096Optical elements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00174Optical arrangements characterised by the viewing angles
    • A61B1/00179Optical arrangements characterised by the viewing angles for off-axis viewing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00193Optical arrangements adapted for stereoscopic vision
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00194Optical arrangements adapted for three-dimensional imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/05Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by the image sensor, e.g. camera, being in the distal end portion
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0605Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements for spatially modulated illumination
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0661Endoscope light sources
    • A61B1/0676Endoscope light sources at distal tip of an endoscope
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0082Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
    • A61B5/0084Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes for introduction into the body, e.g. by catheters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1076Measuring physical dimensions, e.g. size of the entire body or parts thereof for measuring dimensions inside body cavities, e.g. using catheters
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • G02B23/2407Optical details
    • G02B23/2423Optical details of the distal end
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • G02B23/2407Optical details
    • G02B23/2461Illumination
    • G02B23/2469Illumination using optical fibres
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/555Constructional details for picking-up images in sites, inaccessible due to their dimensions or hazardous conditions, e.g. endoscopes or borescopes

Definitions

  • the present invention relates to an endoscope apparatus and a measuring method, and more particularly, to an endoscope apparatus that projects patterns, such as a fringe, onto a subject, to measure the three-dimensional shape of the surface of the subject, and a method of projecting patterns, such as a fringe, onto a subject, to measure the three-dimensional shape of the surface of the subject.
  • endoscopes in order to inspect a subject, there are endoscopes (endoscope apparatuses) including an elongated insertion section and having observation means, such as an optical system and an imaging element, at the tip of an insertion section.
  • observation means such as an optical system and an imaging element
  • an endoscope that acquires a plurality of fringe images obtained by projecting a fringe onto a subject while shifting the phase of the fringe, and that calculates the three-dimensional shape of the subject by a well-known phase shift method using the plurality of fringe images.
  • United States Patent Application, Publication No. 2009-0225321 discloses an endoscope apparatus in which two projection windows used to project a fringe are provided in a tip surface of the insertion section.
  • an endoscope apparatus is provided to measure a subject using a pattern projection image of the subject on which a light and dark pattern of light is projected.
  • the endoscope apparatus according to the first aspect of the invention includes an insertion section, an imaging section, an illumination section, and a pattern projection section.
  • the imaging section is provided at a tip portion of the insertion section to acquire the image of the subject.
  • the illumination section emits illumination light that illuminates an observation visual field of the imaging section.
  • the pattern projection section projects the light and dark pattern onto the subject.
  • a tip surface of the insertion section is provided with an objective optical system that forms the image of the subject on the imaging section, one or more illumination windows through which the illumination light is emitted, and a projection window through which the light and dark pattern is projected onto the subject from the pattern projection section.
  • the pattern projection section includes a pattern generator which generates the light and dark pattern.
  • the light and dark pattern is a pattern with an intensive distribution in which a light portion and a dark portion are alternately arranged.
  • the objective optical system according to the first aspect of the present invention may be arranged so that an optical axis on an emission side that is directed to the imaging section from the objective optical system among optical axes of the objective optical system is parallel to and eccentric from a central axis of the insertion section.
  • the objective optical system according to the second aspect of the present invention may be a direct-view-type objective optical system in which both an optical axis on an incidence side and the optical axis on the emission side are parallel to the central axis.
  • the objective optical system may be provided at the tip surface of the tip portion of the insertion section and is arranged at a position eccentric from the central axis.
  • the projection window according to the third aspect of the present invention may be provided at the tip surface of the tip portion of the insertion section and is arranged at a position eccentric from the central axis of the insertion section.
  • the objective optical system according to the second aspect of the present invention may be a side-view-type objective optical system that is exposed to an outer peripheral surface of the tip portion of the insertion section and has an optical axis on an incidence side arranged at a twisted position with respect to the central axis of the insertion section.
  • the projection window according to the fifth aspect of the present invention may be exposed to an outer peripheral surface of the tip portion of the insertion section, and a centerline extending in the thickness direction of the projection window through the center of the projection window when the projection window is viewed from the thickness direction of the projection window may be arranged at a twisted position with respect to the central axis of the insertion section.
  • the objective optical system according to the second aspect of the present invention may be a side-view-type objective optical system that is exposed to an outer peripheral surface of the tip portion of the insertion section and has an optical axis on an incidence side arranged to intersect the central axis of the insertion section.
  • the projection window may be arranged in the outer peripheral surface of the tip portion of the insertion section so that the center of the projection window when the projection window is viewed from the thickness direction of the projection window is present in a plane defined by the central axis of the insertion section and the optical axis on the incidence side.
  • the pattern projection section according to the first aspect of the present invention may have one or more linear parallel patterns.
  • the pattern projection section according to the first aspect of the present invention may include a projecting light source, and a pattern generator that changes the intensity distribution of the light emitted from the projecting light source and generates the light and dark pattern.
  • the endoscope apparatus may further include an optical fiber that guides the light emitted from the projecting light source to the pattern generator.
  • the projecting light source may be provided on a base end side of the insertion section, and the pattern generator may be provided at the tip portion of the insertion section.
  • the projecting light source and the pattern generator according to the ninth aspect of the present invention may be provided at the tip portion of the insertion section.
  • the endoscope apparatus may further include an optical fiber that guides the light and dark pattern emitted from the projecting light source and generated by the pattern generator to a tip side of the insertion section.
  • the projecting light source and the pattern generator may be provided on a base end side of the insertion section.
  • the endoscope apparatus may further include an optical adapter capable of being detachably mounted on the tip portion of the insertion section, and the pattern generator may be provided in the optical adapter.
  • the projecting light source according to the thirteenth aspect of the present invention may be provided in the optical adapter.
  • the endoscope apparatus may further include switching means that switches between the light for projecting the light-and-dark-pattern and the illumination light.
  • a measuring method is provided to perform the three-dimensional shape measurement of a subject using an endoscope (an endoscope apparatus).
  • the measuring method according to the sixteenth aspect of the present invention includes projecting a predetermined light and dark pattern onto the subject from one place of the endoscope; imaging a portion of the subject onto which the light and dark pattern is projected, and acquiring at least one sheet of a pattern projection image; and using the pattern projection image to perform a three-dimensional shape measurement of the portion onto which the light and dark pattern is projected.
  • a measuring method is provided to perform the three-dimensional shape measurement of a subject using an endoscope apparatus.
  • the measuring method according to the seventeenth aspect of the present invention includes projecting a predetermined fringe pattern onto the subject from one place of the endoscope apparatus; imaging a portion of the subject onto which the fringe pattern is projected, and acquiring one sheet of a fringe image; and measuring the three-dimensional shape of the portion onto which the fringe pattern is projected, from the one sheet of fringe image, using a spatial phase shift method or a Fourier transform method.
  • the measuring method according to the seventeenth aspect of the present invention may further include acquiring at least one sheet of a bright field image of the portion onto which the fringe pattern is projected, at least either before or after the one sheet of fringe image is acquired; selecting at least two sheets of images from the one sheet of fringe image and the bright field image; and detecting that a position of the endoscope apparatus has deviated when there is a positional deviation equal to or more than a predetermined amount in the two sheets of images.
  • the measuring method according to the eighteenth aspect of the present invention may further include acquiring at least one sheet of the bright field images before and after the one sheet of fringe image is acquired.
  • At least two sheets of images selected to detect that the position of the endoscope apparatus has deviated are selected from the bright field images.
  • the diameter of the insertion section can be reduced.
  • the three-dimensional shape measurement is capable of being performed with high precision even in the endoscope apparatus in which the diameter of the insertion section is reduced.
  • the three-dimensional shape measurement can be performed by analyzing one sheet of a fringe image captured using the endoscope apparatus.
  • the three-dimensional shape measurement can be performed in a short period of time using the endoscope apparatus.
  • FIG. 1 is a block diagram showing the configuration of an endoscope apparatus according to a first embodiment and a second embodiment of the present invention.
  • FIG. 2 is a schematic view showing a light and dark pattern projected by the endoscope apparatus according to the first and the second embodiments of the present invention.
  • FIG. 3 is a flowchart showing a measuring method according to the first embodiment of the present invention.
  • FIG. 4 is a schematic view showing a first example of the configuration of a tip surface of an insertion section of the endoscope apparatus according to the first embodiment of the present invention.
  • FIG. 5 is a schematic view showing a second example of the configuration of the tip surface of the insertion section of the endoscope apparatus according to the first embodiment of the present invention.
  • FIG. 6 is a schematic view showing a third example of the configuration of the tip surface of the insertion section of the endoscope apparatus according to the first embodiment of the present invention.
  • FIG. 7 is a schematic view showing a first example of a configuration in the vicinity of the tip of the insertion section of the endoscope apparatus according to the first embodiment of the present invention.
  • FIG. 8 is a schematic view showing a second example of the configuration in the vicinity of the tip of the insertion section of the endoscope apparatus according to the first embodiment of the present invention.
  • FIG. 9 is a schematic view showing a light and dark pattern projected by an endoscope apparatus of a modification according to the first embodiment of the present invention.
  • FIG. 10 is a schematic view showing the configuration of a tip surface of an insertion section in another modification of the endoscope apparatus according to the first embodiment of the present invention.
  • FIG. 11 is a view showing the configuration of an insertion section in a still further modification of the endoscope apparatus according to the first embodiment of the present invention, and a schematic view of the insertion section in the endoscope apparatus capable of being observed in a vertical direction with respect to the central axis of the insertion section.
  • FIG. 12A is a top view of a tip surface on which a cover member of a prism according to the modification shown in FIG. 11 is put.
  • FIG. 12B is a top view of a tip surface on which the cover member of the prism according to the modification shown in FIG. 11 is put.
  • FIG. 12C is a schematic view of the modification shown in FIG. 12A as viewed from a direction D.
  • FIG. 12D is a schematic view of the modification example shown in FIG. 12B as viewed from the direction D.
  • FIG. 13 is a flowchart showing a measuring method according to the second embodiment of the invention.
  • FIG. 1 is a block diagram showing the configuration of the endoscope apparatus 1 of the present embodiment.
  • FIG. 2 is a schematic view showing a light and dark pattern projected by the endoscope apparatus 1 .
  • the endoscope apparatus 1 is used for internal observation of a subject, observation of a subject at a position where it is difficult for an ordinary observation instrument to make an access, or the like.
  • the endoscope apparatus 1 includes an elongated insertion section 10 and a body section 20 to which a base end of the insertion section 10 is connected.
  • the insertion section 10 is formed in a tubular shape, and inserted into the inside of a subject or an access path to a subject.
  • the insertion section 10 is provided with an imaging section 30 that acquires the image of a subject, an illumination section 40 that illuminates an observation visual field in front of the insertion section 10 , and a pattern projection section 50 that projects a light and dark pattern onto a subject.
  • the pattern projection section 50 projects a fringe pattern onto a subject as the light and dark pattern.
  • a tip surface 10 a of the insertion section 10 is provided with an opening 11 for making daylight incident on an objective optical system 32 of the imaging section 30 there through, an illumination window 12 which allows the illumination light from the illumination section 40 to be irradiated toward the front of the insertion section therethrough, and a projection window 13 which allows the fringe from the pattern projection section 50 to be irradiated toward the front of the insertion section therethrough.
  • the imaging section 30 includes an imager 31 arranged in the vicinity of the tip of the insertion section 10 , the objective optical system 32 arranged in front of the imager 31 , and an imager controller 33 connected to the imager 31 .
  • the objective optical system 32 is arranged within the opening 11 of the insertion section 10 .
  • the objective optical system has a predetermined angle of view, causes the reflected light within an observation visual field defined by the angle of view to be incident on the imager 31 , and causes the image of a subject to be formed on the imager. Additionally, the objective optical system 32 has a light-transmissive cover member 32 a that seals the opening 11 .
  • the imager controller 33 is provided within the body section 20 , and is connected to the imager 31 by a wiring line 34 extending within the insertion section 10 .
  • the imager controller 33 performs various kinds of control, such as setting by which driving and the video signals of the imager 31 are acquired.
  • the illumination section 40 includes a first light source 41 , an illumination optical system 42 , a first fiber bundle 43 that guides the light from the first light source 41 to the illumination optical system 42 , a first incidence optical system 44 arranged between the first light source 41 and the first fiber bundle 43 .
  • the first light source 41 is a general white light source, and is arranged inside the body section 20 .
  • light-emitting elements such as an LED and a laser, a halogen lamp, or the like can be adopted.
  • the illumination optical system 42 is attached to the tip of the insertion section 10 or the vicinity of the tip.
  • the illumination optical system 42 has a light-transmissive cover member 42 a provided within the illumination window 12 of the insertion section 10 , and a lens group that is not shown.
  • the illumination optical system 42 broadens the light irradiated from the first light source 41 to a visual field range suitable for the angle of view of the objective optical system 32 and causes the light to be emitted from the illumination window 12 , and illuminates the observation visual field thoroughly.
  • the first fiber bundle 43 extends from the vicinity of the illumination optical system 42 through the insertion section 10 to the first light source 41 within the body section 20 .
  • the type of the first fiber bundle 43 is not particularly limited, and a general light guide can be used.
  • the first incidence optical system 44 converges the light emitted from the first light source 41 up to a diameter nearly equal to the diameter of the first fiber bundle 43 , and efficiently introduces the light into the first fiber bundle 43 .
  • the pattern projection section 50 includes a second light source 51 (projecting light source), a projection optical system 52 , a second fiber bundle 53 that guides the light of the second light source 51 to the projection optical system 52 , a second incidence optical system 54 arranged between the second light source 51 and the second fiber bundle 53 , and a pattern generator 55 arranged on an optical path for the light emitted from the second light source 51 .
  • a second light source 51 projecting light source
  • the pattern projection section 50 includes a second light source 51 (projecting light source), a projection optical system 52 , a second fiber bundle 53 that guides the light of the second light source 51 to the projection optical system 52 , a second incidence optical system 54 arranged between the second light source 51 and the second fiber bundle 53 , and a pattern generator 55 arranged on an optical path for the light emitted from the second light source 51 .
  • the second light source 51 is a white light source similar to the first light source 41 , and is arranged inside the body section 20 .
  • the second light source 51 may be a light source that emits light with a wavelength different from that of the first light source 41 .
  • the projection optical system 52 is attached to the tip of the insertion section 10 or the vicinity of the tip.
  • the projection optical system 52 has a light-transmissive cover member 52 a provided within the projection window 13 of the insertion section 10 .
  • the cover member 52 a provided in the projection window 13 may be lens-shaped.
  • the projection optical system 52 expands the light irradiated from the second light source 51 to a visual field range suitable for the angle of view of the objective optical system 32 , and projects the light into an observation visual field from one projection window 13 .
  • the second fiber bundle 53 extends from the vicinity of the projection optical system 52 through the insertion section 10 to the vicinity of the second light source 51 within the body section 20 .
  • a general light guide can be used, similar to the first fiber bundle 43 .
  • the second incidence optical system 54 converges the light emitted from the second light source 51 up to a diameter nearly equal to the diameter of the second fiber bundle 53 , and efficiently introduces the light into the second fiber bundle 53 .
  • a well-known configuration capable of forming a plurality of phase-shifted fringe patterns can be used.
  • a configuration in which a slit plate having a plurality of slits is moved by an actuator, or a configuration in which a transparent plate made of glass or resin, on which a plurality of mutually phase-shifted fringe patterns are drawn, is moved by the actuator is used.
  • a liquid crystal shutter module capable of switching between transmission and non-transmission of light for every element
  • a MEMS (microelectronics system) mirror module including a fine reflective mirror for every element, or the like may be used as the pattern generator 55 .
  • every element is controlled individually, a plurality of phase-shifted fringe patterns can be formed without moving the entire pattern generator 55 . Therefore, there is an advantage that the configuration of the pattern projection section 50 can be simplified.
  • the switching among the fringe patterns is performed by a pattern controller 56 connected to the pattern generator 55 .
  • the shape of the light and dark pattern is not limited to the fringe pattern, and may be a plurality of linear parallel lines as shown in FIG. 2 . Additionally, one line (to be described below) as shown in FIG. 9 may be provided as another example. Additionally, a grid-like pattern in which a plurality of points or a plurality of vertical lines and horizontal lines intersect each other, a concentric pattern, or the like may be adopted.
  • the first light source 41 and the second light source 51 are connected to a light source controller 21 that controls ON/OFF of the light sources.
  • the imager controller 33 , the pattern controller 56 , and the light source controller 21 are connected to a main controller 22 that controls the entire endoscope apparatus 1 .
  • An operation section 23 that allows a user to perform various kinds of input to the endoscope apparatus 1 is connected to the main controller 22 .
  • the main controller 22 is connected to a main storage device (RAM 24 ).
  • an auxiliary storage device 25 such as a storage device having a rewritable nonvolatile memory or a magnetic storage device, is electrically connected to the main controller 22 .
  • a ROM 26 (or EPROM, EEPROM, or the like) on which firmware or the like is recorded may be connected to the main controller 22 .
  • the video processor 27 that processes video signals acquired by the imager 31 is connected to the imager controller 33 and the main controller 22 .
  • a monitor 28 that displays video signals processed by the video processor 27 as an image is connected to the video processor 27 .
  • the measuring method of the first embodiment of the present invention is a measuring method of performing the three-dimensional shape measurement of a subject, using the endoscope apparatus 1 .
  • a user inserts the insertion section 10 into the inside of the subject, an access path to the subject, such as a conduit, or the like, and advances the tip of the insertion section 10 to a predetermined observation region.
  • the user performs inspection or the like of the subject by switching to an observation mode where a desired region of the subject is observed, and to a measurement mode where the three-dimensional shape of the region is measured, if necessary.
  • the light source controller 21 receives the command from the main controller 22 to ON-control the first light source 41 and OFF-control the second light source 51 .
  • a fringe pattern is not projected from the pattern projection section 50 and white light is irradiated to the observation visual field from the illumination section 40 to illuminate the observation visual field (hereinafter, this illumination state is referred to as an “observation state”).
  • the image of the illuminated subject is formed on the imager 31 through the objective optical system 32 .
  • Video signals sent from the imager 31 are processed by the video processor 27 and displayed on the monitor 28 . The user can observe the subject from the image of the subject displayed on the monitor 28 , or can save the image if necessary.
  • the user When switching is made from the observation mode to the measurement mode, the user inputs a mode switching instruction.
  • a well-known input device can be used as an input device that inputs the mode switching instruction.
  • measurement image capturing processing (refer to FIG. 3 ) is started in the main controller 22 .
  • Step S 1 it is determined whether or not the endoscope apparatus 1 has entered the observation state (Step S 1 shown in FIG. 3 ).
  • Step S 1 When it is determined in Step S 1 that the endoscope apparatus has entered the observation state, the processing proceeds to Step S 3 , and when the endoscope apparatus is in states (for example, a measurement state to be described below) excluding the observation state in Step S 1 , the processing proceeds to Step S 2 .
  • states for example, a measurement state to be described below
  • Step S 1 is ended by this.
  • Step S 2 is a step where the endoscope apparatus 1 is switched to being in the observation state.
  • Step S 2 the first light source 41 is ON-controlled, and the second light source 51 is OFF-controlled. Accordingly, a fringe pattern is not projected from the pattern projection section 50 and white light is irradiated to the observation visual field from the illumination section 40 to illuminate the observation visual field.
  • Step S 2 is ended by this, and the processing proceeds to Step S 3 .
  • Step S 3 is a step where a fringe pattern is not projected and the image of the subject illuminated with the white light from the illumination section 40 is captured.
  • Step S 3 an image is acquired by the imager 31 of the imaging section 30 in a state where the subject is illuminated with the white light from the illumination section 40 (hereinafter, the image captured in the observation state is referred to as a “bright field image”).
  • the bright field image captured in Step S 3 is temporarily stored in the RAM 24 .
  • Step S 3 is ended by this, and the processing proceeds to Step S 4 .
  • Step S 4 is a branch step for capturing a desired number of sheets of pattern projection images.
  • Step S 4 a predetermined number N of sheets of pattern projection images scheduled to be captured is compared with the number of sheets of pattern projection images stored in the RAM 24 at this time.
  • the processing proceeds to Step S 5 .
  • the processing proceeds to Step S 7 .
  • Step S 4 is ended by this.
  • Step S 5 is a step where a fringe pattern is projected onto the subject.
  • Step S 5 on the basis of the command from the main controller 22 , the first light source 41 is OFF-controlled, and the second light source 51 is ON-controlled. Then, the white light irradiated from the illumination section 40 is turned off, and a fringe pattern is projected onto the subject from the pattern projection section 50 .
  • the fringe pattern projected onto the subject is a pattern in which a bright portion R 1 by a white light source and a dark portion R 2 shaded by the pattern generator 55 are alternately arranged. Additionally, the pattern generator 55 operates the actuator to set the phase of the fringe pattern to an appropriate phase. This is a state (hereinafter, this state is referred to as a “pattern projection state”.) where an appropriate fringe is projected onto the subject from one place.
  • Step S 5 is ended by this, and the processing proceeds to Step S 6 .
  • Step S 6 is a step where a pattern projection image is captured in the pattern projection state.
  • the fringe pattern projected onto the subject is a pattern that has changed according to the three-dimensional shape of the subject.
  • an image is acquired by the imager 31 of the imaging section 30 (hereinafter, the image captured in the pattern projection state is referred to as a “pattern projection image”).
  • the pattern projection image captured in Step S 6 is temporarily stored in the RAM 24 .
  • Step S 6 is ended by this, and the processing returns to Step S 4 .
  • Steps S 4 to S 6 are repeated until the number of sheets of pattern projection images to be captured reaches the number N of sheets of images scheduled to be captured.
  • Step S 5 the phase of the fringe pattern is appropriately changed and the images of the subject on which fringes with different phases are projected are captured, for example, one by one by a total number N of sheets.
  • Step S 7 is a step where the endoscope apparatus 1 is switched to being in the observation state.
  • Step S 7 the first light source 41 is ON-controlled, and the second light source 51 is OFF-controlled. Accordingly, a fringe pattern is not projected from the pattern projection section 50 and white light is irradiated to the observation visual field from the illumination section 40 to illuminate the observation visual field.
  • Step S 7 is ended by this, and the processing proceeds to Step S 8 .
  • Step S 8 is a step where a fringe pattern is not projected and the image of the subject illuminated with the white light from the illumination section 40 is captured.
  • Step S 8 a bright field image is captured by the imager 31 of the imaging section 30 in a state where the subject is illuminated with the white light from the illumination section 40 .
  • the bright field image captured in Step S 8 is temporarily stored in the RAM 24 .
  • Step S 8 is ended by this, and the processing proceeds to Step S 9 .
  • Step S 9 is a step where the relative movement (hereinafter referred to as “deviation”) between the insertion section 10 and the subject from Step S 3 to Step S 8 is detected on the basis of the images (the bright field image and the pattern projection image) captured from Step S 3 to Step S 8 .
  • Step S 9 first, two sheets of images are selected from at least any of the bright field image and the fringe image that are stored in the RAM 24 .
  • a bright field image captured before N sheets of pattern projection images are captured and a bright field image captured after the N sheets of pattern projection images are captured are selected.
  • Step S 9 is ended by this, and the processing proceeds to Step S 10 .
  • Step S 10 is a step where a deviation of the two images is determined using the feature point detected in Step S 9 and the processing branches.
  • Step S 10 if the coordinates of the feature point in the two sheets of images are the same coordinates in the respective images, it is determined that any deviation does not occur between a first image and the next image, and the processing proceeds to Step S 11 . On the contrary, if the coordinates of the feature point in the two sheets of images are different coordinates in the respective images, it is determined that deviation has occurred between the first image and the next image. Since the deviation has occurred, a message showing that another capturing is required is displayed on the monitor 28 (Step S 14 ), and a series of processing is ended.
  • Step S 10 is ended by this.
  • Step S 11 is a step where the user is made to select whether three-dimensional measurement using the captured pattern projection image is performed at that time or later.
  • Step S 11 for example, an inquiry of “Perform measurement?” or the like is displayed on the monitor 28 , and the user is urged to make an input on whether or not the three-dimensional measurement using the captured pattern projection image is allowed to be performed.
  • Step S 12 When there is an input that the measurement is allowed to be performed, the processing proceeds to Step S 12 .
  • Step S 15 When there is an input that the measurement is not allowed to be performed, the processing proceeds to Step S 15 .
  • Step S 11 is ended by this.
  • Step S 12 is a step where analysis is performed for the three-dimensional measurement.
  • Step S 12 the three-dimensional shape is analyzed on the basis of the pattern projection images stored in the RAM 24 .
  • the three-dimensional shape of the subject is analyzed, for example, by the well-known time phase shift method, using the N sheets of pattern projection images with different phases.
  • Step S 12 may be performed as background processing during Step S 11 simultaneously with the start of Step S 11 .
  • Step S 12 is ended by this, and the processing proceeds to Step S 13 .
  • Step S 13 is a step where the display on the monitor 28 is shifted to a screen of various measurement modes, and a measurement result is displayed on the monitor 28 , using the information saved in Step S 12 .
  • Step S 13 the three-dimensional shape of the subject displayed on the bright field image is displayed on the monitor 28 , by overlaying the result analyzed in Step S 12 on the bright field image (or bright field image acquired in Step S 8 ) acquired in Step S 3 . This enables the user to identify the three-dimensional shape of the subject.
  • Step S 13 is ended by this, and a series of processing is ended.
  • Step S 15 is a step that branches from the above Step S 11 , and is a step that performs information processing required to display the measurement result later.
  • Step S 15 similar to the above Step S 12 , the three-dimensional shape is analyzed on the basis of the pattern projection images stored in the RAM 24 .
  • the three-dimensional shape of the subject is analyzed by the well-known time phase shift method, using the N sheets of pattern projection images with different phases.
  • analysis results of the bright field image, the pattern projection image, and the three-dimensional shape and optical parameters used for the analysis are saved as binary files or text files, respectively, in the auxiliary storage device 25 .
  • these files are saved in the auxiliary storage device 25 so that the files can be collectively read later.
  • Step S 15 is ended by this, and a series of processing is ended.
  • the projection window 13 of the pattern projection section 50 is provided in one place of the tip surface 10 a of the insertion section 10 .
  • the diameter of the insertion section 10 can be reduced as compared to a case where two projection windows 13 are provided in the tip surface 10 a of the insertion section 10 .
  • the occupying area of the projection window 13 in the tip surface 10 a of the insertion section 10 of the endoscope apparatus 1 is large, and the occupying area of the illumination window 12 and the objective optical system 32 is difficult to increase.
  • the occupying area of the illumination window 12 is small, the quantity of illumination light may be insufficient.
  • the occupying area of the objective optical system 32 is small, it may be difficult to increase the aperture of a lens, and an image may become dark.
  • the number of projection windows 13 through which a fringe pattern is projected is one.
  • the occupying area of the illumination window 12 or the objective optical system 32 is capable of being increased.
  • a brighter image can be acquired even in the insertion section 10 with a thickness equal to that of the endoscope according to related art.
  • an image with a brightness equal to or higher than that of the endoscope according to related art is capable of being obtained.
  • a three-dimensional shape can be measured with high precision.
  • deviation is detected using bright field images before and after a pattern projection image is captured, and a three-dimensional shape is analyzed when it is determined that there is no deviation.
  • analysis is not performed with fringe patterns on a plurality of pattern projection images deviated. For this reason, the analysis precision of the three-dimensional shape can be enhanced.
  • the positional deviations are capable of being reduced.
  • the present modification 1 is different from the above-described first embodiment that the present modification 1 includes a pattern generator 55 A (refer to
  • the pattern generator 55 A is not capable of projecting light and dark patterns with different phases. However, the pattern generator 55 A is configured so that a light and dark pattern with a specific phase can be projected onto a subject. That is, the pattern generator 55 A of the present modification 1 does not include an actuator that moves the slit plate or the like, and is configured of a small size.
  • a measuring method of the three-dimensional shape measurement of a subject is also different.
  • the measuring method of the present modification 1 will be described below mainly about points that are different from the above-described first embodiment in terms of processing.
  • the number N of sheets of images scheduled to be captured in Step S 4 is 1, one sheet of a pattern projection image is captured without any repetition from Step S 4 to Step S 6 in the above-described embodiment, and the processing proceeds to Step S 7 .
  • the analysis method of a three-dimensional shape measurement in Step S 12 and Step S 15 are also different from that of the above-described first embodiment.
  • the three-dimensional shape is analyzed by a space phase shift method or a Fourier transform method in Step S 12 and Step S 15 using one sheet of a pattern projection image.
  • the three-dimensional shape is analyzed using one sheet of the pattern projection image.
  • the time until an analysis result is obtained after capturing of an image is started can be shortened.
  • the measuring method of the present modification 1 is a similarly applicable method even if the method has the pattern generator 55 including the actuator that moves the slit plate or the like, and can rapidly analyze a three-dimensional shape compared to the time phase shift method using a plurality of sheets of pattern projection images.
  • the present modification 2 includes the pattern generator 55 A (refer to FIG. 1 ), and the pattern projection section 50 is configured so that the pattern projection section is capable of projecting one light or dark linear pattern as shown in FIG. 9 onto a subject.
  • one stripe-shaped light portion R 1 may be projected into the dark portion R 2 .
  • the pattern itself projected from the pattern projection section 50 is not moved in terms of projection place or direction or is not deformed in terms of shape.
  • the pattern generator 55 A of the present modification 2 does not include the actuator that moves the slit plate or the like, and is configured with a small size.
  • the measuring method of the three-dimensional shape measurement of a subject is also different.
  • the measuring method of the present modification 2 will be described below mainly about the point that the present modification 2 is different from Modification 1 of the first embodiment as described above in terms of processing.
  • the three-dimensional shape is analyzed in Step S 12 and Step S 15 by an optical cutting method, using one sheet of a pattern projection image.
  • the three-dimensional shape is analyzed on one pattern, using one sheet of a pattern projection image. Therefore, as compared to the case where the entire surface of one sheet of a pattern projection image is analyzed in the above-described first embodiment, a portion where the three-dimensional shape measurement is capable of being performed is limited, but analysis time can be significantly shortened.
  • the measuring method of the present modification 2 is a similarly applicable method even if the pattern generator 55 includes the actuator that moves the slit plate or the like.
  • the three-dimensional shape can be rapidly analyzed not only in a portion of a visual field range (on the screen) but also in a plurality of different portions (positions).
  • the present modification 3 does not include the second light source 51 , but includes switching means that makes the light emitted from the first light source 41 incidents on the second fiber bundle 53 .
  • the switching means for example, devices, such as a MEMS mirror module, which switch an optical path for the light emitted from the first light source 41 to a plurality of directions, can be adopted.
  • the same effects as the endoscope apparatus 1 described in the above-described first embodiment are exhibited. Additionally, since the light source may be comprised of one light source, the number of parts of the endoscope apparatus 1 is capable of being reduced.
  • the configuration of the tip surface 10 a of the endoscope apparatus 1 is different from that of the above-described first embodiment.
  • FIGS. 4 to 6 are views showing the configuration of the direct-view-type insertion section 10 including the illumination window 12 , the projection window 13 , and the like in the tip surface 10 a.
  • FIGS. 4 to 6 there are various embodiments in the configuration of respective elements in the tip surface 10 a of the insertion section 10 of the endoscope apparatus 1 .
  • the objective optical system 32 is arranged on a central axis O of the insertion section 10 .
  • the illumination window 12 is provided so as to surround the objective optical system 32 by a half of an outer periphery of the objective optical system 32 .
  • the projection window 13 is arranged opposite to the objective optical system 32 with respect to the illumination window 12 . In such an arrangement, the occupying area of the illumination window 12 is capable of being increased.
  • the shape of the objective optical system 32 is generally a circle or a shape close to a circle.
  • the illumination window 12 and the projection window 13 can be efficiently arranged at the tip portion of the endoscope with limited arrangement area, and the diameter of the tip portion of an endoscope is easily reduced. Moreover, since the center of an endoscope image and the central axis O of the insertion section 10 coincide with each other, an operator can insert the endoscope without a sense of incompatibility while observing the image of a subject by the monitor.
  • the pattern generator 55 is provided on the depth side of the projection window 13 .
  • the pattern generator 55 is arranged so that a linear pattern is located in a vertical direction with respect to the arrangement direction of the projection window 13 and the objective optical system.
  • Such an arrangement secures the distance (hereinafter referred to as a base length) perpendicular to the linear pattern from the central point of the objective optical system as long as possible, and constitutes an arrangement relationship in which the arrangement interval between the projection window and the objective optical system is the closest. Since measurement precision improves as the base length is longer, according to the present modification example, the three-dimensional shape can be measured with high precision even in the endoscope apparatus in which the diameter of the insertion section is reduced.
  • the objective optical system 32 is arranged at a position eccentric from the central axis O of the insertion section 10 unlike the arrangement shown in FIG. 4 .
  • the objective optical system 32 is arranged so that an optical axis on the emission side where the reflected light within the observation visual field is directed to the imager 31 from the objective optical system 32 is parallel to and eccentric from the central axis O.
  • the opening 11 in which the objective optical system 32 is arranged, the illumination window 12 , and the projection window 13 may be arranged at positions eccentric from the central axis O of the insertion section 10 .
  • a vertical axis P 1 and a left-right axis Q 1 that pass through the optical axis L of the objective optical system 32 may be arranged at positions where these axes do not overlap a vertical axis P 2 and a left-right axis Q 2 that pass through the central axis of the insertion section 10 .
  • the diameter of the insertion section 10 can be further reduced, for example, as compared to a related-art endoscope apparatus in which the objective optical system 32 is provided on the central axis O of the insertion section 10 .
  • the first light source 41 and the second light source 51 are arranged in the vicinity of the tip of the insertion section 10 .
  • the first fiber bundle 43 is not included, the light from the first light source 41 is directly irradiated toward the illumination window 12 , the second fiber bundle 53 is not included, and the light from the second light source 51 is directly irradiated toward the fringe pattern generator 55 .
  • FIG. 8 it is also possible to adopt a configuration having the first light source 41 , the second light source 51 , and the imager 31 in the vicinity of the tip of the insertion section 10 and having an optical adapter 10 A that is attachable to and detachable from a tip portion of the insertion section 10 .
  • a tip surface 10 a 1 of the optical adapter 10 A corresponds to the tip surface 10 a of the insertion section 10 in the above-described first embodiment.
  • the first light source 41 and the illumination window 12 are connected together by an optical fiber 43 A arranged within the optical adapter 10 A. Additionally, the second light source 51 and the pattern generator 55 are connected together by an optical fiber 53 A arranged within the optical adapter 10 A.
  • the first light source 41 and the second light source 51 are provided in the vicinity of the tip of the insertion section 10 , when the insertion section 10 has, for example, a length exceeding several tens of meters, there is little loss of light and a brighter image is capable of being acquired, compared to a case where the first fiber bundle 43 and the second fiber bundle 53 are used.
  • FIG. 10 shows a modification of another arrangement in which the arrangement of FIG. 6 is further modified.
  • This modification is an example of the endoscope apparatus of a configuration having the optical adapter 10 A.
  • FIG. 10 shows a view of the tip surface of the optical adapter.
  • the objective optical system 32 is arranged on the central axis O of the insertion section 10 , and the illumination window 12 and the projection window 13 are respectively arranged on both sides of the objective optical system 32 .
  • Contact pins 14 used to supply electric power to the first light source or the second light source from the body are provided on the back of the optical adapter 10 A.
  • a positioning groove 15 or a structure in place of the positioning groove is provided on the back of the optical adapter 10 A.
  • Such contact pins 14 and positioning groove 15 are respectively provided on sides where the illumination window 12 and the projection window 13 are not arranged, with respect to the objective optical system 32 . This allows the contact pins 14 , the positioning groove 15 , the illumination window 12 , and the projection window 13 to be arranged in a small-diameter endoscope tip portion without interfering with each other, even in an optical adapter type.
  • FIG. 11 shows the modification 7 of the tip portion in the endoscope apparatus capable of performing observation in a perpendicular direction to the central axis of the insertion section.
  • a tip surface 10 b in which a straight line orthogonal to the central axis of the insertion section 10 becomes a normal line is formed in a portion of an outer peripheral surface of the tip portion of the insertion section 10 .
  • the illumination window 12 , the projection window 13 , and the cover member 32 a are all arranged at the tip surface 10 b.
  • the objective optical system 32 has a prism 16 in which an optical axis L 1 on the incidence side is directed to a direction that intersects the optical axis L 2 on the emission side that turns from the objective optical system 32 to the imager 31 .
  • the prism 16 is one of optical elements that configure the objective optical system 32 .
  • the optical axis L 1 on the incidence side is an optical axis when the reflected light within the observation visual field is incident on the prism 16
  • the optical axis L 2 on the emission side is an optical axis when the reflected light within the observation visual field is incident on the imager 31 from the prism 16 .
  • the optical axis L 1 on the incidence side is at a twisted position with respect to the central axis O of the insertion section 10 .
  • the optical axis L 2 on the emission side is parallel to the central axis O of the insertion section 10 .
  • FIG. 12A and FIG. 12B are views of the tip surface 10 b in the endoscope of FIG. 11 as viewed from a direction perpendicular to the tip surface 10 b, and are plan views showing an arrangement example of the illumination window 12 , the projection window 13 , and the cover member 32 a.
  • the tip surface 10 b is a substantially flat plane.
  • the cover member 32 a and the projection window 13 are arranged on the central axis O of the insertion section 10 in plan view.
  • Two illumination windows 12 are arranged on the sides of the cover member 32 a. Both the cover member 32 a and the projection window 13 are exposed to the tip surface 10 b that is the outer peripheral surface of the tip portion of the insertion section.
  • a line obtained by projecting the central axis O perpendicularly to the tip surface 10 b is defined as a virtual centerline PL. That is, in FIG.
  • the cover member 32 a and the projection window 13 are arranged at positions where the centers of these and the virtual centerline PL intersect each other.
  • the projection window 13 has a positional relationship in which the center of the projection window when the projection window 13 is viewed from the thickness direction of the projection window 13 is present within a plane defined by the central axis O of the insertion section 10 and the optical axis L 1 on the incidence side.
  • the cover member 32 a may be arranged at a position where the center thereof does not intersect the virtual centerline PL
  • the illumination window 12 may be arranged on the virtual centerline PL of the insertion section 10
  • the projection window 13 may be arranged at a position where the center thereof does not intersect the virtual centerline PL.
  • the center of the cover member 32 a and the optical axis L 1 of the prism 16 are arranged so as to coincide with each other, and the objective optical system 32 is arranged so that the optical axis L 1 thereof does not intersect the virtual centerline PL.
  • FIG. 12C and FIG. 12D are respectively schematic views of the insertion section 10 as viewed from a direction shown by symbol D in FIG. 12A and FIG. 12B , and are front views of the insertion section 10 .
  • the objective optical system 32 and the imager 31 are arranged so that the optical axis L 2 when the reflected light within the observation visual field is incident on the imager 31 from the prism 16 and the central axis O of the insertion section 10 are eccentric from each other.
  • the diameter of the tip portion is easily reduced because the number of projection windows is one, compared to the case where projection windows 13 used for projecting a light and dark pattern are provided in a plurality of places as in the related art.
  • the shapes and arrangement positions of the respective elements arranged at the tip surface are not limited only to the examples of FIG. 12A and FIG. 12B .
  • the optical axis L 1 and the optical axis L 2 are orthogonal to each other is illustrated in the present modification example, a configuration in which the optical axis L 1 and the optical axis L 2 intersect each other at angles other than the orthogonal intersection.
  • control operation performed by the main controller 22 is different from those of the above-described first embodiment and Modifications 1 to 7 of the first embodiment.
  • the second light source 51 is provided on the tip side of the insertion section 10 , and is, for example, a high-luminance light source, such as a laser.
  • a state where a suitable fringe is projected onto a subject from one place may enter a “fringe projection state” by ON-controlling the second light source 51 with the first light source 41 being ON-controlled, on the basis of the command of the main controller 22 .
  • fringe images with different fringe luminance may be captured by controlling the second light source 51 to thereby change the quantity of light, without changing the phase of a light and dark pattern.
  • control operation performed by the main controller 22 is different from those of the above-described first embodiment and Modifications 1 to 8 of the first embodiment.
  • Step S 9 a bright field image captured before N sheets of fringe images are captured, and a bright field image captured after the N sheets of fringe images are captured are selected, and the total differences in luminance value between the two sheets of images is calculated.
  • Step S 10 if the total of the differences in luminance value calculated in Step S 9 is smaller than a threshold, it is determined that any deviation does not occur in a first image and the next image, and the processing proceeds to Step S 11 . On the contrary, when the total of the differences in luminance value calculated in Step S 9 is larger than the threshold, it is determined that deviation has occurred in the first image and the next image. Since the deviation has occurred, a message showing that another capturing is required is displayed on the monitor 28 (Step S 14 ), and a series of processing is ended.
  • differences are calculated all over an image.
  • processing may be performed using only a certain portion of an image as an object.
  • differences in luminance may be calculated using one sheet of a bright field image and one sheet of a fringe image.
  • control operation performed by the main controller 22 is different from those of the above-described first embodiment and Modifications 1 to 9.
  • the second light source 51 (refer to FIG. 7 ) is configured by a plurality of minute light-emitting elements.
  • the plurality of light-emitting elements provided in the second light source 51 is lighting-controlled every two or more groups.
  • a plurality of groups based on the light-emitting elements may be a plate with simple slits in which the light and dark pattern generator 55 cannot arbitrarily change the phase of a fringe, or a plate similar thereto, if the groups are arranged in the phase direction of a fringe pattern provided on the light and dark pattern generator 55 .
  • Step S 5 a plurality of several different fringes are projected onto the subject by switching groups of light-emitting elements to be turned on in order.
  • these respective fringe images can be captured in Step S 6 .
  • fringe images may be used as the images to be used in order to detect any deviation.
  • bright field images more than two sheets of images may be captured. If there are bright field images more than two sheets of images, any deviation can be detected by selecting a required number of sheets of images from the bright field images if necessary.
  • the measuring method of the present embodiment is a method of performing the three-dimensional shape measurement of a subject, using an endoscope apparatus.
  • FIG. 1 is a block diagram showing the configuration of the endoscope apparatus 1 of the present embodiment.
  • FIG. 2 is a schematic view showing a light and dark pattern projected by the endoscope apparatus 1 .
  • the configuration of the endoscope apparatus of the second embodiment is the same as the configuration of the endoscope apparatus of the first embodiment. Accordingly, the same elements as those of the first embodiment will be designated by the same reference numerals, and a detailed description thereof will be omitted here.
  • a user inserts the insertion section 10 into the inside of a subject, an access path to the subject, such as a conduit, or the like, and advances the tip of the insertion section 10 to a predetermined observation region.
  • the user performs inspection or the like of the subject by switching to an observation mode where a desired region of the subject is observed and a measurement mode where the three-dimensional shape measurement of the region is performed, if necessary.
  • the light source controller 21 receives the command of the main controller 22 to ON-control the first light source 41 and OFF-control the second light source 51 .
  • a fringe pattern is not projected from the pattern projection section 50 and white light is irradiated to the observation visual field from the illumination section 40 to illuminate the observation visual field (hereinafter, this illumination state is referred to as an “observation state”).
  • the image of the illuminated subject is formed on the imager 31 through the objective optical system 32 .
  • Video signals sent from the imager 31 are processed by the video processor 27 and displayed on the monitor 28 . The user can observe the subject from the image of the subject displayed on the monitor 28 , or save the image if necessary.
  • the user When switching is made from the observation mode to the measurement mode, the user inputs a mode switching instruction.
  • a well-known input device can be used as an input device on which the mode switching instruction is input.
  • the operation section 23 is provided with a switch or a configuration in which the monitor 28 is changed to a touch panel so as to provide a software switch.
  • measurement image capturing processing (refer to FIG. 13 ) is started in the main controller 22 .
  • Step S 1 it is determined whether or not the endoscope apparatus 1 is brought into the observation state (Step S 1 shown in FIG. 13 ).
  • Step S 1 When it is determined in Step S 1 that the endoscope apparatus has been brought into the observation state, the processing proceeds to Step S 3 , and when the endoscope apparatus is brought into states (for example, a measurement state to be described below) excluding the observation state in Step S 1 , the processing proceeds to Step S 2 .
  • states for example, a measurement state to be described below
  • Step S 1 is ended by this.
  • Step S 2 is a step where the endoscope apparatus 1 is switched to being in the observation state.
  • Step S 2 the first light source 41 is ON-controlled, and the second light source 51 is OFF-controlled. Accordingly, a fringe pattern is not projected from the pattern projection section 50 and white light is irradiated to the observation visual field from the illumination section 40 to illuminate the observation visual field.
  • Step S 2 is ended by this, and the processing proceeds to Step S 3 .
  • Step S 3 is a step where a fringe pattern is not projected and the image of the subject illuminated with the white light from the illumination section 40 is captured.
  • Step S 3 an image is acquired by the imager 31 of the imaging section 30 in a state where the subject is illuminated with the white light from the illumination section 40 (hereinafter, the image captured in the observation state is referred to as a “bright field image”).
  • Step S 3 The bright field image captured in Step S 3 is temporarily stored in the RAM 24 . Step S 3 is ended by this, and the processing proceeds to Step S 16 .
  • Step S 16 is a step where a predetermined fringe pattern is projected onto the subject from one place of the endoscope apparatus 1 .
  • Step S 16 on the basis of the command of the main controller 22 , the first light source 41 is OFF-controlled, and the second light source 51 is ON-controlled. Then, the white light irradiated from the illumination section 40 is turned off, and a fringe pattern is projected onto the subject from the pattern projection section 50 .
  • the fringe pattern projected onto the subject is a pattern in which a light portion R 1 by a white light source and a dark portion R 2 shaded by the pattern generator 55 are alternately arranged. (Hereinafter, this state is referred to as a “pattern projection state”).
  • Step S 16 is ended by this, and the processing proceeds to Step S 17 .
  • Step S 17 is a step where a pattern projection image is captured in the pattern projection state.
  • the fringe pattern projected onto the subject is a pattern that has changed according to the three-dimensional shape of the subject.
  • one sheet of an image is acquired by the imager 31 of the imaging section 30 (hereinafter, the image captured in the pattern projection state is referred to as a “pattern projection image”).
  • the pattern projection image captured in Step S 17 is temporarily stored in the RAM 24 .
  • Step S 17 is ended by this, and the processing proceeds to Step S 18 .
  • Step S 18 is a step where the endoscope apparatus 1 is switched to being in the observation state.
  • Step S 18 the first light source 41 is ON-controlled, and the second light source 51 is OFF-controlled. Accordingly, a fringe pattern is not projected from the pattern projection section 50 and white light is irradiated to the observation visual field from the illumination section 40 to illuminate the observation visual field.
  • Step S 18 is ended by this, and the processing proceeds to Step S 19 .
  • Step S 19 is a step where a fringe pattern is not projected and the image of the subject illuminated with white light from the illumination section 40 is captured.
  • Step S 19 a bright field image is acquired by the imager 31 of the imaging section 30 in a state where the subject is illuminated with white light from the illumination section 40 .
  • the bright field image captured in Step S 19 is temporarily stored in the RAM 24 .
  • Step S 19 is ended by this, and the processing proceeds to Step S 20 .
  • Step S 20 is a step where the relative movement (hereinafter referred to as “deviation”) between the insertion section 10 and the subject from Step S 3 to Step S 19 is detected on the basis of the images (the bright field image and the pattern projection image) captured from Step S 3 to Step S 19 .
  • Step S 20 first, two sheets of images are selected from at least any of the bright field image and the pattern projection image that are stored in the RAM 24 .
  • a bright field image captured before one sheet of a pattern projection image is captured, and a bright field image captured after the one sheet of pattern projection image is captured are selected.
  • the same feature point is detected from the two sheets of selected images, and the coordinates of the feature point in the two sheets of images are calculated.
  • Step S 20 is ended by this, and the processing proceeds to Step S 21 .
  • Step S 21 is a step where the deviation of the two images is determined using the feature point detected in Step S 20 and the processing branches.
  • Step S 21 if the coordinates of the feature point in the two sheets of images are the same coordinates in the respective images, it is determined that no deviation occurs between a first image and the next image, and the processing proceeds to Step S 22 . On the contrary, if the coordinates of the feature point in the two sheets of images are different coordinates in the respective images, it is determined that deviation has occurred between the first image and the next image. Since the deviation has occurred, a message showing that another capturing is required is displayed on the monitor 28 (Step S 25 ), and a series of processing is ended.
  • Step S 21 is ended by this.
  • Step S 22 is a step where the user is made to select whether three-dimensional measurement using the captured pattern projection image is performed now or later.
  • Step S 22 for example, an inquiry of “Perform measurement?” or the like is displayed on the monitor 28 , and the user is urged to make an input on whether or not the three-dimensional measurement using the captured pattern projection image is performed.
  • Step S 23 When there is an input that the measurement is capable of being performed, the processing proceeds to Step S 23 .
  • Step S 26 When there is an input that the measurement is not capable of being performed, the processing proceeds to Step S 26 .
  • Step S 22 is ended by this.
  • Step S 23 is a step where analysis is performed for the three-dimensional measurement.
  • Step S 23 the three-dimensional shape is analyzed on the basis of the pattern projection images stored in the RAM 24 .
  • the three-dimensional shape of the subject is analyzed, for example, by the well-known spatial phase shift method or Fourier transform method, using one sheet of a pattern projection image.
  • Step S 23 may be performed as background processing of Step S 22 simultaneously with the start of Step S 22 .
  • Step S 23 is ended by this, and the processing proceeds to Step S 24 .
  • Step S 24 is a step where the display on the monitor 28 is shifted to a screen of various measurement modes, and a measurement result is displayed on the monitor 28 , using the information saved in Step S 23 .
  • Step S 24 the three-dimensional shape of the subject displayed on the bright field image is displayed on the monitor 28 , by overlaying the result analyzed in Step S 23 on the bright field image (or bright field image acquired in Step S 19 ) acquired in Step S 3 . This enables the user to know the three-dimensional shape of the subject.
  • Step S 24 is ended by this, and a series of processes is ended.
  • Step S 26 is a step that branches from the above Step S 22 , and is a step that performs information processing required to display the measurement result later.
  • Step S 26 similar to the above Step S 23 , the three-dimensional shape is analyzed on the basis of the pattern projection images stored in the RAM 24 .
  • the three-dimensional shape of the subject is analyzed by the well-known spatial phase shift method or Fourier transform method, using one sheet of a pattern projection image.
  • analysis results of the bright field image, the pattern projection image, and the three-dimensional shape and optical parameters used for the analysis are saved as binary files or text files, respectively, in the auxiliary storage device 25 .
  • these files are saved in the auxiliary storage device 25 so that the files can be collectively read later.
  • Step S 26 is ended by this, and a series of processing is ended.
  • the three-dimensional shape of a subject can be measured on the basis of one sheet of a pattern projection image captured in a state where a predetermined fringe pattern is projected onto the subject.
  • the three-dimensional shape can be measured in a short time using the endoscope apparatus 1 .
  • deviation can be detected by capturing a bright field image in addition to one sheet of a pattern projection image and using two sheets of images selected from the pattern projection image and the bright field image, the measurement precision of a three-dimensional-shape can be enhanced.
  • deviation is detected using bright field images before and after a pattern projection image is captured, and a three-dimensional shape is analyzed when it is determined that there is no deviation.
  • analysis is not performed with fringe patterns on a plurality of pattern projection images deviated. For this reason, the analysis precision of the three-dimensional shape can be enhanced.
  • the positional deviation when the measurement result using the pattern projection image is overlaid and displayed on the bright field images can also be reduced.
  • the measuring method of the three-dimensional shape measurement of a subject is different.
  • the measuring method of the present modification will be described below mainly about points that are different from the above-described second embodiment in terms of processing contents.
  • the three-dimensional shape is analyzed in Step S 23 and Step S 26 by an optical cutting method, using one sheet of a pattern projection image.
  • the three-dimensional shape is analyzed on one pattern, using one sheet of a pattern projection image. Therefore, as compared to the case where the entire surface of one sheet of a pattern projection image is analyzed in the above-described embodiment, a portion where the three-dimensional shape can be measured is limited, but analysis time can be significantly shortened.
  • substantially one sheet of a pattern projection image may be acquired and used for analysis by capturing a plurality of sheets of pattern projection images and selecting one sheet of a pattern projection image whose state is good.
  • pattern projection images may be used as the images to be used in order to detect any deviation.
  • bright field images more than two sheets of images may be captured. If there are bright field images more than two sheets of images, any deviation can be detected by selecting a required number of sheets of images from the bright field images if necessary.

Abstract

An endoscope apparatus and a measuring method measure a subject using a pattern projection image obtained by projecting a fringe pattern onto a subject. The endoscope apparatus includes an elongated insertion section, an imaging section at a tip portion of the insertion section to acquire the image of a subject, an illumination section at the tip portion of the insertion section to emit the illumination light that illuminates an observation visual field of the imaging section, and a pattern projection section at the tip portion of the insertion section to project a fringe pattern onto the subject. A tip surface of the insertion section includes an objective optical system that forms the image of the subject on the imaging section, one or more illumination windows through which illumination light is emitted, and one projection window through which a fringe pattern is projected onto the subject from the pattern projection section.

Description

  • This application is a continuation application based on a PCT Patent Application No. PCT/JP2012/060832, filed on Apr. 23, 2012, whose priority is claimed on Japanese Patent Application No. 2011-099889, filed on Apr. 27, 2011 and Japanese Patent Application No. 2011-099890, filed on Apr. 27, 2011. The contents of both the PCT Application and the Japanese Applications are incorporated herein by reference.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention relates to an endoscope apparatus and a measuring method, and more particularly, to an endoscope apparatus that projects patterns, such as a fringe, onto a subject, to measure the three-dimensional shape of the surface of the subject, and a method of projecting patterns, such as a fringe, onto a subject, to measure the three-dimensional shape of the surface of the subject.
  • 2. Description of Related Art
  • In the related art, in order to inspect a subject, there are endoscopes (endoscope apparatuses) including an elongated insertion section and having observation means, such as an optical system and an imaging element, at the tip of an insertion section. Among such endoscopes, there is known an endoscope that acquires a plurality of fringe images obtained by projecting a fringe onto a subject while shifting the phase of the fringe, and that calculates the three-dimensional shape of the subject by a well-known phase shift method using the plurality of fringe images. For example, United States Patent Application, Publication No. 2009-0225321 discloses an endoscope apparatus in which two projection windows used to project a fringe are provided in a tip surface of the insertion section.
  • SUMMARY OF THE INVENTION
  • According to a first aspect of the present invention, an endoscope apparatus is provided to measure a subject using a pattern projection image of the subject on which a light and dark pattern of light is projected. The endoscope apparatus according to the first aspect of the invention includes an insertion section, an imaging section, an illumination section, and a pattern projection section. The imaging section is provided at a tip portion of the insertion section to acquire the image of the subject. The illumination section emits illumination light that illuminates an observation visual field of the imaging section. The pattern projection section projects the light and dark pattern onto the subject. A tip surface of the insertion section is provided with an objective optical system that forms the image of the subject on the imaging section, one or more illumination windows through which the illumination light is emitted, and a projection window through which the light and dark pattern is projected onto the subject from the pattern projection section. The pattern projection section includes a pattern generator which generates the light and dark pattern. The light and dark pattern is a pattern with an intensive distribution in which a light portion and a dark portion are alternately arranged.
  • According to a second aspect of the present invention, the objective optical system according to the first aspect of the present invention may be arranged so that an optical axis on an emission side that is directed to the imaging section from the objective optical system among optical axes of the objective optical system is parallel to and eccentric from a central axis of the insertion section.
  • According to a third aspect of the present invention, the objective optical system according to the second aspect of the present invention may be a direct-view-type objective optical system in which both an optical axis on an incidence side and the optical axis on the emission side are parallel to the central axis. Moreover, the objective optical system may be provided at the tip surface of the tip portion of the insertion section and is arranged at a position eccentric from the central axis.
  • According to a fourth aspect of the present invention, the projection window according to the third aspect of the present invention may be provided at the tip surface of the tip portion of the insertion section and is arranged at a position eccentric from the central axis of the insertion section.
  • According to a fifth aspect of the present invention, the objective optical system according to the second aspect of the present invention may be a side-view-type objective optical system that is exposed to an outer peripheral surface of the tip portion of the insertion section and has an optical axis on an incidence side arranged at a twisted position with respect to the central axis of the insertion section.
  • According to a sixth aspect of the present invention, the projection window according to the fifth aspect of the present invention may be exposed to an outer peripheral surface of the tip portion of the insertion section, and a centerline extending in the thickness direction of the projection window through the center of the projection window when the projection window is viewed from the thickness direction of the projection window may be arranged at a twisted position with respect to the central axis of the insertion section.
  • According to a seventh aspect of the present invention, the objective optical system according to the second aspect of the present invention may be a side-view-type objective optical system that is exposed to an outer peripheral surface of the tip portion of the insertion section and has an optical axis on an incidence side arranged to intersect the central axis of the insertion section. Moreover, the projection window may be arranged in the outer peripheral surface of the tip portion of the insertion section so that the center of the projection window when the projection window is viewed from the thickness direction of the projection window is present in a plane defined by the central axis of the insertion section and the optical axis on the incidence side.
  • According to an eighth aspect of the present invention, the pattern projection section according to the first aspect of the present invention may have one or more linear parallel patterns.
  • According to a ninth aspect of the present invention, the pattern projection section according to the first aspect of the present invention may include a projecting light source, and a pattern generator that changes the intensity distribution of the light emitted from the projecting light source and generates the light and dark pattern.
  • According to a tenth aspect of the present invention, the endoscope apparatus according to the ninth aspect of the present invention may further include an optical fiber that guides the light emitted from the projecting light source to the pattern generator. Moreover, the projecting light source may be provided on a base end side of the insertion section, and the pattern generator may be provided at the tip portion of the insertion section.
  • According to an eleventh aspect of the present invention, the projecting light source and the pattern generator according to the ninth aspect of the present invention may be provided at the tip portion of the insertion section.
  • According to a twelfth aspect of the present invention, the endoscope apparatus according to the ninth aspect of the present invention may further include an optical fiber that guides the light and dark pattern emitted from the projecting light source and generated by the pattern generator to a tip side of the insertion section. Additionally, the projecting light source and the pattern generator may be provided on a base end side of the insertion section.
  • According to a thirteenth aspect of the present invention, the endoscope apparatus according to the ninth aspect of the present invention may further include an optical adapter capable of being detachably mounted on the tip portion of the insertion section, and the pattern generator may be provided in the optical adapter.
  • According to a fourteenth aspect of the present invention, the projecting light source according to the thirteenth aspect of the present invention may be provided in the optical adapter.
  • According to a fifteenth aspect of the present invention, the endoscope apparatus according to any of the first embodiment to the fourteenth aspect of the present invention may further include switching means that switches between the light for projecting the light-and-dark-pattern and the illumination light.
  • According to a sixteenth aspect of the present invention, a measuring method is provided to perform the three-dimensional shape measurement of a subject using an endoscope (an endoscope apparatus). The measuring method according to the sixteenth aspect of the present invention includes projecting a predetermined light and dark pattern onto the subject from one place of the endoscope; imaging a portion of the subject onto which the light and dark pattern is projected, and acquiring at least one sheet of a pattern projection image; and using the pattern projection image to perform a three-dimensional shape measurement of the portion onto which the light and dark pattern is projected.
  • According to a seventeenth aspect of the present invention, a measuring method is provided to perform the three-dimensional shape measurement of a subject using an endoscope apparatus. The measuring method according to the seventeenth aspect of the present invention includes projecting a predetermined fringe pattern onto the subject from one place of the endoscope apparatus; imaging a portion of the subject onto which the fringe pattern is projected, and acquiring one sheet of a fringe image; and measuring the three-dimensional shape of the portion onto which the fringe pattern is projected, from the one sheet of fringe image, using a spatial phase shift method or a Fourier transform method.
  • According to an eighteenth aspect of the present invention, the measuring method according to the seventeenth aspect of the present invention may further include acquiring at least one sheet of a bright field image of the portion onto which the fringe pattern is projected, at least either before or after the one sheet of fringe image is acquired; selecting at least two sheets of images from the one sheet of fringe image and the bright field image; and detecting that a position of the endoscope apparatus has deviated when there is a positional deviation equal to or more than a predetermined amount in the two sheets of images.
  • According to a nineteenth aspect of the present invention, the measuring method according to the eighteenth aspect of the present invention may further include acquiring at least one sheet of the bright field images before and after the one sheet of fringe image is acquired.
  • According to a twentieth aspect of the present invention, in the measuring method according to the nineteenth aspect of the present invention, at least two sheets of images selected to detect that the position of the endoscope apparatus has deviated are selected from the bright field images.
  • EFFECTS OF THE INVENTION
  • According to the endoscope apparatuses according to all of the aspects of the present invention, the diameter of the insertion section can be reduced.
  • According to the measuring methods according to the aspects of the present invention, the three-dimensional shape measurement is capable of being performed with high precision even in the endoscope apparatus in which the diameter of the insertion section is reduced.
  • According to the measuring methods of the three-dimensional shape measurement according to the aspects of the present invention, the three-dimensional shape measurement can be performed by analyzing one sheet of a fringe image captured using the endoscope apparatus. Thus, the three-dimensional shape measurement can be performed in a short period of time using the endoscope apparatus.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram showing the configuration of an endoscope apparatus according to a first embodiment and a second embodiment of the present invention.
  • FIG. 2 is a schematic view showing a light and dark pattern projected by the endoscope apparatus according to the first and the second embodiments of the present invention.
  • FIG. 3 is a flowchart showing a measuring method according to the first embodiment of the present invention.
  • FIG. 4 is a schematic view showing a first example of the configuration of a tip surface of an insertion section of the endoscope apparatus according to the first embodiment of the present invention.
  • FIG. 5 is a schematic view showing a second example of the configuration of the tip surface of the insertion section of the endoscope apparatus according to the first embodiment of the present invention.
  • FIG. 6 is a schematic view showing a third example of the configuration of the tip surface of the insertion section of the endoscope apparatus according to the first embodiment of the present invention.
  • FIG. 7 is a schematic view showing a first example of a configuration in the vicinity of the tip of the insertion section of the endoscope apparatus according to the first embodiment of the present invention.
  • FIG. 8 is a schematic view showing a second example of the configuration in the vicinity of the tip of the insertion section of the endoscope apparatus according to the first embodiment of the present invention.
  • FIG. 9 is a schematic view showing a light and dark pattern projected by an endoscope apparatus of a modification according to the first embodiment of the present invention.
  • FIG. 10 is a schematic view showing the configuration of a tip surface of an insertion section in another modification of the endoscope apparatus according to the first embodiment of the present invention.
  • FIG. 11 is a view showing the configuration of an insertion section in a still further modification of the endoscope apparatus according to the first embodiment of the present invention, and a schematic view of the insertion section in the endoscope apparatus capable of being observed in a vertical direction with respect to the central axis of the insertion section.
  • FIG. 12A is a top view of a tip surface on which a cover member of a prism according to the modification shown in FIG. 11 is put.
  • FIG. 12B is a top view of a tip surface on which the cover member of the prism according to the modification shown in FIG. 11 is put.
  • FIG. 12C is a schematic view of the modification shown in FIG. 12A as viewed from a direction D.
  • FIG. 12D is a schematic view of the modification example shown in FIG. 12B as viewed from the direction D.
  • FIG. 13 is a flowchart showing a measuring method according to the second embodiment of the invention.
  • DETAILED DESCRIPTION OF THE INVENTION First Embodiment
  • An endoscope apparatus 1 and a measuring method of the first embodiment of the invention will be described below.
  • First, the configuration of the endoscope apparatus 1 of the present embodiment will be described. FIG. 1 is a block diagram showing the configuration of the endoscope apparatus 1 of the present embodiment. FIG. 2 is a schematic view showing a light and dark pattern projected by the endoscope apparatus 1.
  • The endoscope apparatus 1 is used for internal observation of a subject, observation of a subject at a position where it is difficult for an ordinary observation instrument to make an access, or the like. The endoscope apparatus 1 includes an elongated insertion section 10 and a body section 20 to which a base end of the insertion section 10 is connected.
  • As shown in FIG. 1, the insertion section 10 is formed in a tubular shape, and inserted into the inside of a subject or an access path to a subject. The insertion section 10 is provided with an imaging section 30 that acquires the image of a subject, an illumination section 40 that illuminates an observation visual field in front of the insertion section 10, and a pattern projection section 50 that projects a light and dark pattern onto a subject. In the present embodiment, the pattern projection section 50 projects a fringe pattern onto a subject as the light and dark pattern.
  • Additionally, a tip surface 10 a of the insertion section 10 is provided with an opening 11 for making daylight incident on an objective optical system 32 of the imaging section 30 there through, an illumination window 12 which allows the illumination light from the illumination section 40 to be irradiated toward the front of the insertion section therethrough, and a projection window 13 which allows the fringe from the pattern projection section 50 to be irradiated toward the front of the insertion section therethrough.
  • The imaging section 30 includes an imager 31 arranged in the vicinity of the tip of the insertion section 10, the objective optical system 32 arranged in front of the imager 31, and an imager controller 33 connected to the imager 31.
  • As the imager 31, various well-known configurations including various image sensors, such as a CCD and a CMOS, can be appropriately selected and used.
  • The objective optical system 32 is arranged within the opening 11 of the insertion section 10. The objective optical system has a predetermined angle of view, causes the reflected light within an observation visual field defined by the angle of view to be incident on the imager 31, and causes the image of a subject to be formed on the imager. Additionally, the objective optical system 32 has a light-transmissive cover member 32 a that seals the opening 11.
  • The imager controller 33 is provided within the body section 20, and is connected to the imager 31 by a wiring line 34 extending within the insertion section 10. The imager controller 33 performs various kinds of control, such as setting by which driving and the video signals of the imager 31 are acquired.
  • The illumination section 40 includes a first light source 41, an illumination optical system 42, a first fiber bundle 43 that guides the light from the first light source 41 to the illumination optical system 42, a first incidence optical system 44 arranged between the first light source 41 and the first fiber bundle 43.
  • The first light source 41 is a general white light source, and is arranged inside the body section 20. As the first light source 41, light-emitting elements, such as an LED and a laser, a halogen lamp, or the like can be adopted.
  • The illumination optical system 42 is attached to the tip of the insertion section 10 or the vicinity of the tip. The illumination optical system 42 has a light-transmissive cover member 42 a provided within the illumination window 12 of the insertion section 10, and a lens group that is not shown. The illumination optical system 42 broadens the light irradiated from the first light source 41 to a visual field range suitable for the angle of view of the objective optical system 32 and causes the light to be emitted from the illumination window 12, and illuminates the observation visual field thoroughly.
  • The first fiber bundle 43 extends from the vicinity of the illumination optical system 42 through the insertion section 10 to the first light source 41 within the body section 20. The type of the first fiber bundle 43 is not particularly limited, and a general light guide can be used.
  • The first incidence optical system 44 converges the light emitted from the first light source 41 up to a diameter nearly equal to the diameter of the first fiber bundle 43, and efficiently introduces the light into the first fiber bundle 43.
  • The pattern projection section 50 includes a second light source 51 (projecting light source), a projection optical system 52, a second fiber bundle 53 that guides the light of the second light source 51 to the projection optical system 52, a second incidence optical system 54 arranged between the second light source 51 and the second fiber bundle 53, and a pattern generator 55 arranged on an optical path for the light emitted from the second light source 51.
  • The second light source 51 is a white light source similar to the first light source 41, and is arranged inside the body section 20. In addition, the second light source 51 may be a light source that emits light with a wavelength different from that of the first light source 41.
  • The projection optical system 52 is attached to the tip of the insertion section 10 or the vicinity of the tip. The projection optical system 52 has a light-transmissive cover member 52 a provided within the projection window 13 of the insertion section 10. The cover member 52 a provided in the projection window 13 may be lens-shaped. The projection optical system 52 expands the light irradiated from the second light source 51 to a visual field range suitable for the angle of view of the objective optical system 32, and projects the light into an observation visual field from one projection window 13.
  • The second fiber bundle 53 extends from the vicinity of the projection optical system 52 through the insertion section 10 to the vicinity of the second light source 51 within the body section 20. As the second fiber bundle 53, a general light guide can be used, similar to the first fiber bundle 43.
  • The second incidence optical system 54 converges the light emitted from the second light source 51 up to a diameter nearly equal to the diameter of the second fiber bundle 53, and efficiently introduces the light into the second fiber bundle 53.
  • As the pattern generator 55, a well-known configuration capable of forming a plurality of phase-shifted fringe patterns can be used. For example, a configuration in which a slit plate having a plurality of slits is moved by an actuator, or a configuration in which a transparent plate made of glass or resin, on which a plurality of mutually phase-shifted fringe patterns are drawn, is moved by the actuator is used.
  • In addition, a liquid crystal shutter module capable of switching between transmission and non-transmission of light for every element, a MEMS (microelectronics system) mirror module including a fine reflective mirror for every element, or the like may be used as the pattern generator 55. In this case, since every element is controlled individually, a plurality of phase-shifted fringe patterns can be formed without moving the entire pattern generator 55. Therefore, there is an advantage that the configuration of the pattern projection section 50 can be simplified. The switching among the fringe patterns is performed by a pattern controller 56 connected to the pattern generator 55.
  • The shape of the light and dark pattern is not limited to the fringe pattern, and may be a plurality of linear parallel lines as shown in FIG. 2. Additionally, one line (to be described below) as shown in FIG. 9 may be provided as another example. Additionally, a grid-like pattern in which a plurality of points or a plurality of vertical lines and horizontal lines intersect each other, a concentric pattern, or the like may be adopted.
  • Other mechanisms provided within the body section 20 will be described. The first light source 41 and the second light source 51 are connected to a light source controller 21 that controls ON/OFF of the light sources. The imager controller 33, the pattern controller 56, and the light source controller 21 are connected to a main controller 22 that controls the entire endoscope apparatus 1. An operation section 23 that allows a user to perform various kinds of input to the endoscope apparatus 1 is connected to the main controller 22. Additionally, the main controller 22 is connected to a main storage device (RAM 24). In the present embodiment, an auxiliary storage device 25, such as a storage device having a rewritable nonvolatile memory or a magnetic storage device, is electrically connected to the main controller 22.
  • If necessary, a ROM 26 (or EPROM, EEPROM, or the like) on which firmware or the like is recorded may be connected to the main controller 22.
  • Moreover, the video processor 27 that processes video signals acquired by the imager 31 is connected to the imager controller 33 and the main controller 22. A monitor 28 that displays video signals processed by the video processor 27 as an image is connected to the video processor 27.
  • Next, the measuring method of the first embodiment of the present invention will be described through an example in which measurement is performed using the above-described endoscope apparatus 1.
  • The measuring method of the first embodiment of the present invention is a measuring method of performing the three-dimensional shape measurement of a subject, using the endoscope apparatus 1. When the endoscope apparatus 1 is used, first, a user inserts the insertion section 10 into the inside of the subject, an access path to the subject, such as a conduit, or the like, and advances the tip of the insertion section 10 to a predetermined observation region. The user performs inspection or the like of the subject by switching to an observation mode where a desired region of the subject is observed, and to a measurement mode where the three-dimensional shape of the region is measured, if necessary.
  • In the observation mode, the light source controller 21 receives the command from the main controller 22 to ON-control the first light source 41 and OFF-control the second light source 51. As a result, a fringe pattern is not projected from the pattern projection section 50 and white light is irradiated to the observation visual field from the illumination section 40 to illuminate the observation visual field (hereinafter, this illumination state is referred to as an “observation state”). The image of the illuminated subject is formed on the imager 31 through the objective optical system 32. Video signals sent from the imager 31 are processed by the video processor 27 and displayed on the monitor 28. The user can observe the subject from the image of the subject displayed on the monitor 28, or can save the image if necessary.
  • When switching is made from the observation mode to the measurement mode, the user inputs a mode switching instruction. A well-known input device can be used as an input device that inputs the mode switching instruction. For example, it is possible to adopt a configuration in which the operation section 23 is provided with a switch or a configuration in which the monitor 28 is changed to a touch panel so as to provide a software switch.
  • If the instructions to switch from the observation mode to the measurement mode are input by the user, measurement image capturing processing (refer to FIG. 3) is started in the main controller 22.
  • In the measurement image capturing processing, first, it is determined whether or not the endoscope apparatus 1 has entered the observation state (Step S1 shown in FIG. 3).
  • When it is determined in Step S1 that the endoscope apparatus has entered the observation state, the processing proceeds to Step S3, and when the endoscope apparatus is in states (for example, a measurement state to be described below) excluding the observation state in Step S1, the processing proceeds to Step S2.
  • Step S1 is ended by this.
  • Step S2 is a step where the endoscope apparatus 1 is switched to being in the observation state.
  • In Step S2, the first light source 41 is ON-controlled, and the second light source 51 is OFF-controlled. Accordingly, a fringe pattern is not projected from the pattern projection section 50 and white light is irradiated to the observation visual field from the illumination section 40 to illuminate the observation visual field.
  • Step S2 is ended by this, and the processing proceeds to Step S3.
  • Step S3 is a step where a fringe pattern is not projected and the image of the subject illuminated with the white light from the illumination section 40 is captured.
  • In Step S3, an image is acquired by the imager 31 of the imaging section 30 in a state where the subject is illuminated with the white light from the illumination section 40 (hereinafter, the image captured in the observation state is referred to as a “bright field image”).
  • The bright field image captured in Step S3 is temporarily stored in the RAM 24.
  • Step S3 is ended by this, and the processing proceeds to Step S4.
  • Step S4 is a branch step for capturing a desired number of sheets of pattern projection images.
  • In Step S4, a predetermined number N of sheets of pattern projection images scheduled to be captured is compared with the number of sheets of pattern projection images stored in the RAM 24 at this time. When the number of sheets of pattern projection images stored in the RAM 24 is less than the number N of sheets of images scheduled to be captured, the processing proceeds to Step S5. Additionally, when the number of sheets of pattern projection images stored in the RAM 24 is the number N of sheets of images scheduled to be captured, the processing proceeds to Step S7.
  • Step S4 is ended by this.
  • Step S5 is a step where a fringe pattern is projected onto the subject.
  • In Step S5, on the basis of the command from the main controller 22, the first light source 41 is OFF-controlled, and the second light source 51 is ON-controlled. Then, the white light irradiated from the illumination section 40 is turned off, and a fringe pattern is projected onto the subject from the pattern projection section 50. The fringe pattern projected onto the subject, as shown in FIG. 2, is a pattern in which a bright portion R1 by a white light source and a dark portion R2 shaded by the pattern generator 55 are alternately arranged. Additionally, the pattern generator 55 operates the actuator to set the phase of the fringe pattern to an appropriate phase. This is a state (hereinafter, this state is referred to as a “pattern projection state”.) where an appropriate fringe is projected onto the subject from one place.
  • Step S5 is ended by this, and the processing proceeds to Step S6.
  • Step S6 is a step where a pattern projection image is captured in the pattern projection state.
  • In Step S6, the fringe pattern projected onto the subject is a pattern that has changed according to the three-dimensional shape of the subject. In this state, an image is acquired by the imager 31 of the imaging section 30 (hereinafter, the image captured in the pattern projection state is referred to as a “pattern projection image”).
  • The pattern projection image captured in Step S6 is temporarily stored in the RAM 24.
  • Step S6 is ended by this, and the processing returns to Step S4.
  • Steps S4 to S6 are repeated until the number of sheets of pattern projection images to be captured reaches the number N of sheets of images scheduled to be captured. At this time, in Step S5, the phase of the fringe pattern is appropriately changed and the images of the subject on which fringes with different phases are projected are captured, for example, one by one by a total number N of sheets.
  • Step S7 is a step where the endoscope apparatus 1 is switched to being in the observation state.
  • In Step S7, the first light source 41 is ON-controlled, and the second light source 51 is OFF-controlled. Accordingly, a fringe pattern is not projected from the pattern projection section 50 and white light is irradiated to the observation visual field from the illumination section 40 to illuminate the observation visual field.
  • Step S7 is ended by this, and the processing proceeds to Step S8.
  • Step S8 is a step where a fringe pattern is not projected and the image of the subject illuminated with the white light from the illumination section 40 is captured.
  • In Step S8, a bright field image is captured by the imager 31 of the imaging section 30 in a state where the subject is illuminated with the white light from the illumination section 40.
  • The bright field image captured in Step S8 is temporarily stored in the RAM 24.
  • Step S8 is ended by this, and the processing proceeds to Step S9.
  • Step S9 is a step where the relative movement (hereinafter referred to as “deviation”) between the insertion section 10 and the subject from Step S3 to Step S8 is detected on the basis of the images (the bright field image and the pattern projection image) captured from Step S3 to Step S8.
  • In Step S9, first, two sheets of images are selected from at least any of the bright field image and the fringe image that are stored in the RAM 24. For example, in the first embodiment, a bright field image captured before N sheets of pattern projection images are captured, and a bright field image captured after the N sheets of pattern projection images are captured are selected.
  • Subsequently, a same feature point is detected from the two sheets of selected images, and the coordinates of the feature point in the two sheets of images are calculated.
  • Step S9 is ended by this, and the processing proceeds to Step S10.
  • Step S10 is a step where a deviation of the two images is determined using the feature point detected in Step S9 and the processing branches.
  • In Step S10, if the coordinates of the feature point in the two sheets of images are the same coordinates in the respective images, it is determined that any deviation does not occur between a first image and the next image, and the processing proceeds to Step S11. On the contrary, if the coordinates of the feature point in the two sheets of images are different coordinates in the respective images, it is determined that deviation has occurred between the first image and the next image. Since the deviation has occurred, a message showing that another capturing is required is displayed on the monitor 28 (Step S 14), and a series of processing is ended.
  • Step S10 is ended by this.
  • Step S11 is a step where the user is made to select whether three-dimensional measurement using the captured pattern projection image is performed at that time or later.
  • In Step S11, for example, an inquiry of “Perform measurement?” or the like is displayed on the monitor 28, and the user is urged to make an input on whether or not the three-dimensional measurement using the captured pattern projection image is allowed to be performed.
  • When there is an input that the measurement is allowed to be performed, the processing proceeds to Step S12.
  • When there is an input that the measurement is not allowed to be performed, the processing proceeds to Step S15.
  • Step S11 is ended by this.
  • Step S12 is a step where analysis is performed for the three-dimensional measurement.
  • In Step S12, the three-dimensional shape is analyzed on the basis of the pattern projection images stored in the RAM 24. For example, in the first embodiment, the three-dimensional shape of the subject is analyzed, for example, by the well-known time phase shift method, using the N sheets of pattern projection images with different phases.
  • The analysis result of the three-dimensional shape is generated as a text file or a binary file, and is saved together with the N sheets of pattern projection images in the auxiliary storage device 25. Step S12 may be performed as background processing during Step S11 simultaneously with the start of Step S11.
  • Step S12 is ended by this, and the processing proceeds to Step S13.
  • Step S13 is a step where the display on the monitor 28 is shifted to a screen of various measurement modes, and a measurement result is displayed on the monitor 28, using the information saved in Step S12.
  • In Step S13, the three-dimensional shape of the subject displayed on the bright field image is displayed on the monitor 28, by overlaying the result analyzed in Step S12 on the bright field image (or bright field image acquired in Step S8) acquired in Step S3. This enables the user to identify the three-dimensional shape of the subject.
  • Step S13 is ended by this, and a series of processing is ended.
  • Step S15 is a step that branches from the above Step S11, and is a step that performs information processing required to display the measurement result later. In Step S15, similar to the above Step S12, the three-dimensional shape is analyzed on the basis of the pattern projection images stored in the RAM 24. For example, in the first embodiment, the three-dimensional shape of the subject is analyzed by the well-known time phase shift method, using the N sheets of pattern projection images with different phases.
  • Additionally, analysis results of the bright field image, the pattern projection image, and the three-dimensional shape and optical parameters used for the analysis are saved as binary files or text files, respectively, in the auxiliary storage device 25. In this case, by making portions of file names common or collectively saving these files in one directory (folder), these files are saved in the auxiliary storage device 25 so that the files can be collectively read later.
  • Step S15 is ended by this, and a series of processing is ended.
  • As described above, according to the endoscope apparatus 1 of the first embodiment of the invention, the projection window 13 of the pattern projection section 50 is provided in one place of the tip surface 10 a of the insertion section 10. Thus, the diameter of the insertion section 10 can be reduced as compared to a case where two projection windows 13 are provided in the tip surface 10 a of the insertion section 10.
  • Additionally, if projection windows 13 used for projecting a fringe pattern are provided in a plurality of places as in the related art, the occupying area of the projection window 13 in the tip surface 10 a of the insertion section 10 of the endoscope apparatus 1 is large, and the occupying area of the illumination window 12 and the objective optical system 32 is difficult to increase. For example, if the occupying area of the illumination window 12 is small, the quantity of illumination light may be insufficient. Additionally, if the occupying area of the objective optical system 32 is small, it may be difficult to increase the aperture of a lens, and an image may become dark.
  • In contrast, in the endoscope apparatus 1 of the first embodiment of the present invention, the number of projection windows 13 through which a fringe pattern is projected is one. Thus, the occupying area of the illumination window 12 or the objective optical system 32 is capable of being increased. As a result, a brighter image can be acquired even in the insertion section 10 with a thickness equal to that of the endoscope according to related art. Additionally, even in the insertion section 10 whose diameter is reduced compared to the related-art endoscope, an image with a brightness equal to or higher than that of the endoscope according to related art is capable of being obtained.
  • According to the measuring method of the first embodiment of the present invention, even in an environment where a fringe pattern is projected from one projection window 13 in the endoscope apparatus 1 in which the diameter of the insertion section 10 is reduced, a three-dimensional shape can be measured with high precision.
  • According to the measuring method of the first embodiment of the present invention, deviation is detected using bright field images before and after a pattern projection image is captured, and a three-dimensional shape is analyzed when it is determined that there is no deviation. Thus, analysis is not performed with fringe patterns on a plurality of pattern projection images deviated. For this reason, the analysis precision of the three-dimensional shape can be enhanced. Moreover, when the measurement result by using the pattern projection image is overlaid on the bright field images and displayed, the positional deviations are capable of being reduced.
  • Modification 1 of the First Embodiment
  • Next, a modification 1 of the endoscope apparatus 1 and the measuring method according to the above-described first embodiment will be described.
  • The present modification 1 is different from the above-described first embodiment that the present modification 1 includes a pattern generator 55A (refer to
  • FIG. 1) instead of the pattern generator 55. The pattern generator 55A is not capable of projecting light and dark patterns with different phases. However, the pattern generator 55A is configured so that a light and dark pattern with a specific phase can be projected onto a subject. That is, the pattern generator 55A of the present modification 1 does not include an actuator that moves the slit plate or the like, and is configured of a small size.
  • In the present modification 1, a measuring method of the three-dimensional shape measurement of a subject is also different. The measuring method of the present modification 1 will be described below mainly about points that are different from the above-described first embodiment in terms of processing.
  • In the measuring method of the present modification 1, the number N of sheets of images scheduled to be captured in Step S4 is 1, one sheet of a pattern projection image is captured without any repetition from Step S4 to Step S6 in the above-described embodiment, and the processing proceeds to Step S7.
  • Additionally, the analysis method of a three-dimensional shape measurement in Step S12 and Step S15 are also different from that of the above-described first embodiment. In the present modification 1, the three-dimensional shape is analyzed by a space phase shift method or a Fourier transform method in Step S12 and Step S15 using one sheet of a pattern projection image.
  • In the present modification 1, the three-dimensional shape is analyzed using one sheet of the pattern projection image. Thus, as compared to the case where N sheets of fringe images are acquired in the above-described first embodiment, the time until an analysis result is obtained after capturing of an image is started can be shortened.
  • The measuring method of the present modification 1 is a similarly applicable method even if the method has the pattern generator 55 including the actuator that moves the slit plate or the like, and can rapidly analyze a three-dimensional shape compared to the time phase shift method using a plurality of sheets of pattern projection images.
  • Modification 2 of the First Embodiment
  • Next, a modification 2 of the endoscope apparatus 1 and the measuring method according to the above-described first embodiment will be described. The present modification 2 includes the pattern generator 55A (refer to FIG. 1), and the pattern projection section 50 is configured so that the pattern projection section is capable of projecting one light or dark linear pattern as shown in FIG. 9 onto a subject. A case where a stripe-shaped (straight) dark portion R2 is projected into a light portion R1 is shown in FIG. 9. In addition, one stripe-shaped light portion R1 may be projected into the dark portion R2.
  • The pattern itself projected from the pattern projection section 50 is not moved in terms of projection place or direction or is not deformed in terms of shape.
  • That is, the pattern generator 55A of the present modification 2 does not include the actuator that moves the slit plate or the like, and is configured with a small size.
  • In the present modification 2, the measuring method of the three-dimensional shape measurement of a subject is also different. The measuring method of the present modification 2 will be described below mainly about the point that the present modification 2 is different from Modification 1 of the first embodiment as described above in terms of processing.
  • In the present modification 2, the three-dimensional shape is analyzed in Step S12 and Step S15 by an optical cutting method, using one sheet of a pattern projection image. In the present modification 2, the three-dimensional shape is analyzed on one pattern, using one sheet of a pattern projection image. Therefore, as compared to the case where the entire surface of one sheet of a pattern projection image is analyzed in the above-described first embodiment, a portion where the three-dimensional shape measurement is capable of being performed is limited, but analysis time can be significantly shortened.
  • The measuring method of the present modification 2 is a similarly applicable method even if the pattern generator 55 includes the actuator that moves the slit plate or the like. By using a plurality of sheets of pattern projection images, the three-dimensional shape can be rapidly analyzed not only in a portion of a visual field range (on the screen) but also in a plurality of different portions (positions).
  • Modification 3 of the First Embodiment
  • Next, a modification 3 of the endoscope apparatus 1 according to the above-described first embodiment will be described.
  • The present modification 3 does not include the second light source 51, but includes switching means that makes the light emitted from the first light source 41 incidents on the second fiber bundle 53.
  • As the switching means, for example, devices, such as a MEMS mirror module, which switch an optical path for the light emitted from the first light source 41 to a plurality of directions, can be adopted.
  • Even in such a configuration, the same effects as the endoscope apparatus 1 described in the above-described first embodiment are exhibited. Additionally, since the light source may be comprised of one light source, the number of parts of the endoscope apparatus 1 is capable of being reduced.
  • Modification 4 of the First Embodiment
  • Next, a modification 4 of the endoscope apparatus 1 according to the above-described first embodiment will be described.
  • In the present modification 4, the configuration of the tip surface 10 a of the endoscope apparatus 1 is different from that of the above-described first embodiment.
  • FIGS. 4 to 6 are views showing the configuration of the direct-view-type insertion section 10 including the illumination window 12, the projection window 13, and the like in the tip surface 10 a.
  • As shown in FIGS. 4 to 6, there are various embodiments in the configuration of respective elements in the tip surface 10 a of the insertion section 10 of the endoscope apparatus 1.
  • For example, as shown in FIG. 4, the objective optical system 32 is arranged on a central axis O of the insertion section 10. The illumination window 12 is provided so as to surround the objective optical system 32 by a half of an outer periphery of the objective optical system 32. The projection window 13 is arranged opposite to the objective optical system 32 with respect to the illumination window 12. In such an arrangement, the occupying area of the illumination window 12 is capable of being increased. Additionally, the shape of the objective optical system 32 is generally a circle or a shape close to a circle. Therefore, by arranging the illumination window 12 and the projection window 13 around the objective optical system 32, the illumination window and the projection window can be efficiently arranged at the tip portion of the endoscope with limited arrangement area, and the diameter of the tip portion of an endoscope is easily reduced. Moreover, since the center of an endoscope image and the central axis O of the insertion section 10 coincide with each other, an operator can insert the endoscope without a sense of incompatibility while observing the image of a subject by the monitor.
  • The pattern generator 55 is provided on the depth side of the projection window 13. The pattern generator 55 is arranged so that a linear pattern is located in a vertical direction with respect to the arrangement direction of the projection window 13 and the objective optical system. Such an arrangement secures the distance (hereinafter referred to as a base length) perpendicular to the linear pattern from the central point of the objective optical system as long as possible, and constitutes an arrangement relationship in which the arrangement interval between the projection window and the objective optical system is the closest. Since measurement precision improves as the base length is longer, according to the present modification example, the three-dimensional shape can be measured with high precision even in the endoscope apparatus in which the diameter of the insertion section is reduced.
  • As shown in FIG. 5, the objective optical system 32 is arranged at a position eccentric from the central axis O of the insertion section 10 unlike the arrangement shown in FIG. 4. As shown in FIG. 5, it is also possible to provide an arrangement in which the illumination windows 12 are provided in two places between which the objective optical system 32 and the projection window 13 are sandwiched. The objective optical system 32 is arranged so that an optical axis on the emission side where the reflected light within the observation visual field is directed to the imager 31 from the objective optical system 32 is parallel to and eccentric from the central axis O.
  • As shown in FIG. 6, in the tip surface 10 a of the insertion section 10, the opening 11 in which the objective optical system 32 is arranged, the illumination window 12, and the projection window 13 may be arranged at positions eccentric from the central axis O of the insertion section 10. Additionally, a vertical axis P1 and a left-right axis Q1 that pass through the optical axis L of the objective optical system 32 may be arranged at positions where these axes do not overlap a vertical axis P2 and a left-right axis Q2 that pass through the central axis of the insertion section 10.
  • Since the opening 11, the illumination window 12, and the projection window 13 are provided at the positions eccentric from the central axis O of the insertion section 10, the diameter of the insertion section 10 can be further reduced, for example, as compared to a related-art endoscope apparatus in which the objective optical system 32 is provided on the central axis O of the insertion section 10.
  • Modification 5 of the First Embodiment
  • Next, a modification 5 of the endoscope apparatus 1 according to the above-described first embodiment will be described.
  • In the present modification 5, the first light source 41 and the second light source 51 are arranged in the vicinity of the tip of the insertion section 10.
  • For example, as shown in FIG. 7, in the present modification 5, the first fiber bundle 43 is not included, the light from the first light source 41 is directly irradiated toward the illumination window 12, the second fiber bundle 53 is not included, and the light from the second light source 51 is directly irradiated toward the fringe pattern generator 55.
  • As shown in FIG. 8, it is also possible to adopt a configuration having the first light source 41, the second light source 51, and the imager 31 in the vicinity of the tip of the insertion section 10 and having an optical adapter 10A that is attachable to and detachable from a tip portion of the insertion section 10.
  • Portions of the illumination window 12, the projection window 13, and the objective optical system 32 are accommodated in the optical adapter 10A. Additionally, a tip surface 10 a 1 of the optical adapter 10A corresponds to the tip surface 10 a of the insertion section 10 in the above-described first embodiment.
  • The first light source 41 and the illumination window 12 are connected together by an optical fiber 43A arranged within the optical adapter 10A. Additionally, the second light source 51 and the pattern generator 55 are connected together by an optical fiber 53A arranged within the optical adapter 10A.
  • Even in configurations as shown in FIGS. 7 and 8, the same effects as those described in the above-described first embodiment are exhibited.
  • Since the first light source 41 and the second light source 51 are provided in the vicinity of the tip of the insertion section 10, when the insertion section 10 has, for example, a length exceeding several tens of meters, there is little loss of light and a brighter image is capable of being acquired, compared to a case where the first fiber bundle 43 and the second fiber bundle 53 are used.
  • Modification 6 of the First Embodiment
  • Next, a modification 6 of the endoscope apparatus 1 according to the above-described first embodiment will be described.
  • FIG. 10 shows a modification of another arrangement in which the arrangement of FIG. 6 is further modified. This modification is an example of the endoscope apparatus of a configuration having the optical adapter 10A. FIG. 10 shows a view of the tip surface of the optical adapter.
  • The objective optical system 32 is arranged on the central axis O of the insertion section 10, and the illumination window 12 and the projection window 13 are respectively arranged on both sides of the objective optical system 32. Contact pins 14 used to supply electric power to the first light source or the second light source from the body are provided on the back of the optical adapter 10A.
  • When the optical adapter 10A is attached to the tip portion of the insertion section 10, in order to perform positioning in the rotational direction to the central axis of the insertion section, a positioning groove 15 or a structure in place of the positioning groove is provided on the back of the optical adapter 10A.
  • Such contact pins 14 and positioning groove 15 are respectively provided on sides where the illumination window 12 and the projection window 13 are not arranged, with respect to the objective optical system 32. This allows the contact pins 14, the positioning groove 15, the illumination window 12, and the projection window 13 to be arranged in a small-diameter endoscope tip portion without interfering with each other, even in an optical adapter type.
  • Modification 7 of the First Embodiment
  • Next, a modification 7 of the endoscope apparatus 1 according to the above-described first embodiment will be described.
  • FIG. 11 shows the modification 7 of the tip portion in the endoscope apparatus capable of performing observation in a perpendicular direction to the central axis of the insertion section.
  • In the present modification 7, in place of the tip surface 10 a, a tip surface 10 b in which a straight line orthogonal to the central axis of the insertion section 10 becomes a normal line is formed in a portion of an outer peripheral surface of the tip portion of the insertion section 10. The illumination window 12, the projection window 13, and the cover member 32 a are all arranged at the tip surface 10 b.
  • The objective optical system 32 has a prism 16 in which an optical axis L1 on the incidence side is directed to a direction that intersects the optical axis L2 on the emission side that turns from the objective optical system 32 to the imager 31. In the present modification 7, the prism 16 is one of optical elements that configure the objective optical system 32.
  • The optical axis L1 on the incidence side is an optical axis when the reflected light within the observation visual field is incident on the prism 16, and the optical axis L2 on the emission side is an optical axis when the reflected light within the observation visual field is incident on the imager 31 from the prism 16.
  • In the present modification 7, the optical axis L1 on the incidence side is at a twisted position with respect to the central axis O of the insertion section 10. Moreover, the optical axis L2 on the emission side is parallel to the central axis O of the insertion section 10.
  • FIG. 12A and FIG. 12B are views of the tip surface 10 b in the endoscope of FIG. 11 as viewed from a direction perpendicular to the tip surface 10 b, and are plan views showing an arrangement example of the illumination window 12, the projection window 13, and the cover member 32 a.
  • As shown in FIGS. 11 and 12A, the tip surface 10 b is a substantially flat plane. As shown in FIG. 12A, in the present modification ‘, the cover member 32 a and the projection window 13 are arranged on the central axis O of the insertion section 10 in plan view. Two illumination windows 12 are arranged on the sides of the cover member 32 a. Both the cover member 32 a and the projection window 13 are exposed to the tip surface 10 b that is the outer peripheral surface of the tip portion of the insertion section. In the present modification 7, a line obtained by projecting the central axis O perpendicularly to the tip surface 10 b is defined as a virtual centerline PL. That is, in FIG. 12A, the cover member 32 a and the projection window 13 are arranged at positions where the centers of these and the virtual centerline PL intersect each other. In this case, the projection window 13 has a positional relationship in which the center of the projection window when the projection window 13 is viewed from the thickness direction of the projection window 13 is present within a plane defined by the central axis O of the insertion section 10 and the optical axis L1 on the incidence side.
  • As shown in FIG. 12B, the cover member 32 a may be arranged at a position where the center thereof does not intersect the virtual centerline PL, the illumination window 12 may be arranged on the virtual centerline PL of the insertion section 10, and the projection window 13 may be arranged at a position where the center thereof does not intersect the virtual centerline PL.
  • At this time, the center of the cover member 32 a and the optical axis L1 of the prism 16 are arranged so as to coincide with each other, and the objective optical system 32 is arranged so that the optical axis L1 thereof does not intersect the virtual centerline PL.
  • FIG. 12C and FIG. 12D are respectively schematic views of the insertion section 10 as viewed from a direction shown by symbol D in FIG. 12A and FIG. 12B, and are front views of the insertion section 10. As shown in FIG. 12C and FIG. 12D, in the present modification 7, the objective optical system 32 and the imager 31 are arranged so that the optical axis L2 when the reflected light within the observation visual field is incident on the imager 31 from the prism 16 and the central axis O of the insertion section 10 are eccentric from each other.
  • As shown in FIG. 11 and FIG. 12A to FIG. 12D, even in the example of the endoscope that performs observation in the lateral direction, the diameter of the tip portion is easily reduced because the number of projection windows is one, compared to the case where projection windows 13 used for projecting a light and dark pattern are provided in a plurality of places as in the related art.
  • The shapes and arrangement positions of the respective elements arranged at the tip surface are not limited only to the examples of FIG. 12A and FIG. 12B. For example, although the case where the optical axis L1 and the optical axis L2 are orthogonal to each other is illustrated in the present modification example, a configuration in which the optical axis L1 and the optical axis L2 intersect each other at angles other than the orthogonal intersection.
  • Modification 8 of the First Embodiment
  • Next, a modification example 8 of the endoscope apparatus 1 described in the above-described first embodiment will be described.
  • In the present modification example, the control operation performed by the main controller 22 is different from those of the above-described first embodiment and Modifications 1 to 7 of the first embodiment.
  • In the present modification 8, the second light source 51, as shown in FIG. 7, is provided on the tip side of the insertion section 10, and is, for example, a high-luminance light source, such as a laser. In this case, in Step S5, a state where a suitable fringe is projected onto a subject from one place may enter a “fringe projection state” by ON-controlling the second light source 51 with the first light source 41 being ON-controlled, on the basis of the command of the main controller 22. Additionally, when captured images are nth and (n+1)th in the above-described Step S5, fringe images with different fringe luminance may be captured by controlling the second light source 51 to thereby change the quantity of light, without changing the phase of a light and dark pattern.
  • Modification 9 of the First Embodiment
  • Next, a modification 9 of the endoscope apparatus 1 described in the above-described first embodiment will be described.
  • In the present modification 9, the control operation performed by the main controller 22 is different from those of the above-described first embodiment and Modifications 1 to 8 of the first embodiment.
  • In the present modification example, in the above-described Step S9, a bright field image captured before N sheets of fringe images are captured, and a bright field image captured after the N sheets of fringe images are captured are selected, and the total differences in luminance value between the two sheets of images is calculated.
  • Moreover, in the above-described Step S10, if the total of the differences in luminance value calculated in Step S9 is smaller than a threshold, it is determined that any deviation does not occur in a first image and the next image, and the processing proceeds to Step S11. On the contrary, when the total of the differences in luminance value calculated in Step S9 is larger than the threshold, it is determined that deviation has occurred in the first image and the next image. Since the deviation has occurred, a message showing that another capturing is required is displayed on the monitor 28 (Step S14), and a series of processing is ended.
  • The above example is an example in which differences are calculated all over an image. In addition, processing may be performed using only a certain portion of an image as an object. Additionally, differences in luminance may be calculated using one sheet of a bright field image and one sheet of a fringe image.
  • Modification 10 of the First Embodiment
  • Next, a still further modification example of the endoscope apparatus 1 described in the above-described first embodiment will be described.
  • In the present modification 10, the control operation performed by the main controller 22 is different from those of the above-described first embodiment and Modifications 1 to 9.
  • In the present modification 10, the second light source 51 (refer to FIG. 7) is configured by a plurality of minute light-emitting elements. The plurality of light-emitting elements provided in the second light source 51 is lighting-controlled every two or more groups.
  • For example, a plurality of groups based on the light-emitting elements may be a plate with simple slits in which the light and dark pattern generator 55 cannot arbitrarily change the phase of a fringe, or a plate similar thereto, if the groups are arranged in the phase direction of a fringe pattern provided on the light and dark pattern generator 55. In this case, in Step S5, a plurality of several different fringes are projected onto the subject by switching groups of light-emitting elements to be turned on in order. Moreover, these respective fringe images can be captured in Step S6.
  • Although the first embodiment of the invention has been described above in detail with reference to the drawings, specific configuration is not limited to the embodiment, and design changes are also included without departing from the scope of the invention.
  • For example, although the example in which the two sheets of bright field images are selected as the images to be used in order to detect any deviation is shown in the above-described first embodiment, fringe images may be used as the images to be used in order to detect any deviation. Additionally, bright field images more than two sheets of images may be captured. If there are bright field images more than two sheets of images, any deviation can be detected by selecting a required number of sheets of images from the bright field images if necessary.
  • Additionally, the elements shown in the above-described first embodiment and respective modifications can be suitably combined.
  • Second Embodiment
  • A measuring method of a second embodiment of the invention will be described below.
  • The measuring method of the present embodiment is a method of performing the three-dimensional shape measurement of a subject, using an endoscope apparatus.
  • First, the configuration of the endoscope apparatus 1 to which the measuring method of the present embodiment is applied will be described. FIG. 1 is a block diagram showing the configuration of the endoscope apparatus 1 of the present embodiment. FIG. 2 is a schematic view showing a light and dark pattern projected by the endoscope apparatus 1.
  • The configuration of the endoscope apparatus of the second embodiment is the same as the configuration of the endoscope apparatus of the first embodiment. Accordingly, the same elements as those of the first embodiment will be designated by the same reference numerals, and a detailed description thereof will be omitted here.
  • Next, the measuring method of the second embodiment of the present invention will be described through an example in which measurement is performed using the above-described endoscope apparatus 1.
  • Similar to the first embodiment, in the second embodiment, when the endoscope apparatus 1 is used, first, a user inserts the insertion section 10 into the inside of a subject, an access path to the subject, such as a conduit, or the like, and advances the tip of the insertion section 10 to a predetermined observation region. The user performs inspection or the like of the subject by switching to an observation mode where a desired region of the subject is observed and a measurement mode where the three-dimensional shape measurement of the region is performed, if necessary.
  • In the observation mode, the light source controller 21 receives the command of the main controller 22 to ON-control the first light source 41 and OFF-control the second light source 51. As a result, a fringe pattern is not projected from the pattern projection section 50 and white light is irradiated to the observation visual field from the illumination section 40 to illuminate the observation visual field (hereinafter, this illumination state is referred to as an “observation state”). The image of the illuminated subject is formed on the imager 31 through the objective optical system 32. Video signals sent from the imager 31 are processed by the video processor 27 and displayed on the monitor 28. The user can observe the subject from the image of the subject displayed on the monitor 28, or save the image if necessary.
  • When switching is made from the observation mode to the measurement mode, the user inputs a mode switching instruction. A well-known input device can be used as an input device on which the mode switching instruction is input. For example, it is possible to adopt a configuration in which the operation section 23 is provided with a switch or a configuration in which the monitor 28 is changed to a touch panel so as to provide a software switch.
  • If a mode switching instruction from the observation mode to the measurement mode is input by the user, measurement image capturing processing (refer to FIG. 13) is started in the main controller 22.
  • In the measurement image capturing processing, first, it is determined whether or not the endoscope apparatus 1 is brought into the observation state (Step S1 shown in FIG. 13).
  • When it is determined in Step S1 that the endoscope apparatus has been brought into the observation state, the processing proceeds to Step S3, and when the endoscope apparatus is brought into states (for example, a measurement state to be described below) excluding the observation state in Step S1, the processing proceeds to Step S2.
  • Step S1 is ended by this.
  • Step S2 is a step where the endoscope apparatus 1 is switched to being in the observation state.
  • In Step S2, the first light source 41 is ON-controlled, and the second light source 51 is OFF-controlled. Accordingly, a fringe pattern is not projected from the pattern projection section 50 and white light is irradiated to the observation visual field from the illumination section 40 to illuminate the observation visual field.
  • Step S2 is ended by this, and the processing proceeds to Step S3.
  • Step S3 is a step where a fringe pattern is not projected and the image of the subject illuminated with the white light from the illumination section 40 is captured.
  • In Step S3, an image is acquired by the imager 31 of the imaging section 30 in a state where the subject is illuminated with the white light from the illumination section 40 (hereinafter, the image captured in the observation state is referred to as a “bright field image”).
  • The bright field image captured in Step S3 is temporarily stored in the RAM 24. Step S3 is ended by this, and the processing proceeds to Step S16.
  • Step S16 is a step where a predetermined fringe pattern is projected onto the subject from one place of the endoscope apparatus 1.
  • In Step S16, on the basis of the command of the main controller 22, the first light source 41 is OFF-controlled, and the second light source 51 is ON-controlled. Then, the white light irradiated from the illumination section 40 is turned off, and a fringe pattern is projected onto the subject from the pattern projection section 50. The fringe pattern projected onto the subject, as shown in FIG. 2, is a pattern in which a light portion R1 by a white light source and a dark portion R2 shaded by the pattern generator 55 are alternately arranged. (Hereinafter, this state is referred to as a “pattern projection state”).
  • Step S16 is ended by this, and the processing proceeds to Step S17.
  • Step S17 is a step where a pattern projection image is captured in the pattern projection state.
  • In Step S17, the fringe pattern projected onto the subject is a pattern that has changed according to the three-dimensional shape of the subject. In this state, one sheet of an image is acquired by the imager 31 of the imaging section 30 (hereinafter, the image captured in the pattern projection state is referred to as a “pattern projection image”).
  • The pattern projection image captured in Step S17 is temporarily stored in the RAM 24.
  • Step S17 is ended by this, and the processing proceeds to Step S18.
  • Step S18 is a step where the endoscope apparatus 1 is switched to being in the observation state.
  • In Step S18, the first light source 41 is ON-controlled, and the second light source 51 is OFF-controlled. Accordingly, a fringe pattern is not projected from the pattern projection section 50 and white light is irradiated to the observation visual field from the illumination section 40 to illuminate the observation visual field.
  • Step S18 is ended by this, and the processing proceeds to Step S19.
  • Step S19 is a step where a fringe pattern is not projected and the image of the subject illuminated with white light from the illumination section 40 is captured.
  • In Step S19, a bright field image is acquired by the imager 31 of the imaging section 30 in a state where the subject is illuminated with white light from the illumination section 40.
  • The bright field image captured in Step S19 is temporarily stored in the RAM 24.
  • Step S19 is ended by this, and the processing proceeds to Step S20.
  • Step S20 is a step where the relative movement (hereinafter referred to as “deviation”) between the insertion section 10 and the subject from Step S3 to Step S19 is detected on the basis of the images (the bright field image and the pattern projection image) captured from Step S3 to Step S19.
  • In Step S20, first, two sheets of images are selected from at least any of the bright field image and the pattern projection image that are stored in the RAM 24. For example, in the second embodiment, a bright field image captured before one sheet of a pattern projection image is captured, and a bright field image captured after the one sheet of pattern projection image is captured are selected.
  • Subsequently, the same feature point is detected from the two sheets of selected images, and the coordinates of the feature point in the two sheets of images are calculated.
  • Step S20 is ended by this, and the processing proceeds to Step S21.
  • Step S21 is a step where the deviation of the two images is determined using the feature point detected in Step S20 and the processing branches.
  • In Step S21, if the coordinates of the feature point in the two sheets of images are the same coordinates in the respective images, it is determined that no deviation occurs between a first image and the next image, and the processing proceeds to Step S22. On the contrary, if the coordinates of the feature point in the two sheets of images are different coordinates in the respective images, it is determined that deviation has occurred between the first image and the next image. Since the deviation has occurred, a message showing that another capturing is required is displayed on the monitor 28 (Step S25), and a series of processing is ended.
  • Step S21 is ended by this.
  • Step S22 is a step where the user is made to select whether three-dimensional measurement using the captured pattern projection image is performed now or later.
  • In Step S22, for example, an inquiry of “Perform measurement?” or the like is displayed on the monitor 28, and the user is urged to make an input on whether or not the three-dimensional measurement using the captured pattern projection image is performed.
  • When there is an input that the measurement is capable of being performed, the processing proceeds to Step S23.
  • When there is an input that the measurement is not capable of being performed, the processing proceeds to Step S26.
  • Step S22 is ended by this.
  • Step S23 is a step where analysis is performed for the three-dimensional measurement.
  • In Step S23, the three-dimensional shape is analyzed on the basis of the pattern projection images stored in the RAM 24. For example, in the second embodiment, the three-dimensional shape of the subject is analyzed, for example, by the well-known spatial phase shift method or Fourier transform method, using one sheet of a pattern projection image.
  • The analysis result of the three-dimensional shape is generated as a text file or a binary file, and is saved together with the pattern projection image in the auxiliary storage device 25. In addition, Step S23 may be performed as background processing of Step S22 simultaneously with the start of Step S22.
  • Step S23 is ended by this, and the processing proceeds to Step S24.
  • Step S24 is a step where the display on the monitor 28 is shifted to a screen of various measurement modes, and a measurement result is displayed on the monitor 28, using the information saved in Step S23.
  • In Step S24, the three-dimensional shape of the subject displayed on the bright field image is displayed on the monitor 28, by overlaying the result analyzed in Step S23 on the bright field image (or bright field image acquired in Step S19) acquired in Step S3. This enables the user to know the three-dimensional shape of the subject.
  • Step S24 is ended by this, and a series of processes is ended.
  • Step S26 is a step that branches from the above Step S22, and is a step that performs information processing required to display the measurement result later.
  • In Step S26, similar to the above Step S23, the three-dimensional shape is analyzed on the basis of the pattern projection images stored in the RAM 24. For example, in the second embodiment, the three-dimensional shape of the subject is analyzed by the well-known spatial phase shift method or Fourier transform method, using one sheet of a pattern projection image.
  • Additionally, analysis results of the bright field image, the pattern projection image, and the three-dimensional shape and optical parameters used for the analysis are saved as binary files or text files, respectively, in the auxiliary storage device 25. In this case, by making portions of file names common or collectively saving these files in one directory (folder), these files are saved in the auxiliary storage device 25 so that the files can be collectively read later.
  • Step S26 is ended by this, and a series of processing is ended.
  • As described above, according to the measuring method of the second embodiment of the present invention, the three-dimensional shape of a subject can be measured on the basis of one sheet of a pattern projection image captured in a state where a predetermined fringe pattern is projected onto the subject. Thus, the three-dimensional shape can be measured in a short time using the endoscope apparatus 1.
  • According to the measuring method of the second embodiment of the present invention, even in an environment where a fringe pattern is projected from one projection window 13 in the endoscope apparatus 1 in which the diameter of the insertion section 10 is reduced, a three-dimensional shape can be measured with high precision.
  • Since deviation can be detected by capturing a bright field image in addition to one sheet of a pattern projection image and using two sheets of images selected from the pattern projection image and the bright field image, the measurement precision of a three-dimensional-shape can be enhanced.
  • Since bright field images are respectively captured before and after one sheet of a pattern projection image is captured, and used for deviation detection, the absence and presence of deviation can be determined with high precision.
  • In the measuring method of the second embodiment of the present invention, deviation is detected using bright field images before and after a pattern projection image is captured, and a three-dimensional shape is analyzed when it is determined that there is no deviation. Thus, analysis is not performed with fringe patterns on a plurality of pattern projection images deviated. For this reason, the analysis precision of the three-dimensional shape can be enhanced. Moreover, the positional deviation when the measurement result using the pattern projection image is overlaid and displayed on the bright field images can also be reduced.
  • Modification of the Second Embodiment
  • Next, a modification of the measuring method according to the above-described second embodiment will be described.
  • In the present modification, the measuring method of the three-dimensional shape measurement of a subject is different. The measuring method of the present modification will be described below mainly about points that are different from the above-described second embodiment in terms of processing contents.
  • In the present modification example, the three-dimensional shape is analyzed in Step S23 and Step S26 by an optical cutting method, using one sheet of a pattern projection image. In the present modification, the three-dimensional shape is analyzed on one pattern, using one sheet of a pattern projection image. Therefore, as compared to the case where the entire surface of one sheet of a pattern projection image is analyzed in the above-described embodiment, a portion where the three-dimensional shape can be measured is limited, but analysis time can be significantly shortened.
  • Although the second embodiment of the present invention has been described above in detail with reference to the drawings, specific configuration is not limited to the embodiment, and design changes are also included without departing from the scope of the present invention.
  • For example, the above-described second embodiment has been described using the example in which one sheet of a pattern projection image is captured. However, substantially one sheet of a pattern projection image may be acquired and used for analysis by capturing a plurality of sheets of pattern projection images and selecting one sheet of a pattern projection image whose state is good.
  • Although the example in which the two sheets of bright field images are selected as the images to be used in order to detect any deviation is shown in the above-described second embodiment, pattern projection images may be used as the images to be used in order to detect any deviation. Additionally, bright field images more than two sheets of images may be captured. If there are bright field images more than two sheets of images, any deviation can be detected by selecting a required number of sheets of images from the bright field images if necessary.
  • The elements shown in the above-described second embodiment and respective modifications can be suitably combined.

Claims (21)

1. An endoscope apparatus that measures a subject using a pattern projection image of the subject on which a light and dark pattern of light is projected, the endoscope apparatus comprising:
an insertion section;
an imaging section that is provided at a tip portion of the insertion section to acquire the image of the subject;
an illumination section that emits illumination light that illuminates an observation visual field of the imaging section; and
a pattern projection section that projects the light and dark pattern onto the subject,
wherein a tip surface of the insertion section is provided with:
an objective optical system that forms the image of the subject on the imaging section;
one or more illumination window through which the illumination light is emitted; and
a projection window through which the light and dark pattern is projected onto the subject from the pattern projection section;
the pattern projection section includes a pattern generator which generates the light and dark pattern; and
the light and dark pattern is a pattern with an intensive distribution in which a light portion and a dark portion are alternately arranged.
2. The endoscope apparatus according to claim 1,
wherein the objective optical system is arranged so that an optical axis on an emission side that is directed to the imaging section from the objective optical system among optical axes of the objective optical system is parallel to and eccentric from a central axis of the insertion section.
3. The endoscope apparatus according to claim 2,
wherein the objective optical system is a direct-view-type objective optical system in which both an optical axis on an incidence and the optical axis on the emission are parallel to the central axis, and
wherein the objective optical system is provided at the tip surface of the tip portion of the insertion section and is arranged at a position eccentric from the central axis.
4. The endoscope apparatus according to claim 3,
wherein the projection window is provided at the tip surface and is arranged at a position eccentric from the central axis of the insertion section.
5. The endoscope apparatus according to claim 2,
wherein the objective optical system is a side-view-type objective optical system that is exposed to an outer peripheral surface of the tip portion of the insertion section and has an optical axis on an incidence side arranged at a twisted position with respect to the central axis.
6. The endoscope apparatus according to claim 5,
wherein the projection window is exposed to an outer peripheral surface of the tip portion of the insertion section, and
wherein a centerline extending in the thickness direction of the projection window through the center of the projection window when the projection window is viewed from the thickness direction of the projection window is arranged at a twisted position with respect to the central axis of the insertion section.
7. The endoscope apparatus according to claim 2,
wherein the objective optical system is a side-view-type objective optical system that is exposed to an outer peripheral surface of the tip portion of the insertion section, the objective optical system having an optical axis on an incidence side arranged to intersect the central axis of the insertion section, and
wherein the projection window is arranged in the outer peripheral surface of the tip portion of the insertion section so that the center of the projection window is present in a plane defined by the central axis of the insertion section and the optical axis on the incidence side when the projection window is viewed from the thickness direction of the projection window.
8. The endoscope apparatus according to claim 1,
wherein the pattern projection section includes one or more linear parallel pattern.
9. The endoscope apparatus according to claim 1,
wherein the pattern projection section includes a projecting light source, and a pattern generator that changes the intensity distribution of the light emitted from the projecting light source and generates the light and dark pattern.
10. The endoscope apparatus according to claim 9, further comprising:
an optical fiber that guides the light emitted from the projecting light source, to the pattern generator,
wherein the projecting light source is provided on a base end side of the insertion section, and
wherein the pattern generator is provided at the tip portion of the insertion section.
11. The endoscope apparatus according to claim 9,
wherein the projecting light source and the pattern generator are provided at the tip portion of the insertion section.
12. The endoscope apparatus according to claim 9, further comprising:
an optical fiber that guides the light and dark pattern emitted from the projecting light source and generated by the pattern generator to a tip side of the insertion section,
wherein the projecting light source and the pattern generator are provided on a base end side of the insertion section.
13. The endoscope apparatus according to claim 9, further comprising:
an optical adapter capable of being detachably mounted on the tip portion of the insertion section,
wherein the pattern generator is provided at the optical adapter.
14. The endoscope apparatus according to claim 13,
wherein the projecting light source is provided at the optical adapter.
15. The endoscope apparatus according to claim 1, further comprising:
switching means for switching between the light for projecting the light-and-dark-pattern and the illumination light.
16. A measuring method that performs the three-dimensional shape measurement of a subject using an endoscope apparatus, the method comprising:
projecting a predetermined light and dark pattern onto the subject from one place of the endoscope apparatus;
imaging the portion of the subject onto which the light and dark pattern is projected, and acquiring at least one sheet of a pattern projection image; and
performing the three-dimensional shape measurement of the portion onto which the light and dark pattern is projected, using the pattern projection image.
17. A measuring method that measures a subject using a pattern projection image of the subject onto which a light and dark pattern of light is projected, the method comprising:
projecting the predetermined light and dark pattern onto the subject from one place of the endoscope apparatus;
imaging the portion of the subject onto which the light and dark pattern is projected, and acquiring one sheet of a pattern projection image; and
performing the three-dimensional shape measurement of the portion onto which the light and dark pattern is projected, according to the one sheet of pattern projection image, using a spatial phase shift method or a Fourier transform method.
18. The measuring method according to claim 17, further comprising:
acquiring at least one sheet of a bright field image of the portion onto which the light and dark pattern is projected, at least either before or after the one sheet of pattern projection image is acquired;
selecting at least two sheets of images from the one sheet of pattern projection image and the bright field images; and
detecting that the position of the endoscope apparatus has deviated when there is a positional deviation equal to or more than a predetermined amount in the two sheets of images.
19. The measuring method according to claim 18, further comprising:
acquiring at least one sheet of the bright field images before and after the one sheet of pattern projection image is acquired, respectively.
20. The measuring method according to claim 19,
wherein at least two sheets of images selected to detect that the position of the endoscope apparatus has deviated are selected from the bright field images.
21. An endoscope apparatus that measures a subject using a pattern projection image of the subject on which a light and dark pattern of light is projected, the endoscope apparatus comprising:
an insertion section;
imaging means for acquiring the image of the subject;
illumination means for emitting illumination light that illuminates an observation visual field of the imaging means; and
pattern projection means for projecting the light and dark pattern onto the subject,
wherein the insertion means includes:
means for forming the image of the subject on the imaging means;
means for allowing the illumination light to emit through; and
means for projecting the light and dark pattern on to the subject from the pattern projection means,
wherein the pattern projection means includes a pattern generate means for generating the light and dark pattern; and
wherein the light and dark pattern is a pattern with an intensive distribution in which a bright portion and a dark portion are alternately configured.
US14/061,530 2011-04-27 2013-10-23 Endoscope apparatus and measuring method Abandoned US20140052005A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US15/423,043 US10342459B2 (en) 2011-04-27 2017-02-02 Endoscope apparatus and measuring method
US16/427,001 US10898110B2 (en) 2011-04-27 2019-05-30 Endoscope apparatus and measuring method

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
JP2011-099890 2011-04-27
JP2011099889A JP5893264B2 (en) 2011-04-27 2011-04-27 Endoscope device
JP2011-099889 2011-04-27
JP2011099890A JP6032870B2 (en) 2011-04-27 2011-04-27 Measuring method
PCT/JP2012/060832 WO2012147679A1 (en) 2011-04-27 2012-04-23 Endoscopic device and measurement method

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/060832 Continuation WO2012147679A1 (en) 2011-04-27 2012-04-23 Endoscopic device and measurement method

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/423,043 Division US10342459B2 (en) 2011-04-27 2017-02-02 Endoscope apparatus and measuring method

Publications (1)

Publication Number Publication Date
US20140052005A1 true US20140052005A1 (en) 2014-02-20

Family

ID=47072198

Family Applications (3)

Application Number Title Priority Date Filing Date
US14/061,530 Abandoned US20140052005A1 (en) 2011-04-27 2013-10-23 Endoscope apparatus and measuring method
US15/423,043 Active 2032-10-05 US10342459B2 (en) 2011-04-27 2017-02-02 Endoscope apparatus and measuring method
US16/427,001 Active US10898110B2 (en) 2011-04-27 2019-05-30 Endoscope apparatus and measuring method

Family Applications After (2)

Application Number Title Priority Date Filing Date
US15/423,043 Active 2032-10-05 US10342459B2 (en) 2011-04-27 2017-02-02 Endoscope apparatus and measuring method
US16/427,001 Active US10898110B2 (en) 2011-04-27 2019-05-30 Endoscope apparatus and measuring method

Country Status (3)

Country Link
US (3) US20140052005A1 (en)
EP (1) EP2689708B1 (en)
WO (1) WO2012147679A1 (en)

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150238276A1 (en) * 2012-09-30 2015-08-27 M.S.T. Medical Surgery Technologies Ltd. Device and method for assisting laparoscopic surgery - directing and maneuvering articulating tool
WO2015151098A3 (en) * 2014-04-02 2015-12-30 M.S.T. Medical Surgery Technologies Ltd. An articulated structured light based-laparoscope
US9581802B2 (en) 2011-05-24 2017-02-28 Olympus Corporation Endoscope device, and measurement method
US9622644B2 (en) 2011-05-24 2017-04-18 Olympus Corporation Endoscope
US9757206B2 (en) 2011-08-21 2017-09-12 M.S.T. Medical Surgery Technologies Ltd Device and method for assisting laparoscopic surgery—rule based approach
US9757204B2 (en) 2011-08-21 2017-09-12 M.S.T. Medical Surgery Technologies Ltd Device and method for assisting laparoscopic surgery rule based approach
US9795282B2 (en) 2011-09-20 2017-10-24 M.S.T. Medical Surgery Technologies Ltd Device and method for maneuvering endoscope
US20180042466A1 (en) * 2016-08-12 2018-02-15 The Johns Hopkins University Compact endoscope design for three-dimensional surgical guidance
US9943372B2 (en) 2005-04-18 2018-04-17 M.S.T. Medical Surgery Technologies Ltd. Device having a wearable interface for improving laparoscopic surgery and methods for use thereof
WO2018171851A1 (en) 2017-03-20 2018-09-27 3Dintegrated Aps A 3d reconstruction system
US10342459B2 (en) 2011-04-27 2019-07-09 Olympus Corporation Endoscope apparatus and measuring method
US10866783B2 (en) 2011-08-21 2020-12-15 Transenterix Europe S.A.R.L. Vocally activated surgical control system
US10925465B2 (en) 2019-04-08 2021-02-23 Activ Surgical, Inc. Systems and methods for medical imaging
US10972675B2 (en) * 2017-06-12 2021-04-06 Olympus Corporation Endoscope system
US20210141597A1 (en) * 2011-08-21 2021-05-13 Transenterix Europe S.A.R.L. Vocally actuated surgical control system
US11020144B2 (en) 2015-07-21 2021-06-01 3Dintegrated Aps Minimally invasive surgery system
US11033182B2 (en) 2014-02-21 2021-06-15 3Dintegrated Aps Set comprising a surgical instrument
US11039734B2 (en) 2015-10-09 2021-06-22 3Dintegrated Aps Real time correlated depiction system of surgical tool
US11045081B2 (en) * 2017-06-12 2021-06-29 Olympus Corporation Endoscope system
US11070739B2 (en) 2017-06-12 2021-07-20 Olympus Corporation Endoscope system having a first light source for imaging a subject at different depths and a second light source having a wide band visible band
US11179218B2 (en) 2018-07-19 2021-11-23 Activ Surgical, Inc. Systems and methods for multi-modal sensing of depth in vision systems for automated surgical robots
US11324385B2 (en) 2017-06-12 2022-05-10 Olympus Corporation Endoscope system for processing second illumination image using image information other than image information about outermost surface side of subject among three image information from at least four images of first illumination images
US11331120B2 (en) 2015-07-21 2022-05-17 3Dintegrated Aps Cannula assembly kit
US11805988B2 (en) 2018-06-05 2023-11-07 Olympus Corporation Endoscope system
US11871906B2 (en) 2018-06-05 2024-01-16 Olympus Corporation Endoscope system

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6535020B2 (en) 2014-03-02 2019-06-26 ブイ.ティー.エム.(バーチャル テープ メジャー)テクノロジーズ リミテッド System for measuring 3D distance and dimensions of visible objects in endoscopic images
JP7048628B2 (en) 2016-11-28 2022-04-05 アダプティブエンドウ エルエルシー Endoscope with separable disposable shaft
WO2018144360A1 (en) * 2017-02-01 2018-08-09 Boston Scientific Scimed, Inc. Endoscope having multiple viewing directions
US11531112B2 (en) 2019-06-20 2022-12-20 Cilag Gmbh International Offset illumination of a scene using multiple emitters in a hyperspectral, fluorescence, and laser mapping imaging system
US11931009B2 (en) 2019-06-20 2024-03-19 Cilag Gmbh International Offset illumination of a scene using multiple emitters in a hyperspectral imaging system
US11903563B2 (en) 2019-06-20 2024-02-20 Cilag Gmbh International Offset illumination of a scene using multiple emitters in a fluorescence imaging system
US11550057B2 (en) 2019-06-20 2023-01-10 Cilag Gmbh International Offset illumination of a scene using multiple emitters in a fluorescence imaging system
USD1018844S1 (en) 2020-01-09 2024-03-19 Adaptivendo Llc Endoscope handle

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6464633B1 (en) * 1999-08-23 2002-10-15 Olympus Optical Co., Ltd. Light source device for endoscope using DMD

Family Cites Families (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0664243B2 (en) * 1986-04-30 1994-08-22 オリンパス光学工業株式会社 Endoscope
JPH07104493B2 (en) 1987-02-17 1995-11-13 オリンパス光学工業株式会社 Endoscope device
JPH07104491B2 (en) 1988-02-17 1995-11-13 株式会社東芝 Endoscope with measuring function
JPH0285706A (en) 1988-09-22 1990-03-27 Toshiba Corp Measurable endoscope
US4995396A (en) * 1988-12-08 1991-02-26 Olympus Optical Co., Ltd. Radioactive ray detecting endoscope
JPH02287311A (en) * 1989-04-28 1990-11-27 Toshiba Corp Endoscope device with measuring mechanism
JPH03128043A (en) 1989-10-16 1991-05-31 Toshiba Corp Shape measurement endoscope device
US5434669A (en) * 1990-10-23 1995-07-18 Olympus Optical Co., Ltd. Measuring interferometric endoscope having a laser radiation source
JPH0545132A (en) 1991-08-15 1993-02-23 Olympus Optical Co Ltd Endoscope device for measurement
US5436655A (en) * 1991-08-09 1995-07-25 Olympus Optical Co., Ltd. Endoscope apparatus for three dimensional measurement for scanning spot light to execute three dimensional measurement
JP3126065B2 (en) 1991-12-09 2001-01-22 オリンパス光学工業株式会社 Measurement endoscope device
JPH0961132A (en) 1995-08-28 1997-03-07 Olympus Optical Co Ltd Three-dimensional-shape measuring apparatus
JP3816624B2 (en) * 1997-02-28 2006-08-30 オリンパス株式会社 3D measuring device
JP3670789B2 (en) 1997-02-28 2005-07-13 オリンパス株式会社 3D shape measuring device
US6419626B1 (en) 1998-08-12 2002-07-16 Inbae Yoon Surgical instrument endoscope with CMOS image sensor and physical parameter sensor
DE10104483A1 (en) * 2001-01-31 2002-10-10 Forschungszentrum Fuer Medizin Device for 3D measurement of surfaces in especially organic hollow volumes has optical imaging channel(s), projection channel(s) in endoscope shaft, fed out of shaft laterally at distal end
US7385708B2 (en) 2002-06-07 2008-06-10 The University Of North Carolina At Chapel Hill Methods and systems for laser based real-time structured light depth extraction
JP4229791B2 (en) 2003-09-19 2009-02-25 真 金子 Endoscope device
JP4916160B2 (en) * 2005-11-14 2012-04-11 オリンパス株式会社 Endoscope device
JP2007144024A (en) 2005-11-30 2007-06-14 National Univ Corp Shizuoka Univ Three-dimensional measurement endoscope using self-mixing laser
JP4751443B2 (en) 2006-03-07 2011-08-17 富士通株式会社 Imaging apparatus and imaging method
JP5436757B2 (en) 2007-03-20 2014-03-05 オリンパス株式会社 Fluorescence observation equipment
JP5049677B2 (en) * 2007-07-10 2012-10-17 京セラドキュメントソリューションズ株式会社 Waste toner collecting apparatus and image forming apparatus equipped with the same
JP2009019941A (en) * 2007-07-11 2009-01-29 Nikon Corp Shape measuring method
JP2009061014A (en) 2007-09-05 2009-03-26 Fujifilm Corp Hardness measuring apparatus, hardness measuring method and endoscopic system
US7821649B2 (en) 2008-03-05 2010-10-26 Ge Inspection Technologies, Lp Fringe projection system and method for a probe suitable for phase-shift analysis
US8107083B2 (en) * 2008-03-05 2012-01-31 General Electric Company System aspects for a probe system that utilizes structured-light
JP2009240621A (en) 2008-03-31 2009-10-22 Hoya Corp Endoscope apparatus
JP5073564B2 (en) 2008-04-15 2012-11-14 オリンパス株式会社 Endoscope device for measurement and program
US8334900B2 (en) 2008-07-21 2012-12-18 The Hong Kong University Of Science And Technology Apparatus and method of optical imaging for medical diagnosis
JP5127639B2 (en) 2008-09-10 2013-01-23 富士フイルム株式会社 Endoscope system and method of operating the same
EP2272417B1 (en) 2009-07-10 2016-11-09 GE Inspection Technologies, LP Fringe projection system for a probe suitable for phase-shift analysis
JP2011229850A (en) 2010-04-30 2011-11-17 Fujifilm Corp Endoscopic system, method and program
EP2689708B1 (en) 2011-04-27 2016-10-19 Olympus Corporation Endoscopic apparatus and measurement method
JP5830270B2 (en) 2011-05-24 2015-12-09 オリンパス株式会社 Endoscope apparatus and measuring method
JP5846763B2 (en) 2011-05-24 2016-01-20 オリンパス株式会社 Endoscope device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6464633B1 (en) * 1999-08-23 2002-10-15 Olympus Optical Co., Ltd. Light source device for endoscope using DMD

Cited By (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9943372B2 (en) 2005-04-18 2018-04-17 M.S.T. Medical Surgery Technologies Ltd. Device having a wearable interface for improving laparoscopic surgery and methods for use thereof
US10898110B2 (en) 2011-04-27 2021-01-26 Olympus Corporation Endoscope apparatus and measuring method
US10342459B2 (en) 2011-04-27 2019-07-09 Olympus Corporation Endoscope apparatus and measuring method
US9581802B2 (en) 2011-05-24 2017-02-28 Olympus Corporation Endoscope device, and measurement method
US9622644B2 (en) 2011-05-24 2017-04-18 Olympus Corporation Endoscope
US10368721B2 (en) 2011-05-24 2019-08-06 Olympus Corporation Endoscope
US20210141597A1 (en) * 2011-08-21 2021-05-13 Transenterix Europe S.A.R.L. Vocally actuated surgical control system
US9937013B2 (en) 2011-08-21 2018-04-10 M.S.T. Medical Surgery Technologies Ltd Device and method for assisting laparoscopic surgery—rule based approach
US11561762B2 (en) * 2011-08-21 2023-01-24 Asensus Surgical Europe S.A.R.L. Vocally actuated surgical control system
US9757204B2 (en) 2011-08-21 2017-09-12 M.S.T. Medical Surgery Technologies Ltd Device and method for assisting laparoscopic surgery rule based approach
US9757206B2 (en) 2011-08-21 2017-09-12 M.S.T. Medical Surgery Technologies Ltd Device and method for assisting laparoscopic surgery—rule based approach
US10866783B2 (en) 2011-08-21 2020-12-15 Transenterix Europe S.A.R.L. Vocally activated surgical control system
US9795282B2 (en) 2011-09-20 2017-10-24 M.S.T. Medical Surgery Technologies Ltd Device and method for maneuvering endoscope
US20150238276A1 (en) * 2012-09-30 2015-08-27 M.S.T. Medical Surgery Technologies Ltd. Device and method for assisting laparoscopic surgery - directing and maneuvering articulating tool
US11033182B2 (en) 2014-02-21 2021-06-15 3Dintegrated Aps Set comprising a surgical instrument
US20170172382A1 (en) * 2014-04-02 2017-06-22 M.S.T. Medical Surgery Technologies Ltd An articulated structured light based-laparoscope
US11116383B2 (en) 2014-04-02 2021-09-14 Asensus Surgical Europe S.à.R.L. Articulated structured light based-laparoscope
WO2015151098A3 (en) * 2014-04-02 2015-12-30 M.S.T. Medical Surgery Technologies Ltd. An articulated structured light based-laparoscope
US11331120B2 (en) 2015-07-21 2022-05-17 3Dintegrated Aps Cannula assembly kit
US11020144B2 (en) 2015-07-21 2021-06-01 3Dintegrated Aps Minimally invasive surgery system
US11039734B2 (en) 2015-10-09 2021-06-22 3Dintegrated Aps Real time correlated depiction system of surgical tool
US20180042466A1 (en) * 2016-08-12 2018-02-15 The Johns Hopkins University Compact endoscope design for three-dimensional surgical guidance
WO2018171851A1 (en) 2017-03-20 2018-09-27 3Dintegrated Aps A 3d reconstruction system
US11045081B2 (en) * 2017-06-12 2021-06-29 Olympus Corporation Endoscope system
US11070739B2 (en) 2017-06-12 2021-07-20 Olympus Corporation Endoscope system having a first light source for imaging a subject at different depths and a second light source having a wide band visible band
US10972675B2 (en) * 2017-06-12 2021-04-06 Olympus Corporation Endoscope system
US11324385B2 (en) 2017-06-12 2022-05-10 Olympus Corporation Endoscope system for processing second illumination image using image information other than image information about outermost surface side of subject among three image information from at least four images of first illumination images
US11805988B2 (en) 2018-06-05 2023-11-07 Olympus Corporation Endoscope system
US11871906B2 (en) 2018-06-05 2024-01-16 Olympus Corporation Endoscope system
US11179218B2 (en) 2018-07-19 2021-11-23 Activ Surgical, Inc. Systems and methods for multi-modal sensing of depth in vision systems for automated surgical robots
US11857153B2 (en) 2018-07-19 2024-01-02 Activ Surgical, Inc. Systems and methods for multi-modal sensing of depth in vision systems for automated surgical robots
US10925465B2 (en) 2019-04-08 2021-02-23 Activ Surgical, Inc. Systems and methods for medical imaging
US11389051B2 (en) 2019-04-08 2022-07-19 Activ Surgical, Inc. Systems and methods for medical imaging
US11754828B2 (en) 2019-04-08 2023-09-12 Activ Surgical, Inc. Systems and methods for medical imaging

Also Published As

Publication number Publication date
US20170143237A1 (en) 2017-05-25
EP2689708A4 (en) 2014-06-18
EP2689708B1 (en) 2016-10-19
WO2012147679A1 (en) 2012-11-01
US10342459B2 (en) 2019-07-09
EP2689708A1 (en) 2014-01-29
US20190274591A1 (en) 2019-09-12
US10898110B2 (en) 2021-01-26

Similar Documents

Publication Publication Date Title
US10898110B2 (en) Endoscope apparatus and measuring method
US9451872B2 (en) Endoscope and image acquisition method
US10368721B2 (en) Endoscope
US9581802B2 (en) Endoscope device, and measurement method
JP4916160B2 (en) Endoscope device
JP5501848B2 (en) Digital microscope
JP2005013557A (en) Endoscope apparatus
JP2011237574A5 (en)
CN101762611A (en) Wiring pattern checking device
US20120071723A1 (en) Endoscope apparatus and measurement method
CN110021042B (en) Image inspection device and illumination device
JP5893264B2 (en) Endoscope device
JP6032870B2 (en) Measuring method
US10379331B2 (en) Magnifying observation apparatus
JP6574101B2 (en) Endoscope system
JP2018163107A (en) Lens meter
WO2018198908A1 (en) Hair observation method, phase difference microscope system, and preparation
US9757026B2 (en) Ophthalmic apparatus

Legal Events

Date Code Title Description
AS Assignment

Owner name: OLYMPUS CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:YOKOTA, MASAYOSHI;REEL/FRAME:031829/0940

Effective date: 20131028

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION