WO2000052643A1 - Endoscopic observation device - Google Patents

Endoscopic observation device Download PDF

Info

Publication number
WO2000052643A1
WO2000052643A1 PCT/CH2000/000096 CH0000096W WO0052643A1 WO 2000052643 A1 WO2000052643 A1 WO 2000052643A1 CH 0000096 W CH0000096 W CH 0000096W WO 0052643 A1 WO0052643 A1 WO 0052643A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
images
class
points
information
Prior art date
Application number
PCT/CH2000/000096
Other languages
French (fr)
Inventor
Charles Baur
Gaëtan MARTI
Nicolas Chauvin
Original Assignee
2C3D Sa
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 2C3D Sa filed Critical 2C3D Sa
Priority to EP00902532A priority Critical patent/EP1155383A1/en
Publication of WO2000052643A1 publication Critical patent/WO2000052643A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/97Determining parameters from multiple pictures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing

Definitions

  • the present invention relates to an endoscopic observation device of a three-dimensional (3D) field intended, in particular, and not exclusively for surgery.
  • the object of the present invention is to allow the determination at high video frequency (that is to say at more than 25 Hz) of the distance separating a surgical tool from the various elements constituting the operating area and thus to provide the surgeon with the spatial data necessary for the proper execution of operational gestures.
  • the endoscopic approach is a surgical technique which becomes more and more important every day.
  • the principle of this approach called “minimally invasive” approach, consists in making short incisions through which, via an adapted device, the tools necessary for the intervention are penetrated into the patient's body as well as a camera or endoscope. providing on a screen the images which allow the vision of the operating field.
  • This technique has such a potential to reduce operational trauma and, at the same time, the costs associated with hospital stay that all sectors of surgery are interested in it: digestive surgery, gynecology, ENT, cardiology, bone surgery, ...
  • a second approach is optical monitoring, which consists of attaching active elements (LEDs) monitored by cameras to the instruments.
  • LEDs active elements
  • a third approach is passive monitoring, which consists of mounting benchmarks (generally three spheres) on the tools that a vision system pursues. Congestion problems are certainly reduced, but they remain. Furthermore, it is not possible to completely eliminate the risk of losing position information due to an obstruction of the visual field "cameras - markers" following, for example, a gesture by the interveners in the operating area.
  • the object of the present invention is to provide an endoscopic observation device free from the drawbacks and limitations of the devices currently known.
  • the device according to the invention is characterized in that it comprises:
  • tracking means for automatically identifying at least two points P1 and P2 common to the captured images, at least one being associated with one of said representations, and for producing information relating to the position of these points in a space to three dimensions,
  • - calculating means (26c) for determining, from said information, a value representative of the distance separating said points, - processing means (32) for transforming said representative value into signals, and
  • the two optics are rigidly associated with each other to form a stereoscopic endoscope. Besides, they have axes parallel, spaced from each other by a distance D, equal focal distances f and coplanar focal planes.
  • the information produced by the monitoring means is then, for each of the two points, constituted by the coordinates xL, xR and y of its image in the focal plane of the corresponding optics, xL being the abscissa of the image of the point in the left space, xR the abscissa of the image of the point in the right space and y the ordinate of the image of the point in the left and right spaces.
  • the calculation means are advantageously arranged to carry out the operations of: - calculation of the coordinates Xp ,, Yp, and Zp, of the point P, and Xp 2 , Yp 2 and
  • d 12 [(X P1 - Xp 2 ) 2 + (Y 1 - Yp 2 ) 2 + ( Z P1 - Zp 2 ) 2 ] 1 ' 2
  • the device according to the invention also has the following characteristics:
  • the means of communication include a video screen making it possible to display an image of the field observed;
  • the processing means are arranged to generate a significant image of said representative value and superimpose it on the video screen on an image of the field observed; - this signifying image can be, among many possible solutions, an index or an area having a color gradient;
  • a memory for storing the signals from the converters,. selection means, forming part of the processing electronics, so as to take into consideration only a part of the information coming from at least one of the converters, in order to reduce the volume of information processed,
  • correction means for processing the images so as to reduce the effect of camera aberrations
  • Figure 3 shows, in more detail, part of the structure of Figure 2;
  • FIGS. 5a, 6a, 7a and 8a represent class diagrams, respectively the main diagram of the application and the diagrams relating to the acquisition, the stereo and the monitoring of tools, while FIGS. 5b, 6b, 7b and 8b are diagrams of sequences corresponding to class diagrams with the same number.
  • Figure 1 shows, very schematically, the means used, according to the invention, for a surgical operation using an endoscopic observation device.
  • the operation takes place in an operating field 10 internal to an organism and allows intervention on an organ 11 by means of a tool 12 introduced into the organism through an incision 14a.
  • the surface of the member 11 is provided with marks 11a consisting, for example, of dots produced by means of a biocompatible ink, adherent pads, or also of certain parts of the member itself which have a particular appearance.
  • Marks 12a generally formed by colored dots, are advantageously placed on the tool 12, so as to facilitate its identification and, as will be explained below, to calculate its position.
  • the observation of the operating field 10 is done by means of a conventional endoscope 16 with double optics, associated with a light source (not shown) and introduced into the organism by another incision 14b .
  • the endoscope 16 is connected to an image capture device 18 comprising two cameras 18a and 18b, called respectively the left camera L and the right camera R, which receive the light radiation captured by the two optics, so as to be able to process the images stereoscopically, as will be explained later.
  • the cameras 18a and 18b transform, in a conventional manner, the light radiation coming from the operating field 10 into an electrical signal which is applied, by two distinct channels L and R, to a processing electronics 20.
  • a switch 23 makes it possible to select one or the other of the cameras 18a and 18b or another image obtained after processing by the electronics 20.
  • the device according to the invention makes it possible to define the distance between two marks 11a and / or 12a visible in the operating field 10, but also between all points identifiable by their shape, their color, etc. Reference will now be made to FIG. 2 which represents the general structure of the device according to the invention.
  • the operating field 10 with the organ to be treated 11, the intervention tool 12, the endoscope 16, the image capture device 18, as well as various modules constituting the processing electronics 20.
  • the field 10 is attached to a Cartesian frame of reference whose Z axis is parallel to the axes of the cameras.
  • the image capture device 18 comprises, in addition to the two left cameras 18a and right 18b, an image synchronization circuit 18c and two analog-digital converters 18d and 18e.
  • the synchronization circuit 18c makes it possible to manage the images coming from the two cameras so that they are perfectly synchronized, which facilitates their comparison, as will be explained below, and improves the quality of the information relating to the third dimension.
  • the two analog-digital converters 18d and 18e transform the signals coming from the synchronization circuit 18c, of analog type, into signals of digital type.
  • the output of the device 18 is connected to the input of a digital video recorder 24 which can record all or part of the operation.
  • a digital video recorder 24 which can record all or part of the operation.
  • an analog recorder which is then connected to the input of the converters 18d and 18e.
  • the output of the device 18 is also applied to the processing electronics 20 advantageously constituted by a computer in which the different modules are defined by a set of programs and subroutines described with reference to FIGS. 5 to 8.
  • This computer comprises conventional control means, not shown in the drawing, such as a keyboard and / or a mouse.
  • the electronics 20 could be produced by means of different electronic modules. This second solution, however, offers less flexibility of use.
  • the processing electronics 20 is arranged so as to process the signals from the two cameras in parallel. However, to lighten the design, each of the modules or systems is shown only once.
  • the heart of the electronics 20 is constituted by a signal processing module 26, which includes systems for tracking objects 26a and 26b, intended to follow the possible movements of the objects (tools and organs) to which the marks 1 1 a and 12a are linked, as well as a unit 26c for calculating the distance separating, in a three-dimensional space, at least two points associated with these marks.
  • the latter are selected, for example, by means of the computer mouse, from among a set of representation of objects, stored, likely to appear in the operating field. They are identified by the object tracking systems 26a and 26b.
  • the module 26 alone makes it possible to determine the position in the space of points of interest identified by the operator using the marks 11 a and 12a, as well as the distance separating them.
  • the electronics 20 further comprises:
  • an image acquisition module 28 comprising an interface 28a and a memory 28b; a correction module 30 comprising a filtering circuit 30a of the signals coming from at least one of the converters 18d and 18e, with a view to selecting a part thereof, and a circuit 30b for correcting the aberrations of the cameras; and a screen control module 32, which includes a system for combining images 32a and a screen control 32b.
  • the acquisition module 28 is connected, by its input, to the output of the image capture device 18. It makes it possible to store information and can thus play a role of black box, recording in its memory 28b, the content of which cannot be altered, all or part of the information relating to the progress of the operation. Its interface 28a makes it possible to transform the information received to put it in a form compatible with the characteristics of the memory 28b.
  • the correction module 30 is connected, by its input, to the output of the acquisition module 28.
  • Its filtering circuit 30a makes it possible to process the images to make them more readable. By applying an adequate algorithm to them, we can only keep the outline of the objects present in field 10 or, again, represent only one of the colors of the images, reinforce the contrast, etc. This allows you to have a different perspective on the operating field and thus better understand certain details.
  • the correction circuit 30b its function is to correct the aberrations of the optics of the endoscope. This correction is essential in order to have a good image. Indeed, the optics of endoscopes are very small. This results in a strong distortion of the images. To overcome this drawback, a plane transformation is established which makes it possible to find correspondences between the points of the ideal image and of the distorted image. This method is described in detail in "Digital Image Warping, George Wolberg, IEEE Computer Society Press Monograph, 1994".
  • the signals available at the output of the module 30 thus have characteristics making it possible to display, on the video screen 22, information which is easier for the surgeon to interpret. These signals are introduced into the processing module 26 which will be described in more detail with reference to FIG. 3.
  • control module 32 manages the information displayed by the video screen 22, whether or not combining the images of the operating field 10 with information concerning the position of the various objects present in the field 10.
  • the system for combining images 32a called “multimage", by contraction of the words “multiple” and “image”, or, in English, "overlay”. It has its input connected to the output of module 26 and processes the signals supplied by it at the same time as the signals produced by the camera 18a or 18b.
  • the screen control 32b connected to the output of the system 32b, is connected by its output to the video screen 22 which makes it possible to view the operating field 10 as well as information relating to the organs and tools, in particular information relating to the distances.
  • the indication of distances can be done in different ways.
  • the value of the Z coordinate of the distance between the tool 12 and a selected mark 11a can be displayed in digital form or, for example, using two indexes in V. It is also possible to represent the distance between a given point, for example the end of the tool 12, and all or part of the field 10 by a gradient of colors, ie blue corresponding, for example, to very distant areas and red on contact.
  • Figure 3 provides a better understanding of how to do this. It represents the detail of the calculation unit 26c which comprises, connected in series, a geometric correction entity 260 intended to process the images in epipolar geometry, a contour detection entity 262, a correlation entity 264, a determining the distance 266 and a filter 268.
  • the geometric correction entity 260 makes it possible to process the images in epipolar geometry. To fully understand this geometry, we will advantageously refer to the article by Zhengyou Zhang entitled “Determining the Epipolar Geometry and its Uncertainty: A Review” and published in Journal of Computer Vision, 1998.
  • the contour detection entity 262 makes it possible to select, identify and follow the marks 11a and 12a, as well as particular areas of the objects located in the field 10. The device can thus recognize the various objects present in the field and follow them in successive images.
  • the means used to carry out this recognition are fully described in the publication by Kurt Konolige entitled “Small Vision: Hardware Implementation” and published in the annals of "Eight International Symposium on Robotics Research, Hayama, Japan 1997”. From the information obtained by the contour detection entity 262, it is possible to examine two same points on the two images, to define the distance which separates them on one and the other of these images and to make the difference between these two distances. This difference, called disparity, is defined by the correlation entity 264.
  • the filter 268 ensures the elimination of inaccuracies, for example by applying a spatial interpolation method and temporal, such as that defined in the 2013 INRIA research report (1993) entitled “Real-time correlation-based stereo: algorithm, implementations and applications", established by Olivier Faugeras.
  • FIG. 4 makes it possible to understand the manner of determining the coordinates of two points common to the images captured by the cameras, then of calculating the distance which separates them.
  • This figure shows two optics L (left) and R (right), with the same focal length f, whose optical axes are parallel to each other and distant by a value D. In addition, their focal planes are coplanar. We also see a point P belonging to field 10 and whose position it is to determine.
  • a third frame of reference whose axes X, Y and Z define a space called cyclopean.
  • the X-Y plane is parallel to the xL / xR-y plane located at the front, at a distance equal to f.
  • the Z axis is parallel to the optical axes and arranged in the same plane, in the middle position.
  • the cameras 18a and 18b permanently capture the images of the operating field 10, through the endoscope 16. These images are synchronized by the circuit 18c and converted from the analog mode in digital mode by the 18d and 18e converters.
  • the signals are then sent to the recorder 24 to store the operation and to the image acquisition module 28 which allows, on the one hand, to modify them by means of the interface 28a by playing, for example, on the contrast, the brightness, etc. and, on the other hand, to store them in the tamper-proof memory 28b, which can act as a black box.
  • the signals thus obtained are applied to the correction module 30 which eliminates certain faults affecting the quality of the images.
  • the essential functions of the device according to the invention can be provided by software, advantageously written in an object-oriented language.
  • This software is schematically represented by means of a class diagram in FIG. 5a and a diagram of the sequences of the main loop in FIG. 5b. These diagrams call for analysis using the UML methodology described in "UML, the unified object modeling notation, application in Java” by Michel Lai, InterEditions 1997, ISBN 2-7296-0659-9 and executable using the "Rational Rose” software offered by Rational Software.
  • a set of signals will be called “image” which, duly processed, make it possible to form an image on a screen.
  • image an approach allowing to superimpose images
  • monitoring will designate the part of the program which makes it possible to follow an organ and a tool as long as it is in field 10, in order to be able determine the distance between them.
  • the software is structured into classes forming, by a reference link, the application which is itself a class and bears the reference 40. Each class is corr ⁇ osed with operations and d 'attributes. On this diagram, we could see an acquisition class 41, which will be described in a P read P manner with reference to FIG.
  • the acquisition class 41 contains the operation in P elée "new images (right: Image, left: Image)" which makes it possible to process the images coming from the cameras 18a and 18b. This operation acquires the left and right images synchronously, corrects them and stores them. It will be described more precisely with reference to FIG. 6.
  • the display class 42 contains the "display (input: Image)" operation which allows the display screen to be controlled from the signals coming from the different classes making up the application.
  • the conversion class 44 contains the operations:
  • RGBenY ⁇ input: Image, output: Image
  • RGBenHLS input: Image, output: Image
  • the image class 46 contains the information relating to an image. Its attributes are:
  • format: Formatlmage which contains the information of the internal format of the image, that is:
  • the user interface class 48 contains the "modifyParameters (newParams: SystemParameters)" operation which manages the interaction between the user and the various system input devices (keyboard, mouse, voice recognition, etc.). This operation modifies, on the operator's order, the parameters of the stereo, organ monitoring, in particular the identification of points to follow, tool monitoring and absolute and relative "multimages", which will be described later. with reference to classes 50, 52, 54, 58 and 60.
  • the "multimage” class 50 contains the “calculate (tools: ListTools, brands: ListBrands, image entry: Image, Stereo input: Image, output: Image) operation. Two subclasses, called “relative multimage” 52 and “absolute multimage” 54, are derived from the class "multimage” 50.
  • Class 50 allows the calculation of the superposition of images in absolute or relative mode according to selected parameters, as will be explained below, and more particularly to ensure the mixing of 2D type images, corresponding to a usual vision of the field 10 and to add representative images of the third dimension.
  • relative "multimage” the distances are defined between two marks present in the field 10
  • absolute "multimage” the distances are defined relative to the endoscope.
  • the relative "multimage” subclass 52 contains the "calculate (tools:
  • Image, output: Image "which calculates the superposition of the images in relative mode according to the related parameters. Its attributes "ParametersMultimageRel” parameters which are the choice of the type and the variables relating to this type.
  • This class it is possible to superimpose additional information on the image of the field by creating one or more hollow and virtual spheres centered on the end of the tools, for example, and by representing the geometric location of their intersections with the or the organs visible in the field 10 by a modification of the tone and / or the saturation of the corresponding pixels on the image.
  • the end of the tool can also be provided with a virtual light oriented in the extension thereof, which modifies the brightness, the tone or the saturation of the field 10. It is also possible to display the distance between a tool and a mark defined by the user by means of digital information, cursors or any other means, such as a color gradient.
  • the absolute "multimage” class 54 contains the “calculate (tools: List of Tools, brands: List of Brands, image input: Image, Stereo input: Image, output: Image)” operation which allows to calculate the superposition of images in absolute mode as defined. above. Its attributes are the “ParametersMultimageAbs” parameters which are the position and resolution of the "multimage” as well as the color gradient.
  • the stereo class 56 will be described in more detail with reference to FIG. 7. It can however already be specified that it contains the operations of:
  • Class 56 has for attributes the "StereoParameters" parameters which are the following:
  • the "follow-up" class of organs 58 contains the operations "addBrandAfollow (entry: Image, brand: Position2D)” and “followBrand (entry: Image): listBrand", which allow the user to add marks 11a to follow . Its attribute is the “ParametersSuriviOrg” parameters which are as follows:
  • the "tracking" class of tools 60 will be described in more detail with reference to FIG. 8. It can however already be noted that it contains the operation: “search (entry: Image): listTools” having the function of searching the tools in the picture. Its attribute is the parameters: “ParametersTrackingOut” which are the following: - Maximum number of tools in the field, and
  • the sequence shown begins with the acquisition of a new image, defined by the operation "nouvelleslmages (Image, Image)” contained in the acquisition class 41.
  • the operation “RGBenY ⁇ (Image, Image)” converts the images received in Y8 format, then the “reduce (float, Image, Image)” operation allows you to choose the resolution of the image.
  • These two formatting operations with a view to subsequent processing are contained in the conversion class 44.
  • the stereo class 56 makes it possible, by the “calculate (Image, Image, Image)” operation, to calculate a depth image and, by there, the distances between the different marks present in field 10.
  • followBrand (Image) and “search (Image)" contained respectively in the classes “follow-up” of organs 58 and “follow-up” of tools 60, allow the follow-up of marks in the field, associated respectively to the organs and tools therein.
  • IO P eration "display (Image)" contained in the display class 42, transmits on the screen the information P allowing to see the cham P 10 embraced P by the cameras 18a and 18b, as well as the information relating to the distances .
  • the 41 acquisition class itself contains the classes PP elées correcting o tick 411, filter 412, storage 413, Eri P P hérique acquisition 414 and synchronizer 415.
  • the P Eri P hérique d acquisition 414 is linked to acquisition class 41 through class 415.
  • the optical correction class 411 contains the "correct (input: Image, output: Image)" operation which enables errors due to the optics of the endoscope to be corrected. Its attribute is the "CorrectionParameters" parameters.
  • the filtering class 412 contains the operation "filter (input: Image, output: Image)". Its attributes are the "FilterParameters" parameters. This operation processes the images from the cameras 18a and 18b, eliminates the parasites linked to the even and odd lines and corrects the light intensity. The image obtained, after the operations of these two classes have been applied, is therefore of suitable quality for further processing.
  • the storage class 413 contains the operation "save (input: Image)" which records all or part of the images on a digital medium, in their state before and / or after processing by the operations of classes 411 and 412. It is also possible memorize the different states of the system, the successive positions of the tools and the interactions with the system of different partners, notably the surgeon. This memorization makes it possible to preserve the images in the event of a problem.
  • the memory intended to receive them is of permanent type, so that its content cannot be modified and it can thus take the place of proof.
  • the acquisition peripheral class 414 contains the "acquire (): Image” operation. Its attributes are the "DeviceParameters" parameters which are as follows:
  • the video signal initially of analog type, is transformed into digital mode P by the operation contained in this class.
  • the synchronizer 415 class contains the operation "aille2lmages (Left: Image right: Image)" which guarantees a synchronous acquisition of left and right image obtained has lmost treatment I n instances of the P Eri hérique acquisition 414 and delivers the images thus acquired to the acquisition class 41.
  • the pattern sequences of the data acquisition is re BSB has to Figure 6b, according to the same Rinc ⁇ e as those applied in Figure 5b. It defines the detail of the sequence which takes place entirely around the acquisition class 41. It will be noted that the images coming respectively from the cameras on the right and on the left are acquired simultaneously, but processed successively. This detail is not shown in this figure, however.
  • the instances of the acquisition class 41 carry out the operation "acquire2lmages (lmage, Image)” and give the orders to the instances of the synchronizer class 415 of perform the “acquire ()” operation, for both the right and the left cameras, and this in synchronism.
  • These images are then put into memory by the “save (image)” operation contained in the storage class 413, then their geometry is corrected by the “correct (image, image)” operation. They are finally filtered by the “filter (image, Image)” operation.
  • the program then returns to the main loop of Figure 5b to perform the "RGBenY8 (image, Image)” operation.
  • FIG. 7a shows the classes derived from the stereo class 56, and more particularly the epi ⁇ olar correction classes 561, edge detection 562, correlation 563 and stereo conversion 564.
  • the epipolar correction class 561 contains the "correct (entry:
  • Image, output: Image "which transforms the epipolar lines into parallel lines. Its attributes are the parameters" ParamGéomEpip “necessary to ensure a correction of the images by means of the epipolar geometry, or the transformation matrices specific to a given tick. All the information that P ro P bones are found in the book “Determining the E P i P olar Geometry and Its Uncertainty: A Review,” M ore cited above.
  • the edge detection class 562 contains the operation "filterLOG (input: Image, output: Image)” which filters the image by the method known as LOG (La lacian of Gaussian) described in the publication entitled “A computation theory of human stereo vision ", Proceeding Royal Society B-204- 1979, by Marr D. and Poggio T.
  • This class has as attributes the parameters” Filter Parameters "which are the coefficients of the LOG filter.
  • the correlation class 563 contains the operation “correlate (right: Image, left: Image, output: Image)” which makes it possible to define the correspondence between the pixels of the image on the right and those of the image on the left, and build an image of disparity.
  • Correlate right: Image, left: Image, output: Image
  • Its attributes are the "CorrelationParameters” parameters which are as follows: - correction offsets of the respective position of the left and right images which allow these images to be adjusted so as to be able to ensure the correlation between them,
  • the stereo conversion class 564 contains the "disparityInDistance (input: Image; output: Image)" operation which transforms the disparity into millimeter distance, taking into account the geometry of the system optics. Its attributes are the parameters "ParamGéomOptique” which are the parameters of the geometry of the optics of the device for the conversion of the disparity image into a distance image.
  • FIG. 7b shows the sequence diagram for the calculation of the stereo, structured according to the same principles as for FIGS. 5b and 6b. In this figure, the operations performed on the left and right images have been shown in a specific manner, in order to better differentiate the operations relating to the left and right images from the common operations.
  • the application accesses stereo class 56 by the order "calculate (image, Image)".
  • the "correct (image, Image)” operation of the epipolar correction class 561 transforms the left image into epipolar coordinates.
  • the "filterLog (Image, Image)” operation of the contour detection class 562 makes it possible to define the contours of the various objects present in the field 10 and visible on the left image. The same operations are then performed on the right image.
  • the operation "correlate (image, Image, Image) of the correlation class 563 then ensures the correlation between different points of the left and right images.
  • the operation” disparityInDistance (Image, Image) "of the stereo conversion class 564 determines the distances from the disparity values The program then continues in the main loop, as shown in Figures 5a and 5b.
  • Figure 8a shows the complementary classes derived from the tool tracking class 60. We can see the detection classes 601, filtering tracking 602, tools 603 and marks 604.
  • the detection class 601 performs the operations "detectBrand (input: Image): listBrand” which makes it possible to define the position of marks in the filtered images using information relating to the geometry and colors of tools and marks, and "detectTools ( brands: list brands): list tools ".
  • the program determines the different significant points present in the field 10 and representative of a tool. It corrects their alignment and calculates the position and orientation of the tool in space to represent it. It can, moreover, regulate the position of the end of the tool, in order to have a more stable image.
  • the detection class 601 has for attributes the tools "listTools", defined in class 603 and which will be examined later.
  • the filtering class followed 602 performs the operation "filter (input: Image, output: Image)" which allows to keep only the images relating to brands and tools but to erase the background. This filtering is done in HLS format. It also has for attributes the tools “listTools”.
  • the tools class 603 contains all the information enabling the different tools present in field 10 to be identified. Its attributes are the following parameters:
  • Type: TypeOutil which contains all the information relating to the tools, such as their geometry and their marking, - "orientation: 3D orientation” relating to the information making it possible to define the orientation of the tool in space,
  • Position3D containing information relating to the position of each of the tools in space
  • end: 3D position used to define the position of the end of the tools in space.
  • the 604 brand class has the following parameters as attributes: - "center: 2D position", giving information relating to the center of gravity of the brands present in field 10,
  • Geometry Geometry which contains information relating to the shape of each of the marks
  • color Color
  • the filtering class followed 602 contains the operation “filter (Image, Image)” which makes it possible to have an image on which there are only the tools and the marks.
  • the detection class 601 contains the operations "detectBrands (image)” and “detectTools (List Notes)", then the program returns to the main loop.
  • the device according to the invention allows surgical operations by endoscopic technique offering a maximum of information that can be called upon request, without requiring equipment hindering the work of the surgeon and of his team. It can also be used in the field of orthopedic surgery.
  • the same concept is absolutely not limited to the medical field. It applies to many other situations such as inspecting pipes, determining the exact position of fixed or moving objects in a given space that is difficult to access, etc.
  • the information relating to the third dimension can be transmitted by sound rather than optically, the device modifying, for example, the frequency of a signal transmitted, or even giving the information in clear.
  • a cross-linked network is projected in infrared light onto the operating field, invisible to the eye, but which the cameras can detect. Such a network can be used to improve the accuracy of the measurements, the number of marks thus being considerably increased.

Abstract

The invention concerns a device for observing a field (10) comprising: two video cameras (18a, 18b) for capturing a common area of the field, an electronic device (20) for processing signals supplied by the cameras, and communication means (22) for providing an operator, on the basis of the signals supplied by the electronic processing device, data concerning the observed field. Said electronic device (20) comprises: monitoring means for selecting and identifying two points common to the images acquired by the cameras, means for determining from the data supplied by the monitoring means, a value representing the distance separating said two points in a three-dimensional space, and processing means for transforming said value into signals which are supplied to the communication means to enable them to provide the operator with the data concerning the observed field.

Description

DISPOSITIF D'OBSERVATION ENDOSCOPIQUE ENDOSCOPIC OBSERVATION DEVICE
La présente invention concerne un dispositif d'observation endoscopique d'un champ en trois dimensions (3D) destiné, notamment, et non exclusivement à la chirurgie.The present invention relates to an endoscopic observation device of a three-dimensional (3D) field intended, in particular, and not exclusively for surgery.
Le but de la présente invention est de permettre la détermination à haute fréquence vidéo (c'est-à-dire à plus de 25 Hz) de la distance séparant un outil chirurgical des différents éléments constituant la zone opératoire et de fournir ainsi au chirurgien les données spatiales nécessaires à une bonne exécution des gestes opératoires.The object of the present invention is to allow the determination at high video frequency (that is to say at more than 25 Hz) of the distance separating a surgical tool from the various elements constituting the operating area and thus to provide the surgeon with the spatial data necessary for the proper execution of operational gestures.
L'abord endoscopique est une technique chirurgicale qui prend chaque jour plus d'importance. Le principe de cette approche, dite approche "minimale invasive", consiste à pratiquer de courtes incisions au travers desquelles, via un dispositif adapté, on fait pénétrer dans le corps du patient les outils nécessaires à l'intervention ainsi qu'une caméra ou endoscope fournissant sur un écran les images qui permettent la vision du champ opératoire. Cette technique a un tel potentiel de diminuer les traumatismes opératoires et, du même coup, les coûts associés au séjour hospitalier que tous les secteurs de la chirurgie s'y intéressent: chirurgie digestive, gynécologie, ORL, cardiologie, chirurgie osseuse,...The endoscopic approach is a surgical technique which becomes more and more important every day. The principle of this approach, called "minimally invasive" approach, consists in making short incisions through which, via an adapted device, the tools necessary for the intervention are penetrated into the patient's body as well as a camera or endoscope. providing on a screen the images which allow the vision of the operating field. This technique has such a potential to reduce operational trauma and, at the same time, the costs associated with hospital stay that all sectors of surgery are interested in it: digestive surgery, gynecology, ENT, cardiology, bone surgery, ...
Un des plus grands problèmes liés aux techniques endoscopiques est que l'environnement de travail est visualisé en deux dimensions (2D) alors que les gestes opératoires s'exécutent dans un espace tridimensionnel (3D). Plusieurs tentatives ont été faites, à ce jour, pour capter et fournir en temps réel (25 Hz) au chirurgien les informations tridimensionnelles nécessaires à l'exécution d'une procédure chirurgicale. Une première approche connue consiste à utiliser des bras mécaniques à l'extrémité desquels les outils sont fixés. Elle est pénalisée par une trop grande restriction des mouvements autorisés et un important encombrement de la zone opératoire, en particulier quand plusieurs outils sont mis en œuvre simultanément.One of the biggest problems related to endoscopic techniques is that the working environment is visualized in two dimensions (2D) while the operating gestures are performed in three-dimensional space (3D). Several attempts have been made, to date, to capture and provide in real time (25 Hz) to the surgeon the three-dimensional information necessary for the execution of a surgical procedure. A first known approach consists in using mechanical arms at the end of which the tools are fixed. It is penalized by too much restriction of the authorized movements and a large footprint of the operating area, especially when several tools are used simultaneously.
Une deuxième approche est le suivi optique qui consiste à fixer sur les instruments des éléments actifs (LED) monitorés par des caméras. Toutefois, la présence de câbles, l'inflexibilité angulaire et les problèmes de stérilisation ont fait que cette méthode a été peu diffusée.A second approach is optical monitoring, which consists of attaching active elements (LEDs) monitored by cameras to the instruments. However, the presence of cables, angular inflexibility and sterilization problems made this method little known.
Une troisième approche est le suivi passif qui consiste à monter sur les outils des repères (en général, trois sphères) qu'un système de vision poursuit. Les problèmes d'encombrement sont certes réduits, mais ils subsistent. En outre, il n'est pas possible d'éliminer complètement le risque de perdre des informations de position en raison d'une obstruction du champ visuel "caméras - repères" suite, par exemple, à un geste des intervenants dans la zone opératoire.A third approach is passive monitoring, which consists of mounting benchmarks (generally three spheres) on the tools that a vision system pursues. Congestion problems are certainly reduced, but they remain. Furthermore, it is not possible to completely eliminate the risk of losing position information due to an obstruction of the visual field "cameras - markers" following, for example, a gesture by the interveners in the operating area.
Inspirées par les développements des interfaces utilisées en réalité virtuelle pour pouvoir visualiser en 3D des environnements infographiques, de nouvelles approches ont été proposées. Utilisant différents types de lunettes (passives ou actives), elles exploitent la capacité du cerveau humain à reconstituer la notion de profondeur en présentant des images avec une parallaxe pour l'oeil droit différente de celle pour l'œil gauche. Des problèmes d'inconfort associé au port des lunettes, d'interruption potentielle du signal de synchronisation dans le cas des lunettes actives et, surtout, la restitution d'une image subjective ont, dans ce cas également, limité la diffusion de telles approches.Inspired by the development of interfaces used in virtual reality to be able to visualize computer graphics environments in 3D, new approaches have been proposed. Using different types of glasses (passive or active), they exploit the ability of the human brain to reconstruct the concept of depth by presenting images with a parallax for the right eye different from that for the left eye. Problems of discomfort associated with the wearing of glasses, of potential interruption of the synchronization signal in the case of active glasses and, above all, the restitution of a subjective image have, in this case also, limited the diffusion of such approaches.
Un autre dispositif, similaire, est décrit dans le brevet US 4935 810. Il comporte deux appareils de prise de vue et deux écrans sur lesquels apparaissent les images respectives des appareils de prise de vue. En sélectionnant, sur les écrans, des points particuliers, il est possible de calculer la distance qui les sépare. Une telle solution permet de de connaître, dans un champs opératoire, la distance qui sépare un objet, par exemple un bistouri, d'un organe à opérer. Malheureusement, comme les opérations de prise de vue, de sélection des points considérés et de calcul des distances se fait de manière séquentielle, il n'est pas possible de suivre en permanence le champ opératoire. Il en résulte des risques non négligeables d'erreurs de manipulation.Another similar device is described in US Pat. No. 4,935,810. It includes two cameras and two screens on which the respective images of the cameras appear. By selecting particular points on the screens, it is possible to calculate the distance between them. Such a solution makes it possible to know, in an operating field, the distance which separates an object, for example a scalpel, from an organ to be operated. Unfortunately, like the operations of shooting, selection of the points considered and calculation of the distances is done sequentially, it is not possible to continuously monitor the operating field. This results in significant risks of handling errors.
La présente invention a pour but de fournir un dispositif d'observation endoscopique exempt des inconvénients et limitations des dispositifs actuellement connus.The object of the present invention is to provide an endoscopic observation device free from the drawbacks and limitations of the devices currently known.
De façon plus précise, le dispositif selon l'invention est caractérisé en ce qu'il comporte:More specifically, the device according to the invention is characterized in that it comprises:
- au moins deux caméras vidéo (18a, 18b) associées chacune à une optique permettant d'appréhender un espace commun du champ à observer et délivrant un signal électrique représentatif des images saisies,- at least two video cameras (18a, 18b) each associated with an optic making it possible to apprehend a common space of the field to be observed and delivering an electrical signal representative of the captured images,
- une électronique (20) de traitement des signaux fournis par les deux appareils, - des moyens de mise en mémoire de représentations d'objets susceptibles d'apparaître dans ledit champs,- electronics (20) for processing the signals supplied by the two devices, - means for storing representations of objects likely to appear in said fields,
- des moyens de suivi pour identifier automatiquement au moins deux points P1 et P2 communs aux images saisies, l'un au moins étant associé à l'une desdites représentations, et pour produire une information relative à la position de ces points dans un espace à trois dimensions,tracking means for automatically identifying at least two points P1 and P2 common to the captured images, at least one being associated with one of said representations, and for producing information relating to the position of these points in a space to three dimensions,
- des moyens de calcul (26c) pour déterminer, à partir de ladite information, une valeur représentative de la distance séparant iesdits points, - des moyens de traitement (32) pour transformer ladite valeur représentative en signaux, et- calculating means (26c) for determining, from said information, a value representative of the distance separating said points, - processing means (32) for transforming said representative value into signals, and
- des moyens de communication (22) pour fournir à un opérateur, à partir des signaux issus des moyens de traitement, des informations relatives à cette distance. De préférence, les deux optiques sont rigidement associées l'une à l'autre pour former un endoscope stéréoscopique. Par ailleurs, elles ont des axes parallèles, espacés l'un de l'autre d'une distance D, des distances focales f égales et des plans focaux coplanaires. L'information produite par les moyens de suivi est alors, pour chacun des deux points, constituée par les coordonnées xL, xR et y de son image dans le plan focal de l'optique correspondante, xL étant l'abscisse de l'image du point dans l'espace gauche, xR l'abscisse de l'image du point dans l'espace droit et y l'ordonnée de l'image du point dans les espaces gauche et droit.- communication means (22) for supplying an operator, on the basis of signals from the processing means, with information relating to this distance. Preferably, the two optics are rigidly associated with each other to form a stereoscopic endoscope. Besides, they have axes parallel, spaced from each other by a distance D, equal focal distances f and coplanar focal planes. The information produced by the monitoring means is then, for each of the two points, constituted by the coordinates xL, xR and y of its image in the focal plane of the corresponding optics, xL being the abscissa of the image of the point in the left space, xR the abscissa of the image of the point in the right space and y the ordinate of the image of the point in the left and right spaces.
Les moyens de calcul sont avantageusement agencés pour effectuer les opérations de: - calcul des coordonnées Xp,, Yp, et Zp, du point P, et Xp2, Yp2 etThe calculation means are advantageously arranged to carry out the operations of: - calculation of the coordinates Xp ,, Yp, and Zp, of the point P, and Xp 2 , Yp 2 and
Zp2 du point P2dans un espace cyclopéen selon les formules: X = (D / Δ) . (xL + xR) Y= D . y / Δ Z= D . f / Δ avec Δ = xL - xRZp 2 of point P 2 in a cyclopean space according to the formulas: X = (D / Δ). (xL + xR) Y = D. y / Δ Z = D. f / Δ with Δ = xL - xR
- calcul des différences:
Figure imgf000006_0001
- calculation of differences:
Figure imgf000006_0001
Yp, - Yp2
Figure imgf000006_0002
- détermination, à partir de ces trois différences, de la distance séparant les deux points dans un espace à trois dimensions par la formule: d12 = [(XP1 - Xp2)2 + (Y 1 - Yp2)2 + (ZP1 - Zp2)2]1'2
Yp, - Yp 2
Figure imgf000006_0002
- determination, from these three differences, of the distance separating the two points in a three-dimensional space by the formula: d 12 = [(X P1 - Xp 2 ) 2 + (Y 1 - Yp 2 ) 2 + ( Z P1 - Zp 2 ) 2 ] 1 ' 2
Selon un mode de réalisation préféré, le dispositif selon l'invention présente encore les caractéristiques suivantes:According to a preferred embodiment, the device according to the invention also has the following characteristics:
- les moyens de communication comportent un écran vidéo permettant d'afficher une image du champ observé;the means of communication include a video screen making it possible to display an image of the field observed;
- les moyens de traitement sont agencés pour générer une image signifiante de ladite valeur représentative et la superposer sur l'écran vidéo à une image du champ observé; - cette image signifiante peut être, parmi de nombreuses solutions possibles, un index ou une zone présentant un gradient de couleurs;the processing means are arranged to generate a significant image of said representative value and superimpose it on the video screen on an image of the field observed; - this signifying image can be, among many possible solutions, an index or an area having a color gradient;
- il comporte, en outre: . des moyens pour synchroniser les images des deux caméras,- it also includes:. means for synchronizing the images of the two cameras,
. un convertisseur analogique-numérique interposé entre chaque caméra et l'électronique de traitement,. an analog-digital converter interposed between each camera and the processing electronics,
. une mémoire pour le stockage des signaux issus des convertisseurs, . des moyens de sélection, faisant partie de l'électronique de traitement, pour ne prendre en considération qu'une part des informations issues d'au moins l'un des convertisseurs, afin de réduire le volume d'informations traitées,. a memory for storing the signals from the converters,. selection means, forming part of the processing electronics, so as to take into consideration only a part of the information coming from at least one of the converters, in order to reduce the volume of information processed,
. des moyens de correction pour traiter les images de manière à réduire l'effet des aberrations des caméras, et. correction means for processing the images so as to reduce the effect of camera aberrations, and
. un commutateur permettant de limiter l'information affichée sur l'écran à l'image issue de l'une des caméras.. a switch for limiting the information displayed on the screen to the image from one of the cameras.
D'autres caractéristiques de l'invention ressortiront de la description qui va suivre, faite en regard du dessin annexé, dans lequel: - la figure 1 illustre, schématiquement, un ensemble comportant un dispositif selon l'invention appliqué à une opération chirurgicale;Other characteristics of the invention will emerge from the description which follows, made with reference to the appended drawing, in which: - Figure 1 illustrates, diagrammatically, an assembly comprising a device according to the invention applied to a surgical operation;
- la figure 2 présente la structure générale du dispositif selon l'invention;- Figure 2 shows the general structure of the device according to the invention;
- la figure 3 montre, de manière plus détaillée, une partie de la structure de la figure 2;- Figure 3 shows, in more detail, part of the structure of Figure 2;
- la figure 4 illustre le principe de calcul de la distance entre deux points dans un espace tridimensionnel;- Figure 4 illustrates the principle of calculating the distance between two points in a three-dimensional space;
- les figures 5a, 6a, 7a et 8a représentent des diagrammes de classes, respectivement le diagramme principal de l'application et les diagrammes relatifs à l'acquisition, la stéréo et le suivi d'outils, alors que les figures 5b, 6b, 7b et 8b sont des diagrammes des séquences correspondant aux diagrammes de classes portant le même numéro.FIGS. 5a, 6a, 7a and 8a represent class diagrams, respectively the main diagram of the application and the diagrams relating to the acquisition, the stereo and the monitoring of tools, while FIGS. 5b, 6b, 7b and 8b are diagrams of sequences corresponding to class diagrams with the same number.
La figure 1 montre, de manière très schématique, les moyens mis en œuvre, selon l'invention, pour une opération chirurgicale utilisant un dispositif d'observation endoscopique. L'opération se déroule dans un champ opératoire 10 interne à un organisme et permet une intervention sur un organe 11 au moyen d'un outil 12 introduit dans l'organisme par une incision 14a. La surface de l'organe 11 est munie de marques 11 a constituées, par exemple, de points réalisés au moyen d'une encre biocompatible, des pastilles adhérentes, ou encore de certaines parties de l'organe lui-même qui ont un aspect particulier. Des marques 12a, généralement formées de pastilles de couleur, sont avantageusement disposées sur l'outil 12, de manière à faciliter son identification et, comme cela sera expliqué plus loin, à calculer sa position. Dans le dispositif selon l'invention, l'observation du champ opératoire 10 se fait au moyen d'un endoscope conventionnel 16 à double optique, associé à une source de lumière (non représentée) et introduit dans l'organisme par une autre incision 14b. L'endoscope 16 est relié à un dispositif de saisie d'images 18 comportant deux caméras 18a et 18b, appelées respectivement caméra gauche L et caméra droite R, qui reçoivent le rayonnement lumineux capté par les deux optiques, de manière à pouvoir traiter les images de manière stéréoscopique, comme cela sera expliqué plus loin.Figure 1 shows, very schematically, the means used, according to the invention, for a surgical operation using an endoscopic observation device. The operation takes place in an operating field 10 internal to an organism and allows intervention on an organ 11 by means of a tool 12 introduced into the organism through an incision 14a. The surface of the member 11 is provided with marks 11a consisting, for example, of dots produced by means of a biocompatible ink, adherent pads, or also of certain parts of the member itself which have a particular appearance. . Marks 12a, generally formed by colored dots, are advantageously placed on the tool 12, so as to facilitate its identification and, as will be explained below, to calculate its position. In the device according to the invention, the observation of the operating field 10 is done by means of a conventional endoscope 16 with double optics, associated with a light source (not shown) and introduced into the organism by another incision 14b . The endoscope 16 is connected to an image capture device 18 comprising two cameras 18a and 18b, called respectively the left camera L and the right camera R, which receive the light radiation captured by the two optics, so as to be able to process the images stereoscopically, as will be explained later.
Les caméras 18a et 18b transforment, de manière classique, le rayonnement lumineux provenant du champ opératoire 10 en un signal électrique qui est appliqué, par deux voies distinctes L et R, à une électronique de traitement 20. Un écran vidéo 22, tenant lieu de dispositif de communication, affiche à une fréquence de 25 Hz les images reçues par les caméras 18a et 18b, avant ou après traitement par l'électronique 20. Un commutateur 23 permet de sélectionner l'une ou l'autre des caméras 18a et 18b ou encore une image obtenue après traitement par l'électronique 20. Comme décrit plus loin, le dispositif selon l'invention permet de définir la distance entre deux marques 11a et/ou 12a visibles dans le champ opératoire 10, mais également entre tous points identifiables par leur forme, leur couleur, etc. On se référera maintenant à la figure 2 qui représente la structure générale du dispositif selon l'invention. On y retrouve le champ opératoire 10 avec l'organe à traiter 11 , l'outil d'intervention 12, l'endoscope 16, le dispositif de saisie d'images 18, ainsi que différents modules constituant l'électronique de traitement 20. Le champ 10 est attaché à un référentiel cartésien dont l'axe Z est parallèle aux axes des caméras.The cameras 18a and 18b transform, in a conventional manner, the light radiation coming from the operating field 10 into an electrical signal which is applied, by two distinct channels L and R, to a processing electronics 20. A video screen 22, taking the place of communication device, displays at a frequency of 25 Hz the images received by the cameras 18a and 18b, before or after processing by the electronics 20. A switch 23 makes it possible to select one or the other of the cameras 18a and 18b or another image obtained after processing by the electronics 20. As described below, the device according to the invention makes it possible to define the distance between two marks 11a and / or 12a visible in the operating field 10, but also between all points identifiable by their shape, their color, etc. Reference will now be made to FIG. 2 which represents the general structure of the device according to the invention. There is the operating field 10 with the organ to be treated 11, the intervention tool 12, the endoscope 16, the image capture device 18, as well as various modules constituting the processing electronics 20. The field 10 is attached to a Cartesian frame of reference whose Z axis is parallel to the axes of the cameras.
Le dispositif de saisie d'images 18 comporte, en plus des deux caméras gauche 18a et droite 18b, un circuit de synchronisation des images 18c et deux convertisseurs analogique-numérique 18d et 18e.The image capture device 18 comprises, in addition to the two left cameras 18a and right 18b, an image synchronization circuit 18c and two analog-digital converters 18d and 18e.
Le circuit de synchronisation 18c permet de gérer les images provenant des deux caméras de manière à ce qu'elles soient parfaitement synchrones, ce qui facilite leur comparaison, comme cela sera expliqué plus loin, et améliore la qualité des informations relatives à la troisième dimension.The synchronization circuit 18c makes it possible to manage the images coming from the two cameras so that they are perfectly synchronized, which facilitates their comparison, as will be explained below, and improves the quality of the information relating to the third dimension.
Les deux convertisseurs analogique-numérique 18d et 18e transforment les signaux issus du circuit de synchronisation 18c, de type analogique, en des signaux de type numérique.The two analog-digital converters 18d and 18e transform the signals coming from the synchronization circuit 18c, of analog type, into signals of digital type.
La sortie du dispositif 18 est reliée à l'entrée d'un enregistreur vidéo numérique 24 qui peut assurer l'enregistrement de tout ou partie de l'opération. Dans une variante non représentée au dessin, il est aussi possible d'utiliser un enregistreur analogique qui est alors connecté à l'entrée des convertisseurs 18d et 18e.The output of the device 18 is connected to the input of a digital video recorder 24 which can record all or part of the operation. In a variant not shown in the drawing, it is also possible to use an analog recorder which is then connected to the input of the converters 18d and 18e.
La sortie du dispositif 18 est également appliquée à l'électronique de traitement 20 avantageusement constituée par un ordinateur dans lequel les différents modules sont définis par un ensemble de programmes et de sous- programmes décrits en référence aux figures 5 à 8. Cet ordinateur comporte des moyens de commande conventionnels, non représentés au dessin, tels qu'un clavier et/ou une souris.The output of the device 18 is also applied to the processing electronics 20 advantageously constituted by a computer in which the different modules are defined by a set of programs and subroutines described with reference to FIGS. 5 to 8. This computer comprises conventional control means, not shown in the drawing, such as a keyboard and / or a mouse.
En variante, l'électronique 20 pourrait être réalisée au moyen de différents modules électroniques. Cette deuxième solution offre toutefois moins de souplesse d'utilisation.As a variant, the electronics 20 could be produced by means of different electronic modules. This second solution, however, offers less flexibility of use.
Afin de faciliter la compréhension, le dispositif selon l'invention sera décrit, en référence aux figures 2 et 3, sur la base d'une construction modulaire dans laquelle chaque module assure une fonction particulière. Les figures 5 à 8 permettront ensuite de mieux comprendre la structure du logiciel commandant le dispositif.In order to facilitate understanding, the device according to the invention will be described, with reference to Figures 2 and 3, on the basis of a modular construction in which each module performs a particular function. Figures 5 to 8 will then allow a better understanding of the structure of the software controlling the device.
L'électronique de traitement 20 est agencée de manière à traiter en parallèle les signaux issus des deux caméras. Toutefois, pour alléger le dessin, chacun des modules ou systèmes n'est représenté qu'une fois.The processing electronics 20 is arranged so as to process the signals from the two cameras in parallel. However, to lighten the design, each of the modules or systems is shown only once.
Le coeur de l'électronique 20 est constitué par un module de traitement des signaux 26, qui comprend des systèmes de suivi des objets 26a et 26b, destinés à suivre les éventuels déplacements des objets (outils et organes) auxquels les marques 1 1 a et 12a sont liées, ainsi qu'une unité de calcul 26c de la distance séparant, dans un espace à trois dimensions, au moins deux points associés à ces marques. Ces dernières sont sélectionnées, par exemple, au moyen de la souris de l'ordinateur, parmi un ensemble de représentation d'objets, mis en mémoire, susceptible d'apparaître dans le champ opératoire. Elle sont identifiées par les systèmes de suivi des objets 26a et 26b.The heart of the electronics 20 is constituted by a signal processing module 26, which includes systems for tracking objects 26a and 26b, intended to follow the possible movements of the objects (tools and organs) to which the marks 1 1 a and 12a are linked, as well as a unit 26c for calculating the distance separating, in a three-dimensional space, at least two points associated with these marks. The latter are selected, for example, by means of the computer mouse, from among a set of representation of objects, stored, likely to appear in the operating field. They are identified by the object tracking systems 26a and 26b.
Le module 26 permet, à lui seul, de déterminer la position dans l'espace de points d'intérêts identifiés par l'opérateur grâce aux marques 1 1 a et 12a, ainsi que la distance les séparant. Toutefois, afin de conférer au dispositif une plus grande souplesse d'utilisation et une précision optimale, l'électronique 20 comporte, de plus:The module 26 alone makes it possible to determine the position in the space of points of interest identified by the operator using the marks 11 a and 12a, as well as the distance separating them. However, in order to give the device greater flexibility of use and optimal precision, the electronics 20 further comprises:
- un module d'acquisition des images 28 comportant une interface 28a et une mémoire 28b; - un module de correction 30, comprenant un circuit de filtrage 30a des signaux issus d'au moins l'un des convertisseurs 18d et 18e, en vue d'en sélectionner une partie, et un circuit de correction 30b des aberrations des caméras; et - un module de commande 32 de l'écran, qui comprend un système de combinaison d'images 32a et une commande d'écran 32b.- an image acquisition module 28 comprising an interface 28a and a memory 28b; a correction module 30 comprising a filtering circuit 30a of the signals coming from at least one of the converters 18d and 18e, with a view to selecting a part thereof, and a circuit 30b for correcting the aberrations of the cameras; and a screen control module 32, which includes a system for combining images 32a and a screen control 32b.
Le module d'acquisition 28 est relié, par son entrée, à la sortie du dispositif de saisie d'images 18. Il permet de stocker des informations et peut ainsi jouer un rôle de boîte noire, enregistrant dans sa mémoire 28b, dont le contenu ne peut être altéré, tout ou partie des informations relatives au déroulement de l'opération. Son interface 28a permet de transformer les informations reçues pour les mettre dans une forme compatible avec les caractéristiques de la mémoire 28b.The acquisition module 28 is connected, by its input, to the output of the image capture device 18. It makes it possible to store information and can thus play a role of black box, recording in its memory 28b, the content of which cannot be altered, all or part of the information relating to the progress of the operation. Its interface 28a makes it possible to transform the information received to put it in a form compatible with the characteristics of the memory 28b.
Le module de correction 30 est relié, par son entrée, à la sortie du module d'acquisition 28. Son circuit de filtrage 30a permet de traiter les images pour les rendre plus lisibles. En leur appliquant un algorithme adéquat, on peut ne garder que le contour des objets présents dans le champ 10 ou, encore, ne représenter qu'une des couleurs des images, renforcer le contraste, etc. Cela permet d'avoir un regard différent sur le champ opératoire et ainsi de mieux saisir certains détails. Quant au circuit de correction 30b, il a pour fonction de corriger les aberrations de l'optique de l'endoscope. Cette correction est indispensable pour pouvoir disposer d'une bonne image. En effet, les optiques des endoscopes sont de très petites dimensions. Il en résulte une forte distorsion des images. Pour pallier cet inconvénient, on établit une transformation plane permettant de trouver des correspondances entre les points de l'image idéale et de l'image déformée. Cette méthode est décrite de manière détaillée dans "Digital Image Warping, George Wolberg, IEEE Computer Society Press Monograph, 1994".The correction module 30 is connected, by its input, to the output of the acquisition module 28. Its filtering circuit 30a makes it possible to process the images to make them more readable. By applying an adequate algorithm to them, we can only keep the outline of the objects present in field 10 or, again, represent only one of the colors of the images, reinforce the contrast, etc. This allows you to have a different perspective on the operating field and thus better understand certain details. As for the correction circuit 30b, its function is to correct the aberrations of the optics of the endoscope. This correction is essential in order to have a good image. Indeed, the optics of endoscopes are very small. This results in a strong distortion of the images. To overcome this drawback, a plane transformation is established which makes it possible to find correspondences between the points of the ideal image and of the distorted image. This method is described in detail in "Digital Image Warping, George Wolberg, IEEE Computer Society Press Monograph, 1994".
Les signaux disponibles à la sortie du module 30 présentent ainsi des caractéristiques permettant d'afficher, sur l'écran vidéo 22, des informations plus faciles à interpréter par le chirurgien. Ces signaux sont introduits dans le module de traitement 26 qui sera décrit de manière plus détaillée en référence à la figure 3.The signals available at the output of the module 30 thus have characteristics making it possible to display, on the video screen 22, information which is easier for the surgeon to interpret. These signals are introduced into the processing module 26 which will be described in more detail with reference to FIG. 3.
Enfin, le module de commande 32 assure la gestion des informations affichées par l'écran vidéo 22, combinant ou non les images du champ opératoire 10 avec des informations concernant la position des différents objets présents dans le champ 10. Le système de combinaison d'images 32a, appelé "multimage", par contraction des mots "multiple" et "image", ou, en anglais, "overlay". Il a son entrée reliée à la sortie du module 26 et traite les signaux fournis par celui-ci en même temps que les signaux produits par la caméra 18a ou 18b. La commande d'écran 32b, connectée à la sortie du système 32b, est reliée par sa sortie à l'écran vidéo 22 qui permet de visualiser le champ opératoire 10 ainsi que des informations relatives aux organes et aux outils, notamment les informations relatives aux distances.Finally, the control module 32 manages the information displayed by the video screen 22, whether or not combining the images of the operating field 10 with information concerning the position of the various objects present in the field 10. The system for combining images 32a, called "multimage", by contraction of the words "multiple" and "image", or, in English, "overlay". It has its input connected to the output of module 26 and processes the signals supplied by it at the same time as the signals produced by the camera 18a or 18b. The screen control 32b, connected to the output of the system 32b, is connected by its output to the video screen 22 which makes it possible to view the operating field 10 as well as information relating to the organs and tools, in particular information relating to the distances.
L'indication des distances peut se faire de différentes manières. La valeur de la coordonnée Z de la distance entre l'outil 12 et une marque 11a sélectionnée peut être affichée sous forme numérique ou, encore, par exemple, à l'aide de deux index en V. Il est également possible de représenter la distance entre un point donné, par exemple l'extrémité de l'outil 12, et tout ou partie du champ 10 par un dégradé de couleurs, ie bleu correspondant, par exemple, aux zones très éloignées et le rouge au contact.The indication of distances can be done in different ways. The value of the Z coordinate of the distance between the tool 12 and a selected mark 11a can be displayed in digital form or, for example, using two indexes in V. It is also possible to represent the distance between a given point, for example the end of the tool 12, and all or part of the field 10 by a gradient of colors, ie blue corresponding, for example, to very distant areas and red on contact.
Dès lors que l'outil 12 doit s'approcher avec précaution d'un point particulier, ce point étant identifié, il est également possible de faire apparaître une part de l'outil 12 dans une couleur correspondant à la distance mesurée.As soon as the tool 12 must approach with care a particular point, this point being identified, it is also possible to show a part of the tool 12 in a color corresponding to the measured distance.
Le choix de l'une ou l'autre solution, réalisable en tout temps, se fera en fonction des problèmes auxquels le chirurgien est confronté, donnant plus ou moins d'importance aux informations relatives à la valeur de la composante Z, ou au contraire au champ opératoire.The choice of one or the other solution, achievable at all times, will be made according to the problems with which the surgeon is confronted, giving more or less importance to information relating to the value of the Z component, or on the contrary in the operating field.
Il est également possible d'ajouter des images complémentaires, provenant d'une banque de données non représentée au dessin et facilitant un diagnostic, par exemple des réticules, des masques ou tout autre image susceptible de faciliter le travail du chirurgien. La manière de procéder pour obtenir cette superposition d'images sera précisée en référence à la figure 5.It is also possible to add additional images, coming from a database not shown in the drawing and facilitating a diagnosis, for example reticles, masks or any other image. likely to facilitate the surgeon's work. The procedure for obtaining this superimposition of images will be specified with reference to FIG. 5.
Pour déterminer la distance entre deux marques appartenant à des objets disposés dans le champ 10, il est nécessaire d'établir une corrélation entre les différents objets visibles sur les deux images captées par les caméras 18a et 18b. Comme les prises de vues se font de deux points distincts, il en résulte que les distances entre les projections de deux marques sur les plans des caméras, transposées dans un référentiel commun, sont différentes. En calculant la différence des coordonnées de ces projections dans le référentiel commun, il est possible de calculer la distance séparant ces marques dans l'espace à trois dimensions.To determine the distance between two marks belonging to objects arranged in the field 10, it is necessary to establish a correlation between the different objects visible on the two images captured by the cameras 18a and 18b. As the shots are taken from two distinct points, it follows that the distances between the projections of two brands on the plans of the cameras, transposed into a common reference frame, are different. By calculating the difference of the coordinates of these projections in the common reference frame, it is possible to calculate the distance separating these marks in three-dimensional space.
La figure 3 permet de mieux comprendre la manière de procéder. Elle représente le détail de l'unité de calcul 26c qui comporte, connectées en série, une entité de correction géométrique 260 destinée à traiter les images en géométrie épipolaire, une entité de détection du contour 262, une entité de corrélation 264, une entité de détermination de la distance 266 et un filtre 268.Figure 3 provides a better understanding of how to do this. It represents the detail of the calculation unit 26c which comprises, connected in series, a geometric correction entity 260 intended to process the images in epipolar geometry, a contour detection entity 262, a correlation entity 264, a determining the distance 266 and a filter 268.
L'entité de correction géométrique 260 permet de traiter les images en géométrie épipolaire. Pour bien comprendre cette géométrie, on se référera avantageusement à l'article de Zhengyou Zhang intitulé "Determining the Epipolar Geometry and its Uncertainty: A Review" et publié dans Journal of Computer Vision, 1998.The geometric correction entity 260 makes it possible to process the images in epipolar geometry. To fully understand this geometry, we will advantageously refer to the article by Zhengyou Zhang entitled "Determining the Epipolar Geometry and its Uncertainty: A Review" and published in Journal of Computer Vision, 1998.
L'entité de détection des contours 262 permet de sélectionner, d'identifier et de suivre les marques 11a et 12a, ainsi que des zones particulières des objets se trouvant dans le champ 10. Le dispositif peut ainsi reconnaître les différents objets présents dans le champ et les suivre dans les images successives. Les moyens mis en oeuvre pour effectuer cette reconnaissance sont décrits de manière complète dans la publication de Kurt Konolige intitulée "Small Vision : Hardware Implementation" et publiée dans les annales de "Eight International Symposium on Robotics Research, Hayama, Japan 1997". A partir des informations obtenues par l'entité de détection des contours 262, il est possible d'examiner deux mêmes points sur les deux images, de définir la distance qui les sépare sur l'une et l'autre de ces images et de faire la différence entre ces deux distances. Cette différence, appelée disparité, est définie par l'entité de corrélation 264.The contour detection entity 262 makes it possible to select, identify and follow the marks 11a and 12a, as well as particular areas of the objects located in the field 10. The device can thus recognize the various objects present in the field and follow them in successive images. The means used to carry out this recognition are fully described in the publication by Kurt Konolige entitled "Small Vision: Hardware Implementation" and published in the annals of "Eight International Symposium on Robotics Research, Hayama, Japan 1997". From the information obtained by the contour detection entity 262, it is possible to examine two same points on the two images, to define the distance which separates them on one and the other of these images and to make the difference between these two distances. This difference, called disparity, is defined by the correlation entity 264.
En connaissant les caractéristiques des optiques et la disparité, il est possible d'établir une relation entre la disparité et la distance entre les deux points. Cette opération est effectuée dans l'entité de détermination de la distance 266 et sera décrite plus en détail en référence à la figure 4. Enfin, le filtre 268 assure l'élimination d'inexactitudes, par exemple en appliquant une méthode d'interpolation spatiale et temporelle, telle que celle définie dans le rapport de recherche 2013 de l'INRIA (1993) intitulé "Real-time correlation-based stereo : algorithm, implementations and applications", établi par Olivier Faugeras. La figure 4 permet de comprendre la manière de déterminer les coordonnées de deux points communs aux images saisies par les caméras, puis de calculer la distance qui les sépare. Cette figure montre deux optiques L (gauche) et R (droite), de même distance focale f, dont les axes optiques sont parallèles entre eux et distants d'une valeur D. De plus, leurs plans focaux sont coplanaires. On voit, en outre, un point P appartenant au champ 10 et dont il s'agit de déterminer la position.By knowing the characteristics of the optics and the disparity, it is possible to establish a relationship between the disparity and the distance between the two points. This operation is performed in the distance determination entity 266 and will be described in more detail with reference to FIG. 4. Finally, the filter 268 ensures the elimination of inaccuracies, for example by applying a spatial interpolation method and temporal, such as that defined in the 2013 INRIA research report (1993) entitled "Real-time correlation-based stereo: algorithm, implementations and applications", established by Olivier Faugeras. FIG. 4 makes it possible to understand the manner of determining the coordinates of two points common to the images captured by the cameras, then of calculating the distance which separates them. This figure shows two optics L (left) and R (right), with the same focal length f, whose optical axes are parallel to each other and distant by a value D. In addition, their focal planes are coplanar. We also see a point P belonging to field 10 and whose position it is to determine.
A cet effet, on dispose de deux référentiels plans, l'un gauche et l'autre droit, dont les abscisses respectives, alignées sur un même axe, sont xL et xR et dont l'ordonnée est y. La distance entre les origines de ces référentiels est égale à D. Ils permettent respectivement de définir la position pL de l'image du point P vue par la caméra gauche et sa position pR vue par la caméra droite. Il est ainsi possible de mesurer la coordonnée des images du point P dans les référentiels gauche et droit et ainsi définir ses coordonnées xLp, xRp et yp. On relèvera que la détermination des coordonnées dans les référentiels gauche et droit se fait tout simplement en identifiant les pixels des caméras sur lesquels l'image des points considérés se forme. Les unités de xL, xR et y sont donc des pixels. La distance focale f doit également être exprimée dans cette unité.For this purpose, there are two plane referentials, one left and the other right, whose respective abscissae, aligned on the same axis, are xL and xR and whose ordinate is y. The distance between the origins of these reference frames is equal to D. They respectively make it possible to define the position pL of the image of the point P seen by the left camera and its position pR seen by the right camera. It is thus possible to measure the coordinate of the images of point P in the left and right repositories and thus define its xLp, xRp and yp coordinates. It will be noted that the determination of the coordinates in the left and right frames of reference is done simply by identifying the pixels of the cameras on which the image of the points considered is formed. The units of xL, xR and y are therefore pixels. The focal length f must also be expressed in this unit.
Pour définir la position du point P dans le champ, on fait appel à un troisième référentiel dont les axes X, Y et Z définissent un espace appelé cyclopéen. Le plan X-Y est parallèle au plan xL/xR-y disposé à l'avant, à une distance égale à f. L'axe Z est parallèle aux axes optiques et disposé dans le même plan, en position médiane.To define the position of point P in the field, we use a third frame of reference whose axes X, Y and Z define a space called cyclopean. The X-Y plane is parallel to the xL / xR-y plane located at the front, at a distance equal to f. The Z axis is parallel to the optical axes and arranged in the same plane, in the middle position.
Pour calculer les coordonnées du point P dans l'espace cyclopéen, on commence par définir la disparité Δ de ses images selon la formule: Δ = xL - xRTo calculate the coordinates of point P in cyclopean space, we first define the disparity Δ of its images according to the formula: Δ = xL - xR
On peut alors définir les coordonnées du point P dans l'espace cyclopéen selon les formules:We can then define the coordinates of point P in cyclopean space according to the formulas:
Xp = (D/2Δ) . (xL + xR) Yp = D . y/Δ Zp = D . f/ΔXp = (D / 2Δ). (xL + xR) Yp = D. y / Δ Zp = D. f / Δ
Lorsqu'il s'agit de calculer la distance entre deux points P- et P2, on doit donc déterminer, tout d'abord, selon la méthode ci-dessus, les coordonnées Xp.,, Yp., et Zp, du point P, et les coordonnées Xp2, Yp2 et Zp2 du point P2. La distance d12 qui les sépare est alors obtenue par la formule: d12 = [(Xp, - Xp2)2 + (Y l - Yp2)2 + (ZP1 - Zp2)2]1'2 When it is a question of calculating the distance between two points P- and P 2 , one must thus determine, first of all, according to the method above, the coordinates Xp. ,, Yp., And Zp, of the point P, and the coordinates Xp 2 , Yp 2 and Zp 2 of the point P 2 . The distance d 12 which separates them is then obtained by the formula: d 12 = [(Xp, - Xp 2 ) 2 + (Y l - Yp 2 ) 2 + (Z P1 - Zp 2 ) 2 ] 1 ' 2
Si, dans les formules ci-dessus, D et exPrimé en mm, les valeurs de X, Y et Z le seront également.If, in the above formulas, D and ex P rhyme in mm, the values of X, Y and Z will also be.
Dans le dispositif selon l'invention qui vient d'être décrit, les caméras 18a et 18b assurent en permanence la saisie des images du champ opératoire 10, au travers de l'endoscope 16. Ces images sont synchronisées par le circuit 18c et converties du mode analogique en mode numérique par les convertisseurs 18d et 18e. Les signaux sont ensuite adressés à l'enregistreur 24 pour mémoriser l'opération et au module d'acquisition des images 28 qui permet, d'une part, de les modifier au moyen de l'interface 28a en jouant, par exemple, sur le contraste, la luminosité, etc. et, d'autre part, de les mémoriser dans la mémoire inviolable 28b, pouvant tenir lieu de boîte noire. Les signaux ainsi obtenus sont appliqués au module de correction 30 qui réalise l'élimination de certains défauts affectant la qualité des images. Ils sont ensuite envoyés au module de traitement 26 qui effectue le calcul des distances et le suivi des organes puis au module de commande 32 qui réalise la superposition des images avant de commander l'écran vidéo 22. Comme cela a été expliqué plus haut, les fonctions essentielles du dispositif selon l'invention peuvent être assurées par un logiciel, avantageusement rédigé dans un langage orienté objets. Ce logiciel est schématiquement représenté au moyen d'un diagramme de classes sur la figure 5a et d'un diagramme des séquences de la boucle principale sur la figure 5b. Ces diagrammes font appel à l'analyse par la méthodologie UML décrite dans "UML, la notation unifiée de modélisation objet, application en Java" de Michel Lai, InterEditions 1997, ISBN 2-7296-0659-9 et exécutable à l'aide du logiciel "Rational Rose" proposé par la firme Rational Software.In the device according to the invention which has just been described, the cameras 18a and 18b permanently capture the images of the operating field 10, through the endoscope 16. These images are synchronized by the circuit 18c and converted from the analog mode in digital mode by the 18d and 18e converters. The signals are then sent to the recorder 24 to store the operation and to the image acquisition module 28 which allows, on the one hand, to modify them by means of the interface 28a by playing, for example, on the contrast, the brightness, etc. and, on the other hand, to store them in the tamper-proof memory 28b, which can act as a black box. The signals thus obtained are applied to the correction module 30 which eliminates certain faults affecting the quality of the images. They are then sent to the processing module 26 which calculates the distances and the monitoring of the organs, then to the control module 32 which superimposes the images before controlling the video screen 22. As has been explained above, the essential functions of the device according to the invention can be provided by software, advantageously written in an object-oriented language. This software is schematically represented by means of a class diagram in FIG. 5a and a diagram of the sequences of the main loop in FIG. 5b. These diagrams call for analysis using the UML methodology described in "UML, the unified object modeling notation, application in Java" by Michel Lai, InterEditions 1997, ISBN 2-7296-0659-9 and executable using the "Rational Rose" software offered by Rational Software.
Dans la suite de la description, on appellera "image" un ensemble de signaux qui, dûment traités, permettent de former une image sur un écran. Par ailleurs, on appellera "multimage" une approche permettant de superposer des images, tandis que le terme "suivi" désignera la partie du programme qui permet de suivre un organe et un outil tant qu'il se trouve dans le champ 10, pour pouvoir déterminer la distance les séparant. Sur le diagramme de la figure 5a, le logiciel est structuré en classes formant, par un lien de référence, l'application qui est, elle-même, une classe et porte la référence 40. Chaque classe est corrφosée d'o érations et d'attributs. Sur ce diagramme, on eut voir une classe acquisition 41 , qui sera décrite de manière Plus Précise en référence à la figure 6 et des classes affichage 42, conversion 44, image 46, Interface utilisateur 48, "multimage" 50, "multimage" relatif 52 et "multimage" absolu 54, stéréo 56, "suivi" d'organes 58, et "suivi" d'outils 60. Les classes stéréo 56 et "suivi" d'outils 60 sont respectivement représentées de manière plus détaillée aux figures 7 et 8.In the following description, a set of signals will be called "image" which, duly processed, make it possible to form an image on a screen. In addition, we will call "multimage" an approach allowing to superimpose images, while the term "monitoring" will designate the part of the program which makes it possible to follow an organ and a tool as long as it is in field 10, in order to be able determine the distance between them. In the diagram in FIG. 5a, the software is structured into classes forming, by a reference link, the application which is itself a class and bears the reference 40. Each class is corrφosed with operations and d 'attributes. On this diagram, we could see an acquisition class 41, which will be described in a P read P manner with reference to FIG. 6 and display classes 42, conversion 44, image 46, User interface 48, "multimage" 50, "multimage "relative 52 and" multimage "absolute 54, stereo 56," followed "by organs 58, and" followed " of tools 60. The stereo classes 56 and "tracking" of tools 60 are respectively shown in more detail in FIGS. 7 and 8.
Dans certaines des classes mentionnées ci-dessus, il est possible d'ajuster, de manière automatique ou volontaire, un ou plusieurs paramètres que comportent les attributs, ce qui confère au dispositif une grande souplesse d'utilisation.In some of the classes mentioned above, it is possible to adjust, automatically or voluntarily, one or more parameters that comprise the attributes, which gives the device great flexibility of use.
Les différents attributs et opérations contenus dans les classes sont formulés en langage Java, dans la notation UML citée précédemment.The various attributes and operations contained in the classes are formulated in Java language, in the UML notation mentioned above.
La classe acquisition 41 contient l'opération aP elée "nouvelleslmages (droite: Image, gauche: Image)" qui permet de traiter les images provenant des caméras 18a et 18b. Cette opération assure l'acquisition des images gauches et droites de manière synchrone, les corrige et les stocke. Elle sera décrite de manière plus précise en référence à la figure 6.The acquisition class 41 contains the operation in P elée "new images (right: Image, left: Image)" which makes it possible to process the images coming from the cameras 18a and 18b. This operation acquires the left and right images synchronously, corrects them and stores them. It will be described more precisely with reference to FIG. 6.
La classe affichage 42 contient l'opération "afficher (entrée : Image)" qui permet de commander l'écran d'affichage à partir des signaux provenant des différentes classes composant l'application.The display class 42 contains the "display (input: Image)" operation which allows the display screen to be controlled from the signals coming from the different classes making up the application.
La classe conversion 44 contient les opérations:The conversion class 44 contains the operations:
- "RGBenYδ (entrée: Image, sortie: Image)" qui permet de convertir une image couleur au format RGB en image noir et blanc au format Y8, - "RGBenHLS (entrée: Image, sortie: Image)" qui permet de convertir une image RGB en image HLS (Hue, Luminance, Saturation), et- "RGBenYδ (input: Image, output: Image)" which allows you to convert a color image in RGB format to black and white image in Y8 format, - "RGBenHLS (input: Image, output: Image)" which allows you to convert RGB image to HLS (Hue, Luminance, Saturation) image, and
- "réduire (facteur: float, entrée: Image, sortie: Image)" dont le facteur mentionné définit un taux de réduction de la résolution d'une image.- "reduce (factor: float, input: Image, output: Image)" whose mentioned factor defines a reduction rate of the resolution of an image.
La classe image 46 contient les informations relatives à une image. Elle a pour attributs:The image class 46 contains the information relating to an image. Its attributes are:
- "format: Formatlmage", qui contient les informations du format interne de l'image, soit:- "format: Formatlmage", which contains the information of the internal format of the image, that is:
- le format Y8 (tons de gris)- Y8 format (shades of gray)
- le format RGB (couleur 16, 24 ou 36 bits), et - le format HLS (couleur 36 bits) - "largeur : int", qui permet de définir la largeur de l'image,- RGB format (16, 24 or 36 bit color), and - HLS format (36 bit color) - "width: int", which defines the width of the image,
- "hauteur : int", qui permet de définir la hauteur de l'image, et- "height: int", which defines the height of the image, and
- "tableau : Byte*" qui tient lieu de pointeur sur les pixels de l'image.- "table: Byte * " which acts as a pointer to the pixels of the image.
Concernant les formats RGB et HLS, on se référera avantageusement à la publication de Ken Fishkin & San Rafaël intitulée "A fast HLS-To-RGB Transform", publiée dans Graphics GEMS, 1990, pages 448-449.Concerning the RGB and HLS formats, we will advantageously refer to the publication by Ken Fishkin & San Rafaël entitled "A fast HLS-To-RGB Transform", published in Graphics GEMS, 1990, pages 448-449.
La classe interface utilisateur 48 contient l'opération "modifierParamètres (nouveauxParams: ParamètresSystème)" qui permet de gérer l'interaction entre l'utilisateur et les différents périphériques d'entrée du système (clavier, souris, reconnaissance vocale,...). Cette opération modifie, sur ordre de l'opérateur, les paramètres de la stéréo, du suivi d'organes, notamment l'identification des points à suivre, du suivi d'outils et des "multimages" absolus et relatifs, qui seront décrits ultérieurement en références aux classes 50, 52, 54, 58 et 60. La classe "multimage" 50 contient l'opération "calculer (outils: ListeOutils, marques: ListeMarques, entréelmage: Image, entréeStéréo: Image, sortie: Image)". Deux sous-classes, appelées "multimage" relatif 52 et "multimage" absolu 54, sont dérivées de la classe "multimage" 50.The user interface class 48 contains the "modifyParameters (newParams: SystemParameters)" operation which manages the interaction between the user and the various system input devices (keyboard, mouse, voice recognition, etc.). This operation modifies, on the operator's order, the parameters of the stereo, organ monitoring, in particular the identification of points to follow, tool monitoring and absolute and relative "multimages", which will be described later. with reference to classes 50, 52, 54, 58 and 60. The "multimage" class 50 contains the "calculate (tools: ListTools, brands: ListBrands, image entry: Image, Stereo input: Image, output: Image) operation. Two subclasses, called "relative multimage" 52 and "absolute multimage" 54, are derived from the class "multimage" 50.
La classe 50 permet le calcul de la superposition des images en mode absolu ou relatif en fonction de paramètres choisis, comme cela sera expliqué ci- dessous, et plus particulièrement d'assurer le mélange des images de type 2D, correspondant une vision habituelle du champ 10 et d'y adjoindre des images représentatives de la troisième dimension. Dans le cas du "multimage" relatif, les distances sont définies entre deux marques présentes dans le champ 10, alors que dans le "multimage" absolu, les distances sont définies par rapport à l'endoscope.Class 50 allows the calculation of the superposition of images in absolute or relative mode according to selected parameters, as will be explained below, and more particularly to ensure the mixing of 2D type images, corresponding to a usual vision of the field 10 and to add representative images of the third dimension. In the case of relative "multimage", the distances are defined between two marks present in the field 10, while in absolute "multimage", the distances are defined relative to the endoscope.
La sous-classe "multimage" relatif 52 contient l'opération "calculer (outils:The relative "multimage" subclass 52 contains the "calculate (tools:
ListeOutils, marques: ListeMarques, entréelmage: Image, entréeStéréo:ToolsList, brands: BrandsList, image input: Image, Stereo input:
Image, sortie: Image)" qui permet de calculer la superposition des images en mode relatif en fonction des paramètres s'y rapportant. Elle a pour attributs les paramètres "ParamètresMultimageRel" qui sont le choix du type et les variables relatives à ce type. Avec cette classe, il est possible de superposer à l'image du champ des informations complémentaires en créant une ou plusieurs sphères creuses et virtuelles centrées sur l'extrémité des outils, par exemple, et en représentant le lieu géométrique de leurs intersections avec le ou les organes visibles dans le champ 10 par une modification du ton et/ou de la saturation des pixels correspondants sur l'image. L'extrémité de l'outil peut aussi être munie d'une lumière virtuelle orientée dans le prolongement de celui-ci, qui modifie la luminosité, le ton ou la saturation du champ 10. Il est également possible d'afficher la distance entre un outil et une marque définie par l'utilisateur au moyen d'une information numérique, de curseurs ou de tout autre moyen, tel qu'un dégradé de couleur.Image, output: Image) "which calculates the superposition of the images in relative mode according to the related parameters. Its attributes "ParametersMultimageRel" parameters which are the choice of the type and the variables relating to this type. With this class, it is possible to superimpose additional information on the image of the field by creating one or more hollow and virtual spheres centered on the end of the tools, for example, and by representing the geometric location of their intersections with the or the organs visible in the field 10 by a modification of the tone and / or the saturation of the corresponding pixels on the image. The end of the tool can also be provided with a virtual light oriented in the extension thereof, which modifies the brightness, the tone or the saturation of the field 10. It is also possible to display the distance between a tool and a mark defined by the user by means of digital information, cursors or any other means, such as a color gradient.
La classe "multimage" absolu 54 contient l'opération "calculer (outils: ListeOutils, marques: ListeMarques, entréelmage: Image, entréeStéréo: Image, sortie: Image)" qui permet de calculer la superposition d'images en mode absolu tel que défini ci-dessus. Elle a pour attributs les paramètres "ParamètresMultimageAbs" qui sont la position et la résolution du "multimage" ainsi que le dégradé de couleurs.The absolute "multimage" class 54 contains the "calculate (tools: List of Tools, brands: List of Brands, image input: Image, Stereo input: Image, output: Image)" operation which allows to calculate the superposition of images in absolute mode as defined. above. Its attributes are the "ParametersMultimageAbs" parameters which are the position and resolution of the "multimage" as well as the color gradient.
La classe stéréo 56 sera décrite de manière plus détaillée en référence à la figure 7. Il peut toutefois déjà être précisé qu'elle contient les opérations de:The stereo class 56 will be described in more detail with reference to FIG. 7. It can however already be specified that it contains the operations of:
- "calculer (entréeDroite: Image, entréeGauche: Image, sortie: Image)" qui permet le calcul de l'image stéréo, appelée aussi "range image", en fonction des paramètres ci-dessous,- "calculate (input Right: Image, input Left: Image, output: Image)" which allows the calculation of the stereo image, also called "range image", according to the parameters below,
- "fusionner (petiteEntrée: Image, grandeEntrée: Image, sortie:lmage)" qui permet la fusion de deux images de disparité n'ayant pas la même résolution.- "merge (small Input: Image, large Input: Image, output: image)" which allows the fusion of two disparity images not having the same resolution.
La classe 56 a pour attributs les paramètres "ParamètresStéréo" qui sont les suivants:Class 56 has for attributes the "StereoParameters" parameters which are the following:
- Paramètre de la géométrie du système pour la conversion de l'image de disparité en image de distance, - Paramètres pour la correction de la géométrie épipolaire,- Parameter of the system geometry for the conversion of the disparity image into a distance image, - Parameters for the correction of the epipolar geometry,
- Offsets de correction de la position respective des deux images, - Espace de recherche de disparités,- Offsets to correct the respective position of the two images, - Search for disparities,
- Taille des fenêtres de recherche pour la reconnaissance de formes,- Size of search windows for pattern recognition,
- Seuil de confiance,- Confidence threshold,
- Paramètres pour la multi-résolution (position et résolution respectives des différentes images), et- Parameters for multi-resolution (respective position and resolution of the different images), and
- Paramètres pour le post-filtrage.- Parameters for post-filtering.
Dans la présente description, on appelle "reconnaissance de formes" une méthode permettant d'établir des correspondances entre les points équivalents des deux images, gauche et droite. Cette méthode est décrite dans la publication de Kurt Konolige déjà citée.In the present description, a “shape recognition” method is used to establish correspondences between the equivalent points of the two images, left and right. This method is described in the publication of Kurt Konolige already cited.
La classe "suivi" d'organes 58 contient les opérations "ajouterMarqueASuivre (entrée: Image, marque: Position2D)" et "suivreMarques (entrée: Image) : listeMarques", qui permettent à l'utilisateur d'ajouter des marques 11a à suivre. Elle a pour attribut les paramètres "ParamètresSuiviOrg" qui sont les suivants:The "follow-up" class of organs 58 contains the operations "addBrandAfollow (entry: Image, brand: Position2D)" and "followBrand (entry: Image): listBrand", which allow the user to add marks 11a to follow . Its attribute is the "ParametersSuriviOrg" parameters which are as follows:
- Nombre de marques à suivre,- Number of brands to follow,
- Forme des marques à suivre,- Form of marks to follow,
- Position précédente des marques à suivre, et- Previous position of the brands to follow, and
- Position courante des marques à suivre. La classe "suivi" d'outils 60 sera décrite de manière plus détaillée en référence à la figure 8. On peut toutefois déjà relever qu'elle contient l'opération: "chercher (entrée: Image) : listeOutils" ayant pour fonction de chercher les outils dans l'image. Elle a pour attribut les paramètres: "ParamètresSuiviOut" qui sont les suivants: - Nombre maximum d'outils dans le champ, et- Current position of the brands to follow. The "tracking" class of tools 60 will be described in more detail with reference to FIG. 8. It can however already be noted that it contains the operation: "search (entry: Image): listTools" having the function of searching the tools in the picture. Its attribute is the parameters: "ParametersTrackingOut" which are the following: - Maximum number of tools in the field, and
- Types d'outils possibles.- Types of tools possible.
Dans le logiciel, l'interaction des instances de classes définies ci-dessus est mise en oeuvre par l'application de ses opérations, selon une procédure séquentielle rePrésentée sur le schéma logique de la figure 5b. Les classes sont rePrésentées en abscisse et portent les mêmes références que sur la figure 5a, ainsi que leur nom. Les opérations permettant d'effectuer l'application sont représentées en ordonnée, identifiées par le nom des paramètres permettant de les réaliser écrits au-dessus d'une flèche reliant l'application 40 à la classe contenant ce paramètre.In the software, the interaction of the instances of classes defined above is implemented by the application of its operations, according to a sequential procedure re P represented on the logic diagram of FIG. 5b. Classes are re résentées P x-axis and have the same references as in Figure 5a, and their name. Operations to perform the application are represented on the ordinate, identified by the name of the parameters enabling them to be written written above an arrow connecting the application 40 to the class containing this parameter.
La séquence représentée débute par l'acquisition d'une nouvelle image, définie par l'opération "nouvelleslmages (Image, Image)" contenue dans la classe acquisition 41. L'opération "RGBenYδ (Image, Image)" convertit les images reçues dans le format Y8, puis l'opération "réduire (float, Image, Image)" permet de choisir la résolution de l'image. Ces deux opérations de mise en forme dans la perspective du traitement subséquent sont contenues dans la classe conversion 44. La classe stéréo 56 permet, par l'opération "calculer (Image, Image, Image)" de calculer une image de profondeur et, par là, les distances entre les différentes marques présentes dans le champ 10.The sequence shown begins with the acquisition of a new image, defined by the operation "nouvelleslmages (Image, Image)" contained in the acquisition class 41. The operation "RGBenYδ (Image, Image)" converts the images received in Y8 format, then the "reduce (float, Image, Image)" operation allows you to choose the resolution of the image. These two formatting operations with a view to subsequent processing are contained in the conversion class 44. The stereo class 56 makes it possible, by the "calculate (Image, Image, Image)" operation, to calculate a depth image and, by there, the distances between the different marks present in field 10.
Les deux opérations suivantes, "suivreMarque (Image)" et "chercher (Image)", contenues respectivement dans les classes "suivi" d'organes 58 et "suivi" d'outils 60, permettent le suivi de marques dans le champ, associées respectivement aux organes et aux outils qui s'y trouvent.The two following operations, "followBrand (Image)" and "search (Image)", contained respectively in the classes "follow-up" of organs 58 and "follow-up" of tools 60, allow the follow-up of marks in the field, associated respectively to the organs and tools therein.
Les opérations"RGBenHLS (Image, Image)" et "calculer (ListeOutils, ListeMarques, Image, Image, Image)", contenues respectivement dans les classes conversion 44 et "multimage" 50, permettent de créer une image qui, superposée à l'image du champ 10, donne une information relative aux distances séParant les marques associées aux organes et aux outils.The operations "RGBenHLS (Image, Image)" and "calculate (List of Tools, List of Brands, Image, Image, Image)", contained respectively in the conversion classes 44 and "multimage" 50, allow to create an image which, superimposed on the image of the field 10, provides information on sé distances P Arant brands associated with organs and tools.
Finalement, IOPération "afficher (Image)", contenue dans la classe affichage 42, transmet à l'écran les informations Permettant de voir le chamP 10 embrassé Par les caméras 18a et 18b, ainsi que les informations relatives aux distances.Finally, IO P eration "display (Image)", contained in the display class 42, transmits on the screen the information P allowing to see the cham P 10 embraced P by the cameras 18a and 18b, as well as the information relating to the distances .
Comme le montre la figure 6a, la classe acquisition 41 contient elle-même des classes aPPelées correction o tique 411 , filtrage 412, stockage 413, PériPhérique d'acquisition 414 et synchronisateur 415. La classe PériPhérique d'acquisition 414 est reliée à la classe acquisition 41 au travers de la classe 415. La classe correction optique 411 contient l'opération "corriger (entrée: Image, sortie: Image)" qui permet de corriger les erreurs dues à l'optique de l'endoscope. Elle a pour attribut les paramètres "ParamètresCorrection". Au travers de cette classe, les informations provenant des caméras 18a et 18b sont traitées de manière à ce que leurs coordonnées en x et y soient corrigées en fonction des caractéristiques optiques des endoscopes, sur la base d'un algorithme de correction de distorsions oPtiques décrit dans la publication Intitulée "Digital Image Processing (second édition)", IEEE ComPuter Society Press Monograph, 1994, de Wiliam K. Pratt. La classe filtrage 412 contient l'opération "filtrer (entrée: Image, sortie: Image)". Elle a pour attributs les paramètres "ParamètresFiltre". Cette opération assure le traitement des images issues des caméras 18a et 18b, élimine les parasites liés aux lignes paires et impaires et corrige l'intensité lumineuse. L'image obtenue, après que soient appliquées les opérations de ces deux classes, est ainsi de qualité idoine pour la suite de son traitement.As shown in Figure 6a, the 41 acquisition class itself contains the classes PP elées correcting o tick 411, filter 412, storage 413, Eri P P hérique acquisition 414 and synchronizer 415. The P Eri P hérique d acquisition 414 is linked to acquisition class 41 through class 415. The optical correction class 411 contains the "correct (input: Image, output: Image)" operation which enables errors due to the optics of the endoscope to be corrected. Its attribute is the "CorrectionParameters" parameters. Through this class, the information coming from the cameras 18a and 18b is processed so that their coordinates in x and y are corrected according to the optical characteristics of the endoscopes, on the basis of a distortion correction algorithm o P ticks described in the publication entitled "Digital Image Processing (second edition)", IEEE Com P uter Society Press Monograph, 1994, by Wiliam K. Pratt. The filtering class 412 contains the operation "filter (input: Image, output: Image)". Its attributes are the "FilterParameters" parameters. This operation processes the images from the cameras 18a and 18b, eliminates the parasites linked to the even and odd lines and corrects the light intensity. The image obtained, after the operations of these two classes have been applied, is therefore of suitable quality for further processing.
La classe stockage 413 contient l'opération "enregistrer (entrée: Image)" qui enregistre tout ou partie des images sur un support numérique, en leur état avant et/ou après traitement par les opérations des classes 411 et 412. Il est également possible de mémoriser les différents états du système, les positions successives des outils et les interactions avec le système de différents partenaires, notamment du chirurgien. Cette mémorisation permet de conserver les images en cas de problème. De manière avantageuse, la mémoire destinée à les recevoir est de type permanent, de manière que son contenu ne puisse être modifié et qu'il puisse ainsi tenir lieu de preuve. La classe périphérique d'acquisition 414 contient l'opération "acquérir (): Image". Elle a pour attributs les paramètres "ParamètresDuPériphérique" qui sont les suivants:The storage class 413 contains the operation "save (input: Image)" which records all or part of the images on a digital medium, in their state before and / or after processing by the operations of classes 411 and 412. It is also possible memorize the different states of the system, the successive positions of the tools and the interactions with the system of different partners, notably the surgeon. This memorization makes it possible to preserve the images in the event of a problem. Advantageously, the memory intended to receive them is of permanent type, so that its content cannot be modified and it can thus take the place of proof. The acquisition peripheral class 414 contains the "acquire (): Image" operation. Its attributes are the "DeviceParameters" parameters which are as follows:
- type de signal,- type of signal,
- format de l'image, et - résolution et position de l'image. Le signal vidéo, initialement de ty e analogique, est transformé en mode numérique Par l'opération contenue dans cette classe.- image format, and - image resolution and position. The video signal, initially of analog type, is transformed into digital mode P by the operation contained in this class.
La classe synchronisateur 415 contient l'opération "acquérir2lmages (gauche: Image, droite: Image)" qui garantit une acquisition synchrone des images gauches et droites obtenues aPrès traitement Par les instances de la classe Péri hérique d'acquisition 414 et délivre les images ainsi acquises à la classe acquisition 41.The synchronizer 415 class contains the operation "acquérir2lmages (Left: Image right: Image)" which guarantees a synchronous acquisition of left and right image obtained has lmost treatment I n instances of the P Eri hérique acquisition 414 and delivers the images thus acquired to the acquisition class 41.
Le diagramme des séquences de l'acquisition des données est rePrésenté à la figure 6b, selon le même rinci e que celui appliqué à la figure 5b. Il définit le détail de la séquence qui se déroule entièrement autour de la classe d'acquisition 41. On notera que les images provenant respectivement des caméras de droite et de gauche sont acquises simultanément, mais traitées successivement. Ce détail n'est toutefois pas représenté sur cette figure.The pattern sequences of the data acquisition is re BSB has to Figure 6b, according to the same Rincí e as those applied in Figure 5b. It defines the detail of the sequence which takes place entirely around the acquisition class 41. It will be noted that the images coming respectively from the cameras on the right and on the left are acquired simultaneously, but processed successively. This detail is not shown in this figure, however.
Après que l'opération "nouvelleslmages(lmage, Image)" a été effectuée, les instances de la classe acquisition 41 effectuent l'opération "aquérir2lmages(lmage, Image)" et donnent l'ordre aux instances de la classe synchronisateur 415 d'effectuer l'opération "acquérir()", tant pour la caméra de droite que celle de gauche, et cela en synchronisme. Ces images sont ensuite mises en mémoire par l'opération "enregistrer(lmage)" contenue dans la classe stockage 413, puis leur géométrie est corrigée par l'opération "corriger(lmage, Image)". Elles sont enfin filtrées par l'opération "fιltrer(lmage, Image)". Le programme retourne alors à la boucle principale de la figure 5b pour effectuer l'opération "RGBenY8(lmage, Image)".After the "newslmages (lmage, Image)" operation has been performed, the instances of the acquisition class 41 carry out the operation "acquire2lmages (lmage, Image)" and give the orders to the instances of the synchronizer class 415 of perform the "acquire ()" operation, for both the right and the left cameras, and this in synchronism. These images are then put into memory by the "save (image)" operation contained in the storage class 413, then their geometry is corrected by the "correct (image, image)" operation. They are finally filtered by the "filter (image, Image)" operation. The program then returns to the main loop of Figure 5b to perform the "RGBenY8 (image, Image)" operation.
La figure 7a montre les classes dérivées de la classe stéréo 56, et plus particulièrement les classes correction épiσolaire 561 , détection de contours 562, corrélation 563 et conversion stéréo 564.FIG. 7a shows the classes derived from the stereo class 56, and more particularly the epiσolar correction classes 561, edge detection 562, correlation 563 and stereo conversion 564.
La classe correction épipolaire 561 contient l'opération "corriger (entrée:The epipolar correction class 561 contains the "correct (entry:
Image, sortie: Image)" qui transforme les lignes épipolaires en des lignes parallèles. Elle a pour attributs les paramètres "ParamGéomEpip" nécessaires pour assurer une correction des images au moyen de la géométrie épipolaire, soit les matrices de transformation spécifiques à une o tique donnée. Toutes les informations utiles à ce ProPos se trouvent dans l'ouvrage "Determining the EPiPolar Geometry and its Uncertainty: A Review", cité Plus haut.Image, output: Image) "which transforms the epipolar lines into parallel lines. Its attributes are the parameters" ParamGéomEpip "necessary to ensure a correction of the images by means of the epipolar geometry, or the transformation matrices specific to a given tick. All the information that P ro P bones are found in the book "Determining the E P i P olar Geometry and Its Uncertainty: A Review," M ore cited above.
La classe détection de contours 562 contient l'opération "filtrerLOG (entrée: Image, sortie: Image)" qui filtre l'image ar la méthode connue sous le nom de LOG (La lacian of Gaussian) décrite dans la publication intitulée "A computation theory of human stereo vision", Proceeding Royal Society B-204- 1979, de Marr D. et Poggio T. Cette classe a pour attributs les paramètres "ParamètresFiltres" qui sont les coefficients du filtre LOG. La classe corrélation 563 contient l'opération "corréler (droite: Image, gauche: Image, sortie: Image)" qui permet de définir la correspondance entre les pixels de l'image de droite et ceux de l'image de gauche, et de construire une image de disparité. Elle a pour attributs les paramètres "ParamètresCorrélation" qui sont les suivants: - offsets de correction de la position respective des images gauche et droite qui permettent d'ajuster ces images de manière à pouvoir assurer la corrélation entre elles,The edge detection class 562 contains the operation "filterLOG (input: Image, output: Image)" which filters the image by the method known as LOG (La lacian of Gaussian) described in the publication entitled "A computation theory of human stereo vision ", Proceeding Royal Society B-204- 1979, by Marr D. and Poggio T. This class has as attributes the parameters" Filter Parameters "which are the coefficients of the LOG filter. The correlation class 563 contains the operation "correlate (right: Image, left: Image, output: Image)" which makes it possible to define the correspondence between the pixels of the image on the right and those of the image on the left, and build an image of disparity. Its attributes are the "CorrelationParameters" parameters which are as follows: - correction offsets of the respective position of the left and right images which allow these images to be adjusted so as to be able to ensure the correlation between them,
- espace de recherche de la disparité, confiné à la seule zone pour laquelle elle est utile, permettant ainsi d'éviter des calculs inutiles, - taille des fenêtres de recherche pour la "reconnaissance de formes", permettant également de réduire le volume des informations traitées, et- disparity search space, confined to the only area for which it is useful, thus making it possible to avoid unnecessary calculations, - size of the search windows for "pattern recognition", also making it possible to reduce the volume of information processed, and
- seuil de confiance, qui offre la possibilité de définir un seuil minimum pour la qualité des mesures de distances.- confidence threshold, which offers the possibility of defining a minimum threshold for the quality of distance measurements.
La classe conversion stéréo 564 contient l'opération "disparitéEnDistance (entrée: Image; sortie: Image)" qui transforme la disparité en distance millimétrique, en prenant en compte la géométrie de l'optique du système. Elle a pour attributs les paramètres "ParamGéomOptique" qui sont les paramètres de la géométrie de l'optique du dispositif pour la conversion de l'image de disparité en image de distance. La figure 7b montre le diagramme des séquences pour le calcul de la stéréo, structuré selon les mêmes principes que pour les figures 5b et 6b. Sur cette figure, les opérations effectuées sur les images gauche et droite ont été représentées de manière spécifique, afin de mieux différencier les opérations relatives aux images gauche et droite des opérations communes.The stereo conversion class 564 contains the "disparityInDistance (input: Image; output: Image)" operation which transforms the disparity into millimeter distance, taking into account the geometry of the system optics. Its attributes are the parameters "ParamGéomOptique" which are the parameters of the geometry of the optics of the device for the conversion of the disparity image into a distance image. FIG. 7b shows the sequence diagram for the calculation of the stereo, structured according to the same principles as for FIGS. 5b and 6b. In this figure, the operations performed on the left and right images have been shown in a specific manner, in order to better differentiate the operations relating to the left and right images from the common operations.
L'application accède à la classe stéréo 56 par l'ordre "calculer(lmage, Image)". L'opération "corriger(lmage, Image)" de la classe correction épipolaire 561 transforme l'image gauche en coordonnées épipolaires. L'opération "filtrerLog (Image, Image)" de la classe détection de contours 562 permet de définir les contours des différents objets présents dans le champ 10 et visibles sur l'image gauche. Les mêmes opérations sont ensuite effectuées sur l'image droite. L'opération "corréler(lmage, Image, Image) de la classe corrélation 563 assure ensuite la corrélation entre différents points des images gauche et droite. L'opération "disparitéEnDistance (Image, Image)" de la classe conversion stéréo 564 détermine les distances à partir des valeurs de disparités. Le programme se poursuit ensuite dans la boucle principale, telle que représentée aux figures 5a et 5b.The application accesses stereo class 56 by the order "calculate (image, Image)". The "correct (image, Image)" operation of the epipolar correction class 561 transforms the left image into epipolar coordinates. The "filterLog (Image, Image)" operation of the contour detection class 562 makes it possible to define the contours of the various objects present in the field 10 and visible on the left image. The same operations are then performed on the right image. The operation "correlate (image, Image, Image) of the correlation class 563 then ensures the correlation between different points of the left and right images. The operation" disparityInDistance (Image, Image) "of the stereo conversion class 564 determines the distances from the disparity values The program then continues in the main loop, as shown in Figures 5a and 5b.
La figure 8a montre les classes complémentaires dérivées de la classe suivi d'outils 60. On peut y voir les classes détection 601 , filtrage suivi 602, outils 603 et marques 604.Figure 8a shows the complementary classes derived from the tool tracking class 60. We can see the detection classes 601, filtering tracking 602, tools 603 and marks 604.
La classe détection 601 effectue les opérations "détecterMarques (entrée: Image) :listeMarques" qui permet de définir la position des marques dans les images filtrées en utilisant les informations relatives à la géométrie et aux couleurs des outils et des marques, et "détecterOutils (marques : listeMarques) : listeOutils". Par cette opération, le programme détermine les différents points significatifs présents dans le champ 10 et représentatifs d'un outil. Il corrige leur alignement et calcule la position et l'orientation de l'outil dans l'espace pour le représenter. Il peut, en outre, réguler la position de l'extrémité de l'outil, pour pouvoir disposer d'une image plus stable. La classe détection 601 a pour attributs les outils "listeOutils", définis dans la classe 603 et qui seront examinés plus tard. La classe filtrage suivi 602 effectue l'opération "filtrer (entrée: Image, sortie: Image)" qui permet de ne conserver que les images relatives aux marques et aux outils mais d'effacer le fond. Ce filtrage s'effectue en format HLS. Elle a également pour attributs les outils "listeOutils". La classe outils 603 contient toutes les informations permettant d'identifier les différents outils présents dans le champ 10. Elle a pour attributs les paramètres suivants:The detection class 601 performs the operations "detectBrand (input: Image): listBrand" which makes it possible to define the position of marks in the filtered images using information relating to the geometry and colors of tools and marks, and "detectTools ( brands: list brands): list tools ". By this operation, the program determines the different significant points present in the field 10 and representative of a tool. It corrects their alignment and calculates the position and orientation of the tool in space to represent it. It can, moreover, regulate the position of the end of the tool, in order to have a more stable image. The detection class 601 has for attributes the tools "listTools", defined in class 603 and which will be examined later. The filtering class followed 602 performs the operation "filter (input: Image, output: Image)" which allows to keep only the images relating to brands and tools but to erase the background. This filtering is done in HLS format. It also has for attributes the tools "listTools". The tools class 603 contains all the information enabling the different tools present in field 10 to be identified. Its attributes are the following parameters:
- "type: TypeOutil" qui comporte toutes les informations relatives aux outils, telles que leur géométrie et leur marquage, - "orientation: Orientation 3D" relatif aux informations permettant de définir l'orientation de l'outil dans l'espace,- "type: TypeOutil" which contains all the information relating to the tools, such as their geometry and their marking, - "orientation: 3D orientation" relating to the information making it possible to define the orientation of the tool in space,
- "position : Position3D" contenant les informations relatives à la position de chacun des outils dans l'espace, et- "position: Position3D" containing information relating to the position of each of the tools in space, and
- "extrémité: Position 3D" permettant de définir la position de l'extrémité des outils dans l'espace.- "end: 3D position" used to define the position of the end of the tools in space.
La prise en compte de l'extrémité des outils de manière spécifique permet de travailler avec un maximum de précision. Cela est particulièrement important avec les bistouris.Taking into account the end of the tools in a specific way allows you to work with maximum precision. This is particularly important with scalpels.
La classe marque 604 a pour attributs les paramètres suivants: - "centre : Position 2D", donnant les informations relatives au centre de gravité des marques présentes dans le champ 10,The 604 brand class has the following parameters as attributes: - "center: 2D position", giving information relating to the center of gravity of the brands present in field 10,
- "surface : int" qui définit la surface de chacune des marques,- "surface: int" which defines the surface of each of the marks,
- "géométrie : Géométrie" qui contient les informations relatives à la forme de chacune des marques, et - "couleur : Couleur" dans lequel les couleurs de chacune des marques sont mémorisées.- "geometry: Geometry" which contains information relating to the shape of each of the marks, and - "color: Color" in which the colors of each of the marks are stored.
Si l'on se réfère à la figure 8b, on constate qu'on accède à la classe suivi d'outils 60 par l'ordre "chercher (Image)". La classe filtrage suivi 602 contient l'opération "filtrer (Image, Image)" qui permet d'avoir une image sur laquelle ne se trouvent plus que les outils et les marques. La classe détection 601 contient les opérations "détecterMarques(lmage)" et "détecterOutils(Listemarques)", puis le programme revient à la boucle principale.If we refer to Figure 8b, we see that we access the class followed by tools 60 by the order "search (Image)". The filtering class followed 602 contains the operation "filter (Image, Image)" which makes it possible to have an image on which there are only the tools and the marks. The detection class 601 contains the operations "detectBrands (image)" and "detectTools (List Notes)", then the program returns to the main loop.
Le dispositif selon l'invention, tel qu'il vient d'être décrit, permet des opérations chirurgicales par technique endoscopique offrant un maximum d'informations pouvant être appelées à la demande, sans pour autant nécessiter des équipements entravant le travail du chirurgien et de son équipe. Il est également utilisable dans le domaine de la chirurgie orthopédique.The device according to the invention, as just described, allows surgical operations by endoscopic technique offering a maximum of information that can be called upon request, without requiring equipment hindering the work of the surgeon and of his team. It can also be used in the field of orthopedic surgery.
Le même concept n'est absolument pas limité au domaine médical. Il s'applique à de nombreuses autres situations telles que l'inspection de canalisations, la détermination de la position exacte d'objets fixes ou mobiles dans un espace donné difficile d'accès, etc.The same concept is absolutely not limited to the medical field. It applies to many other situations such as inspecting pipes, determining the exact position of fixed or moving objects in a given space that is difficult to access, etc.
Il peut encore être noté qu'en plus de l'indication d'informations relatives à la troisième dimension, d'autres informations peuvent être associées à l'image affichée, sans pour autant sortir du cadre de l'invention.It may also be noted that in addition to the indication of information relating to the third dimension, other information may be associated with the displayed image, without however departing from the scope of the invention.
Dans une variante non représentée, les informations relatives à la troisième dimensions peuvent être transmises par voie sonore plutôt qu'optique, le dispositif modifiant, par exemple, la fréquence d'un signal émis, ou encore donnant l'information en clair. Dans une autre variante, un réseau réticulé est projeté en lumière infrarouge sur le champ opératoire, invisible pour l'oeil, mais que les caméras peuvent détecter. Un tel réseau peut être utilisé pour améliorer la précision des mesures, le nombre de marques étant ainsi considérablement augmenté. In a variant not shown, the information relating to the third dimension can be transmitted by sound rather than optically, the device modifying, for example, the frequency of a signal transmitted, or even giving the information in clear. In another variant, a cross-linked network is projected in infrared light onto the operating field, invisible to the eye, but which the cameras can detect. Such a network can be used to improve the accuracy of the measurements, the number of marks thus being considerably increased.

Claims

REVENDICATIONS
1. Dispositif d'observation d'un champ (10), caractérisé en ce qu'il comporte:1. Field observation device (10), characterized in that it comprises:
- au moins deux caméras vidéo (18a, 18b) associées chacune à une optique permettant d'appréhender un espace commun du champ à observer et délivrant un signal électrique représentatif des images saisies,- at least two video cameras (18a, 18b) each associated with an optic making it possible to apprehend a common space of the field to be observed and delivering an electrical signal representative of the captured images,
- une électronique (20) de traitement des signaux fournis par les deux appareils,- electronics (20) for processing the signals supplied by the two devices,
- des moyens de mise en mémoire de représentations d'objets susceptibles d'apparaître dans ledit champs, - des moyens de suivi pour identifier automatiquement au moins deux points P1 et P2 communs aux images saisies, l'un au moins étant associé à l'une desdites représentations, et pour produire une information relative à la position de ces points dans un espace à trois dimensions, - des moyens de calcul (26c) pour déterminer, à partir de ladite information, une valeur représentative de la distance séparant lesdits points,- means for storing representations of objects likely to appear in said fields, - monitoring means for automatically identifying at least two points P1 and P2 common to the captured images, at least one being associated with the one of said representations, and to produce information relating to the position of these points in a three-dimensional space, - calculation means (26c) for determining, from said information, a value representative of the distance separating said points,
- des moyens de traitement (32) pour transformer ladite valeur rePrésentative en signaux, et - des moyens de communication (22) Pour fournir à un o érateur, à partir des signaux issus des moyens de traitement, des informations relatives à cette distance.- processing means (32) for transforming said representative value P into signals, and - communication means (22) T o supply to an operator, from the signals coming from the processing means, information relating to this distance.
2. Dispositif selon la revendication 1 , caractérisé en ce que les deux optiques sont rigidement associées l'une à l'autre pour former un endoscope stéréoscopique (16).2. Device according to claim 1, characterized in that the two optics are rigidly associated with each other to form a stereoscopic endoscope (16).
3. Dispositif selon la revendication 2, caractérisé en ce que:3. Device according to claim 2, characterized in that:
- lesdites optiques ont des axes parallèles, espacés l'un de l'autre d'une distance D, des distances focales f égales et des plans focaux coplanaires, et - l'information produite ar lesdits moyens de visée est, our chacun desdits deux points P, et P2, constituée par les coordonnées xL, xR et y de son image dans le plan focal de l'optique correspondante, xL étant l'abscisse de l'image du point dans l'espace gauche, xR l'abscisse de l'image du point dans l'espace droit et y l'ordonnée de l'image du point dans les espaces gauche et droit.the said optics have parallel axes, spaced from each other by a distance D, equal focal distances f and coplanar focal planes, and the information produced by said sighting means is, for each of said two points P, and P 2 , constituted by the coordinates xL, xR and y of its image in the focal plane of the corresponding optics, xL being the abscissa of the image of the point in the left space, xR the abscissa of the image of the point in the right space and y the ordinate of the image of the point in the left and right spaces.
4. Dispositif selon la revendication 3, caractérisé en ce que lesdits moyens de calcul (264) sont agencés pour effectuer les opérations de:4. Device according to claim 3, characterized in that said calculation means (264) are arranged to perform the operations of:
- calcul des coordonnées Xp^ Yp, et Zp, du point P et Xp2, Yp2 et Zp2 du point P2dans un espace cyclopéen selon les formules:- calculation of the coordinates Xp ^ Yp, and Zp, of the point P and Xp 2 , Yp 2 and Zp 2 of the point P 2 in a cyclopean space according to the formulas:
X = (D / Δ) . (xL + xR)X = (D / Δ). (xL + xR)
Y= D . y / ΔY = D. y / Δ
Z= D . f / Δ avec Δ = xL - xR - calcul des différences:
Figure imgf000029_0001
YPl - Yp2
Figure imgf000029_0002
Z = D. f / Δ with Δ = xL - xR - calculation of the differences:
Figure imgf000029_0001
Y Pl - Yp 2
Figure imgf000029_0002
- détermination, à partir de ces trois différences, de la distance séparant les deux points dans un espace à trois dimensions par la formule: d12 = [(Xp, - Xp2)2 + (YPl - Yp2)2 + (Zp, - Zp2)2]1'2 - determination, from these three differences, of the distance separating the two points in a three-dimensional space by the formula: d 12 = [(Xp, - Xp 2 ) 2 + (Y Pl - Yp 2 ) 2 + ( Zp, - Zp 2 ) 2 ] 1 ' 2
5. Dispositif selon l'une des revendications 1 à 4, caractérisé en ce que lesdits moyens de communication comportent un écran vidéo (22) Permettant d'afficher une image du champ- observé (10).5. Device according to one of claims 1 to 4, characterized in that said communication means comprise a video screen (22) P ermitting to display an image of the observed field (10).
6. Dispositif selon la revendication 5, caractérisé en ce que lesdits moyens de traitement (32) sont agencés pour générer une image signifiante de ladite valeur représentative et la superposer sur l'écran vidéo à une image du champ observé. 6. Device according to claim 5, characterized in that said processing means (32) are arranged to generate a significant image of said representative value and superimpose it on the video screen to an image of the field observed.
7. Dispositif selon la revendication 6, caractérisé en ce que ladite image signifiante est un index.7. Device according to claim 6, characterized in that said signifying image is an index.
8. Dispositif selon la revendication 6, caractérisé en ce que ladite image signifiante est une zone présentant un gradient de couleurs.8. Device according to claim 6, characterized in that said signifying image is an area having a color gradient.
5 9. Dispositif selon l'une des revendications 1 à 8, caractérisé en ce qu'il comporte, en outre, des moyens (18c) pour synchroniser les images des deux caméras.5 9. Device according to one of claims 1 to 8, characterized in that it further comprises means (18c) for synchronizing the images of the two cameras.
10. Dispositif selon l'une des revendications 1 à 9, caractérisé en ce qu'il comporte, en outre, un convertisseur analogique-numérique (18d, 18e)10. Device according to one of claims 1 to 9, characterized in that it further comprises an analog-digital converter (18d, 18th)
10 interposé entre chaque caméra (18a, 18b) et l'électronique de traitement10 interposed between each camera (18a, 18b) and the processing electronics
(20).(20).
1 1. Dispositif selon la revendication 10, caractérisé en ce qu'il comporte, en outre, une mémoire pour le stockage des signaux issus des convertisseurs.1 1. Device according to claim 10, characterized in that it further comprises a memory for storing the signals from the converters.
15 12. Dispositif selon l'une des revendications 10 et 11 , caractérisé en ce que l'électronique de traitement comporte, en outre, des moyens de sélection (30a) pour ne prendre en considération qu'une part des informations issues d'au moins l'un des convertisseurs, afin de réduire le volume d'informations traitées.15 12. Device according to one of claims 10 and 11, characterized in that the processing electronics further comprises selection means (30a) to take into consideration only part of the information from minus one of the converters, in order to reduce the volume of information processed.
20 13. Dispositif selon l'une des revendications 1 à 12, caractérisé en ce que l'électronique de traitement comporte, en outre, des moyens de correction (30b) pour traiter lesdites images, afin de réduire l'effet des aberrations des caméras.13. Device according to one of claims 1 to 12, characterized in that the processing electronics further comprises correction means (30b) for processing said images, in order to reduce the effect of camera aberrations .
14. Dispositif selon l'une des revendications 1 à 13, caractérisé en ce qu'il14. Device according to one of claims 1 to 13, characterized in that it
25 comporte, en outre, un commutateur (23) permettant de limiter l'information affichée sur l'écran à l'image issue de l'une des caméras. 25 further includes a switch (23) for limiting the information displayed on the screen to the image from one of the cameras.
PCT/CH2000/000096 1999-02-26 2000-02-22 Endoscopic observation device WO2000052643A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP00902532A EP1155383A1 (en) 1999-02-26 2000-02-22 Endoscopic observation device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR99/02534 1999-02-26
FR9902534A FR2790196A1 (en) 1999-02-26 1999-02-26 ENDOSCOPIC OBSERVATION DEVICE

Publications (1)

Publication Number Publication Date
WO2000052643A1 true WO2000052643A1 (en) 2000-09-08

Family

ID=9542681

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CH2000/000096 WO2000052643A1 (en) 1999-02-26 2000-02-22 Endoscopic observation device

Country Status (3)

Country Link
EP (1) EP1155383A1 (en)
FR (1) FR2790196A1 (en)
WO (1) WO2000052643A1 (en)

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8926502B2 (en) 2011-03-07 2015-01-06 Endochoice, Inc. Multi camera endoscope having a side service channel
US9101268B2 (en) 2009-06-18 2015-08-11 Endochoice Innovation Center Ltd. Multi-camera endoscope
US9101266B2 (en) 2011-02-07 2015-08-11 Endochoice Innovation Center Ltd. Multi-element cover for a multi-camera endoscope
US9101287B2 (en) 2011-03-07 2015-08-11 Endochoice Innovation Center Ltd. Multi camera endoscope assembly having multiple working channels
US9314147B2 (en) 2011-12-13 2016-04-19 Endochoice Innovation Center Ltd. Rotatable connector for an endoscope
US9320419B2 (en) 2010-12-09 2016-04-26 Endochoice Innovation Center Ltd. Fluid channeling component of a multi-camera endoscope
US9402533B2 (en) 2011-03-07 2016-08-02 Endochoice Innovation Center Ltd. Endoscope circuit board assembly
US9492063B2 (en) 2009-06-18 2016-11-15 Endochoice Innovation Center Ltd. Multi-viewing element endoscope
US9554692B2 (en) 2009-06-18 2017-01-31 EndoChoice Innovation Ctr. Ltd. Multi-camera endoscope
US9560953B2 (en) 2010-09-20 2017-02-07 Endochoice, Inc. Operational interface in a multi-viewing element endoscope
US9560954B2 (en) 2012-07-24 2017-02-07 Endochoice, Inc. Connector for use with endoscope
US9642513B2 (en) 2009-06-18 2017-05-09 Endochoice Inc. Compact multi-viewing element endoscope system
US9655502B2 (en) 2011-12-13 2017-05-23 EndoChoice Innovation Center, Ltd. Removable tip endoscope
US9706903B2 (en) 2009-06-18 2017-07-18 Endochoice, Inc. Multiple viewing elements endoscope system with modular imaging units
US9713417B2 (en) 2009-06-18 2017-07-25 Endochoice, Inc. Image capture assembly for use in a multi-viewing elements endoscope
US9814374B2 (en) 2010-12-09 2017-11-14 Endochoice Innovation Center Ltd. Flexible electronic circuit board for a multi-camera endoscope
US9872609B2 (en) 2009-06-18 2018-01-23 Endochoice Innovation Center Ltd. Multi-camera endoscope
US9901244B2 (en) 2009-06-18 2018-02-27 Endochoice, Inc. Circuit board assembly of a multiple viewing elements endoscope
US9986899B2 (en) 2013-03-28 2018-06-05 Endochoice, Inc. Manifold for a multiple viewing elements endoscope
US9993142B2 (en) 2013-03-28 2018-06-12 Endochoice, Inc. Fluid distribution device for a multiple viewing elements endoscope
US10080486B2 (en) 2010-09-20 2018-09-25 Endochoice Innovation Center Ltd. Multi-camera endoscope having fluid channels
US10165929B2 (en) 2009-06-18 2019-01-01 Endochoice, Inc. Compact multi-viewing element endoscope system
US10203493B2 (en) 2010-10-28 2019-02-12 Endochoice Innovation Center Ltd. Optical systems for multi-sensor endoscopes
US10499794B2 (en) 2013-05-09 2019-12-10 Endochoice, Inc. Operational interface in a multi-viewing element endoscope
US11278190B2 (en) 2009-06-18 2022-03-22 Endochoice, Inc. Multi-viewing element endoscope
US11547275B2 (en) 2009-06-18 2023-01-10 Endochoice, Inc. Compact multi-viewing element endoscope system
US11864734B2 (en) 2009-06-18 2024-01-09 Endochoice, Inc. Multi-camera endoscope

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4935810A (en) * 1988-10-26 1990-06-19 Olympus Optical Co., Ltd. Three-dimensional measuring apparatus

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4935810A (en) * 1988-10-26 1990-06-19 Olympus Optical Co., Ltd. Three-dimensional measuring apparatus

Cited By (53)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9901244B2 (en) 2009-06-18 2018-02-27 Endochoice, Inc. Circuit board assembly of a multiple viewing elements endoscope
US9101268B2 (en) 2009-06-18 2015-08-11 Endochoice Innovation Center Ltd. Multi-camera endoscope
US11864734B2 (en) 2009-06-18 2024-01-09 Endochoice, Inc. Multi-camera endoscope
US11547275B2 (en) 2009-06-18 2023-01-10 Endochoice, Inc. Compact multi-viewing element endoscope system
US11534056B2 (en) 2009-06-18 2022-12-27 Endochoice, Inc. Multi-camera endoscope
US11471028B2 (en) 2009-06-18 2022-10-18 Endochoice, Inc. Circuit board assembly of a multiple viewing elements endoscope
US11278190B2 (en) 2009-06-18 2022-03-22 Endochoice, Inc. Multi-viewing element endoscope
US10912445B2 (en) 2009-06-18 2021-02-09 Endochoice, Inc. Compact multi-viewing element endoscope system
US9492063B2 (en) 2009-06-18 2016-11-15 Endochoice Innovation Center Ltd. Multi-viewing element endoscope
US9554692B2 (en) 2009-06-18 2017-01-31 EndoChoice Innovation Ctr. Ltd. Multi-camera endoscope
US10799095B2 (en) 2009-06-18 2020-10-13 Endochoice, Inc. Multi-viewing element endoscope
US10791909B2 (en) 2009-06-18 2020-10-06 Endochoice, Inc. Image capture assembly for use in a multi-viewing elements endoscope
US9642513B2 (en) 2009-06-18 2017-05-09 Endochoice Inc. Compact multi-viewing element endoscope system
US10791910B2 (en) 2009-06-18 2020-10-06 Endochoice, Inc. Multiple viewing elements endoscope system with modular imaging units
US9706905B2 (en) 2009-06-18 2017-07-18 Endochoice Innovation Center Ltd. Multi-camera endoscope
US9706903B2 (en) 2009-06-18 2017-07-18 Endochoice, Inc. Multiple viewing elements endoscope system with modular imaging units
US9713417B2 (en) 2009-06-18 2017-07-25 Endochoice, Inc. Image capture assembly for use in a multi-viewing elements endoscope
US10638922B2 (en) 2009-06-18 2020-05-05 Endochoice, Inc. Multi-camera endoscope
US10165929B2 (en) 2009-06-18 2019-01-01 Endochoice, Inc. Compact multi-viewing element endoscope system
US10092167B2 (en) 2009-06-18 2018-10-09 Endochoice, Inc. Multiple viewing elements endoscope system with modular imaging units
US9872609B2 (en) 2009-06-18 2018-01-23 Endochoice Innovation Center Ltd. Multi-camera endoscope
US10080486B2 (en) 2010-09-20 2018-09-25 Endochoice Innovation Center Ltd. Multi-camera endoscope having fluid channels
US9560953B2 (en) 2010-09-20 2017-02-07 Endochoice, Inc. Operational interface in a multi-viewing element endoscope
US9986892B2 (en) 2010-09-20 2018-06-05 Endochoice, Inc. Operational interface in a multi-viewing element endoscope
US11543646B2 (en) 2010-10-28 2023-01-03 Endochoice, Inc. Optical systems for multi-sensor endoscopes
US10203493B2 (en) 2010-10-28 2019-02-12 Endochoice Innovation Center Ltd. Optical systems for multi-sensor endoscopes
US11497388B2 (en) 2010-12-09 2022-11-15 Endochoice, Inc. Flexible electronic circuit board for a multi-camera endoscope
US9320419B2 (en) 2010-12-09 2016-04-26 Endochoice Innovation Center Ltd. Fluid channeling component of a multi-camera endoscope
US10898063B2 (en) 2010-12-09 2021-01-26 Endochoice, Inc. Flexible electronic circuit board for a multi camera endoscope
US9814374B2 (en) 2010-12-09 2017-11-14 Endochoice Innovation Center Ltd. Flexible electronic circuit board for a multi-camera endoscope
US10182707B2 (en) 2010-12-09 2019-01-22 Endochoice Innovation Center Ltd. Fluid channeling component of a multi-camera endoscope
US9351629B2 (en) 2011-02-07 2016-05-31 Endochoice Innovation Center Ltd. Multi-element cover for a multi-camera endoscope
US10070774B2 (en) 2011-02-07 2018-09-11 Endochoice Innovation Center Ltd. Multi-element cover for a multi-camera endoscope
US9101266B2 (en) 2011-02-07 2015-08-11 Endochoice Innovation Center Ltd. Multi-element cover for a multi-camera endoscope
US11026566B2 (en) 2011-03-07 2021-06-08 Endochoice, Inc. Multi camera endoscope assembly having multiple working channels
US9713415B2 (en) 2011-03-07 2017-07-25 Endochoice Innovation Center Ltd. Multi camera endoscope having a side service channel
US10292578B2 (en) 2011-03-07 2019-05-21 Endochoice Innovation Center Ltd. Multi camera endoscope assembly having multiple working channels
US9854959B2 (en) 2011-03-07 2018-01-02 Endochoice Innovation Center Ltd. Multi camera endoscope assembly having multiple working channels
US9402533B2 (en) 2011-03-07 2016-08-02 Endochoice Innovation Center Ltd. Endoscope circuit board assembly
US9101287B2 (en) 2011-03-07 2015-08-11 Endochoice Innovation Center Ltd. Multi camera endoscope assembly having multiple working channels
US8926502B2 (en) 2011-03-07 2015-01-06 Endochoice, Inc. Multi camera endoscope having a side service channel
US9655502B2 (en) 2011-12-13 2017-05-23 EndoChoice Innovation Center, Ltd. Removable tip endoscope
US10470649B2 (en) 2011-12-13 2019-11-12 Endochoice, Inc. Removable tip endoscope
US9314147B2 (en) 2011-12-13 2016-04-19 Endochoice Innovation Center Ltd. Rotatable connector for an endoscope
US11291357B2 (en) 2011-12-13 2022-04-05 Endochoice, Inc. Removable tip endoscope
US9560954B2 (en) 2012-07-24 2017-02-07 Endochoice, Inc. Connector for use with endoscope
US10905315B2 (en) 2013-03-28 2021-02-02 Endochoice, Inc. Manifold for a multiple viewing elements endoscope
US9986899B2 (en) 2013-03-28 2018-06-05 Endochoice, Inc. Manifold for a multiple viewing elements endoscope
US9993142B2 (en) 2013-03-28 2018-06-12 Endochoice, Inc. Fluid distribution device for a multiple viewing elements endoscope
US10925471B2 (en) 2013-03-28 2021-02-23 Endochoice, Inc. Fluid distribution device for a multiple viewing elements endoscope
US11793393B2 (en) 2013-03-28 2023-10-24 Endochoice, Inc. Manifold for a multiple viewing elements endoscope
US11925323B2 (en) 2013-03-28 2024-03-12 Endochoice, Inc. Fluid distribution device for a multiple viewing elements endoscope
US10499794B2 (en) 2013-05-09 2019-12-10 Endochoice, Inc. Operational interface in a multi-viewing element endoscope

Also Published As

Publication number Publication date
EP1155383A1 (en) 2001-11-21
FR2790196A1 (en) 2000-09-01

Similar Documents

Publication Publication Date Title
WO2000052643A1 (en) Endoscopic observation device
EP3253279B1 (en) Device for viewing the inside of a mouth
EP3148402B1 (en) Device for viewing the inside of the mouth of a patient
US5491510A (en) System and method for simultaneously viewing a scene and an obscured object
US6503195B1 (en) Methods and systems for real-time structured light depth extraction and endoscope using real-time structured light depth extraction
KR101608848B1 (en) System and method for generating a multi-dimensional image
EP0494943B1 (en) Local intervention interactive system inside a region of a non homogeneous structure
FR3095331A1 (en) Computer-assisted orthopedic surgery procedure
Lurie et al. 3D reconstruction of cystoscopy videos for comprehensive bladder records
WO2021000466A1 (en) Optical coherence tomography augmented reality-based surgical microscope imaging system and method
AU2016290620A1 (en) System and method for scanning anatomical structures and for displaying a scanning result
Cutolo et al. Software framework for customized augmented reality headsets in medicine
EP3886754A1 (en) Tracking system for image-guided surgery
CN104918572A (en) Digital system for surgical video capturing and display
Hu et al. Head-mounted augmented reality platform for markerless orthopaedic navigation
EP4161437B1 (en) Taking an optical impression of a patient's dental arch
Saucer et al. A head-mounted display system for augmented reality image guidance: towards clinical evaluation for imri-guided nuerosurgery
Goncalves et al. Perspective shape from shading for wide-FOV near-lighting endoscopes
US20050254720A1 (en) Enhanced surgical visualizations with multi-flash imaging
CN111658142A (en) MR-based focus holographic navigation method and system
Zhang et al. Autostereoscopic 3D augmented reality navigation for laparoscopic surgery: a preliminary assessment
Fan et al. Three-dimensional image-guided techniques for minimally invasive surgery
TWI697317B (en) Digital image reality alignment kit and method applied to mixed reality system for surgical navigation
JP2022142428A (en) Machine learning teacher data generation system, machine learning teacher data generation method, and program
CN115311405A (en) Three-dimensional reconstruction method of binocular endoscope

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): CA JP US

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
WWE Wipo information: entry into national phase

Ref document number: 2000902532

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 2000902532

Country of ref document: EP

WWW Wipo information: withdrawn in national office

Ref document number: 2000902532

Country of ref document: EP