US6782123B1 - Method and device for mapping radiation sources - Google Patents

Method and device for mapping radiation sources Download PDF

Info

Publication number
US6782123B1
US6782123B1 US09/355,956 US35595699A US6782123B1 US 6782123 B1 US6782123 B1 US 6782123B1 US 35595699 A US35595699 A US 35595699A US 6782123 B1 US6782123 B1 US 6782123B1
Authority
US
United States
Prior art keywords
images
taking
sources
visual images
positions
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
US09/355,956
Inventor
François Guillon
Philippe Baussart
Thomas Dalancon
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Orano Cycle SA
Original Assignee
Compagnie Generale des Matieres Nucleaires SA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Compagnie Generale des Matieres Nucleaires SA filed Critical Compagnie Generale des Matieres Nucleaires SA
Assigned to COMPAGNIE GENERALE DES MATIERES NUCLEAIRES reassignment COMPAGNIE GENERALE DES MATIERES NUCLEAIRES ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BAUSSART, PHILIPPE, DALANCON, THOMAS, GUILLON, FRANCOIS
Application granted granted Critical
Publication of US6782123B1 publication Critical patent/US6782123B1/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01TMEASUREMENT OF NUCLEAR OR X-RADIATION
    • G01T7/00Details of radiation-measuring instruments

Definitions

  • tomography methods which make it possible to know the positions of radioactive sources in a three-dimensional environment; the cameras (or, more generally, the means of taking two-dimensional images) are moved around the object and the information read on the images is combined in order to derive therefrom the position of the sources.
  • it is necessary to know the position of the camera each time an image is taken which is not always possible in the situations envisaged here, since the robots are not always sufficiently precise, nor even provided with position coders which indicate where they have arrived.
  • This patent therefore does not give the idea of using a visual image of the environment in order to assist in determining the positions of point radiation sources in this environment, without it being necessary to have recourse to the conditions of obtaining tomographic images.
  • FIG. 1 is a general view of the mechanical means of the device
  • FIGS. 2 and 3 are two details of the device
  • FIG. 7 shows the geometric definition of certain calibration parameters for the device
  • a frame 1 comprises a central housing 2 from which there start two lateral arms 3 pointing in opposite directions and an upper appendage 4 .
  • the central housing 2 contains an alveolus, open towards the front, intended to house a gamma camera 6 .
  • Video cameras 7 are mounted at the ends of the lateral arms 3 .
  • a spotlight 8 is mounted at the top of the upper appendage 4 .
  • the method of locating radioactive sources can be described fairly simply by means of FIG. 4 : the device, brought to a suitable distance from the environment to be explored, takes a first series of images thereof by means of the video cameras 7 and gamma camera 6 without moving.
  • the two visual images 25 and 26 of the environment thus obtained depict substantially the same subject at different angles of view using the video cameras 7 .
  • An automatic comparison of these visual images 25 26 consisting of identifying and comparing the homologous points of the images, representing the same noteworthy detail of the environment, makes it possible to deduce the position of the points in the environment with respect to the device, including their distance.
  • the environment can therefore be modelled in three dimensions and the position of the device in the environment determined. Seeking the homologous points on the two visual images 25 and 26 is carried out by one of the specialist software packets which are now available commercially.
  • An object 35 at which a camera is aimed appears on its image plane in the form of an object image 36 , which may be affected by distortions, but if the object 35 is a sight of a known form and the position of the camera 6 or 7 , and in particular of the image plane 30 and of the optical centre P, is calculated with respect to the object 35 , it is possible to establish a correspondence between each point 37 of the object 35 and the point 38 by means of which it is represented on the image plane 30 , by determining the radius 32 which connects them, and which passes through the optical centre P; this radius 32 can be completely defined by the coordinates of the optical centre and the point 37 .
  • any point 40 on an object 41 in an environment which is unknown at the start and aimed at by the two video cameras 7 will appear on the visual images 25 and 26 with the appearance of two points 42 and 43 whose positions on these images will make it possible to derive the directions, on the radii 44 and 45 , from the optical centres (denoted here P 1 and P 2 ) of the two video cameras 7 .
  • the intersection of the radii 44 and 45 is then calculated, which gives the position of the point 40 with respect to the video cameras 7 .
  • the generalisation of this method to all the pairs of homologous points on the visual images 25 and 26 makes it possible to find the shape and position of the object 41 in three-dimensional space.
  • the method then comprises a synthesis of the two models of the object 101 , which can also be undertaken by the software for seeking and identifying noteworthy homologous points of these models, in order to evaluate their differences in location and orientation compared with the video cameras 7 between the two shooting phases.
  • this evaluation it is possible to calculate the positions of the two projection lines 107 and 108 of the source in one of the models, and therefore the position of their point of intersection 100 in this model.
  • the method can, in general terms, be implemented with a single visual camera 7 , provided that there is available a measurement of the positions of a shot in order to serve as a basis for the triangulations for constructing the model of the visual environment, or other information, a few examples of which are given below.
  • the modelling of the object 40 effected on the visual images 25 and 26 , also required using a triangulation and therefore knowing a triangulation base, which corresponded to the distance 51 between the video cameras 7 .
  • One variant consists of using multiple positions of the device coupled with known length information issuing from a plane or a reference placed in the scene on the one hand, and calculation software for adjusting the beam on the other hand. It was also necessary to know the position of the optical centre P 3 of the gamma camera 6 in order to determine the position of the radius 47 .
  • These external parameters of the device can be expressed by a file of the coordinates of six points, as shown by FIG.
  • x, y and z designate the Cartesian coordinates of the point P under consideration.
  • All these coordinates can be calculated by calibration triangulations undertaken by causing the device to turn about the sight in FIG. 6 and placing it at different positions in order to take series of shots: the distances of the optical centres P 1 , P 2 and P 3 of the cameras 6 and 7 to the sight are then known, and as soon as a noteworthy point on the sight is registered on the images of the cameras 6 and 7 , the directions of the radii which connect it to the optical centres P 1 , P 2 and P 3 are determined as a function of the look-up table, and finally the relative positions of the optical centres P 1 , P 2 and P 3 , and then suitable positions for the points P 4 , P 5 and P 6 .
  • the calculations of the positions of the points P 1 to P 6 are undertaken from several noteworthy positions of the sight in order to have available more numerous data, whose average is finally taken.
  • FIGS. 8 and 9 are flow diagrams which set out the steps undertaken in order to calibrate the device and then to use it for effecting a mapping.
  • a complete diagram of the operating system of the device is finally given in FIG. 10 : there is found therein a first shooting module 52 which is connected to the gamma camera 6 and which records its images; a second shooting module 53 connected to the video cameras 7 and which records their images; a module for seeking homologous points 54 which is connected to the second shooting module 53 and which seeks the homologous points, corresponding to one and the same point on the object sighted, present on the images of the two video cameras 7 ; a photogrammetry module 55 which establishes essentially the directions of the points of the object aimed at according to the positions of the images of these points on the views; a modelling module 56 which calculates the positions of the points on the object with respect to the device; a display module 57 ; and a control module 58 responsible for the running of the four previous modules, their relationships and the triggering of the shots. All these

Abstract

A method and apparatus for precisely locating radioactive sources. A pair of visual cameras are oriented in directions so that they have all or part of their field of vision in common. The apparatus also includes an intermediate camera which is sensitive to the radiation to be measured. The visual cameras make it possible to define the position of the details of the environment by a triangulation method and another triangulation is carried out in order to know the position of the sources after having moved the apparatus. Photogrammetry software is used to accomplish these tasks easily.

Description

BACKGROUND OF THE INVENTION
1. Field of the Invention
The invention concerns a method and device for mapping radiation sources, to make it possible to locate such sources in a three-dimensional environment which may be known or unknown at the start.
2. Discussion of the Background
The idea of detecting radiation sources such as radioactive leaks by comparing an image of these sources, taken by a specialised item of equipment, and a visual or video image of the environment taken by an ordinary light-sensitive camera has already been applied. The use of the images can be effected empirically by the user, finding the locations of the radioactive sources on the visual image; in a rudimentary design of the method, he identifies the elements of the environment, restored on the visual image, which correspond to the places on the image of the sources on which a source has been recorded. The identification of the position of the sources is however not very precise and calls on the judgement and knowledge of the user with regard to the environment: he may thus know in advance the places in the environment where a leakage is liable to occur and identify them easily when it arises. Though such a method is sufficient in certain situations, it does not lend itself to automatic processing, notably if repair work is to be entrusted to a robot, which must know precisely the position of the source in three-dimensional space in order to reach it.
In a more improved method which is described in French patent No 2 652 909, a camera is used which is combined with obturators and means of converting the radioactive radiation, which enables it to record in turn a visual image and a radioactive emission image of the environment. The two images are supplied to automatic processing means, which superimpose them, which gives, without error, the position of the sources in the field of the camera. However, as this work is carried out on a two-dimensional image, the problem mentioned above is not resolved: an automatic use system cannot determine the distance of the source, and if it is indeed at the position of a detail of the environment on which it is superimposed, or if it is situated in front of or behind it on the radius of sight of the camera.
In addition, tomography methods are known which make it possible to know the positions of radioactive sources in a three-dimensional environment; the cameras (or, more generally, the means of taking two-dimensional images) are moved around the object and the information read on the images is combined in order to derive therefrom the position of the sources. This amounts to inverting, directly or indirectly, a system of equations expressing the fact that the radiation received by each image point is the sum of the radiation emitted along the line of projection or sight of the camera which ends at this point. However, it is necessary to know the position of the camera each time an image is taken, which is not always possible in the situations envisaged here, since the robots are not always sufficiently precise, nor even provided with position coders which indicate where they have arrived.
Finally, it is necessary to cite the international patent application WO 96/20421, which describes a double method of tomography and superimposition of two three-dimensional images thus obtained, where one illustrates the visible details of the object examined and the other depicts a view of the object by X-ray or the like. However, the two images are calculated in the same way and separately (except in order to apply correction calculations for the effects of the distortion, enlargement, etc produced by each of the photographic means); they are put in a relationship of equality and have the same importance.
This patent therefore does not give the idea of using a visual image of the environment in order to assist in determining the positions of point radiation sources in this environment, without it being necessary to have recourse to the conditions of obtaining tomographic images.
SUMMARY OF THE INVENTION
The object of the invention is therefore to make it possible to completely and precisely locate radiation sources, radioactive or others, in a three-dimensional environment.
The essential idea of the invention is that a three-dimensional model of the environment is used, established in advance by taking visual images, on which there are placed the sources registered on other images, which are correlated with the visual images.
The purpose of the model is therefore not only to provide a graphical representation of the positions of the sources in the environment, but particularly to help to determine these positions.
In its most general form, the invention thus concerns a method for the three-dimensional mapping of sources of radiation in an environment, comprising a first taking of a visual image of the environment and a first taking of an image of the sources, characterised in that it comprises a second taking of a visual image of the environment and a second taking of an image of the sources; an establishment of a visual three-dimensional model of the environment by searching for and identification of analogous elements of the visual image and then by location calculations for the homologous elements of the visual images; a location in the model of the environment of projection lines leading from the sources to the images of the sources; and calculations of positions of points of intersection of the projection lines in the model.
A device for implementing this method comprises a device for taking images of the radiation, a pair of means of taking visual images of an environment of the radiation sources, the means of taking visual images being oriented in directions such that they have all or part of their field of vision in common and mounted on a rigid common support which is non-deformable but adjustable with the means of taking images of the radiation, and photogrammetry means able to reconstitute a visual three-dimensional model of the environment from the visual images and to superimpose a three-dimensional model of the radiation sources using images of the radiation on the visual model.
BRIEF DESCRIPTION OF THE DRAWINGS
The commentary on concrete embodiments of the invention, given for purposes of illustration and applied to the detection of gamma radiation sources, will now be developed by means of the following figures:
FIG. 1 is a general view of the mechanical means of the device,
FIGS. 2 and 3 are two details of the device,
FIG. 4 illustrates the mapping method,
FIG. 5 illustrates the formation of the images,
FIG. 6 depicts a calibration sight,
FIG. 7 shows the geometric definition of certain calibration parameters for the device,
FIGS. 8 and 9 are two flow diagrams setting out the methods of calibrating and servicing the device, and
FIG. 10 is a diagram of the means, notably computer means, serving the device.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
Reference is now made to the figures. The first shows the general appearance of the device of the invention: a frame 1 comprises a central housing 2 from which there start two lateral arms 3 pointing in opposite directions and an upper appendage 4. The central housing 2 contains an alveolus, open towards the front, intended to house a gamma camera 6. Video cameras 7 are mounted at the ends of the lateral arms 3. Finally, a spotlight 8 is mounted at the top of the upper appendage 4.
The video cameras 7 are screwed onto rotating plates 9, each terminating (see FIG. 2) in a bottom journal 10 engaged in a cup 11 in a ring 12 welded to the end of the associated arm 3. The ring 12 receives a lateral locking screw 13, the end of which emerges in the cup 11 and clamps the lower journal 10, holding it in place at the required orientation. Finally, a fixing screw 14 is engaged in a thread 15 in the plate 9 coaxial with the lower journal 10, and this screw 14 passes through the lateral arm 3 and the ring 12 and holds the assembly clamped. This arrangement makes it possible, as will be seen immediately, to orient the video cameras 7 as required in the same plane with respect to azimuth in order to give them the desired angle of convergence.
The central housing 2 essentially comprises (see FIGS. 1 and 3) two opposite lateral walls 16 provided with a recess 17 for receiving a journal 18 supporting the gamma camera 6. The screws 19 are engaged through the lateral walls 16 and journals 18 and screwed into the gamma camera 6 in order to clamp it in the required position: it is in fact possible to cause it to rotate about a horizontal axis defined by the journals 18 when the screws 19 are loosened and therefore to adjust its orientation with respect to elevation. The frame 1 is mounted on a base 20 which can be designed to rotate and which is fixed to a support, not shown, robot arm or the like according to the application.
The method of locating radioactive sources can be described fairly simply by means of FIG. 4: the device, brought to a suitable distance from the environment to be explored, takes a first series of images thereof by means of the video cameras 7 and gamma camera 6 without moving. The two visual images 25 and 26 of the environment thus obtained depict substantially the same subject at different angles of view using the video cameras 7. An automatic comparison of these visual images 25 26 consisting of identifying and comparing the homologous points of the images, representing the same noteworthy detail of the environment, makes it possible to deduce the position of the points in the environment with respect to the device, including their distance. The environment can therefore be modelled in three dimensions and the position of the device in the environment determined. Seeking the homologous points on the two visual images 25 and 26 is carried out by one of the specialist software packets which are now available commercially.
There are even software packages capable of directly distinguishing complete details on an image, by recognition of shape associated with a correlation of images, and finding them on an analogous image.
When a pair of homologous points is identified on the visual images 25 and 26, two lines are derived therefrom, ending at the video cameras 7 and through which the real point in the environment is projected onto the visual images 25 and 26. The separation and angle of convergence of the video cameras 7 being known, an elementary triangulation calculation gives the position of intersection of the projection lines, that is to say the position of the point in the environment.
An image 27 of radioactive sources is taken by the gamma camera 6. However, a complete determination of the position of the sources makes it necessary to take another image of sources 28, which is obtained after having moved the device whilst causing it to aim at the same environment, and to compare these two images of sources 27 and 28 in order to assess the difference between the sources and the gamma camera 6.
A more detailed description of the method will now be given.
The work of seeking the position of the points in the visual radioactive environment from the images 25 to 28 is effected by photogrammetry software in conjunction with triangulation calculations, but a preliminary calibration must be undertaken in order to determine the external and internal parameters of the cameras 6 and 7, that is to say their relative positions and orientations on the one hand and their characteristics of restoration of the environment on the images which they produce, on the other hand.
It is first of all necessary to know the internal parameters. The gamma camera 6 and the video camera 7 can be represented (see FIG. 5) by an image plane 30, on which the image 25, 26, 27 or 28 is taken, and an optical centre P in front of this image plane 30, through which there pass all the radii 32 which impinge on the image plane 30. The optical centre is formed by the diaphragm for the video cameras 7 and by a collimator 33, preceding a pinhole enclosure 34, at the end of which there is situated the image plane 30 for the gamma camera 6.
An object 35 at which a camera is aimed appears on its image plane in the form of an object image 36, which may be affected by distortions, but if the object 35 is a sight of a known form and the position of the camera 6 or 7, and in particular of the image plane 30 and of the optical centre P, is calculated with respect to the object 35, it is possible to establish a correspondence between each point 37 of the object 35 and the point 38 by means of which it is represented on the image plane 30, by determining the radius 32 which connects them, and which passes through the optical centre P; this radius 32 can be completely defined by the coordinates of the optical centre and the point 37. The calibration consists precisely of drawing up a look-up table between each point 38 of the image taken by the camera 6 or 7 and the direction of sight (the radius 32) associated with this point and passing through the optical centre P. This table is immutable for given camera settings and serves to determine the directions of the points of unknown objects whose images are then taken. The object 35 of known shape can be the sight illustrated in FIG. 6, composed of a lattice of bars which intersect but which are rather separated from each other and carry points 39 which are identifiable with precision and which enable the photogrammetric software to easily find the noteworthy points in the image and to identify them with respective points on the sight.
It will then be understood, returning to FIG. 4, that any point 40 on an object 41 in an environment which is unknown at the start and aimed at by the two video cameras 7 will appear on the visual images 25 and 26 with the appearance of two points 42 and 43 whose positions on these images will make it possible to derive the directions, on the radii 44 and 45, from the optical centres (denoted here P1 and P2) of the two video cameras 7. The intersection of the radii 44 and 45 is then calculated, which gives the position of the point 40 with respect to the video cameras 7. The generalisation of this method to all the pairs of homologous points on the visual images 25 and 26 makes it possible to find the shape and position of the object 41 in three-dimensional space.
If a point 100 is defined as a source, this appears also on the source image 27 of the gamma camera 6 with the appearance of a point 106. The position of the cameras 6 and 7 with respect to the object makes it possible per se to know the direction of the point 100, which is situated on a radius 107. However, it is not possible to state with certainty that the origin of the point 106 is indeed the point 100 rather than another point on the radius 107. This is why the second source image 28 must be taken in order to give a second radius 108 leading from the new position of the optical centre (P3) of the gamma camera 6 to the point 100, which can then be identified as the source by calculating the intersection of the radii 107 and 108. It is also necessary to know the distance 109 between the two successive positions of the optical centre P3 of the gamma camera 6, that is to say the base of the triangulation undertaken on the radii 107 and 108, in order to determine the position of the point 100 which is their intersection, and the angle of rotation of the device between the two shots. These two items of information can be obtained by a direct measurement of the movements of the device if it is carried by a robot arm whose articulations are provided with movement coders; otherwise, an exploration of the environment and of the object 101 can be recommenced using new visual images 49 and 50 taken by the video cameras 7 at the second position of the device, in order to calculate the latter with respect to the position of the object 41.
The method then comprises a synthesis of the two models of the object 101, which can also be undertaken by the software for seeking and identifying noteworthy homologous points of these models, in order to evaluate their differences in location and orientation compared with the video cameras 7 between the two shooting phases. When this evaluation is completed, it is possible to calculate the positions of the two projection lines 107 and 108 of the source in one of the models, and therefore the position of their point of intersection 100 in this model.
Another variant of the method consists of exploiting a portion of the gamma image 28 which represents the environment 41: the gamma images 28 obtained by the usual gamma cameras are in fact sensitive also to visible light, which means that an image of the environment 41 is superimposed on the image of the sources. This image of the environment is often too tenuous and fuzzy to provide a suitable model of the environment 41 (although exceptions may be envisaged, so that the invention could then be applied with a single camera, the visual images all being able to be superimposed on gamma images), but it can be correlated with the model already obtained by the video cameras 7, by means of the software already mentioned, in order to calculate here also the positions of the lines 107 and 108 of projection of the source in the model.
The method can, in general terms, be implemented with a single visual camera 7, provided that there is available a measurement of the positions of a shot in order to serve as a basis for the triangulations for constructing the model of the visual environment, or other information, a few examples of which are given below.
It is however preferred to apply the invention with the device described in detail, which comprises three cameras in total, since the model of the environment can be constructed much more quickly, without moving the device between two shots and using more effective software for performing the triangulation calculations.
It makes it possible to work “in real time” if for example it is mounted on a robot arm having to manipulate the sources: the position of the sources is then calculated whilst the arm advances, and the two shooting times correspond to two phases of the advance of the arm. There is also the certainty that the two visual images 25, distant from a known base, will be similar and that seeking homologous points will almost always be fruitful.
The rules for determining the external parameters of the device will now be given; this determination is made before using the device according to FIG. 4 and immediately follows the determination, explained above, of the internal parameters of the device; it therefore constitutes a second part of the calibration of the device.
The modelling of the object 40, effected on the visual images 25 and 26, also required using a triangulation and therefore knowing a triangulation base, which corresponded to the distance 51 between the video cameras 7. One variant consists of using multiple positions of the device coupled with known length information issuing from a plane or a reference placed in the scene on the one hand, and calculation software for adjusting the beam on the other hand. It was also necessary to know the position of the optical centre P3 of the gamma camera 6 in order to determine the position of the radius 47. These external parameters of the device can be expressed by a file of the coordinates of six points, as shown by FIG. 7: the three optical centres P1, P2 and P3 of the cameras 6 and 7 and three points aligned with these respective optical centres on the central axes of site of the cameras 6 and 7; the latter points are numbered P4, P5 and P6 and can be situated at any distances, identical or not, from the points P1 to P3 with which they are respectively associated. In addition there is no constraint on the relative orientations and positions of the central axes of sight, which can intersect or not, although the axes of the video cameras 7 are supposed to intersect in the embodiment actually proposed; it is thus possible to adjust, without any particular constraint, the positions of the cameras 6 and 7 on the lateral arms 3 and frame 1. The external parameters of the cameras can therefore be summarised in the following table:
x(P1) y(P1) z(P1)
x(P2) y(P2) z(P2)
x(P3) y(P3) z(P3)
x(P4) y(P4) z(P4)
x(P5) y(P5) z(P5)
x(P6) y(P6) z(P6)
where x, y and z designate the Cartesian coordinates of the point P under consideration. The reference frame for measuring the coordinates can be chosen as required, for example with the origin P1, the axis x passing through P2 and the central axis of sight P1P4 included in the plane of the axes x and y. Under these circumstances, it is possible to write x(P1)=y(P1)=z(P1)=y(P2)=z(P2)=z(P4)=0. These seven constraints fix the seven degrees of freedom of the device, that is to say 3 rotations, three translations and one distance base. All these coordinates can be calculated by calibration triangulations undertaken by causing the device to turn about the sight in FIG. 6 and placing it at different positions in order to take series of shots: the distances of the optical centres P1, P2 and P3 of the cameras 6 and 7 to the sight are then known, and as soon as a noteworthy point on the sight is registered on the images of the cameras 6 and 7, the directions of the radii which connect it to the optical centres P1, P2 and P3 are determined as a function of the look-up table, and finally the relative positions of the optical centres P1, P2 and P3, and then suitable positions for the points P4, P5 and P6. The calculations of the positions of the points P1 to P6 are undertaken from several noteworthy positions of the sight in order to have available more numerous data, whose average is finally taken.
FIGS. 8 and 9 are flow diagrams which set out the steps undertaken in order to calibrate the device and then to use it for effecting a mapping. A complete diagram of the operating system of the device is finally given in FIG. 10: there is found therein a first shooting module 52 which is connected to the gamma camera 6 and which records its images; a second shooting module 53 connected to the video cameras 7 and which records their images; a module for seeking homologous points 54 which is connected to the second shooting module 53 and which seeks the homologous points, corresponding to one and the same point on the object sighted, present on the images of the two video cameras 7; a photogrammetry module 55 which establishes essentially the directions of the points of the object aimed at according to the positions of the images of these points on the views; a modelling module 56 which calculates the positions of the points on the object with respect to the device; a display module 57; and a control module 58 responsible for the running of the four previous modules, their relationships and the triggering of the shots. All these modules can in reality be grouped together on the same computer and be embodied in particular by a software package.

Claims (9)

What is claimed is:
1. A method for mapping sources of radiation in a three-dimensional environment, comprising:
providing an assembly comprising a support, means for taking visual images, and means for taking images of the sources, said means for taking visual images and said means for taking images of the sources being secured together to the support;
determining relationships between respective positions and orientations of said means for taking visual images and said means for taking images of the sources;
taking at least two visual images of the environment at different angles of view with said assembly displaced at one imaging position and at least two images of the sources with said assembly displaced to at least two imaging positions;
establishing a three-dimensional model of the environment from said at least two visual images, identifying homologous elements in the at least two visual images, and then computing locations of said homologous elements;
determining lines of projection of the sources leading to the means for taking images of the sources for the at least two imaging positions;
locating said lines of projection in said three-dimensional model based on said relationships; and
computing positions of intersection points of said lines of projection in said three-dimensional model, which are positions of the sources.
2. A method according to claim 1, wherein said three-dimensional model of the environment is established using measurements of translation and rotation displacements of said assembly between said at least two imaging positions.
3. A method according to claim 1, further comprising:
a preliminary calibration comprising correcting any distortion in images by establishing a look-up table between points on the images and directions of sight of a known object at a given position of said means for taking visual images and said means for taking images of the sources.
4. A method according to claim 1, further comprising:
a preliminary calibration comprising assessing positions of optical centers of said means for taking visual images and said means for taking images of the sources with respect to a known object and assessing directions of central axes of sight of said means for taking visual images and said means for taking images of the sources.
5. A method for mapping sources of radiation in a three-dimensional environment, comprising:
providing an assembly comprising a support, two means for taking visual images, and one means for taking images of the sources, said two means for taking visual images being oriented in directions such that they have at least a part of their field of vision in common, said two means for taking visual images and said one means for taking images of the sources being secured together to the support;
determining relationships between respective positions and orientations of said two means for taking visual images and said one means for taking images of the sources, and relationships between respective positions and orientations of said two means for taking visual images;
taking at least two visual images of the environment with each of said two means for taking visual images with said assembly displaced at least at two imaging positions and at least two images of the sources with said assembly displaced to the respective two imaging positions;
establishing three-dimensional models of the environment from said at least two visual images for each one of said two means for taking visual images, identifying homologous elements in the visual images, and then computing locations of said homologous elements for each of said imaging positions;
determining lines of projection of the sources leading to said one means for taking images of the sources for the respective two imaging positions;
locating in each three-dimensional model said lines of projection determined for a same imaging position by using said relationships;
combining the three-dimensional models into an overall model and locating said lines of projection in said overall model; and
computing positions of intersection points of said lines of projection in said overall model, which are positions of the sources.
6. A method according to claim 5, wherein one of said two means for taking visual images is said means for taking images of the sources.
7. A method according to claim 5, wherein said overall model of the environment is established using measurements of translation and rotation displacements of said assembly between said imaging positions.
8. A method according to claim 5, further comprising:
a preliminary calibration comprising correcting any distortion in images by establishing a look-up table between points on the images and directions of sight of a known object at a given position of said two means for taking visual images and said one means for taking images of the sources.
9. A method according to claim 5, further comprising:
a preliminary calibration comprising assessing positions of optical centers of said two means for taking visual images and said one means for taking images of the sources with respect to a known object and assessing directions of central axes of sight of said two means for taking visual images and said one means for taking images of the sources.
US09/355,956 1997-02-17 1998-02-16 Method and device for mapping radiation sources Expired - Fee Related US6782123B1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
FR9701809A FR2759791B1 (en) 1997-02-17 1997-02-17 RADIATION SOURCE MAPPING DEVICE
FR9701809 1997-02-17
PCT/FR1998/000293 WO1998036321A1 (en) 1997-02-17 1998-02-16 Method and device for mapping radiation sources

Publications (1)

Publication Number Publication Date
US6782123B1 true US6782123B1 (en) 2004-08-24

Family

ID=9503787

Family Applications (1)

Application Number Title Priority Date Filing Date
US09/355,956 Expired - Fee Related US6782123B1 (en) 1997-02-17 1998-02-16 Method and device for mapping radiation sources

Country Status (8)

Country Link
US (1) US6782123B1 (en)
EP (1) EP0958527B1 (en)
JP (1) JP4312833B2 (en)
DE (1) DE69808431T2 (en)
FR (1) FR2759791B1 (en)
RU (1) RU2204149C2 (en)
UA (1) UA66784C2 (en)
WO (1) WO1998036321A1 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050018897A1 (en) * 2003-07-23 2005-01-27 Samsung Electronics, Co., Ltd. Panel inspection apparatus
WO2008001210A2 (en) * 2006-06-29 2008-01-03 Pierre De Hill Survey instrument and a method of surveying
US20080135767A1 (en) * 2004-12-14 2008-06-12 Commissariat A L'energie Atomique Gamma Imagery Device
US20090231595A1 (en) * 2008-03-17 2009-09-17 Michael Petroff Mobile object position, motion and attitude detection in three dimension space
US20100045777A1 (en) * 2006-09-27 2010-02-25 Matthew Paul Mellor Radiation measurement
US20100104064A1 (en) * 2008-10-23 2010-04-29 General Electric Company System and method for threat detection
US20110024611A1 (en) * 2009-07-29 2011-02-03 Ut-Battelle, Llc Calibration method for video and radiation imagers
WO2012131329A3 (en) * 2011-03-31 2013-01-03 Babcock Nulcear Limited Improvements in and relating to methods and systems for investigating radioactive sources in locations
US20140085481A1 (en) * 2012-09-21 2014-03-27 Hitachi Consumer Electronics Co., Ltd. Radiation measurement apparatus and radiation measurement method
WO2015024694A1 (en) * 2013-08-23 2015-02-26 Stmi Societe Des Techniques En Milieu Ionisant 3d topographic and radiological modeling of an environment
US9091628B2 (en) 2012-12-21 2015-07-28 L-3 Communications Security And Detection Systems, Inc. 3D mapping with two orthogonal imaging views
US20170041535A1 (en) * 2015-08-03 2017-02-09 GE Lighting Solutions, LLC Method and system for imaging in a luminaire
WO2018146358A1 (en) 2017-02-10 2018-08-16 Consejo Superior De Investigaciones Cientificas (Csic) System and metod for the volumetric and isotopic identification of radiation distribution in radioactive surroundings

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5465383B2 (en) * 2005-02-28 2014-04-09 アドバンスト フューエル リサーチ、インク. High energy particle detection method
RU2481597C1 (en) * 2011-11-02 2013-05-10 Федеральное Бюджетное Учреждение "33 Центральный Научно-Исследовательский Испытательный Институт" Министерства Обороны Российской Федерации Method of determining position of point source of gamma-radiation
JP2013113610A (en) * 2011-11-25 2013-06-10 Hitachi-Ge Nuclear Energy Ltd Method and apparatus for measuring radiation
JP5992351B2 (en) * 2013-03-14 2016-09-14 株式会社東芝 Radiation visualization apparatus and radiation visualization method
JP2015187567A (en) * 2014-03-27 2015-10-29 日立Geニュークリア・エナジー株式会社 radiation measuring device
KR101681130B1 (en) * 2014-07-11 2016-12-12 한국원자력연구원 Symmetrical-type mono-sensor three-dimensional radiation detection and visualization system and method thereof
RU2593523C2 (en) * 2014-12-29 2016-08-10 Федеральное государственное казенное военное образовательное учреждение высшего профессионального образования "Военный учебно-научный центр Военно-воздушных сил "Военно-воздушная академия имени профессора Н.Е. Жуковского и Ю.А. Гагарина" (г. Воронеж) Министерства обороны Российской Федерации Method of determining coordinates of incidence of ammunition
FR3043211B1 (en) * 2015-11-04 2019-05-17 Commissariat A L'energie Atomique Et Aux Energies Alternatives MEASURING DEVICE AND METHOD FOR CARTOGRAPHING AN ENVIRONMENT CARRYING OUT RADIOACTIVE SOURCES
RU2672305C2 (en) * 2016-06-01 2018-11-13 Федеральное государственное бюджетное образовательное учреждение высшего образования "Новосибирский государственный педагогический университет" (ФГБОУ ВО "НГПУ") Overview slit pinhole camera obscura
GB201611506D0 (en) * 2016-06-30 2016-08-17 Create Tech Ltd Radiation imaging apparatus

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4172226A (en) * 1977-03-29 1979-10-23 Saul Rubin Remote radiation detection system
US4727565A (en) * 1983-11-14 1988-02-23 Ericson Bjoern E Method of localization
US4791300A (en) * 1986-08-25 1988-12-13 Qtr Corporation Miniature gamma camera
US5197476A (en) * 1989-03-16 1993-03-30 Christopher Nowacki Locating target in human body
US5227969A (en) * 1988-08-01 1993-07-13 W. L. Systems, Inc. Manipulable three-dimensional projection imaging method
US5349378A (en) * 1992-12-21 1994-09-20 Robotic Vision Systems, Inc. Context independent fusion of range and intensity imagery
WO1997001769A1 (en) * 1995-06-24 1997-01-16 British Nuclear Fuels Plc Apparatus and methods for detecting and/or imaging gamma radiation
US5638461A (en) * 1994-06-09 1997-06-10 Kollmorgen Instrument Corporation Stereoscopic electro-optical system for automated inspection and/or alignment of imaging devices on a production assembly line
US5680474A (en) * 1992-10-27 1997-10-21 Canon Kabushiki Kaisha Corresponding point extraction method for a plurality of images
US5699444A (en) * 1995-03-31 1997-12-16 Synthonics Incorporated Methods and apparatus for using image data to determine camera location and orientation
US5930383A (en) * 1996-09-24 1999-07-27 Netzer; Yishay Depth sensing camera systems and methods
US6483948B1 (en) * 1994-12-23 2002-11-19 Leica Ag Microscope, in particular a stereomicroscope, and a method of superimposing two images

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB9302271D0 (en) * 1993-02-05 1993-03-24 Robinson Max The visual presentation of information derived for a 3d image system

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4172226A (en) * 1977-03-29 1979-10-23 Saul Rubin Remote radiation detection system
US4727565A (en) * 1983-11-14 1988-02-23 Ericson Bjoern E Method of localization
US4791300A (en) * 1986-08-25 1988-12-13 Qtr Corporation Miniature gamma camera
US5227969A (en) * 1988-08-01 1993-07-13 W. L. Systems, Inc. Manipulable three-dimensional projection imaging method
US5197476A (en) * 1989-03-16 1993-03-30 Christopher Nowacki Locating target in human body
US5680474A (en) * 1992-10-27 1997-10-21 Canon Kabushiki Kaisha Corresponding point extraction method for a plurality of images
US5349378A (en) * 1992-12-21 1994-09-20 Robotic Vision Systems, Inc. Context independent fusion of range and intensity imagery
US5638461A (en) * 1994-06-09 1997-06-10 Kollmorgen Instrument Corporation Stereoscopic electro-optical system for automated inspection and/or alignment of imaging devices on a production assembly line
US6483948B1 (en) * 1994-12-23 2002-11-19 Leica Ag Microscope, in particular a stereomicroscope, and a method of superimposing two images
US5699444A (en) * 1995-03-31 1997-12-16 Synthonics Incorporated Methods and apparatus for using image data to determine camera location and orientation
US6080989A (en) * 1995-06-24 2000-06-27 British Nuclear Fuels Plc Apparatus and methods for detecting and/or imaging gamma radiation
WO1997001769A1 (en) * 1995-06-24 1997-01-16 British Nuclear Fuels Plc Apparatus and methods for detecting and/or imaging gamma radiation
US5930383A (en) * 1996-09-24 1999-07-27 Netzer; Yishay Depth sensing camera systems and methods

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7110104B2 (en) * 2003-07-23 2006-09-19 Samsung Electronics Co., Ltd. Panel inspection apparatus
US20050018897A1 (en) * 2003-07-23 2005-01-27 Samsung Electronics, Co., Ltd. Panel inspection apparatus
US7772563B2 (en) * 2004-12-14 2010-08-10 Commissariat A L'energie Atomique Gamma imagery device
US20080135767A1 (en) * 2004-12-14 2008-06-12 Commissariat A L'energie Atomique Gamma Imagery Device
CN101080650B (en) * 2004-12-14 2011-09-07 法国原子能委员会 Improved gamma imaging device
WO2008001210A2 (en) * 2006-06-29 2008-01-03 Pierre De Hill Survey instrument and a method of surveying
WO2008001210A3 (en) * 2006-06-29 2008-06-05 Hill Pierre De Survey instrument and a method of surveying
US20100045777A1 (en) * 2006-09-27 2010-02-25 Matthew Paul Mellor Radiation measurement
US8405786B2 (en) * 2006-09-27 2013-03-26 Create Technologies Limited Radiation measurement
US20090231595A1 (en) * 2008-03-17 2009-09-17 Michael Petroff Mobile object position, motion and attitude detection in three dimension space
WO2010048374A3 (en) * 2008-10-23 2011-05-19 Morpho Detection, Inc. System and method for threat detection
US20100104064A1 (en) * 2008-10-23 2010-04-29 General Electric Company System and method for threat detection
US20110024611A1 (en) * 2009-07-29 2011-02-03 Ut-Battelle, Llc Calibration method for video and radiation imagers
US7973276B2 (en) * 2009-07-29 2011-07-05 Ut-Battelle, Llc Calibration method for video and radiation imagers
GB2502501B (en) * 2011-03-31 2019-01-02 Cavendish Nuclear Ltd Improvements in and relating to methods and systems for investigating radioactive sources in locations
WO2012131329A3 (en) * 2011-03-31 2013-01-03 Babcock Nulcear Limited Improvements in and relating to methods and systems for investigating radioactive sources in locations
GB2502501A (en) * 2011-03-31 2013-11-27 Babcock Nuclear Ltd Improvements in and relating to methods and systems for investigating radioactive sources in locations
US20140085481A1 (en) * 2012-09-21 2014-03-27 Hitachi Consumer Electronics Co., Ltd. Radiation measurement apparatus and radiation measurement method
US9374537B2 (en) * 2012-09-21 2016-06-21 Hitachi Aloka Medical, Ltd. Radiation measurement apparatus and radiation measurement method
US9091628B2 (en) 2012-12-21 2015-07-28 L-3 Communications Security And Detection Systems, Inc. 3D mapping with two orthogonal imaging views
WO2015024694A1 (en) * 2013-08-23 2015-02-26 Stmi Societe Des Techniques En Milieu Ionisant 3d topographic and radiological modeling of an environment
US10733330B2 (en) 2013-08-23 2020-08-04 Orano Ds—Démantèlement Et Services 3D topographic and radiological modeling of an environment
US20170041535A1 (en) * 2015-08-03 2017-02-09 GE Lighting Solutions, LLC Method and system for imaging in a luminaire
US9918009B2 (en) * 2015-08-03 2018-03-13 GE Lighting Solutions, LLC Method and system for imaging in a luminaire
WO2018146358A1 (en) 2017-02-10 2018-08-16 Consejo Superior De Investigaciones Cientificas (Csic) System and metod for the volumetric and isotopic identification of radiation distribution in radioactive surroundings
US11022705B2 (en) * 2017-02-10 2021-06-01 Consejo Superior De Investigaciones Cientificas (Csic) System and method for the volumetric and isotopic identification of radiation distribution in radioactive surroundings

Also Published As

Publication number Publication date
EP0958527B1 (en) 2002-10-02
DE69808431T2 (en) 2003-08-28
FR2759791A1 (en) 1998-08-21
UA66784C2 (en) 2004-06-15
DE69808431D1 (en) 2002-11-07
RU2204149C2 (en) 2003-05-10
EP0958527A1 (en) 1999-11-24
JP4312833B2 (en) 2009-08-12
WO1998036321A1 (en) 1998-08-20
JP2001512570A (en) 2001-08-21
FR2759791B1 (en) 1999-04-09

Similar Documents

Publication Publication Date Title
US6782123B1 (en) Method and device for mapping radiation sources
US8699005B2 (en) Indoor surveying apparatus
US6731329B1 (en) Method and an arrangement for determining the spatial coordinates of at least one object point
US6243599B1 (en) Methods, systems and computer program products for photogrammetric sensor position estimation
US7136170B2 (en) Method and device for determining the spatial co-ordinates of an object
US6310644B1 (en) Camera theodolite system
CA2912859C (en) Apparatus and method for three dimensional surface measurement
US10830588B2 (en) Surveying instrument for scanning an object and image acquistion of the object
US20150116691A1 (en) Indoor surveying apparatus and method
Zhang et al. A universal and flexible theodolite-camera system for making accurate measurements over large volumes
EP1672314B1 (en) Method for preparing a stereo image and corresponding system for preparing three-dimensional data
US20180238687A1 (en) Surveying instrument for scanning an object and image acquisition of the object
RU99119906A (en) METHOD AND DEVICE FOR MAPPING SOURCES OF RADIATION
JP2004163271A (en) Noncontact image measuring apparatus
CN111879354A (en) Unmanned aerial vehicle measurement system that becomes more meticulous
JP2000205821A (en) Instrument and method for three-dimensional shape measurement
Schneider et al. Combined bundle adjustment of panoramic and central perspective images
El-Sheimy et al. Kinematic positioning in three dimensions using CCD technology
JP3359241B2 (en) Imaging method and apparatus
Beyer Calibration of CCD-cameras for machine vision and robotics
CN114061738B (en) Wind turbine tower drum foundation ring vibration monitoring method based on calibration plate pose calculation
JP2005017288A (en) Calibration device and method for zoom lens, and photographing equipment
US20220375122A1 (en) Surveying device with image evaluator for determining a spatial pose of the target axis
US10885368B2 (en) Six-dimensional smart target
Mohammadi et al. Mounting Calibration of a Multi-View Camera System on a Uav Platform

Legal Events

Date Code Title Description
AS Assignment

Owner name: COMPAGNIE GENERALE DES MATIERES NUCLEAIRES, FRANCE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GUILLON, FRANCOIS;BAUSSART, PHILIPPE;DALANCON, THOMAS;REEL/FRAME:015584/0154

Effective date: 19991122

FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

REMI Maintenance fee reminder mailed
LAPS Lapse for failure to pay maintenance fees
STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20160824