CN105578985B - Object tracking device - Google Patents
Object tracking device Download PDFInfo
- Publication number
- CN105578985B CN105578985B CN201480053435.0A CN201480053435A CN105578985B CN 105578985 B CN105578985 B CN 105578985B CN 201480053435 A CN201480053435 A CN 201480053435A CN 105578985 B CN105578985 B CN 105578985B
- Authority
- CN
- China
- Prior art keywords
- imaging unit
- primary
- patient
- view data
- primary imaging
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000003384 imaging method Methods 0.000 claims abstract description 354
- 238000000034 method Methods 0.000 claims description 23
- 230000003287 optical effect Effects 0.000 claims description 23
- 238000012544 monitoring process Methods 0.000 abstract description 8
- 238000012806 monitoring device Methods 0.000 abstract description 4
- 238000004590 computer program Methods 0.000 description 20
- 238000007689 inspection Methods 0.000 description 6
- 230000001419 dependent effect Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 5
- 230000005855 radiation Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 3
- 238000002059 diagnostic imaging Methods 0.000 description 3
- 238000001356 surgical procedure Methods 0.000 description 3
- 230000001276 controlling effect Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 235000013399 edible fruits Nutrition 0.000 description 2
- 239000003550 marker Substances 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 241000208340 Araliaceae Species 0.000 description 1
- 235000005035 Panax pseudoginseng ssp. pseudoginseng Nutrition 0.000 description 1
- 235000003140 Panax quinquefolius Nutrition 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000037237 body shape Effects 0.000 description 1
- 238000002591 computed tomography Methods 0.000 description 1
- 230000002079 cooperative effect Effects 0.000 description 1
- 238000013480 data collection Methods 0.000 description 1
- 238000002224 dissection Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 235000008434 ginseng Nutrition 0.000 description 1
- 230000006698 induction Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000013152 interventional procedure Methods 0.000 description 1
- 238000002357 laparoscopic surgery Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000007674 radiofrequency ablation Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000002603 single-photon emission computed tomography Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 230000001225 therapeutic effect Effects 0.000 description 1
- 238000002560 therapeutic procedure Methods 0.000 description 1
- 230000000472 traumatic effect Effects 0.000 description 1
- 238000002604 ultrasonography Methods 0.000 description 1
- 210000002700 urine Anatomy 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/54—Control of apparatus or devices for radiation diagnosis
- A61B6/547—Control of apparatus or devices for radiation diagnosis involving tracking of position of the device or parts of the device
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
- A61B5/0015—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
- A61B5/002—Monitoring the patient using a local or closed circuit, e.g. in a room or building
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0033—Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
- A61B5/0035—Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room adapted for acquisition of images from more than one imaging mode, e.g. combining MRI and optical tomography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0077—Devices for viewing the surface of the body, e.g. camera, magnifying lens
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1113—Local tracking of patients, e.g. in a hospital or private home
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/04—Positioning of patients; Tiltable beds or the like
- A61B6/0407—Supports, e.g. tables or beds, for the body or parts of the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/44—Constructional features of apparatus for radiation diagnosis
- A61B6/4417—Constructional features of apparatus for radiation diagnosis related to combined acquisition of different diagnostic modalities
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/44—Constructional features of apparatus for radiation diagnosis
- A61B6/4429—Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units
- A61B6/4435—Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being coupled by a rigid structure
- A61B6/4441—Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being coupled by a rigid structure the rigid structure being a C-arm or U-arm
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/46—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment with special arrangements for interfacing with the operator or the patient
- A61B6/461—Displaying means of special interest
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/54—Control of apparatus or devices for radiation diagnosis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/363—Use of fiducial points
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/371—Surgical systems with images on a monitor during operation with simultaneous use of two cameras
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/376—Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
- A61B2090/3762—Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy using computed tomography systems [CT]
- A61B2090/3764—Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy using computed tomography systems [CT] with a rotating C-arm having a cone beam emitting source
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3937—Visible markers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/397—Markers, e.g. radio-opaque or breast lesions markers electromagnetic other than visible, e.g. microwave
Abstract
The present invention relates to a kind of medical image system that is used for track the object tracking device of predetermined movable objects.The predetermined movable objects can be medical instrument and/or patient.Object tracking device includes primary imaging unit and secondary imaging unit.The primary imaging unit is configured to supply the first view data of the body of patient.It can be additionally configured to the view data for providing the inside of the body of patient.The primary imaging unit is moveable between imaging pattern and park mode.The secondary imaging unit is configured to supply the second view data of the body of patient.It can be additionally configured to the view data for providing the outside of the body of patient.The object tracking device also includes position monitoring device, and the position monitoring device is configured as monitoring position of the secondary imaging unit relative to the position of reference point.Therefore, the view data of medical instrument and/or patient based on the primary imaging unit is traceable in the imaging pattern, and the view data based on the secondary imaging unit is traceable in the park mode.
Description
Technical field
The present invention relates to it is a kind of for the object tracking device of medical image system, medical image system, for medical science into
Method for tracing object as system, the computer program element for equipment controlling and it is stored as calculate
The computer-readable medium of machine program element.
Background technology
In the minimally-invasive treatment that image is guided, while patient's disposal is performed in a minimally invasive manner, it can use based on company
The guiding of continuous image.Known principle for the guiding based on consecutive image can be related to the system based on pivotable c arm, its
Video imaging is continuously merged with x-ray imaging.
WO 2013/102827 describes a kind of position determining means for being used to determine position of the intervention instrument in patient.
The radioscopic image that real image based on the intervention instrument in patient is preferably Part II and the position provided, come true
Surely the sky between the position of the Part I of the position and intervention instrument of Part II of the instrument outside patient in patient is intervened
Between relation.Once spatial relationship is determined, it becomes possible to based on the institute of the Part II outside identified spatial relationship and object
The physical location of determination is determined to intervene position of the instrument in patient, moved while intervening instrument in patient, without adopting
The other real image of collection.
Such principle persistently needs the X-ray detector with embedded video camera, and the embedded video camera is continuous
Be located in its top of dissection interested, around and the C-arm of lower section in, so as to continuous tracking patient and instrument fortune
It is dynamic.However, the C-arm with embedded video camera due to its big size and the top of patient, around and lower section position
Put, obstruction can be formed during operation is treated.
The content of the invention
Therefore, it is possible to need to provide Object tracking, it hinders less during operation is treated.
The purpose of the present invention is solved by independent claimed subject matter, wherein, further embodiment be incorporated in from
Belong in claim.It should be noted that being also applied for object tracking device, medical imaging system in terms of the following description of the present invention
System, method for tracing object, computer program element and computer-readable medium.
According to the invention it is proposed that a kind of medical image system for being arranged to track predetermined movable objects.It is described pre-
It can be medical instrument and/or patient to determine movable objects.The object tracking device include primary imaging unit and it is secondary into
As unit.The primary imaging unit is configured to supply the first view data of the body of patient.It can be additionally configured to
The view data of the inside of the body of patient is provided.The primary imaging unit is removable between imaging pattern and park mode
Dynamic.Imaging pattern can close to patient body, and park mode can farther away from patient body.
The secondary imaging unit is configured to supply the second view data of the body of patient.It can be additionally configured to
The optical image data of the outside of the body of patient is provided, for example, body shape, vital sign, movement, perfused tissue, skin
Property etc..However, the secondary imaging unit can also specially or extraly be configured to supply the inside of the body of patient
View data.
The especially described primary imaging unit of the system is also arranged to determine the secondary imaging unit relative to ginseng
The position of the position of examination point.Reference point is preferably the position of the primary imaging unit.However, reference point can also be operation
Point in room, the position is known relative to the position of primary imaging unit, and its therefore, it is possible to be used for calculate institute
State the position of position of the secondary imaging unit relative to the primary imaging unit.
Then, the predetermined movable objects view data of (such as medical instrument or patient) based on primary imaging unit into
As being traceable in pattern, and the view data based on secondary imaging unit is traceable in park mode.
Because primary imaging unit can be moved in the park mode away from surgical operation, thus avoid it is primary into
Obstruction as unit to treatment operation, while medical instrument or patient are still traceable.
Specifically, when primary imaging unit needs to be moved away from surgical operation pattern, that is, need to be moved to park mould
When in formula, the primary imaging unit for being arranged to determine the position of the secondary imaging unit is realized from primary imaging unit
To the seamless transitions of the Object tracking of secondary imaging unit.Therefore particularly effective Object tracking can be implemented, without
Any extra global follow means.
Term " imaging pattern " is related to a position, wherein, primary imaging unit is disposed proximal to the appropriate position of object
Put to provide the first view data of object.
Term " park mode " is related to a position, wherein, compared to imaging pattern, primary imaging unit is further from patient's
Body.In park mode, primary imaging unit can not provide the first view data of object.For example, not being directed to surgery
The free sight of field of operation, or apart from or angle be unsuitable for provide object the first view data.
Term " reference point " is related to the point in operating room, and the position is known relative to the position of primary imaging unit
, and therefore, it is possible to be used for the position for calculating position of the secondary imaging unit relative to primary imaging unit.Reference point is excellent
Elect the position of the primary imaging unit as.
Term " predetermined movable objects " is related to medical instrument and/or patient.
In example, primary imaging unit includes 2D and/or 3D X-ray image acquisition systems.Secondary imaging unit can be with
Including at least one camera from externally visible body or instrument property, such as visible ray or infrared camera can be detected.Remove
Outside x-ray imaging unit, primary imaging unit can also include additional image unit, and the additional image unit is included extremely
A few other optical camera, the other optical camera is equally arranged to capture the surgical operation on patient table.
For both imaging devices, optical camera is positioned in which can be fixed to one another.Both imaging units can submit continuously/currently/
Life image data.Both can carry out the three-dimensional reconstruction to such as body by imaging unit.
In other example, primary imaging unit is connected to the base portion with C-arm, and secondary imaging unit quilt
It is connected to fixture.Secondary imaging unit and/or fixture are preferably at least regularly attached to subject support (its temporarily
Can be patient table) or ceiling in operating room, floor or Anywhere.
In a preferred embodiment, secondary imaging unit includes surgical tasks lamp, i.e. known per se outer in this area
Section's surgical tasks lamp is preferably equipped with least one camera to provide the second view data.
Secondary imaging unit can also be moveable, and therefore movable sensor may be provided in detection it is secondary into
As the movement of unit.If detecting the movement of secondary imaging unit, it is capable of informing a user that.
In example, in imaging pattern, merge the current image date of primary imaging unit and secondary imaging unit.Knot
Fruit preferably is superimposed to track the radioscopic image of patient or medical instrument with optical imagery.Therefore, for medical imaging system
The preferred object tracking device of system not only provides real time patient and/or the instrument motion display of the patient-external guided via video
And shown via the real time patient inside the patient of X-ray and/or instrument motion, and both merging are provided.
In other example, in park mode, only secondary imaging unit provides the current of the outside of the body of patient
Second view data.However, these second view data are preferably able to previously be captured and stored with primary imaging unit
View data merges.If there is necessity of imaging guiding/renewal/inspection to the first view data, then primary imaging unit
It is positioned from park mode and returns to imaging pattern, and gathers the first new view data of the inside of body.Then using new
Information automatically update the merging of the first imaging data and the second imaging data.
For the excellent of the first view data from such as x-ray system and the second view data from such as video camera
Choosing merges, and secondary imaging unit needs to be known and be therefore determined and monitor relative to the position of primary imaging unit.
It is therefore preferred that the position of secondary imaging unit is monitored relative to the position of primary imaging unit.Position monitoring
It can be realized by means of the position sensor of the position of the secondary imaging unit of monitoring, or can be according to by primary imaging unit
View data in itself that generated with secondary imaging unit is exported.Such position monitoring can be based on in both images
Characteristic or benchmark detection, such as explanation below, or it can also include another optics and non-optical position monitoring system, such as
Optic shape sensing system, electromagnetic tracking system, articulated jib, Radio Frequency Tracking system etc..
In example, if the position of primary imaging unit and secondary imaging unit is known relative to each other, by
Then the view data of one imaging unit collection can be registered and be merged into the view data gathered by another imaging unit.
Object tracking can be controlled by imaging unit, and another imaging unit need not in position manual calibration.Therefore, no
Need calibration.
Preferably, the position of secondary imaging unit is monitored relative to the position of primary imaging unit.It is single by secondary imaging
Member collection view data then can be registered and be merged into by the first imaging unit gather view data, without secondary into
As the time-consuming and complicated calibration of unit.
According to example, primary imaging unit is configured to determine that the characteristic or base of the determination on object with secondary imaging unit
Accurate position.First view data and characteristic in the second view data or the position of benchmark and then can be true relative to each other
Fixed, secondary imaging unit can be according to its export on the position of primary imaging unit.In a preferred embodiment, for this mesh
, the first optical image data provided by the camera of additional image unit and second of the camera offer by secondary imaging unit
Optical image data is combined.
Characteristic can be the physical features of such as instrument fixed end or patient really.Benchmark can be not actively to imaging unit
The passive marker of optical radiation is sent, or it can be active tag thing, i.e., the light source of radiation is sent to imaging unit.Example
Such as, benchmark can be infrared light supply, and the camera in secondary imaging unit and additional image unit can be infrared-sensitive,
To gather the infrared image for showing infrared light benchmark.
Benchmark is preferably arranged on patient or instrument.Preferably, if dry basis is used for determining secondary imaging unit
Position, such as relative to primary imaging unit.Moreover it is preferred that the camera in both imaging units aims at benchmark.Thus adopt
First optical image data of collection shows position of the benchmark relative to primary imaging unit, and the second optical imagery number gathered
According to showing position of the benchmark relative to secondary imaging unit.By being superimposed benchmark in primary image data and secondary image data
Relative position, the relative position and orientation of primary imaging unit and secondary imaging unit are calculated.Specifically, secondary imaging is single
Member is calculated relative to the position of the position of primary imaging unit.If using the combination of characteristic and benchmark, rather than benchmark, this
It is equally applicable.Using the knowledge, the view data from both imaging units is (especially from the X-ray of primary imaging unit
View data and the optical image data from secondary imaging unit) it can be merged.
On the renewal each time of primary imaging data and secondary imaging data, primary imaging unit and secondary imaging unit
Relative position be updated.When one (or both) imaging unit moved when, primary imaging unit and secondary imaging unit
Relative position is recalculated, as long as both imaging units are maintained at the sight on benchmark.If the sight of primary imaging unit
Lose, then the tracking to patient and/or instrument is taken over by secondary imaging unit.
In other example, secondary imaging unit and/or additional image unit includes can be in different spectral wavelengths
The EO-1 hyperion camera of the outside of monitoring body in band.
In other example, secondary imaging unit and/or additional image unit can track laparoscope, and can utilize
The X-ray viewdata that obtains from one of imaging unit extends laparoscope view.
According to other example, primary imaging unit tracks at least one first object (for example, medical instrument) and secondary
Level imaging unit tracks at least one second object (part of such as patient or s/he).
According to the present invention, medical image system can also include image capture device, subject support (such as with patient table
Form), control unit and display device.
Monitor can be included by being for example disposed in display device at secondary imaging unit, to present by first and/or
The data of secondary imaging unit collection.Medical image system and particularly secondary imaging unit can also include surgical operation
Lamp, to illuminate surgical area.
According to the present invention, it is also proposed that one kind be used for medical image system with track predetermined movable objects (such as patient or
Medical instrument) method for tracing object.It comprises the following steps:
A) by primary imaging unit provide patient body view data, the primary imaging unit imaging pattern with
Be between park mode it is moveable,
B) view data of the body of patient is provided by secondary imaging unit, and
C) position of the secondary imaging unit relative to the position of reference point is monitored by the primary imaging unit, with
And
D) the predetermined movable objects are tracked in the following manner:
- when the primary imaging unit is in the imaging pattern, based on the imaging of the primary imaging unit, and
- when the primary imaging unit is in the park mode, the imaging based on the secondary imaging unit.
In example, in imaging pattern, because secondary imaging unit relative to the position of primary imaging unit is known
, therefore the first view data from x-ray system and the second view data from optical camera can be merged.This is excellent
Selection of land provides and is superimposed to track the x-ray imaging of patient or medical instrument with optical imagery.
In other example, in park mode, when body of the primary imaging unit further from patient, it is only secondary into
As unit provides current second view data of the body of patient, preferably optical image data.However, these second picture numbers
According to remaining able to merge with the view data being previously captured and stored of primary imaging unit.If there is to the first view data
Imaging guiding/renewal/inspection necessity, then the base portion with C-arm and primary imaging unit is positioned back from park mode
To imaging pattern, and obtain the first new view data of the inside of body.Then first is automatically updated using new information
The merging of imaging data and the second imaging data.
In the other example of the present invention, it is proposed that a kind of Object tracking for being used to track predetermined movable objects is calculated
Machine program, wherein, computer program includes code modules, and described program code module is used to set when in control object tracking
When running the computer program on standby computer, the object tracking device according to independent device claim is made to perform
The step of method for tracing object according to dependent method claims.
According to aspects of the present invention, primary imaging unit (x-ray unit) is supplemented by secondary imaging unit.Tracking purpose is
There is provided by secondary units.It is smaller because how much secondary units are provided as, therefore more available and therefore free work is provided
Area.Advantageously, secondary imaging unit and the surgical tasks lamp commonly used during intervention or surgical procedure are integrated, thus
Extra equipment need not be provided.
Moreover it is preferred that the primary imaging unit is provided with the additional image unit including at least one camera;
The additional image unit is for example integrated in the housing of the detector of the x-ray unit so that obtained by described extra
Radioscopic image and the autoregistration of optical imagery that imaging unit is provided.
In other words, be embedded in from tracking equipment from imaging unit or secondary imaging unit, it is described from imaging unit or
Secondary imaging unit is attached to operating desk and is coupled to video camera set, and the video camera set is positioned in main imaging
In the additional image unit of unit or primary imaging unit.The main set of four video cameras in additional image unit with it is secondary into
As two cameras of unit are ensured from the locus of imaging unit by spatially being built from the continuous real-time Communication for Power between set
Stand and determine.
Once the C-arm of main imaging system is considered as hindering treatment operation, seamless pipe is guided from imaging unit, therefore
Allow C arms temporarily by park away from surgical operation.Once needing to provide 2D or 3D x-ray imagings, C-arm can be moved
Imaging or operating position are returned to, and Object tracking from imaging unit can be transferred back to the main imaging unit from described.
The present invention provide patient in minimally-invasive treatment environment and instrument tracking both, it has to being positioned by close neighbour
The continuous availability of imaging guiding/renewal/inspection of C-arm.
It should be appreciated that medical image system, the method for tracing object for medical image system, for controlling such set
Standby computer program element and the computer of stored such computer program element according to independent claims
Computer-readable recording medium has similar and/or identical preferred embodiment, especially as defined in dependent claims.It should also manage
Solution, the preferred embodiments of the present invention also can be any combinations of dependent claims and respective independent claims.
Embodiment from the description below is become apparent and will referred to and hereafter retouched by these and other aspects of the invention
The embodiment stated is elaborated.
Brief description of the drawings
The one exemplary embodiment of the present invention is described hereinafter with reference to accompanying drawing:
Fig. 1 shows the schematic diagram of the example of the medical image system with two imaging units, and wherein both imagings are single
Member is provided at top, surrounding and the lower section of patient,
Fig. 2 shows the schematic diagram of the example of medical image system, wherein, only one (i.e. secondary imaging unit) is to suffer from
The top of person, and another (i.e. primary imaging unit) is anchored on one side,
Fig. 3 shows the schematic diagram of the example similar to the medical image system in the case of Fig. 1, wherein, benchmark by with
In it is determined that secondary imaging unit relative to primary imaging unit position, and
The basic step of the example for the method that Fig. 4 shows for medical image system to track predetermined movable objects.
Embodiment
Fig. 1 shows the embodiment of the medical image system according to the present invention schematic and exemplaryly.Medical imaging system
System includes object tracking device 1, the object holder in the form of such as patient table 2, control unit (not shown) and display
Equipment (not shown).It is used to track predetermined movable objects, such as patient 3 or medical science instrument according to the object tracking device 1 of the present invention
Device.It includes the base portion 11 with C-arm 12 and primary imaging unit 13 and the fixture 14 with secondary imaging unit 15.
Base portion 11 is moveable relative to patient table 2, and is installed on wheel 19.Base portion 11 is unnecessary attached
To patient table 2.Patient table 2 is surrounded in the position shown of C-arm 12 in Fig. 1.In other words, it is disposed in patient table 2
Top, around and lower section.C-arm 12 relative to base portion 11 and is rotatable relative to patient 3.C-arm 12 is in arrow
It is possible that angle rotation of the head X side upwards about first axle A and/or the square track upwards about the second axle B in arrow Y, which rotate,.
Primary imaging unit 13 is disposed in object tracking device 1, especially in C-arm 12.It is configured as carrying
It is as explained further below for the first view data, the first view data of the inside of the body of such as patient.Therefore, just
Level imaging unit 13 is included in the top set of C-arm 12 the 2D and/or 3D x-ray systems in the form of X-ray detector 20
And the x-ray source 21 in the inferior division of C-arm 12.2D and/or 3D x-ray systems can include one or more X-rays
Camera.X-ray source 21 and X-ray detector 20 are controlled by control unit (not shown).The X-ray generated in x-ray source 21 is worn
Patient 3 and patient table 2 are crossed, the X-ray then passed through is detected by X-ray detector 20.X-ray detector 20 and x-ray source
21 are located in the end of C-arm 12.They are generally arranged to capture the surgical operation on patient table 2.The phase of C-arm 12
It is rotatable for base portion 11 and relative to patient 3, is shown in the desired direction with allowing primary imaging unit 13 to provide
The real image of desired region in patient 3.
First view data is three-dimensional image data set, in this embodiment it is that three-dimensional computed tomography picture number
According to collection.In other embodiments, image data set also can be two-dimensional image data collection.In addition, image data set can be another
The image data set of one image mode, such as magnetic resonance, ultrasound, single photon emission computed tomography and positron emission are disconnected
Layer photography.
Additional image unit 23, such as photaesthesia to ultraviolet light (UV light), infrared light (IR light) and/or with visible wavelength
Camera be also attached to the top set of C-arm 12, on the side of X-ray detector 20.The camera is also arranged to capture patient
Surgical operation on platform 2.Preferably, for example, the set of four cameras is arranged along the not homonymy of X-ray detector, and
It is integrated in its housing.
It must be noted that C-arm is shown as example.Of course it is also possible to provide other removable x-ray imaging units, example
Such as, the x-ray system for the track being attached to for example on ceiling, wall or ground, or be attached to the X of robots arm and penetrate
Linear system is united.
Fixture 14 with secondary imaging unit 15 is attached to patient table 2, and can capture outer on patient table 2
Section's operation field.Secondary imaging unit 15 can also be disposed in the other positions in operating room.Preferably, secondary imaging unit 15
Also include surgical tasks lamp.
Fixture 14 and/or secondary imaging unit 15 can be removable relative to patient table 2.Secondary imaging unit 15 by with
It is set to the second view data of the outside that the body of such as patient is provided, it is preferable that optical image data.Therefore, it include to
A few camera, at least one described camera can be interrupted or continuously detect from externally visible body property.Secondary imaging is single
Member 15 also being capable of tracked instrument.Preferably, for example, secondary imaging unit 15 includes the set of two cameras.
However, secondary imaging unit 15 can also specially or extraly be configured to supply the figure of the inside of the body of patient
As data.
Control unit (not shown) is interrupted or continuously merges the first view data and the second view data and show them
On monitor (not shown).Therefore, it provides the x-ray imaging being superimposed with video imaging, to track predetermined removable pair
As such as patient 3 or medical instrument.Therefore, the exemplary object tracking equipment 1 for medical image system not only provide via regarding
The real time patient of the patient-external of frequency guiding and/or instrument motion are shown and via the real time patient inside the patient of X-ray
And/or instrument motion display, and both merging are provided.
Term " merging " is related to the first view data and the second view data to the integration in a view.
In order to merge the first view data from x-ray system and the second view data from video camera, it is secondary into
As unit 15 needs to be known relative to the position of primary imaging unit 13, and therefore it is monitored.In shown embodiment
In, relative to the position of the secondary imaging unit 15 of position monitoring of primary imaging unit 13.In another embodiment, can be relative
The position of secondary imaging unit 15 is monitored in reference point (such as the predetermined fixing point in operating room), secondary imaging unit 15
Position is known relative to the position of primary imaging unit 13, and can be used the secondary imaging unit 15 of calculating relative to
The position of the position of primary imaging unit 13.
In other words, if the position of the local coordinate system of primary imaging unit and secondary imaging unit and orientation are relative to that
This is known, can be closed by the view data of an imaging unit collection with the view data gathered by other imaging units
And.Can be controlled according to the method for tracing object of the present invention by imaging unit, and another imaging unit in position not
Need manual calibration.It therefore, there is no need to calibration.
In order to provide for the object tracking device 1 of medical image system and do not turn into during operation is treated the doctor of obstruction
Learn imaging system, the imaging pattern M that the base portion 11 with C-arm 12 is shown in Fig. 1 close to patient table 21With further from trouble
The park mode M shown in Fig. 2 of person's platform 2pBetween be moveable.
Therefore, Fig. 2 shows the schematic diagram of medical image system in a case where:Wherein, only secondary imaging unit
15 are provided at the top of patient 3, and the base portion 11 with C-arm 12 and primary imaging unit 13 is anchored on one side.Stop at this
In pattern, the base portion 11 with C-arm 12 without prejudice to doctor and/or medical personnel during operation is treated.In park
In pattern, only secondary imaging unit 15 provides current second view data of the outside of the body of patient.However, these second figures
As data remain able to merge with the view data being previously captured and stored of primary imaging unit 13.If necessary to the first figure
As imaging guiding/renewal/inspection of data, the base portion 11 with C-arm 12 and primary imaging unit 13 is determined from park mode
Position returns to imaging pattern, and obtains the first new view data of the inside of body.First imaging data and the second imaging number
According to merging then automatically updated using the new image information of primary imaging unit 13.
Therefore, for medical image system with the demonstration for the predetermined movable objects for tracking such as patient 3 or medical instrument
Sex object tracking includes the following steps indicated such as Fig. 1 and 2.
A) provide patient's by the primary imaging unit 13 in the imaging pattern as shown in Figure 1 close to patient body
The view data of body and/or medical instrument.The X that primary imaging unit 13 is configured as providing the inside of the body of patient is penetrated
The x-ray system of line image data.Primary imaging unit 13 is in imaging pattern M1(Fig. 1) and the body further from patient park
Pattern MpIt is moveable between (Fig. 2).
B) body of patient and/or the view data of medical instrument are provided by secondary imaging unit 15.Secondary imaging unit
15 be that can detect the photo camera and/or video camera from externally visible body property.
C) position of the secondary imaging unit 15 relative to the position of reference point is monitored, the position of the reference point can be just
The position of level imaging unit 13.Monitoring is completed by position monitoring device, and the position monitoring device can be by means of benchmark
30 are provided in itself by sensor or primary imaging unit and secondary imaging unit, are described in detail as described below for Fig. 3.
D) in imaging pattern M1Imaging based on primary imaging unit 13 in (Fig. 1), and in park mode MpIn (Fig. 2)
Based on the imaging of secondary imaging unit 15, to track patient body and/or medical instrument.
In imaging pattern M1In (Fig. 1), because secondary imaging unit 15 relative to the position of primary imaging unit 13 is
Know, therefore the first view data from x-ray system and the second view data energy from photo camera and/or video camera
Enough it is merged.The x-ray imaging that this offer is superimposed with photo and/or video imaging, to track patient 3 or medical instrument.
In park mode MpIn (Fig. 2), when body of the primary imaging unit 13 further from patient, only secondary imaging unit
15 provide current second view data of the body of patient.However, these second view data are remained able to and primary imaging list
The view data being previously captured and stored of member 13 merges.If producing imaging guiding/renewal/inspection to the first view data
Necessity, the base portion 11 with C-arm 12 and primary imaging unit 13 is positioned from park mode (Fig. 2) returns to imaging pattern
(Fig. 1), and obtain the first new view data of the inside of body.Then the first imaging number is automatically updated using new information
According to the merging with the second imaging data.
Fig. 3 shown in the schematic diagram similar to the medical image system in the case of Fig. 1, wherein, benchmark 30 is used for
It is determined that position of the secondary imaging unit 15 relative to primary imaging unit 13.Benchmark 30 can not sent to imaging unit actively
The passive marker of optical radiation, or they can be active tag thing, i.e. the light source of radiation is sent to imaging unit.Example
Such as, benchmark 30 can be infrared light supply and imaging unit can infrared-sensitive, the red of infrared light benchmark 30 is shown to gather
Outer image.
, can with determine secondary imaging unit 15 as shown in figure 3, preferably four benchmark 30 are used to form benchmark model
The position leaned on.Benchmark 30 is disposed on patient.Both the aiming benchmark 30 of imaging unit 13,15.Thus the primary image gathered
Data show position of the benchmark 30 relative to primary imaging unit 13, and the secondary image data gathered show that benchmark 30 is relative
In the position of secondary imaging unit 15.By by the benchmark 30 in primary image data and secondary image data relative to each other
Position carries out registration, calculates position of the secondary imaging unit 15 relative to the position of primary imaging unit 13.
Using the knowledge, it can be merged from both view data of imaging unit 13,15, and if primary imaging
Unit 13 needs to be moved away from surgical operation, then can provide pair from primary imaging unit 13 to secondary imaging unit 15
The seamless transitions of image tracing.
Using the renewal each time of primary and secondary imaging data, the phase of primary imaging unit 13 and secondary imaging unit 15
Position can be updated.When one (or both) imaging unit moved when, the relative position of primary and secondary imaging unit 13
Put and be recalculated, as long as both imaging units are maintained at the sight on benchmark 30.
If the sight of primary imaging unit 13 is lost, for example, because primary imaging unit is moved in park mode,
The tracking to patient 3 and/or instrument is then taken over by secondary imaging unit 15., whereas if the sight of secondary imaging unit 15 is lost
Lose, then the tracking to patient 3 and/or instrument is taken over by primary imaging unit 13.If using the particular characteristics rather than base of object
Standard 30, can realize same effect.
In other words, can include being rigidly connected to one another from imaging unit or secondary imaging unit from camera system two
Individual or more camera.Relation between camera and the inherent parameter of camera is with the main camera system phase with being attached to detector
As mode be calibrated.Be attached to ceiling, ground or patient table from camera system, or with surgical tasks lamp collection
Into, and aim at the benchmark 30 on patient or instrument.Using calibrated from camera system, can calculate from camera system relative to
The posture of benchmark model.By detector camera system according to the benchmark of the attachment on patient or instrument creates benchmark model.Such as
Fruit is known relative to benchmark model from the posture of camera system, additionally it is possible to calculated from camera system relative to x-ray system
Posture.
If moved from camera system, it can be updated from the posture of camera system, as long as both camera system tools
There is the sight on benchmark model.If be correctly positioned from camera system, detector camera system can be moved, directly
The sight on benchmark model is lost in it.If the sight on detector camera system is lost, by from camera system
To take over the tracking to patient and/or instrument.Also can be to track an instrument and for example sharp using detector camera system
With the combination that patient is tracked from camera system.
If be tracked from camera system, when detector camera system does not have sight, should not move from
Camera.Therefore, it is proposed that movable sensor is added to from camera system to detect movement.If detected from camera system
Movement, then be capable of informing a user that.
Benchmark model must include at least four benchmark can reliably determine the posture from camera system.
Therefore, thus realize preferably four video cameras in the X-ray detector 20 of primary imaging unit 13 with secondary
The continuous real-time Communication for Power between preferably two video cameras in level imaging unit 15.
Both primary imaging unit and secondary imaging unit 15 can be operated directly, and can be according to two imagings
Only one in unit is controlled.It is preferably primary imaging unit 13 to control imaging unit.However, being drawn by secondary imaging unit 15
The execution for the intervention on the body of patient led is also possible.
It is also possible that primary imaging unit 13 tracks the first object (for example, medical instrument), and secondary imaging unit 15 with
The object of track second (such as part of patient 3 or s/he).
During embodiments of the invention can be used minimally invasive flow of the requirement by X-ray tracking object, such as pin is lived
Inspection, RF ablation etc..The present invention is it is contemplated that to be used in for example following clinical setting:Minimally invasive traumatic surgery, it is minimally invasive rectify
Shape surgical operation, Microinvasive neurosurgery operative treatment, minimally invasive therapeutic laparoscopy, minimally invasive endoscope treatment, minimally invasive gynaecology, minimally invasive secrete
Urine, minimally invasive bronchoscope flow.
Imaging system is provided at surgical operation/intervention top by continuous, to reach the purpose of image guided therapy.When
When needing primary imaging unit removing to its parked position temporarily, patient and instrument tracking can be carried out continuously.In addition, according to
Present clinical environment and in the case of any interruption of no treatment, realizes from primary imaging unit and obtains patient and instrument
Navigation and the possibility of the repetition returned.
The step of method for tracing object 100 that Fig. 4 shows for medical image system to track predetermined movable objects
Schematic overview.As above instructions, methods described includes that the following steps of the order need not be taken:
- in first step 102, the view data of the body of patient, the primary imaging are provided by primary imaging unit
Unit is moveable between imaging pattern and park mode.
- in second step 104, the view data of the body of patient is provided by secondary imaging unit.
- in third step 106, position of the secondary imaging unit relative to reference point is determined by means of primary imaging unit
The position put.
- in four steps 108, track predetermined movable objects in the following manner:I) it is based in imaging pattern
The imaging of primary imaging unit (13), and ii) imaging of secondary imaging unit (15) is based in park mode.
First step 102 is also referred to as step a), and second step 104 is also referred to as step b), and third step 106 is also claimed
For step c), and four steps 108 is also referred to as step d).
In other words, the method for tracing object 100 for medical image system is such method:Wherein, medical system
Primary imaging unit or main imaging unit shoot the 3D rendering of the inside of body, main imaging unit determine secondary imaging unit or from
The position of imaging unit, the image of the outside of body is shot from the camera of imaging unit, and medical system is used from imaging unit phase
For the position of main imaging unit that internal image and external image is registering each other, main imaging unit is positioned remote from body
Body, by guiding the intervention performed on body from imaging unit.
There is provided a kind of computer program or a kind of computer program member in another one exemplary embodiment of the present invention
Part, it is characterised in that the method suitable for performing the method described in one in embodiment above in appropriate system
Step.
Therefore, the computer program element can be stored on computer unit, and the computer unit can also
It is a part for embodiments of the invention.It is more than the step of computing unit may be adapted to perform process as described above or induction
The execution of the step of method of description.In addition, the computing unit may be adapted to the part for operating arrangement described above.The meter
Calculate the order that unit can be adapted to automatically operate and/or perform user.Computer program can be downloaded to data processor
Working storage in.The data processor is it is possible thereby to be equipped to carry out the method for the present invention.
The one exemplary embodiment of the present invention covers the computer program of the use present invention from the beginning and by means of inciting somebody to action
Existing procedure turns are using both computer programs of renewal of program of the present invention.
Further, the computer program element can provide the exemplary implementation for realizing method as described above
All required steps of the flow of example.
According to another one exemplary embodiment of the present invention, it is proposed that a kind of computer-readable medium, such as CD-ROM, its
In, the computer-readable medium has the computer program element being stored on the computer-readable medium, wherein, institute
Computer program element is stated to be described by previous section.
Computer program can be stored and/or is distributed on appropriate media, such as provided together with other hardware or
The optical storage medium or solid state medium provided as the part of other hardware, but computer program can also be with other shapes
Formula is distributed, such as via internet or other wired or wireless communication systems.
However, the computer program can also be present on the network such as WWW and can be in the network under
It is downloaded in the working storage of data processor.It is used to cause there is provided one kind according to another one exemplary embodiment of the present invention
Computer program element can be used for the medium downloaded, wherein, the computer program element is arranged to perform according to the present invention
Previously described embodiment in one described in method.
It must be noted that embodiments of the invention are been described by with reference to different themes.Specifically, some embodiment reference sides
Method type claims are been described by, and other embodiment reference device type claims are been described by.However, this area skill
Art personnel will recognize in the description according to above and below, unless otherwise noted, the spy except belonging to a type of theme
Outside any combinations levied, any combinations being related between the feature of different themes are recognized as by disclosure.However, institute
There is feature to be combined to provide the cooperative effect of the simple addition more than feature.
Although the present invention, such explanation and description are illustrated and described in detail in the description in accompanying drawing and above
It is considered as illustrative or exemplary and nonrestrictive.The invention is not restricted to the disclosed embodiments.By studying accompanying drawing,
Specification and dependent claims, those skilled in the art is when putting into practice the advocated present invention it will be appreciated that and realizing that institute is public
Other modifications for the embodiment opened.
In the claims, word " comprising " is not excluded for other elements or step, also, measure word "a" or "an" is not
Exclude multiple.Single processor or other units can realize the function of some projects described in claims.Although
It is mutually different to be recited in mutually different dependent certain measures, but this does not indicate that and these measures cannot be used to advantage
Combination.Any reference in claim is not interpreted as the limitation to scope.
Claims (14)
1. a kind of medical image system for being arranged to track predetermined movable objects, including:
- primary imaging unit (13), it is used for the first view data of the body for providing patient, the primary imaging unit into
As pattern (M1) and park mode (Mp) between can move, and
- secondary imaging unit (15), it is used for the second view data of the body for providing patient,
Wherein, the primary imaging unit (13) is configured as being based on when the primary imaging unit is in the imaging pattern
Described first image data track the predetermined movable objects,
Wherein, the secondary imaging unit (15) is configured as being based on when the primary imaging unit is in the park mode
Second view data tracks the predetermined movable objects, and
Wherein, the primary imaging unit (13) is also arranged to determine the secondary imaging unit (15) relative to reference position
Position.
2. medical image system according to claim 1, wherein, the primary imaging unit (13) includes:
- x-ray imaging unit (20), and
- additional image unit (23), it includes at least one camera, and at least one described camera is used for the body for providing patient
First optical image data.
3. medical image system according to claim 2, wherein, the secondary imaging unit (13) includes at least one phase
Machine, to provide the second optical image data as second view data.
4. medical image system according to claim 3, wherein, the additional image unit and the secondary imaging unit
It is configured as detecting characteristic or the position of benchmark (30) of the determination on object, wherein, according to first optical image data
The secondary imaging unit is exported with the characteristic of the determination in second optical image data or the relative position of benchmark
The position.
5. the medical image system according to any one of claim 1-4, wherein, primary imaging unit (13) bag
Position sensor is included, to monitor the position of the secondary imaging unit (15).
6. the medical image system according to any one of claim 1-4, wherein, the reference position is the primary
The position of imaging unit in itself.
7. the medical image system according to any one of claim 1-4, wherein, when the primary imaging unit (13)
During in the imaging pattern, the current image date of the primary imaging unit (13) and the secondary imaging unit (15)
Current image date is configured as being merged.
8. the medical image system according to any one of claim 1-4, wherein, when the primary imaging unit (13)
During in the park mode, the view data and the secondary imaging unit of the previous capture of the primary imaging unit (13)
(15) current image date is configured as being merged.
9. the medical image system according to any one of claim 1-4, wherein, primary imaging unit (13) quilt
It is configured to track at least the first object, and the secondary imaging unit (15) is configured as tracking at least the second object.
10. the medical image system according to any one of claim 1-4, wherein, the secondary imaging unit is also wrapped
Include surgical tasks lamp.
11. the medical image system according to any one of claim 1-4, wherein, the secondary imaging unit (15)
At least it can regularly be attached to object holder temporarily.
12. the medical image system according to any one of claim 1-4, wherein, the primary imaging unit (13)
Movable sensor is provided with, to detect the movement of the secondary imaging unit (15).
13. a kind of medical image system that is used for is to track the method for tracing object (100) of predetermined movable objects, wherein, it is described
Method comprises the following steps:
A) view data of the body of (102) patient is provided by primary imaging unit (13), the primary imaging unit is in imaging
It can be moved between pattern and park mode,
B) view data of the body of (104) patient is provided by secondary imaging unit (15),
C) determined (106) described secondary imaging unit (15) relative to reference position by means of the primary imaging unit (13)
Position, and
D) (108) described predetermined movable objects are tracked in the following manner:
- in the imaging pattern, based on the imaging of the primary imaging unit (13), and
- in the park mode, the imaging based on the secondary imaging unit (15).
14. a kind of medical image system that is used for is to track the Object tracking device of predetermined movable objects, wherein, described device bag
Include:
For by primary imaging unit (13) provide patient body view data module, the primary imaging unit into
As being to move between pattern and park mode,
For the module of the view data of the body by secondary imaging unit (15) offer patient,
For being determined the secondary imaging unit (15) relative to reference position by means of the primary imaging unit (13)
The module of position, and
The module of the predetermined movable objects is tracked in the following manner:
- in the imaging pattern, based on the imaging of the primary imaging unit (13), and
- in the park mode, the imaging based on the secondary imaging unit (15).
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP13198609 | 2013-12-19 | ||
EP13198609.3 | 2013-12-19 | ||
PCT/EP2014/076662 WO2015091015A2 (en) | 2013-12-19 | 2014-12-05 | Object tracking device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105578985A CN105578985A (en) | 2016-05-11 |
CN105578985B true CN105578985B (en) | 2017-09-01 |
Family
ID=49920025
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201480053435.0A Active CN105578985B (en) | 2013-12-19 | 2014-12-05 | Object tracking device |
Country Status (6)
Country | Link |
---|---|
US (1) | US10542959B2 (en) |
EP (1) | EP3086734B1 (en) |
JP (1) | JP6118465B2 (en) |
CN (1) | CN105578985B (en) |
RU (1) | RU2687883C2 (en) |
WO (1) | WO2015091015A2 (en) |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10886019B2 (en) * | 2016-03-22 | 2021-01-05 | Koninklijke Philips N.V. | Medical image orientation |
CN106137394B (en) * | 2016-06-15 | 2019-09-24 | 苏州铸正机器人有限公司 | A method of obtaining pedicle of vertebral arch standard axle bitmap |
EP3388987A1 (en) * | 2017-04-12 | 2018-10-17 | Deutsche Post AG | Support for the storage and location of objects in a storage facility |
EP3760127B1 (en) | 2017-09-25 | 2024-03-20 | Shanghai United Imaging Healthcare Co., Ltd. | System and method for locating a target subject |
EP3545896A1 (en) * | 2018-03-30 | 2019-10-02 | Koninklijke Philips N.V. | Monitoring of moving objects in an operation room |
JP7102219B2 (en) * | 2018-05-10 | 2022-07-19 | キヤノンメディカルシステムズ株式会社 | Nuclear medicine diagnostic equipment and position correction method |
EP3646790A1 (en) | 2018-10-31 | 2020-05-06 | Koninklijke Philips N.V. | Guidance during x-ray imaging |
JP6978454B2 (en) * | 2019-02-22 | 2021-12-08 | ファナック株式会社 | Object detector, control device and computer program for object detection |
JP7350519B2 (en) * | 2019-05-29 | 2023-09-26 | キヤノン株式会社 | Radiography system, radiography control device, control method thereof, and program |
EP3757940A1 (en) * | 2019-06-26 | 2020-12-30 | Siemens Healthcare GmbH | Determination of a patient's movement during a medical imaging measurement |
CN111292239B (en) * | 2020-01-21 | 2021-03-12 | 天目爱视(北京)科技有限公司 | Three-dimensional model splicing equipment and method |
JP7394645B2 (en) * | 2020-02-05 | 2023-12-08 | 富士フイルム株式会社 | Teacher image generation device, method and program, learning device, method and program, discriminator, and radiation image processing device, method and program |
US11475997B2 (en) * | 2020-02-21 | 2022-10-18 | Shanghai United Imaging Intelligence Co., Ltd. | Systems and methods for automated healthcare services |
DE102020205546A1 (en) * | 2020-04-30 | 2021-11-04 | Siemens Healthcare Gmbh | Monitoring procedure and medical system |
RU203631U1 (en) * | 2020-12-07 | 2021-04-14 | Федеральное государственное автономное образовательное учреждение высшего образования "Севастопольский государственный университет" | PATIENT POSITION TRACKING SYSTEM DURING OPERATIONS USING A ROBOTIC SURGICAL COMPLEX |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6527443B1 (en) * | 1999-04-20 | 2003-03-04 | Brainlab Ag | Process and apparatus for image guided treatment with an integration of X-ray detection and navigation system |
CN1933782A (en) * | 2004-03-23 | 2007-03-21 | 皇家飞利浦电子股份有限公司 | X-ray examination apparatus and method |
WO2007115825A1 (en) * | 2006-04-12 | 2007-10-18 | Nassir Navab | Registration-free augmentation device and method |
CN101623200A (en) * | 2008-07-09 | 2010-01-13 | 西门子公司 | X-ray system |
DE102008050572A1 (en) * | 2008-10-06 | 2010-04-15 | Siemens Aktiengesellschaft | Method for positioning medical imaging device at patient, involves determining current position of patient and imaging device fixed to carrier with navigation system by position markers fitted to patient and also to imaging device or it |
CN102711650A (en) * | 2010-01-13 | 2012-10-03 | 皇家飞利浦电子股份有限公司 | Image integration based registration and navigation for endoscopic surgery |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6405072B1 (en) * | 1991-01-28 | 2002-06-11 | Sherwood Services Ag | Apparatus and method for determining a location of an anatomical target with reference to a medical apparatus |
US6205347B1 (en) * | 1998-02-27 | 2001-03-20 | Picker International, Inc. | Separate and combined multi-modality diagnostic imaging system |
JP2001137221A (en) * | 1999-11-12 | 2001-05-22 | Ge Medical Systems Global Technology Co Llc | Biplane angiography and ct apparatus |
DE10015815A1 (en) * | 2000-03-30 | 2001-10-11 | Siemens Ag | Image data set generating system for medical diagnostics - superimposes or merges image data obtained from X-ray and ultrasound systems, whose position was determined using navigation system |
JP4737808B2 (en) * | 2000-09-29 | 2011-08-03 | 株式会社東芝 | IVR-CT equipment |
US7505809B2 (en) * | 2003-01-13 | 2009-03-17 | Mediguide Ltd. | Method and system for registering a first image with a second image relative to the body of a patient |
EP1751712A2 (en) * | 2004-05-14 | 2007-02-14 | Philips Intellectual Property & Standards GmbH | Information enhanced image guided interventions |
DE102004046430A1 (en) | 2004-09-24 | 2006-04-06 | Siemens Ag | System for visual situation-based real-time based surgeon support and real-time documentation and archiving of the surgeon's visually perceived support-based impressions during surgery |
US8013607B2 (en) * | 2006-10-31 | 2011-09-06 | Koninklijke Philips Electronics N.V. | Magnetic shielding for a PET detector system |
JP5661264B2 (en) * | 2009-09-03 | 2015-01-28 | 株式会社日立メディコ | X-ray navigation device |
US10426554B2 (en) | 2011-04-29 | 2019-10-01 | The Johns Hopkins University | System and method for tracking and navigation |
WO2013055707A1 (en) * | 2011-10-09 | 2013-04-18 | Clear Guide Medical, Llc | Interventional in-situ image-guidance by fusing ultrasound video |
EP2800534B1 (en) | 2012-01-03 | 2021-04-28 | Koninklijke Philips N.V. | Position determining apparatus |
JP5444422B2 (en) | 2012-07-17 | 2014-03-19 | 株式会社東芝 | Intraoperative medical imaging system |
-
2014
- 2014-12-05 EP EP14806660.8A patent/EP3086734B1/en active Active
- 2014-12-05 US US14/915,047 patent/US10542959B2/en active Active
- 2014-12-05 RU RU2016114514A patent/RU2687883C2/en not_active IP Right Cessation
- 2014-12-05 JP JP2016525596A patent/JP6118465B2/en active Active
- 2014-12-05 WO PCT/EP2014/076662 patent/WO2015091015A2/en active Application Filing
- 2014-12-05 CN CN201480053435.0A patent/CN105578985B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6527443B1 (en) * | 1999-04-20 | 2003-03-04 | Brainlab Ag | Process and apparatus for image guided treatment with an integration of X-ray detection and navigation system |
CN1933782A (en) * | 2004-03-23 | 2007-03-21 | 皇家飞利浦电子股份有限公司 | X-ray examination apparatus and method |
WO2007115825A1 (en) * | 2006-04-12 | 2007-10-18 | Nassir Navab | Registration-free augmentation device and method |
CN101623200A (en) * | 2008-07-09 | 2010-01-13 | 西门子公司 | X-ray system |
DE102008050572A1 (en) * | 2008-10-06 | 2010-04-15 | Siemens Aktiengesellschaft | Method for positioning medical imaging device at patient, involves determining current position of patient and imaging device fixed to carrier with navigation system by position markers fitted to patient and also to imaging device or it |
CN102711650A (en) * | 2010-01-13 | 2012-10-03 | 皇家飞利浦电子股份有限公司 | Image integration based registration and navigation for endoscopic surgery |
Also Published As
Publication number | Publication date |
---|---|
US10542959B2 (en) | 2020-01-28 |
US20160278731A1 (en) | 2016-09-29 |
EP3086734A2 (en) | 2016-11-02 |
WO2015091015A2 (en) | 2015-06-25 |
RU2687883C2 (en) | 2019-05-16 |
JP2016533795A (en) | 2016-11-04 |
JP6118465B2 (en) | 2017-04-19 |
EP3086734B1 (en) | 2018-02-21 |
RU2016114514A (en) | 2017-10-19 |
WO2015091015A3 (en) | 2015-09-03 |
RU2016114514A3 (en) | 2018-07-20 |
CN105578985A (en) | 2016-05-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105578985B (en) | Object tracking device | |
CN110831536B (en) | System and method for correcting for a non-commanded instrument roll | |
US20230190244A1 (en) | Biopsy apparatus and system | |
CN110831538B (en) | Image-based airway analysis and mapping | |
CN110913788B (en) | Electromagnetic distortion detection | |
CN110809452B (en) | Electromagnetic field generator alignment | |
KR102558061B1 (en) | A robotic system for navigating the intraluminal tissue network that compensates for physiological noise | |
CN110831537B (en) | Robotic system for determining pose of medical device in lumen network | |
US20200237449A1 (en) | Redundant reciprocal tracking system | |
US20200030038A1 (en) | Optical targeting and visualization of trajectories | |
CN110831535A (en) | Configuring a robotic system for navigation path tracking | |
US20200289207A1 (en) | Method of fluoroscopic surgical registration | |
US20230015717A1 (en) | Anatomical scanning, targeting, and visualization | |
WO2023049528A1 (en) | Anatomical scanning, targeting, and visualization |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |