WO2016124539A1 - A system and a method for labeling objects in medical images - Google Patents

A system and a method for labeling objects in medical images Download PDF

Info

Publication number
WO2016124539A1
WO2016124539A1 PCT/EP2016/052061 EP2016052061W WO2016124539A1 WO 2016124539 A1 WO2016124539 A1 WO 2016124539A1 EP 2016052061 W EP2016052061 W EP 2016052061W WO 2016124539 A1 WO2016124539 A1 WO 2016124539A1
Authority
WO
WIPO (PCT)
Prior art keywords
lesion
tool
biopsy
sites
image
Prior art date
Application number
PCT/EP2016/052061
Other languages
French (fr)
Inventor
Lu Wang
Zhenwei HE
Payal Keswarpu
Sarif Kumar Naik
Subhendu Seth
Vipin Gupta
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Publication of WO2016124539A1 publication Critical patent/WO2016124539A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • A61B1/000094Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00039Operational features of endoscopes provided with input arrangements for the user
    • A61B1/0004Operational features of endoscopes provided with input arrangements for the user for electronic operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H15/00ICT specially adapted for medical reports, e.g. generation or transmission thereof

Definitions

  • the invention relates to object labeling, more particularly to a system and a method for object labeling in medical images. .
  • lesion annotation and biopsy marking are very important as to meaningful and successful completion of such procedure.
  • lesion annotation is done to mark the areas of abnormalities and provide descriptions of such areas.
  • a lesion tool is used to annotate the lesions and the observations thereof, and thereafter indicate the biopsy sites using a biopsy tool.
  • the biopsy sites are indicated based on the lesions indicated in the lesion tool, which is done manually by the user. Accordingly, it is required to indicate the same lesion twice in both the tools viz. lesion tool and biopsy tool. This requires the user to remember the indications or markings of the lesion and indicate it once again, at least in the biopsy tool. Also, the annotations of the lesion and of providing the description therein are done independently. This makes these tasks more cumbersome given to that they need to be referred to each other and appropriate description has to be identified and adaptively provided for the marked lesion.
  • the current devices provide basic image capturing, image browsing and report generation.
  • the procedure as described above involves labeling of the objects in the medical images that is more manually performed though being digital.
  • WO 2001078607 provides actual biopsy sites by comparing sequence of images.
  • Further object of the invention is to provide a method for labeling objects in the images using the system of the invention.
  • the invention provides a system for labeling objects in images.
  • the system of the invention comprises a lesion tool for identifying and providing lesion sites from at least one of the images; a biopsy tool for identifying the biopsy sites from the said at least one image and / or of the said lesion sites, the said lesion tool and biopsy tool are integrated therein along with a knowledge base and Graphical User Interface (GUI), to identify the lesion sites and of the biopsy sites therefrom.
  • GUI Graphical User Interface
  • a biopsy tool which comprises a comparison module for comparing the said lesion sites and the said image and of the image data thereof.
  • the biopsy sites are automatically identified by the biopsy tool herein and of the affirmation of the biopsy sites in accordance with the identified lesion sites.
  • the invention also provides a method labeling objects in images by the system of the invention.
  • the method of the invention comprises identifying and providing lesion sites from at least one of the image, by a lesion tool; identifying biopsy sites from the said at least one image and / or the said lesion sites, by the biopsy tool, the said lesion tool and biopsy tool are integrated therein along with a knowledge base and Graphical User Interface (GUI), to identify the lesion sites and of the biopsy sites therefrom.
  • GUI Graphical User Interface
  • Fig. 1 shows a system for labeling objects in images, in accordance with the invention
  • Figs. 2a illustrate the method of the invention performed automatically
  • Fig. 2b illustrates the method of the invention performed using manual inputs from the user; and Fig. 2c illustrates the method of the invention performed in real time.
  • a system (100) for labeling objects in images is shown.
  • the system (100) purports to provide biopsy sites in respect of a colposcopy examination and of the procedure.
  • the images of the region of interest is captured by a colposcopy hardware (101), which typically comprises a camera with other supporting elements that enables it to capture images and provide the same for further analysis and processing.
  • the images may be captured digitally, which allows further integration with other devices to process the said images captured and of the image data related thereto.
  • Inputs from the user (102) may be obtained along with the images from the colposcopy hardware (101).
  • Such inputs include but not limited to at least one of the manual drawings of lesion or landmark regions, manual annotation of biopsy sites, annotation of properties such as position, size, grade , opaqueness, etc. of lesions and landmarks, confirmation on computer recommended lesions / landmarks / biopsy sites, and user modifications of any of the above.
  • GUI Graphical User Interface
  • the interactions between the graphical representation of the images and of the image data may be enabled by the GUI (103).
  • the lesion tool (104) is provided to identify the lesion sites from at least one of the captured images.
  • the graphical representation of the image and of the image data are analyzed by the lesion tool (104) and the text description pertaining to the images under analysis is provided by the knowledge base (106).
  • the process in relation to the knowledge base (106) recognizes the most frequent observation terms the user usually uses or the user has configured the vocabulary in advance.
  • the process maps the terms the user uses into properties of the features. Based on this mapping, the automatic description generation can selectively translate the quantitative parameters into qualitative description.
  • a rule-based reasoning plays a role here. For instance, if the user usually describes Transformation Zone (TZ) type, then the translated properties are not only TZ type, but also the ones that the TZ type is dependent on such as Squamous Columnar Junction (SCJ) margins.
  • TZ Transformation Zone
  • SCJ Squamous Columnar Junction
  • the knowledge base (106) may comprises, in general, at least one of the process data such as text description, information on the medical procedure purporting to the image labeling, images and of the corresponding data, semantic data, procedural data, user inputs, etc. These process data may be stored and / or updated by lesion tool (104), biopsy tool (105) or user (102).
  • the knowledge base (106) may also stores one or more of margins and properties e.g., size, opacity, thickness, border, etc. of landmark and lesions, temporal change of aceto-whiteness, geographic properties of the detected features, timestamp of acetic acid application, taxonomy of the descriptions for cervical features, map of user's vocabulary and corresponding properties of features.
  • Lesion tool (104) detects lesion / landmark margins, and perform lesion mapping based on the four quadrants of clock diagram of the image. The result is visualized via the GUI (103).
  • the GUI (103) collects user input, such as confirmed margins, which in turn are fed back to the lesion tool (104).
  • the detected lesions / landmark margins, and their properties, with or without user input, are gathered in the lesion tool (104).
  • the lesion tool (104) translates quantitative feature measurements, which includes lesion and landmark segmentation, size measurement, lesion mapping, into qualitative description. Provided with an output from the knowledge base (106), a final description tailored to user's vocabulary is generated.
  • Lesions and their properties may be obtained either by manual input from user (102) or from the lesion tool (104) that automatically recognize lesions.
  • the biopsy tool (105) accepts the aforementioned input and uses the knowledge-based rules to decide recommended biopsy sites and their properties.
  • the knowledge- based rules make decisions based on type, grade of lesions and any other description provided by the user (102). For instance, if the user annotates a lesion 'aceto-white' at 3 o'clock and marks it as dense aceto-white, then the biopsy tool provides an indication of biopsy site at 3 o'clock accordingly. On the other hand, if the user annotates 'polyp' or 'transformation zone', the biopsy tool does not note down any marks because it does not require a biopsy according to the knowledge rules.
  • the biopsy tool (105) can be presented together with the lesion tool (104), either integrated into one clock diagram or two clock diagrams of the image next to each other. Alternatively, it can be presented next to the video during biopsy process, wherein lesion properties can be automatically computed by the lesion tool (104). It can serve as not only navigation guidance but also a reminder.
  • the biopsy tool (105) compares the difference between recommended biopsy sites and operated ones. By knowing the difference, the system (100) reminds the user of potentially overlooked sites. The difference can be computed by comparing the recommended sites with either users' manual input on the biopsy tool or biopsy tool (105) that automatically recognize actual biopsy sites.
  • the lesion tool (104) and the biopsy tool (105) may co-exist with each other in the system (100) or may be integrated therein.
  • the images and of the data corresponding thereto may be converted into semantic data by the lesion tool (104) and / or the biopsy tool (105), to provide more information about the images and of its association with the process and of the process data.
  • Fig. 2a illustrates one embodiment of the method of the invention wherein the detection of biopsy sites is performed automatically.
  • the user performs the colposcopy examination (201) and the system is adapted to detect the findings of the images, its position, type and grade
  • the user provides the annotation for the lesions (203), and the findings and properties are provided thereto to the system.
  • the system uses the knowledge base to decide on the recommended biopsy sites (204).
  • the biopsy sites recommended by the system and that provided by the user are compared and confirmed (205).
  • the system uses the knowledge-base and generates description of findings and biopsy sites and updates the knowledge base as well (206).
  • Fig. 2b one embodiment of the method of the invention, wherein the detection of biopsy sites is performed using manual inputs from the user, is shown.
  • the user performs the colposcopy examination (201) and provides the markings and findings and other related information like type, grade, etc. (203).
  • These findings and other property related information are provided to the system and the knowledge base to decide on the biopsy sites that needs to be recommended (204).
  • the recommended biopsy sites provided by the system and that identified by the user based on the findings and properties are compared by the user (205) and confirmation on such biopsy sites were made thereupon.
  • Fig. 2c the method of the invention performed in real time is shown in Fig. 2c as one embodiment in accordance with the method of the invention.
  • the user performs the colposcopy examination (201) and the system automatically detects the findings and of other properties related thereto (202). These findings and properties are provided to the system.
  • the system uses the knowledge base to decide on the biopsy sites to be recommended (204).
  • the System provides the recommended biopsy sites to the user (205a) and upon receiving the recommended biopsy sites, the user confirms the same during report generation or may add such sites as the case may be (205b).
  • the invention as described above provides a system and a method theretofore labeling objects in images, in an improved manner.
  • the lesion tool and the biopsy tool as described herein may be integrated and the recommendations of the biopsy sites are made available automatically. This improves the reliability and efficiency of the system and of the method for performing such colposcopy procedure and of the examination.
  • the invention can be coextensively applied to colposcopy, sonography, endoscopy, cystoscopy etc.

Abstract

The invention relates to a system for labeling objects in images. The system of the invention comprises a lesion tool for identifying and providing lesion sites from at least one of the image; a biopsy tool for identifying the biopsy sites from the said at least one image and/or of the said lesion sites, the said lesion tool and biopsy tool are integrated therein along with a knowledge base and Graphical User Interface (GUI), to identify the lesion sites and of the biopsy sites therefrom. The comparison of the lesion sites detected by the lesion tool is automatically compared with the images captured by the biopsy tool. The invention also relates to a method for labeling objects in images, performed by the system of the invention.

Description

A SYSTEM AND A METHOD FOR LABELING OBJECTS IN MEDICAL IMAGES
FIELD OF THE INVENTION
The invention relates to object labeling, more particularly to a system and a method for object labeling in medical images. .
BACKGROUND OF THE INVENTION
In a medical procedure like colposcopy procedure for detecting and affirming biopsy sites, capturing and analyzing the images and of the data thereof is crucial. Information management and image data management are made easier and efficient, both quantitatively and qualitatively through examination of such procedures digitally. This allows improved integration with other applications or devices like internet, printers, etc.
In a colposcopy procedure, lesion annotation and biopsy marking are very important as to meaningful and successful completion of such procedure. In this respect, lesion annotation is done to mark the areas of abnormalities and provide descriptions of such areas. A lesion tool is used to annotate the lesions and the observations thereof, and thereafter indicate the biopsy sites using a biopsy tool.
The biopsy sites are indicated based on the lesions indicated in the lesion tool, which is done manually by the user. Accordingly, it is required to indicate the same lesion twice in both the tools viz. lesion tool and biopsy tool. This requires the user to remember the indications or markings of the lesion and indicate it once again, at least in the biopsy tool. Also, the annotations of the lesion and of providing the description therein are done independently. This makes these tasks more cumbersome given to that they need to be referred to each other and appropriate description has to be identified and adaptively provided for the marked lesion. The current devices provide basic image capturing, image browsing and report generation.
The procedure as described above, involves labeling of the objects in the medical images that is more manually performed though being digital.
WO 2001078607 provides actual biopsy sites by comparing sequence of images.
Therefore, there is a need for a solution that provides an automated labeling of the objects in the images wherein the annotations and of the descriptions for the lesions are provided in conjunction and of identifying the biopsy sites in relation to the identified lesions. The invention is aimed at providing such a solution and to overcome the limitations of the above described procedure. OBJECTS OF THE INVENTION
It is an object of the invention to provide a system for annotating the lesions;
It is another object of the invention to provide a system for providing description for the annotations made therein for the lesions;
It is yet another object of the invention to provide a system that integrates the lesion tool and the biopsy tool.
Further object of the invention is to provide a method for labeling objects in the images using the system of the invention.
SUMMARY OF THE INVENTION
The invention provides a system for labeling objects in images. The system of the invention comprises a lesion tool for identifying and providing lesion sites from at least one of the images; a biopsy tool for identifying the biopsy sites from the said at least one image and / or of the said lesion sites, the said lesion tool and biopsy tool are integrated therein along with a knowledge base and Graphical User Interface (GUI), to identify the lesion sites and of the biopsy sites therefrom.
In one preferred embodiment of the invention, a biopsy tool is provided which comprises a comparison module for comparing the said lesion sites and the said image and of the image data thereof. The biopsy sites are automatically identified by the biopsy tool herein and of the affirmation of the biopsy sites in accordance with the identified lesion sites.
The invention also provides a method labeling objects in images by the system of the invention. The method of the invention comprises identifying and providing lesion sites from at least one of the image, by a lesion tool; identifying biopsy sites from the said at least one image and / or the said lesion sites, by the biopsy tool, the said lesion tool and biopsy tool are integrated therein along with a knowledge base and Graphical User Interface (GUI), to identify the lesion sites and of the biopsy sites therefrom.
BRIEF DESCRIPTION OF THE DRAWINGS
With reference to the accompanying drawings in which:
Fig. 1 shows a system for labeling objects in images, in accordance with the invention;
Figs. 2a illustrate the method of the invention performed automatically;
Fig. 2b illustrates the method of the invention performed using manual inputs from the user; and Fig. 2c illustrates the method of the invention performed in real time. DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
The invention is further described herein after with reference to a non-exhaustive exemplary embodiment and with reference to Figs. 1 and 2a to 2c.
In Fig. 1, a system (100) for labeling objects in images is shown. The system (100) purports to provide biopsy sites in respect of a colposcopy examination and of the procedure. The images of the region of interest is captured by a colposcopy hardware (101), which typically comprises a camera with other supporting elements that enables it to capture images and provide the same for further analysis and processing. The images may be captured digitally, which allows further integration with other devices to process the said images captured and of the image data related thereto.
Inputs from the user (102) may be obtained along with the images from the colposcopy hardware (101). Such inputs include but not limited to at least one of the manual drawings of lesion or landmark regions, manual annotation of biopsy sites, annotation of properties such as position, size, grade , opaqueness, etc. of lesions and landmarks, confirmation on computer recommended lesions / landmarks / biopsy sites, and user modifications of any of the above.
A Graphical User Interface (GUI) is provided to establish interactions / integration amongst one or more of the colposcopy hardware (101), user (102), lesion tool (104), biopsy tool (105), and of the knowledge base (106). The interactions between the graphical representation of the images and of the image data may be enabled by the GUI (103).
The lesion tool (104) is provided to identify the lesion sites from at least one of the captured images. The graphical representation of the image and of the image data are analyzed by the lesion tool (104) and the text description pertaining to the images under analysis is provided by the knowledge base (106).
The process in relation to the knowledge base (106) recognizes the most frequent observation terms the user usually uses or the user has configured the vocabulary in advance. The process maps the terms the user uses into properties of the features. Based on this mapping, the automatic description generation can selectively translate the quantitative parameters into qualitative description. A rule-based reasoning plays a role here. For instance, if the user usually describes Transformation Zone (TZ) type, then the translated properties are not only TZ type, but also the ones that the TZ type is dependent on such as Squamous Columnar Junction (SCJ) margins.
The knowledge base (106) may comprises, in general, at least one of the process data such as text description, information on the medical procedure purporting to the image labeling, images and of the corresponding data, semantic data, procedural data, user inputs, etc. These process data may be stored and / or updated by lesion tool (104), biopsy tool (105) or user (102). The knowledge base (106) may also stores one or more of margins and properties e.g., size, opacity, thickness, border, etc. of landmark and lesions, temporal change of aceto-whiteness, geographic properties of the detected features, timestamp of acetic acid application, taxonomy of the descriptions for cervical features, map of user's vocabulary and corresponding properties of features.
Lesion tool (104) detects lesion / landmark margins, and perform lesion mapping based on the four quadrants of clock diagram of the image. The result is visualized via the GUI (103). The GUI (103) collects user input, such as confirmed margins, which in turn are fed back to the lesion tool (104). The detected lesions / landmark margins, and their properties, with or without user input, are gathered in the lesion tool (104). The lesion tool (104) translates quantitative feature measurements, which includes lesion and landmark segmentation, size measurement, lesion mapping, into qualitative description. Provided with an output from the knowledge base (106), a final description tailored to user's vocabulary is generated.
Lesions and their properties may be obtained either by manual input from user (102) or from the lesion tool (104) that automatically recognize lesions. Upon providing the lesion information and of the properties, the biopsy tool (105) accepts the aforementioned input and uses the knowledge-based rules to decide recommended biopsy sites and their properties. The knowledge- based rules make decisions based on type, grade of lesions and any other description provided by the user (102). For instance, if the user annotates a lesion 'aceto-white' at 3 o'clock and marks it as dense aceto-white, then the biopsy tool provides an indication of biopsy site at 3 o'clock accordingly. On the other hand, if the user annotates 'polyp' or 'transformation zone', the biopsy tool does not note down any marks because it does not require a biopsy according to the knowledge rules.
On the GUI layer, the biopsy tool (105) can be presented together with the lesion tool (104), either integrated into one clock diagram or two clock diagrams of the image next to each other. Alternatively, it can be presented next to the video during biopsy process, wherein lesion properties can be automatically computed by the lesion tool (104). It can serve as not only navigation guidance but also a reminder.
Also, the biopsy tool (105) compares the difference between recommended biopsy sites and operated ones. By knowing the difference, the system (100) reminds the user of potentially overlooked sites. The difference can be computed by comparing the recommended sites with either users' manual input on the biopsy tool or biopsy tool (105) that automatically recognize actual biopsy sites.
The lesion tool (104) and the biopsy tool (105) may co-exist with each other in the system (100) or may be integrated therein. The images and of the data corresponding thereto may be converted into semantic data by the lesion tool (104) and / or the biopsy tool (105), to provide more information about the images and of its association with the process and of the process data.
Fig. 2a illustrates one embodiment of the method of the invention wherein the detection of biopsy sites is performed automatically. Here, the user performs the colposcopy examination (201) and the system is adapted to detect the findings of the images, its position, type and grade
(202) . The user provides the annotation for the lesions (203), and the findings and properties are provided thereto to the system. The system uses the knowledge base to decide on the recommended biopsy sites (204). The biopsy sites recommended by the system and that provided by the user are compared and confirmed (205). The system uses the knowledge-base and generates description of findings and biopsy sites and updates the knowledge base as well (206).
In Fig. 2b, one embodiment of the method of the invention, wherein the detection of biopsy sites is performed using manual inputs from the user, is shown. Here, the user performs the colposcopy examination (201) and provides the markings and findings and other related information like type, grade, etc. (203). These findings and other property related information are provided to the system and the knowledge base to decide on the biopsy sites that needs to be recommended (204). The recommended biopsy sites provided by the system and that identified by the user based on the findings and properties are compared by the user (205) and confirmation on such biopsy sites were made thereupon.
Similarly, the method of the invention performed in real time is shown in Fig. 2c as one embodiment in accordance with the method of the invention. Here, the user performs the colposcopy examination (201) and the system automatically detects the findings and of other properties related thereto (202). These findings and properties are provided to the system. The system uses the knowledge base to decide on the biopsy sites to be recommended (204). The System provides the recommended biopsy sites to the user (205a) and upon receiving the recommended biopsy sites, the user confirms the same during report generation or may add such sites as the case may be (205b).
The invention as described above provides a system and a method theretofore labeling objects in images, in an improved manner. The lesion tool and the biopsy tool as described herein may be integrated and the recommendations of the biopsy sites are made available automatically. This improves the reliability and efficiency of the system and of the method for performing such colposcopy procedure and of the examination. The invention can be coextensively applied to colposcopy, sonography, endoscopy, cystoscopy etc.
Only certain features of the invention have been specifically illustrated and described herein, and many modifications and changes will occur to those skilled in the art. The invention is not restricted by the preferred embodiment described herein in the description. It is to be noted that the invention is explained by way of exemplary embodiment and is neither exhaustive nor limiting. Certain aspects of the invention that not been elaborated herein in the description are well understood by one skilled in the art. Also, the terms relating to singular form used herein in the description also include its plurality and vice versa, wherever applicable. Any relevant modification or variation, which is not described specifically in the specification are in fact to be construed of being well within the scope of the invention. The appended claims are intended to cover all such modifications and changes which fall within the spirit of the invention. Thus, it will be appreciated by those skilled in the art that the present invention can be embodied in other specific forms without departing from the spirit or essential characteristics thereof. The presently disclosed embodiments are therefore considered in all respects to be illustrative and not restricted. The scope of the invention is indicated by the appended claims rather than the foregoing description and all changes that come within the meaning and range and equivalence thereof are intended to be embraced therein.

Claims

WE CLAIM:
1. A system for labeling objects in images comprising:
a lesion tool for identifying and providing lesion sites from at least one of the image; a biopsy tool for identifying the biopsy sites from the said at least one image and / or of the said lesion sites, the said lesion tool and biopsy tool are integrated therein along with a knowledge base and Graphical User Interface (GUI), to identify the lesion sites and of the biopsy sites therefrom.
2. The system as claimed in claim 1 , wherein the said lesion tool is provided for establishing interaction between the graphical representation of the said image and of the image data through the said GUI and the text description provided by the said knowledge base.
3. The system as claimed in claim 1, wherein said biopsy tool is provided for comparing the said lesion sites and the said image and of the image data thereof.
4. The system as claimed in claim 1 , wherein the said lesion tool and / or the said biopsy tool are adapted to covert the image and of the corresponding data, and the user inputs or the like into semantic data.
5. The system as claimed in claim 1, 2 and 4, wherein the said knowledge base comprises at least one of the process data such as text description, information on the medical procedure purporting to labeling of the images, images and of the corresponding data, semantic data, procedural data, user inputs, etc as stored and as updated by at least one of the said lesion tool, biopsy tool or user.
6. The system as claimed in claim 1, wherein the said lesion tool and biopsy tool co-exist or integrated therein.
7. A method for labeling objects in images, by the system as claimed in the preceding claims, wherein the said method comprising:
identifying and providing lesion sites from at least one of the images, by a lesion tool; identifying biopsy sites from the said at least one image and / or the said lesion sites, by the biopsy tool, the said lesion tool and biopsy toll are integrated therein along with a knowledge base and Graphical User Interface (GUI), to identify the lesion sites and of the biopsy sites therefrom.
8. The method as claimed in claim 7, wherein the said identifying and providing lesion sites comprises detecting lesion margins from the said image and mapping the said lesion margins on the schematic diagram of the graphical representation of the said image or the said image.
9. The method as claimed in claim 7 and 8, wherein the said identifying and providing lesion sites includes converting the quantitative measures of the said image and of the data thereof to qualitative measures.
10. The method as claimed in claim 9, wherein the said quantitative measures include lesion segmentation, size measurements, lesion mapping, etc.
1 1. The method as claimed in claim 8, wherein the said schematic diagram is a clock diagram or the like.
12. The method as claimed in any one of the preceding claims 7 to 11, wherein the said method is capable of being performed in real time.
13. A computer program product comprising code means for performing the method as claimed in claims 7 to 12 when executed on a computer processor.
14. A computer readable medium comprising the computer program product as claimed in claim 13.
PCT/EP2016/052061 2015-02-04 2016-02-01 A system and a method for labeling objects in medical images WO2016124539A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN549CH2015 2015-02-04
IN549/CHE/2015 2015-02-04

Publications (1)

Publication Number Publication Date
WO2016124539A1 true WO2016124539A1 (en) 2016-08-11

Family

ID=55436066

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2016/052061 WO2016124539A1 (en) 2015-02-04 2016-02-01 A system and a method for labeling objects in medical images

Country Status (1)

Country Link
WO (1) WO2016124539A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112614572A (en) * 2020-12-28 2021-04-06 深圳开立生物医疗科技股份有限公司 Focus marking method and device, image processing equipment and medical system
US11195313B2 (en) 2016-10-14 2021-12-07 International Business Machines Corporation Cross-modality neural network transform for semi-automatic medical image annotation

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001078607A1 (en) 2000-04-18 2001-10-25 Litton Systems, Inc. Enhanced visualization of in-vivo breast biopsy location for medical documentation
US20050059894A1 (en) * 2003-09-16 2005-03-17 Haishan Zeng Automated endoscopy device, diagnostic method, and uses
US20070237378A1 (en) * 2005-07-08 2007-10-11 Bruce Reiner Multi-input reporting and editing tool
US20080260218A1 (en) * 2005-04-04 2008-10-23 Yoav Smith Medical Imaging Method and System
US20090046905A1 (en) * 2005-02-03 2009-02-19 Holger Lange Uterine cervical cancer computer-aided-diagnosis (CAD)
US20100145720A1 (en) * 2008-12-05 2010-06-10 Bruce Reiner Method of extracting real-time structured data and performing data analysis and decision support in medical reporting
WO2013084123A2 (en) * 2011-12-05 2013-06-13 Koninklijke Philips Electronics N.V. Selection of images for optical examination of the cervix

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001078607A1 (en) 2000-04-18 2001-10-25 Litton Systems, Inc. Enhanced visualization of in-vivo breast biopsy location for medical documentation
US20050059894A1 (en) * 2003-09-16 2005-03-17 Haishan Zeng Automated endoscopy device, diagnostic method, and uses
US20090046905A1 (en) * 2005-02-03 2009-02-19 Holger Lange Uterine cervical cancer computer-aided-diagnosis (CAD)
US20080260218A1 (en) * 2005-04-04 2008-10-23 Yoav Smith Medical Imaging Method and System
US20070237378A1 (en) * 2005-07-08 2007-10-11 Bruce Reiner Multi-input reporting and editing tool
US20100145720A1 (en) * 2008-12-05 2010-06-10 Bruce Reiner Method of extracting real-time structured data and performing data analysis and decision support in medical reporting
WO2013084123A2 (en) * 2011-12-05 2013-06-13 Koninklijke Philips Electronics N.V. Selection of images for optical examination of the cervix

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11195313B2 (en) 2016-10-14 2021-12-07 International Business Machines Corporation Cross-modality neural network transform for semi-automatic medical image annotation
CN112614572A (en) * 2020-12-28 2021-04-06 深圳开立生物医疗科技股份有限公司 Focus marking method and device, image processing equipment and medical system

Similar Documents

Publication Publication Date Title
US20230033601A1 (en) Dynamic self-learning medical image method and system
KR101346539B1 (en) Organizing digital images by correlating faces
US9805469B2 (en) Marking and tracking an area of interest during endoscopy
AU2014237346B2 (en) System and method for reviewing and analyzing cytological specimens
JP2017534117A (en) Optimized anatomical interest structure labeling
US20150080652A1 (en) Lesion detection and image stabilization using portion of field of view
US11094403B2 (en) Method and apparatus for collecting test data from use of a disposable test kit
US11106724B2 (en) Matching result display device, matching result display method, program, and recording medium
KR102531400B1 (en) Artificial intelligence-based colonoscopy diagnosis supporting system and method
US20230190404A1 (en) Systems and methods for capturing, displaying, and manipulating medical images and videos
KR20150049585A (en) Polyp detection apparatus and operating method for the same
CN110767312A (en) Artificial intelligence auxiliary pathological diagnosis system and method
TW202105245A (en) Method for analyzing image of biopsy specimen to determine cancerous probability thereof
WO2016124539A1 (en) A system and a method for labeling objects in medical images
Hsieh et al. An overview of deep learning algorithms and water exchange in colonoscopy in improving adenoma detection
CN117618021A (en) Ultrasound imaging system and related workflow system and method
US20230172425A1 (en) Information processing method, electronic device, and computer storage medium
US20080031504A1 (en) Optimized user interactions using archived data in medical applications
JP2016202722A (en) Medical image display apparatus and program
Marinescu et al. Endobronchial optical coherence tomography for the diagnosis of fibrotic interstitial lung disease: a light at the end of the tunnel?
JP2004201722A (en) Ultrasonograph
CN111192679A (en) Method and device for processing image data exception and storage medium
US20230215145A1 (en) System and method for similarity learning in digital pathology
CN111289510B (en) In-vitro diagnostic device, image switching method, and readable storage medium
US20240062896A1 (en) Training and implementing a four-dimensional data object recommendation model

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16706316

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16706316

Country of ref document: EP

Kind code of ref document: A1