US20150154470A1 - Image matching method using feature point matching - Google Patents

Image matching method using feature point matching Download PDF

Info

Publication number
US20150154470A1
US20150154470A1 US14/529,875 US201414529875A US2015154470A1 US 20150154470 A1 US20150154470 A1 US 20150154470A1 US 201414529875 A US201414529875 A US 201414529875A US 2015154470 A1 US2015154470 A1 US 2015154470A1
Authority
US
United States
Prior art keywords
matching
search area
straight line
feature point
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US14/529,875
Other versions
US9824303B2 (en
Inventor
Jaeyoon OH
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hanwha Vision Co Ltd
Original Assignee
Samsung Techwin Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Techwin Co Ltd filed Critical Samsung Techwin Co Ltd
Assigned to SAMSUNG TECHWIN CO., LTD. reassignment SAMSUNG TECHWIN CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: OH, JAEYOON
Publication of US20150154470A1 publication Critical patent/US20150154470A1/en
Assigned to HANWHA TECHWIN CO., LTD. reassignment HANWHA TECHWIN CO., LTD. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: SAMSUNG TECHWIN CO., LTD.
Application granted granted Critical
Publication of US9824303B2 publication Critical patent/US9824303B2/en
Assigned to HANWHA AEROSPACE CO., LTD. reassignment HANWHA AEROSPACE CO., LTD. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: HANWHA TECHWIN CO., LTD
Assigned to HANWHA AEROSPACE CO., LTD. reassignment HANWHA AEROSPACE CO., LTD. CORRECTIVE ASSIGNMENT TO CORRECT THE APPLICATION NUMBER 10/853,669. IN ADDITION PLEASE SEE EXHIBIT A PREVIOUSLY RECORDED ON REEL 046927 FRAME 0019. ASSIGNOR(S) HEREBY CONFIRMS THE CHANGE OF NAME. Assignors: HANWHA TECHWIN CO., LTD.
Assigned to HANWHA TECHWIN CO., LTD. reassignment HANWHA TECHWIN CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HANWHA AEROSPACE CO., LTD.
Assigned to HANWHA VISION CO., LTD. reassignment HANWHA VISION CO., LTD. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: HANWHA TECHWIN CO., LTD.
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06K9/6201
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/757Matching configurations of points or features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/806Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of extracted features

Definitions

  • Methods and apparatuses consistent with exemplary embodiments relate to image matching.
  • an image matching method using a multi-sensor has been used to match complementary image information obtained using computerized tomography (CT), magnetic resonance imaging (MRI), and positron emission tomography (PET), etc.
  • CT computerized tomography
  • MRI magnetic resonance imaging
  • PET positron emission tomography
  • an object is recognized and tracked by matching images obtained from a visible light sensor, an infrared sensor, etc. which use different wavelength bands.
  • various candidate feature points are selected to search for matching feature points between feature points of a visible image and feature points of a thermal image.
  • image patches having a predetermined size for measuring similarity between candidate feature points of the visible image and candidate feature points of the thermal image are required, and many complex calculations are required to calculate similarity between the image patches.
  • memory capacity may increase and an input/output bandwidth with an external memory may increase in order to process the image patches.
  • One or more exemplary embodiments of the inventive concept provide an image matching method, wherein images are matched by using coordinate information about a plurality of feature points, without using a patch image.
  • an image matching method which may include: capturing, by an image capturing apparatus using a first sensor, a reference image; capturing, by the image capturing apparatus or another image capturing apparatus using a second sensor, a target image; selecting a first feature point from the reference image and selecting a first reference search area including the first feature point from the reference image; setting a first matching candidate search area corresponding to the first reference search area from the target image, and extracting a plurality of feature points from the first matching candidate search area; selecting a second feature point closest to the first feature point from the first reference search area, and selecting a first straight line connecting the first and second feature points; generating a plurality of segments between the feature points extracted from the first matching candidate search area; and determining a first matching straight line matching a length and an angle of the first straight line, from among the segments.
  • the image matching method may further include: selecting a second reference search area based on the second feature point forming the first straight line; setting a second matching candidate search area corresponding to the second reference search area from the target image, and extracting a plurality of feature points from the second matching candidate search area; selecting a third feature point closest to the second feature point from the second reference search area, and selecting a second straight line connecting the second and third feature points; and searching the second matching candidate search area for a second matching straight line corresponding to the second straight line.
  • the searching for the second matching straight line may include searching for the second matching straight line which satisfies a condition that an angle formed by the first and second straight lines and a length of the second straight line may be respectively the same as an angle formed by the first and second matching straight lines and a length of the second matching straight line.
  • the image matching method may further include, in order to search for the second matching straight line, generating a plurality of segments connecting a feature point, corresponding to the second feature point selected from the first reference search area, among feature points forming the first matching straight line and at least one another feature point among the plurality of feature points extracted from the second matching candidate search area, except a segment constituting the first matching straight line.
  • the second feature point may be a feature point closest to the first feature point in the first reference search area.
  • the angle of the first straight line may be an angle formed by the first straight line and a horizontal or vertical line passing the first feature point.
  • an image matching method which may include: extracting a plurality of feature points from a reference image; selecting a first feature point from the feature points extracted from the reference image, and selecting a first reference search area comprising the first feature point; setting a first matching candidate search area corresponding to the first reference search area from a target image, and extracting a plurality of feature points from the first matching candidate search area; selecting a second feature point closest to the first feature point in the first reference search area, and selecting a first straight line connecting the first and second feature points; generating a plurality of segments from the feature points extracted from the first matching candidate search area; and determining a first matching straight line matching a length and an angle of the first straight line, from the segments generated from the feature points extracted from the first matching candidate search area.
  • a matching system for matching a reference image and a target image by using a geometric relationship.
  • the matching system may include: a geometrical shape generator configured to extract a plurality of feature points from the reference image, select a reference search area comprising a predetermined feature point from among the extracted feature points, select a next feature point closest to the predetermined feature point, from among the feature points extracted from the reference image, in the reference search area, and generate a reference straight line connecting the predetermined feature point and the next feature point; a candidate generator configured to set a matching candidate search area corresponding to the reference search area from the target image, extract a plurality of feature points from the matching candidate search area, and generate a plurality of segments between the feature points extracted from the matching candidate search area; and a matcher configured to search the segments generated by the candidate generator for a matching straight line matching a length and an angle of the reference straight line from the segments generated between the feature points extracted from the matching candidate search area, wherein the geometrical shape generator is configured to update the
  • a matching system for matching a reference image and a target image which may include: a geometric shape generator configured to extract a plurality of feature points from the reference image, select a reference search area comprising a predetermined feature point from among the extracted feature points, select a next feature point closest to the predetermined feature point from among the feature points and included in the reference search area, generate a reference straight line connecting the predetermined feature point and the next feature point, newly set the reference search area based on the next feature point, and generate an additional straight line by selecting a feature point closest to the next feature point; and a matcher configured to select a matching straight line corresponding to the reference straight line based on an angle and a length in the target image, and select an additional matching straight line, corresponding to the additional straight line, in the newly set reference search area.
  • FIG. 1 is a flowchart illustrating an image matching method according to an exemplary embodiment
  • FIG. 2 is a diagram for describing a process of selecting a first feature point and a reference search area from a reference image during the image matching method, according to an exemplary embodiment
  • FIG. 3 is a diagram for describing a process of selecting feature points and a first matching candidate search area from a target image during the image matching method, according to an exemplary embodiment
  • FIG. 4 is a diagram for describing a process of selecting a second feature point from the reference search area in the reference image during the image matching method, according to an exemplary embodiment
  • FIG. 5 is a diagram for describing a process of selecting a first straight line from the reference search area in the reference image, according to an exemplary embodiment
  • FIGS. 6 and 7 are diagrams for describing a process of searching the first matching candidate search area for a matching straight line corresponding to the first straight line of the reference search area, according to an exemplary embodiment
  • FIG. 8 is a diagram for describing a process of selecting a second straight line from the reference search area of the reference image, and searching the first matching candidate search area for a matching straight line corresponding to the second straight line, according to an exemplary embodiment
  • FIG. 9 is a diagram for describing a process of selecting a third straight line from the reference search area of the reference image, and searching the first matching candidate search area for a straight line corresponding to the third straight line during the image matching method, according to an exemplary embodiment.
  • FIG. 10 is a block diagram of a matching system that matches a reference image and a target image by using a geometric relationship between feature points, according to an exemplary embodiment.
  • FIG. 1 is a flowchart illustrating an image matching method according to an exemplary embodiment.
  • the image matching method uses a geometric relationship of feature points between a visible image captured by the visible light camera and a thermal image captured by the terminal imaging camera.
  • a first matching candidate search area corresponding to the first reference search area is set from a target image captured by an image capturing apparatus using a second sensor, and feature points are extracted from the first matching candidate search area in operations S 114 and S 116 , as will be described in detail later with reference to FIG. 3 .
  • a second feature point VP2 closest to the first feature point VP1 is selected from the first reference search area, and then a first straight line connecting the first and second feature points VP1 and VP2 is selected, as will be described in detail later with reference to FIG. 4 .
  • a straight line represents a shortest path between two distinct end points.
  • the all possible segments generated in the first matching candidate search area is searched for a first matching straight line matching a length and an angle of the first straight line connecting the first and second feature points VP1 and VP2 in the first reference search area, in operation S 122 .
  • a second reference search area is selected based on the second feature point VP2 that is selected last from among the first and second feature points VP1 and VP2 forming the first straight line in the reference image, in operation S 130 . Then, a second matching candidate search area corresponding to the second reference search area is set from the target image captured by the image capturing apparatus using the second sensor, and feature points are extracted from the second matching candidate search area, in operation S 132 .
  • a third feature point VP3 closest to the second feature point VP2 is selected from the second reference search area, and then a second straight line connecting the second and third feature points VP2 and VP3 is selected, in operation S 134 . Then, a second matching straight line corresponding to the second straight line is searched for in the second matching candidate search area, in operation S 136 . Operations S 134 and S 136 will be described in detail later with reference to FIG. 8 .
  • operations S 130 through S 136 are repeatedly performed until a matched pair is found in the target image S 140 to search for matching straight lines corresponding to third through n th straight lines, as will be described later with reference to FIG. 9 .
  • the image matching method may be performed by using a geometric relationship between feature points of images after the same target is photographed by using the same or different types of sensors.
  • FIG. 2 is a diagram for describing a process of selecting a first feature point VP1 S 211 and a reference search area SA1 S 210 from a reference image 210 , according to an exemplary embodiment.
  • the reference image 210 may be captured by using an image capturing apparatus using a first sensor and a target image 220 may be captured by using an image capturing apparatus using a second sensor.
  • the reference image 210 may be a visible image and the target image 220 may be a thermal image.
  • the reference image 210 and the target image 220 may be images obtained by photographing the same object by using the same type of sensors.
  • feature points S 211 through S 126 are extracted from the reference image 210 . Then, an arbitrary feature point, for example, the first feature point VP1 S 211 , may be selected from the first through sixth feature points S 211 through S 216 .
  • the first reference search area SA1 S 210 including the first feature point VP1 S 211 is selected.
  • FIG. 3 is a diagram for describing a process of selecting feature points TP1 S 211 , TP1 S 222 , TP3 S 223 , and TP4 S 224 and a first matching candidate search area S 220 from the target image, according to an exemplary embodiment.
  • the first matching candidate search area S 220 corresponding to the first reference search area SA1 S 210 is set from the target image 220 .
  • the feature points TP1 S 221 through TP4 S 224 are extracted from the first matching candidate search area S 220 .
  • the second feature point VP2 S 212 closest to the first feature point VP1 S 211 is selected from the first reference search area SA1 S 210 .
  • VP 2( S 212) MinDistance( VP 1)( S 211)
  • FIG. 5 is a diagram for describing a process of selecting a first straight line VL 12 S 510 from the first reference search area SA1 S 210 in the reference image 210 , according to an exemplary embodiment.
  • the first straight line VL 12 S 510 connecting the first feature point VP1 S 211 and the second feature point VP2 S 212 is selected.
  • the number of possible segments is n ⁇ (n ⁇ 1).
  • FIGS. 6 and 7 are diagrams for describing a process of searching the first matching candidate search area S 220 for a matching straight line corresponding to the first straight line V 12 S 510 of the first reference search area SA1 S 210 , according to an exemplary embodiment.
  • an angle and a length of the first straight line VL 12 S 510 are used.
  • the angle of the first straight line VL 12 S 510 is an angle formed by the first straight line VL 12 S 510 and a horizontal line passing the first feature point VP1 S 211 .
  • a first matching straight line S 520 matching the first feature point VP1 S 211 is searched for.
  • the second reference search area S 810 is newly set based on the second feature point VP2 S 212 that is selected last from among the first and second feature points VP1 S 211 and VP2 S 212 forming the first straight line VL 12 S 510 .
  • the second reference search area S 810 may be set to be the same as or different from the first reference search area SA1 S 210 .
  • a third feature point VP3 S 213 closest to the second feature point VP2 S 212 is selected from the second reference search area S 810 of the reference image 210 , and the second straight line VL 23 S 511 connecting the second feature point VP2 S 212 and the third feature point VP3 S 213 is generated.
  • a second matching candidate search area S 820 corresponding to the second reference search area S 810 is selected from the target image 220 , and feature points are extracted from the second matching candidate search area S 820 .
  • the second matching candidate search area S 820 may be set to be the same as or different from the first matching candidate search area S 220 .
  • a process of extracting the feature points from the second matching candidate search area S 820 may not be performed.
  • the second matching candidate search area S 820 is searched for a second matching straight line S 521 corresponding to the second straight line VL 23 of the second reference search area S 810 .
  • candidates for the second matching straight line S 521 corresponding to the second straight line VL 23 S 511 are a segment connecting the feature points TP2 S 222 and TP3 S 223 , a segment connecting the feature points TP2 S 222 and TP4 S 224 , and a segment connecting the feature points TP2 S 222 and TP5 S 225 .
  • the candidates for the second matching straight line S 521 are the three segments.
  • the second matching straight line S 521 matching the second straight line VL 23 is searched for based on an angle S 812 formed by the first straight line VL 12 S 510 and the second straight line VL 23 S 511 , and a length of the second straight line VL 23 S 511 .
  • angles between the first matching straight line S 520 , and the segment connecting the feature points TP2 S 222 and TP3 S 223 , the segment connecting the feature points TP2 S 222 and TP4 S 224 , and the segment connecting the feature points TP2 S 222 and TP5 S 225 , which are the candidates for the second matching straight line S 521 , are compared to the angle S 812 .
  • lengths of the segment connecting the feature points TP2 S 222 and TP3 S 223 , the segment connecting the feature points TP2 S 222 and TP4 S 224 , and the segment connecting the feature points TP2 S 222 and TP5 S 225 , which are the candidates for the second matching straight line S 521 , are compared to the length of the second straight line VL 23 S 511 .
  • the second matching straight line S 521 matching the angle S 812 and the length of the second straight line S 511 is selected.
  • the image matching method is continuously performed as shown in FIG. 9 in the similar manner described above with reference to FIG. 8 .
  • FIG. 10 is a block diagram of a matching system 1000 that matches a reference image and a target image by using a geometric relationship between feature points, according to an embodiment.
  • the matching system 1000 includes a geometrical shape generator 1010 , a candidate generator 1020 , and a matcher 1030 .
  • the geometrical shape generator 1010 extracts feature points from the reference image, and selects a reference search area including a predetermined feature point from among the extracted feature points. Then, the geometrical shape generator 1010 selects a next feature point closest to the predetermined feature point from the reference search area, and generates a reference straight line connecting the predetermined feature point and the next feature point.
  • the geometrical shape generator 1010 selects the first reference search area SA1 S 210 including the first feature point VP1 S 211 . Then, the second feature point VP2 S 212 closest to the first feature point VP1 S 211 is selected from the first reference search area SA1 S 210 .
  • the predetermined feature point is referred to as the first feature point VP1 S 211
  • the next feature point closest to the predetermined feature point is referred to as the second feature point VP2 S 212
  • the reference search area including the predetermined feature point is referred to as the first reference search area SA1 S 210 .
  • a reference straight line i.e., the first straight line VL 12 S 510 , is selected by connecting the first feature point VP1 S 211 and the second feature point VP2 S 212 .
  • the matching system 1000 is illustrated to include the three different structures of the geometrical shape generator 1010 , the candidate generator 1020 , and the matcher 1030 which perform the above-described functions, respectively. According to another exemplary embodiment, however, any two or all of the three structures may be combined to constitute only two structures or one single structure which performs all of the functions described above.
  • the candidate generator 1020 sets the first matching candidate search area S 220 corresponding to the first reference search area S 210 from the target image 220 , extracts the feature points TP1 S 221 through TP4 S 224 from the first matching candidate search area S 220 , and generates all possible segments between the extracted feature points TP1 S 221 through TP4 S 224 .
  • the matcher 1030 searches for the first matching straight line S 520 matching the length and the angle of the first straight line VL 12 S 510 from among the all possible segments generated by the candidate generator 1020 .
  • the geometrical shape generator 1010 then updates the reference search area to obtain the second reference search area S 810 , based on the second feature point VP2 S 212 that is selected last from among the first and second feature points VP1 S 211 and VP2 S 212 forming the first straight line VL 12 .
  • an additional straight line i.e., the second straight line VL 23 S 511 , is selected by selecting the third feature point VP3 S 213 closest to the second feature point VP2 S 212 from the second reference search area S 810 .
  • the candidate generator 1020 newly sets a matching candidate search area, i.e., the second matching candidate search area S 820 , corresponding to the second reference search area S 810 , and selects an additional matching straight line, i.e., the second matching straight line S 521 corresponding to the second straight line VL 23 S 511 from the second matching candidate search area S 820 .
  • the candidate generator 1020 may generate possible segments, wherein one end point of each of the possible segments is a last selected feature point of the matching straight line and the other end point of each of the possible segments is one of feature points that lie in the newly set matching candidate search area.
  • the inventive concept can also be embodied as computer readable codes on a computer readable recording medium.
  • the computer readable recording medium is any data storage device that can store data which can be thereafter read by a computer system.
  • Examples of the computer readable recording medium include read-only memory (ROM), random-access memory (RAM), CD-ROMs, magnetic tapes, floppy disks, optical data storage devices, etc.
  • the computer readable recording medium can also be distributed over network coupled computer systems so that the computer readable code is stored and executed in a distributed fashion.
  • At least one of the components, elements or units represented by a block as illustrated in FIG. 10 may be embodied as various numbers of hardware, software and/or firmware structures that execute respective functions described above, according to an exemplary embodiment.
  • at least one of these components, elements or units may use a direct circuit structure, such as a memory, processing, logic, a look-up table, etc. that may execute the respective functions through controls of one or more microprocessors or other control apparatuses.
  • at least one of these components, elements or units may be specifically embodied by a module, a program, or a part of code, which contains one or more executable instructions for performing specified logic functions.
  • At least one of these components, elements or units may further include a processor such as a CPU that performs the respective functions, a microprocessor, or the like.
  • a bus is not illustrated in FIG. 10 , communication between the components, elements or units may be performed through the bus.

Abstract

An image matching method includes: extracting a plurality of feature points from a reference image; selecting a first feature point from the feature points, and selecting a first reference search area comprising the first feature point; setting a first matching candidate search area corresponding to the first reference search area from a target image, and extracting a plurality of feature points from the first matching candidate search area; selecting a second feature point closest to the first feature point in the first reference search area, and selecting a first straight line connecting the first and second feature points; generating a plurality of segments from the feature points extracted from the first matching candidate search area; and determining a first matching straight line matching a length and an angle of the first straight line, from the segments generated from the feature points extracted from the first matching candidate search area.

Description

    CROSS-REFERENCE TO THE RELATED APPLICATION
  • This application claims priority from Korean Patent Application No. 10-2013-0147994, filed on Nov. 29, 2013, in the Korean Intellectual Property Office, the disclosure of which is incorporated herein in its entirety by reference.
  • BACKGROUND
  • 1. Field
  • Methods and apparatuses consistent with exemplary embodiments relate to image matching.
  • 2. Description of the Related Art
  • In medical fields, an image matching method using a multi-sensor has been used to match complementary image information obtained using computerized tomography (CT), magnetic resonance imaging (MRI), and positron emission tomography (PET), etc. In contrast, in surveillance and security fields, an object is recognized and tracked by matching images obtained from a visible light sensor, an infrared sensor, etc. which use different wavelength bands.
  • According to a related-art image matching method using a multi-sensor, various candidate feature points are selected to search for matching feature points between feature points of a visible image and feature points of a thermal image.
  • For this image matching method, image patches having a predetermined size for measuring similarity between candidate feature points of the visible image and candidate feature points of the thermal image are required, and many complex calculations are required to calculate similarity between the image patches. Also, memory capacity may increase and an input/output bandwidth with an external memory may increase in order to process the image patches.
  • SUMMARY
  • One or more exemplary embodiments of the inventive concept provide an image matching method, wherein images are matched by using coordinate information about a plurality of feature points, without using a patch image.
  • Various aspects of the exemplary embodiment will be set forth in part in the description which follows and, in part, will be apparent from the description, or may be learned by practice of the presented embodiments.
  • According to an aspect of an exemplary embodiment, there is provided an image matching method which may include: capturing, by an image capturing apparatus using a first sensor, a reference image; capturing, by the image capturing apparatus or another image capturing apparatus using a second sensor, a target image; selecting a first feature point from the reference image and selecting a first reference search area including the first feature point from the reference image; setting a first matching candidate search area corresponding to the first reference search area from the target image, and extracting a plurality of feature points from the first matching candidate search area; selecting a second feature point closest to the first feature point from the first reference search area, and selecting a first straight line connecting the first and second feature points; generating a plurality of segments between the feature points extracted from the first matching candidate search area; and determining a first matching straight line matching a length and an angle of the first straight line, from among the segments.
  • The image matching method may further include: selecting a second reference search area based on the second feature point forming the first straight line; setting a second matching candidate search area corresponding to the second reference search area from the target image, and extracting a plurality of feature points from the second matching candidate search area; selecting a third feature point closest to the second feature point from the second reference search area, and selecting a second straight line connecting the second and third feature points; and searching the second matching candidate search area for a second matching straight line corresponding to the second straight line.
  • The searching for the second matching straight line may include searching for the second matching straight line which satisfies a condition that an angle formed by the first and second straight lines and a length of the second straight line may be respectively the same as an angle formed by the first and second matching straight lines and a length of the second matching straight line.
  • The image matching method may further include, in order to search for the second matching straight line, generating a plurality of segments connecting a feature point, corresponding to the second feature point selected from the first reference search area, among feature points forming the first matching straight line and at least one another feature point among the plurality of feature points extracted from the second matching candidate search area, except a segment constituting the first matching straight line.
  • The second feature point may be a feature point closest to the first feature point in the first reference search area.
  • The angle of the first straight line may be an angle formed by the first straight line and a horizontal or vertical line passing the first feature point.
  • According to an aspect of another exemplary embodiment, there is provided an image matching method which may include: extracting a plurality of feature points from a reference image; selecting a first feature point from the feature points extracted from the reference image, and selecting a first reference search area comprising the first feature point; setting a first matching candidate search area corresponding to the first reference search area from a target image, and extracting a plurality of feature points from the first matching candidate search area; selecting a second feature point closest to the first feature point in the first reference search area, and selecting a first straight line connecting the first and second feature points; generating a plurality of segments from the feature points extracted from the first matching candidate search area; and determining a first matching straight line matching a length and an angle of the first straight line, from the segments generated from the feature points extracted from the first matching candidate search area.
  • According to an aspect of still another exemplary embodiment, there is provided a matching system for matching a reference image and a target image by using a geometric relationship. The matching system may include: a geometrical shape generator configured to extract a plurality of feature points from the reference image, select a reference search area comprising a predetermined feature point from among the extracted feature points, select a next feature point closest to the predetermined feature point, from among the feature points extracted from the reference image, in the reference search area, and generate a reference straight line connecting the predetermined feature point and the next feature point; a candidate generator configured to set a matching candidate search area corresponding to the reference search area from the target image, extract a plurality of feature points from the matching candidate search area, and generate a plurality of segments between the feature points extracted from the matching candidate search area; and a matcher configured to search the segments generated by the candidate generator for a matching straight line matching a length and an angle of the reference straight line from the segments generated between the feature points extracted from the matching candidate search area, wherein the geometrical shape generator is configured to update the reference search area based on the next feature point and select an additional straight line by selecting another feature point closest to the next feature point from the updated reference search area, and then the candidate generator is configured to newly set a matching candidate search area corresponding to the updated reference search area and select an additional matching straight line corresponding to the additional straight line from the newly set matching candidate search area.
  • According to an aspect of still another exemplary embodiment, there is provided a matching system for matching a reference image and a target image which may include: a geometric shape generator configured to extract a plurality of feature points from the reference image, select a reference search area comprising a predetermined feature point from among the extracted feature points, select a next feature point closest to the predetermined feature point from among the feature points and included in the reference search area, generate a reference straight line connecting the predetermined feature point and the next feature point, newly set the reference search area based on the next feature point, and generate an additional straight line by selecting a feature point closest to the next feature point; and a matcher configured to select a matching straight line corresponding to the reference straight line based on an angle and a length in the target image, and select an additional matching straight line, corresponding to the additional straight line, in the newly set reference search area.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • These and/or other aspects will become apparent and more readily appreciated from the following description of the exemplary embodiments, taken in conjunction with the accompanying drawings, in which:
  • FIG. 1 is a flowchart illustrating an image matching method according to an exemplary embodiment;
  • FIG. 2 is a diagram for describing a process of selecting a first feature point and a reference search area from a reference image during the image matching method, according to an exemplary embodiment;
  • FIG. 3 is a diagram for describing a process of selecting feature points and a first matching candidate search area from a target image during the image matching method, according to an exemplary embodiment;
  • FIG. 4 is a diagram for describing a process of selecting a second feature point from the reference search area in the reference image during the image matching method, according to an exemplary embodiment;
  • FIG. 5 is a diagram for describing a process of selecting a first straight line from the reference search area in the reference image, according to an exemplary embodiment;
  • FIGS. 6 and 7 are diagrams for describing a process of searching the first matching candidate search area for a matching straight line corresponding to the first straight line of the reference search area, according to an exemplary embodiment;
  • FIG. 8 is a diagram for describing a process of selecting a second straight line from the reference search area of the reference image, and searching the first matching candidate search area for a matching straight line corresponding to the second straight line, according to an exemplary embodiment;
  • FIG. 9 is a diagram for describing a process of selecting a third straight line from the reference search area of the reference image, and searching the first matching candidate search area for a straight line corresponding to the third straight line during the image matching method, according to an exemplary embodiment; and
  • FIG. 10 is a block diagram of a matching system that matches a reference image and a target image by using a geometric relationship between feature points, according to an exemplary embodiment.
  • DETAILED DESCRIPTION OF THE EXEMPLARY EMBODIMENTS
  • Reference will now be made in detail to exemplary embodiments which are illustrated in the accompanying drawings, wherein like reference numerals refer to like elements throughout. In this regard, the exemplary embodiments may have different forms and should not be construed as being limited to the descriptions set forth herein. Accordingly, the exemplary embodiments are merely described below, by referring to the figures, to explain various aspects of the inventive concept.
  • FIG. 1 is a flowchart illustrating an image matching method according to an exemplary embodiment.
  • According to an exemplary embodiment, when a visible light camera and a thermal imaging camera, preferably but not necessarily, adjacent to the visible light camera photograph the same target, the image matching method uses a geometric relationship of feature points between a visible image captured by the visible light camera and a thermal image captured by the terminal imaging camera.
  • According to an exemplary embodiment, in order to perform image matching, feature points are extracted from a reference image captured by an image capturing apparatus using a first sensor, in operation S110. A first feature point VP1 is arbitrarily selected from the reference image and a first reference search area including the first feature point VP1 is selected, in operation S112, as will be described in detail later with reference to FIG. 2.
  • Then, a first matching candidate search area corresponding to the first reference search area is set from a target image captured by an image capturing apparatus using a second sensor, and feature points are extracted from the first matching candidate search area in operations S114 and S116, as will be described in detail later with reference to FIG. 3.
  • In operation S118, a second feature point VP2 closest to the first feature point VP1 is selected from the first reference search area, and then a first straight line connecting the first and second feature points VP1 and VP2 is selected, as will be described in detail later with reference to FIG. 4. Here, a straight line represents a shortest path between two distinct end points.
  • Then, all possible segments between the feature points extracted from the first matching candidate search area are generated in operation S120. For example, if the number of feature points extracted from the first matching candidate search area is N, the number of possible segments is N×(N−1). Operation S120 will be described in detail later with reference to FIG. 7.
  • Then, the all possible segments generated in the first matching candidate search area is searched for a first matching straight line matching a length and an angle of the first straight line connecting the first and second feature points VP1 and VP2 in the first reference search area, in operation S122.
  • If the first matching straight line is found, a second reference search area is selected based on the second feature point VP2 that is selected last from among the first and second feature points VP1 and VP2 forming the first straight line in the reference image, in operation S130. Then, a second matching candidate search area corresponding to the second reference search area is set from the target image captured by the image capturing apparatus using the second sensor, and feature points are extracted from the second matching candidate search area, in operation S132.
  • A third feature point VP3 closest to the second feature point VP2 is selected from the second reference search area, and then a second straight line connecting the second and third feature points VP2 and VP3 is selected, in operation S134. Then, a second matching straight line corresponding to the second straight line is searched for in the second matching candidate search area, in operation S136. Operations S134 and S136 will be described in detail later with reference to FIG. 8.
  • Then, operations S130 through S136 are repeatedly performed until a matched pair is found in the target image S140 to search for matching straight lines corresponding to third through nth straight lines, as will be described later with reference to FIG. 9.
  • According to another exemplary embodiment, the image matching method may be performed by using a geometric relationship between feature points of images after the same target is photographed by using the same or different types of sensors.
  • The image matching method according to an exemplary embodiment will now be described in detail with reference to FIGS. 2 through 9.
  • FIG. 2 is a diagram for describing a process of selecting a first feature point VP1 S211 and a reference search area SA1 S210 from a reference image 210, according to an exemplary embodiment.
  • According to an exemplary embodiment, the reference image 210 may be captured by using an image capturing apparatus using a first sensor and a target image 220 may be captured by using an image capturing apparatus using a second sensor. According to an exemplary embodiment, the reference image 210 may be a visible image and the target image 220 may be a thermal image.
  • According to another exemplary embodiment, the reference image 210 and the target image 220 may be images obtained by photographing the same object by using the same type of sensors.
  • According to an exemplary embodiment, feature points S211 through S126 are extracted from the reference image 210. Then, an arbitrary feature point, for example, the first feature point VP1 S211, may be selected from the first through sixth feature points S211 through S216.
  • If the first feature point VP1 S211 is selected, the first reference search area SA1 S210 including the first feature point VP1 S211 is selected.
  • FIG. 3 is a diagram for describing a process of selecting feature points TP1 S211, TP1 S222, TP3 S223, and TP4 S224 and a first matching candidate search area S220 from the target image, according to an exemplary embodiment.
  • If the first feature point VP1 S211 and the first reference search area SA1 S210 are set from the reference image 210, the first matching candidate search area S220 corresponding to the first reference search area SA1 S210 is set from the target image 220.
  • If the first matching candidate search area S220 is set, the feature points TP1 S221 through TP4 S224 are extracted from the first matching candidate search area S220.
  • FIG. 4 is a diagram for describing a process of selecting a second feature point VP2 S212 from the first reference search area SA1 S210 in the reference image 210, according to an embodiment.
  • If the first feature point VP1 S211 and the first reference search area SA1 S210 are set from the reference image 210, and the feature points TP1 S221 through TP4 S224 are extracted from the first matching candidate search area S220 of the target image 220, the second feature point VP2 S212 closest to the first feature point VP1 S211 is selected from the first reference search area SA1 S210.

  • VP2(S212)=MinDistance(VP1)(S211)
  • FIG. 5 is a diagram for describing a process of selecting a first straight line VL12 S510 from the first reference search area SA1 S210 in the reference image 210, according to an exemplary embodiment.
  • If the first feature point VP1 S211 and the second feature point VP2 S212 are selected from the first reference search area SA1 S210 of the reference image 210, the first straight line VL12 S510 connecting the first feature point VP1 S211 and the second feature point VP2 S212 is selected.
  • Then, all possible segments are generated based on the feature points TP1 S221 through TP4 S224 extracted from the first matching candidate search area S220 of the target image 220 (refer to target images 220 through 223 of FIG. 7).
  • For example, if the number of feature points extracted from the first matching candidate search area S220 is n, the number of possible segments is n×(n−1). In FIG. 5, the number of all possible segments generated based on the feature points TP1 S221 through TP4 S224 extracted from the first matching candidate search area S220 is 4×3=12.
  • FIGS. 6 and 7 are diagrams for describing a process of searching the first matching candidate search area S220 for a matching straight line corresponding to the first straight line V12 S510 of the first reference search area SA1 S210, according to an exemplary embodiment.
  • Referring to FIG. 6, it is determined whether a segment corresponding to the first straight line VL12 S510 in the first reference search area SA1 S210 of the reference image 210 exists in the 12 possible segments between the feature points TP1 S221 through TP4 S224 extracted from the first matching candidate search area S220.
  • In order to search the target image 220 for the straight line matching the first straight line VL12 S510 in the reference image 210, an angle and a length of the first straight line VL12 S510 are used. The angle of the first straight line VL12 S510 is an angle formed by the first straight line VL12 S510 and a horizontal line passing the first feature point VP1 S211.
  • As shown in FIG. 7, a first matching straight line S520 matching the first feature point VP1 S211 is searched for.
  • FIG. 8 is a diagram for describing a process of selecting a second straight line VL23 S511 from a second reference search area S810 of the reference image 210, and searching the first matching candidate search area S220 for a matching straight line corresponding to the second straight line VL23 S511, according to an exemplary embodiment.
  • After the first matching straight line S520 matching the first straight line VL12 S510 of the reference image 210 is found in the target image 220 through the processes described above with reference to FIGS. 2 through 7, the second reference search area S810 is newly set based on the second feature point VP2 S212 that is selected last from among the first and second feature points VP1 S211 and VP2 S212 forming the first straight line VL12 S510. The second reference search area S810 may be set to be the same as or different from the first reference search area SA1 S210.
  • Then, a third feature point VP3 S213 closest to the second feature point VP2 S212 is selected from the second reference search area S810 of the reference image 210, and the second straight line VL23 S511 connecting the second feature point VP2 S212 and the third feature point VP3 S213 is generated.
  • A second matching candidate search area S820 corresponding to the second reference search area S810 is selected from the target image 220, and feature points are extracted from the second matching candidate search area S820. The second matching candidate search area S820 may be set to be the same as or different from the first matching candidate search area S220. When the second matching candidate search area S820 is set to be the same as the first matching candidate search area S220, a process of extracting the feature points from the second matching candidate search area S820 may not be performed.
  • Then, the second matching candidate search area S820 is searched for a second matching straight line S521 corresponding to the second straight line VL23 of the second reference search area S810.
  • In this case, unlike when the first matching straight line S520 is searched for, only feature points forming a straight line with the feature point TP2 S222 that is selected last from among the feature points TP1 S221 and TP2 S222 forming the first matching straight line S520 are compared.
  • Referring to FIG. 8, candidates for the second matching straight line S521 corresponding to the second straight line VL23 S511 are a segment connecting the feature points TP2 S222 and TP3 S223, a segment connecting the feature points TP2 S222 and TP4 S224, and a segment connecting the feature points TP2 S222 and TP5 S225.
  • In other words, the candidates for the second matching straight line S521 are the three segments.
  • The second matching straight line S521 matching the second straight line VL23 is searched for based on an angle S812 formed by the first straight line VL12 S510 and the second straight line VL23 S511, and a length of the second straight line VL23 S511.
  • For example, angles between the first matching straight line S520, and the segment connecting the feature points TP2 S222 and TP3 S223, the segment connecting the feature points TP2 S222 and TP4 S224, and the segment connecting the feature points TP2 S222 and TP5 S225, which are the candidates for the second matching straight line S521, are compared to the angle S812.
  • Then, lengths of the segment connecting the feature points TP2 S222 and TP3 S223, the segment connecting the feature points TP2 S222 and TP4 S224, and the segment connecting the feature points TP2 S222 and TP5 S225, which are the candidates for the second matching straight line S521, are compared to the length of the second straight line VL23 S511.
  • Then, the second matching straight line S521 matching the angle S812 and the length of the second straight line S511 is selected.
  • Then, the image matching method is continuously performed as shown in FIG. 9 in the similar manner described above with reference to FIG. 8.
  • FIG. 10 is a block diagram of a matching system 1000 that matches a reference image and a target image by using a geometric relationship between feature points, according to an embodiment.
  • The matching system 1000 includes a geometrical shape generator 1010, a candidate generator 1020, and a matcher 1030.
  • The geometrical shape generator 1010 extracts feature points from the reference image, and selects a reference search area including a predetermined feature point from among the extracted feature points. Then, the geometrical shape generator 1010 selects a next feature point closest to the predetermined feature point from the reference search area, and generates a reference straight line connecting the predetermined feature point and the next feature point.
  • Referring to FIGS. 2 and 5, the geometrical shape generator 1010 selects the first reference search area SA1 S210 including the first feature point VP1 S211. Then, the second feature point VP2 S212 closest to the first feature point VP1 S211 is selected from the first reference search area SA1 S210. For convenience of description, the predetermined feature point is referred to as the first feature point VP1 S211, the next feature point closest to the predetermined feature point is referred to as the second feature point VP2 S212, and the reference search area including the predetermined feature point is referred to as the first reference search area SA1 S210.
  • Then, a reference straight line, i.e., the first straight line VL12 S510, is selected by connecting the first feature point VP1 S211 and the second feature point VP2 S212.
  • In FIG. 10, the matching system 1000 is illustrated to include the three different structures of the geometrical shape generator 1010, the candidate generator 1020, and the matcher 1030 which perform the above-described functions, respectively. According to another exemplary embodiment, however, any two or all of the three structures may be combined to constitute only two structures or one single structure which performs all of the functions described above.
  • Referring to FIG. 7, the candidate generator 1020 sets the first matching candidate search area S220 corresponding to the first reference search area S210 from the target image 220, extracts the feature points TP1 S221 through TP4 S224 from the first matching candidate search area S220, and generates all possible segments between the extracted feature points TP1 S221 through TP4 S224.
  • The matcher 1030 searches for the first matching straight line S520 matching the length and the angle of the first straight line VL12 S510 from among the all possible segments generated by the candidate generator 1020.
  • Referring to FIG. 8, the geometrical shape generator 1010 then updates the reference search area to obtain the second reference search area S810, based on the second feature point VP2 S212 that is selected last from among the first and second feature points VP1 S211 and VP2 S212 forming the first straight line VL12.
  • Next, an additional straight line, i.e., the second straight line VL23 S511, is selected by selecting the third feature point VP3 S213 closest to the second feature point VP2 S212 from the second reference search area S810.
  • In this case, the candidate generator 1020 newly sets a matching candidate search area, i.e., the second matching candidate search area S820, corresponding to the second reference search area S810, and selects an additional matching straight line, i.e., the second matching straight line S521 corresponding to the second straight line VL23 S511 from the second matching candidate search area S820. In order to search for the additional matching straight line, the candidate generator 1020 may generate possible segments, wherein one end point of each of the possible segments is a last selected feature point of the matching straight line and the other end point of each of the possible segments is one of feature points that lie in the newly set matching candidate search area.
  • As described above, according to the above exemplary embodiments, since a patch image is not used during an image matching method, the number of complex operations may be reduced.
  • Also, according to the above exemplary embodiments, since images are matched based on coordinate information of feature points, the number of unnecessary operations may be reduced and performance may be increased.
  • The inventive concept can also be embodied as computer readable codes on a computer readable recording medium. The computer readable recording medium is any data storage device that can store data which can be thereafter read by a computer system.
  • Examples of the computer readable recording medium include read-only memory (ROM), random-access memory (RAM), CD-ROMs, magnetic tapes, floppy disks, optical data storage devices, etc. The computer readable recording medium can also be distributed over network coupled computer systems so that the computer readable code is stored and executed in a distributed fashion.
  • At least one of the components, elements or units represented by a block as illustrated in FIG. 10 may be embodied as various numbers of hardware, software and/or firmware structures that execute respective functions described above, according to an exemplary embodiment. For example, at least one of these components, elements or units may use a direct circuit structure, such as a memory, processing, logic, a look-up table, etc. that may execute the respective functions through controls of one or more microprocessors or other control apparatuses. Also, at least one of these components, elements or units may be specifically embodied by a module, a program, or a part of code, which contains one or more executable instructions for performing specified logic functions. Also, at least one of these components, elements or units may further include a processor such as a CPU that performs the respective functions, a microprocessor, or the like. Further, although a bus is not illustrated in FIG. 10, communication between the components, elements or units may be performed through the bus.
  • It should be understood that the exemplary embodiments described above should be considered in a descriptive sense only and not for purposes of limitation. Descriptions of features or aspects within each embodiment should typically be considered as available for other similar features or aspects in other embodiments.
  • While exemplary embodiments have been described with reference to the figures, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the inventive concept as defined by the following claims.

Claims (20)

What is claimed is:
1. An image matching method using feature point matching, the image matching method comprising:
capturing, by an image capturing apparatus using a first sensor, a reference image;
capturing, by the image capturing apparatus or another image capturing apparatus using a second sensor, a target image;
selecting a first feature point from the reference image and selecting a first reference search area including the first feature point from the reference image;
setting a first matching candidate search area corresponding to the first reference search area from the target image, and extracting a plurality of feature points from the first matching candidate search area;
selecting a second feature point closest to the first feature point from the first reference search area, and selecting a first straight line connecting the first and second feature points;
generating a plurality of segments between the feature points extracted from the first matching candidate search area; and
determining a first matching straight line matching a length and an angle of the first straight line, from among the segments.
2. The image matching method of claim 1, further comprising:
selecting a second reference search area based on the second feature point forming the first straight line;
setting a second matching candidate search area corresponding to the second reference search area from the target image, and extracting a plurality of feature points from the second matching candidate search area;
selecting a third feature point closest to the second feature point from the second reference search area, and selecting a second straight line connecting the second and third feature points; and
searching the second matching candidate search area for a second matching straight line corresponding to the second straight line.
3. The image matching method of claim 2, wherein the second reference search area is the same as or different from the first reference search area, and the second matching candidate search area is the same as or different from the first matching candidate search area, and
wherein if the second reference search area is the same as the first reference search area, and the second matching candidate search area is the same as the first matching candidate search area, the plurality of feature points extracted from the second matching candidate search area are the same as the plurality of feature points extracted from the first matching candidate search area.
4. The image matching method of claim 2, wherein the searching for the second matching straight line comprises searching for the second matching straight line which satisfies a condition that an angle formed by the first and second straight lines and a length of the second straight line are respectively the same as an angle formed by the first and second matching straight lines and a length of the second matching straight line.
5. The image matching method of claim 2, further comprising, in order to search for the second matching straight line, generating a plurality of segments connecting a feature point, corresponding to the second feature point selected from the first reference search area, among feature points forming the first matching straight line and at least one another feature point among the plurality of feature points extracted from the second matching candidate search area, except a segment constituting the first matching straight line.
6. The image matching method of claim 1, wherein if a number of the feature points extracted from the first matching candidate search area is N, a number of the segments generated between the feature points extracted from the first matching candidate search area is N×(N−1).
7. An image matching method for matching at least two images, the image matching method comprising:
extracting a plurality of feature points from a reference image;
selecting a first feature point from the feature points extracted from the reference image, and selecting a first reference search area comprising the first feature point;
setting a first matching candidate search area corresponding to the first reference search area from a target image, and extracting a plurality of feature points from the first matching candidate search area;
selecting a second feature point closest to the first feature point in the first reference search area, and selecting a first straight line connecting the first and second feature points;
generating a plurality of segments from the feature points extracted from the first matching candidate search area; and
determining a first matching straight line matching a length and an angle of the first straight line, from the segments generated from the feature points extracted from the first matching candidate search area.
8. The image matching method of claim 7, further comprising:
selecting a second reference search area based on the second feature point forming the first straight line;
setting a second matching candidate search area corresponding to the second reference search area from the target image, and extracting a plurality of feature points from the second matching candidate search area;
selecting a third feature point closest to the second feature point in the second reference search area, and selecting a second straight line connecting the second and third feature points; and
searching the second matching candidate search area for a second matching straight line corresponding to the second straight line.
9. The image matching method of claim 8, wherein the second reference search area is the same as or different from the first reference search area, and the second matching candidate search area is the same as or different from the first matching candidate search area, and
wherein if the second reference search area is the same as the first reference search area, and the second matching candidate search area is the same as the first matching candidate search area, the plurality of feature points extracted from the second matching candidate search area are the same as the plurality of feature points extracted from the first matching candidate search area.
10. The image matching method of claim 8, wherein the searching for the second matching straight line comprises searching for the second matching straight line which satisfies an angle formed by the first and second straight lines and a length of the second straight line are respectively the same as an angle formed by the first and second matching straight lines and a length of the second matching straight line.
11. The image matching method of claim 8, further comprising, in order to search for the second matching straight line, generating a plurality of segments connecting a feature point, corresponding to the second feature point selected from the first reference search area, among feature points forming the first matching straight line and at least one another feature point among the plurality of feature points extracted from the second matching candidate search area, except a segment constituting the first matching straight line.
12. The image matching method of claim 7, wherein if a number of the feature points extracted from the first matching candidate search area is N, a number of the segments generated from the feature points extracted from the first matching candidate search area is N×(N−1).
13. A matching system for matching a reference image and a target image by using a geometric relationship, the matching system comprising:
a geometrical shape generator configured to extract a plurality of feature points from the reference image, select a reference search area comprising a predetermined feature point from among the extracted feature points, select a next feature point closest to the predetermined feature point, from among the feature points extracted from the reference image, in the reference search area, and generate a reference straight line connecting the predetermined feature point and the next feature point;
a candidate generator configured to set a matching candidate search area corresponding to the reference search area from the target image, extract a plurality of feature points from the matching candidate search area, and generate a plurality of segments between the feature points extracted from the matching candidate search area; and
a matcher configured to search the segments generated by the candidate generator for a matching straight line matching a length and an angle of the reference straight line from the segments generated between the feature points extracted from the matching candidate search area,
wherein the geometrical shape generator is configured to update the reference search area based on the next feature point and select an additional straight line by selecting another feature point closest to the next feature point from the updated reference search area, and then the candidate generator is configured to newly set a matching candidate search area corresponding to the updated reference search area and select an additional matching straight line corresponding to the additional straight line from the newly set matching candidate search area.
14. The matching system of claim 13, wherein the geometrical shape generator is configured to update the reference search area again based on the other feature point and add a straight line by selecting still another feature point closest to the other feature point, and then the candidate generator is configured to newly set another matching candidate search area corresponding to the reference search area updated again and select a matching straight line corresponding to the added straight line from the newly set other matching candidate search area.
15. The matching system of claim 13, wherein the updated reference search area is the same as or different from the reference search area, and the newly set matching candidate search area is the same as or different from the matching candidate search area.
16. The matching system of claim 13, wherein an angle formed by the reference straight line and the additional straight line and a length of the additional straight line are respectively the same as an angle formed by the matching straight line and the additional matching straight line and a length of the additional matching straight line.
17. The matching system of claim 13, wherein the reference image is captured by an image capturing apparatus using a first sensor, and the target image is captured by the image capturing apparatus or another image capturing apparatus using a second sensor.
18. The matching system of claim 13, wherein the reference image is a visible image and the target image is a thermal image.
19. The matching system of claim 13, wherein, in order to search for the additional matching straight line, the candidate generator is configured to generate a plurality of segments between a plurality of feature points extracted from the newly set matching candidate search area, and
wherein one end and the other end of each of the segments generated between the feature points extracted from the newly set matching candidate search area are the next feature point of the matching straight line and one of the feature points extracted from the newly set matching candidate search area, respectively.
20. A matching system for matching a reference image and a target image, the matching system comprising:
a geometric shape generator configured to extract a plurality of feature points from the reference image, select a reference search area comprising a predetermined feature point from among the extracted feature points, select a next feature point closest to the predetermined feature point from among the feature points and included in the reference search area, generate a reference straight line connecting the predetermined feature point and the next feature point, newly set the reference search area based on the next feature point, and generate an additional straight line by selecting a feature point closest to the next feature point; and
a matcher configured to select a matching straight line corresponding to the reference straight line based on an angle and a length in the target image, and select an additional matching straight line, corresponding to the additional straight line, in the newly set reference search area.
US14/529,875 2013-11-29 2014-10-31 Image matching method using feature point matching Active 2035-02-09 US9824303B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020130147994A KR102149274B1 (en) 2013-11-29 2013-11-29 Method for image matching using a feature matching of the image
KR10-2013-0147994 2013-11-29

Publications (2)

Publication Number Publication Date
US20150154470A1 true US20150154470A1 (en) 2015-06-04
US9824303B2 US9824303B2 (en) 2017-11-21

Family

ID=53265609

Family Applications (1)

Application Number Title Priority Date Filing Date
US14/529,875 Active 2035-02-09 US9824303B2 (en) 2013-11-29 2014-10-31 Image matching method using feature point matching

Country Status (3)

Country Link
US (1) US9824303B2 (en)
KR (1) KR102149274B1 (en)
CN (1) CN104680514B (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105160308A (en) * 2015-08-20 2015-12-16 武汉大学 Airport target recognition method based on line classification and texture classification
US20160196653A1 (en) * 2014-12-31 2016-07-07 Flir Systems, Inc. Systems and methods for dynamic registration of multimodal images
CN106127755A (en) * 2016-06-21 2016-11-16 奇瑞汽车股份有限公司 The image matching method of feature based and device
CN106991405A (en) * 2017-04-10 2017-07-28 广东金杭科技股份有限公司 A kind of dynamic portrait storehouse method for building up
CN109711321A (en) * 2018-12-24 2019-05-03 西南交通大学 A kind of wide Baseline Images unchanged view angle linear feature matching process of structure adaptive
CN111476231A (en) * 2020-06-22 2020-07-31 努比亚技术有限公司 Image area identification method and device and computer readable storage medium
CN113129634A (en) * 2019-12-31 2021-07-16 中移物联网有限公司 Parking space acquisition method and system and communication equipment
US11210551B2 (en) * 2019-07-29 2021-12-28 Hong Kong Applied Science And Technology Research Institute Co., Ltd. Iterative multi-directional image search supporting large template matching

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102132527B1 (en) * 2018-04-25 2020-07-09 한양대학교 산학협력단 Avm system and method for detecting matched point between images in avm system
CN109064401B (en) * 2018-07-26 2023-06-23 南京富士通南大软件技术有限公司 Splicing method of ultra-long shopping list
CN110261923A (en) * 2018-08-02 2019-09-20 浙江大华技术股份有限公司 A kind of contraband detecting method and device
KR102471205B1 (en) * 2020-05-15 2022-11-25 한국전자통신연구원 Method for rectification of 2d multi-view images and apparatus for the same
CN115830353A (en) * 2021-09-17 2023-03-21 北京极智嘉科技股份有限公司 Line segment matching method and device, computer equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6084989A (en) * 1996-11-15 2000-07-04 Lockheed Martin Corporation System and method for automatically determining the position of landmarks in digitized images derived from a satellite-based imaging system
US20090002475A1 (en) * 2007-06-27 2009-01-01 General Instrument Corporation Apparatus and System for Improving Image Quality
US20100310177A1 (en) * 2009-05-06 2010-12-09 University Of New Brunswick Method of interest point matching for images
US20120076409A1 (en) * 2010-09-29 2012-03-29 Hitachi, Ltd. Computer system and method of matching for images and graphs
US8229222B1 (en) * 1998-07-13 2012-07-24 Cognex Corporation Method for fast, robust, multi-dimensional pattern recognition
US20140118514A1 (en) * 2012-10-26 2014-05-01 Raytheon Company Method and apparatus for image stacking
US20140193068A1 (en) * 2010-05-28 2014-07-10 Zazzle.Com, Inc Using infrared imaging to create digital images for use in product customization

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001250122A (en) 2000-03-06 2001-09-14 Nippon Telegr & Teleph Corp <Ntt> Method for determining position and posture of body and program recording medium for the same
KR20050063991A (en) 2003-12-23 2005-06-29 한국전자통신연구원 Image matching method and apparatus using image pyramid
KR100986809B1 (en) 2008-07-17 2010-10-08 인하대학교 산학협력단 The Method of Automatic Geometric Correction for Multi-resolution Satellite Images using Scale Invariant Feature Transform
KR101117026B1 (en) 2009-12-15 2012-03-16 삼성메디슨 주식회사 Image registration system for performing image registration between different images
JP2012098984A (en) 2010-11-04 2012-05-24 Nomura Research Institute Ltd Business form data correction method and business form data correction program
KR101677561B1 (en) 2010-12-08 2016-11-18 한국전자통신연구원 Image registration device and image registration method thereof
KR101265694B1 (en) 2011-08-30 2013-05-21 위드로봇 주식회사 Image recognition system for DB efficiency and providing method thereof
CN102722731A (en) * 2012-05-28 2012-10-10 南京航空航天大学 Efficient image matching method based on improved scale invariant feature transform (SIFT) algorithm
CN103279955B (en) * 2013-05-23 2016-03-09 中国科学院深圳先进技术研究院 Image matching method and system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6084989A (en) * 1996-11-15 2000-07-04 Lockheed Martin Corporation System and method for automatically determining the position of landmarks in digitized images derived from a satellite-based imaging system
US8229222B1 (en) * 1998-07-13 2012-07-24 Cognex Corporation Method for fast, robust, multi-dimensional pattern recognition
US20090002475A1 (en) * 2007-06-27 2009-01-01 General Instrument Corporation Apparatus and System for Improving Image Quality
US20100310177A1 (en) * 2009-05-06 2010-12-09 University Of New Brunswick Method of interest point matching for images
US20140193068A1 (en) * 2010-05-28 2014-07-10 Zazzle.Com, Inc Using infrared imaging to create digital images for use in product customization
US20120076409A1 (en) * 2010-09-29 2012-03-29 Hitachi, Ltd. Computer system and method of matching for images and graphs
US20140118514A1 (en) * 2012-10-26 2014-05-01 Raytheon Company Method and apparatus for image stacking

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160196653A1 (en) * 2014-12-31 2016-07-07 Flir Systems, Inc. Systems and methods for dynamic registration of multimodal images
US9684963B2 (en) * 2014-12-31 2017-06-20 Flir Systems, Inc. Systems and methods for dynamic registration of multimodal images
CN105160308A (en) * 2015-08-20 2015-12-16 武汉大学 Airport target recognition method based on line classification and texture classification
CN106127755A (en) * 2016-06-21 2016-11-16 奇瑞汽车股份有限公司 The image matching method of feature based and device
CN106991405A (en) * 2017-04-10 2017-07-28 广东金杭科技股份有限公司 A kind of dynamic portrait storehouse method for building up
CN109711321A (en) * 2018-12-24 2019-05-03 西南交通大学 A kind of wide Baseline Images unchanged view angle linear feature matching process of structure adaptive
US11210551B2 (en) * 2019-07-29 2021-12-28 Hong Kong Applied Science And Technology Research Institute Co., Ltd. Iterative multi-directional image search supporting large template matching
CN113129634A (en) * 2019-12-31 2021-07-16 中移物联网有限公司 Parking space acquisition method and system and communication equipment
CN111476231A (en) * 2020-06-22 2020-07-31 努比亚技术有限公司 Image area identification method and device and computer readable storage medium

Also Published As

Publication number Publication date
KR20150062880A (en) 2015-06-08
CN104680514A (en) 2015-06-03
US9824303B2 (en) 2017-11-21
KR102149274B1 (en) 2020-08-28
CN104680514B (en) 2019-06-18

Similar Documents

Publication Publication Date Title
US9824303B2 (en) Image matching method using feature point matching
US8693785B2 (en) Image matching devices and image matching methods thereof
WO2021012484A1 (en) Deep learning-based target tracking method and apparatus, and computer readable storage medium
US8416989B2 (en) Image processing apparatus, image capture apparatus, image processing method, and program
JP2020519989A (en) Target identification method, device, storage medium and electronic device
US9361692B2 (en) Image registration device and operation method of the same
US7903840B2 (en) Image processing method, image processing apparatus, image processing program and program recording medium
US9558424B2 (en) On-road stereo visual odometry without explicit pose determinations
US20180089832A1 (en) Place recognition algorithm
CN110705574A (en) Positioning method and device, equipment and storage medium
KR102051032B1 (en) Object detection apparatus and controlling method thereof
CN111612852B (en) Method and apparatus for verifying camera parameters
CN111385490B (en) Video splicing method and device
Zhang et al. A new modified panoramic UAV image stitching model based on the GA-SIFT and adaptive threshold method
US20150228060A1 (en) Information processing apparatus, information processing method, information processing system, and non-transitory computer readable medium
JP2017130042A (en) Video processing apparatus, video processing method, and program
EP2828620A1 (en) Generating navigation data
CN113298187A (en) Image processing method and device, and computer readable storage medium
US20180033215A1 (en) Photographing system for long-distance running event and operation method thereof
CN108765277B (en) Image splicing method and device, computer equipment and storage medium
US8958651B2 (en) Tree-model-based stereo matching
EP2993623B1 (en) Apparatus and method for multi-object detection in a digital image
CN110647609A (en) Visual map positioning method and system
CN112750164B (en) Lightweight positioning model construction method, positioning method and electronic equipment
JP5991166B2 (en) 3D position measurement device, 3D position measurement method, and 3D position measurement program

Legal Events

Date Code Title Description
AS Assignment

Owner name: SAMSUNG TECHWIN CO., LTD., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:OH, JAEYOON;REEL/FRAME:034082/0504

Effective date: 20141029

AS Assignment

Owner name: HANWHA TECHWIN CO., LTD., KOREA, REPUBLIC OF

Free format text: CHANGE OF NAME;ASSIGNOR:SAMSUNG TECHWIN CO., LTD.;REEL/FRAME:036233/0327

Effective date: 20150701

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: HANWHA AEROSPACE CO., LTD., KOREA, REPUBLIC OF

Free format text: CHANGE OF NAME;ASSIGNOR:HANWHA TECHWIN CO., LTD;REEL/FRAME:046927/0019

Effective date: 20180401

AS Assignment

Owner name: HANWHA AEROSPACE CO., LTD., KOREA, REPUBLIC OF

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE APPLICATION NUMBER 10/853,669. IN ADDITION PLEASE SEE EXHIBIT A PREVIOUSLY RECORDED ON REEL 046927 FRAME 0019. ASSIGNOR(S) HEREBY CONFIRMS THE CHANGE OF NAME;ASSIGNOR:HANWHA TECHWIN CO., LTD.;REEL/FRAME:048496/0596

Effective date: 20180401

AS Assignment

Owner name: HANWHA TECHWIN CO., LTD., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HANWHA AEROSPACE CO., LTD.;REEL/FRAME:049013/0723

Effective date: 20190417

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4

AS Assignment

Owner name: HANWHA VISION CO., LTD., KOREA, REPUBLIC OF

Free format text: CHANGE OF NAME;ASSIGNOR:HANWHA TECHWIN CO., LTD.;REEL/FRAME:064549/0075

Effective date: 20230228