US20100004856A1 - Positioning device - Google Patents

Positioning device Download PDF

Info

Publication number
US20100004856A1
US20100004856A1 US12/305,397 US30539707A US2010004856A1 US 20100004856 A1 US20100004856 A1 US 20100004856A1 US 30539707 A US30539707 A US 30539707A US 2010004856 A1 US2010004856 A1 US 2010004856A1
Authority
US
United States
Prior art keywords
planimetric feature
planimetric
unit
traffic light
positioning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US12/305,397
Other versions
US8725412B2 (en
Inventor
Norimasa Kobori
Kazunori Kagawa
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toyota Motor Corp
Original Assignee
Toyota Motor Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toyota Motor Corp filed Critical Toyota Motor Corp
Assigned to TOYOTA JIDOSHA KABUSHIKI KAISHA reassignment TOYOTA JIDOSHA KABUSHIKI KAISHA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KAGAWA, KAZUNORI, KOBORI, NORIMASA
Publication of US20100004856A1 publication Critical patent/US20100004856A1/en
Application granted granted Critical
Publication of US8725412B2 publication Critical patent/US8725412B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1656Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
    • G01S19/485Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an optical system or imaging system
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
    • G01S19/49Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an inertial position system, e.g. loosely-coupled
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/09623Systems involving the acquisition of information from passive traffic signs by means mounted on the vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/0969Systems involving transmission of navigation instructions to the vehicle having a display in the form of a map

Definitions

  • the present invention relates to positioning devices for detecting positions of moving objects, and more particularly to a positioning device for accurately correcting a position of a moving object that has been determined by an autonomous navigation method.
  • a navigation system locates the position of the vehicle in which it is installed (hereinafter, “self-vehicle”) based on electric waves from GPS (Global Positioning System) satellites, and applies the travel distance and the travel direction with the use of a vehicle speed sensor and a gyro sensor, to accurately estimate the present position of the self-vehicle.
  • GPS Global Positioning System
  • Patent document 1 proposes a method of selecting, from map data, a road whose position and orientation best match the position and orientation detected by the autonomous navigation method, and correcting the detected position and orientation by associating them with the selected road.
  • map data included in typical commercially available navigation systems is not so accurate.
  • the road network is expressed by linear links joining intersections (nodes).
  • the map data may not match the actual roads. Accordingly, with the map matching method, the position of the self-vehicle may not be sufficiently corrected.
  • a navigation device for calculating the distance between the self-vehicle and an intersection when an intersection symbol such as a traffic light or a crosswalk is detected in an image photographed by a camera, and correcting the position of the self-vehicle in accordance with the calculated distance (see, for example, patent document 2).
  • the position of the self-vehicle with respect to the traveling direction can be corrected by calculating the distance between the self-vehicle and the intersection, even while traveling on a long straight road.
  • Patent Document 1 Japanese Laid-Open Patent Application No. 2002-213979
  • Patent Document 2 Japanese Laid-Open Patent Application No. H9-243389
  • the distance between the self-vehicle and the intersection is calculated based on photographed image data, and the calculated distance is directly used to correct the position of the self-vehicle as described in patent document 2, there may be errors in the distance calculated based on photographed image data. Thus, the corrected image may not be accurate. For example, when the self-vehicle is traveling with pitching motions, a considerably erroneous distance may be calculated.
  • an object of the present invention is to provide a positioning device that can correct positioning results obtained by an autonomous navigation method for locating the position of a moving object with improved accuracy.
  • a positioning device includes a map data storing unit (for example, the map database 5 ) configured to store map data; autonomous sensors (for example, the vehicle speed sensor 2 and the yaw rate sensor 3 ) configured to detect behavior information of a moving object; an inertial positioning unit (for example, the INS positioning unit 82 ) configured to detect an estimated position of the moving object by applying the behavior information detected by the autonomous sensors to positioning results obtained by an electronic navigation positioning unit such as a GPS; a planimetric feature detecting unit (for example, the traffic light detecting unit 84 ) configured to detect a planimetric feature located around a road; a planimetric feature position identifying unit (for example, the traffic light position identifying unit 83 ) configured to identify a position of the planimetric feature; a planimetric feature reference positioning unit configured to estimate a planimetric feature estimated position of the moving object by using the position of the planimetric feature as a reference; and a position estimating unit configured to estimate
  • a positioning device which is capable of correcting positioning results obtained by an autonomous navigation method to locate the position of a moving object with improved accuracy.
  • FIG. 1 is a schematic block diagram of a navigation system to which a positioning device is applied;
  • FIG. 2A is a functional block diagram of the positioning device
  • FIG. 2B is for describing the operation of a position estimating unit in the positioning device shown in FIG. 2A ;
  • FIG. 3A illustrates the positional relationship between a traffic light and a self-vehicle
  • FIG. 3B illustrates the position of the traffic light identified by a least squares method
  • FIG. 3C indicates a final estimated position estimated based on the estimated position and the planimetric feature estimated position
  • FIG. 4A illustrates an example of an eye vector
  • FIG. 4B illustrates an example of identifying the position of the traffic light by a least squares method
  • FIG. 5A illustrates the relationship between the evaluation function and the positions of the traffic lights when the sample size is sufficiently large
  • FIG. 5B illustrates the relationship between the evaluation function and the positions of the traffic lights when the sample size is small
  • FIG. 6 is a flowchart of procedures performed by the positioning device for estimating the position of the self-vehicle with a positioning operation based on the autonomous navigation method and the position of the traffic light;
  • FIG. 7 illustrates an example of correcting the estimated position located by the autonomous navigation method to obtain the final estimated position
  • FIG. 8A illustrates positioning results obtained by the positioning device in a state where the GPS waves are blocked along an ordinary road
  • FIG. 8B is a diagram in which the estimated positions and the final estimated positions obtained by the autonomous navigation method are plotted on a photograph showing the plan view of an actual road.
  • FIG. 9 shows an example of a map in which positions of traffic lights are registered in a road network obtained from map data.
  • FIG. 1 is a schematic block diagram of a navigation system 10 to which a positioning device 9 according to the present embodiment is applied.
  • the navigation system 10 is controlled by a navigation ECU (Electrical Control Unit) 8 .
  • the navigation ECU 8 is a computer including a CPU for executing programs, a storage device for storing programs (hard disk drive, ROM), a RAM for temporarily storing data and programs, an input output device for inputting and outputting data, and a NV (NonVolatile)-RAM, which are connected to each other via a bus.
  • NV NonVolatile
  • the units connected to the navigation ECU 8 include a GPS receiving device 1 for receiving electric waves from GPS (Global Positioning System) satellites, a vehicle speed sensor 2 for detecting the speed of the vehicle, a yaw rate sensor 3 (or gyro sensor) for detecting the rotational speed around the gravity center of the vehicle, a rudder angle sensor 4 for detecting the rudder angle of the steering wheel, a map DB (database) 5 for storing map data, an input device 6 for operating the navigation system 10 , and a display device 7 such as a liquid crystal device or a HUD (Heads Up Display) for displaying the present position in the map.
  • GPS Global Positioning System
  • vehicle speed sensor 2 for detecting the speed of the vehicle
  • a yaw rate sensor 3 or gyro sensor
  • a rudder angle sensor 4 for detecting the rudder angle of the steering wheel
  • a map DB (database) 5 for storing map data
  • an input device 6 for operating the navigation system 10
  • the map DB 5 is a table-type database configured by associating the actual road network with nodes (for example, points where roads intersect each other or points indicating predetermined intervals from an intersection) and links (roads connecting the nodes).
  • the navigation ECU 8 extracts the map data around the detected present position, and displays the map on the display device 7 provided in the vehicle interior at a specified scale size.
  • the navigation ECU 8 displays the present position of the vehicle by superposing it on the map according to need.
  • the navigation ECU 8 searches for the route from the detected present position to the destination by a known route searching method such as a Dijkstra method, displays the route by superposing it on a map, and guides the driver along the route by giving directions at intersections to turn left or right.
  • a known route searching method such as a Dijkstra method
  • a camera 11 is fixed at the front part of the vehicle, more preferably on the backside of the rearview mirror or at the top of the windshield, in order to photograph a predetermined range in front of the vehicle.
  • the camera 11 has a photoelectric conversion element such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor), and performs photoelectric conversion on the incident light with a photoelectric conversion element, reads and amplifies the accumulated electric charge as a voltage to perform A/D conversion, and then converts it into a digital image having predetermined brightness grayscale levels (for example, 256 levels of grayscale values).
  • a photoelectric conversion element such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor)
  • the camera 11 preferably has a function of obtaining distance information, in order to detect the positional relationship between the self-vehicle and artificial planimetric features (traffic lights, signs, paint of crosswalks, electric utility poles, etc.) located along a road at intersections, etc.
  • examples of the camera 11 are a stereo camera including two cameras and a motion stereo camera for performing stereoscopic viewing with time-series imagery obtained with a single camera mounted on a moving object.
  • Another example of the camera 11 is one that obtains the distance by radiating near-infrared rays from an LED (light-emitting diode) at predetermined time intervals and measuring the time until the photoelectric conversion element receives the reflected rays.
  • FIG. 2A is a functional block diagram of the positioning device 9 .
  • the positioning device 9 includes a GPS positioning unit 81 for locating the position of the self-vehicle by the autonomous navigation method, an INS (Inertial Navigation Sensor) positioning unit 82 for locating the position of the self-vehicle by the autonomous navigation method with the use of autonomous sensors (vehicle speed sensor 2 , rudder angle sensor 4 ), a traffic light detecting unit 84 for detecting planimetric features such as traffic lights based on image data photographed by the camera 11 , a traffic light position identifying unit 83 for identifying the position of a detected traffic light, a planimetric feature reference positioning unit 85 for locating the position of the self-vehicle based on the position of the traffic light, a position estimating unit 86 for outputting the maximum likelihood value of the position of the self-vehicle with a Kalman filter, and
  • the positioning device 9 can perform accurate positioning by correcting the position of the vehicle located by the autonomous navigation method, even when it is difficult to capture the GPS satellite waves or the reliability of the GPS positioning operation is degraded.
  • FIG. 2B is for describing the operation of the position estimating unit 86 in the positioning device shown in FIG. 2A .
  • the positioning device 9 uses a Kalman filter for coupling the position located by the autonomous navigation method and the position located based on an eye vector to a planimetric feature such as a traffic light, in order to accurately estimate a position Y.
  • the GPS positioning unit 81 locates the position of the self-vehicle based on electric waves from the GPS satellites by a known method.
  • the GPS positioning unit 81 selects four or more GPS satellites that are within a predetermined elevation angle from the present position of the vehicle from among plural GPS satellites rotating along predetermined orbits, and receives electric waves from the selected GPS satellites.
  • the GPS positioning unit 81 calculates the time when the electric wave will be received, and calculates the distance to the corresponding GPS satellite based on the receiving time and the light velocity c. Accordingly, the point at which the three distances between the GPS satellites and the self-vehicle intersect each other is determined as the position of the self-vehicle.
  • the positioning device 9 locates the position of the self-vehicle at every predetermined time interval, while receiving GPS waves. When the GPS waves are blocked, the position most recently determined is set to be the initial position and the traveling direction at this point is determined to be the initial direction. Then, a positioning operation starts by performing an autonomous navigation method of applying traveling distances and traveling directions to the initial position and direction.
  • FIG. 3A illustrates a position detected by the autonomous navigation method.
  • the self-vehicle is traveling toward the intersection, and the GPS waves are blocked at an initial position 23 .
  • a node 22 of the intersection is extracted from the map DB 5 , and therefore the position of the node 22 is already known.
  • the INS positioning unit 82 detects the vehicle speed from the vehicle speed sensor 2 and the rudder angle from the rudder angle sensor 4 , applies the traveling distance and traveling direction to the initial position 23 and the initial direction, to estimate the position and direction by the autonomous navigation method (hereinafter, “estimated position” and “estimated direction”).
  • the autonomous sensor for detecting the traveling direction of the self-vehicle may be a gyro sensor or a yaw rate sensor.
  • the error variance of the estimated position is necessary for applying the Kalman filter.
  • the errors of the vehicle speed sensor 2 and the rudder angle sensor 4 are already known according to the speed and the time that the GPS waves are blocked.
  • the error variance of an estimated position 24 obtained by applying the distance based on these errors is already known.
  • the error variance of the estimated position 24 is indicated by an oval with dashed lines.
  • time t the self-vehicle is at an actual position 25 .
  • the traffic light detecting unit 84 detects the traffic light from the image data photographed by the camera 11 .
  • the detected position of the traffic light is used to estimate the position of the self-vehicle.
  • a traffic light 21 can be any planimetric feature as long as it can be used for estimating the position of the self-vehicle, such as a sign or an electric utility pole.
  • the traffic light detecting unit 84 detects the traffic light by a pattern matching method with the use of a reference pattern in which the position of the traffic light is stored beforehand.
  • the traffic light detecting unit 84 scans the pixel values of the image data (brightness) in the horizontal direction and the vertical direction, and extracts an edge portion having a gradient of more than or equal to a predetermined level. Adjacent edge portions are joined to extract the outline of the photograph object, and pattern matching is performed on the extracted outline with the use of the reference pattern.
  • the outline is a rectangle, and therefore pattern matching can be performed only on the edge portions corresponding to the outline of a predetermined horizontal to vertical ratio.
  • the traffic light detecting unit 84 compares the brightness of each pixel of a region surrounded by the outline with that of the reference pattern. When the brightness values correlate with each other by more than a predetermined level, it is determined that the traffic light is being photographed, and the traffic light is detected.
  • the traffic light detecting unit 84 Upon detecting the traffic light, the traffic light detecting unit 84 extracts distance information from the image data. As described above, the distance information is extracted from the parallax between two sets of image data, for example. From a pair of stereo images photographed by the camera 11 , a part of the same photograph object (traffic light) is extracted. The same points of the traffic light included in the pair of stereo images are associated with each other, and the displacement amount (parallax) between the associated points is obtained, thereby calculating the distance between the self-vehicle and the traffic light. That is, when the pair of image data items are superposed, the traffic light appears to be displaced in the horizontal direction due to the parallax.
  • the distance information is extracted from the parallax between two sets of image data, for example. From a pair of stereo images photographed by the camera 11 , a part of the same photograph object (traffic light) is extracted. The same points of the traffic light included in the pair of stereo images are associated with each other, and the displacement
  • the position where the images best overlap is obtained by shifting one of the images by one pixel at a time, based on the correlation between the pixel values.
  • the number of shifted pixels is n
  • the focal length of the lens is f
  • the distance between the optical axes is m
  • the pixel pitch is d
  • (n ⁇ d) represents the parallax.
  • the traffic light detecting unit 84 calculates the eye vector joining the detected traffic light and the self-vehicle. Assuming that the direction in which the camera 11 is fixed (the front direction of the camera 11 ) corresponds to zero, the direction ⁇ of the eye vector is obtained based on the distance L between the self-vehicle and the position where the traffic light is photographed in the photoelectric conversion element.
  • FIG. 4A illustrates an example of the eye vector.
  • the map DB 5 stores the coordinates of nodes, information indicating whether there are intersections, and the types of intersections.
  • the map DB 5 may store information indicating whether a traffic light is placed and coordinates expressing where the traffic light is placed (hereinafter, “traffic light coordinates”). When traffic light coordinates are stored, the absolute position of the traffic light is already determined, and therefore the traffic light coordinates of the detected traffic light can be acquired.
  • the traffic light position identifying unit 83 identifies the position of the traffic light with the use of one of i) traffic light coordinates, ii) a least squares method, and iii) a least squares method to which a maximum grade method is applied.
  • FIG. 4B illustrates an example of identifying the position of the traffic light by the least squares method. Every time a traffic light is detected, distances L 1 , L 2 , . . . Ln are obtained, and similarly, directions of the eye vector ⁇ 1 , ⁇ 2 , . . . ⁇ n are obtained. Thus, an assembly of points (a, b) can be obtained from sets (L, ⁇ ) each including a distance from the self-vehicle and a direction of the eye vector. The positions of the traffic lights in the height direction are substantially equal, and therefore (a, b) are coordinates of a plane that is parallel to the road.
  • a linear model is determined as a i ⁇ k 0 +k 1 L i +k 2 ⁇ i b i ⁇ m 0 +m 1 L i +m 2 ⁇ i
  • the square error ⁇ 2 k (k 0 , k 1 , k 2 ), ⁇ 2 m (m 0 , m 1 , m 2 ) is as follows.
  • the sample size is “N”, and “i” is a value from 1 through N.
  • “a” in a ⁇ /ak represents the partial differentiation.
  • “b i ” can be obtained in a similar manner.
  • the above linear model is one example.
  • the relationship of (a, b) and (L, ⁇ ) can be nonlinear.
  • One example is ai ⁇ f(L)+g( ⁇ ).
  • FIG. 3B illustrates the position of the traffic light 21 identified by a least squares method.
  • the position of the traffic light 21 can be identified based on the distance L from the self-vehicle to the traffic light detected until the vehicle reaches the estimated position 24 at the time t, and the direction of the eye vector ⁇ .
  • the sample size N needs to be four or more.
  • the traffic light position identifying unit 83 does not identify the position of the traffic light when the sample size N is less than four, i.e., when the number of times that the traffic signal is detected and the eye vector is calculated is less than four. Accordingly, it is possible to prevent the accuracy in locating the position of the self-vehicle from declining when the Kalman filter is applied. In this case, the positioning device 9 outputs the estimated position 24 and the estimated direction.
  • the concept of the least squares method is to set “the total sum of the square errors acquired each time the distance between the position of the vehicle and the position of the traffic signal is measured” as the evaluation function, and to obtain the position of the traffic light that minimizes this evaluation function.
  • a description is given of a method of obtaining the position of the traffic light with which the evaluation function is minimized based on the slope of the evaluation function, by applying the maximum grade method.
  • FIG. 5A illustrates the relationship between the evaluation function and the positions of the traffic lights.
  • the value of the evaluation function fluctuates according to the calculated position of the traffic light, and there is a minimum value of the evaluation function.
  • the maximum grade method starts with an appropriate initial value, and the parameter value is gradually changed in the direction opposite to that of the differential value to approach an optimum parameter.
  • the evaluation function corresponds to the least squares error, and therefore the partial differentiation of each parameter is calculated.
  • the following formulae are used as the updating formulae for parameters k 0 , k 1 , k 2 .
  • k 0 (j+1) k 0 (j) +2 ⁇ (1/ N ) ⁇ ( a i ⁇ ( k 0 +k 1 L i +k 2 ⁇ i ) ⁇
  • k 1 (j+1) k 1 (j) +2 ⁇ (1/ N ) ⁇ ( a i ⁇ ( k 0 +k 1 L i +k 2 ⁇ i ) ⁇ L i
  • k 2 (j+1) k 2 (j) +2 ⁇ (1/ N ) ⁇ ( a i ⁇ ( k 0 +k 1 L i +k 2 ⁇ i ) ⁇ i
  • the subscript “j” starts with 0, and when k 0 (j+1) becomes minimum, the evaluation function becomes minimum at this parameter, and therefore the calculation is aborted. It can be determined whether the evaluation function has become minimum depending on whether the differential value (slope) has become substantially zero. In this manner, the parameters k 0 , k 1 , k 2 can be achieved.
  • the position of the traffic light at which the evaluation function becomes minimum is indicated as global min.
  • FIG. 5B illustrates the relationship between the evaluation function and the positions of the traffic light.
  • global min indicates the position of the traffic light corresponding to the minimum value on the right side.
  • the position of the traffic light corresponding to the minimum value may be indicated by local min on the left side.
  • the value of the evaluation function is likely to diverge.
  • a value obtained from the node position of the intersection is set as the initial value for the maximum grade method. This is the same as setting the node position as the initial value.
  • planimetric feature reference positioning unit 85 locates the position of the self-vehicle (hereinafter, “planimetric feature estimated position” and “planimetric feature estimated direction”) based on the position of the traffic light 21 acquired with the use of one of i) traffic light coordinates, ii) a least squares method, and iii) a least squares method to which a maximum grade method is applied.
  • FIG. 3C indicates a planimetric feature estimated position 26 of the self-vehicle located based on the distance L from the position of the traffic light and the eye vector direction ⁇ .
  • the planimetric feature reference positioning unit 85 calculates the planimetric feature estimated position 26 and the planimetric feature estimated direction based on the distance L and the eye vector direction ⁇ at the time t.
  • the error variance of the planimetric feature estimated position 26 can be obtained from an error ⁇ L of the distance L and an error ⁇ of the eye vector ⁇ .
  • the least squares method of ii) or iii) is used to identify the position of the traffic light, in addition to the process of i), it is possible to obtain the error variance from errors found in the course of calculating the parameters.
  • the error variance of the planimetric feature estimated position 26 is indicated by an oval with dashed lines.
  • the position estimating unit 86 uses a Kalman filter for coupling the estimated position 24 and the planimetric feature estimated position 26 , and outputs the final estimated position and direction of the self-vehicle having the highest probability.
  • FIG. 3C indicates a final estimated position 27 that is estimated based on the estimated position 24 and the planimetric feature estimated position 26 .
  • each error variance is indicated with convexities at the estimated position 24 and the planimetric feature estimated position 26 .
  • the variance actually extends in a three-dimensional manner.
  • FIG. 2B illustrates how a maximum likelihood value Y is estimated with the dispersions and the Kalman filter.
  • the Kalman filter when the status of each system is separately estimated, the status having the highest probability (status where the product of the distribution is maximum) is estimated based on the distribution of the probability density of the statuses. Accordingly, by coupling the two sets of positioning information with the use of the Kalman filter, it is possible to estimate the final estimated position 27 where the self-vehicle is most probably located.
  • the error variance of Z is R and the error variance of X is M
  • the maximum likelihood value Y is obtained by the following formula:
  • K(i) is the Kalman gain matrix, which can be expressed as follows.
  • K ⁇ ( i ) A ⁇ ( i ) ⁇ Hi t ⁇ Ri - 1
  • the error variance of the estimated position 24 and the error variance of the planimetric feature estimated position 26 are already obtained, and therefore the position estimating unit 86 outputs the final estimated position 27 based on these formulae.
  • the direction of the self-vehicle can be obtained in the same manner.
  • FIG. 6 is a flowchart of procedures performed by the positioning device 9 for estimating the position of the self-vehicle with a positioning operation based on the autonomous navigation method and the position of the traffic light.
  • the positioning device 9 determines whether the GPS waves have been blocked at each positioning interval of the GPS positioning unit 81 , for example (step S 1 ). When the GPS waves are not blocked (No in step S 1 ), the position of the self-vehicle is located by using the GPS waves (step S 2 ).
  • the GPS positioning unit 81 stores the initial position and the initial direction in the storage device of the navigation ECU 8 , and the INS positioning unit 82 applies traveling distances and traveling directions to the initial position 23 and the initial direction based on the vehicle speed and the rudder angle, to estimate the estimated position 24 and the estimated direction by the autonomous navigation method (step S 3 ).
  • the INS positioning unit 82 calculates the dispersions of the accumulated estimated position 24 and estimated direction.
  • the traffic light detecting unit 84 repeatedly detects the traffic light 21 in parallel with the positioning operation performed by the autonomous navigation method (step S 4 ). When the traffic light is detected, the traffic light detecting unit 84 extracts distance information from a pair of image data items. Furthermore, the traffic light detecting unit 84 calculates the eye vector joining the detected traffic light and the self-vehicle (step S 5 ).
  • the traffic light position identifying unit 83 refers to the distances and the eye vectors between the self-vehicle and the traffic light which have been obtained for the past N times (step S 6 ).
  • the traffic light position identifying unit 83 determines whether N is four or more (step S 7 ). When the distance and eye vector have not been obtained for four or more times (No in step S 7 ), the positioning device 9 outputs a positioning result obtained by the autonomous navigation method.
  • the traffic light position identifying unit 83 extracts, from the map DB 5 , the position information of the node of the intersection located in the traveling direction (step S 8 ).
  • the traffic light position identifying unit 83 calculates the position of the traffic light by the least squares method (step S 9 ).
  • the maximum grade method is performed to use the position information of the node as the initial value. Accordingly, the position of the traffic light can be identified.
  • planimetric feature reference positioning unit 85 calculates the planimetric feature estimated position 26 and the estimated planimetric direction from the distance L and the eye vector direction ⁇ , using the identified position of the traffic light as the origin (step S 10 ).
  • the position estimating unit 86 uses a Kalman filter for coupling the estimated position 24 and the planimetric feature estimated position 26 , and outputs the final estimated position 27 (step S 11 ).
  • FIG. 7 illustrates an example of correcting the estimated position 24 located by the autonomous navigation method to obtain the final estimated position 27 .
  • FIG. 7 illustrates how the estimated position is corrected by the autonomous navigation method every time the final estimated position 27 is output.
  • the position of the self-vehicle is located by using as a reference the position of the traffic light located in the traveling direction, and the position located by the autonomous navigation method is corrected. Therefore, the position particularly in the traveling direction can be accurately corrected.
  • FIG. 8 illustrates positioning results obtained by the positioning device 9 in a state where the GPS waves are blocked in an ordinary road.
  • FIG. 8A shows image data photographed by the camera 11 .
  • the traffic light 21 is detected at the top part of the image data.
  • FIG. 8B is a diagram in which the estimated positions 24 and the final estimated positions 27 obtained by the autonomous navigation method are plotted on a photograph showing the plan view of an actual road.
  • the self-vehicle traveled from the bottom part toward the top part of the photograph of the plan view.
  • the estimated position 24 becomes more and more displaced from the actual road.
  • the displacement from the road can be considerably reduced.
  • positions of landmarks such as traffic lights are detected in the course of correcting the position of the self-vehicle located by the autonomous navigation method.
  • the map DB 5 in a typical navigation system does not store position information of road infrastructure items (traffic lights, signs, etc.), as described above.
  • the positioning device 9 registers, in the map DB 5 , positions of traffic lights, etc., detected in the course of correcting the position of the self-vehicle, and positions of traffic lights, etc., detected when the GPS waves are not blocked. Accordingly, a database of position information of road infrastructure items can be created.
  • FIG. 9 shows an example of a map in which positions of traffic lights are registered in a road network obtained from map data.
  • the white circles indicate the registered traffic lights.
  • the positions of road infrastructure items can be detected by a camera installed in the vehicle quickly and at low cost. Thus, even if a new road infrastructure item is installed, this can be reflected in the database when the vehicle travels, thereby providing excellent redundancy.
  • the position of a planimetric feature such as a traffic light is identified, and the position of the self-vehicle is located by using the identified position of the planimetric feature as a reference.
  • the position of the self-vehicle located in this manner and a position located by the autonomous navigation method are applied to the Kalman filter. Accordingly, the positioning results obtained by the autonomous navigation method can be accurately corrected even if the GPS waves are blocked. In particular, the position in the traveling direction can be accurately corrected.
  • the map database in which position information of road infrastructure items are registered can be updated during usage.

Abstract

A positioning device 9 includes a map data storing unit 5 configured to store map data; autonomous sensors 2, 3 configured to detect behavior information of a moving object; an inertial positioning unit 82 configured to detect an estimated position of the moving object by applying the behavior information detected by the autonomous sensors to positioning results obtained by an electronic navigation positioning unit such as a GPS; a planimetric feature detecting unit 84 configured to detect a planimetric feature located around a road; a planimetric feature position identifying unit 83 configured to identify a position of the planimetric feature; a planimetric feature reference positioning unit 85 configured to estimate a planimetric feature estimated position of the moving object by using the position of the planimetric feature as a reference; and a position estimating unit 86 configured to estimate a position of the moving object by applying the estimated position and the planimetric feature estimated position to a Kalman filter.

Description

    TECHNICAL FIELD
  • The present invention relates to positioning devices for detecting positions of moving objects, and more particularly to a positioning device for accurately correcting a position of a moving object that has been determined by an autonomous navigation method.
  • BACKGROUND ART
  • A navigation system locates the position of the vehicle in which it is installed (hereinafter, “self-vehicle”) based on electric waves from GPS (Global Positioning System) satellites, and applies the travel distance and the travel direction with the use of a vehicle speed sensor and a gyro sensor, to accurately estimate the present position of the self-vehicle.
  • However, when electric waves cannot be received from the GPS satellites, the error in the position located by the autonomous navigation method is amplified along with the passage of time. Thus, the accuracy of the position gradually declines.
  • Accordingly, various methods have been proposed for correcting the position of the self-vehicle located by the autonomous navigation method. For example, map matching is for correcting the position located by the autonomous navigation method with the use of map data of the navigation system (see, for example, patent document 1). Patent document 1 proposes a method of selecting, from map data, a road whose position and orientation best match the position and orientation detected by the autonomous navigation method, and correcting the detected position and orientation by associating them with the selected road.
  • However, the map data included in typical commercially available navigation systems is not so accurate. Furthermore, in the map data, the road network is expressed by linear links joining intersections (nodes). Thus, the map data may not match the actual roads. Accordingly, with the map matching method, the position of the self-vehicle may not be sufficiently corrected.
  • Furthermore, there is proposed a navigation device for calculating the distance between the self-vehicle and an intersection when an intersection symbol such as a traffic light or a crosswalk is detected in an image photographed by a camera, and correcting the position of the self-vehicle in accordance with the calculated distance (see, for example, patent document 2). According to patent document 2, the position of the self-vehicle with respect to the traveling direction can be corrected by calculating the distance between the self-vehicle and the intersection, even while traveling on a long straight road.
  • Patent Document 1: Japanese Laid-Open Patent Application No. 2002-213979
  • Patent Document 2: Japanese Laid-Open Patent Application No. H9-243389
  • DISCLOSURE OF INVENTION Problems to be Solved by the Invention
  • However, when the distance between the self-vehicle and the intersection is calculated based on photographed image data, and the calculated distance is directly used to correct the position of the self-vehicle as described in patent document 2, there may be errors in the distance calculated based on photographed image data. Thus, the corrected image may not be accurate. For example, when the self-vehicle is traveling with pitching motions, a considerably erroneous distance may be calculated.
  • In view of the above problems, an object of the present invention is to provide a positioning device that can correct positioning results obtained by an autonomous navigation method for locating the position of a moving object with improved accuracy.
  • Means for Solving the Problems
  • In order to achieve the above objects, a positioning device according to the present invention includes a map data storing unit (for example, the map database 5) configured to store map data; autonomous sensors (for example, the vehicle speed sensor 2 and the yaw rate sensor 3) configured to detect behavior information of a moving object; an inertial positioning unit (for example, the INS positioning unit 82) configured to detect an estimated position of the moving object by applying the behavior information detected by the autonomous sensors to positioning results obtained by an electronic navigation positioning unit such as a GPS; a planimetric feature detecting unit (for example, the traffic light detecting unit 84) configured to detect a planimetric feature located around a road; a planimetric feature position identifying unit (for example, the traffic light position identifying unit 83) configured to identify a position of the planimetric feature; a planimetric feature reference positioning unit configured to estimate a planimetric feature estimated position of the moving object by using the position of the planimetric feature as a reference; and a position estimating unit configured to estimate a position of the moving object by applying the estimated position and the planimetric feature estimated position to a Kalman filter.
  • ADVANTAGEOUS EFFECT OF THE INVENTION
  • According to the present invention, a positioning device can be provided, which is capable of correcting positioning results obtained by an autonomous navigation method to locate the position of a moving object with improved accuracy.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a schematic block diagram of a navigation system to which a positioning device is applied;
  • FIG. 2A is a functional block diagram of the positioning device;
  • FIG. 2B is for describing the operation of a position estimating unit in the positioning device shown in FIG. 2A;
  • FIG. 3A illustrates the positional relationship between a traffic light and a self-vehicle;
  • FIG. 3B illustrates the position of the traffic light identified by a least squares method;
  • FIG. 3C indicates a final estimated position estimated based on the estimated position and the planimetric feature estimated position;
  • FIG. 4A illustrates an example of an eye vector;
  • FIG. 4B illustrates an example of identifying the position of the traffic light by a least squares method;
  • FIG. 5A illustrates the relationship between the evaluation function and the positions of the traffic lights when the sample size is sufficiently large;
  • FIG. 5B illustrates the relationship between the evaluation function and the positions of the traffic lights when the sample size is small;
  • FIG. 6 is a flowchart of procedures performed by the positioning device for estimating the position of the self-vehicle with a positioning operation based on the autonomous navigation method and the position of the traffic light;
  • FIG. 7 illustrates an example of correcting the estimated position located by the autonomous navigation method to obtain the final estimated position;
  • FIG. 8A illustrates positioning results obtained by the positioning device in a state where the GPS waves are blocked along an ordinary road;
  • FIG. 8B is a diagram in which the estimated positions and the final estimated positions obtained by the autonomous navigation method are plotted on a photograph showing the plan view of an actual road; and
  • FIG. 9 shows an example of a map in which positions of traffic lights are registered in a road network obtained from map data.
  • EXPLANATION OF REFERENCES
    • 1 GPS receiving device
    • 2 vehicle speed sensor
    • 3 yaw rate sensor
    • 4 rudder angle sensor
    • 5 map database
    • 6 input device
    • 7 display device
    • 8 navigation ECU
    • 9 positioning device
    • 10 navigation system
    • 11 camera
    • 21 traffic light
    • 22 node of intersection
    • 23 initial position
    • 24 estimated position
    • 25 actual position of self vehicle
    • 26 planimetric feature estimated position
    • 27 final estimated position
    BEST MODE FOR CARRYING OUT THE INVENTION
  • The best mode for carrying out the invention is described based on the following embodiments with reference to the accompanying drawings.
  • FIG. 1 is a schematic block diagram of a navigation system 10 to which a positioning device 9 according to the present embodiment is applied. The navigation system 10 is controlled by a navigation ECU (Electrical Control Unit) 8. The navigation ECU 8 is a computer including a CPU for executing programs, a storage device for storing programs (hard disk drive, ROM), a RAM for temporarily storing data and programs, an input output device for inputting and outputting data, and a NV (NonVolatile)-RAM, which are connected to each other via a bus.
  • The units connected to the navigation ECU 8 include a GPS receiving device 1 for receiving electric waves from GPS (Global Positioning System) satellites, a vehicle speed sensor 2 for detecting the speed of the vehicle, a yaw rate sensor 3 (or gyro sensor) for detecting the rotational speed around the gravity center of the vehicle, a rudder angle sensor 4 for detecting the rudder angle of the steering wheel, a map DB (database) 5 for storing map data, an input device 6 for operating the navigation system 10, and a display device 7 such as a liquid crystal device or a HUD (Heads Up Display) for displaying the present position in the map.
  • The map DB 5 is a table-type database configured by associating the actual road network with nodes (for example, points where roads intersect each other or points indicating predetermined intervals from an intersection) and links (roads connecting the nodes).
  • The navigation ECU 8 extracts the map data around the detected present position, and displays the map on the display device 7 provided in the vehicle interior at a specified scale size. The navigation ECU 8 displays the present position of the vehicle by superposing it on the map according to need.
  • Furthermore, when the destination is input from the input device 6 such as a press-down-type keyboard or a remote controller, the navigation ECU 8 searches for the route from the detected present position to the destination by a known route searching method such as a Dijkstra method, displays the route by superposing it on a map, and guides the driver along the route by giving directions at intersections to turn left or right.
  • A camera 11 is fixed at the front part of the vehicle, more preferably on the backside of the rearview mirror or at the top of the windshield, in order to photograph a predetermined range in front of the vehicle.
  • The camera 11 has a photoelectric conversion element such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor), and performs photoelectric conversion on the incident light with a photoelectric conversion element, reads and amplifies the accumulated electric charge as a voltage to perform A/D conversion, and then converts it into a digital image having predetermined brightness grayscale levels (for example, 256 levels of grayscale values).
  • The camera 11 preferably has a function of obtaining distance information, in order to detect the positional relationship between the self-vehicle and artificial planimetric features (traffic lights, signs, paint of crosswalks, electric utility poles, etc.) located along a road at intersections, etc. Accordingly, examples of the camera 11 are a stereo camera including two cameras and a motion stereo camera for performing stereoscopic viewing with time-series imagery obtained with a single camera mounted on a moving object. Another example of the camera 11 is one that obtains the distance by radiating near-infrared rays from an LED (light-emitting diode) at predetermined time intervals and measuring the time until the photoelectric conversion element receives the reflected rays.
  • The CPU of the navigation ECU 8 executes a program stored in the storage device to implement the positioning operation described in the present embodiment. FIG. 2A is a functional block diagram of the positioning device 9. The positioning device 9 includes a GPS positioning unit 81 for locating the position of the self-vehicle by the autonomous navigation method, an INS (Inertial Navigation Sensor) positioning unit 82 for locating the position of the self-vehicle by the autonomous navigation method with the use of autonomous sensors (vehicle speed sensor 2, rudder angle sensor 4), a traffic light detecting unit 84 for detecting planimetric features such as traffic lights based on image data photographed by the camera 11, a traffic light position identifying unit 83 for identifying the position of a detected traffic light, a planimetric feature reference positioning unit 85 for locating the position of the self-vehicle based on the position of the traffic light, a position estimating unit 86 for outputting the maximum likelihood value of the position of the self-vehicle with a Kalman filter, and a map data registering unit 87 for registering, in the map DB 5, the position information of a planimetric feature such as a traffic light identified by the traffic light position identifying unit 83, in association with the planimetric feature. Hereinafter, a description is given of the positioning device 9 shown in FIG. 2A.
  • The positioning device 9 can perform accurate positioning by correcting the position of the vehicle located by the autonomous navigation method, even when it is difficult to capture the GPS satellite waves or the reliability of the GPS positioning operation is degraded. FIG. 2B is for describing the operation of the position estimating unit 86 in the positioning device shown in FIG. 2A.
  • An outline is described with reference to FIG. 2B. When the GPS waves are blocked, the positioning device 9 uses a Kalman filter for coupling the position located by the autonomous navigation method and the position located based on an eye vector to a planimetric feature such as a traffic light, in order to accurately estimate a position Y.
  • The GPS positioning unit 81 locates the position of the self-vehicle based on electric waves from the GPS satellites by a known method. The GPS positioning unit 81 selects four or more GPS satellites that are within a predetermined elevation angle from the present position of the vehicle from among plural GPS satellites rotating along predetermined orbits, and receives electric waves from the selected GPS satellites. The GPS positioning unit 81 calculates the time when the electric wave will be received, and calculates the distance to the corresponding GPS satellite based on the receiving time and the light velocity c. Accordingly, the point at which the three distances between the GPS satellites and the self-vehicle intersect each other is determined as the position of the self-vehicle.
  • The positioning device 9 locates the position of the self-vehicle at every predetermined time interval, while receiving GPS waves. When the GPS waves are blocked, the position most recently determined is set to be the initial position and the traveling direction at this point is determined to be the initial direction. Then, a positioning operation starts by performing an autonomous navigation method of applying traveling distances and traveling directions to the initial position and direction.
  • FIG. 3A illustrates a position detected by the autonomous navigation method. The self-vehicle is traveling toward the intersection, and the GPS waves are blocked at an initial position 23. A node 22 of the intersection is extracted from the map DB 5, and therefore the position of the node 22 is already known.
  • The INS positioning unit 82 detects the vehicle speed from the vehicle speed sensor 2 and the rudder angle from the rudder angle sensor 4, applies the traveling distance and traveling direction to the initial position 23 and the initial direction, to estimate the position and direction by the autonomous navigation method (hereinafter, “estimated position” and “estimated direction”). The autonomous sensor for detecting the traveling direction of the self-vehicle may be a gyro sensor or a yaw rate sensor.
  • Furthermore, knowing the error variance of the estimated position is necessary for applying the Kalman filter. The errors of the vehicle speed sensor 2 and the rudder angle sensor 4 are already known according to the speed and the time that the GPS waves are blocked. Thus, the error variance of an estimated position 24 obtained by applying the distance based on these errors is already known. In FIG. 3A, the error variance of the estimated position 24 is indicated by an oval with dashed lines. At this time (time t), the self-vehicle is at an actual position 25.
  • While the position is being detected by the autonomous navigation method, the traffic light detecting unit 84 detects the traffic light from the image data photographed by the camera 11. The detected position of the traffic light is used to estimate the position of the self-vehicle. A traffic light 21 can be any planimetric feature as long as it can be used for estimating the position of the self-vehicle, such as a sign or an electric utility pole.
  • The traffic light detecting unit 84 detects the traffic light by a pattern matching method with the use of a reference pattern in which the position of the traffic light is stored beforehand. The traffic light detecting unit 84 scans the pixel values of the image data (brightness) in the horizontal direction and the vertical direction, and extracts an edge portion having a gradient of more than or equal to a predetermined level. Adjacent edge portions are joined to extract the outline of the photograph object, and pattern matching is performed on the extracted outline with the use of the reference pattern. In the case of a traffic light, the outline is a rectangle, and therefore pattern matching can be performed only on the edge portions corresponding to the outline of a predetermined horizontal to vertical ratio. The traffic light detecting unit 84 compares the brightness of each pixel of a region surrounded by the outline with that of the reference pattern. When the brightness values correlate with each other by more than a predetermined level, it is determined that the traffic light is being photographed, and the traffic light is detected.
  • Upon detecting the traffic light, the traffic light detecting unit 84 extracts distance information from the image data. As described above, the distance information is extracted from the parallax between two sets of image data, for example. From a pair of stereo images photographed by the camera 11, a part of the same photograph object (traffic light) is extracted. The same points of the traffic light included in the pair of stereo images are associated with each other, and the displacement amount (parallax) between the associated points is obtained, thereby calculating the distance between the self-vehicle and the traffic light. That is, when the pair of image data items are superposed, the traffic light appears to be displaced in the horizontal direction due to the parallax. The position where the images best overlap is obtained by shifting one of the images by one pixel at a time, based on the correlation between the pixel values. Assuming that the number of shifted pixels is n, the focal length of the lens is f, the distance between the optical axes is m, and the pixel pitch is d, the distance L between the self-vehicle and the photograph object is calculated by a relational expression of L=(f·m)/(n·d). In this expression, (n·d) represents the parallax.
  • The traffic light detecting unit 84 calculates the eye vector joining the detected traffic light and the self-vehicle. Assuming that the direction in which the camera 11 is fixed (the front direction of the camera 11) corresponds to zero, the direction θ of the eye vector is obtained based on the distance L between the self-vehicle and the position where the traffic light is photographed in the photoelectric conversion element. FIG. 4A illustrates an example of the eye vector.
  • The map DB 5 stores the coordinates of nodes, information indicating whether there are intersections, and the types of intersections. In addition, the map DB 5 may store information indicating whether a traffic light is placed and coordinates expressing where the traffic light is placed (hereinafter, “traffic light coordinates”). When traffic light coordinates are stored, the absolute position of the traffic light is already determined, and therefore the traffic light coordinates of the detected traffic light can be acquired.
  • However, when plural traffic lights are photographed in one image data item, or when traffic light coordinates are not stored in the map DB 5, it is necessary to identify the position of the traffic light from the image data in which the traffic light has been detected.
  • The traffic light position identifying unit 83 according to the present embodiment identifies the position of the traffic light with the use of one of i) traffic light coordinates, ii) a least squares method, and iii) a least squares method to which a maximum grade method is applied.
  • FIG. 4B illustrates an example of identifying the position of the traffic light by the least squares method. Every time a traffic light is detected, distances L1, L2, . . . Ln are obtained, and similarly, directions of the eye vector θ1, θ2, . . . θn are obtained. Thus, an assembly of points (a, b) can be obtained from sets (L, θ) each including a distance from the self-vehicle and a direction of the eye vector. The positions of the traffic lights in the height direction are substantially equal, and therefore (a, b) are coordinates of a plane that is parallel to the road.
  • For example, if a linear model is determined as ai≡k0+k1Li+k2θi bi≡m0+m1Li+m2θi, the square error ε2 k(k0, k1, k2), ε2 m(m0, m1, m2) is as follows. The sample size is “N”, and “i” is a value from 1 through N.

  • ε2 k(k 0 , k 1 , k 2)=(1/N)Σ{a i−(k 0 +k 1 L i +k 2θi)}2

  • ε2 m(m 0 , m 1 , m 2)=(1/N)Σ{b i−(m 0 +m 1 L i +m 2θi)}2
  • ε2 k is differentiated partially with respect to k0, k1, k2, and (k0, k1, k2) can be obtained from aε2 k/ak0=0, aε2 k/ak1=0, aε2 k/ak2=0. “a” in a ε/ak represents the partial differentiation. “bi” can be obtained in a similar manner.
  • The above linear model is one example. The relationship of (a, b) and (L, θ) can be nonlinear. One example is ai≡f(L)+g(θ).
  • FIG. 3B illustrates the position of the traffic light 21 identified by a least squares method. The position of the traffic light 21 can be identified based on the distance L from the self-vehicle to the traffic light detected until the vehicle reaches the estimated position 24 at the time t, and the direction of the eye vector θ.
  • In order to obtain preferable computation results by the least squares method, the sample size N needs to be four or more. Thus, the traffic light position identifying unit 83 does not identify the position of the traffic light when the sample size N is less than four, i.e., when the number of times that the traffic signal is detected and the eye vector is calculated is less than four. Accordingly, it is possible to prevent the accuracy in locating the position of the self-vehicle from declining when the Kalman filter is applied. In this case, the positioning device 9 outputs the estimated position 24 and the estimated direction.
  • The concept of the least squares method is to set “the total sum of the square errors acquired each time the distance between the position of the vehicle and the position of the traffic signal is measured” as the evaluation function, and to obtain the position of the traffic light that minimizes this evaluation function. A description is given of a method of obtaining the position of the traffic light with which the evaluation function is minimized based on the slope of the evaluation function, by applying the maximum grade method.
  • FIG. 5A illustrates the relationship between the evaluation function and the positions of the traffic lights. The value of the evaluation function fluctuates according to the calculated position of the traffic light, and there is a minimum value of the evaluation function. The maximum grade method starts with an appropriate initial value, and the parameter value is gradually changed in the direction opposite to that of the differential value to approach an optimum parameter.
  • When the least squares method is performed, the evaluation function corresponds to the least squares error, and therefore the partial differentiation of each parameter is calculated. For example, the following formulae are used as the updating formulae for parameters k0, k1, k2.

  • k 0 (j+1) =k 0 (j)+2·(1/N)Σ{(a i−(k 0 +k 1 L i +k 2θi)}

  • k 1 (j+1) =k 1 (j)+2·(1/N)Σ{(a i−(k 0 +k 1 L i +k 2θi)}L i

  • k 2 (j+1) =k 2 (j)+2·(1/N)Σ{(a i−(k 0 +k 1 L i +k 2θi)}θi
  • The subscript “j” starts with 0, and when k0 (j+1) becomes minimum, the evaluation function becomes minimum at this parameter, and therefore the calculation is aborted. It can be determined whether the evaluation function has become minimum depending on whether the differential value (slope) has become substantially zero. In this manner, the parameters k0, k1, k2 can be achieved. In FIG. 5A, the position of the traffic light at which the evaluation function becomes minimum is indicated as global min.
  • In the maximum grade method, k0 (0), etc., when j=0 corresponds to the initial value of each parameter. The accuracy of the minimum value depends on this initial value. FIG. 5B illustrates the relationship between the evaluation function and the positions of the traffic light. In this example, there are plural minimum values. As shown in FIG. 5B, global min indicates the position of the traffic light corresponding to the minimum value on the right side. Depending on the initial value, the position of the traffic light corresponding to the minimum value may be indicated by local min on the left side. Furthermore, when the sample size N is small, the value of the evaluation function is likely to diverge.
  • However, it can be estimated that the traffic light is located at the intersection. Therefore, a value obtained from the node position of the intersection is set as the initial value for the maximum grade method. This is the same as setting the node position as the initial value. By performing such a process, the position of the traffic light corresponding to global min can be detected, even when the sample size N is small.
  • In the above manner, the position of the traffic light 21 shown in FIG. 3B can be identified.
  • Returning to FIG. 2A, the planimetric feature reference positioning unit 85 locates the position of the self-vehicle (hereinafter, “planimetric feature estimated position” and “planimetric feature estimated direction”) based on the position of the traffic light 21 acquired with the use of one of i) traffic light coordinates, ii) a least squares method, and iii) a least squares method to which a maximum grade method is applied.
  • FIG. 3C indicates a planimetric feature estimated position 26 of the self-vehicle located based on the distance L from the position of the traffic light and the eye vector direction θ. The planimetric feature reference positioning unit 85 calculates the planimetric feature estimated position 26 and the planimetric feature estimated direction based on the distance L and the eye vector direction θ at the time t.
  • When the i) traffic light coordinates are used, the error variance of the planimetric feature estimated position 26 can be obtained from an error ΔL of the distance L and an error Δθ of the eye vector θ. When the least squares method of ii) or iii) is used to identify the position of the traffic light, in addition to the process of i), it is possible to obtain the error variance from errors found in the course of calculating the parameters. In FIG. 3C, the error variance of the planimetric feature estimated position 26 is indicated by an oval with dashed lines.
  • The position estimating unit 86 uses a Kalman filter for coupling the estimated position 24 and the planimetric feature estimated position 26, and outputs the final estimated position and direction of the self-vehicle having the highest probability.
  • FIG. 3C indicates a final estimated position 27 that is estimated based on the estimated position 24 and the planimetric feature estimated position 26. In FIG. 3C, each error variance is indicated with convexities at the estimated position 24 and the planimetric feature estimated position 26. However, the variance actually extends in a three-dimensional manner. FIG. 2B illustrates how a maximum likelihood value Y is estimated with the dispersions and the Kalman filter.
  • In the Kalman filter, when the status of each system is separately estimated, the status having the highest probability (status where the product of the distribution is maximum) is estimated based on the distribution of the probability density of the statuses. Accordingly, by coupling the two sets of positioning information with the use of the Kalman filter, it is possible to estimate the final estimated position 27 where the self-vehicle is most probably located.
  • In the Kalman filter, when the estimated position 24 is a Z vector and the planimetric feature estimated position 26 is an X vector, Z and X are assumed to have the following relationship with the use of a known observation equation.

  • Z−HX=0
  • When the error variance of Z is R and the error variance of X is M, the error variance of the maximum likelihood value Y corresponding to the final estimated position 27 is analytically obtained as A=(M−1+HtR−1H)−1. Furthermore, with the Kalman filter, the maximum likelihood value Y is obtained by the following formula:

  • Y(i)=X(i−1)+K(i)·{Z(i)−H(iX(i−1)}
  • where the subscript i represents the number of times of observing the position of the self-vehicle, t represents the transposed matrix, and −1 represents the inverse matrix. K(i) is the Kalman gain matrix, which can be expressed as follows.
  • K ( i ) = A ( i ) · Hi t Ri - 1 A ( i ) = ( M - 1 + H t R - 1 H ) - 1 = M ( i ) - M ( i ) · H ( i ) t { H ( i ) M ( i ) H ( i ) t + R ( i ) } - 1 H ( i ) M ( i )
  • The error variance of the estimated position 24 and the error variance of the planimetric feature estimated position 26 are already obtained, and therefore the position estimating unit 86 outputs the final estimated position 27 based on these formulae. The direction of the self-vehicle can be obtained in the same manner.
  • FIG. 6 is a flowchart of procedures performed by the positioning device 9 for estimating the position of the self-vehicle with a positioning operation based on the autonomous navigation method and the position of the traffic light.
  • The positioning device 9 determines whether the GPS waves have been blocked at each positioning interval of the GPS positioning unit 81, for example (step S1). When the GPS waves are not blocked (No in step S1), the position of the self-vehicle is located by using the GPS waves (step S2).
  • When the GPS waves are blocked (Yes in step S1), the GPS positioning unit 81 stores the initial position and the initial direction in the storage device of the navigation ECU 8, and the INS positioning unit 82 applies traveling distances and traveling directions to the initial position 23 and the initial direction based on the vehicle speed and the rudder angle, to estimate the estimated position 24 and the estimated direction by the autonomous navigation method (step S3). Based on the errors, etc., of the vehicle speed sensor 2 and the rudder angle sensor 4, the INS positioning unit 82 calculates the dispersions of the accumulated estimated position 24 and estimated direction.
  • The traffic light detecting unit 84 repeatedly detects the traffic light 21 in parallel with the positioning operation performed by the autonomous navigation method (step S4). When the traffic light is detected, the traffic light detecting unit 84 extracts distance information from a pair of image data items. Furthermore, the traffic light detecting unit 84 calculates the eye vector joining the detected traffic light and the self-vehicle (step S5).
  • Next, the traffic light position identifying unit 83 refers to the distances and the eye vectors between the self-vehicle and the traffic light which have been obtained for the past N times (step S6).
  • As described above, when N is small, the accuracy of the position of the traffic light identified by the least squares method declines. Thus, the traffic light position identifying unit 83 determines whether N is four or more (step S7). When the distance and eye vector have not been obtained for four or more times (No in step S7), the positioning device 9 outputs a positioning result obtained by the autonomous navigation method.
  • When the distance and eye vector have been obtained for four or more times (Yes in step S7), the traffic light position identifying unit 83 extracts, from the map DB 5, the position information of the node of the intersection located in the traveling direction (step S8).
  • Next, the traffic light position identifying unit 83 calculates the position of the traffic light by the least squares method (step S9). When the least squares method is applied, the maximum grade method is performed to use the position information of the node as the initial value. Accordingly, the position of the traffic light can be identified.
  • Next, the planimetric feature reference positioning unit 85 calculates the planimetric feature estimated position 26 and the estimated planimetric direction from the distance L and the eye vector direction θ, using the identified position of the traffic light as the origin (step S10).
  • Next, the position estimating unit 86 uses a Kalman filter for coupling the estimated position 24 and the planimetric feature estimated position 26, and outputs the final estimated position 27 (step S11).
  • FIG. 7 illustrates an example of correcting the estimated position 24 located by the autonomous navigation method to obtain the final estimated position 27. FIG. 7 illustrates how the estimated position is corrected by the autonomous navigation method every time the final estimated position 27 is output. In the present embodiment, the position of the self-vehicle is located by using as a reference the position of the traffic light located in the traveling direction, and the position located by the autonomous navigation method is corrected. Therefore, the position particularly in the traveling direction can be accurately corrected.
  • FIG. 8 illustrates positioning results obtained by the positioning device 9 in a state where the GPS waves are blocked in an ordinary road. FIG. 8A shows image data photographed by the camera 11. In FIG. 8A, the traffic light 21 is detected at the top part of the image data.
  • FIG. 8B is a diagram in which the estimated positions 24 and the final estimated positions 27 obtained by the autonomous navigation method are plotted on a photograph showing the plan view of an actual road. The self-vehicle traveled from the bottom part toward the top part of the photograph of the plan view. As shown in FIG. 8B, as the vehicle travels, the estimated position 24 becomes more and more displaced from the actual road. However, by correcting the displacement with the planimetric feature estimated positions 26, the displacement from the road can be considerably reduced.
  • In the present embodiment, after the GPS waves are blocked, positions of landmarks such as traffic lights are detected in the course of correcting the position of the self-vehicle located by the autonomous navigation method. This is because the map DB 5 in a typical navigation system does not store position information of road infrastructure items (traffic lights, signs, etc.), as described above.
  • In order to provide driving assistance (vehicle control, attention calls) to the driver at intersections, etc., at appropriate timings, position information of road infrastructure items is indispensable. However, considerable workload and costs would be required for positioning the road infrastructure items nationwide and turning the information into a database. Furthermore, when a new road infrastructure item is installed, it cannot be immediately reflected in the database.
  • Thus, the positioning device 9 registers, in the map DB 5, positions of traffic lights, etc., detected in the course of correcting the position of the self-vehicle, and positions of traffic lights, etc., detected when the GPS waves are not blocked. Accordingly, a database of position information of road infrastructure items can be created.
  • FIG. 9 shows an example of a map in which positions of traffic lights are registered in a road network obtained from map data. The white circles indicate the registered traffic lights.
  • The positions of road infrastructure items can be detected by a camera installed in the vehicle quickly and at low cost. Thus, even if a new road infrastructure item is installed, this can be reflected in the database when the vehicle travels, thereby providing excellent redundancy.
  • As described above, in the positioning device 9 according to the present embodiment, the position of a planimetric feature such as a traffic light is identified, and the position of the self-vehicle is located by using the identified position of the planimetric feature as a reference. The position of the self-vehicle located in this manner and a position located by the autonomous navigation method are applied to the Kalman filter. Accordingly, the positioning results obtained by the autonomous navigation method can be accurately corrected even if the GPS waves are blocked. In particular, the position in the traveling direction can be accurately corrected. Furthermore, by registering the identified positions of planimetric features in a map database, the map database in which position information of road infrastructure items are registered can be updated during usage.
  • The present invention is not limited to the specifically disclosed embodiment, and variations and modifications may be made without departing from the scope of the present invention.
  • The present application is based on Japanese Priority Patent Application No. 2006-171755, filed on Jun. 21, 2006, the entire contents of which are hereby incorporated by reference.

Claims (5)

1. A positioning device comprising:
a map data storing unit configured to store map data;
an autonomous sensor configured to detect behavior information of a moving object;
an inertial positioning unit configured to detect an estimated position of the moving object by applying the behavior information detected by the autonomous sensor to positioning results obtained by an electronic navigation positioning unit such as a GPS;
a planimetric feature detecting unit configured to detect a planimetric feature located around a road and calculate planimetric feature position information of the planimetric feature;
a planimetric feature position identifying unit configured to identify a position of the planimetric feature in the map data in the event of acquiring four or more items of the planimetric feature position information that has been repeatedly calculated by the planimetric feature detecting unit;
a planimetric feature reference positioning unit configured to estimate a planimetric feature estimated position of the moving object by using the position of the planimetric feature as a reference; and
a position estimating unit configured to estimate a position of the moving object by applying the estimated position and the planimetric feature estimated position to a Kalman filter.
2. The positioning device according to claim 1, wherein:
the planimetric feature detecting unit comprises an eye vector detecting unit configured to detect the planimetric feature from image data obtained by a photographing unit, and to acquire an eye vector joining the moving object and the planimetric feature thus detected; and
the planimetric feature position identifying unit comprises a planimetric feature position calculating unit configured to calculate the planimetric feature position information of the planimetric feature by a least squares method with the use of lengths and directions of plural of the eye vectors.
3. The positioning device according to claim 2, wherein:
the planimetric feature position calculating unit extracts, from the map data storing unit, intersection position information corresponding to an intersection which is located within a predetermined distance from the planimetric feature, and sets the intersection position information as an initial value used for calculating the planimetric feature position information by the least squares method.
4. The positioning device according to claim 1, further comprising:
a map data registering unit configured to register the planimetric feature position information of the planimetric feature acquired by the planimetric feature position identifying unit, in the map data storing unit in association with the planimetric feature and a link or a node.
5. The positioning device according to claim 2, wherein:
in the event that a number of the eye vectors detected by the eye vector detecting unit is less than or equal to three, the planimetric feature detecting unit does not calculate the planimetric feature position information by the least squares method, and the position estimating unit outputs the estimated position.
US12/305,397 2006-06-21 2007-06-08 Positioning device Active 2030-06-02 US8725412B2 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2006171755A JP4600357B2 (en) 2006-06-21 2006-06-21 Positioning device
JP2006171755 2006-06-21
JP2006-171755 2006-06-21
PCT/JP2007/061607 WO2007148546A1 (en) 2006-06-21 2007-06-08 Positioning device

Publications (2)

Publication Number Publication Date
US20100004856A1 true US20100004856A1 (en) 2010-01-07
US8725412B2 US8725412B2 (en) 2014-05-13

Family

ID=38833288

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/305,397 Active 2030-06-02 US8725412B2 (en) 2006-06-21 2007-06-08 Positioning device

Country Status (5)

Country Link
US (1) US8725412B2 (en)
EP (1) EP2034271B1 (en)
JP (1) JP4600357B2 (en)
CN (1) CN101473195B (en)
WO (1) WO2007148546A1 (en)

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120257056A1 (en) * 2011-03-03 2012-10-11 Honda Elesys Co., Ltd. Image processing apparatus, image processing method, and image processing program
US20120310516A1 (en) * 2011-06-01 2012-12-06 GM Global Technology Operations LLC System and method for sensor based environmental model construction
US8406987B2 (en) * 2009-12-08 2013-03-26 At&T Intellectual Property I, L.P. Cellular-based live traffic service
US20130162824A1 (en) * 2011-12-22 2013-06-27 Electronics And Telecommunications Research Institute Apparatus and method for recognizing current position of vehicle using internal network of the vehicle and image sensor
US20140067262A1 (en) * 2012-09-06 2014-03-06 Toshiba Solutions Corporation Position detection device, position detection method, and computer program product
US20140088863A1 (en) * 2011-05-20 2014-03-27 Yoshitaka Hara Locus correcting method, locus correcting apparatus, and mobile object equipment
US8712624B1 (en) * 2012-04-06 2014-04-29 Google Inc. Positioning vehicles to improve quality of observations at intersections
US8793046B2 (en) 2012-06-01 2014-07-29 Google Inc. Inferring state of traffic signal and other aspects of a vehicle's environment based on surrogate data
US8825371B2 (en) 2012-12-19 2014-09-02 Toyota Motor Engineering & Manufacturing North America, Inc. Navigation of on-road vehicle based on vertical elements
DE102013217060A1 (en) * 2013-08-27 2015-03-05 Bayerische Motoren Werke Aktiengesellschaft Accurate positioning of a vehicle
US20150241226A1 (en) * 2014-02-24 2015-08-27 Ford Global Technologies, Llc Autonomous driving sensing system and method
US9145140B2 (en) 2012-03-26 2015-09-29 Google Inc. Robust method for detecting traffic signals and their associated states
US20150369608A1 (en) * 2012-12-20 2015-12-24 Continental Teves Ag & Co. Ohg Method for determining a reference position as the starting position for an inertial navigation system
US20160283789A1 (en) * 2015-03-25 2016-09-29 Motorola Mobility Llc Power-saving illumination for iris authentication
US9520064B2 (en) 2008-07-10 2016-12-13 Mitsubishi Electric Corporation Train-of-vehicle travel support device, control system and processor unit
US20170010108A1 (en) * 2015-02-10 2017-01-12 Mobileye Vision Technologies Ltd. Adaptive navigation based on user intervention
US9648107B1 (en) * 2011-04-22 2017-05-09 Angel A. Penilla Methods and cloud systems for using connected object state data for informing and alerting connected vehicle drivers of state changes
US9721171B2 (en) * 2012-11-20 2017-08-01 Robert Bosch Gmbh Method and device for detecting variable message traffic signs
US20190064830A1 (en) * 2017-08-25 2019-02-28 Toyota Jidosha Kabushiki Kaisha Host vehicle position confidence degree calculation device
CN109520495A (en) * 2017-09-18 2019-03-26 财团法人工业技术研究院 Navigation positioning device and navigation positioning method using same
US10262434B2 (en) * 2016-12-15 2019-04-16 Hyundai Motor Company Vehicle localization apparatus and method
CN111480131A (en) * 2018-08-23 2020-07-31 日本精工株式会社 Bicycle device, travel control method for bicycle device, and travel control program
US10935978B2 (en) * 2017-10-30 2021-03-02 Nio Usa, Inc. Vehicle self-localization using particle filters and visual odometry
US20210116251A1 (en) * 2017-12-07 2021-04-22 International Business Machines Corporation Location calibration based on movement path and map objects
US11105637B2 (en) * 2016-09-20 2021-08-31 Trimble Inc. Vehicle navigation by dead reckoning and GNSS-aided map-matching
US11181737B2 (en) 2016-08-05 2021-11-23 Panasonic Intellectual Property Management Co., Ltd. Head-up display device for displaying display items having movement attribute or fixed attribute, display control method, and control program
US11255676B2 (en) * 2011-10-24 2022-02-22 Continental Teves Ag & Co. Ogh Sensor system for independently evaluating the integrity of the data of the sensor system
EP3842735A4 (en) * 2018-08-23 2022-06-15 Nippon Telegraph And Telephone Corporation Position coordinates estimation device, position coordinates estimation method, and program
US11550330B2 (en) * 2017-07-12 2023-01-10 Arriver Software Ab Driver assistance system and method
US11710153B2 (en) 2016-11-21 2023-07-25 Nio Technology (Anhui) Co., Ltd. Autonomy first route optimization for autonomous vehicles
US11726474B2 (en) 2017-10-17 2023-08-15 Nio Technology (Anhui) Co., Ltd. Vehicle path-planner monitor and controller
US11874117B2 (en) 2019-04-04 2024-01-16 Mitsubishi Electric Corporation Vehicle positioning device

Families Citing this family (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2010136929A (en) * 2008-02-04 2012-03-20 Теле Атлас Норт Америка Инк. (Us) METHOD FOR HARMONIZING A CARD WITH DETECTED SENSOR OBJECTS
JP2009192406A (en) * 2008-02-15 2009-08-27 Xanavi Informatics Corp Navigation device
JP5386850B2 (en) * 2008-04-25 2014-01-15 トヨタ自動車株式会社 Object information acquisition device
JP5233606B2 (en) * 2008-11-19 2013-07-10 富士通株式会社 Absolute movement path calculating device and method, and program
KR101573289B1 (en) 2008-11-27 2015-12-01 삼성전자주식회사 Apparatus and method for recongnizing position using camera
JP5169804B2 (en) * 2008-12-25 2013-03-27 株式会社エクォス・リサーチ Control device
JP2011080845A (en) * 2009-10-06 2011-04-21 Topcon Corp Method and apparatus for creating three-dimensional data
US8559673B2 (en) 2010-01-22 2013-10-15 Google Inc. Traffic signal mapping and detection
US8823585B2 (en) * 2010-02-12 2014-09-02 Broadcom Corporation Sensor-assisted location-aware mobile device
KR101078769B1 (en) * 2010-11-08 2011-11-02 최승열 Location-based av flashlight and method of displaying map related moving image thereof
JP5602070B2 (en) * 2011-03-15 2014-10-08 三菱電機株式会社 POSITIONING DEVICE, POSITIONING METHOD OF POSITIONING DEVICE, AND POSITIONING PROGRAM
CN102426799B (en) * 2011-11-11 2014-01-22 中国联合网络通信集团有限公司 Road condition information prompting method and system, information control platform and vehicle-mounted front end device
US9008694B2 (en) 2012-06-29 2015-04-14 Broadcom Corporation Indoor/outdoor differentiation using radio frequency (RF) transmitters
US9482739B2 (en) 2012-06-29 2016-11-01 Broadcom Corporation Indoor/outdoor transition determination
US9798011B2 (en) * 2012-08-31 2017-10-24 Apple Inc. Fast GPS recovery using map vector data
US9013617B2 (en) * 2012-10-12 2015-04-21 Qualcomm Incorporated Gyroscope conditioning and gyro-camera alignment
CN102928860B (en) * 2012-10-18 2015-01-21 无锡清华信息科学与技术国家实验室物联网技术中心 Method for improving GPS (Global Positioning System) positioning precision on the basis of local positioning information
US9196040B2 (en) * 2013-03-12 2015-11-24 Qualcomm Incorporated Method and apparatus for movement estimation
KR101629691B1 (en) * 2013-05-24 2016-06-13 회명정보통신(주) Indoor positioning system using inertial sensor
CN107533801A (en) 2013-11-01 2018-01-02 国际智能技术公司 Use the ground mapping technology of mapping vehicle
US20150149085A1 (en) * 2013-11-27 2015-05-28 Invensense, Inc. Method and system for automatically generating location signatures for positioning using inertial sensors
EP3808634A1 (en) 2013-12-04 2021-04-21 Mobileye Vision Technologies Ltd. Navigating a vehicle to pass another vehicle
GB201407643D0 (en) 2014-04-30 2014-06-11 Tomtom Global Content Bv Improved positioning relatie to a digital map for assisted and automated driving operations
CN104007460B (en) * 2014-05-30 2017-02-08 北京中电华远科技有限公司 Individual fireman positioning and navigation device
US9707960B2 (en) 2014-07-31 2017-07-18 Waymo Llc Traffic signal response for autonomous vehicles
JP6441616B2 (en) * 2014-08-29 2018-12-19 株式会社ゼンリン Positioning device, driving support device, and control program
CN104992567A (en) * 2015-07-03 2015-10-21 赵洪海 Road signal lamp synchronous intelligent prompting device
CN107850445B (en) 2015-08-03 2021-08-27 通腾全球信息公司 Method and system for generating and using positioning reference data
JP2017072422A (en) * 2015-10-05 2017-04-13 パイオニア株式会社 Information processing device, control method, program, and storage medium
JP6323439B2 (en) * 2015-12-17 2018-05-16 カシオ計算機株式会社 Autonomous mobile device, autonomous mobile method and program
CN105489035B (en) * 2015-12-29 2018-03-30 大连楼兰科技股份有限公司 Apply the method that traffic lights are detected in active driving technology
JP6751280B2 (en) * 2016-02-22 2020-09-02 Necプラットフォームズ株式会社 Position estimating device, position detecting method and program
CN107764254A (en) * 2016-08-23 2018-03-06 上海闻通信息科技有限公司 A kind of plain navigation system and method based on monocular cam
EP3339807B1 (en) * 2016-12-20 2024-03-13 HERE Global B.V. An apparatus and associated methods for determining the location of a vehicle
CN107037467B (en) * 2017-03-24 2021-06-29 奇瑞汽车股份有限公司 Positioning system and method and intelligent automobile
WO2018212301A1 (en) * 2017-05-19 2018-11-22 パイオニア株式会社 Self-position estimation device, control method, program, and storage medium
JP6980010B2 (en) * 2017-05-19 2021-12-15 パイオニア株式会社 Self-position estimator, control method, program and storage medium
JP7257737B2 (en) * 2017-09-05 2023-04-14 ソニーグループ株式会社 Information processing device, self-position estimation method, and program
TWI657230B (en) * 2017-09-18 2019-04-21 財團法人工業技術研究院 Navigation and positioning device and method of navigation and positioning
JP7189691B2 (en) * 2018-07-02 2022-12-14 株式会社Subaru Vehicle cruise control system
WO2020223974A1 (en) * 2019-05-09 2020-11-12 珊口(深圳)智能科技有限公司 Method for updating map and mobile robot
JP7298882B2 (en) 2019-06-17 2023-06-27 国立大学法人金沢大学 Vehicle self-localization device and vehicle
JP7361348B2 (en) * 2019-06-24 2023-10-16 パナソニックIpマネジメント株式会社 Video control device, video control method and program
JP7343054B2 (en) * 2020-06-11 2023-09-12 日本電信電話株式会社 Location estimation method, location estimation device, and location estimation program

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6246960B1 (en) * 1998-11-06 2001-06-12 Ching-Fang Lin Enhanced integrated positioning method and system thereof for vehicle
US20050137786A1 (en) * 1997-10-22 2005-06-23 Intelligent Technologies International Inc. Communication method and arrangement
US20060106533A1 (en) * 2004-11-12 2006-05-18 Mitsubishi Denki Kabushiki Kaisha System for autonomous vehicle navigation with carrier phase DGPS and laser-scanner augmentation
US20060139619A1 (en) * 2003-10-31 2006-06-29 Fujitsu Limited Distance calculation device and calculation program
US20060233424A1 (en) * 2005-01-28 2006-10-19 Aisin Aw Co., Ltd. Vehicle position recognizing device and vehicle position recognizing method
US7177737B2 (en) * 2002-12-17 2007-02-13 Evolution Robotics, Inc. Systems and methods for correction of drift via global localization with a visual landmark
US20090005961A1 (en) * 2004-06-03 2009-01-01 Making Virtual Solid, L.L.C. En-Route Navigation Display Method and Apparatus Using Head-Up Display

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS63302317A (en) * 1987-06-03 1988-12-09 Mitsubishi Heavy Ind Ltd Positional speed measuring apparatus of moving object
JPH01306560A (en) * 1988-06-01 1989-12-11 Shin Meiwa Ind Co Ltd Method for controlling vapor-deposited film thickness
JP3381312B2 (en) * 1993-07-23 2003-02-24 株式会社デンソー Navigation device
JPH07239236A (en) 1994-02-28 1995-09-12 Hitachi Ltd Method and apparatus for measurement of quantity of state of moving body and calculation device of attitude angle of moving body
JPH09243389A (en) 1996-03-08 1997-09-19 Alpine Electron Inc On-vehicle navigation system
JP3848431B2 (en) * 1997-04-28 2006-11-22 本田技研工業株式会社 VEHICLE POSITION ESTIMATION APPARATUS, VEHICLE POSITION ESTIMATION METHOD, TRAVEL lane maintenance apparatus, and TR
JP2001336941A (en) 2000-05-25 2001-12-07 Sony Corp Car navigation device
JP2002213979A (en) 2001-01-12 2002-07-31 Clarion Co Ltd Gps receiver with dr function capable of correcting measurement position and azimuth
JP3958133B2 (en) * 2002-07-12 2007-08-15 アルパイン株式会社 Vehicle position measuring apparatus and method
CN100356139C (en) * 2005-01-21 2007-12-19 清华大学 Miniature assembled gesture measuring system for mini-satellite

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050137786A1 (en) * 1997-10-22 2005-06-23 Intelligent Technologies International Inc. Communication method and arrangement
US6246960B1 (en) * 1998-11-06 2001-06-12 Ching-Fang Lin Enhanced integrated positioning method and system thereof for vehicle
US7177737B2 (en) * 2002-12-17 2007-02-13 Evolution Robotics, Inc. Systems and methods for correction of drift via global localization with a visual landmark
US20060139619A1 (en) * 2003-10-31 2006-06-29 Fujitsu Limited Distance calculation device and calculation program
US20090005961A1 (en) * 2004-06-03 2009-01-01 Making Virtual Solid, L.L.C. En-Route Navigation Display Method and Apparatus Using Head-Up Display
US20060106533A1 (en) * 2004-11-12 2006-05-18 Mitsubishi Denki Kabushiki Kaisha System for autonomous vehicle navigation with carrier phase DGPS and laser-scanner augmentation
US20060233424A1 (en) * 2005-01-28 2006-10-19 Aisin Aw Co., Ltd. Vehicle position recognizing device and vehicle position recognizing method

Cited By (61)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9520064B2 (en) 2008-07-10 2016-12-13 Mitsubishi Electric Corporation Train-of-vehicle travel support device, control system and processor unit
US8718909B2 (en) * 2009-12-08 2014-05-06 At&T Intellectual Property I, L.P. Cellular-based live traffic service
US20130179066A1 (en) * 2009-12-08 2013-07-11 At&T Intellectual Property I, L.P. Cellular-based live traffic service
US8406987B2 (en) * 2009-12-08 2013-03-26 At&T Intellectual Property I, L.P. Cellular-based live traffic service
US8788186B2 (en) * 2009-12-08 2014-07-22 At&T Intellectual Property I, L.P. Cellular-based live traffic service
US20120257056A1 (en) * 2011-03-03 2012-10-11 Honda Elesys Co., Ltd. Image processing apparatus, image processing method, and image processing program
US9648107B1 (en) * 2011-04-22 2017-05-09 Angel A. Penilla Methods and cloud systems for using connected object state data for informing and alerting connected vehicle drivers of state changes
US20140088863A1 (en) * 2011-05-20 2014-03-27 Yoshitaka Hara Locus correcting method, locus correcting apparatus, and mobile object equipment
US9182235B2 (en) * 2011-05-20 2015-11-10 Hitachi, Ltd. Locus correcting method, locus correcting apparatus, and mobile object equipment
US9140792B2 (en) * 2011-06-01 2015-09-22 GM Global Technology Operations LLC System and method for sensor based environmental model construction
US20120310516A1 (en) * 2011-06-01 2012-12-06 GM Global Technology Operations LLC System and method for sensor based environmental model construction
US11255676B2 (en) * 2011-10-24 2022-02-22 Continental Teves Ag & Co. Ogh Sensor system for independently evaluating the integrity of the data of the sensor system
US20130162824A1 (en) * 2011-12-22 2013-06-27 Electronics And Telecommunications Research Institute Apparatus and method for recognizing current position of vehicle using internal network of the vehicle and image sensor
US9208389B2 (en) * 2011-12-22 2015-12-08 Electronics And Telecommunications Research Institute Apparatus and method for recognizing current position of vehicle using internal network of the vehicle and image sensor
US9145140B2 (en) 2012-03-26 2015-09-29 Google Inc. Robust method for detecting traffic signals and their associated states
US9796386B2 (en) 2012-03-26 2017-10-24 Waymo Llc Robust method for detecting traffic signals and their associated states
US10906548B2 (en) 2012-03-26 2021-02-02 Waymo Llc Robust method for detecting traffic signals and their associated states
US11731629B2 (en) 2012-03-26 2023-08-22 Waymo Llc Robust method for detecting traffic signals and their associated states
US9910439B1 (en) * 2012-04-06 2018-03-06 Waymo Llc Positioning vehicles to improve quality of observations at intersections
US10782695B1 (en) * 2012-04-06 2020-09-22 Waymo Llc Positioning vehicles to improve quality of observations at intersections
US8712624B1 (en) * 2012-04-06 2014-04-29 Google Inc. Positioning vehicles to improve quality of observations at intersections
US11300962B1 (en) 2012-04-06 2022-04-12 Waymo Llc Positioning vehicles to improve quality of observations at intersections
US9280156B1 (en) 2012-04-06 2016-03-08 Google Inc. Positioning vehicles to improve quality of observations at intersections
US9327734B2 (en) 2012-06-01 2016-05-03 Google Inc. Inferring state of traffic signal and other aspects of a vehicle's environment based on surrogate data
US10331133B2 (en) 2012-06-01 2019-06-25 Waymo Llc Inferring state of traffic signal and other aspects of a vehicle's environment based on surrogate data
US8793046B2 (en) 2012-06-01 2014-07-29 Google Inc. Inferring state of traffic signal and other aspects of a vehicle's environment based on surrogate data
US10831196B2 (en) 2012-06-01 2020-11-10 Waymo Llc Inferring state of traffic signal and other aspects of a vehicle's environment based on surrogate data
US9804601B2 (en) 2012-06-01 2017-10-31 Waymo Llc Inferring state of traffic signal and other aspects of a vehicle's environment based on surrogate data
US11845472B2 (en) 2012-06-01 2023-12-19 Waymo Llc Inferring state of traffic signal and other aspects of a vehicle's environment based on surrogate data
US11474520B2 (en) 2012-06-01 2022-10-18 Waymo Llc Inferring state of traffic signal and other aspects of a vehicle's environment based on surrogate data
US20140067262A1 (en) * 2012-09-06 2014-03-06 Toshiba Solutions Corporation Position detection device, position detection method, and computer program product
US9026363B2 (en) * 2012-09-06 2015-05-05 Kabushiki Kaisha Toshiba Position detection device, position detection method, and computer program product
US9721171B2 (en) * 2012-11-20 2017-08-01 Robert Bosch Gmbh Method and device for detecting variable message traffic signs
US9062977B2 (en) 2012-12-19 2015-06-23 Toyota Motor Engineering & Manufacturing North America, Inc. Navigation of on-road vehicle based on object reference data that is updated
US8825371B2 (en) 2012-12-19 2014-09-02 Toyota Motor Engineering & Manufacturing North America, Inc. Navigation of on-road vehicle based on vertical elements
US9658069B2 (en) * 2012-12-20 2017-05-23 Continental Teves Ag & Co. Ohg Method for determining a reference position as the starting position for an inertial navigation system
US20150369608A1 (en) * 2012-12-20 2015-12-24 Continental Teves Ag & Co. Ohg Method for determining a reference position as the starting position for an inertial navigation system
DE102013217060A1 (en) * 2013-08-27 2015-03-05 Bayerische Motoren Werke Aktiengesellschaft Accurate positioning of a vehicle
DE102013217060B4 (en) 2013-08-27 2023-08-03 Bayerische Motoren Werke Aktiengesellschaft Accurate positioning of a vehicle
US20150241226A1 (en) * 2014-02-24 2015-08-27 Ford Global Technologies, Llc Autonomous driving sensing system and method
US10422649B2 (en) * 2014-02-24 2019-09-24 Ford Global Technologies, Llc Autonomous driving sensing system and method
US20170010108A1 (en) * 2015-02-10 2017-01-12 Mobileye Vision Technologies Ltd. Adaptive navigation based on user intervention
US11397433B2 (en) * 2015-02-10 2022-07-26 Mobileye Vision Technologies Ltd. Adaptive navigation based on user intervention
US9939813B2 (en) * 2015-02-10 2018-04-10 Mobileye Vision Technologies Ltd. Systems and methods for refining landmark positions
US20160283789A1 (en) * 2015-03-25 2016-09-29 Motorola Mobility Llc Power-saving illumination for iris authentication
US11181737B2 (en) 2016-08-05 2021-11-23 Panasonic Intellectual Property Management Co., Ltd. Head-up display device for displaying display items having movement attribute or fixed attribute, display control method, and control program
US11105637B2 (en) * 2016-09-20 2021-08-31 Trimble Inc. Vehicle navigation by dead reckoning and GNSS-aided map-matching
US11710153B2 (en) 2016-11-21 2023-07-25 Nio Technology (Anhui) Co., Ltd. Autonomy first route optimization for autonomous vehicles
US10937190B2 (en) 2016-12-15 2021-03-02 Hyundai Motor Company Vehicle localization apparatus and method
US10262434B2 (en) * 2016-12-15 2019-04-16 Hyundai Motor Company Vehicle localization apparatus and method
US11550330B2 (en) * 2017-07-12 2023-01-10 Arriver Software Ab Driver assistance system and method
US20190064830A1 (en) * 2017-08-25 2019-02-28 Toyota Jidosha Kabushiki Kaisha Host vehicle position confidence degree calculation device
US10845814B2 (en) * 2017-08-25 2020-11-24 Toyota Jidosha Kabushiki Kaisha Host vehicle position confidence degree calculation device
CN109520495A (en) * 2017-09-18 2019-03-26 财团法人工业技术研究院 Navigation positioning device and navigation positioning method using same
US11726474B2 (en) 2017-10-17 2023-08-15 Nio Technology (Anhui) Co., Ltd. Vehicle path-planner monitor and controller
US10935978B2 (en) * 2017-10-30 2021-03-02 Nio Usa, Inc. Vehicle self-localization using particle filters and visual odometry
US20210116251A1 (en) * 2017-12-07 2021-04-22 International Business Machines Corporation Location calibration based on movement path and map objects
US11898852B2 (en) * 2017-12-07 2024-02-13 International Business Machines Corporation Location calibration based on movement path and map objects
EP3842735A4 (en) * 2018-08-23 2022-06-15 Nippon Telegraph And Telephone Corporation Position coordinates estimation device, position coordinates estimation method, and program
CN111480131A (en) * 2018-08-23 2020-07-31 日本精工株式会社 Bicycle device, travel control method for bicycle device, and travel control program
US11874117B2 (en) 2019-04-04 2024-01-16 Mitsubishi Electric Corporation Vehicle positioning device

Also Published As

Publication number Publication date
CN101473195B (en) 2013-01-09
EP2034271A1 (en) 2009-03-11
EP2034271A4 (en) 2010-08-11
US8725412B2 (en) 2014-05-13
CN101473195A (en) 2009-07-01
JP2008002906A (en) 2008-01-10
EP2034271B1 (en) 2014-04-16
JP4600357B2 (en) 2010-12-15
WO2007148546A1 (en) 2007-12-27

Similar Documents

Publication Publication Date Title
US8725412B2 (en) Positioning device
US10860871B2 (en) Integrated sensor calibration in natural scenes
Rose et al. An integrated vehicle navigation system utilizing lane-detection and lateral position estimation systems in difficult environments for GPS
EP1991973B1 (en) Image processing system and method
US10127461B2 (en) Visual odometry for low illumination conditions using fixed light sources
US8260036B2 (en) Object detection using cooperative sensors and video triangulation
JP4897542B2 (en) Self-positioning device, self-positioning method, and self-positioning program
JP3986360B2 (en) Camera calibration device
CN108692719B (en) Object detection device
JP2012127896A (en) Mobile object position measurement device
CN112074885A (en) Lane sign positioning
KR101704405B1 (en) System and method for lane recognition
EP3842751B1 (en) System and method of generating high-definition map based on camera
JP2012185011A (en) Mobile position measuring apparatus
JP4596566B2 (en) Self-vehicle information recognition device and self-vehicle information recognition method
KR20140065627A (en) Method and apparatus for providing camera calibration for vehicles
Shunsuke et al. GNSS/INS/on-board camera integration for vehicle self-localization in urban canyon
CN111351502A (en) Method, apparatus and computer program product for generating an overhead view of an environment from a perspective view
JP2009140192A (en) Road white line detection method, road white line detection program and road white line detection apparatus
US20230334696A1 (en) Camera orientation estimation
CN115702450A (en) Vehicle position estimation device and travel position estimation method
JPWO2004048895A1 (en) MOBILE NAVIGATION INFORMATION DISPLAY METHOD AND MOBILE NAVIGATION INFORMATION DISPLAY DEVICE
Hu et al. Fusion of vision, GPS and 3D gyro data in solving camera registration problem for direct visual navigation
WO2020113425A1 (en) Systems and methods for constructing high-definition map
EP3795952A1 (en) Estimation device, estimation method, and computer program product

Legal Events

Date Code Title Description
AS Assignment

Owner name: TOYOTA JIDOSHA KABUSHIKI KAISHA, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KOBORI, NORIMASA;KAGAWA, KAZUNORI;REEL/FRAME:022026/0916

Effective date: 20081114

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551)

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8