WO2006087542A1 - Vehicle location - Google Patents

Vehicle location Download PDF

Info

Publication number
WO2006087542A1
WO2006087542A1 PCT/GB2006/000527 GB2006000527W WO2006087542A1 WO 2006087542 A1 WO2006087542 A1 WO 2006087542A1 GB 2006000527 W GB2006000527 W GB 2006000527W WO 2006087542 A1 WO2006087542 A1 WO 2006087542A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
vehicle
velocity
equipment
image sequence
Prior art date
Application number
PCT/GB2006/000527
Other languages
French (fr)
Inventor
Savan Bhagvanji Chhaniyara
Yahya Hashem Zweiri
Kaspar Alexander Althoefer
Lakmal Dasarath Seneviratne
Original Assignee
Kings College London
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kings College London filed Critical Kings College London
Publication of WO2006087542A1 publication Critical patent/WO2006087542A1/en

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0253Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting relative motion information from a plurality of images taken successively, e.g. visual odometry, optical flow
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0272Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising means for registering the travel distance, e.g. revolutions of wheels
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0225Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving docking at a fixed facility, e.g. base station or loading bay

Definitions

  • the present invention relates to a method for improving vehicle location particularly the location of unmanned ground vehicles such as mobile robots and for accurately determining the speed of vehicles.
  • a mobile robot To facilitate mobile robot navigation, it is advantageous to provide a mobile robot with information about the precise position and orientation of the mobile robot relative to a fixed coordinate system. Such information may help a mobile robot in planning and following a route, in docking manoeuvres, and in coordinating with other systems, including other mobile robots.
  • Satellite-based positioning requires a clear communications path to a satellite and also does not provide location information with the precision required for docking manoeuvres.
  • Other methods for providing mobile robots with precise position and orientation information involve placing specially designed objects at known positions in the working environment, to be used as landmarks. These landmarks are sensed by a vision system carried by the mobile robot. The visual information about the landmarks is processed by the robot, enabling it to detect and recognize the various landmarks, and to determine its own position with respect to them.
  • US 5999866 describes a method of determining the location of a vehicle by capturing a visual image of the vehicle's present environment and comparing this with a stored map to determine the vehicle location.
  • Inertial systems are one of the widely used commercialised totally self-contained navigational systems. Initially developed gyros are of mechanical complex structure and mechanisms involving numbers of moving parts; but rapid development in solid state electronics and Mechatronics has helped to develop electro-mechanical, electro- optical sensors. This type of sensor has no moving parts. This method uses gyroscopes and is combined with accelerometers to measure rate of rotation and acceleration. Measurements are integrated once (or twice) to yield position.
  • This system is used for missile cruise control, military and civil aviation, space technology for accurate estimations and control. According to these areas of operation, the components used and the entire system has to be very precise and reliable. Consequently, the costs for such a system are still very high and the available cheaper sensors suffer from the accuracy of measurements; moreover the size is not yet as small that it can be used for mobile robots, automotive or consumer electronics. Inertial sensor data drifts with time because of the need to integrate rate data to yield position; any small constant error increases without bound after integration. List count of error is generally in centimetres. This is undesirable for some of the mobile robot application areas like autonomous land mine disabling, radioactive material handling in nuclear plants, scientific research and operations like auto drilling and assembly work in industries.
  • NASA's Mars Exploration Rover mission which provided an example of how slip plays an important role with the problem of any autonomous vehicle navigation and why there is an immediate need to develop a system to cope with it.
  • the Rover had a hard time gaining traction into the sandy ground around the crater; its wheels spun in place before they actually gained tracking.
  • Mars rover uses an odometer to click off the distance its wheels travels to measure and register how far the vehicle has moved. But they cannot give information about whether a target has or has not actually been reached or how much slippage is occurring.
  • measurement of slippage is important for vehicle safety and operator aids for manned wheeled vehicles and can be used to enhance the driving performance of the vehicle; for example if the slip parameters for each wheel are known, the torque applied to that wheel can be controlled and this can improve braking and prevent wheel spin e.g. in icy conditions.
  • a method for overcoming this problem is by attaching an extra castor/trailing wheel to the robots' chassis.
  • Generally ball type (spherical) castors or several wheeled castors are attached to the robot with inbuilt encoders on it. This can successfully find the longitudinal slip, lateral slip and slip angle for a high speed robot.
  • Rotary encoders are attached on axes of individual measuring wheels and castor axes to measure longitudinal and lateral slip. In order to find the longitudinal slip, they use two-sided measuring wheels with encoders. Any difference in the measuring wheel and corresponding driving wheel represents slippage.
  • This proposed system may be advantageous for the indoor and limited ground specifications but this type of system is totally impractical for the all terrain and unmanned ground vehicles for surveillance, planetary explorations, and military purposes due to the unexpected nature of the terrain and, also, the extra castor wheels limit the mobility of the system and increase complexity.
  • equipment for determining the slip of a vehicle moving over a surface and in contact with the surface which equipment comprises (i) a means for measuring the actual velocity of the vehicle over the surface, (ii) means for measuring the velocity as determined by the component(s) of the vehicle in contact with the surface and (iii) means for comparing these values.
  • the invention also provides a method for measuring the slip of a vehicle moving over a surface and in contact with the surface which method comprises measuring the actual velocity of the vehicle over the surface, measuring the velocity as determined by the component(s) of the vehicle in contact with the surface and comparing these values.
  • the invention also provides a method and equipment for determining the location of a vehicle from the measurement of slip.
  • the invention also provides equipment for determining the velocity of a vehicle comprising an image forming means able to be attached to the vehicle, which image forming means is able to form an image of the surface on which the vehicle is moving as a spatiotemporal image sequence of the surface by recording a time- varying image sequence and computational means for calculating the velocity of the vehicle from the sequence of spatiotemporal images.
  • the invention also provides a method for determining the velocity of a vehicle comprising forming a spatiotemporal image sequence of the surface on which the vehicle is traveling by recording a time- varying image sequence and calculating the velocity of the vehicle from the sequence of spatiotemporal images.
  • the equipment for determining the location of a vehicle preferably comprises an image forming means able to be attached to the vehicle which image forming means is able to form an image of the surface on which the vehicle is moving as a spatiotemporal image sequence of the surface by recording a time-varying image sequence means to process the image sequence to obtain a flow parameter and to calculate the velocity of the vehicle to determine the location of the vehicle.
  • the method for determining the location of a vehicle preferably comprises forming an image of the surface on which the vehicle is moving as a spatiotemporal image sequence of the surface by recording a time-varying image sequence, processing the image sequence to obtain a flow parameter and calculating the velocity of the vehicle from the flow parameter to determine the location of the vehicle.
  • velocity is used in its mathematical sense of measuring the speed and direction i.e. it is a vector not a scalar.
  • the method enables the slip to be calculated by comparing the velocity of the vehicle as calculated by the direction and speed as calculated from the vehicle propulsion means e.g. driven wheels or tracks etc. with the velocity of the vehicle as calculated by method of the invention.
  • the equipment can be in the form of a sensor which can be attached to a vehicle.
  • the image sequence is formed by recording a time-varying image sequence from an image recording device such as a fixed or moving camera. A great deal of information can be extracted by recording a time-varying image sequence from the camera.
  • An image sequence (or video) is a series of 2-D images that are sequentially ordered in time. This enables the motion estimation to be calculated.
  • motion estimation we mean the estimation of the displacement (or velocity) of image structures from one frame to another in a time sequence of 2-D images.
  • Any method can be used for extracting motion information from the image sequence and three main techniques are image difference, moving edge detection and motion flow and optical field.
  • image difference basically a moving object is first detected using the difference of images. Then the new position of the object is mathematically calculated with help of pixel moments over time; during the second step of the calculation different aspects are employed like fixed windowing, polygon matching, edge detection etc.
  • moving edge detection basically the first step is to detect an edge and using -an algorithm to find edges based on colour, shape or brightness patterns or criteria set by the user. Then in the next step, the same edge is traced in the next image at some time interval later; this technique is not usually used standalone and is generally combined with others for better results.
  • Motion flow and optical field is the preferred method and in this method the motion field assigns a velocity vector to every point in an image. If a point in the environment moves with velocity VQ, then this induces a velocity v t in the image plane; it is possible to determine mathematically the relationship between vg and v,-
  • the optical flow field is the velocity field that represents the three-dimensional motion of object points across a two-dimensional image.
  • the motion of each point is described using an optic flow vector. Mathematically, this is the rate of change of the unit vector along the line of sight to the surface point. We can think of this as the motion of the intersection of the line of sight with the view sphere (which moves with the eye or camera).
  • the image moves consistently; the flow is often locally uniform and discontinuities may occur at scene boundaries.
  • Local flow vectors that are similar reinforce one another.
  • the flow will not be uniform close to the fixation point, but will still approximate locally to a simple smooth pattern.
  • the assumption is that the scene is made up of extended reasonably smooth surfaces which move rigidly or at least distort smoothly when moving.
  • the optic flow vectors are small; the amount of motion between frames is small compared to the size of the image and is most likely 10 pixels/frame or less.
  • the software which can be used in the invention is image acquisition software, image processing software and post processing software.
  • the image acquisition software is generally a type of camera driver. It enables user settings for focus, colour, brightness, shutter speed and acquisition frame settings and enables time based images to be captured and stored.
  • the image conversion software is used in one of the pre-processing steps.
  • the captured image sequence is manipulated as per the optical flow algorithm criteria. It includes converting images in certain formats like BMP to RAS (Sun Raster Image Format). Some of the filters can be used for sizing or dipping, imaging encasement.
  • the optical flow algorithm is performed at this stage on the image sequence and results are stored for the post processing stage.
  • the optical flow algorithm consists of sequential reading, performing the processing and then closing the image from memory and displaying the velocity gradients on an image.
  • Post processing converts the result from the optical flow algorithms into user defined units used for analysis of data.
  • Output velocity of the optical flow algorithms are in — Q —
  • the pixel/frame units are converted into SI units and this velocity information is then used for detecting the slip parameters.
  • the invention can provide a system which integrates optical flow measurements with a slip parameter estimation technique, for example, sliding mode observer, Kalman Filter, Extended Kalman Filter, Direct Mathematical Inversion in order to determine the slip parameters; we have found that the sliding mode observer is the preferred method for slip estimation.
  • a slip parameter estimation technique for example, sliding mode observer, Kalman Filter, Extended Kalman Filter, Direct Mathematical Inversion in order to determine the slip parameters; we have found that the sliding mode observer is the preferred method for slip estimation.
  • the positional data can be obtained via integration and it is then possible to plot the present trajectories and the occurring slip. This data is then fed back to a trajectory controller for correction of the trajectory controller to enable the vehicle to follow a tight trajectory.
  • the invention is particularly useful with unmanned ground vehicles (UGV) and mobile robots and enables their location to be determined from measuring the velocity of the vehicle or robot from its starting point.
  • UUV unmanned ground vehicles
  • mobile robots enables their location to be determined from measuring the velocity of the vehicle or robot from its starting point.
  • the invention can predict side slip and by comparing the velocity of the vehicle or robot from the driving means in contact with the surface over which it is moving, and comparing this with the velocity calculated by the method of the invention it is possible to calculate the slippage.
  • the invention can be used with driven vehicles such as commercial vehicles to provide for increased safety and improved navigability.
  • the invention enables slip in x and y direction as well as the angular component of slip to be determined. This was not possible in earlier anti-skid systems.
  • the actuation signal for each wheel i.e. power requirements for each wheel
  • the steering angle can be calculated to achieve the most appropriate vehicle trajectory for a given situation (i.e. to drive the vehicle around a bend if one or more wheel is suddenly experiencing high slip due to changes in road surface (e.g. icy or wet roads).
  • the present invention could provide estimates on location when a GPS system fails to provide information on position (e.g. in a tunnel, in narrow city streets).
  • a GPS system fails to provide information on position (e.g. in a tunnel, in narrow city streets).
  • the proposed system can provide better results when small accelerations occur and/or when the vehicle is moving at a constant speed (in such cases accelerometer signals are very noisy and unreliable).
  • error accumulation is not so high, since
  • the combined slip is most likely to come into the picture when the wheel is cornering when side forces also come into effect. This develops a lateral force at a contact area between the wheel and ground. Due to this the wheel will move along a path at an angle ⁇ with the wheel plane. This angle ⁇ is usually referred to as the slip angle.
  • Combined slip demonstrates a strong nonlinear behaviour; it depends on a number of the parameters like soil type, tyre soil adhesion, side forces, pressure on the wheel and many more. It is very complicated to derive an accurate mathematical model of the combined slip but basically we can put it as below.
  • Angular velocity can be obtained from the encoders mounted on driving sprockets.
  • Velocity of the UGV is obtained by this proposed system and using equation 1.1 and 1.2 or it can be incorporated with slip model based on kinematics and dynamics of the tracked vehicle as set out in calculations (1.3-1.4).
  • V x the velocity component of vehicle along the X e axis
  • V c the velocity of vehicle
  • this system measures slip independently and it can also predict the slip using the above model. It is a feature of the invention that the system is totally self contained; accuracy is very high as the resolution is limited only by optics; capital cost is limited to that of a camera, lighting, offline storage and a processor; installation cost is limited to the labour and time required for initial setting up and it is a totally non contact type sensor/method for measuring the slip and it does not restrict the mobility of the UGV in any type of terrain.
  • the invention can be used with a trajectory controller which controls the trajectory of a remote vehicle by controlling the velocity of the vehicle.
  • information about the location of the vehicle is fed to the trajectory controller, compared with the desired trajectory and the velocity of the vehicle adjusted by the trajectory controller to correct any deviation from the desired trajectory.
  • FIG. I 5 An illustration of a vehicle incorporating the present invention is shown in fig. I 5 in which :- Fig. 1 shows a tracked vehicle and
  • Fig. 2 shows a basic flow diagram of the system
  • a tracked UGV (1) moving over terrain (2) has a camera (3) mounted on it.
  • the camera (3) takes pictures of the terrain perpendicularly below it at (4).
  • the images obtained by the camera are processed according to the invention.
  • the images from camera (3) are processed and a spatiotemporal image sequence obtained.
  • This is processed by the optical flow algorithm to obtain the flow parameter which gives the velocity and position, this parameter passed to the central control algorithm along with the reading from the encoder on the UGV which gives the readings from the drive chain of the UGV.
  • the central control algorithm calculates the slip and a correction based on the kinematics and dynamic model of the UGV can be applied. This can be passed onto the trajectory controller to give a corrected trajectory.
  • a camera is mounted onto a carriage which can move only in one direction and the carriage represents the moving unmanned robot.
  • Different terrains were represented by sand, small stony terrain, concrete and carpet.
  • Stage 1 Camera Calibration The camera was calibrated to establish the relationship between the image coverage of the terrain in x and y directions with the image resolution dimension in the pixels. Thus this step provides the relationship between image pixels and the trajectory parameters at the mm. scale.
  • the carriage was moved in one direction with a uniform motion and images were taken at a predefined rate; the maximum frame rate of the camera was 15 frames/sec.
  • the motion of the camera was measured by a fixed rule and the distance moved during each frame registered manually.
  • the BMP Image format was used for image capture.
  • the image sequence was converted into a desirable format before processing; for the image conversion and enhancement ACD See 4.0 software was used.
  • the acquired image sequence was in BMP form and was converted to Sun Raster Image Format (RAS).
  • RAS Sun Raster Image Format
  • the output of the algorithm consists of the velocity of each moving pixel in x and y direction.
  • Stage 5 Post Processing an average velocity component is calculated from the results available from Stage 3 above and it is converted into trajectory, parameters.
  • instantaneous velocity data were plotted against time and then a curve filling model established. After integrating the curve fitting model we can get the position at different time intervals. We can process this data for further analysis or it can be used as a feedback for a robot trajectory controller.

Abstract

A method for determining vehicle location, particularly the location of unmanned ground vehicles such as mobile robots, by determining the slip of the vehicle using a camera to take images of the surface as the vehicle moves.

Description

Vehicle Location
The present invention relates to a method for improving vehicle location particularly the location of unmanned ground vehicles such as mobile robots and for accurately determining the speed of vehicles.
To facilitate mobile robot navigation, it is advantageous to provide a mobile robot with information about the precise position and orientation of the mobile robot relative to a fixed coordinate system. Such information may help a mobile robot in planning and following a route, in docking manoeuvres, and in coordinating with other systems, including other mobile robots.
Commonly used methods for providing a robot with such information include dead reckoning, inertial navigation, laser positioning, and satellite-based positioning. Each of these techniques has disadvantages. With dead reckoning and inertial navigation, the autonomous vehicle keeps track of its various movements and calculates its position and orientation accordingly. However, inaccuracies in tracking or calculation are cumulative, so that a series of small errors can lead to substantial mistakes, especially in docking.
Satellite-based positioning requires a clear communications path to a satellite and also does not provide location information with the precision required for docking manoeuvres. Other methods for providing mobile robots with precise position and orientation information involve placing specially designed objects at known positions in the working environment, to be used as landmarks. These landmarks are sensed by a vision system carried by the mobile robot. The visual information about the landmarks is processed by the robot, enabling it to detect and recognize the various landmarks, and to determine its own position with respect to them. US 5999866 describes a method of determining the location of a vehicle by capturing a visual image of the vehicle's present environment and comparing this with a stored map to determine the vehicle location.
Current landmark methods suffer from several disadvantages. Especially if the working environment is cluttered or unevenly lit, or if the landmarks are partially occluded; errors may occur in detecting or recognizing the landmarks, resulting in errors in the position determined for the mobile robot. In addition, the processing in many current landmark systems needed to extract information from visual images requires substantial computational resources, which makes such systems difficult and expensive to implement in applications requiring the mobile robot to determine its position in real time. Landmark systems cannot be used in unfamiliar or unknown territory.
Inertial systems are one of the widely used commercialised totally self-contained navigational systems. Initially developed gyros are of mechanical complex structure and mechanisms involving numbers of moving parts; but rapid development in solid state electronics and Mechatronics has helped to develop electro-mechanical, electro- optical sensors. This type of sensor has no moving parts. This method uses gyroscopes and is combined with accelerometers to measure rate of rotation and acceleration. Measurements are integrated once (or twice) to yield position.
This system is used for missile cruise control, military and civil aviation, space technology for accurate estimations and control. According to these areas of operation, the components used and the entire system has to be very precise and reliable. Consequently, the costs for such a system are still very high and the available cheaper sensors suffer from the accuracy of measurements; moreover the size is not yet as small that it can be used for mobile robots, automotive or consumer electronics. Inertial sensor data drifts with time because of the need to integrate rate data to yield position; any small constant error increases without bound after integration. List count of error is generally in centimetres. This is undesirable for some of the mobile robot application areas like autonomous land mine disabling, radioactive material handling in nuclear plants, scientific research and operations like auto drilling and assembly work in industries.
In motion, autonomous vehicles can suffer from slippage and slippage is one of the big stumbling blocks in achieving accurate localisation of the autonomous vehicles. Behaviour of this phenomenon is very unpredictable, non linear and it changes with the different terrain conditions. This physical property of the slippage has not been taken into account seriously for the control engineering and artificial intelligence point of view and is considered a disturbance. There is a specific need for developing self intelligence in autonomous robots to fulfil this gap and there is the need for a system that scans ground and is able to locate the relative position accurately.
An example of this problem was NASA's Mars Exploration Rover mission which provided an example of how slip plays an important role with the problem of any autonomous vehicle navigation and why there is an immediate need to develop a system to cope with it. The Rover had a hard time gaining traction into the sandy ground around the crater; its wheels spun in place before they actually gained tracking. Mars rover uses an odometer to click off the distance its wheels travels to measure and register how far the vehicle has moved. But they cannot give information about whether a target has or has not actually been reached or how much slippage is occurring.
Apart from its importance in autonomous vehicles, measurement of slippage is important for vehicle safety and operator aids for manned wheeled vehicles and can be used to enhance the driving performance of the vehicle; for example if the slip parameters for each wheel are known, the torque applied to that wheel can be controlled and this can improve braking and prevent wheel spin e.g. in icy conditions.
A method for overcoming this problem is by attaching an extra castor/trailing wheel to the robots' chassis. Generally ball type (spherical) castors or several wheeled castors are attached to the robot with inbuilt encoders on it. This can successfully find the longitudinal slip, lateral slip and slip angle for a high speed robot.
Rotary encoders are attached on axes of individual measuring wheels and castor axes to measure longitudinal and lateral slip. In order to find the longitudinal slip, they use two-sided measuring wheels with encoders. Any difference in the measuring wheel and corresponding driving wheel represents slippage.
This proposed system may be advantageous for the indoor and limited ground specifications but this type of system is totally impractical for the all terrain and unmanned ground vehicles for surveillance, planetary explorations, and military purposes due to the unexpected nature of the terrain and, also, the extra castor wheels limit the mobility of the system and increase complexity.
Methods for determining vehicle slip are described in US 5147010 which determines the slip ratio by measuring the main drive speed and the actual ground speed and calculates the slip and GB 1346678 which determines slip by measuring the vehicle velocity and measuring the rotational velocity of a driven wheel.
We have now devised an improved equipment and system for determining the location of a mobile vehicle which can deal with the problem of slippage.
According to the invention there is provided equipment for determining the slip of a vehicle moving over a surface and in contact with the surface which equipment comprises (i) a means for measuring the actual velocity of the vehicle over the surface, (ii) means for measuring the velocity as determined by the component(s) of the vehicle in contact with the surface and (iii) means for comparing these values.
The invention also provides a method for measuring the slip of a vehicle moving over a surface and in contact with the surface which method comprises measuring the actual velocity of the vehicle over the surface, measuring the velocity as determined by the component(s) of the vehicle in contact with the surface and comparing these values.
The invention also provides a method and equipment for determining the location of a vehicle from the measurement of slip.
The invention also provides equipment for determining the velocity of a vehicle comprising an image forming means able to be attached to the vehicle, which image forming means is able to form an image of the surface on which the vehicle is moving as a spatiotemporal image sequence of the surface by recording a time- varying image sequence and computational means for calculating the velocity of the vehicle from the sequence of spatiotemporal images.
The invention also provides a method for determining the velocity of a vehicle comprising forming a spatiotemporal image sequence of the surface on which the vehicle is traveling by recording a time- varying image sequence and calculating the velocity of the vehicle from the sequence of spatiotemporal images.
The equipment for determining the location of a vehicle preferably comprises an image forming means able to be attached to the vehicle which image forming means is able to form an image of the surface on which the vehicle is moving as a spatiotemporal image sequence of the surface by recording a time-varying image sequence means to process the image sequence to obtain a flow parameter and to calculate the velocity of the vehicle to determine the location of the vehicle.
The method for determining the location of a vehicle preferably comprises forming an image of the surface on which the vehicle is moving as a spatiotemporal image sequence of the surface by recording a time-varying image sequence, processing the image sequence to obtain a flow parameter and calculating the velocity of the vehicle from the flow parameter to determine the location of the vehicle. The term velocity is used in its mathematical sense of measuring the speed and direction i.e. it is a vector not a scalar.
The method enables the slip to be calculated by comparing the velocity of the vehicle as calculated by the direction and speed as calculated from the vehicle propulsion means e.g. driven wheels or tracks etc. with the velocity of the vehicle as calculated by method of the invention.
The equipment can be in the form of a sensor which can be attached to a vehicle. The image sequence is formed by recording a time-varying image sequence from an image recording device such as a fixed or moving camera. A great deal of information can be extracted by recording a time-varying image sequence from the camera. An image sequence (or video) is a series of 2-D images that are sequentially ordered in time. This enables the motion estimation to be calculated.
Selection of the camera depends upon the maximum slide velocity, resolution and compatibility with the image grabber. To capture sufficient overlapping images it requires higher frame rate cameras. Presently available cameras provide 10,000 frames/sec to 10 frames/sec, which are adequate.
By "motion estimation", we mean the estimation of the displacement (or velocity) of image structures from one frame to another in a time sequence of 2-D images.
Any method can be used for extracting motion information from the image sequence and three main techniques are image difference, moving edge detection and motion flow and optical field.
In image difference, basically a moving object is first detected using the difference of images. Then the new position of the object is mathematically calculated with help of pixel moments over time; during the second step of the calculation different aspects are employed like fixed windowing, polygon matching, edge detection etc. In moving edge detection basically the first step is to detect an edge and using -an algorithm to find edges based on colour, shape or brightness patterns or criteria set by the user. Then in the next step, the same edge is traced in the next image at some time interval later; this technique is not usually used standalone and is generally combined with others for better results.
Motion flow and optical field is the preferred method and in this method the motion field assigns a velocity vector to every point in an image. If a point in the environment moves with velocity VQ, then this induces a velocity vt in the image plane; it is possible to determine mathematically the relationship between vg and v,-
The optical flow field is the velocity field that represents the three-dimensional motion of object points across a two-dimensional image. The motion of each point is described using an optic flow vector. Mathematically, this is the rate of change of the unit vector along the line of sight to the surface point. We can think of this as the motion of the intersection of the line of sight with the view sphere (which moves with the eye or camera).
Features of the optical flow method are that alike features match; a feature in one image must correspond to the matching feature in the other, using some constraints such as geometrical (shape) comparison, comparison of grey levels etc. The assumption is that the appearance of a scene feature does not change significantly between frames.
In this method the image moves consistently; the flow is often locally uniform and discontinuities may occur at scene boundaries. Local flow vectors that are similar reinforce one another. In active vision, the flow will not be uniform close to the fixation point, but will still approximate locally to a simple smooth pattern. The assumption is that the scene is made up of extended reasonably smooth surfaces which move rigidly or at least distort smoothly when moving. The optic flow vectors are small; the amount of motion between frames is small compared to the size of the image and is most likely 10 pixels/frame or less.
There are many different techniques used to determine optical flow and there are many algorithms. In a paper by J. L. Barron, DJ. Fleet and S. S. Beauchemin
"Performance of Optical Flow Techniques" Int. J. Comput.Vis., vol. 12, no 1, pp 43-
77, 1994 and a paper by J. L. Barron, D.J. Fleet, S. S. Beauchemin and T. Burkitt
"Performance of Optical Flow Techniques" Dept. Comput. Sci. Univ Western
Ontario, London, Ontario, Canada, and Dept. Comput Sci. Queens Univ. Kingston, Ontario, Canada, Tech. Rep. TR299, RPL-TR-9107, July 1992, Revised July 1993.
Several of these are described and evaluated.
The software which can be used in the invention is image acquisition software, image processing software and post processing software.
The image acquisition software is generally a type of camera driver. It enables user settings for focus, colour, brightness, shutter speed and acquisition frame settings and enables time based images to be captured and stored.
The image conversion software is used in one of the pre-processing steps. In this step the captured image sequence is manipulated as per the optical flow algorithm criteria. It includes converting images in certain formats like BMP to RAS (Sun Raster Image Format). Some of the filters can be used for sizing or dipping, imaging encasement.
The optical flow algorithm is performed at this stage on the image sequence and results are stored for the post processing stage. The optical flow algorithm consists of sequential reading, performing the processing and then closing the image from memory and displaying the velocity gradients on an image.
Post processing converts the result from the optical flow algorithms into user defined units used for analysis of data. Output velocity of the optical flow algorithms are in — Q —
the pixel/frame units. Depending upon the camera calibration, the results are converted into SI units and this velocity information is then used for detecting the slip parameters.
The invention can provide a system which integrates optical flow measurements with a slip parameter estimation technique, for example, sliding mode observer, Kalman Filter, Extended Kalman Filter, Direct Mathematical Inversion in order to determine the slip parameters; we have found that the sliding mode observer is the preferred method for slip estimation.
From the velocity the positional data can be obtained via integration and it is then possible to plot the present trajectories and the occurring slip. This data is then fed back to a trajectory controller for correction of the trajectory controller to enable the vehicle to follow a tight trajectory.
The invention is particularly useful with unmanned ground vehicles (UGV) and mobile robots and enables their location to be determined from measuring the velocity of the vehicle or robot from its starting point.
As set out below the invention can predict side slip and by comparing the velocity of the vehicle or robot from the driving means in contact with the surface over which it is moving, and comparing this with the velocity calculated by the method of the invention it is possible to calculate the slippage.
As well as being used in unmanned vehicles, the invention can be used with driven vehicles such as commercial vehicles to provide for increased safety and improved navigability. The invention enables slip in x and y direction as well as the angular component of slip to be determined. This was not possible in earlier anti-skid systems. Computing the difference between speed measured at the wheels and the vehicle speed (against ground) as done by the method and equipment of the present invention, these three slip components can be accurately determined. This aspect is very important for any advanced traction control system in a vehicle. With the slip components accurately computed, the actuation signal for each wheel (i.e. power requirements for each wheel) as well as the steering angle can be calculated to achieve the most appropriate vehicle trajectory for a given situation (i.e. to drive the vehicle around a bend if one or more wheel is suddenly experiencing high slip due to changes in road surface (e.g. icy or wet roads).
Such an advanced system of vehicle control would take over control from the driver and attempt to steer the vehicle out of a dangerous situation as safely as possible.
As wejl as its use for measuring location and velocities of vehicles, the present invention could provide estimates on location when a GPS system fails to provide information on position (e.g. in a tunnel, in narrow city streets). First of all, the proposed system can provide better results when small accelerations occur and/or when the vehicle is moving at a constant speed (in such cases accelerometer signals are very noisy and unreliable). Secondly, error accumulation is not so high, since
only one integration from the speed measurements (output of the system of the present invention) is needed in contrast to two integrations which are needed if accelerometers (inertial sensors) are used.
An explanation of how the system of the present invention is set out below.
1. Basic slip models
When driving torque is applied to a wheel, a tractive force is developed at the contact area between the wheel and the ground. During this procedure the wheel's tread in front and within the contact area is under compression forces. These also develop the shear deformation of the sidewalls of the wheel. The distance travelled under action of the driving torque will be generally less then in free rolling. The main reason behind this is that the tread elements are compressed before entering the contact region. This phenomenon is referred to as a longitudinal slip. The linear tyre model doesn't consider lateral tyre forces due to complex interactions between lateral and longitudinal tyre forces.
Mathematically slip can be described as: rθ-x
where rθ Wheel angular velocity
X Vehicle velocity a Slip angle
S Slip
V Side Slip velocity
2. Combined slip model
The combined slip is most likely to come into the picture when the wheel is cornering when side forces also come into effect. This develops a lateral force at a contact area between the wheel and ground. Due to this the wheel will move along a path at an angle α with the wheel plane. This angle α is usually referred to as the slip angle. Combined slip demonstrates a strong nonlinear behaviour; it depends on a number of the parameters like soil type, tyre soil adhesion, side forces, pressure on the wheel and many more. It is very complicated to derive an accurate mathematical model of the combined slip but basically we can put it as below.
Sv = tm a 0.2)
a = tan — x r 0 Wheel angular velocity x Vehicle velocity a Slip angle
S Slip
Side Slip velocity 3. Prediction of slip
The prediction of the slip requires two important parameters - angular velocity and vehicular velocity, either for linear wheeled robots, or for tracked vehicles. Angular velocity can be obtained from the encoders mounted on driving sprockets.
Velocity of the UGV is obtained by this proposed system and using equation 1.1 and 1.2 or it can be incorporated with slip model based on kinematics and dynamics of the tracked vehicle as set out in calculations (1.3-1.4).
Figure imgf000013_0001
y = ^[<°o (1 - k ) + ωι (1 - 0] [si n φ{t) - ∞sψ(ή tan a(ή] (1.3)
, Δv _ φvD(1--if0)->v,(1-/,)] Ψ B B
Figure imgf000013_0002
x
B -
Figure imgf000013_0003
cu = arctan(— ) -φ x
Where
S Slip for tire model r Radius of wheel rθ Angular velocity of driving sprocket
V Side slip velocity
Xy Vehicle velocity θ\ Front caster angle orientation θ2 Back caster angle orientation
A contact area, area i slip ij slip of the inside track of a tracked vehicle i0 slip of the outside track of a tracked vehicle
IC shear deformation modulus b width of a single track
I length of a tracked vehicle
Iz mass moment of inertia of vehicle about the Z axis
C cohesion φ angle of internal shearing resistance
B the tread of a tracked vehicle, i.e. the spacing between the centreline of the two tracks
F total tractive effort of a track φ angular displacement of a tracked vehicle
V0 speed of outside track of a tracked vehicle
VI speed of inside track of a tracked vehicle
Vx the velocity component of vehicle along the X e axis
Vc the velocity of vehicle
X linear displacement of a tracked vehicle along X axis with regard to arbitrary frame y linear displacement of a tracked vehicle along Y axis with regard to arbitrary frame α slip angle μr coefficient of traction μ.| coefficient of lateral resistance
R turning radius
O instantaneous centre of rotation
W weight of tracked vehicles
Fo.i thrust on the outer, inner track, Λ'
X1, linear displacement of a tracked vehicle along X axis with regard to frame attached to the vehicle. ye linear displacement of attacked vehicle along Y axis with regard to frame attached to the vehicle φ angular displacement of a tracked vehicle about Z axis, measured clock O)0 the angular velocity of outer track
(Oj the angular velocity of inner track
Thus this system measures slip independently and it can also predict the slip using the above model. It is a feature of the invention that the system is totally self contained; accuracy is very high as the resolution is limited only by optics; capital cost is limited to that of a camera, lighting, offline storage and a processor; installation cost is limited to the labour and time required for initial setting up and it is a totally non contact type sensor/method for measuring the slip and it does not restrict the mobility of the UGV in any type of terrain.
The invention can be used with a trajectory controller which controls the trajectory of a remote vehicle by controlling the velocity of the vehicle. In this application information about the location of the vehicle is fed to the trajectory controller, compared with the desired trajectory and the velocity of the vehicle adjusted by the trajectory controller to correct any deviation from the desired trajectory.
This gives a superior result compared with trajectory controllers which rely on the speed of rotation of wheels or tracks and measurements of distance travelled which, as explained above, are subject to errors due to slippage.
An illustration of a vehicle incorporating the present invention is shown in fig. I5 in which :- Fig. 1 shows a tracked vehicle and
Fig. 2 shows a basic flow diagram of the system
Referring to fig. 1 a tracked UGV (1) moving over terrain (2) has a camera (3) mounted on it. The camera (3) takes pictures of the terrain perpendicularly below it at (4). As the vehicle moves over the terrain the images obtained by the camera are processed according to the invention.
Referring to fig. 2 the images from camera (3) are processed and a spatiotemporal image sequence obtained. This is processed by the optical flow algorithm to obtain the flow parameter which gives the velocity and position, this parameter passed to the central control algorithm along with the reading from the encoder on the UGV which gives the readings from the drive chain of the UGV. The central control algorithm calculates the slip and a correction based on the kinematics and dynamic model of the UGV can be applied. This can be passed onto the trajectory controller to give a corrected trajectory.
In the Example a system on test scale was evaluated.
Example
One dimensional motion was investigated for experimental purposes. A camera is mounted onto a carriage which can move only in one direction and the carriage represents the moving unmanned robot. Different terrains were represented by sand, small stony terrain, concrete and carpet.
Stage 1 Camera Calibration The camera was calibrated to establish the relationship between the image coverage of the terrain in x and y directions with the image resolution dimension in the pixels. Thus this step provides the relationship between image pixels and the trajectory parameters at the mm. scale.
Stage 2 Image Acquisition
Spatiotemporal image sequences were taken by the camera and stored on a hard disk; the hardware used was drom 3Com Home Connect Pc Webcam and the software was Capture Studio Professional 4.1.
The carriage was moved in one direction with a uniform motion and images were taken at a predefined rate; the maximum frame rate of the camera was 15 frames/sec. The motion of the camera was measured by a fixed rule and the distance moved during each frame registered manually. The BMP Image format was used for image capture. Stage 3 Image Conversion
The image sequence was converted into a desirable format before processing; for the image conversion and enhancement ACD See 4.0 software was used. The acquired image sequence was in BMP form and was converted to Sun Raster Image Format (RAS).
Stage 4 Image Processing Optical Flow Analysis
After image acquisition, an optical flow algorithm is used to find the velocity vectors in the direction of motion of the carriage. The implementation of the Phase based approach to the estimation of the optical flow field using spatial filtering on MATLAB is used. This software is freely available on MATLAB Central File exchange. This algorithm proceeds in 3 steps:
1. Spatial filtering 2. Phase gradient estimation
3. IOC using recurrent networks
The flow chart of the optical flow algorithm is shown in Table 1.
Table 1
Figure imgf000018_0001
The output of the algorithm consists of the velocity of each moving pixel in x and y direction.
Stage 5 Post Processing In this step an average velocity component is calculated from the results available from Stage 3 above and it is converted into trajectory, parameters. For establishing the trajectory of UGV, instantaneous velocity data were plotted against time and then a curve filling model established. After integrating the curve fitting model we can get the position at different time intervals. We can process this data for further analysis or it can be used as a feedback for a robot trajectory controller.
Results
Experiments were performed for the manually archived image sequences for different terrains and the results correlated with the actual results.
Results of the Experiments are shown below.
Parameters Terrain Sample: Sand Frame Rate : 30 frames/sec Actual Distance moved: 100mm
.Experiment: 1
Figure imgf000020_0001
Figure imgf000021_0001
Avg velocity 2.428069337
Actual Distance Moved 100 mm
Total distance 249.7101509pixels
95.20199503 mm relative error 4.798004966
Experiment 2
Figure imgf000022_0001
Total Dist 244.2304 pixels Total Dist 93.1128 mm Relative Error 6.89%
Figure imgf000023_0001
Figure imgf000023_0002
Total Dist 276,6976 pixels Total Dte\ l05A910mm Relative Error -5.49 % Summar of Results
Figure imgf000024_0001
The results are shown graphically in figures 3 to 5 of the drawings.
As can be seen the results obtained by the method of the invention correlate accurately with the actual movement.

Claims

Claims
1. Equipment for determining the slip of a vehicle moving over a surface and in contact with the surface which equipment comprises (i) a means for measuring the actual velocity of the vehicle over the surface comprising an image forming means able to be attached to the vehicle, which image forming means is able to form an image of the surface on which the vehicle is moving as a spatiotemporal image sequence of the surface by recording a time-varying image sequence; (ii) processing means able to process the image sequence to obtain a flow parameter means to measure the actual velocity of the vehicle; (iii) means for measuring a velocity of the vehicle as determined from component(s) of the vehicle in contact with the surface; (iv) means to calculate the slip of the vehicle from the flow parameter and the velocity as measured by the component(s) of the vehicle in contact with the surface and (v) means for comparing these values.
2. Equipment as claimed in claim 1 in which the image forming means comprises a camera.
3. Equipment as claimed in claim 2 in which the camera provides at least 10 frames/sec.
4. Equipment as claimed in any one of claims 1 to 3 in which the processing means comprises a processor which is able to process the image sequence by a motion flow and optical field algorithm.
5. Equipment as claimed in claim 3 or 4 in which the processing means is able to display the velocity gradients of the vehicle on an image.
6. Equipment as claimed in any one of claims 1 to 5 in which the spatiotemporal image sequence is processed by a process selected from image difference, moving edge detection and motion flow and optical field.
7. Equipment as claimed in any one of claims 1 to 5 in which the image of the surface is processed by image acquisition software, image processing software and post processing software.
8. Equipment as claimed in claim 7 in which the image acquisition software is a camera driver which enables user settings for focus, colour, brightness, shutter speed and acquisition frame settings and enables time based images to be captured and stored.
9. Equipment as claimed in claims 7 or 8 in which the image conversion software manipulates the captured image sequence as per the optical flow algorithm criteria and converts images to a different format.
10. Equipment as claimed in claim 9 in which the image is converted from BMP to RAS (Sun Raster Image Format).
11. Equipment as claimed in claims 9 or 10 in which the optical flow algorithm consists of sequential reading, performing the processing and then closing the image from memory and displaying the velocity gradients on an image.
12. Equipment as claimed in any one of the preceding claims in which the means to measure the velocity of the vehicle as determined by the component(s) of the vehicle in contact with the surface are encoders connected to the drive train of the vehicle.
13. Equipment as claimed in any one of claims 1 to 12 in which is a sensor able to be attached to a vehicle.
14. Equipment as claimed in any one of claims 1 to 13 in which the vehicle is an unmanned ground vehicle or a robot.
15. Equipment for determining the velocity of a vehicle comprising an image forming means able to be attached to the vehicle, which image forming means is able to form an image of the surface on which the vehicle is moving as a spatiotemporal image sequence of the surface by recording a time-varying image sequence and computational means for calculating the velocity of the vehicle from the sequence of spatiotemporal images.
16, A method for measuring the slip of a vehicle moving over a surface and in contact with the surface which method comprises measuring the actual velocity of the vehicle over the surface by forming an image of the surface on which the vehicle is moving as a spatiotemporal image sequence of the surface by recording a time-varying image sequence, measuring the velocity as determined by the component(s) of the vehicle in contact with the surface and comparing these values.
17. A method as claimed in claim 16 in which the actual velocity of the vehicle is measured by forming an image of the surface on which the vehicle is moving as a spatiotemporal image sequence of the surface by recording a time-varying image sequence, processing the image sequence to obtain a flow parameter and calculating the velocity of the vehicle from the flow parameter.
18. A method as claimed in claim 17 in which the image of the surface is formed by a fixed or moving camera.
19. A method as claimed in claim 18 in which the camera provides at least 10 frames/sec.
20. A method as claimed in any one of claims 15 to 18 in which the spatiotemporal image sequence is processed by a process selected from image difference, moving edge detection and motion flow and optical field.
21. A method as claimed in any one of claims 16 to 20 in which the image of the surface is processed by image acquisition software, image processing software and post processing software.
22. A method as claimed in claim 21 in which the image acquisition software is a camera driver which enables user settings for focus, colour, brightness, shutter speed and acquisition frame settings and enables time based images to be captured and stored.
23. A method as claimed in claim 21 or 22 in which the image processing software manipulates the captured image sequence as per the optical flow algorithm criteria and converts images to a different format.
24. A method as claimed in claim 23 in which the image is converted from BMP to RAS (Sun Raster Image Format).
25. A method as claimed in claims 23 or 24 in which the optical flow algorithm consists of sequential reading, performing the processing and then closing the image from memory and displaying the velocity gradients on an image.
26. A method as claimed in any one of claims 16 to 25 in which the velocity as determined by the component(s) of the vehicle in contact with the surface is obtained by encoders connected to the drive train of the vehicle.
27. A method for determining the velocity of a vehicle which comprises forming an image of the surface on which the vehicle is moving as a spatiotemporal image sequence of the surface by recording a time- varying image sequence, processing the image sequence to obtain a flow parameter and calculating the velocity of the vehicle from the flow parameter..
28. A method as claimed in claim 27 in which the image of the surface is formed by a fixed or moving camera.
29. A method as claimed in claims 27 or 28 in which the camera provides at least 10 frames/sec.
30. A method as claimed in any one of claims 27 to 29 in which the spatiotemporal image sequence is processed by a process selected from image difference; moving edge detection and motion flow and optical field.
31. A method as claimed in any one of claims 27 to 30 in which the image of the surface is processed by image acquisition software, image processing software and post processing software.
32. A method as claimed in claim 31 in which the image acquisition software is a camera driver which enables user settings for focus, colour, brightness, shutter speed and acquisition frame settings and enables time based images to be captured and stored.
33. A method as claimed in claim 31 or 32 in which the image processing software manipulates the captured image sequence as per the optical flow algorithm criteria and converts images to a different format.
34. A method as claimed in claim 33 in which the image is converted from BMP to RAS (Sun Raster Image Format).
35. A method as claimed in any one of claims 33 or 34 in which the optical flow algorithm consists of sequential reading, performing the processing and then closing the image from memory and displaying the velocity gradients on an image.
36. A method of controlling the trajectory of a vehicle which comprises determining the location of the vehicle by the method of any one of claims 27 to 35 feeding the information about its location to a trajectory controller to enable the trajectory controller to adjust the velocity of the vehicle to the required trajectory.
37. A method for determining the location of a vehicle comprising measuring the velocity of the vehicle as claimed in any one of claims 27-36 and using the velocity to determine the location of the vehicle.
PCT/GB2006/000527 2005-02-18 2006-02-15 Vehicle location WO2006087542A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GBGB0503392.3A GB0503392D0 (en) 2005-02-18 2005-02-18 Vehicle location
GB0503392.3 2005-02-18

Publications (1)

Publication Number Publication Date
WO2006087542A1 true WO2006087542A1 (en) 2006-08-24

Family

ID=34385710

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2006/000527 WO2006087542A1 (en) 2005-02-18 2006-02-15 Vehicle location

Country Status (2)

Country Link
GB (1) GB0503392D0 (en)
WO (1) WO2006087542A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013104455A1 (en) * 2012-01-13 2013-07-18 Robert Bosch Gmbh Autonomous implement
WO2014079632A1 (en) * 2012-11-26 2014-05-30 Robert Bosch Gmbh Autonomous transportation device
WO2015085985A3 (en) * 2013-12-12 2015-10-22 Grenzebach Maschinenbau Gmbh Driver-free transport vehicle for the transportation of heavy loads on carriages and method for operating the transport vehicle
EP2959349A1 (en) * 2013-02-20 2015-12-30 Husqvarna AB A robotic work tool configured for improved turning in a slope, a robotic work tool system, and a method for use in the robot work tool
CN105242675A (en) * 2014-06-17 2016-01-13 苏州宝时得电动工具有限公司 Automatic walking equipment
US10731994B2 (en) * 2015-09-30 2020-08-04 Sony Corporation Information processing device and information processing method
CN111813131A (en) * 2020-09-01 2020-10-23 中国人民解放军国防科技大学 Guide point marking method and device for visual navigation and computer equipment
CN114440874A (en) * 2021-12-31 2022-05-06 深圳市云鼠科技开发有限公司 Fusion positioning method and device based on optical flow and grating

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0945319A1 (en) * 1998-03-25 1999-09-29 Lucent Technologies Inc. Process for determining dynamic properties of motor vehicles
US6453223B1 (en) * 1996-11-05 2002-09-17 Carnegie Mellon University Infrastructure independent position determining system
US20040016077A1 (en) * 2002-07-26 2004-01-29 Samsung Gwangju Electronics Co., Ltd. Robot cleaner, robot cleaning system and method of controlling same
US20040088080A1 (en) * 2002-10-31 2004-05-06 Jeong-Gon Song Robot cleaner, robot cleaning system and method for controlling the same
US20040138799A1 (en) * 2001-05-14 2004-07-15 Sandvik Tamrock Oy Method and apparatus for determining position of mining machine

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6453223B1 (en) * 1996-11-05 2002-09-17 Carnegie Mellon University Infrastructure independent position determining system
EP0945319A1 (en) * 1998-03-25 1999-09-29 Lucent Technologies Inc. Process for determining dynamic properties of motor vehicles
US20040138799A1 (en) * 2001-05-14 2004-07-15 Sandvik Tamrock Oy Method and apparatus for determining position of mining machine
US20040016077A1 (en) * 2002-07-26 2004-01-29 Samsung Gwangju Electronics Co., Ltd. Robot cleaner, robot cleaning system and method of controlling same
US20040088080A1 (en) * 2002-10-31 2004-05-06 Jeong-Gon Song Robot cleaner, robot cleaning system and method for controlling the same

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9380742B2 (en) 2012-01-13 2016-07-05 Robert Bosch Gmbh Autonomous implement
CN104067190A (en) * 2012-01-13 2014-09-24 罗伯特·博世有限公司 Autonomous implement
WO2013104455A1 (en) * 2012-01-13 2013-07-18 Robert Bosch Gmbh Autonomous implement
WO2014079632A1 (en) * 2012-11-26 2014-05-30 Robert Bosch Gmbh Autonomous transportation device
US10149430B2 (en) 2013-02-20 2018-12-11 Husqvarna Ab Robotic work tool configured for improved turning in a slope, a robotic work tool system, and a method for use in the robot work tool
EP2959349A1 (en) * 2013-02-20 2015-12-30 Husqvarna AB A robotic work tool configured for improved turning in a slope, a robotic work tool system, and a method for use in the robot work tool
EP2959349A4 (en) * 2013-02-20 2017-03-29 Husqvarna AB A robotic work tool configured for improved turning in a slope, a robotic work tool system, and a method for use in the robot work tool
US10077176B2 (en) 2013-12-12 2018-09-18 Grenzebach Maschinenbau Gmbh Driver-free transport vehicle for the transportation of heavy loads on carriages and method for operating the transport vehicle
WO2015085985A3 (en) * 2013-12-12 2015-10-22 Grenzebach Maschinenbau Gmbh Driver-free transport vehicle for the transportation of heavy loads on carriages and method for operating the transport vehicle
CN105242675A (en) * 2014-06-17 2016-01-13 苏州宝时得电动工具有限公司 Automatic walking equipment
US10731994B2 (en) * 2015-09-30 2020-08-04 Sony Corporation Information processing device and information processing method
CN111813131A (en) * 2020-09-01 2020-10-23 中国人民解放军国防科技大学 Guide point marking method and device for visual navigation and computer equipment
CN111813131B (en) * 2020-09-01 2020-11-24 中国人民解放军国防科技大学 Guide point marking method and device for visual navigation and computer equipment
CN114440874A (en) * 2021-12-31 2022-05-06 深圳市云鼠科技开发有限公司 Fusion positioning method and device based on optical flow and grating
CN114440874B (en) * 2021-12-31 2022-11-01 深圳市云鼠科技开发有限公司 Fusion positioning method and device based on optical flow and grating

Also Published As

Publication number Publication date
GB0503392D0 (en) 2005-03-23

Similar Documents

Publication Publication Date Title
KR102508843B1 (en) Method and device for the estimation of car egomotion from surround view images
Kelly et al. Combined visual and inertial navigation for an unmanned aerial vehicle
US11867510B2 (en) Integrated vision-based and inertial sensor systems for use in vehicle navigation
Helmick et al. Path following using visual odometry for a mars rover in high-slip environments
WO2006087542A1 (en) Vehicle location
US8494225B2 (en) Navigation method and aparatus
Karamat et al. Novel EKF-based vision/inertial system integration for improved navigation
EP2856273B1 (en) Pose estimation
CN105931275A (en) Monocular and IMU fused stable motion tracking method and device based on mobile terminal
CN105953796A (en) Stable motion tracking method and stable motion tracking device based on integration of simple camera and IMU (inertial measurement unit) of smart cellphone
CN113223161B (en) Robust panoramic SLAM system and method based on IMU and wheel speed meter tight coupling
CN107644441A (en) Multi-foot robot complex road condition based on three-dimensional imaging is separated into point methods of stopping over
Balaram Kinematic state estimation for a Mars rover
Balaram Kinematic observers for articulated rovers
Reina et al. Vision-based estimation of slip angle for mobile robots and planetary rovers
Reina et al. Odometry correction using visual slip angle estimation for planetary exploration rovers
Sorensen et al. On-line optical flow feedback for mobile robot localization/navigation
Liu et al. Implementation and analysis of tightly integrated INS/stereo VO for land vehicle navigation
Mikov et al. Vehicle dead-reckoning autonomous algorithm based on turn velocity updates in kalman filter
Huntsberger et al. Sensory fusion for planetary surface robotic navigation, rendezvous, and manipulation operations
Lee et al. Development of a vehicle body velocity sensor using Modulated Motion Blur
Seneviratne et al. The modelling and estimation of driving forces for unmanned ground vehicles in outdoor terrain
Kundra et al. Improving orientation estimation in mobiles with built-in camera
KR100575108B1 (en) Method of Docking Multiple Spacecrafts Using Vision Sensor
Albilani et al. Localization of Autonomous Vehicle with low cost sensors

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 06709764

Country of ref document: EP

Kind code of ref document: A1

WWW Wipo information: withdrawn in national office

Ref document number: 6709764

Country of ref document: EP