WO1987007009A1 - Position locating system for a vehicle - Google Patents

Position locating system for a vehicle Download PDF

Info

Publication number
WO1987007009A1
WO1987007009A1 PCT/US1987/001146 US8701146W WO8707009A1 WO 1987007009 A1 WO1987007009 A1 WO 1987007009A1 US 8701146 W US8701146 W US 8701146W WO 8707009 A1 WO8707009 A1 WO 8707009A1
Authority
WO
WIPO (PCT)
Prior art keywords
angle
beacon
sensor
locating system
position locating
Prior art date
Application number
PCT/US1987/001146
Other languages
French (fr)
Inventor
Faycal Benayad-Cherif
James F. Maddox
Robert Warren George Ii
Original Assignee
Denning Mobile Robotics, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Denning Mobile Robotics, Inc. filed Critical Denning Mobile Robotics, Inc.
Publication of WO1987007009A1 publication Critical patent/WO1987007009A1/en

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0234Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons
    • G05D1/0236Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons in combination with a laser
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S3/00Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
    • G01S3/78Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using electromagnetic waves other than radio waves
    • G01S3/782Systems for determining direction or deviation from predetermined direction
    • G01S3/783Systems for determining direction or deviation from predetermined direction using amplitude comparison of signals derived from static detectors or detector systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves

Definitions

  • This invention relates to a position locating system for a vehicle or mobile robot, and more particularly to such a system which determines the angles from the vehicle to a remote beacon.
  • the invention results from the realization that a truly effective position locating system for a vehicle such as a mobile robot may be achieved by using a number of remote, uniquely identifiable beacons to provide a signal whose altitude and azimuthal angle relative to the robot can be readily ascertained and used to calculate location of the robot by direction and/or distance between the beacon and the robot.
  • This invention also features a position locating system for a vehicle such as a mobile robot.
  • a multisector sensor is carried by the vehicle or robot for sensing the coded signal emitted by a beacon.
  • each beacon producing a coded signal which uniquely identifies that beacon.
  • the angle may be the altitude angle, and there are means responsive to the altitude angle for calculating the distance from the sensor to the beacon, or the angle may be the azimuthal angle and there may be means responsive to the azimuthal angle for calculating the direction from the beacon to the sensor.
  • the coded signal may be an optical signal such as an infrared signal.
  • the means for generating the coded data signal may include summing means for combining the outputs of one or more of the sectors of the multisector sensor.
  • the sensor may be a lateral effect photodiode and the infrared.signal provided to it from the beacon may be in the range of 904 nanometers.
  • the means for generating may include a differentiator circuit for detecting signal transitions.
  • the means for generating may still further include pulse generating means responsive to the differentiator circuit for reconstructing the coded signal from the code data signal.
  • the means for generating an angle data signal may include means for filtering out ambient light and means for converting the output from each sector of the sensor from an analog to a digital signal.
  • the coded signal may be a modulated optical signal such as a modulated infrared signal.
  • Fig. 1 is an axonometric view of a robot incorporating the position locating system according to this invention
  • Fig. 2 is a simplified exploded view with parts removed of the robot of Fig. 1;
  • Fig. 3 is a block diagram of the electronic modules included in the robot of Figs. 1 and 2;
  • Fig. 4A is a plan view of the fields of view of the ultrasonic infrared and microwave sensors of the robot of Fig. 1;
  • Fig. 4B is a side elevational view taken along line 4B-4B of Fig. 4A, showing the vertical profile of the fields of view;
  • Fig. 5 is a block diagram of the position locating according to this invention including beacon sensors and the beacon electronic module;
  • Fig. 6 is an illustration of the optical burst output of the beacons of Fig. 5;
  • Fig. 7 is an enlarged detail of a single burst of Fig.
  • Fig. 8 is a more detailed block diagram of a beacon shown in Fig. 5;
  • Fig. 9 is a more detailed block diagram of an eye shown in Fig. 5;
  • Fig. 10 is a more detailed block diagram of the beacon STD-bus interface of Fig. 5;
  • Fig. 11 is a flow chart of the software utilized in the microprocessor of Fig. 5;
  • Fig. 12 is a schematic of the photodiode of Fig. 9.
  • a vehicle, robot 10, according to this invention including a head section 12 and a base 14 movable on three wheels, only two of which, 16, 18, are visible.
  • the wheels are mounted in three steerable trucks, only two of which, 20 and 22, are visible.
  • ultrasonic transducers 24 such as the electrostatic transducer of the Sell type available from Polaroid equally spaced at fifteen degrees around the periphery of base 14.
  • Head section 12 is mounted to base 14 and rotates with respect to base 14 about a central vertical axis. Head section 12 carries an RF antenna 65 for sending and receiving communication signals to a base location or guard station. Head section 12 also includes an infrared sensor 68 for sensing radiation in the near infrared region, e.g.
  • Base 14 includes a main chassis 80 which carries three batteries 82 such as Globe 12V, 80AH, Gel Cells, only one of which is shown. When fully charged they will operate the robot for twelve hours or more.
  • Each truck as indicated at truck 20 includes a right-angle drive 84 which receives input from vertical drive shaft 86 and provides output on horizontal drive shaft 88, which operates pulley 90, which in turn through belt 92 drives pulley 94 attached to the axle of wheel 16.
  • Vertical drive shaft 86 and counterpart drive shafts 96 and 98 are driven by their respective sprockets or pulleys 100, 102, 104 which in turn are driven by endless belt 106 powered by the pulley 107 on output shaft 108 of drive motor 110 mounted beneath chassis 80.
  • An encoder 111 mounted with motor 110 monitors the velocity of the robot.
  • An idler wheel 112 is provided to maintain proper tension on belt 106.
  • Three additional shafts, only one of which, 99, is shown, concentric with shafts 86, 96 and 98, respectively, are driven by a second set of pulleys or sprockets 120, 122, 124 engaged with drive belt 126 powered by sprocket 128 driven by steering motor 130 mounted beneath chassis 80.
  • Idler pulley 131 is used to maintain tension on belt 126.
  • An encoder 132 is associated with steering motor 130 to provide outputs indicative of the steering position.
  • the steering motor shaft is connected through pulley 128 to extension shaft 134, the top of which is provided with a flange 136 with a plurality of mounting holes 138.
  • Electronic chassis 140 is mounted by means of screws 142 on three shorter standoffs 144. Three holes 146 in electronic chassis 140 accommodate the pass-through of longer standoffs 148, which mount neck 1 26 by means of screws 150.
  • Electronic chassis 140 contains ⁇ _11 of the electronic circuit boards and components such as indicated at items 152 that are contained in the base 14, including the beacon module described infra.
  • extension shaft 134 and flange 136 and the associated structure are accommodated by the central hole 160 in electronic chassis 140 and the opening in neck 26 so that the head plate 170 may be mounted by means of screws 172 to threaded holes 138 in flange 136.
  • the entire head rotates in synchronism with the trucks and wheels as they are steered by steering motor 130.
  • Housing 194 which faces directly to the back of the head as opposed to primary microwave sensor 70 which faces front, also contains a second passive infrared sensor 334, not visible, which is the same as passive infrared sensor 68.
  • Cover 200 protects the electronics on head plate 170. All of the electrical interconnections between head 12 and base 14 are made through slip rings contained in slip ring unit 202 mounted about extension shaft 134 in base 14.
  • Head 12 Fig. 3 includes three electronic portions: beacon module 210, head ultrasonic module 212, and intrusion detection module 214.
  • Beacon module 210 responds to the head IR sensor 60 to determine what angle the beacon 64 is at with respect to the robot. That angle is fed on bus 216 through the slip ring unit 202 to the main CPU 218.
  • Head ultrasonic module 212 responds to ultrasonic transducer 26 to provide ranging information on bus 216 to CPU 218.
  • Intruder detection module 214 responds to the four microwave sensors 70, 190, 330, 332, and the two IR sensors 68, 334 to provide indications as of yet unconfirmed intrusion events.
  • status module 222 responds to the six infrared sensors 28-38 to provide an indication of an intrusion. Status module 222 may also monitor fire and smoke detectors, diagnostic sensors throughout the robot, as well as chemical and odor detectors and other similar sensors.
  • Mobile module 224 operates and monitors the action of drive motor 110 and steering motor 130. The twenty-four ultrasonic transducers 24 provide an input to the body of ultrasonic module 226, which guides the movement and obstacle avoidance procedures for the robot.
  • body 14 contains CPU 218, which in addition to the alarm confirmation unit 220 also interconnects with a floppy disk controller, two-channel serial I/O boards, and a reset board which receives inputs from a pushbutton reset and CPU 218 and outputs ultrasonic resets, motor resets, status resets, beacon resets, I/O module resets and head ultrasonic resets.
  • CPU 218 also receives inputs from RF antenna 65 through RF circuit 240.
  • FIG. 4A A top plan view of the fields of view of the various sensors and transducers is shown in Fig. 4A.
  • the . twenty-four ultrasonic transducers 24 have a complete 360° field of view 300.
  • the six infrared sensors 28, 30, 32, 34, 36, 38, on body 14 provide six triangular fields of view 302, 304, 306, 308, 310 and 312.
  • the two infrared sensors 68 and 334 on head 12 provide the narrower fields of view 314 and 316, and the four microwave transducers 70, 190, 330, 332 provide the four fields of view 318, 320, 322 and 324.
  • the vertical profile of these fields is depicted in Fig. 4B.
  • the field of view of the microwave transducers extends approximately one hundred fifty feet. That of the infrareds in the head extend about thirty feet, those of the infrared in the body about five feet, and the ultrasonics in the body also extend about twenty-five feet.
  • the position locating system 350, Fig. 5, of this invention includes one or more beacon transmitters 64, 64a, 64b, each having an infrared source 62, 62a, 62b. Also included is an infrared sensor 60 sensitive to the infrared radiation emitted by source 62, and associated with sensor 60 is an eye circuit 352 whose output is provided on bus 354.
  • Bus 354 interconnects with beacon STD-bus interface 356 in beacon module 210.
  • Interface 356 communicates with microprocessor 358 over STD bus 360.
  • Microprocessor 358 may be Z80 and it communicates directly with CPU 218, which may be a 68,000.
  • Beacon transmitter 64 provides an optical burst 362 of coded signals every 15.6 milliseconds.
  • Each burst has a total burst time of 244 microseconds which defines an eight-bit word, each bit being 30.5 microseconds.
  • the first bit is a start bit; the next seven bits are code bits and represent 128 different possible codes.
  • Each code can uniquely identify a single beacon, so that with this simple arrangement one hundred twenty-eight different beacons can be uniquely identified; that is, when the infrared source is seen that is considered a logic one.
  • the infrared source which may be a light-emitting diode or LED, is off, then the signal is low and is considered a logic zero.
  • the signals shown in Figs. 6 and 7 are generated in beacon transmitter 64 by an oscillator 364, Fig.
  • Sensor 60 in eye circuit 352, Fig. 9, is a multisector sensor such as a dual-axis lateral effect photodiode. It provides four separate outputs, each indicative of the infrared radiation incident on its particular sector. By analyzing the relative values of the radiation falling on the different sectors, a determination can be made as to the angle of the sensor to the emitting beacon.
  • Each of the four sector outputs from photodiode 60 is fed to a different channel 372, 374, 376, 378.
  • Each channel includes an amplifier 380, high-pass filters 382, voltage amplifiers 384, and sample and hold circuits 386.
  • High-pass filters 382 pass the coded signal from beacon 64 but block 60-cycle and 120-cycle signals introduced by ambient light conditions; periodically on command from microprocessor 358 a signal on sample and hold line 388 causes sample and hold circuits 386 to sample and hold the signal in each channel. Those signals are then multiplexed by analog multiplexer 392 as directed by a command from microprocessor 358 on line 390. The signal from each channel is fed directly to the gain control of amplifier 394. Finally, the output from each channel is fed to A/D converter 398, where it stops unless a control signal on line 400 from microprocessor 358 requests the angle data signal on line 402. Microprocessor 358 also provides a select and enable signal on line 404 to A/D converter 398 to indicate the particular eye circuit 352, 352a, 352b or 352c which is currently being interrogated.
  • one or more of the outputs from photodiode 60 after passing through amplifiers 380 are combined in an AC summer 406 in order to maximize the signal which will be used to detect the identifying code.
  • the signal is passed to clipper circuit 408, which limits the output independent of the input amplitude.
  • the signal is constituted by one or more coded pulses riding on an envelope of sixty or one hundred twenty cycle noise.
  • Differentiator circuit 414 is therefore used to detect only the transitions of the pulses; thus, for every positive-going transition a positive spike appears at the output of differentiator 414 and for every negative-going transition a negative spike occurs at the output of differentiator 414.
  • the positive-going spikes pass through amplifier 416 and set flip-flop 418 to define the beginning of a pulse.
  • Negative-going spikes passing through amplifier 420 reset flip-flop 418 and define the end of the pulse. In this way the pulses and the received coded signal are reconstituted one at a time to construct the code data signal on line 422.
  • the angle data signal on line 402, Fig. 10, is fed directly through MUX 424 in beacon STD-bus interface 356 to STD-bus 360.
  • the code data signal is fed from MUX 424 to code verifier circuit 426. After it is verified it is submitted to a converter 428 where it is changed from a serial signal to a parallel signal and then provided to STD-bus 360.
  • Code verifier circuit 426 may utilize any of a number of techniques for verifying the authenticity of an incoming code. For example, the incoming signal may be sampled at fixed times following a start pulse when pulse transitions would normally be expected in a valid signal. If the transitions occur within narrow windows at the expected times, they are treated as valid code; otherwise they are rejected.
  • the code status is provided on line 430 to STD-bus 360.
  • operation may begin with a signal from CPU 218 in step 440 with the command "Get Eye Data".
  • microprocessor 358 receives that signal it selects a particular eye in step 442.
  • the A/D converter is then commanded to start the conversion in step 446 and the code data is obtained on line 422 in step 448.
  • step 450 if the code data is bad the cycle starts again with the beginning of a new conversion in step 446. If the code is good then the angle information is used and the next step 452 provides the azimuth angle and the altitude angle and the code in step 454 to microprocessor 358.
  • the angle data is converted to the azimuth angle and the altitude angle and combined with the code and directed to CPU 218.
  • the azimuth angle needs no further processing.
  • the altitude angle and code are delivered to CPU 218, which then retrieves the height H of the identified beacon in step 456. That height is used to calculate the distance D to the beacon by triangulation, e.g., dividing the height by the tangent of the altitude angle in step 458. Then the distance and direction of the robot versus the beacon is output in step 460.
  • step 454, Fig. 11, of the azimuth angle and the altitude angle from the angle data signal is accomplished by determining the X position and the Y position from the dual axis lateral effect photodiode of Fig. 9 shown in more detail in Fig. 12.
  • the X position is calculated according to the expression:
  • Azimuthal Angle arc ⁇ - * n
  • D is the diameter of the detector and F-, is the focal length of the lens.

Abstract

A position locating system for a vehicle such as a mobile robot which includes a multisector sensor (60) for sensing the coded signal emitted by a beacon. There are elements (406, 414, 418) responsive to the sensor for generating a code data signal representative of the coded elements (382, 386) and means responsive to the sensor sectors for generating an angle data signal representative of the angle from sensor to the beacon and at least one of the azimuthal and altitude dimensions.

Description

POSITION LOCATING SYSTEM FOR A VEHICLE
FIELD OF INVENTION This invention relates to a position locating system for a vehicle or mobile robot, and more particularly to such a system which determines the angles from the vehicle to a remote beacon.
CROSS-REFERENCES The following applications, filed concurrently herewith, are incorporated herein by reference:
Inventors Title Attorney's Docket No.
Maddox et al. Intrusion Detection System DMR-101J
Muller et al. Ultrasonic Ranging System DMR-102J Pavlak et al. Power-Up Sequencing Apparatus DMR-104J
Maddox et al. Beacon Proximity Detection DMR-105J System for Vehicle
Kadonoff et al. Orientation Adjustment System DMR-106J and Robot Using Same Kadonoff et al, Obstacle Avoidance System DMR-107J
Kadonoff et al. Beacon Navigation System DMR-108J
Method for Guiding a Vehicle
George II et al. Recharge Docking System DMR-110J for Mobile Robot
BACKGROUND OF INVENTION Mobile robots are being used more and more to perform tedious and dangerous tasks and to provide security. A serious problem is presented by autonomous mobile robots which patrol a secured space in some pattern or randomly. To keep from becoming lost, stymied and ineffective the robot must know where it is. In the past, robots have been made to home in on a signal to return to base. But this does not help determine position; it only vectors home the robot. If a fire or intrusion occurred the robot might detect it but would be unable to relay its location. Most robots cannot safely travel more than a few feet from the homing signal. Closer control can be achieved by connecting the robot by cable directly to a central computer. But the disadvantages of trailing a cable about make this approach unappealing. Typically the locator signals are infrared and suffer from high background noise due to ambient light such as daylight, incandescent and fluorescent. Because of this some systems are required to be operated in the dark or very low light conditions.
SUMMARY OF INVENTION
It is therefore an object of this invention to provide an improved position locating system for a vehicle such as a mobile robot.
It is a further object of this invention to provide such an improved locating system which defines the location of the vehicle with respect to a specific one of a known number of beacons.
It is a further object of this invention to provide such an improved locating system which is able to operate in ambient light conditions.
It is a further object of this invention to provide such an improved locating system which has increased sensitivity and can operate for greater distances from a beacon.
The invention results from the realization that a truly effective position locating system for a vehicle such as a mobile robot may be achieved by using a number of remote, uniquely identifiable beacons to provide a signal whose altitude and azimuthal angle relative to the robot can be readily ascertained and used to calculate location of the robot by direction and/or distance between the beacon and the robot.
This invention also features a position locating system for a vehicle such as a mobile robot. A multisector sensor is carried by the vehicle or robot for sensing the coded signal emitted by a beacon. There are means responsive to the sensor for generating a code data signal representative of the coded signal and means responsive to the sensor sectors for generating an angle data signal representative of the angle from the sensor to the beacon in at least one of the azimuthal and altitude dimensions.
In a preferred embodiment there are a plurality of remote beacons each beacon producing a coded signal which uniquely identifies that beacon. The angle may be the altitude angle, and there are means responsive to the altitude angle for calculating the distance from the sensor to the beacon, or the angle may be the azimuthal angle and there may be means responsive to the azimuthal angle for calculating the direction from the beacon to the sensor.
Each said beacon may be self-contained and independent of the other beacons. The coded signal may be an optical signal such as an infrared signal. The means for generating the coded data signal may include summing means for combining the outputs of one or more of the sectors of the multisector sensor. The sensor may be a lateral effect photodiode and the infrared.signal provided to it from the beacon may be in the range of 904 nanometers. The means for generating may include a differentiator circuit for detecting signal transitions. The means for generating may still further include pulse generating means responsive to the differentiator circuit for reconstructing the coded signal from the code data signal. The means for generating an angle data signal may include means for filtering out ambient light and means for converting the output from each sector of the sensor from an analog to a digital signal. The coded signal may be a modulated optical signal such as a modulated infrared signal.
DISCLOSURE OF PREFERRED EMBODIMENT
Other objects, features and advantages will occur from the following description of a preferred embodiment and the accompanying drawings, in which:
Fig. 1 is an axonometric view of a robot incorporating the position locating system according to this invention;
Fig. 2 is a simplified exploded view with parts removed of the robot of Fig. 1;
Fig. 3 is a block diagram of the electronic modules included in the robot of Figs. 1 and 2;
Fig. 4A is a plan view of the fields of view of the ultrasonic infrared and microwave sensors of the robot of Fig. 1;
Fig. 4B is a side elevational view taken along line 4B-4B of Fig. 4A, showing the vertical profile of the fields of view;
Fig. 5 is a block diagram of the position locating according to this invention including beacon sensors and the beacon electronic module;
Fig. 6 is an illustration of the optical burst output of the beacons of Fig. 5;
Fig. 7 is an enlarged detail of a single burst of Fig.
6;
Fig. 8 is a more detailed block diagram of a beacon shown in Fig. 5;
Fig. 9 is a more detailed block diagram of an eye shown in Fig. 5;
Fig. 10 is a more detailed block diagram of the beacon STD-bus interface of Fig. 5;
Fig. 11 is a flow chart of the software utilized in the microprocessor of Fig. 5; and
Fig. 12 is a schematic of the photodiode of Fig. 9. There is shown in Fig. 1 a vehicle, robot 10, according to this invention including a head section 12 and a base 14 movable on three wheels, only two of which, 16, 18, are visible. The wheels are mounted in three steerable trucks, only two of which, 20 and 22, are visible. There are twenty-four ultrasonic transducers 24 such as the electrostatic transducer of the Sell type available from Polaroid equally spaced at fifteen degrees around the periphery of base 14. Above that on reduced neck 26 there are located six passive infrared motion detectors 28, 30, 32, 34, 36, 38, only two of which, 28 and 30, are shown. These detectors are equally spaced at sixty degrees apart and may be DR-321*s available from Aritech. Just above that are two conductor bands 50 and 52 which are used to engage a charging arm for recharging the robot's batteries. Head section 12 is mounted to base 14 and rotates with respect to base 14 about a central vertical axis. Head section 12 carries an RF antenna 65 for sending and receiving communication signals to a base location or guard station. Head section 12 also includes an infrared sensor 68 for sensing radiation in the near infrared region, e.g. 904 nanometers, emitted from LED 62 of beacon 64, one or more of which are mounted on the walls in the space to be protected by robot 10 to assist in locating and directing robot 10 in the area in which it is to patrol. An ultrasonic transducer 66 similar to the transducer 24 used for maneuvering and i avoidance may be provided for ranging. There is also ~t provided a passive infrared sensor 68 similar to sensors - 28-38. A microwave transmission and reception antenna 70 and a TV camera 72 which may be turned on when an apparent intrusion has occurred are also included in head 12.
Base 14, Fig. 2, includes a main chassis 80 which carries three batteries 82 such as Globe 12V, 80AH, Gel Cells, only one of which is shown. When fully charged they will operate the robot for twelve hours or more. Trucks 20 and 22, with wheels 16 and 18 respectively, are suspended fro chassis 80. Each truck as indicated at truck 20 includes a right-angle drive 84 which receives input from vertical drive shaft 86 and provides output on horizontal drive shaft 88, which operates pulley 90, which in turn through belt 92 drives pulley 94 attached to the axle of wheel 16. Vertical drive shaft 86 and counterpart drive shafts 96 and 98 are driven by their respective sprockets or pulleys 100, 102, 104 which in turn are driven by endless belt 106 powered by the pulley 107 on output shaft 108 of drive motor 110 mounted beneath chassis 80. An encoder 111 mounted with motor 110 monitors the velocity of the robot. An idler wheel 112 is provided to maintain proper tension on belt 106. Three additional shafts, only one of which, 99, is shown, concentric with shafts 86, 96 and 98, respectively, are driven by a second set of pulleys or sprockets 120, 122, 124 engaged with drive belt 126 powered by sprocket 128 driven by steering motor 130 mounted beneath chassis 80. Idler pulley 131 is used to maintain tension on belt 126. An encoder 132 is associated with steering motor 130 to provide outputs indicative of the steering position. The steering motor shaft is connected through pulley 128 to extension shaft 134, the top of which is provided with a flange 136 with a plurality of mounting holes 138. Electronic chassis 140 is mounted by means of screws 142 on three shorter standoffs 144. Three holes 146 in electronic chassis 140 accommodate the pass-through of longer standoffs 148, which mount neck126 by means of screws 150. Electronic chassis 140 contains <_11 of the electronic circuit boards and components such as indicated at items 152 that are contained in the base 14, including the beacon module described infra.
When an electronic chassis 140 and neck 26 are mounted on their respective standoffs, extension shaft 134 and flange 136 and the associated structure are accommodated by the central hole 160 in electronic chassis 140 and the opening in neck 26 so that the head plate 170 may be mounted by means of screws 172 to threaded holes 138 in flange 136. In this way the entire head rotates in synchronism with the trucks and wheels as they are steered by steering motor 130. In addition to the primary microwave sensor 70 there are three additional microwave sensors 190, 330, 332, only one of which, 190, is visible spaced at ninety degrees about head plate 170 mounted in housings 192, 194, and 196. Housing 194 which faces directly to the back of the head as opposed to primary microwave sensor 70 which faces front, also contains a second passive infrared sensor 334, not visible, which is the same as passive infrared sensor 68. Cover 200 protects the electronics on head plate 170. All of the electrical interconnections between head 12 and base 14 are made through slip rings contained in slip ring unit 202 mounted about extension shaft 134 in base 14.
Head 12, Fig. 3, includes three electronic portions: beacon module 210, head ultrasonic module 212, and intrusion detection module 214. Beacon module 210 responds to the head IR sensor 60 to determine what angle the beacon 64 is at with respect to the robot. That angle is fed on bus 216 through the slip ring unit 202 to the main CPU 218. Head ultrasonic module 212 responds to ultrasonic transducer 26 to provide ranging information on bus 216 to CPU 218. Intruder detection module 214 responds to the four microwave sensors 70, 190, 330, 332, and the two IR sensors 68, 334 to provide indications as of yet unconfirmed intrusion events. These events are processed by the alarm confirmation unit 220 in CPU 218 to determine whether a true confirmed intrusion has occurred. In the body section 14, there is included status module 222, mobile module 224, body ultrasonics module 226, and CPU 218. Status module 222 responds to the six infrared sensors 28-38 to provide an indication of an intrusion. Status module 222 may also monitor fire and smoke detectors, diagnostic sensors throughout the robot, as well as chemical and odor detectors and other similar sensors. Mobile module 224 operates and monitors the action of drive motor 110 and steering motor 130. The twenty-four ultrasonic transducers 24 provide an input to the body of ultrasonic module 226, which guides the movement and obstacle avoidance procedures for the robot. Finally, body 14 contains CPU 218, which in addition to the alarm confirmation unit 220 also interconnects with a floppy disk controller, two-channel serial I/O boards, and a reset board which receives inputs from a pushbutton reset and CPU 218 and outputs ultrasonic resets, motor resets, status resets, beacon resets, I/O module resets and head ultrasonic resets. CPU 218 also receives inputs from RF antenna 65 through RF circuit 240.
A top plan view of the fields of view of the various sensors and transducers is shown in Fig. 4A. The . twenty-four ultrasonic transducers 24 have a complete 360° field of view 300. The six infrared sensors 28, 30, 32, 34, 36, 38, on body 14 provide six triangular fields of view 302, 304, 306, 308, 310 and 312. The two infrared sensors 68 and 334 on head 12 provide the narrower fields of view 314 and 316, and the four microwave transducers 70, 190, 330, 332 provide the four fields of view 318, 320, 322 and 324. The vertical profile of these fields is depicted in Fig. 4B. The field of view of the microwave transducers extends approximately one hundred fifty feet. That of the infrareds in the head extend about thirty feet, those of the infrared in the body about five feet, and the ultrasonics in the body also extend about twenty-five feet.
The position locating system 350, Fig. 5, of this invention includes one or more beacon transmitters 64, 64a, 64b, each having an infrared source 62, 62a, 62b. Also included is an infrared sensor 60 sensitive to the infrared radiation emitted by source 62, and associated with sensor 60 is an eye circuit 352 whose output is provided on bus 354. Bus 354 interconnects with beacon STD-bus interface 356 in beacon module 210. Interface 356 communicates with microprocessor 358 over STD bus 360. Microprocessor 358 may be Z80 and it communicates directly with CPU 218, which may be a 68,000. Beacon transmitter 64 provides an optical burst 362 of coded signals every 15.6 milliseconds. Each burst, as shown in greater detail in Fig. 7, has a total burst time of 244 microseconds which defines an eight-bit word, each bit being 30.5 microseconds. The first bit is a start bit; the next seven bits are code bits and represent 128 different possible codes. Each code can uniquely identify a single beacon, so that with this simple arrangement one hundred twenty-eight different beacons can be uniquely identified; that is, when the infrared source is seen that is considered a logic one. When the infrared source, which may be a light-emitting diode or LED, is off, then the signal is low and is considered a logic zero. The signals shown in Figs. 6 and 7 are generated in beacon transmitter 64 by an oscillator 364, Fig. 8, which runs continuously at 32.768 KHz. Its output is delivered directly to a register in code generator 366. Its output is also delivered to a counter 368, modulo 512, which divides the 32.768 KHz signal to provide the time period shown in Figs. 6 and 7. That is, with every 64th pulse (or every 15.6 ms) a burst occurs of eight bits. Eight bits are set to one or zero to produce the unique code for a particular beacon by the setting of the code select keys 370. When one of the keys 370 is toggled to ground, the associated stage of the register in 366 is grounded, thereby placing a logic one in that bit position. . Switches that are toggled to high level voltage produce a logic zero in the associated stage. The patterns of ones and zeros modulate the infrared radiation produced by LED 62 so that a coded signal is provided which uniquely defines the particular beacon.
Sensor 60 in eye circuit 352, Fig. 9, is a multisector sensor such as a dual-axis lateral effect photodiode. It provides four separate outputs, each indicative of the infrared radiation incident on its particular sector. By analyzing the relative values of the radiation falling on the different sectors, a determination can be made as to the angle of the sensor to the emitting beacon. Each of the four sector outputs from photodiode 60 is fed to a different channel 372, 374, 376, 378. Each channel includes an amplifier 380, high-pass filters 382, voltage amplifiers 384, and sample and hold circuits 386. High-pass filters 382 pass the coded signal from beacon 64 but block 60-cycle and 120-cycle signals introduced by ambient light conditions; periodically on command from microprocessor 358 a signal on sample and hold line 388 causes sample and hold circuits 386 to sample and hold the signal in each channel. Those signals are then multiplexed by analog multiplexer 392 as directed by a command from microprocessor 358 on line 390. The signal from each channel is fed directly to the gain control of amplifier 394. Finally, the output from each channel is fed to A/D converter 398, where it stops unless a control signal on line 400 from microprocessor 358 requests the angle data signal on line 402. Microprocessor 358 also provides a select and enable signal on line 404 to A/D converter 398 to indicate the particular eye circuit 352, 352a, 352b or 352c which is currently being interrogated.
Simultaneously with this, one or more of the outputs from photodiode 60 after passing through amplifiers 380 are combined in an AC summer 406 in order to maximize the signal which will be used to detect the identifying code. From summer circuit 406 the signal is passed to clipper circuit 408, which limits the output independent of the input amplitude. At this point the signal is constituted by one or more coded pulses riding on an envelope of sixty or one hundred twenty cycle noise. Differentiator circuit 414 is therefore used to detect only the transitions of the pulses; thus, for every positive-going transition a positive spike appears at the output of differentiator 414 and for every negative-going transition a negative spike occurs at the output of differentiator 414. The positive-going spikes pass through amplifier 416 and set flip-flop 418 to define the beginning of a pulse. Negative-going spikes passing through amplifier 420 reset flip-flop 418 and define the end of the pulse. In this way the pulses and the received coded signal are reconstituted one at a time to construct the code data signal on line 422.
The angle data signal on line 402, Fig. 10, is fed directly through MUX 424 in beacon STD-bus interface 356 to STD-bus 360. The code data signal is fed from MUX 424 to code verifier circuit 426. After it is verified it is submitted to a converter 428 where it is changed from a serial signal to a parallel signal and then provided to STD-bus 360. Code verifier circuit 426 may utilize any of a number of techniques for verifying the authenticity of an incoming code. For example, the incoming signal may be sampled at fixed times following a start pulse when pulse transitions would normally be expected in a valid signal. If the transitions occur within narrow windows at the expected times, they are treated as valid code; otherwise they are rejected. The code status is provided on line 430 to STD-bus 360.
Under software control. Fig. 11, operation may begin with a signal from CPU 218 in step 440 with the command "Get Eye Data". When microprocessor 358 receives that signal it selects a particular eye in step 442. The A/D converter is then commanded to start the conversion in step 446 and the code data is obtained on line 422 in step 448. In step 450, if the code data is bad the cycle starts again with the beginning of a new conversion in step 446. If the code is good then the angle information is used and the next step 452 provides the azimuth angle and the altitude angle and the code in step 454 to microprocessor 358. Here the angle data is converted to the azimuth angle and the altitude angle and combined with the code and directed to CPU 218. The azimuth angle needs no further processing. The altitude angle and code are delivered to CPU 218, which then retrieves the height H of the identified beacon in step 456. That height is used to calculate the distance D to the beacon by triangulation, e.g., dividing the height by the tangent of the altitude angle in step 458. Then the distance and direction of the robot versus the beacon is output in step 460.
The calculation in step 454, Fig. 11, of the azimuth angle and the altitude angle from the angle data signal is accomplished by determining the X position and the Y position from the dual axis lateral effect photodiode of Fig. 9 shown in more detail in Fig. 12. The X position is calculated according to the expression:
A-C X position = &+c
and the Y position by the expression:
Y position = H__2. B+D
The division by A + C and B + D respectively normalizes the signal to reduce its dependence on the incident light level. The angles are then determined by the expression:
Altitude Angle = arc tan [B++Όo Xx kkjJ
Azimuthal Angle = arc τ-*n|A-C x y A+C / where K is a constant dependent on the size of the detector and focal length of the light gathering lens if one is used D
K = Fl
where D is the diameter of the detector and F-, is the focal length of the lens. Although specific features of the invention are shown in some drawings and not others, this is for convenience only as each feature may be combined with any or all of the other features in accordance with the invention.
Other embodiments will occur to those skilled in the art and are within the following claims: What is claimed is:

Claims

1. A position locating system for a vehicle comprising: a plurality of remote beacons, each beacon producing a coded signal which uniquely identifies that beacon; a multisector sensor on a vehicle for sensing the signal emitted by a beacon; means, responsive to said sensor, for generating a code data signal representative of said coded signal; and means responsive to said sensor sectors for generating an angle data signal representative of the angle from said sensor to said beacon in at least one of the azimuthal and altitude dimensions.
2. The position locating system of claim 1 in which said angle is the altitude angle and further including means responsive to said altitude angle for calculating the distance from the sensor to the beacon.
3. The position locating system of claim 1 in which said angle is the azimuthal angle and further including means responsive to said azimuthal angle for calculating the direction from the sensor to the beacon.
4. The position locating system of claim 1 .'.n which each said beacon is self-contained and independent of the other beacons.
5. The position locating system of claim 1 in which said coded signal is an optical signal.
6. The position locating system of claim 1 in which said coded signal is an infrared signal.
7. The position locating system of claim 1 in which said means for generating a code data signal includes summing means for combining the outputs of all said sectors of said multisector sensor.
8. The position locating system of claim 1 in which said sensor is a lateral effect photodiode.
9. The position locating system of claim 1 in which said infrared signal is approximately 904 nanometers.
10. The position locating system of claim 1 in which said means for generating said code data signal includes a differentiator circuit for detecting signal transitions.
11. The position locating system of claim 10 in which said means for generating said code data signal includes pulse generating means responsive to said differentiation circuit for reconstructing said coded signal from said code data signal.
12. The position locating system of claim 5 in which said means for generating an angle data signal includes means for filtering out ambient light.
13. The position locating system of claim 1 in which said means for generating an angle data signal includes means for converting the output from each sector of said sensor from an analog to a digital signal.
14. The position locating system of claim 1 in which said coded signal is a modulated optical signal.
15. A position locating system for a vehicle comprising: a multisector sensor on a vehicle for sensing the signal emitted by a beacon; means, responsive to said sensor, for generating a code data signal representative of said signal; and means responsive to said sensor sectors for generating an angle data signal representative of the angle from said sensor to said beacon in at least one of the azimuthal and altitude dimensions.
16. The position locating system of claim 15 further including a plurality of remote beacons, each beacon producing a coded signal which uniquely identifies that beacon.
17. The position locating system of claim 15 in which said angle is the altitude angle and further including means responsive to said altitude angle for calculating the distance from the sensor to the beacon.
18. The position locating system of claim 15 in which said angle is the azimuthal angle and further including means responsive to said azimuthal angle for calculating the direction from the sensor to the beacon.
PCT/US1987/001146 1986-05-16 1987-05-14 Position locating system for a vehicle WO1987007009A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US06/864,031 US4815840A (en) 1986-05-16 1986-05-16 Position locating system for a vehicle
US864,031 1992-04-06

Publications (1)

Publication Number Publication Date
WO1987007009A1 true WO1987007009A1 (en) 1987-11-19

Family

ID=25342361

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1987/001146 WO1987007009A1 (en) 1986-05-16 1987-05-14 Position locating system for a vehicle

Country Status (3)

Country Link
US (1) US4815840A (en)
AU (1) AU7581487A (en)
WO (1) WO1987007009A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2637691A1 (en) * 1988-10-11 1990-04-13 Micromaine Asi Device for locating moving objects
US5227973A (en) * 1991-02-26 1993-07-13 Siemens Corporate Research, Inc. Control arbitration system for a mobile robot vehicle
US5363935A (en) * 1993-05-14 1994-11-15 Carnegie Mellon University Reconfigurable mobile vehicle with magnetic tracks
US5435405A (en) * 1993-05-14 1995-07-25 Carnegie Mellon University Reconfigurable mobile vehicle with magnetic tracks
US5451135A (en) * 1993-04-02 1995-09-19 Carnegie Mellon University Collapsible mobile vehicle
FR2722443A1 (en) * 1994-07-13 1996-01-19 Centre Ind SURFACE TREATMENT DEVICE
EP3446952A4 (en) * 2016-04-21 2020-01-08 Tianqi Sun General-purpose six-legged walking robot, and main structure thereof

Families Citing this family (69)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4996468A (en) * 1987-09-28 1991-02-26 Tennant Company Automated guided vehicle
GB2216358A (en) * 1988-02-26 1989-10-04 Siemens Ag Tracking moving object
US5014206A (en) * 1988-08-22 1991-05-07 Facilitech International Incorporated Tracking system
FR2648071B1 (en) * 1989-06-07 1995-05-19 Onet SELF-CONTAINED METHOD AND APPARATUS FOR AUTOMATIC FLOOR CLEANING BY EXECUTING PROGRAMMED MISSIONS
US4967064A (en) * 1989-06-30 1990-10-30 Tennant Company Method and apparatus for a target determining apparatus having increased range
US5052799A (en) * 1989-07-17 1991-10-01 Thurman Sasser Object orienting systems and systems and processes relating thereto
JPH04227507A (en) * 1990-07-02 1992-08-17 Nec Corp Method for forming and keeping map for moving robot
US5202742A (en) * 1990-10-03 1993-04-13 Aisin Seiki Kabushiki Kaisha Laser radar for a vehicle lateral guidance system
EP0479271B1 (en) 1990-10-03 1998-09-09 Aisin Seiki Kabushiki Kaisha Automatic lateral guidance control system
US5390118A (en) 1990-10-03 1995-02-14 Aisin Seiki Kabushiki Kaisha Automatic lateral guidance control system
US5318254A (en) * 1991-06-28 1994-06-07 Conceptual Solutions, Inc. Aircraft maintenance robot
US5245177A (en) * 1991-10-24 1993-09-14 Schiller Norman H Electro-optical system for detecting the presence of an object within a predetermined detection system
JP2589901B2 (en) * 1991-11-26 1997-03-12 インターナショナル・ビジネス・マシーンズ・コーポレイション Mobile machines with active sensors
US5355222A (en) * 1992-05-15 1994-10-11 Precision Tracking Fm, Inc. Optical receiver for area location system
JP3026276B2 (en) * 1992-05-22 2000-03-27 本田技研工業株式会社 Emergency stop control device for legged mobile robot
US5279672A (en) * 1992-06-29 1994-01-18 Windsor Industries, Inc. Automatic controlled cleaning machine
SE501477C2 (en) * 1993-07-22 1995-02-27 Apogeum Ab Procedure and device for controlling AGV for choice of road when switching between separate, non-fixed work areas
KR0161031B1 (en) * 1993-09-09 1998-12-15 김광호 Position error correction device of robot
DE19611209C2 (en) * 1996-03-21 1999-05-12 Industrieanlagen Betriebsges Method for determining the position of objects, namely mobile exercise participants in a combat exercise
KR980010094A (en) * 1996-07-08 1998-04-30 김광호 Gas pressure state information detection and display device
USD420972S (en) * 1998-08-14 2000-02-22 The Stanley Works Rotating laser
US6834276B1 (en) * 1999-02-25 2004-12-21 Integrated Data Control, Inc. Database system and method for data acquisition and perusal
US7533435B2 (en) 2003-05-14 2009-05-19 Karcher North America, Inc. Floor treatment apparatus
KR100520078B1 (en) * 2003-08-04 2005-10-12 삼성전자주식회사 robot system and beacon
US9534899B2 (en) 2005-03-25 2017-01-03 Irobot Corporation Re-localization of a robot for slam
US9002511B1 (en) 2005-10-21 2015-04-07 Irobot Corporation Methods and systems for obstacle detection using structured light
US20070150103A1 (en) * 2005-12-08 2007-06-28 Electronics And Telecommunications Research Institute Positioning method and system for indoor moving robot
US20090082879A1 (en) 2007-09-20 2009-03-26 Evolution Robotics Transferable intelligent control device
DE102009016082A1 (en) * 2008-04-28 2009-10-29 Stefan Leske Device for safely transferring personnel or material from a ship-shaped object to a relatively moved object and ship with the device
US8966693B2 (en) 2009-08-05 2015-03-03 Karcher N. America, Inc. Method and apparatus for extended use of cleaning fluid in a floor cleaning machine
US9310806B2 (en) * 2010-01-06 2016-04-12 Irobot Corporation System for localization and obstacle detection using a common receiver
US9288270B1 (en) 2011-04-22 2016-03-15 Angel A. Penilla Systems for learning user preferences and generating recommendations to make settings at connected vehicles and interfacing with cloud systems
US10572123B2 (en) 2011-04-22 2020-02-25 Emerging Automotive, Llc Vehicle passenger controls via mobile devices
US10289288B2 (en) 2011-04-22 2019-05-14 Emerging Automotive, Llc Vehicle systems for providing access to vehicle controls, functions, environment and applications to guests/passengers via mobile devices
US9536197B1 (en) 2011-04-22 2017-01-03 Angel A. Penilla Methods and systems for processing data streams from data producing objects of vehicle and home entities and generating recommendations and settings
US9215274B2 (en) 2011-04-22 2015-12-15 Angel A. Penilla Methods and systems for generating recommendations to make settings at vehicles via cloud systems
US9365188B1 (en) 2011-04-22 2016-06-14 Angel A. Penilla Methods and systems for using cloud services to assign e-keys to access vehicles
US11270699B2 (en) 2011-04-22 2022-03-08 Emerging Automotive, Llc Methods and vehicles for capturing emotion of a human driver and customizing vehicle response
US9230440B1 (en) 2011-04-22 2016-01-05 Angel A. Penilla Methods and systems for locating public parking and receiving security ratings for parking locations and generating notifications to vehicle user accounts regarding alerts and cloud access to security information
US9818088B2 (en) 2011-04-22 2017-11-14 Emerging Automotive, Llc Vehicles and cloud systems for providing recommendations to vehicle users to handle alerts associated with the vehicle
US9371007B1 (en) 2011-04-22 2016-06-21 Angel A. Penilla Methods and systems for automatic electric vehicle identification and charging via wireless charging pads
US9229905B1 (en) 2011-04-22 2016-01-05 Angel A. Penilla Methods and systems for defining vehicle user profiles and managing user profiles via cloud systems and applying learned settings to user profiles
US9104537B1 (en) 2011-04-22 2015-08-11 Angel A. Penilla Methods and systems for generating setting recommendation to user accounts for registered vehicles via cloud systems and remotely applying settings
US10217160B2 (en) * 2012-04-22 2019-02-26 Emerging Automotive, Llc Methods and systems for processing charge availability and route paths for obtaining charge for electric vehicles
US9581997B1 (en) 2011-04-22 2017-02-28 Angel A. Penilla Method and system for cloud-based communication for automatic driverless movement
US9139091B1 (en) 2011-04-22 2015-09-22 Angel A. Penilla Methods and systems for setting and/or assigning advisor accounts to entities for specific vehicle aspects and cloud management of advisor accounts
US9180783B1 (en) 2011-04-22 2015-11-10 Penilla Angel A Methods and systems for electric vehicle (EV) charge location color-coded charge state indicators, cloud applications and user notifications
US11294551B2 (en) 2011-04-22 2022-04-05 Emerging Automotive, Llc Vehicle passenger controls via mobile devices
US9493130B2 (en) 2011-04-22 2016-11-15 Angel A. Penilla Methods and systems for communicating content to connected vehicle users based detected tone/mood in voice input
US9171268B1 (en) 2011-04-22 2015-10-27 Angel A. Penilla Methods and systems for setting and transferring user profiles to vehicles and temporary sharing of user profiles to shared-use vehicles
US9809196B1 (en) 2011-04-22 2017-11-07 Emerging Automotive, Llc Methods and systems for vehicle security and remote access and safety control interfaces and notifications
US11370313B2 (en) 2011-04-25 2022-06-28 Emerging Automotive, Llc Methods and systems for electric vehicle (EV) charge units and systems for processing connections to charge units
US9648107B1 (en) 2011-04-22 2017-05-09 Angel A. Penilla Methods and cloud systems for using connected object state data for informing and alerting connected vehicle drivers of state changes
US9123035B2 (en) 2011-04-22 2015-09-01 Angel A. Penilla Electric vehicle (EV) range extending charge systems, distributed networks of charge kiosks, and charge locating mobile apps
US10286919B2 (en) 2011-04-22 2019-05-14 Emerging Automotive, Llc Valet mode for restricted operation of a vehicle and cloud access of a history of use made during valet mode use
US11132650B2 (en) 2011-04-22 2021-09-28 Emerging Automotive, Llc Communication APIs for remote monitoring and control of vehicle systems
US9285944B1 (en) 2011-04-22 2016-03-15 Angel A. Penilla Methods and systems for defining custom vehicle user interface configurations and cloud services for managing applications for the user interface and learned setting functions
US9697503B1 (en) 2011-04-22 2017-07-04 Angel A. Penilla Methods and systems for providing recommendations to vehicle users to handle alerts associated with the vehicle and a bidding market place for handling alerts/service of the vehicle
US11203355B2 (en) 2011-04-22 2021-12-21 Emerging Automotive, Llc Vehicle mode for restricted operation and cloud data monitoring
US9348492B1 (en) 2011-04-22 2016-05-24 Angel A. Penilla Methods and systems for providing access to specific vehicle controls, functions, environment and applications to guests/passengers via personal mobile devices
US9346365B1 (en) 2011-04-22 2016-05-24 Angel A. Penilla Methods and systems for electric vehicle (EV) charging, charging unit (CU) interfaces, auxiliary batteries, and remote access and user notifications
US9189900B1 (en) 2011-04-22 2015-11-17 Angel A. Penilla Methods and systems for assigning e-keys to users to access and drive vehicles
US10824330B2 (en) 2011-04-22 2020-11-03 Emerging Automotive, Llc Methods and systems for vehicle display data integration with mobile device data
US9963145B2 (en) 2012-04-22 2018-05-08 Emerging Automotive, Llc Connected vehicle communication with processing alerts related to traffic lights and cloud systems
US8798840B2 (en) 2011-09-30 2014-08-05 Irobot Corporation Adaptive mapping with spatial summaries of sensor data
US9223312B2 (en) 2012-06-08 2015-12-29 Irobot Corporation Carpet drift estimation using differential sensors or visual measurements
US9134181B2 (en) * 2013-04-18 2015-09-15 Volution Inc. Flame detector
CN105522557A (en) * 2016-01-19 2016-04-27 中国人民解放军国防科学技术大学 Intelligent security service robot platform
USD907868S1 (en) 2019-01-24 2021-01-12 Karcher North America, Inc. Floor cleaner

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4081669A (en) * 1973-10-26 1978-03-28 Klingman Iii Edwin E Recognition system for class II robots
US4218616A (en) * 1978-03-27 1980-08-19 Litton Systems Inc. Automated warehouse vehicle position determining system
US4264161A (en) * 1977-10-12 1981-04-28 Canon Kabushiki Kaisha Motion detecting device in exposure control system for optical instruments
US4328545A (en) * 1978-08-01 1982-05-04 Imperial Chemical Industries Limited Driverless vehicle autoguide by light signals and two directional detectors
US4441810A (en) * 1980-07-15 1984-04-10 Konishiroku Photo Industry Co., Ltd. Range finder
US4445029A (en) * 1980-06-16 1984-04-24 Seiko Koki Kabushiki Kaisha Distance detector using a photopotentiometer and a continuous detecting system
US4630208A (en) * 1983-01-06 1986-12-16 Guy Le Pechon Identification of beacons

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4119900A (en) * 1973-12-21 1978-10-10 Ito Patent-Ag Method and system for the automatic orientation and control of a robot
US4274503A (en) * 1979-09-24 1981-06-23 Charles Mackintosh Power operated wheelchair
US4500970A (en) * 1982-01-15 1985-02-19 Richard A. Boulais Robot vehicle guidance system including checkpoint realignment system
US4463821A (en) * 1982-03-01 1984-08-07 Robot Crabtor International Drivable, steerable platform for lawnmower and the like
US4573548A (en) * 1983-07-23 1986-03-04 Cybermation, Inc. Mobile base for robots and the like
JPS60249076A (en) * 1984-05-25 1985-12-09 Casio Comput Co Ltd Detection of obstruction
US4638445A (en) * 1984-06-08 1987-01-20 Mattaboni Paul J Autonomous mobile robot
JPH07115121B2 (en) * 1992-08-21 1995-12-13 島村工業株式会社 Coating agent coating device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4081669A (en) * 1973-10-26 1978-03-28 Klingman Iii Edwin E Recognition system for class II robots
US4264161A (en) * 1977-10-12 1981-04-28 Canon Kabushiki Kaisha Motion detecting device in exposure control system for optical instruments
US4218616A (en) * 1978-03-27 1980-08-19 Litton Systems Inc. Automated warehouse vehicle position determining system
US4328545A (en) * 1978-08-01 1982-05-04 Imperial Chemical Industries Limited Driverless vehicle autoguide by light signals and two directional detectors
US4445029A (en) * 1980-06-16 1984-04-24 Seiko Koki Kabushiki Kaisha Distance detector using a photopotentiometer and a continuous detecting system
US4441810A (en) * 1980-07-15 1984-04-10 Konishiroku Photo Industry Co., Ltd. Range finder
US4630208A (en) * 1983-01-06 1986-12-16 Guy Le Pechon Identification of beacons

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2637691A1 (en) * 1988-10-11 1990-04-13 Micromaine Asi Device for locating moving objects
US5227973A (en) * 1991-02-26 1993-07-13 Siemens Corporate Research, Inc. Control arbitration system for a mobile robot vehicle
US5451135A (en) * 1993-04-02 1995-09-19 Carnegie Mellon University Collapsible mobile vehicle
US5363935A (en) * 1993-05-14 1994-11-15 Carnegie Mellon University Reconfigurable mobile vehicle with magnetic tracks
US5435405A (en) * 1993-05-14 1995-07-25 Carnegie Mellon University Reconfigurable mobile vehicle with magnetic tracks
FR2722443A1 (en) * 1994-07-13 1996-01-19 Centre Ind SURFACE TREATMENT DEVICE
WO1996002365A1 (en) * 1994-07-13 1996-02-01 Centre Industrie Surface treatment device
EP3446952A4 (en) * 2016-04-21 2020-01-08 Tianqi Sun General-purpose six-legged walking robot, and main structure thereof

Also Published As

Publication number Publication date
US4815840A (en) 1989-03-28
AU7581487A (en) 1987-12-01

Similar Documents

Publication Publication Date Title
US4815840A (en) Position locating system for a vehicle
US4772875A (en) Intrusion detection system
US4815008A (en) Orientation adjustment system and robot using same
US4710020A (en) Beacon proximity detection system for a vehicle
US4751658A (en) Obstacle avoidance system
US4829442A (en) Beacon navigation system and method for guiding a vehicle
US4701893A (en) Ultrasonic ranging system
US4821192A (en) Node map system and method for vehicle
US4780719A (en) Method of, and apparatus for, area and air space surveillance
US5179421A (en) Remote tracking system particularly for moving picture cameras and method
EP0259445B1 (en) Object detection method and apparatus employing electro-optics
US4736116A (en) Power-up sequencing apparatus
WO1991019165A1 (en) Remote tracking system particularly for moving picture cameras and method
EP0226322A3 (en) Proximity detector
EP0271494A1 (en) Navigation system.
US5500525A (en) Area surveying apparatus and communication system
CA2030793A1 (en) Collision avoidance system for vehicles
JPH08166822A (en) User tracking type moving robot device and sensing method
GB2053516A (en) Programmable Vehicle
JP2005214913A (en) Crime prevention sensor system with optical axis regulating function
CN212301861U (en) Multi-line laser radar device
EP0671715A1 (en) Perimeter securing apparatus
JPS63172985A (en) Detecting method for lost article and thing left behind
CN212749255U (en) Small-size laser obstacle-avoiding radar for unmanned aerial vehicle
JPS61253482A (en) Direction detector

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AU BR JP KR

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE FR GB IT LU NL SE