WO1987002797A1 - Dead reckoning optoelectronic intelligent docking system - Google Patents

Dead reckoning optoelectronic intelligent docking system Download PDF

Info

Publication number
WO1987002797A1
WO1987002797A1 PCT/US1986/002249 US8602249W WO8702797A1 WO 1987002797 A1 WO1987002797 A1 WO 1987002797A1 US 8602249 W US8602249 W US 8602249W WO 8702797 A1 WO8702797 A1 WO 8702797A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
effective
range
image
light
Prior art date
Application number
PCT/US1986/002249
Other languages
French (fr)
Inventor
Steven M. Ward
Original Assignee
Energy Optics, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Energy Optics, Inc. filed Critical Energy Optics, Inc.
Publication of WO1987002797A1 publication Critical patent/WO1987002797A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • G01S17/875Combinations of systems using electromagnetic waves other than radio waves for determining attitude
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64GCOSMONAUTICS; VEHICLES OR EQUIPMENT THEREFOR
    • B64G1/00Cosmonautic vehicles
    • B64G1/22Parts of, or equipment specially adapted for fitting in or to, cosmonautic vehicles
    • B64G1/64Systems for coupling or separating cosmonautic vehicles or parts thereof, e.g. docking arrangements
    • B64G1/646Docking or rendezvous systems

Definitions

  • the present invention relates generally to a modular docking system for use in host space vehicles which must perform satellite capture maneuvers and the like, where automated docking is required.
  • the present invention is a microprocessor based, optoelectronic sensor system interacting with target mounted, passive optical aids for the purpose of providing target range, position and attitude information necessary for the host guidance system to execute a hard docking.
  • TV systems which must rely on solar target illumination, are frought with difficulties in dealing with high contrast shadows, low contrast target geometries and/or operation in the earth's shadow.
  • TV systems must process millions of bytes of video pixel data to deduce the necessary target information, resulting in a significant burden in computer size, memory capacity and processing time.
  • conventional TV optics must be adjusted for exposure and focus, requiring moving mechanical devices which l mit long term operational reliabil ty.
  • laser scanning systems generally involve large, high power laser sources and require moving mechanisms (galvanometer mirrors and the like) to steer the narrow laser beams over the sensor field.
  • Such systems are generally fragile;, have substantial mass, consume large amounts of power and place a : severe limitation on the overall system reliability and operat ng- lifetime.
  • the present invention is a microcomputer based system for use on spacecraft which must automatically maneuver to hard docking with targets equipped with passive, optical aids with a reduced range capabil ty for docking non-cooperative targets.
  • the DROID system performs target acquisition, tracking, homing, relative attitude adjustment, relative roll correction, hard docking and self diagnostics.
  • the system operates over target distances from about 1000 meters to about 20 centimeters, generating guidance control commands required to steer the host vehicle through the maneuver.
  • Key salient features of the invention include autonomous operation, small size, low mass, all solid state components, target illumination at a single laser line and most importantly, no moving parts.
  • the DROID system combines three, highly integrated sensor subsystems with a Master Central Processing Unit (CPU) to generate a target database and compute host vehicle guidance commands.
  • Each sensor is a distinct form of laser radar (or more commonly termed LIDAR) configured to provide range dependent measurement resolution required for conducting a safe hard docking maneuver.
  • the Pulsed Array LIDAR (PAL) subsystem is configured for long range operation from about 1000 meters to about 30 meters.
  • the Continuous Array LIDAR (CAL) subsystem provides more precise target sensing at ranges from about 50 meters to about 3 meters.
  • the Charge Coupled Device (CCD) Television LIDAR (CTL) subsystem is configured for extremely precise target sensing at ranges from 5 meters to about 20 centimeters.
  • the Master CPU sequences the operation of each sensor subsystem, accepting target data and computing the guidance commands necessary to maneuver the host vehicle.
  • Passive, retroreflective optical aids strategically located near the docking port of the target represent an integral factor in long range sensing.
  • the retroreflect ve nature of each device allows the use of er low power, wide beam semiconductor lasers for target illumination, minimizing the power burden of the system and enhancing system reliability.
  • the primary purpose of the present invention is to sense the target at relatively close ranges (under a few thousand meters) and to precisely control the terminal phase of the hard docking maneuver.
  • the Master CPU executes a spiral search maneuver and activates the PAL subsystem in an attempt to acquire the target at typical closing velocities of about 50 meters per second.
  • PAL When the target is acquired, PAL passes target position and range data to the CPU, which generates pitch, yaw and +Z commands to center the target on the host Z axis (an imaginary line emanating from the docking probe) and to close the target. As the target range closes, -Z commands gradually slow the closing velocity to about 1.0 meter per second.
  • the CAL subsystem At a range of 50 meters, the CAL subsystem is activated and tested for proper operation. At 30 meters range, operation is handed over to CAL which senses target position, range and attitude by precisely ranging each of three individual retroreflectors mounted in a triangular pattern on the target surface.
  • the CPU processes target data and maneuvers the host onto the target Z axis (an imaginary line emanating from the center of the docking port), orients the host at the appropriate attitude and spins the host to match the roll rate of the target. Again -Z commands serve to reduce closing velocity to about 10 centimeters per second.
  • the CTL subsystem is activated and tested for proper operation. Target sensing is handed over to the CTL sensor at a range of 3 meters to control the final, hard docking operation.
  • the CTL acquires a retroreflect ve, docking image plate located near the docking port and generates a data indicating target range, position and attitude.
  • the Master CPU processes CTL data to precisely maneuver the docking probe into the docking port, sense the contact, lock the probe into place and terminate the operation.
  • the CPU outputs a "bailout" command to abort the mission and to maneuver in a manner to avoid collision.
  • the primary object of the present invention to provide a fully automatic space docking system which senses a target equipped with passive optical aids and guides a host vehicle to a desired docking port which overcomes the disadvantages, shortcomings, and difficulties of prior art systems
  • the DROID system is applicable to any other space vehicle such as the space shuttle, the orbital transfer vehicle, Mars probe and the space station, supporting all kinds of proximity operations.
  • Fig. 1 is a perspective view of the DROID System
  • Fig. 2 is a schematic illustration of the retroreflective target aids
  • Fig. 3 is block diagram of the DROID System
  • Fig. 4 is block diagram of the PAL subsystem
  • Fig. 5 is an illustration of the PAL and CAL emitted beam patterns and detector fields
  • Fig. 6 is a block diagram of the CAL subsystem
  • Fig. 7 is a block diagram of the CTL subsystem
  • Fig. 8 is a perspective frontal view of the docking plate.
  • Fig. 9 is a perspective view of docking plate perimeter image shapes.
  • the DROID system 10 includes an intelligent sensor package 11 strategically positioned relative to a docking probe 12, both of which are mounted on the host or chaser space vehicle 13.
  • the intelligent sensor 11 passes guidance commands by means of a data bus 19 to a host GNC computer 14.
  • the computer 14 directly controls thruster mechanisms 15 by means of control cables 20 to maneuver the spacecraft 13.
  • the DROID system 10 also includes retrorefleetors 16, a retroreflective docking plate 17 and a docking port 18 located on the target vehicle 21.
  • the DROID system 10 illuminates the target vehicle 21 by coherent laser light 81 in the near IR spectrum, senses retroref!ected source radiation 91 and processes the returns 91 in a manner to determine the relative angular direction, range and attitude of the target vehicle 21.
  • the target vehicle 21 orientation parameters are processed to generate host vehicle 13 guidance commands necessary to insert the docking probe 12 into the docking port 18 for a safe, hard docking.
  • the intelligent sensor 11 operates through a window 22 which is effective to pass only a narrow band of radiation 91 about the operating wavelength of the system 10.
  • retroreflective optical aids 30 are more clearly illustrated in Fig. 2.
  • the term retroref!ector as used herein defines an optical reflector which returns incident light to its source, regardless of the angle of incidence. Of course, a perfect retroref!ector would return the diverging laser beams directly back into the laser optics. However, typical retroref!ectors have a finite divergence depending on manufacturing tolerances and the source angle subtended. Retroref!ectors 16 of the type described herein exhibit full angle divergence on the order of 0.09 miHiradians at long ranges, increasing to about 1.0 milliradian at a range of 50 meters (due to source angle subtense).
  • Retroref!ectors 16 are arranged in the pattern of an isosceles triangle with two 1.0 meter sides and one 0.8 meter side. This arrangement allows the target vehicle 21 roll angle (not shown) to be determined without ambiguity by imaging the reflector pattern.
  • the retroreflectors 16 are used by the longer range DROID sensor subsystems 40, 50 to enhance the target signature.
  • a second optical aid 17 the docking image plate, which is used by the short range DROID sensor subsystem 60 is located near the docking port 18.
  • the docking port IS is positioned at the target centroid, the geometrical center of the docking surface.
  • the image plate 17 comprises a square, retroreflective, white surface 33 subtending a non-retroreflecti e, black crosshair pattern 34.
  • the retroreflective bead surface 33 is similar to common bicycle reflective tape with a relatively wide, 10 degree, full angle divergence. Interactions between the DROID sensors 40, 50, 60 and the optical aids 30 are described in following paragraphs.
  • optical aids 30 of the type described herein serve only to enhance the target vehicle 21 signature, providing significant increases in operational range. At reduced ranges, the DROID system 10 can automatically dock a target vehicle 21 without the help of optical aids 30.
  • the sensor 11 comprises a Pulsed Array LIDAR (PAL) subsystem 40, a CW Array LIDAR (CAL) subsystem 50, a CCD TV LIDAR (CTL) subsystem 60, a Master CPU 70 and a power supply 80.
  • PAL Pulsed Array LIDAR
  • CAL CW Array LIDAR
  • CTL CCD TV LIDAR
  • T+ie DC power supply 80 provides for a maximum power requirement of thirty watts.
  • the optical systems of all three sensors are co-aligned, operating over a field of 20° azimuth by 20° elevation.
  • the design of each LIDAR subsystem is optimized over specific operating range brackets with improving target vehicle 21 sensing resolution and decreasing sensitivity as the range closes.
  • the CPU 70 first activates the PAL subsystem 40 which employs a pulsed laser diode array 41 to illuminate the retroref!ectors 16, a PIN diode detector array 42 for sensing target returns 9! and a time-of-f ⁇ ght pulse processor 44 for target ranging.
  • PAL provides for long range target vehicle 21 acquisition, tracking and ranging (resolution ⁇ +/- 3.0 meters) over a range bracket from 1000 meters to 30 meters.
  • the CAL subsystem 50 is activated, employing a CW laser diode array 51 to illuminate retroref1ectors 16 and a PIN diode detector array 42 to sense target returns 91 and a tone range processor 54 for target ranging.
  • CAL provides for target vehicle 21 tracking, ranging (resolution ⁇ +/- 3.0 centimeters) and attitude sensing (resolution ⁇ +/- 3.0 degrees) over a range bracket of 50 meters to 3 meters.
  • the CTL subsystem 60 is activated, employing both previously mentioned laser arrays 41, 51 to illuminate the docking plate 17 and a CCD solid state TV camera 61 and a video processor 64 for target tracking, ranging (resolution ⁇ +/- 1.0 centimeter) and attitude sensing (resolution ⁇ +/- 2.0 degrees) over a range bracket from 5 meters to 20 centimeters.
  • the CTL sensor 60 employs four narrow beam diode lasers 62, strategically oriented to designate bright spots (not shown) on the docking plate 17 to be processed for range and attitude sensing in close proximity.
  • the three sensor subsystems 40, 50 and 60 are controlled by a Master CPU microprocessor 70 which communicates with each sensor by means of a two way data bus 71.
  • the three sensors 40, 50 and 60 (only one of which is active at a time) sense the target aids 30 and pass target data (not shown) to the Master CPU 70.
  • the CPU 70 processes target data and performs the necessary computations to generate host vehicle 1.3 guidance commands 5.
  • Guidance commands 5 are transferred to the host GNC computer 14 by means of a two way data bus 19.
  • PAL 40 comprises a twenty el ement GaAs diode l aser array 41 , a twenty element si l icon PIN diode detector array 42, a single element wide field-of-view (FOV) recei ver 45, a ti e-of -fl ight (TOF) pul se processor 44 and a singl e-chip microcomputer control l er 43.
  • the transmitter array 41 is configured so that each l aser element 46 is independently triggerable by the microcontrol ler 43.
  • the microcontrol l er 43 fi res a l aser di ode 46 by writing a control word on output port 47.
  • the control word comprises a digi tal "one” bit corresponding to the l aser selected and digital "zeros" on all nineteen other l ines .
  • the diode array 46 is arranged wi th a cyl indrical l ens system 49 in a manner that generates vertical ly oriented, el ongated beam 8! cross-sections which are 1° wide and 20° high.
  • the optical system 49 directs the twenty juxaposed and non-overlapping beams 81 of each lasrer diode 46 in a manner providing a contiguous field coverage of 20° by 20°. Transmitting each selected beam 81 (one at a time) and looking for target returns 91 after each such transmission, allows the microcontroller 43 to locate the azimuth angle of the target vehicle 21 in the DROID system 10 optical field 90 with a resolution of +/- 1°.
  • the PAL detector array 42 is a group of twenty PIN photodiodes 83, each with an aspect ratio of 1 X 20 (including interdiode gaps), with the minor axes stacked on a single substrate (not shown). The resulting total aspect ratio is then 20 X 20 representing a square active area.
  • Orienting the diodes 83 with the major axes horizontal and adding a double convex objective lens 84 provides twenty juxaposed and non-overlapping fields-of-view which are individually 20° wide and 1° high. Combined, the fields provide a 20° x 20° contiguous fie!d-of-view.
  • Co-aligning the optical systems 49, 84 of the transmitter array 41 and the detector array 42 provides a 20° horizontal by 20° vertical sensor coverage and an orthogonal relation between the major axes of transmitter beam 81 cross-sections and receiver 5 FOVs 85.
  • identifying the specific transmitter beam 81 and the receiver FOV 85 corresponding to a target return 91 allows the PAL microcontroller 43 to locate the aximuth and elevation angles of the target in the sensor field 90 to a resolution of +/- 1°.
  • the wide field receiver 45 is a single element, large area photodiode 88 arranged with an objective lens 89 and an IR filter 22 in a manner providing a conical FOV (not shown) of roughly 30° full angle, co-aligned with the optical systems 49, 84 previously described.
  • Pulsed IR radiation (905 nanometers wavelength) 91 passing the filter 22 generates a signal pulse (not shown) at the photodiode 88 which is amplified by a wideband video amplifier 93 to a level triggering a latch circuit 87.
  • PAL microcomputer firmware fires the first laser diode 46 and tests the latch 87 output state. If no return 91 is sensed, the next diode in the array 4! is triggered and so on. Assuming a maximum target closure rate of 50 meters per second, the entire field 90 is scanned at a rate of 500 Hz or ten field 90 scans per meter of closure.
  • the PAL subsystem 40 continues to scan the field 90 until a target return 91 is sensed on the latch 87 input line 94.
  • the PAL microcomputer 43 stores a return count of one and clears the latch 87 by toggling line 95.
  • the PAL microcomputer 43 continues the target acquisition scan until "n" consecutive returns 9! are detected,, verifying that the target vehicle 21 has actually been acquired and terminating the acquisition routine.
  • the PAL microcomputer 43 sequences to the target tracking mode by activating the PIN receiver array 42.
  • Each element of the detector array 42 includes a signal amplifier 96 and a latch circuit 97 identical to that previously described for the wide field receiver 45.
  • the latch array 97 is tied to the PAL microcomputer 43 by means of a twenty line input port 98, which can be read (in parallel) by a single instruction. Al! latch states are cleared simultaneously, when the PAL microcomputer toggles line 95.
  • PAL microcomputer 43 alternates between firing a laser beam 81 and reading the receiver port 98 for target returns 91. Any 905 nanometer target return 91 passes through the IR filter 22 and is focused by lens 84 onto the PIN diode array 42.
  • Pulsed energy levels exceeding receiver sensitivity result " in latched digital levels at the PAL microcomputer 43 input port 98. If all input lines of the port 98 are zeros, the microprocessor 43 assumes that no target vehicle 21 is present in the transmitted beam 8! and sequences the operation to the next beam. When a transmitted beam 8! intercepts the target vehicle 21, one or more of the receiver latches 97 will go high to a digital "one" state. By monitoring the beam output line 47 number transmitted and identifying the active receiver line 98 number, the PAL microcomputer 43 locates the target vehicle 21 in one of the 400, 1° by 1° field elements illustrated in Fig. 5.
  • the PAL microcomputer 43 outputs relative target azimuth and elevation angles to the Master CPU 70 by means of data bus 71.
  • the Master CPU subsequently computes and outputs the pitch, yaw and +Z commands to the host computer 14 (via data bus 19) required to point the host Z axis 100 directly at the target vehicle 21 and to close the target on a di ect line.
  • the PAL microcomputer 43 tracks the target 21, it also measures target range by pulse processing which is well known in prior art.
  • the TOF range processor 44 comprises a 50 megahertz oscillator 99 and high speed counter 101 effective to resolve 20 nanosecond time elements.
  • the TOF processor 44 is configured to measure the propagation time of each laser pulse 81 to the target vehicle 21 and back. The round trip propagation time is related to target range by the speed of light.
  • the PAL microcomputer 43 outputs a control word on port 47 to fire a laser diode 46, it simultaneously outputs a "one" on line 102 which clears the counter 101 and enables the 50 mhz clock 99 input 104.
  • the counter continues to register clock strobes 104 until the leading edge of the target return 91 sets latch 87, thereby disabling the clock input 104 to the counter 101.
  • the PAL microcomputer 43 reads the counter 101 by means of a parallel input port 103.
  • the range count is then transferred to the Master CPU 70 by means of data bus 71.
  • This technique provides a range measurement resolution of about +/- 3.0 meters, which is sufficient for the long range maneuver.
  • the PAL microcomputer 43 continues to operate in this mode while the CPU 70 slows the rate of closure (outputting -Z commands) to about 1.0 meter per second at a range of 50 meters.
  • the CAL subsystem 50 is activated by the CPU 70 for diagnostic testing. If the CAL subsystem 50 fails to operate properly, the CPU 70 outputs a "bailout" command to the host GNC computer 14 and terminates the mission. If, however, the CAL subsystem 50 passes such diagnostic: testing, it takes over the target sensing operation at a range of 30. meters, and the PAL subsystem 40 is deactivated.
  • PAL subsystem 40 Since the PAL subsystem 40 is controlled by a microcomputer 43, a key component lies in the microcomputer firmware that executes the above described operation.
  • the operational firmware is described in the following "pseudo code" listing, where each line represents a machine language program designed to execute the indicated funct on:
  • CAL 50 includes a CW diode laser array 51, the same PIN diode detector array 42 previously described, a tone range processor 54, the same wide field receiver 45 previously described 5 and a CAL microcomputer 53.
  • CW laser diode array 51 is optically equivalent to the transmitter arra 41 of the PAL system 40, in that twenty vertically oriented beams 81 are propagated over the 20° by 20° 0 field 90 as seen in Fig. 5.
  • the laser diodes 112 employed in CAL 50 are of the continuously emitting type rather than the pulsed type described for the PAL sensor 40. Continuous emission is required for the high precision tone range measurement technique employed in the CAL sensor 50.
  • CW 5 diodes 112 are characterized by relatively low output power and, therefore, short operating range.
  • CAL diodes 112, when active, are driven with a CW current of about 0.1 ampere resulting in an output power of about 0.010 watt.
  • the diodes 112 are amplitude modulated by varying the drive current in any desired waveform.
  • a 3.0 megahertz oscillator 113 provides a modulation tone (not shown), which is imposed on the drive current of each diode 112 by modulation circuitry 114.
  • the CW diodes 112 are selected (one at a time) for transmission by the CAL microcomputer 53 by writing a digital word to output port 115.
  • Each diode 112 radiates continuously for the entire period that it is selected.
  • a 50 Hz field scan rate results in a minimum 1.0 millisecond ON time for each diode 112 when selected.
  • the receiver 42 operates exactly as described for the PAL sensor 40, with the exception that the lower signal strength of the CW diodes 112 reduces the maximum operating range to about 64 meters.
  • the orthogonal relation between the vertically elongated transmit beams 81 and the horizontally elongated detector FOVs 85 again allows the CAL microcomputer 53 to resolve the target relative angular direction +/- 1° in both azimuth and elevation.
  • the CAL sensor 50 must resolve multiple targets, because the subtense of the three retroreflector 16 pattern exceeds the 1° sensor field elements (see Fig. 5). Th s is accomplished in a straight forward manner by the procedure previously described.
  • the CAL microcomputer 53 tests the receiver bus 98 for target returns 91. If no returns 91 are detected, operation is sequenced to the next beam. If one or more receiver latches 97 are high, then the active beam number and the active receiver line numbers are stored for later transfer to the Master CPU 70, the receive latches 97 are cleared by toggling line 95 and operation is sequenced to the next beam. Scanning the field 90 in this manner, provides the CPU 70 with continuously updated azimuth and elevation coordinates for each of the target retroref!ectors 16. The coordinates on sequential scans are processed to compute the angular direction of the docking port 18, the target roll angle and the roll rate.
  • the CAL sensor 50 tracks the target vehicle 21, it must also precisely range each of the three retroref!ectors 16 ' in order to determine the target attitude .(or the orientation of the target Z axis).
  • the goal of the CAL sensor 50 is to provide the information 5 necessary to maneuver the host vehicle !3 to a position and orientation aligning the docking probe 12 axis 100 with that of the target docking port 18 for the final approach. Tone ranging techniques which are known in prior art are employed to measure the range from the sensor 50 to each reflector 16 to a resolution of 0 about +/- 3.0 centimeters.
  • the tone range processor 54 compares the phase of the 3.0 Mhz modulation transmit tone !13 to the 3.0 Mhz tone received from each reflector 16 to a resolution of one part in 3,333 to determine the propagation delay and, thereby, the range.
  • the CAL microcomputer 50 detects an active latch return line 5 98, it allows the output beam to remain on the target vehicle 21 for sufficient time for the tone range processor 54 to accurately measure the phase relation (not shown).
  • Heterodyne techniques are employed to beat the 3.0 Mhz signals against a 2.999 Mhz reference 117, so that lower frequency (1.0 Khz) signals 119, 140 can be 0 processed in a phase comparator circuit 118.
  • the delay period between the zero crossing of the transmit signal 119 and that of the receive signal 140 corresponds to the reflector 16 range.
  • a digital value representing the phase delay is read by the CAL microcomputer 50 from the comparator 118 by means of a data bus 116.
  • the CPU 70 processes sequential scans to accurately measure closure rate and the roll rate of the target. Periodic -Z commands are generated to slow the rate of
  • roll commands are generated to spin the host 13 in a manner to zero the relative roll rate between the target vehicles 2! and the chaser 13, so that the final hard docking maneuver can be performed.
  • the Master CPU 70 activates the CTL subsystem 60 for diagnostic testing. If the CTL 60 fails to operate properly, a
  • CTL 60 which operates over target ranges from 3 meters to 20 centimeters is described in greater detail with reference to Fig. 7, Fig. 8 and Fig. 9.
  • CTL 60 comprises a CCD solid state TV camera 61, both the pulsed 41 and CW 51 laser arrays previously described, four narrow beam diode lasers 62, a video processor 64 and a microcomputer controller 63.
  • the two laser arrays 41, 51 are enabled with all elements 46, 112 active simultaneously to illuminate the target vehicle 21.
  • the narrow beam lasers ' 62 which designate bright spots 135 on the docking plate 17, are used in the last meter of closure to indicate target range and attitude.
  • a key element of the CTL sensor 60 is the CCD camera 61 which comprises a 491 X 491 array of active elements 120 which are highly sensitive to the 905 nanometer source radiation 91.
  • the camera 61 is a high resolution sensor capable of imaging the docking plate 17 and generating a stream of digital data on data bus 121, one byte for each of the 241,081 pixels in the array 120, thirty times per second.
  • the data is continuously fed to a video processor 64, which analyzes selected data to determine target angular direction, range and attitude.
  • Target data is periodically read by the CTL microcomputer 63 and transferred to the CPU 70, so that guidance commands 5 can be generated to accurately maneuver the host probe 12 into the target port 18 for hard docking.
  • the CTL subsystem 60 operates in two range dependent modes. From a range of 3 meters to a range of one meter, the system 60 processes the shape of the docking plate 17 in the CCD image (see Fig. 9).
  • the CTL microcomputer 63 enables both laser diode arrays 41, 51 with all elements active and each diode in the pulsed array 41 operating at 10,000 pulses per second. Therefore, the docking plate 17 is illuminated with 300 milliwatts of continuous, laser light in a 20° by 20° beam.
  • the resulting image 130 (see Fig. 8) is that of a white square surface 122 on a black background subtending a black crosshair pattern 126.
  • the retroreflective bead material covering the majority of the docking plate 17 area 122 returns the source radiation 9! toward the CCD camera- 61 in roughly a 10° diverging beam.
  • the irradiance- at the camera input 123 at the maximum range of 3 meters, is about 20 db above the camera's 61 minimum discernable signal with' an 11 millimeter, F1.4 lens 123 and a 20° field-of-view.
  • the fact that the irradiance is excessive allows the F-stop 124 of the camera 61 to be closed down to 1/32, while maintaining good image 130 contrast.
  • the result is that of a pinhole camera with a ry long depth of focus.
  • the combination of intense laser illumination 8!, the retroreflective docking plate surface 1 The combination of intense laser illumination 8!, the retroreflective docking plate surface 1 .
  • the camera optics 123 look through the same infrared filter window 22 previously described.
  • the filter 22 passes only a narrow band of wavelengths about the source line (905 nanometers), thereby significantly attenuating broadband sources of interference (not shown), such as the sun.
  • the video processor 64 employs optical contrast video tracking techniques, known in prior art, to acquire the docking plate image 130 and to locate the plate centroid 125 (the junction of the two, black crosshair pointers 12b, seen in Fig. 8).
  • the camera 61 images a 20° by 20° scene on the 491 X 491 array of detector elements 120. Therefore, each picture element (pixel) 120 covers a 0.7 milliradian field.
  • a pixel coordinate system where 0 ⁇ X ⁇ 491 and 0 ⁇ Y ⁇ " 491, is used to indicate the relative angular direction in azimuth and elevation, respectively, of any discrete pixel in the scene.
  • the video processor 64 locates the crosshair centroid 125 and outputs the image pixel coordinates (X,Y) to the CTL microcomputer 63.
  • the CTL 60 then transfers the target coordinate data to the CPU 70, so that pitch and yaw commands can be generated to keep the CCD camera 61 pointed directly at the plate centroid 125. This process is periodically repeated, because other video processing schemes utilized in the CTL subsystem rely on the plate image 130 remaining in the center of the picture.
  • the video processor 64 analyzes each horizontal line (not shown) in the scene, locates the thicker crosshair sector 127 and computes the relative target roll angle and roll rate. In a like manner, roll data is transferred to the CPU 70, so that it can output host 13 roll commands necessary to orient the image 130 as seen in Fig. 8 and to zero the roll rate.
  • the CTL microcomputer 63 next analyzes the shape (see Fig. 9) of the docking plate image 130 to determine the attitude of the target vehicle 21. If the target attitude is perfectly matched with that of the chaser 13, then the perimeter 128 of the plate image 130 is perfectly square. If, on the other hand, the target is pitched up in the +Y direction, for example, then the image 130 is that of a trapezoid 129 with the parallel lines 141, 142 horizontal and the shorter side 141 on the top of the image 130. The relative length of the top 141 and bottom 142 lines, the ratio of the height to the average width and the angles of the sides off the vertical all are mathematically related to the magnitude of the pitch angle.
  • each of these parameters can be measured by the video processor 64 by locating the black to white transitions in the image 130 and counting image pixels to define the perimeter lines 128.
  • the video processor 64 generates pixel count data corresponding to each parameter, which are passed on to the CPU 70 via the CTL microcomputer 63 and data buses 131, 71.
  • the CPU 70 performs the necessary computations to determine target attitude and to generate the guidance commands 5 required to move the host 13 directly onto the target Z axis, thereby aligning the probe 12 with the port !8. - 2! -
  • FIG. 9 An alternate technique which is less math intensive is illustrated in Fig. 9.
  • the video processor 64 simply determines the directions of convergence (either +X or -X and +Y or -Y) and passes this information through the CTL microcomputer 63 to the CPU 70..
  • the CPU 70 then maneuvers the host 13 in the indicated +/-X and +/-Y directions until the CTL 60 indicates that the image shape 130 is square and the attitude is corrected.
  • target range is measured in the video processor 64 by counting the number of pixels across the plate image 130.
  • the count which is inversely related to target range, is periodically transferred to the CPU 70 which computes range and range rate.
  • Target vehicle 21 tracking resolution (the processor's 64 ability to locate the centroid pixel 125), is defined by the field coverage of each pixel, i.e. +/- 0.7 milliradians in azimuth and elevation.
  • Roll measurement resolution is a function of the image 130 size. At a range of one meter, the roll angle can be measured with a resolution of +/-0.1 degree. Attitude measurement resolution depends on the method selected, the magnitude of the attitude offset angles and the target range. Assuming the method which processes the relative lengths of opposite sides to determine attitude, the resolution is about +/- 0.5 degrees at a range of one meter. Range measurement resolution improves as the range closes. At a range of one meter, the measurement resolution is about +/- 2.0 millimeters.
  • the microcomputer firmware is a key component of the system.
  • the CTL microcomputer 63 firmware is described by means of the following "pseudo code" listing, where each line represents a machine language program which executes the indicated function: SHAPE PROCESSING
  • LASER SPOT PROCESSING At ranges below one meter, the docking plate image 130 size exceeds the FOV of the CCD camera 61.
  • a laser spot image 135 processing method, employed in the final stage of hard docking, is described with reference to Fig. 7 and Fig. 8.
  • Four narrow beam, GaAs diode lasers 62 are arranged about the CCD camera 61; one on the +Y axis, one on the -Y axis, one on the +X axis and one on the -X axis. Each laser 62 is physically offset by 10 centimeters (from the camera Z axis) and pointed toward the camera axis at a convergence angle of 16 degrees.
  • each spot 135 moves through the image along its associated axis toward the pattern centroid.125.
  • the spots 135 overlap: at the centroid 125. As the range continues to close, the spots 135 cross through the centroid 1 5 and move to the opposite sides of the image 130. At the minimum range of 20 centimeters, the hard docking range, the spots 135 fall on the outer perimeter 128 of the CCD camera 61 FOV 90.
  • the general technique employed by the video processor 64 in this mode is to count the number of pixels from the centroid 125 (along each axis) to each laser spot 135*.
  • the four sets of pixel counts are mathematically related to the range from the camera 61 to each spot 135 designated on the docking plate 17. Knowing the range to four points on the target surface, provides a measure of target range and attitude.
  • Image plate centroid 125 tracking and roll angle adjustments are executed as previously described.
  • the geometry of the crosshair pattern 126 provides a centroid image 125 (the junction of the two crosshair pointers in Fig. 8) with zero size. Therefore, the centroid 125 pixel location can be sensed by the video processor 64 even at zero range.
  • the video processor periodically outputs the pixel coordinates (X,Y) of the centroid 125, the relative roll angle (r), and the four sets of laser spot pixel counts to the CTL microcomputer 63 via data bus 131.
  • the CTL microcomputer 63 passes the target data to the MASTER CPU 70, so that guidance commands 5 can be generated to maintain alignment of the docking probe 12 with the docking port 18.
  • the final target range and range rate are computed by averaging the four spot ranges. At a range of 35 centimeters, when the laser spots converge, the probe 12 enters the port 18.
  • the probe end (not shown) makes contact with the target port 18, and a hard docking indicator switch 145 is closed.
  • the switch 145 closure is sensed by the CTL microcomputer 63 via line 137. This information is passed to the CPU 70 which deactivates all systems and terminates the mission.
  • Measurement resolution in the spot 135 processing method is a function of the magnitude of spot movement in the image as the range and: attitude change. Both attitude and range measurement resolution are nonlinear functions of target range and attitude. However, at a range of 50 centimeters attitude resolution is about +/- 1.2 degrees . and range resolut on is about +/- 3.0 millimeters. At a range of 25 centimeters, attitude resolution is +/- 0.2 degree and range resolution is about +/- 0.3 mill meter.
  • the DROID Master CPU 70 is a more powerful sixteen bit microprocessor with sufficient mathematics capability to process the target data supplied by the PAL microcomputer 43, the CAL microcomputer 53 and the CTL microcomputer 63 to generate guidance commands 5 required to maneuver a space vehicle 13 through the last few hundred meters of a hard docking mission. Since the DROID system 10 is intended for use in various types of vehicles 13, the Master CPU 70 memory is that of a plug in programmable read only memory (PROM) (not shown). Different space vehicles 13 can be expected to have different guidance and response characteristics. Therefore, the Master CPU 70 PROM can be customized for each vehicle 13 to adapt the guidance commands 5 generated as necessary.
  • PROM programmable read only memory
  • Space vehicles 13 which must execute automated docking missions can be expected to have groups of control thrusters 15 manipulated by a GNC computer 14 effective to maneuver the craft in pitch, yaw, roll, X, Y and Z.
  • thruster 15 burn can be controlled in a manner to effect the rate of the maneuver. Therefore, the Master CPU 70 analyzes the target data and generates tne following types of commands:
  • the Master CPU 70 establishes two-way communication with the GNC computer 14 by means of data bus 19. In addition to the guidance commands 5 which must be transferred, the Master CPU 70 receives an enable command 6 from the GNC 14 which activates the DROID system 10 and outputs a hard dock command 9 which terminates the mission. As previously described, the CPU also outputs a "bailout” command 7 any time a sensor subsystem fails and the docking mission is in jeopardy. The "bailout" command 7 initiates a preprogrammed -Z maneuver designed to avoid a catastrophic collision with the target vehicle 21.
  • the DROID system 10 can be labeled an intelligent sensor since the Master CPU 70 acts to reconfigure the system on the basis of the sensed environment and changing mission requirements. In this process, the CPU 70 performs two important tasks. One is to monitor the operating condition of each sensor and target data to determine when it may be unsafe to proceed. The other is to process target data to generate guidance commands 5.
  • the Master CPU 70 firmware which is an integral component of the system 10, is described in the following "pseudo code" l sting, where each line represents a machine language or higher level program to execute the indicated function. Associated guidance commands 5 are also given:
  • NULL TARGET ANGLES P,Y
  • NULL Z AXIS OFFSET X,Y
  • NULL TARGET ANGLES P,Y
  • NULL ROLL ANGLE r
  • GET SPOT DATA XCOUNT (+XCOUNT)-(-XCOUNT) IF XCOUNT > 0 STEP RIGHT (+X)
  • the orthogonal transceiver of the preferred embodiment carries the advantages of a low component count and high speed of operation.
  • a key industrial appl cation of the present invention resides in the field of robotics.
  • Robot vision currently requires dual television cameras, moving pedestals and a great deal of video processing electronics to emulate depth perception.
  • the ability of the present invention to indicate target direction, range and attitude without the need for moving parts, provides a reliable approach to depth perception for robots.
  • Activating the sensors of the present invention in alternating, sequential fashion allows a robot to locate a target at long range, to maneuver to the target while avoiding close range obstacles and to sense the relative attitude of the target at very close ranges. Such capabilities allow a robot to perform many kinds of proximity manipulations.

Abstract

An intelligent, optoelectronic docking system for use in manned or unmanned spacecraft for automatically docking with a target spacecraft. The system is a multifaceted, active sensor using a controlling microprocessor to integrate the operation of independently triggerable laser sources for target illumination and optical receiver arrays for target detection. Returning signal waveforms are processed to sense the direction, range, and attitude of a target docking surface which may be equipped with passive optical aids to enhance the reflective signature. The docking system reconfigures its active sensor elements as the target range closes by sequentially employing three laser transceiver arrangements. As a result, the system effectively tracks a target from a range of several hundred meters down to a range of only a few centimeters. The system microprocessor operates on data received from each sensor arrangement to generate the information necessary to guide a host spacecraft safely through hard docking maneuvers. Wide field coverage is achieved without moving parts by electronic selection of independently directed laser beams.

Description

DEAD RECKONING OPTOELECTRONIC INTELLIGENT DOCKING SYSTEM
BACKGROUND OF THE INVENTION FIELD OF THE INVENTION The present invention relates generally to a modular docking system for use in host space vehicles which must perform satellite capture maneuvers and the like, where automated docking is required. In particular, the present invention is a microprocessor based, optoelectronic sensor system interacting with target mounted, passive optical aids for the purpose of providing target range, position and attitude information necessary for the host guidance system to execute a hard docking.
DESCRIPTION OF THE PRIOR ART The docking of two spacecraft has in the past been a primarily manual operation. Such maneuvers are executed by a highly trained astronaut, visually acquiring and tracking a target, while manipulating control mechanisms to fly the host vehicle to a desired docking point. Recent events in the U.S. space program have fostered new impetus toward the development of effective procedures and systems for automating rendezvous, station keeping, tethering, berthing and docking of spacecraft. Three particular space vehicles which are currently under some phase of development, (1) Space Station, (2) Orbital Maneuvering Vehicle and (3) Mars Probe, require new technology for automatic docking. In some cases, unmanned docking can be accomplished by teleoperations or remote control. Such techniques rely on downlink telemetry of TV images which are monitored by a controller on the ground, transmitting guidance commands back to the vehicle on the telemetry uplink. However, the large number of future docking missions and the vastness of space eliminate the use of teleoperations in many scenarios. Although Radio Frequency (RF) technology has long been developed for acquiring and tracking targets for various purposes, the magnitude of RF wavelengths preclude operation at very close ranges and cannot be adapted for the precision measurement capability required for automated docking. It is well understood in the art that effective docking technology must operate in the optical, IR or millimeter wave spectra.
A large number of commercial and military optoelectronic systems such as stereoscopic robot vision, optical contrast video trackers, laser guided seekers, forward looking infrared imaging devices, laser radar and laser interferometers are currently available and appear applicable to automated docking. However, all known systems are burdened with practical disadvantages such as large size, fragile components, large power dissipation and a general requirement for moving mechanisms for beam steering, field scanning, focusing, cooling and aperature control which preclude application in an autonomous, space environment.
Recent developments in automated docking technology have taken two distinct directions. One approach represents an extension of TV teleoperations, relying on automated computer processing of video signals generated by TV cameras to sense position, range and attitude of a target. The second approach lies in the use of laser radar, scanning a narrow laser beam over a wide field and processing target reflected returns for position, range and attitude. With the necessary target data base, a guidance/navigation/control computer generates guidance commands to maneuver the host vehicle to the target docking port.
Although both such methods provide a target sensing and measurement capability, specific l mitations preclude practical implementation in space. TV systems, which must rely on solar target illumination, are frought with difficulties in dealing with high contrast shadows, low contrast target geometries and/or operation in the earth's shadow. In addition, TV systems must process millions of bytes of video pixel data to deduce the necessary target information, resulting in a significant burden in computer size, memory capacity and processing time. Further, conventional TV optics must be adjusted for exposure and focus, requiring moving mechanical devices which l mit long term operational reliabil ty. On the other hand, laser scanning systems generally involve large, high power laser sources and require moving mechanisms (galvanometer mirrors and the like) to steer the narrow laser beams over the sensor field. Such systems are generally fragile;, have substantial mass, consume large amounts of power and place a: severe limitation on the overall system reliability and operat ng- lifetime.
SUMMARY OF PRESENT INVENTION
The present invention, generally described as a Dead Reckoning Optoelectronic, Intelligent Docking, hereinafter referred to as "DROID", system, is a microcomputer based system for use on spacecraft which must automatically maneuver to hard docking with targets equipped with passive, optical aids with a reduced range capabil ty for docking non-cooperative targets. The DROID system performs target acquisition, tracking, homing, relative attitude adjustment, relative roll correction, hard docking and self diagnostics. The system operates over target distances from about 1000 meters to about 20 centimeters, generating guidance control commands required to steer the host vehicle through the maneuver. Key salient features of the invention include autonomous operation, small size, low mass, all solid state components, target illumination at a single laser line and most importantly, no moving parts.
The DROID system combines three, highly integrated sensor subsystems with a Master Central Processing Unit (CPU) to generate a target database and compute host vehicle guidance commands. Each sensor is a distinct form of laser radar (or more commonly termed LIDAR) configured to provide range dependent measurement resolution required for conducting a safe hard docking maneuver. The Pulsed Array LIDAR (PAL) subsystem is configured for long range operation from about 1000 meters to about 30 meters. The Continuous Array LIDAR (CAL) subsystem provides more precise target sensing at ranges from about 50 meters to about 3 meters. The Charge Coupled Device (CCD) Television LIDAR (CTL) subsystem is configured for extremely precise target sensing at ranges from 5 meters to about 20 centimeters. The Master CPU sequences the operation of each sensor subsystem, accepting target data and computing the guidance commands necessary to maneuver the host vehicle.
Passive, retroreflective optical aids strategically located near the docking port of the target represent an integral factor in long range sensing. The retroreflect ve nature of each device allows the use of er low power, wide beam semiconductor lasers for target illumination, minimizing the power burden of the system and enhancing system reliability.
In a typical rendezvous and docking mission, initial target rendezvous is accomplished by various other navigation techniques including the Global Posi ioning System and conventional Radar. Therefore., the primary purpose of the present invention is to sense the target at relatively close ranges (under a few thousand meters) and to precisely control the terminal phase of the hard docking maneuver. For example, when the DROID system is enabled by the host guidance computer, the Master CPU executes a spiral search maneuver and activates the PAL subsystem in an attempt to acquire the target at typical closing velocities of about 50 meters per second. When the target is acquired, PAL passes target position and range data to the CPU, which generates pitch, yaw and +Z commands to center the target on the host Z axis (an imaginary line emanating from the docking probe) and to close the target. As the target range closes, -Z commands gradually slow the closing velocity to about 1.0 meter per second. At a range of 50 meters, the CAL subsystem is activated and tested for proper operation. At 30 meters range, operation is handed over to CAL which senses target position, range and attitude by precisely ranging each of three individual retroreflectors mounted in a triangular pattern on the target surface. The CPU processes target data and maneuvers the host onto the target Z axis (an imaginary line emanating from the center of the docking port), orients the host at the appropriate attitude and spins the host to match the roll rate of the target. Again -Z commands serve to reduce closing velocity to about 10 centimeters per second. At a range of 5 meters, the CTL subsystem is activated and tested for proper operation. Target sensing is handed over to the CTL sensor at a range of 3 meters to control the final, hard docking operation. The CTL acquires a retroreflect ve, docking image plate located near the docking port and generates a data indicating target range, position and attitude. The Master CPU processes CTL data to precisely maneuver the docking probe into the docking port, sense the contact, lock the probe into place and terminate the operation. At any time in the mission that an active DROID system sensor fails to track the target, the CPU outputs a "bailout" command to abort the mission and to maneuver in a manner to avoid collision.
It is, therefore, the primary object of the present invention to provide a fully automatic space docking system which senses a target equipped with passive optical aids and guides a host vehicle to a desired docking port which overcomes the disadvantages, shortcomings, and difficulties of prior art systems
It is a more specific object of the present invention to provide a hard docking system operating independently of solar illumination and providing wide field coverage with no moving parts for beam steering, camera focusing or any other function.
It is a further object of the present invention to provide a fully automatic space docking system which senses a non-cooperati e target and guides a host vehicle to the target for docking.
It is a still further object of the present invention to provide a fully integrated optoelectronic sensor system operating properly in any aitiDient lighting conditions at a single wavelength (laser 1 ne).
It is a still further object of the present invention to provide a small, low mass docking sensor which comprises all solid state, electronic and optoelectronic components with an operating lifetime of at least 10 years.
It is a still further object of the present invention to provide an intelligent sensor which reconfigures itself through the mission, resulting in improved target measurement resolution as the range closes. It is a still further object of the present invention to provide a docking system capable of sensing target position, range and attitude at very short distances under one meter.
It is a still further object of the present invention to provide a docking system operating at a ery low power burden under 30 watts.
It should be noted that although the invention is describeα as an Orbital Maneuvering Vehicle subsystem, guiding the vehicle through an automated satellite capture mission, the DROID system is applicable to any other space vehicle such as the space shuttle, the orbital transfer vehicle, Mars probe and the space station, supporting all kinds of proximity operations.
For a better understanding of the present invention, together with other and further objects thereof, reference is had to the following. description taken in connection with the accompanying drawings, and its scope will be pointed out in the appending claims.
BRIEF DESCRIPTION OF THE DRAWINGS
Fig. 1 is a perspective view of the DROID System;
Fig. 2 is a schematic illustration of the retroreflective target aids; Fig. 3 is block diagram of the DROID System;
Fig. 4 is block diagram of the PAL subsystem;
Fig. 5 is an illustration of the PAL and CAL emitted beam patterns and detector fields;
Fig. 6 is a block diagram of the CAL subsystem; Fig. 7 is a block diagram of the CTL subsystem;
Fig. 8 is a perspective frontal view of the docking plate; and
Fig. 9 is a perspective view of docking plate perimeter image shapes.
L
DETAILED DESCRIPTION OF THE INVENTION An illustration of the DROID system 10 is shown in Fig. 1. The system includes an intelligent sensor package 11 strategically positioned relative to a docking probe 12, both of which are mounted on the host or chaser space vehicle 13. The intelligent sensor 11 passes guidance commands by means of a data bus 19 to a host GNC computer 14. The computer 14 directly controls thruster mechanisms 15 by means of control cables 20 to maneuver the spacecraft 13. The DROID system 10 also includes retrorefleetors 16, a retroreflective docking plate 17 and a docking port 18 located on the target vehicle 21. The DROID system 10 illuminates the target vehicle 21 by coherent laser light 81 in the near IR spectrum, senses retroref!ected source radiation 91 and processes the returns 91 in a manner to determine the relative angular direction, range and attitude of the target vehicle 21. The target vehicle 21 orientation parameters are processed to generate host vehicle 13 guidance commands necessary to insert the docking probe 12 into the docking port 18 for a safe, hard docking. The intelligent sensor 11 operates through a window 22 which is effective to pass only a narrow band of radiation 91 about the operating wavelength of the system 10.
The retroreflective optical aids 30 are more clearly illustrated in Fig. 2. The term retroref!ector as used herein defines an optical reflector which returns incident light to its source, regardless of the angle of incidence. Of course, a perfect retroref!ector would return the diverging laser beams directly back into the laser optics. However, typical retroref!ectors have a finite divergence depending on manufacturing tolerances and the source angle subtended. Retroref!ectors 16 of the type described herein exhibit full angle divergence on the order of 0.09 miHiradians at long ranges, increasing to about 1.0 milliradian at a range of 50 meters (due to source angle subtense). Retroref!ectors 16 are arranged in the pattern of an isosceles triangle with two 1.0 meter sides and one 0.8 meter side. This arrangement allows the target vehicle 21 roll angle (not shown) to be determined without ambiguity by imaging the reflector pattern. The retroreflectors 16 are used by the longer range DROID sensor subsystems 40, 50 to enhance the target signature.
A second optical aid 17, the docking image plate, which is used by the short range DROID sensor subsystem 60 is located near the docking port 18. The docking port IS is positioned at the target centroid, the geometrical center of the docking surface. The image plate 17 comprises a square, retroreflective, white surface 33 subtending a non-retroreflecti e, black crosshair pattern 34. The retroreflective bead surface 33 is similar to common bicycle reflective tape with a relatively wide, 10 degree, full angle divergence. Interactions between the DROID sensors 40, 50, 60 and the optical aids 30 are described in following paragraphs.
It should be pointed out that optical aids 30 of the type described herein, serve only to enhance the target vehicle 21 signature, providing significant increases in operational range. At reduced ranges, the DROID system 10 can automatically dock a target vehicle 21 without the help of optical aids 30.
A block diagram of the intelligent DROID sensor 11 is shown in Fig. 3. The sensor 11 comprises a Pulsed Array LIDAR (PAL) subsystem 40, a CW Array LIDAR (CAL) subsystem 50, a CCD TV LIDAR (CTL) subsystem 60, a Master CPU 70 and a power supply 80. T+ie DC power supply 80 provides for a maximum power requirement of thirty watts. The optical systems of all three sensors are co-aligned, operating over a field of 20° azimuth by 20° elevation. The design of each LIDAR subsystem is optimized over specific operating range brackets with improving target vehicle 21 sensing resolution and decreasing sensitivity as the range closes. The CPU 70 first activates the PAL subsystem 40 which employs a pulsed laser diode array 41 to illuminate the retroref!ectors 16, a PIN diode detector array 42 for sensing target returns 9! and a time-of-fϋght pulse processor 44 for target ranging. PAL provides for long range target vehicle 21 acquisition, tracking and ranging (resolution < +/- 3.0 meters) over a range bracket from 1000 meters to 30 meters. As the target range closes, the CAL subsystem 50 is activated, employing a CW laser diode array 51 to illuminate retroref1ectors 16 and a PIN diode detector array 42 to sense target returns 91 and a tone range processor 54 for target ranging. CAL provides for target vehicle 21 tracking, ranging (resolution < +/- 3.0 centimeters) and attitude sensing (resolution < +/- 3.0 degrees) over a range bracket of 50 meters to 3 meters. In the final phase of the docking maneuver, the CTL subsystem 60 is activated, employing both previously mentioned laser arrays 41, 51 to illuminate the docking plate 17 and a CCD solid state TV camera 61 and a video processor 64 for target tracking, ranging (resolution < +/- 1.0 centimeter) and attitude sensing (resolution < +/- 2.0 degrees) over a range bracket from 5 meters to 20 centimeters. In addition, the CTL sensor 60 employs four narrow beam diode lasers 62, strategically oriented to designate bright spots (not shown) on the docking plate 17 to be processed for range and attitude sensing in close proximity. The three sensor subsystems 40, 50 and 60 are controlled by a Master CPU microprocessor 70 which communicates with each sensor by means of a two way data bus 71. The three sensors 40, 50 and 60 (only one of which is active at a time) sense the target aids 30 and pass target data (not shown) to the Master CPU 70. The CPU 70 processes target data and performs the necessary computations to generate host vehicle 1.3 guidance commands 5. Guidance commands 5 are transferred to the host GNC computer 14 by means of a two way data bus 19.
PULSED ARRAY LIDAR The PAL sensor subsystem 40, which operates over target ranges from 1000 meters down to 30 meters, is described in greater detail with reference to Fig. 4 and Fig. 5-. PAL 40 comprises a twenty el ement GaAs diode l aser array 41 , a twenty element si l icon PIN diode detector array 42, a single element wide field-of-view (FOV) recei ver 45, a ti e-of -fl ight (TOF) pul se processor 44 and a singl e-chip microcomputer control l er 43. The transmitter array 41 is configured so that each l aser element 46 is independently triggerable by the microcontrol ler 43. The microcontrol l er 43 fi res a l aser di ode 46 by writing a control word on output port 47. The control word comprises a digi tal "one" bit corresponding to the l aser selected and digital "zeros" on all nineteen other l ines . The digital "one" acti vates a pul se forming network 48 which dri ves a twenty ampere current pul se (100 nanoseconds in duration) via l ine 82 through the sel ected diode 46, resul ting i n a 905 nanometer laser pul se 81 wi th peak power on the order of fi ve watts . The diode array 46 is arranged wi th a cyl indrical l ens system 49 in a manner that generates vertical ly oriented, el ongated beam 8! cross-sections which are 1° wide and 20° high. The optical system 49 directs the twenty juxaposed and non-overlapping beams 81 of each lasrer diode 46 in a manner providing a contiguous field coverage of 20° by 20°. Transmitting each selected beam 81 (one at a time) and looking for target returns 91 after each such transmission, allows the microcontroller 43 to locate the azimuth angle of the target vehicle 21 in the DROID system 10 optical field 90 with a resolution of +/- 1°.
The PAL detector array 42 is a group of twenty PIN photodiodes 83, each with an aspect ratio of 1 X 20 (including interdiode gaps), with the minor axes stacked on a single substrate (not shown). The resulting total aspect ratio is then 20 X 20 representing a square active area. Orienting the diodes 83 with the major axes horizontal and adding a double convex objective lens 84 provides twenty juxaposed and non-overlapping fields-of-view which are individually 20° wide and 1° high. Combined, the fields provide a 20° x 20° contiguous fie!d-of-view. Co-aligning the optical systems 49, 84 of the transmitter array 41 and the detector array 42 provides a 20° horizontal by 20° vertical sensor coverage and an orthogonal relation between the major axes of transmitter beam 81 cross-sections and receiver 5 FOVs 85. Now referring again to Fig. 5, identifying the specific transmitter beam 81 and the receiver FOV 85 corresponding to a target return 91 allows the PAL microcontroller 43 to locate the aximuth and elevation angles of the target in the sensor field 90 to a resolution of +/- 1°.
Once the PAL microcomputer 43 is activated by the Master CPU 70, it initially attempts to acquire a target by sequentially firing each laser beam 81 and testing the single element receiver latch 87 for a target return 91. The wide field receiver 45 is a single element, large area photodiode 88 arranged with an objective lens 89 and an IR filter 22 in a manner providing a conical FOV (not shown) of roughly 30° full angle, co-aligned with the optical systems 49, 84 previously described. Pulsed IR radiation (905 nanometers wavelength) 91 passing the filter 22 generates a signal pulse (not shown) at the photodiode 88 which is amplified by a wideband video amplifier 93 to a level triggering a latch circuit 87. The latch 87 - n
holds a digital "one" on an- input line 94 to the PAL microcomputer 43 until the microcomputer 43 acknowledges the input by toggling output line 95, thereby clearing the latch. PAL microcomputer firmware fires the first laser diode 46 and tests the latch 87 output state. If no return 91 is sensed, the next diode in the array 4! is triggered and so on. Assuming a maximum target closure rate of 50 meters per second, the entire field 90 is scanned at a rate of 500 Hz or ten field 90 scans per meter of closure.
Referring again to Fig. 4, the PAL subsystem 40 continues to scan the field 90 until a target return 91 is sensed on the latch 87 input line 94. The PAL microcomputer 43 stores a return count of one and clears the latch 87 by toggling line 95. In order to avoid the possibility of a false alarm, the PAL microcomputer 43 continues the target acquisition scan until "n" consecutive returns 9! are detected,, verifying that the target vehicle 21 has actually been acquired and terminating the acquisition routine.
Once target acquisition is confirmed, the PAL microcomputer 43 sequences to the target tracking mode by activating the PIN receiver array 42. Each element of the detector array 42 includes a signal amplifier 96 and a latch circuit 97 identical to that previously described for the wide field receiver 45. The latch array 97 is tied to the PAL microcomputer 43 by means of a twenty line input port 98, which can be read (in parallel) by a single instruction. Al! latch states are cleared simultaneously, when the PAL microcomputer toggles line 95. In the tracking mode, PAL microcomputer 43 alternates between firing a laser beam 81 and reading the receiver port 98 for target returns 91. Any 905 nanometer target return 91 passes through the IR filter 22 and is focused by lens 84 onto the PIN diode array 42. Pulsed energy levels exceeding receiver sensitivity result" in latched digital levels at the PAL microcomputer 43 input port 98. If all input lines of the port 98 are zeros, the microprocessor 43 assumes that no target vehicle 21 is present in the transmitted beam 8! and sequences the operation to the next beam. When a transmitted beam 8! intercepts the target vehicle 21, one or more of the receiver latches 97 will go high to a digital "one" state. By monitoring the beam output line 47 number transmitted and identifying the active receiver line 98 number, the PAL microcomputer 43 locates the target vehicle 21 in one of the 400, 1° by 1° field elements illustrated in Fig. 5. With the target vehicle 21 located in.the sensor field 90, the PAL microcomputer 43 outputs relative target azimuth and elevation angles to the Master CPU 70 by means of data bus 71. The Master CPU subsequently computes and outputs the pitch, yaw and +Z commands to the host computer 14 (via data bus 19) required to point the host Z axis 100 directly at the target vehicle 21 and to close the target on a di ect line.
While the PAL microcomputer 43 tracks the target 21, it also measures target range by pulse processing which is well known in prior art. The TOF range processor 44 comprises a 50 megahertz oscillator 99 and high speed counter 101 effective to resolve 20 nanosecond time elements. The TOF processor 44 is configured to measure the propagation time of each laser pulse 81 to the target vehicle 21 and back. The round trip propagation time is related to target range by the speed of light. When the PAL microcomputer 43 outputs a control word on port 47 to fire a laser diode 46, it simultaneously outputs a "one" on line 102 which clears the counter 101 and enables the 50 mhz clock 99 input 104. The counter continues to register clock strobes 104 until the leading edge of the target return 91 sets latch 87, thereby disabling the clock input 104 to the counter 101. After a target return 91 is detected, the PAL microcomputer 43 reads the counter 101 by means of a parallel input port 103. The range count is then transferred to the Master CPU 70 by means of data bus 71. The Master CPU 70 determines target range by dividing the count of 20 nanosecond time elements by two and multiplying the result by six (20 X 10~9 sec X 3 X 108 m/sec = 6). Further, the CPU stores the range on each subsequent scan, in order to compute the range closure rate by dividing the change in range by the time between samples (dR/dt). This technique provides a range measurement resolution of about +/- 3.0 meters, which is sufficient for the long range maneuver. The PAL microcomputer 43 continues to operate in this mode while the CPU 70 slows the rate of closure (outputting -Z commands) to about 1.0 meter per second at a range of 50 meters. At this range, the CAL subsystem 50 is activated by the CPU 70 for diagnostic testing. If the CAL subsystem 50 fails to operate properly, the CPU 70 outputs a "bailout" command to the host GNC computer 14 and terminates the mission. If, however, the CAL subsystem 50 passes such diagnostic: testing, it takes over the target sensing operation at a range of 30. meters, and the PAL subsystem 40 is deactivated. Since the PAL subsystem 40 is controlled by a microcomputer 43, a key component lies in the microcomputer firmware that executes the above described operation. The operational firmware is described in the following "pseudo code" listing, where each line represents a machine language program designed to execute the indicated funct on:
TARGET ACQUISITION
INITIALIZE SCAN
SET B = 1
SET COUNT = 0 40 FIRE BEAM #B
CLEAR WIDE FIELD (WF)RECEIVER LATCH • DELAY LOOP
READ WF RECEIVER LATCH
TEST FOR TARGET RETURN IF TARGET GOTO 130
ELSE B = B + 1
IF B = 21 THEN B = 1
GOTO 40 130 COUNT = COUNT + 1 IF COUNT = N GOTO 200
ELSE B = B + 1
IF B = 21 THEN B=l
GOTO 40 TARGET TRACK AND RANGE 200 INITIALIZE SCAN
SET B = 1 220 FIRE BEAM #B 5. OUTPUT ENABLE RANGE COUNT
CLEAR ARRAY (AR) RECEIVER LATCHES DELAY LOOP
READ AR RECEIVER BUS (R) READ RANGE COUNT "(C) 0 TEST R > 0
IF SO - TARGET: X=#B, Y=#R, R=C GOTO 320
ELSE X=#B, Y=0, R=0 320 STORE (X,Y,R) 5 .B = B+l
IF B < 2! THEN 220 ELSE INDICATE DATA READY OUTPUT DATA TO CPU (X,Y,R) GOTO 200.
0 CW ARRAY LIDAR
The CAL sensor subsystem 50 is described with reference to Fig. 5 and Fig. 6. CAL 50 includes a CW diode laser array 51, the same PIN diode detector array 42 previously described, a tone range processor 54, the same wide field receiver 45 previously described 5 and a CAL microcomputer 53. In order to minimize size and weight of the total DROID system !0, subsystem hardware is shared where practical. The CW laser diode array 51 is optically equivalent to the transmitter arra 41 of the PAL system 40, in that twenty vertically oriented beams 81 are propagated over the 20° by 20° 0 field 90 as seen in Fig. 5. The difference lies in the fact that the laser diodes 112 employed in CAL 50 are of the continuously emitting type rather than the pulsed type described for the PAL sensor 40. Continuous emission is required for the high precision tone range measurement technique employed in the CAL sensor 50. CW 5 diodes 112 are characterized by relatively low output power and, therefore, short operating range. CAL diodes 112, when active, are driven with a CW current of about 0.1 ampere resulting in an output power of about 0.010 watt. The diodes 112 are amplitude modulated by varying the drive current in any desired waveform. A 3.0 megahertz oscillator 113 provides a modulation tone (not shown), which is imposed on the drive current of each diode 112 by modulation circuitry 114. As in the PAL sensor 40, the CW diodes 112 are selected (one at a time) for transmission by the CAL microcomputer 53 by writing a digital word to output port 115. Each diode 112 radiates continuously for the entire period that it is selected. A 50 Hz field scan rate results in a minimum 1.0 millisecond ON time for each diode 112 when selected.
The receiver 42 operates exactly as described for the PAL sensor 40, with the exception that the lower signal strength of the CW diodes 112 reduces the maximum operating range to about 64 meters. The orthogonal relation between the vertically elongated transmit beams 81 and the horizontally elongated detector FOVs 85 again allows the CAL microcomputer 53 to resolve the target relative angular direction +/- 1° in both azimuth and elevation. However, at ranges under about 15 meters, the CAL sensor 50 must resolve multiple targets, because the subtense of the three retroreflector 16 pattern exceeds the 1° sensor field elements (see Fig. 5). Th s is accomplished in a straight forward manner by the procedure previously described. Each time a beam 81 is transmitted,' the CAL microcomputer 53 tests the receiver bus 98 for target returns 91. If no returns 91 are detected, operation is sequenced to the next beam. If one or more receiver latches 97 are high, then the active beam number and the active receiver line numbers are stored for later transfer to the Master CPU 70, the receive latches 97 are cleared by toggling line 95 and operation is sequenced to the next beam. Scanning the field 90 in this manner, provides the CPU 70 with continuously updated azimuth and elevation coordinates for each of the target retroref!ectors 16. The coordinates on sequential scans are processed to compute the angular direction of the docking port 18, the target roll angle and the roll rate. While the CAL sensor 50 tracks the target vehicle 21, it must also precisely range each of the three retroref!ectors 16' in order to determine the target attitude .(or the orientation of the target Z axis). The goal of the CAL sensor 50 is to provide the information 5 necessary to maneuver the host vehicle !3 to a position and orientation aligning the docking probe 12 axis 100 with that of the target docking port 18 for the final approach. Tone ranging techniques which are known in prior art are employed to measure the range from the sensor 50 to each reflector 16 to a resolution of 0 about +/- 3.0 centimeters. The tone range processor 54 compares the phase of the 3.0 Mhz modulation transmit tone !13 to the 3.0 Mhz tone received from each reflector 16 to a resolution of one part in 3,333 to determine the propagation delay and, thereby, the range. When the CAL microcomputer 50 detects an active latch return line 5 98, it allows the output beam to remain on the target vehicle 21 for sufficient time for the tone range processor 54 to accurately measure the phase relation (not shown). Heterodyne techniques are employed to beat the 3.0 Mhz signals against a 2.999 Mhz reference 117, so that lower frequency (1.0 Khz) signals 119, 140 can be 0 processed in a phase comparator circuit 118. The delay period between the zero crossing of the transmit signal 119 and that of the receive signal 140 corresponds to the reflector 16 range. A digital value representing the phase delay is read by the CAL microcomputer 50 from the comparator 118 by means of a data bus 116. At the end
25 of each field 90 scan, three sets of target coordinates and phase delay values (one set for each retroref!ector 16) are transferred to the CPU 70 via data bus 71. The CPU 70 processes the data to determine the attitude of the target and transfers guidance commands 5 via bus 19 to the host GNC computer 14 required to maneuver the
30 host !3 onto the target Z axis 100 and to point the docking probe 12 directly at the docking port 18.
At a range of about 5 meters, the CPU 70 processes sequential scans to accurately measure closure rate and the roll rate of the target. Periodic -Z commands are generated to slow the rate of
35. closure to about 0.! meter per second. In addition, roll commands are generated to spin the host 13 in a manner to zero the relative roll rate between the target vehicles 2! and the chaser 13, so that the final hard docking maneuver can be performed.
Once the host 13 is properly oriented and the range is under 5 meters, the Master CPU 70 activates the CTL subsystem 60 for diagnostic testing. If the CTL 60 fails to operate properly, a
"bailout" command is transferred via data bus 19 to the GNC computer 14, and the mission is terminated. If the CTL 60 passes diagnostic testing, it takes over the target sensing operation at a range of 3 meters, and the CAL subsystem 50 is deactivated. Since the CAL subsystem 50 is controlled by a microcomputer 53, a key component is the microcomputer firmware that executes the operation described above. The operational firmware is given below in a "pseudo code" listing, where each line represents a machine language program to execute the indicated function:
MULTIPLE RETRO TRACK AND RANGE 10 INITIALIZE SCAN
SET B = 1 30 ENABLE BEAM #B
CLEAR ARRAY (AR) RECEIVER LATCHES READ RECEIVER BUS (R)
TEST #R > 0
IF #R = 0 THEN X = #B, Y = 0
IF MULTIPLE RETURNS THEN X = #B, Y = 255
IF ONE RETURN THEN X = #B, Y = #R IF Y = 0 OR Y = 255 THEN R = 0 GOTO 120
GOTO 200 120 DISABLE BEAM #B
STORE (X,Y,R)
B = B +1 IF B < 21 THEN 30
ELSE INDICATE DATA READY
OUTPUT DATA TO CPU (X,Y,R)
GOTO 10 TONE RANGE PROCESSING 200 ENABLE PHASE PROCESSOR DELAY LOOP
READ PHASE COUNT (R)
GOTO 120.
CCD TV LIDAR The CTL sensor subsystem 60, which operates over target ranges from 3 meters to 20 centimeters is described in greater detail with reference to Fig. 7, Fig. 8 and Fig. 9. CTL 60 comprises a CCD solid state TV camera 61, both the pulsed 41 and CW 51 laser arrays previously described, four narrow beam diode lasers 62, a video processor 64 and a microcomputer controller 63. The two laser arrays 41, 51 are enabled with all elements 46, 112 active simultaneously to illuminate the target vehicle 21. The narrow beam lasers' 62, which designate bright spots 135 on the docking plate 17, are used in the last meter of closure to indicate target range and attitude.
A key element of the CTL sensor 60 is the CCD camera 61 which comprises a 491 X 491 array of active elements 120 which are highly sensitive to the 905 nanometer source radiation 91. The camera 61 is a high resolution sensor capable of imaging the docking plate 17 and generating a stream of digital data on data bus 121, one byte for each of the 241,081 pixels in the array 120, thirty times per second. The data is continuously fed to a video processor 64, which analyzes selected data to determine target angular direction, range and attitude. Target data is periodically read by the CTL microcomputer 63 and transferred to the CPU 70, so that guidance commands 5 can be generated to accurately maneuver the host probe 12 into the target port 18 for hard docking.
The CTL subsystem 60 operates in two range dependent modes. From a range of 3 meters to a range of one meter, the system 60 processes the shape of the docking plate 17 in the CCD image (see Fig. 9). When the CTL 60 is initially activated, the CTL microcomputer 63 enables both laser diode arrays 41, 51 with all elements active and each diode in the pulsed array 41 operating at 10,000 pulses per second. Therefore, the docking plate 17 is illuminated with 300 milliwatts of continuous, laser light in a 20° by 20° beam. The resulting image 130 (see Fig. 8) is that of a white square surface 122 on a black background subtending a black crosshair pattern 126.
The retroreflective bead material covering the majority of the docking plate 17 area 122 returns the source radiation 9! toward the CCD camera- 61 in roughly a 10° diverging beam. As a result, the irradiance- at the camera input 123, at the maximum range of 3 meters, is about 20 db above the camera's 61 minimum discernable signal with' an 11 millimeter, F1.4 lens 123 and a 20° field-of-view. The fact that the irradiance is excessive allows the F-stop 124 of the camera 61 to be closed down to 1/32, while maintaining good image 130 contrast. The result is that of a pinhole camera with a ry long depth of focus. The combination of intense laser illumination 8!, the retroreflective docking plate surface 1.22 and the small camera aperature 124 represents a key feature of the present invention, since no moving mechanisms are required to maintain image 130 exposure or focus over the operating distances from 3 meters to 20 centimeters. The camera optics 123 look through the same infrared filter window 22 previously described. The filter 22 passes only a narrow band of wavelengths about the source line (905 nanometers), thereby significantly attenuating broadband sources of interference (not shown), such as the sun.
SHAPE MODE In the first operating mode, the video processor 64 employs optical contrast video tracking techniques, known in prior art, to acquire the docking plate image 130 and to locate the plate centroid 125 (the junction of the two, black crosshair pointers 12b, seen in Fig. 8). The camera 61 images a 20° by 20° scene on the 491 X 491 array of detector elements 120. Therefore, each picture element (pixel) 120 covers a 0.7 milliradian field. A pixel coordinate system, where 0< X <491 and 0<Y<"491, is used to indicate the relative angular direction in azimuth and elevation, respectively, of any discrete pixel in the scene. The video processor 64 locates the crosshair centroid 125 and outputs the image pixel coordinates (X,Y) to the CTL microcomputer 63. The CTL 60 then transfers the target coordinate data to the CPU 70, so that pitch and yaw commands can be generated to keep the CCD camera 61 pointed directly at the plate centroid 125. This process is periodically repeated, because other video processing schemes utilized in the CTL subsystem rely on the plate image 130 remaining in the center of the picture.
With the docking plate 17 centered in the image 130, the video processor 64 analyzes each horizontal line (not shown) in the scene, locates the thicker crosshair sector 127 and computes the relative target roll angle and roll rate. In a like manner, roll data is transferred to the CPU 70, so that it can output host 13 roll commands necessary to orient the image 130 as seen in Fig. 8 and to zero the roll rate.
With both the camera 61 pointing angle and the host 13 roll angle properly oriented, the CTL microcomputer 63 next analyzes the shape (see Fig. 9) of the docking plate image 130 to determine the attitude of the target vehicle 21. If the target attitude is perfectly matched with that of the chaser 13, then the perimeter 128 of the plate image 130 is perfectly square. If, on the other hand, the target is pitched up in the +Y direction, for example, then the image 130 is that of a trapezoid 129 with the parallel lines 141, 142 horizontal and the shorter side 141 on the top of the image 130. The relative length of the top 141 and bottom 142 lines, the ratio of the height to the average width and the angles of the sides off the vertical all are mathematically related to the magnitude of the pitch angle. In addition, each of these parameters can be measured by the video processor 64 by locating the black to white transitions in the image 130 and counting image pixels to define the perimeter lines 128. The video processor 64 generates pixel count data corresponding to each parameter, which are passed on to the CPU 70 via the CTL microcomputer 63 and data buses 131, 71. The CPU 70 performs the necessary computations to determine target attitude and to generate the guidance commands 5 required to move the host 13 directly onto the target Z axis, thereby aligning the probe 12 with the port !8. - 2! -
An alternate technique which is less math intensive is illustrated in Fig. 9. When the docking plate image 130 is initially acquired, it is likely that the target attitude is' off both in pitch and yaw. The result is a complex four sided image 132, with opposite sides non-parallel but converging along the X and Y axes of the image. In this method, the video processor 64 simply determines the directions of convergence (either +X or -X and +Y or -Y) and passes this information through the CTL microcomputer 63 to the CPU 70.. The CPU 70 then maneuvers the host 13 in the indicated +/-X and +/-Y directions until the CTL 60 indicates that the image shape 130 is square and the attitude is corrected.
With the relative attitude adjusted, target range is measured in the video processor 64 by counting the number of pixels across the plate image 130. The count, which is inversely related to target range, is periodically transferred to the CPU 70 which computes range and range rate.
Target vehicle 21 tracking resolution (the processor's 64 ability to locate the centroid pixel 125), is defined by the field coverage of each pixel, i.e. +/- 0.7 milliradians in azimuth and elevation. Roll measurement resolution is a function of the image 130 size. At a range of one meter, the roll angle can be measured with a resolution of +/-0.1 degree. Attitude measurement resolution depends on the method selected, the magnitude of the attitude offset angles and the target range. Assuming the method which processes the relative lengths of opposite sides to determine attitude, the resolution is about +/- 0.5 degrees at a range of one meter. Range measurement resolution improves as the range closes. At a range of one meter, the measurement resolution is about +/- 2.0 millimeters. Since the CTL 60 is controlled by a microcomputer 63, the microcomputer firmware is a key component of the system. The CTL microcomputer 63 firmware is described by means of the following "pseudo code" listing, where each line represents a machine language program which executes the indicated function: SHAPE PROCESSING
ACTIVATE ILLUMINATOR INITIALIZE VIDEO PROCESSOR 30 GET CROSSHAIR COORDINATES (X,Y) OUTPUT (X,Y) TO CPU
50 GET TOP LINE ROLL ANGLE (r) IF r = 0 then 90 OUTPUT ROLL STEP TO CPU GOTO 50 90 GET X CONVERGENCE SIGN +/- IF 0 THEN 130 OUTPUT +/- X STEP TO CPU GOTO 90 130 GET Y CONVERGENCE SIGN +/- IF 0 THEN 160
OUTPUT +/- Y STEP TO CPU GOTO 130 160 GET RANGE/SIZE PIXEL COUNT (R) OUTPUT R TO CPU IF RANGE = 1 METER THEN 200 ELSE 30.
LASER SPOT PROCESSING At ranges below one meter, the docking plate image 130 size exceeds the FOV of the CCD camera 61. A laser spot image 135 processing method, employed in the final stage of hard docking, is described with reference to Fig. 7 and Fig. 8. Four narrow beam, GaAs diode lasers 62 are arranged about the CCD camera 61; one on the +Y axis, one on the -Y axis, one on the +X axis and one on the -X axis. Each laser 62 is physically offset by 10 centimeters (from the camera Z axis) and pointed toward the camera axis at a convergence angle of 16 degrees. Therefore, at a range of 35 centimeters, all four laser beams 136 intersect the camera Z axis (not shown). This geometry results in four bright spots 135 imaged on the axes of the docking plate cross-hairs 126 (see Fig. 8). The laser spot 135 intensity is roughly 30 times the intensity of the broadbea laser 81 illuminated image plate 17, whether the spot designates the retroreflective material 122 or the non-retroreflect ve surface 126. The resulting video signal contrast allows the video processor 64 to locate the bright spots 135 in the image 130 by simple threshold detection. Due to the 16 degree convergence angles, as the target range closes, each spot 135 moves through the image along its associated axis toward the pattern centroid.125. At a range of 35 centimeters, all four spots 135 overlap: at the centroid 125. As the range continues to close, the spots 135 cross through the centroid 1 5 and move to the opposite sides of the image 130. At the minimum range of 20 centimeters, the hard docking range, the spots 135 fall on the outer perimeter 128 of the CCD camera 61 FOV 90.
The general technique employed by the video processor 64 in this mode is to count the number of pixels from the centroid 125 (along each axis) to each laser spot 135*. The four sets of pixel counts are mathematically related to the range from the camera 61 to each spot 135 designated on the docking plate 17. Knowing the range to four points on the target surface, provides a measure of target range and attitude.
Image plate centroid 125 tracking and roll angle adjustments are executed as previously described. The geometry of the crosshair pattern 126, provides a centroid image 125 (the junction of the two crosshair pointers in Fig. 8) with zero size. Therefore, the centroid 125 pixel location can be sensed by the video processor 64 even at zero range.
As the target range closes through the final meter, the video processor periodically outputs the pixel coordinates (X,Y) of the centroid 125, the relative roll angle (r), and the four sets of laser spot pixel counts to the CTL microcomputer 63 via data bus 131. The CTL microcomputer 63 passes the target data to the MASTER CPU 70, so that guidance commands 5 can be generated to maintain alignment of the docking probe 12 with the docking port 18. In addition, the final target range and range rate are computed by averaging the four spot ranges. At a range of 35 centimeters, when the laser spots converge, the probe 12 enters the port 18. At a range of 20 centimeters the probe end (not shown) makes contact with the target port 18, and a hard docking indicator switch 145 is closed. The switch 145 closure is sensed by the CTL microcomputer 63 via line 137. This information is passed to the CPU 70 which deactivates all systems and terminates the mission.
Measurement resolution in the spot 135 processing method is a function of the magnitude of spot movement in the image as the range and: attitude change. Both attitude and range measurement resolution are nonlinear functions of target range and attitude. However, at a range of 50 centimeters attitude resolution is about +/- 1.2 degrees . and range resolut on is about +/- 3.0 millimeters. At a range of 25 centimeters, attitude resolution is +/- 0.2 degree and range resolution is about +/- 0.3 mill meter.
Laser spot 135 processing, CTL microcomputer 63 firmware, which is an integral component in the subsystem 60, is described in the following pseudo code listing, wherein each line represents a machine language program which executes the indicated function:
LASER SPOT PROCESSING
200 GET CENTROID COORDINATES (X,Y) OUTPUT (X,Y) TO CPU
220 GET ROLL ANGLE (r) IF r = 0 THEN 260 OUTPUT ROLL STEP TO CPU GOTO 220 260 GET LASER SPOT COUNT (C) IF C <, 4 THEN 200 GET +X PIXEL COUNT (+PX) GET -X PIXEL COUNT (-PX) GET +Y PIXEL COUNT (+PY) GET -Y PIXEL COUNT (-PY)
OUTPUT (+PX,-PX,+PY,-PY) TO CPU IF RANGE > 25 CENTIMETERS 200 330 TEST DOCK INDICATOR IF HARD DOCK THEN 360 ' GOTO 330 360 OUTPUT DOCK INDICATOR TO CPU END.
MASTER CPU The DROID Master CPU 70 is a more powerful sixteen bit microprocessor with sufficient mathematics capability to process the target data supplied by the PAL microcomputer 43, the CAL microcomputer 53 and the CTL microcomputer 63 to generate guidance commands 5 required to maneuver a space vehicle 13 through the last few hundred meters of a hard docking mission. Since the DROID system 10 is intended for use in various types of vehicles 13, the Master CPU 70 memory is that of a plug in programmable read only memory (PROM) (not shown). Different space vehicles 13 can be expected to have different guidance and response characteristics. Therefore, the Master CPU 70 PROM can be customized for each vehicle 13 to adapt the guidance commands 5 generated as necessary. Space vehicles 13 which must execute automated docking missions can be expected to have groups of control thrusters 15 manipulated by a GNC computer 14 effective to maneuver the craft in pitch, yaw, roll, X, Y and Z. In addition, thruster 15 burn can be controlled in a manner to effect the rate of the maneuver. Therefore, the Master CPU 70 analyzes the target data and generates tne following types of commands:
(1) +/- PITCH
(2) d(PITCH)/dt (3) +/- YAW
(4) d(YAW)/dt
(5) +/- ROLL
(6) d(R0LL)/dt
(7) +/- X (8) dX/dt
(9) +/- Y
(10) dY/dt
(11) +/- Z
(12) dZ/dt. The Master CPU 70 establishes two-way communication with the GNC computer 14 by means of data bus 19. In addition to the guidance commands 5 which must be transferred, the Master CPU 70 receives an enable command 6 from the GNC 14 which activates the DROID system 10 and outputs a hard dock command 9 which terminates the mission. As previously described, the CPU also outputs a "bailout" command 7 any time a sensor subsystem fails and the docking mission is in jeopardy. The "bailout" command 7 initiates a preprogrammed -Z maneuver designed to avoid a catastrophic collision with the target vehicle 21.
The DROID system 10 can be labeled an intelligent sensor since the Master CPU 70 acts to reconfigure the system on the basis of the sensed environment and changing mission requirements. In this process, the CPU 70 performs two important tasks. One is to monitor the operating condition of each sensor and target data to determine when it may be unsafe to proceed. The other is to process target data to generate guidance commands 5. The Master CPU 70 firmware, which is an integral component of the system 10, is described in the following "pseudo code" l sting, where each line represents a machine language or higher level program to execute the indicated function. Associated guidance commands 5 are also given:
INITIALIZATION
10 VERIFY ENABLE COMMAND INITIALIZE DATA BASE EXECUTE SELF DIAGNOSTICS EXECUTE PAL DIAGNOSTICS EXECUTE CAL DIAGNOSTICS EXECUTE CTL DIAGNOSTICS ON FAILURE GOTO 2000 ELSE GOTO 80 TARGET ACQUISITION
80 ACTIVATE PAL SEARCH
EXECUTE SPIRAL MANEUVER (P,Y) TEST TARGET DATA IF NO TARGET THEN 80
ELSE GOTO 130
PAL TARGET TRACK
130 ACTIVATE PAL TRACK TEST TARGET DATA
NULL TARGET ANGLES (P,Y)
CLOSE TARGET (+Z)
COMPUTE RANGE, RRATE
SLOW CLOSURE TO 30 M/S (-Z)
190 TEST TARGET DATA
NULL TARGET ANGLES (P,Y)
COMPUTE RANGE, RRATE
IF RANGE > 200M THEN 190
SLOW CLOSURE TO 10 M/S (-Z) 230 TEST TARGET DATA
NULL TARGET ANGLES . (P,Y) COMPUTE RANGE, RRATE IF RANGE > 60M THEN 230 SLOW CLOSURE TO 1 M/S (-Z) NULL TARGET ANGLES (P,Y) ACTIVATE CAL EXECUTE CAL DIAGNOSTICS ON FAILURE GOTO 2000 320 TEST TARGET DATA COMPUTE RANGE, RRATE
IF RANGE > 35 THEN 320 ENTER PAL/CAL HANDOVER CAL TARGET TRACE
DEACTIVATE PAL 370 TEST TARGET DATA
COMPUTE CENTROID POSITION NULL TARGET ANGLES (P,Y) COMPUTE RANGE, RRATE IF RANGE > 20M THEN 370 420 TEST TARGET DATA
COMPUTE RANGE, RRATE COMPUTE CENTROID POSITION
COMPUTE ATTITUDE
COMPUTE ROLL
NULL Z AXIS OFFSET (X,Y) NULL TARGET ANGLES (P,Y)
NULL ROLL ANGLE (r)
IF RANGE > 5M THEN 420
SLOW CLOSURE TO 0.1M (-Z)
TEST TARGET DATA COMPUTE RANGE, RRATE
COMPUTE CENTROID POSITION
COMPUTE ATTITUDE
COMPUTE ROLL
NULL TARGET ANGLES (P,Y) NULL Z AXIS OFFSET (X,Y)
NULL ROLL ANGLE (r)
ACTIVATE CTL
EXECUTE CTL DIAGNOSTICS
ON FAILURE GOTO 2000 590 TEST TARGET DATA
COMPUTE RANGE, RRATE
IF RANGE > 3M THEN 590
ENTER CAL/CTL HANDOVER CTL SHAPE TRACK DEACTIVATE CAL
640 GET CROSSHAIR COORDINATES
NULL TARGET ANGLES (P,Y)
GET SHAPE DATA
COMPUTE ROLL ANGLE IF ROLL = 0 THEN 710
NULL ROLL ANGLE (r)
GOTO 640 710 GET CROSSHAIR COORDINATES
NULL TARGET ANGLES (P,Y) GET SHAPE DATA
TEST X CONVERGENCE (+,-,0) IF CONV = 0 THEN 780 NULL X CONVERGENCE (X) GOTO 710 780 GET CROSSHAIR COORDINATES NULL TARGET ANGLES (P,Y) GET SHAPE DATA TEST Y CONVERGENCE (+,-,0) IFCONV = 0 THEN 850 NULL Y CONVERGENCE (Y) GOTO 780
850 GET SHAPE DATA
COMPUTE RANGE, RRATE IF RANGE > 1M THEN 640 CTL SPOT TRACK 880 .GET TARGET DATA
NULL TARGET ANGLES (P,Y) NULL ROLL ANGLE (r) GET SPOT DATA XCOUNT = (+XCOUNT)-(-XCOUNT) IF XCOUNT > 0 STEP RIGHT (+X)
IF XCOUNT <' 0 STEP LEFT (-X) YCOUNT = (+YCOUNT)-(-YCOUNT) IF YCOUNT > 0 STEP UP (+Y) IF YCOUNT < 0 STEP DOWN (-Y) GET SPOT DATA
IF XCOUNT = YCOUNT = 0 THEN 1010 1000 GOTO 880 1010 GET SPOT DATA
COMPUTE RANGE, RRATE TEST SPOT CONVERGENCE
IF RANGE > 35 CM THEN 1010 ACTIVATE HARD DOCK MECHANISMS 1060 TEST DOCK INDICATOR
IF HARD DOCK THEN 3000 GOTO 1060 BAILOUT
2000 OUTPUT BAILOUT COMMAND
EXECUTE MANEUVER (P.Y.-Z) HARD DOCK 3000 OUTPUT HARD DOCK COMMAND TERMINATE SYSTEMS.
TRANSCEIVER CONFIGURATIONS W thi regard to the orthogonal transceiver configuration described for the PAL and CAL sensors, it should be noted that equivalent resolution can be obtained by various other configurations. Some of these (not shown) include two transmitter arrays of the type described oriented orthogonally to each other combined with a single element, wide field receiver; a two dimensional transmitter array combined with a single element, wide field receiver; a single element, broad beam transmitter combined with a two dimensional receiver array; and various other configurations. The orthogonal transceiver of the preferred embodiment carries the advantages of a low component count and high speed of operation. Although the present invention is particularly significant for the development of space, and it has been described primarily in the context of space applications, it is obvious to those skilled in the art that the ability to sense the direction, range and attitude of a target renders it useful in various military applications as well as in the robotics and security industries. Operations such as the airborne refueling of aircraft are manual by nature, and nighttime operations can be significantly aided by the sensors of the present invention. Properly arranging the retroref!ector pattern on the fuel receiving basket allows the entire operation to be automated. In the security field, more powerful laser sources render the system suitable for covert operation as an area surveillance radar, solving the problem of undesired, omnidirectional emission associated with conventional radar. A key industrial appl cation of the present invention resides in the field of robotics. Robot vision currently requires dual television cameras, moving pedestals and a great deal of video processing electronics to emulate depth perception. The ability of the present invention to indicate target direction, range and attitude without the need for moving parts, provides a reliable approach to depth perception for robots. Activating the sensors of the present invention in alternating, sequential fashion allows a robot to locate a target at long range, to maneuver to the target while avoiding close range obstacles and to sense the relative attitude of the target at very close ranges. Such capabilities allow a robot to perform many kinds of proximity manipulations. While there have been described what are at present considered to be the preferred embodiments of this invention, it will be obvious to those skilled in the art that various changes and modifications may be made therein without departing from the invention, and it is aimed, therefore, in the appended claims to cover all. such changes and modifications as fall within the true 'spirit and scope of the invention.

Claims

CLAIMS :
1. An optoelectronic system for docking a host spacecraft to a target spacecraft having a docking port comprising: first optical light sensor and illuminating means secured on the host spacecraft effective to establish a source of light radiation waveforms-;, second, optical light returning means secured on the target spacecraft- nd effective to return incident light radiation in the direction of said source of light radiation and configured in a predetermined geometry with respect to the docking port of the target spacecraft; said first optical means also being adapted to illuminate said second optical means within an optically effective field of view having a central axis and to detect and process returned light illumination waveforms to indicate the direction, range and attitude of the target spacecraft; and data processing means associated with said first optical means effective to analyze target information to produce guidance commands for maneuvering the host spacecraft to dock with the target spacecraft.
2. A system according to Claim 1, wherein said first and second optical means are solely composed of essentially nonmoveable components.
3. A system according to Claim 1, wherein said first optical ' means comprises first, second and third optoelectronic sensors and said second optical means comprises a pattern of three retroreflectors separated and arranged in an isosceles triangle about the docking port, and a retroreflective image plate on and juxtaposed to said port and having a preselected perimetric shape subtending a contrasting crosshair pattern of nonretroreflective material effective to identify the image plate centroid.
4. A system according to Claim 3, wherein said first optoelectronic sensor is adapted to operably scan the target over range from approximately one thousand meters to approximately thirty meters and is effective to illuminate said second optical means for detecting the returned light and sensing the angular direction and range of the target spacecraft and to transfer the information to said data processing means.
5. A system according to claim 4, wherein said first sensor is composed of a first and a second set of elements arranged in an array, each element of the first set comprising a pulsed laser transmitter and each element of the second set comprising a light radiation receiver being sensitive solely to a predetermined narrow band of light radiation.
6. A system according to Claim 5, wherein said first sensor includes a single element light radiation receiver effective to view the entire optically effective field of view of the first optical means; a first range processing circuit effective to measure the round trip propagation time of the light emanating from said first optical means and returned from said second optical means; and a first microcomputer effective to activate said transmitter elements, to monitor said receiver elements and to process data from the sensor elements to determine target direction and range.
7. A system according to Claim 6, wherein each said transmitter comprises a combination of a laser pulse forming network and a pulsed laser diode, and wherein each transmitter is responsive to be selectively triggered by said first microcomputer to establish laser beam emanation for the duration of said pulse.
8. A system according to Claim 5, wherein said first set of elements are effective to propagate independent beams of laser light through a system of lenses to effect a cooperative beam pattern, wherein substantially one degree horizontal by twenty degrees vertical beam cross-sections of each element are effective to illuminate mutually exclusive but juxtaposed fields in space, and wherein the combination of all said beam cross-sections is arranged to cover the entire optically effective field of view of the first optical means.
9. A system according to Claim 6, wherein each said receiver is composed of a combination of light detector, signal amplifier and digital latching circuit with the binary state of each said latch effective to represent the presence or absence of a target return, and wherein the state of each said latch being selectively readable and clearable by the first microcomputer.
10. A system according to Claim 5, wherein said second set of elements are arranged to view space through a system of lenses to effect a cooperative pattern of viewing fields, wherein substantially twenty degree horizontal by one degree vertical field cross-sections of each element are effective to view mutually exclusive but juxtaposed fields in space, and wherein the combination of all fields is arranged to cover the entire optically effective field of the first optical means.
11. A system according to Claim 6, wherein said single element receiver comprises a light detector and lense system arranged to sense the entire optically effective field of the first optical means, a signal amplifier and a digital latching circuit with the binary state of said latch effective to represent the presence or absence of a target return, and wherein the latch output state is coupled both to the first microcomputer and the first range processing circuit.
12. A system according to Claim 6, wherein said first range processing circuit comprises a high frequency clock and a digital counter circuit effective to count clock cycles occurring between laser pulse transmission and laser pulse reception and to transfer said count data to said first microcomputer.
13. A system according to Claim 12, wherein said first microcomputer is effective to selectively and sequentially trigger each transmitter element, execute a delay, selectively read and clear the binary state of each said receiver element and store saiα count data and data identifying the transmitter and receiver elements associated with each target return.
14. A system according to Claim 13, wherein said first microcomputer is effective to transfer target data to said data processing means.
15. A system according to Claim 3, wherein said second optoelectronic sensor is adapted to operably scan the target over ranges of approximately fifty meters to approximately three meters, to illuminate said retroreflectors of second optical means, to detect the returned light radiation, to sense the direction and range of each of said retroreflectors and to transfer target information to said data processing means.
16. A system according to Claim 15, wherein said second sensor is composed of a first and second set of elements arranged in an array, each element of the first set comprising a continuous laser transmitter having the amplitude of its output light modulated in the waveform of a sine function; and each element of the second set comprising a light radiation receiver being sensitive solely to a narrow band of radiation of the wavelength of the transmitted light.
17. A system according to 16, wherein said second sensor includes a single element light radiation sensor; a second range processing circuit effective to measure the round trip propagation time of said light emanating from said first optical means and returned by said second optical means; and a second microcomputer effective to activate said transmitter elements, monitor said receiver elements, and process data from the sensor elements to determine target direction, range and attitude.
18. A system according to Claim 17, wherein each said transmitter comprises a combination of an intensity modulation circuit and a continuously emitting diode laser, and wherein each said transmitter being selectively enabled by said second microcomputer to establish continuously modulated laser beam emanation for the duration of the enabling period.
19. A system according to Claim 16, wherein said first set of elements are effective to propogate independent beams of laser light through a system of lenses to effect a cooperative beam pattern, wherein substantially one degree horizontal by twenty degrees vertical beam cross-sections of each element are effective to illuminate mutually exclusive but juxtaposed fields in space, and wherein the combination of all said beam cross-sections is arranged to cover the entire optically effective field of view of the first optical means.
20. A system according to Claim 17, wherein each said receiver forms a combination of light detector, signal amplifier and digital latch circuit with the binary state of each said latch effective to represent the presence or absence of a target return, and wherein the state of each said latch being selectively readable and clearable by the second microcomputer.
21. A system according to Claim 16, wherein said second set of elements are arranged to view space" through a system of lenses to effect a cooperative pattern of viewing fields, wherein substantially twenty degree horizontal by one degree vertical field cross-sections of each element are effective to view mutually exclusive, but juxtaposed fields in space, and wherein the combination of all fields is arranged to cover the entire optically effective field of view of the f rst optical means.
22. A system according to Claim 17, wherein said single element receiver comprises a light detector and lense system arranged to sense the ent re optically effective field of the first optical means and. a signal amplifier, and wherein the output of said amplifier is coupled to said second range processing circuit. '
23. A system according to Claim 17, wherein said second range processing circuit comprises a heterodyne phase comparator effective to measure the propagation phase delay between the transmitted modulation wave and the received modulation wave and to transfer the phase delay data to said second microcomputer.
24. A system according to Claim 23, wherein said second microcomputer is effective for: selectively enabling each said transmitter element, executing a delay, selectively reading and clearing the binary state of each said receiver element and storing said delay data and data identifying the transmitter and receiver elements associated with each target return.
25. A system according to Claim 24, wherein said second microcomputer is effective to transfer target data to said data processing means.
26. A system according to Claim 3, wherein said third optoelectronic sensor is adapted to operably scan the target over ranges of approximately five meters to essentially zero, to illuminate said retroreflective image plate of the second optical means, to image the returned light radiation, to sense the direction, attitude and range of said retroreflective image plate and to transfer target information to said data processing means.
27. A system according to Claim 26, wherein said third sensor comprises a continuously emitting diode laser array effective to illuminate said image plate; an arrangement of multiple narrow beam diode lasers having the same wavelength of said illumination lasers effective to designate predetermined spots on the image plate of the target spacecraft; a solid state imaging device having a field of view essentially equivalent to the optically effective field of view of said first optical means and a central optical axis and being sensitive solely to a narrow band of radiation of the wavelength of said illuminating lasers effective to generate two dimensional, time sequential image frames of said image plate scene; video processing means effective to generate pre-selected data for identifying the position, size and shape of said image plate perimeter and crosshair pattern; and a third microcomputer effective to process said data from said video processor to determine target spacecraft direction, range and attitude.
28. A system according to Claim 27, wherein all elements of said diode laser array are effective to emit continously through an optical lens system to generate an illumination field equivalent to the optically effective field of view of the first optical means.
29. A system according to claim 28, wherein said narrow beam lasers are composed of three or more continuously emitting lasers arranged offset from and symmetrically about said imaging device w th each said beam being directed through the field of view of the imaging device toward a common point on said central axis effective to designate spots on the image plate, wherein the position of each said spot in said image has a known relation to the range from said imaging device to the spot designated on the target.
30. A system according to Claim 27, wherein said solid state imaging device comprises a charge coupled television camera having a two dimensional array of light sensitive pixel elements and a pinhole optical system effective to create time sequential frames of focused and properly exposed images of said image plate and to output a digital representation of each image pixel of each said frame to said video processing means.
31. A system according to Claim 27, wherein said video processing means comprises a pre-programmed digital system effective to identify the perimetric image of said image plate and said crosshair pattern, to output data relating the shape and orientation of said images, to output the relative image location of the image plate centroid and to output the relative image location of each of said bright spots with all of said outputs transferred to said third microcomputer.
32. A system according to Claim 27, wherein said third microcomputer is effective to process data from said video processing means to generate target information including direction, range and attitude and to transfer the information to said data processing system.
33. A system according to Claim 3, wherein said data processing system is. effective to test the operational state of each of said sensors, to activate said first sensor to acquire said target spacecraft and to sequentially activate said second'and third sensors, depending on target range, to track said target during the docking operation.
34. A system according to Claim 3, wherein said data processing system is effective to process information from said first, second and third sensors for generating guidance commands to maneuver said host spacecraft and for maintaining the central axis of said optically effective field of view of said first optical means directed at said docking port for the duration of the docking operation.
35. A system according to Claim 3, wherein said data processing system is effective to process information from said second and third sensors to generate guidance commands to maneuver said host spacecraft and to maintain the appropriate attitude relative to the attitude of said docking port, to dock with said target spacecraft.
36. An optoelectronic system for docking or engaging a host spacecraft with a spaceborne target body comprising: optoelectronically active system means secured on said host spacecraft effective to illuminate the target body and to detect and process returned illumination light waves to establish the direction, range and attitude of the target body; and data processing system means associated with said active system means effective to analyze said target body information to produce guidance commands to maneuver said host spacecraft to engage or dock with said body.
37. A system according to Claim 36, wherein said active system means is comprised of essentially nonmoveable parts, and is effective to operably scan said target body through an optically effective field of view of twenty degrees by twenty degrees, said field having a central optical axis.
38. A system according to Claim 37, wherein said active system means comprises first, second and third optoelectronic sensor means.
39. A system according to Claim 38, wherein first optoelectronic sensor means is effective to operably scan the target body over ranges from approximately three hundred meters to approximately thirty meters, illuminate the target body, detect said returned illumination, sense the angular direction and range of the body and to transfer the acquired data to said data processing means.
40. A system according to Claim 39, wherein said first sensor means comprises a two dimensional array of laser transmitters combined with a beam forming optical means for target illumination; a single element receiver being sensitive only to a narrow band of light radiation of the wavelength of said transmitters for sensing said target light radiation returns; a first target range processing circuit effective to measure the round trip propagation time of said illumination; and a first microcomputer effective to selectively activate said transmitters and process data from said receiver and said range processor circuit to determine target direction and range.
41. A system according to Claim 40, wherein each said transmitter comprises a combination of a laser pulse forming network and a pulsed diode laser.
42. A system according to 40, wherein said beam forming optical means is effective to generate a two dimensional matrix of said transmi tter l ight beams wi th each beam cross-section arranged juxtaposed but nonoverl apping wi th each adjacent beam cross-section and, wherein the combination of all beams is effective to cover the entire optically effective field of view of said active system.
43. A system according to Claim 40, wherein said single element receiver means comprises a combination of objective lens, light detector, signal amplifier and digital latch circuit, and wherein the binary state of said latch effective to represent the presence or absence of a target return and being readable and clearable by said first microcomputer and readable by said range processing circuit.
44. A system according to Claim 43, wherein said objective lens is arranged to collect incident light over the entire optically effective field of view of said active system.
45. A system according to Claim 38, wherein said second optoelectronic sensor means is effective to: operably scan the target body within an approximate range of one to fifteen meters; illuminate the target; detect the returning light; sense the direction, range, and attitude of the target body and to transfer the acquired information to sa d data processing means.
46. A system according to Claim 45, wherein said second sensor means comprises a two dimensional array of laser transmitters and a
• beam forming optical means for target illumination; a single element receiver being sensitive only to a.narrow band of light radiation of the wavelength of said transmitter for sensing said target light radiation returns; a second target range processing circuit effective to measure the round trip propagation time of said illumination; and a second microcomputer effective to selectively activate said transmitters and process data from said receivers and said second range processing circuit to determine target direction, range and attitude.
47. A system according to Claim 46, wherein each said transmitter comprises a combination of a laser intensity modulation circuit and a continuously emitting diode laser.
48. A system according to Claim 47, wherein the intensity of said laser beams are modulated in the waveform of a sine function.
49. A system according to Claim 46, wherein said beam forming optical means is effective to generate a two dimensional matrix of said transmitter light beams with each beam cross-section arranged juxtaposed but nonoverlapping with each adjacent beam cross-section and, wherein the combination of all beams is effective to cover the entire optically effective field of view of said active system.
50. A system according to Claim 46, wherein said single element receiver means comprises a combination of objective lens, light detector, signal amplifier and digital latch circuit, wherein the binary state of said latch is effective to represent the presence or absence of a target return and being readable and clearable by said second microcomputer, and the output of said amplifier is coupled to said second range processing circuit.
51. A system according to Claim 50, wherein said objective lens is arranged to collect incident light over the entire optically
• effective field of view of said active system.
52. A system according to Claim 38, wherein third optoelectronic sensor means is effective to: operably scan the target body over ranges from approximately five meters to zero; illuminate the target body; image said returned illumination; sense the angular direction, range and attitude of the body and to transfer the acquired data to said data processing means.
53. A system according to Claim 52, wherein said third sensor means comprises a two dimensional array of continuously emitting laser diodes for illuminating the target body; an arrangement of narrow beam lasers for designating spots on said body; a two dimensional imaging device being sensitive solely to a narrow band of light radiation of the wavelength of said lasers and effective to generate a video image of the illuminated body; video processing means effective to generate data identifying contrast features in the image; and a third microcomputer effective to process information from said video processing means to determine the direction, range and attitude of the target body and to transfer target information to said data processing means.
54. A system according to Claim 53, wherein said two dimensional array of laser diodes is operated continuously with all diode elements radiating simultaneously, and wherein the combined beam pattern is effective to cover the entire optically effective field of view of said active system.
55. A system according to Claim 53, wherein said imaging device comprises a charged coupled television camera having a two . dimensional array of light sensitive pixel elements combined with a system of optics with a central optical axis and a field of view essentially identical to the optically effective field of view of said active system.
56. A system according to Claim 55, wherein said system of optics comprises a stationary pinhole effective to maintain the focus and exposure of said target Image and wherein said camera is effective to generate time sequential video frames of said image and to output a digital representation of each image pixel of each said frame to said video processing means.
57. A system according to Claim 55, wherein three or more narrow beam diode lasers, of the same wavelength of said lasers of said two dimensional"array, are arranged offset from and symmetrically about said imaging device w th each beam directed through a common point on the central optical axis .of said imaging device effective to designate spots on the target body.
58. A system according to Claim 57, wherein the relative position of each spot in the video image of said target body has a known relation to the range from said imaging device to each designated spot.
59. A system according to Claim 53, wherein said video processing means comprises a preprogrammed system effective to identify contrast features of said target body image and to output data relating the shape and orientation of said features, the image location of a selected contrast reference point and the relative image location of said" spot to said third microcomputer.
60. A system according to Claim 59, wherein said third microcomputer is effective to process target information from said video processing means to determine target body direction, range and attitude information and to transfer all such information to said data processing system.
61. A system according to Claim 38, wherein said data processing means is effective to test the operational state of each of said first, second and third sensor means, to activate said first sensor means to acquire the target body, and to sequentially activate said second and third sensor means, depending on range, to actively engage or dock with said target body.
62. A system according to Claim 61, wherein said data processing means processes target information from said first, second and third microcomputers to generate guidance commands effective to maneuver the host spacecraft to engage or dock with the target body; to maintain the central axis of the optically effective field of view of the active system directed at said target body and to maintain the appropriate attitude relative to the attitude of said target body for the duration of said maneuver.
63. An optoelectronic system for acquiring and tracking one or more targets comprising: an arrangement of one or more optical reflectors securely attached to each of the targets and effective to return incident light radiation in the direction of a source of said radiation; multi-element, solid state scanning means effective to sequentially transmit independently triggered and directed beams of light and to detect and process light waveforms returned from said refl ctors to establish target orientation data; and a data processing means effective to process said target orientation data to identify the direction and range of each of said reflectors from the location of said scanning means.
64. An optoelectronic system for acquiring and tracking one or more reflective targets comprising: multi-element, solid state scanning means effective to sequentially transmit independently triggered and directed illumination beams, to detect and process illumination waveforms reflected from said target and to generate target orientation data; data processing means effective to process said target orientation data to identify the direction, range and attitude of said target from the location of said scanning means.
65. A system according to either Claim 63 or Claim 64, wherein said optoelectronic system means is comprised of essentially nonmoveable parts.
66. An optoelectronic system for acquiring and tracking a target and for sensing the attitude and range of said target comprising: an image plate with an optically identifiable centroid, secured on the target; comprising a perimetric shape of retroreflective material and a selected crosshair pattern of nonretroreflective material, effective to identify the location of the plate centroid; continuous laser light source means effective to illuminate said target; an imaging device comprising a two dimensional array of light sensitive image pixels effective to provide a digital representation of the light intensity at each said pixel and being solely responsive to the wavelength of said laser source; and an arrangement of optical lenses having a field of view with a central axis effective to image light reflected from said image plate, and to maintain focus and exposure of the target image; three or more narrow beam lasers located offset from and arranged symmetrically about said imaging device with each said beam directed through a common point on said central axis of said imaging device effective to designate spots on said image plate, wherein the location of each spot in the image has a known relation to the range from said imaging device to each said spot; video processing means effective to process said digital data from said imaging device for identifying the shape and the orientation of said plate image and the relative image location of said plate centroid and the relative image locations of said spots; data processing means effective to process data from said video processing means for identifying the direction, range and attitude of said image plate.
67. An optoelectronic system for acquiring and tracking a target and for sensing the attitude of said target comprising: continuous laser light source means effective to illuminate said target; an imaging device comprising a two dimensional array of light sensitive image pixels and a digital representation of the light intensity at each pixel and being solely responsive to the wavelength of said laser source; and an arrangement of optical lenses having a field of view with a central axis effective to image light reflected from said image plate, and to maintain focus and exposure of the target image; three or more narrow beam lasers located offset from and arranged symmetrically about said imaging device, with each said beam directed through a common point on said central axis of said imaging device effective to designate spots on said target, wherein the location of each spot in the image has a known relation to the range from said imaging device to each said spot; video processing means effective to process said digital data from said imaging device for identifying image contrast features and for determining the shape and the orientation of said features and the relative location of a selected contrast reference point in said image and the relative location of each said spot in the image; data processing means effective to process said outputs of said video processing means to indicate the di ection, range and attitude of said target.
68. A system according to either Claim 66 or Claim 67, wherein said optoelectronic system means is comprised of essentially nonmoveable parts.
PCT/US1986/002249 1985-10-31 1986-10-28 Dead reckoning optoelectronic intelligent docking system WO1987002797A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US06/793,292 US4834531A (en) 1985-10-31 1985-10-31 Dead reckoning optoelectronic intelligent docking system
US793,292 1985-10-31

Publications (1)

Publication Number Publication Date
WO1987002797A1 true WO1987002797A1 (en) 1987-05-07

Family

ID=25159585

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1986/002249 WO1987002797A1 (en) 1985-10-31 1986-10-28 Dead reckoning optoelectronic intelligent docking system

Country Status (3)

Country Link
US (1) US4834531A (en)
EP (1) EP0256023A1 (en)
WO (1) WO1987002797A1 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0388618A2 (en) * 1989-03-23 1990-09-26 EDELHOFF POLYTECHNIK GMBH &amp; CO. System for determining the position of an object in space using a video-optical sensor
EP0390051A2 (en) * 1989-03-31 1990-10-03 Honeywell Inc. Method and apparatus for computing the self-motion of moving imaging devices
EP0467671A2 (en) * 1990-07-18 1992-01-22 National Space Development Agency Of Japan Retry/recovery method in rendezvous manoeuvre
FR2672690A1 (en) * 1991-02-12 1992-08-14 Matra Espace OPTICAL DEVICE FOR DETERMINING THE RELATIVE POSITION OF TWO VEHICLES AND ALIGNMENT SYSTEM COMPRISING THE SAME.
GB2258112A (en) * 1991-07-25 1993-01-27 Marconi Gec Ltd Rendezvous apparatus
FR2688613A1 (en) * 1992-03-16 1993-09-17 Aerospatiale METHOD AND DEVICE FOR DETERMINING THE RELATIVE POSITION AND TRACK OF TWO SPACE VEHICLES.
EP0573990A2 (en) * 1992-06-09 1993-12-15 Olympus Optical Co., Ltd. Optical device for detecting inclination and variations in inclination for a photographic camera
FR2704654A1 (en) * 1993-04-19 1994-11-04 Nec Corp Apparatus and method for measuring relative azimuth
EP0624806A1 (en) * 1993-04-09 1994-11-17 Trw Inc. Spacecraft docking sensor system
EP0803436A1 (en) * 1996-04-22 1997-10-29 Mitsubishi Denki Kabushiki Kaisha Rendezvous spacecraft collision avoidance device
FR2902894A1 (en) * 2006-06-27 2007-12-28 Alcatel Sa Lateral and longitudinal satellite metrology system for e.g. satellite formation flight system, has reference satellite including semi-reflecting device interposed between source and lens for deviating light reflected by secondary satellite
DE102011001387A1 (en) * 2011-03-18 2012-09-20 First Sensor AG Sampling method for sampling field by optical sampling system or scanning system, involves transmitting test light signals by transmitters of transmitter arrangement of optical sampling system
RU2522840C1 (en) * 2012-12-20 2014-07-20 Федеральное государственное автономное научное учреждение "Центральный научно-исследовательский и опытно-конструкторский институт робототехники и технической кибернетики" (ЦНИИ РТК) Electronic space scanning method

Families Citing this family (103)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5200792A (en) * 1989-08-31 1993-04-06 Nec Corporation Device for obtaining distance information from an object by instantaneously illuminating the object by a light beam
FR2651878B1 (en) * 1989-09-14 1991-12-13 Aerospatiale METHOD AND SYSTEM FOR REMOTE CONTROL OF THE ASSEMBLY OF A FIRST OBJECT WITH A SECOND OBJECT.
SE8904235D0 (en) * 1989-12-15 1989-12-15 Saab Space Ab DEVICE FOR DIFFICULTY OF NEARBY AND COMPLIANCE WITH TWO OBJECTIVES, SPECIFIC SPACE COSTS
FR2656700B1 (en) * 1989-12-28 1992-08-07 Aerospatiale METHOD FOR RESTORING THE MOVEMENT OF A MOBILE BY OBSERVING A SYMBOL FORMED THEREON AND DEVICES FOR IMPLEMENTING THE METHOD.
US5109345A (en) * 1990-02-20 1992-04-28 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Closed-loop autonomous docking system
US5112172A (en) * 1990-09-12 1992-05-12 Knorr Brake Holding Corporation Sliding pull up stanchion and method
US5215423A (en) * 1990-09-21 1993-06-01 Edelhoff Polytechnik Gmbh & Co. System for determining the spatial position of an object by means of a video optical sensor
US5390118A (en) 1990-10-03 1995-02-14 Aisin Seiki Kabushiki Kaisha Automatic lateral guidance control system
DE69130147T2 (en) 1990-10-03 1999-04-01 Aisin Seiki Automatic control system for lateral guidance
US5150026A (en) * 1990-11-19 1992-09-22 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Obstacle avoidance for redundant robots using configuration control
US5326052A (en) * 1991-10-02 1994-07-05 Enig Associates, Inc. Controllable hose-and-drogue in-flight refueling system
JP2669223B2 (en) * 1991-10-14 1997-10-27 三菱電機株式会社 Optical sensor device for rendezvous docking
KR0152096B1 (en) * 1992-10-26 1998-10-15 윤종용 Obstacle detecting device of movable watching robot
US5493392A (en) * 1992-12-15 1996-02-20 Mcdonnell Douglas Corporation Digital image system for determining relative position and motion of in-flight vehicles
US5394233A (en) * 1993-04-06 1995-02-28 Wang; Charles P. Apparatus for measuring high frequency vibration, motion, or displacement
US5465142A (en) * 1993-04-30 1995-11-07 Northrop Grumman Corporation Obstacle avoidance system for helicopters and other aircraft
US7370834B2 (en) * 1993-11-12 2008-05-13 The Baron Company, Ltd. Apparatus and methods for in-space satellite operations
US6843446B2 (en) * 1993-11-12 2005-01-18 David D. Scott Apparatus and methods for in-space satellite operations
US5803407A (en) * 1993-11-12 1998-09-08 Scott; David R. Apparatus and methods for in-space satellite operations
US6017000A (en) * 1998-08-02 2000-01-25 Scott; David R. Apparatus and methods for in-space satellite operations
US5490075A (en) * 1994-08-01 1996-02-06 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Global positioning system synchronized active light autonomous docking system
SE503396C2 (en) 1994-09-14 1996-06-03 Fmt Int Trade Method and apparatus for connecting a passenger bridge to a door on a vehicle
US5710621A (en) * 1995-04-25 1998-01-20 Omron Corporation Heterodyne measurement device and method
US7527288B2 (en) * 1995-06-07 2009-05-05 Automotive Technologies International, Inc. Vehicle with crash sensor coupled to data bus
EP0852732A1 (en) * 1995-09-21 1998-07-15 Omniplanar, Inc. Method and apparatus for determining position and orientation
US6009188A (en) * 1996-02-16 1999-12-28 Microsoft Corporation Method and system for digital plenoptic imaging
US5832139A (en) * 1996-07-31 1998-11-03 Omniplanar, Inc. Method and apparatus for determining degrees of freedom of a camera
US5982481A (en) * 1996-10-01 1999-11-09 Mcdonnell Douglas Corporation Alignment system and method for dish concentrators
US5883706A (en) * 1996-12-05 1999-03-16 Northrop Grumman Corporation Multiplexer for laser ranging devices and the like
US6064750A (en) * 1997-01-10 2000-05-16 Hunter Engineering Company Apparatus and method for determining vehicle wheel alignment measurements from three dimensional wheel positions and orientations
DE19722829A1 (en) * 1997-05-30 1998-12-10 Daimler Benz Ag Vehicle with a scanning system
US6176451B1 (en) 1998-09-21 2001-01-23 Lockheed Martin Corporation Utilizing high altitude long endurance unmanned airborne vehicle technology for airborne space lift range support
US6254035B1 (en) * 1998-12-10 2001-07-03 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Synchronized docking system
CH709211B1 (en) * 1999-10-06 2015-08-14 Leica Geosystems Ag A method for determining the spatial position of a target tracking mirror and mirror arrangement for implementing the method.
SE515352C2 (en) 1999-11-09 2001-07-16 Fmt Int Trade Ab Coupling device for a passenger bridge
DE10011890C2 (en) * 2000-03-03 2003-04-24 Jena Optronik Gmbh Method for determining the state variables of a moving rigid body in space
US6604711B1 (en) * 2000-11-20 2003-08-12 Sargent Fletcher, Inc. Autonomous system for the aerial refueling or decontamination of unmanned airborne vehicles
US6594007B2 (en) * 2001-02-01 2003-07-15 Snap-On Technologies, Inc. Method and apparatus for mapping system calibration
US6510401B2 (en) 2001-05-11 2003-01-21 The United States Of America As Represented By The Director Of The National Security Agency Method of authenticating beacon
US6597443B2 (en) * 2001-06-27 2003-07-22 Duane Boman Spatial tracking system
CA2373669A1 (en) * 2002-02-27 2003-08-27 Indal Technologies Inc. Imaging system for a passenger bridge of the like for docking automatically with an aircraft
US6658329B1 (en) * 2002-05-02 2003-12-02 The United States Of America As Represented By The United States National Aeronautics And Space Administration Video guidance sensor system with laser rangefinder
US7093314B2 (en) * 2002-05-07 2006-08-22 Dew Engineering And Development Limited Beacon docking system with visual guidance display
KR100556612B1 (en) * 2002-06-29 2006-03-06 삼성전자주식회사 Apparatus and method of localization using laser
US8175799B1 (en) * 2002-10-15 2012-05-08 Douglas Edward Woehler Location system
US7171028B2 (en) 2002-11-22 2007-01-30 The Boeing Company Method and apparatus for covertly determining the rate of relative motion between two objects
US6910660B2 (en) * 2003-01-31 2005-06-28 The Boeing Company Laser guidance system
US20050057670A1 (en) * 2003-04-14 2005-03-17 Tull Damon L. Method and device for extracting and utilizing additional scene and image formation data for digital image and video processing
US7530315B2 (en) 2003-05-08 2009-05-12 Lone Star Ip Holdings, Lp Weapon and weapon system employing the same
US20050240341A1 (en) * 2003-11-03 2005-10-27 Fielhauer Karl B Low-power photonic telemetry system and method for spacecraft monitoring
US7603804B2 (en) * 2003-11-04 2009-10-20 Leupold & Stevens, Inc. Ballistic reticle for projectile weapon aiming systems and method of aiming
US7433021B2 (en) * 2004-08-10 2008-10-07 Joseph Saltsman Stereoscopic targeting, tracking and navigation device, system and method
US7239377B2 (en) 2004-10-13 2007-07-03 Bushnell Performance Optics Method, device, and computer program for determining a range to a target
KR100766434B1 (en) * 2005-07-22 2007-10-15 엘지전자 주식회사 Robot having function of recognizing image and leading method for thereof
US20070040065A1 (en) * 2005-08-19 2007-02-22 Von Thal German Flexible refueling boom extendable tube
US7690304B2 (en) 2005-09-30 2010-04-06 Lone Star Ip Holdings, Lp Small smart weapon and weapon system employing the same
US7895946B2 (en) 2005-09-30 2011-03-01 Lone Star Ip Holdings, Lp Small smart weapon and weapon system employing the same
CN101512282B (en) 2005-11-01 2014-04-16 路波史蒂芬公司 Ballistic ranging methods and portable systems for inclined shooting
US20070129879A1 (en) * 2005-12-07 2007-06-07 Honeywell International Inc. Precision approach guidance using global navigation satellite system (GNSS) and ultra-wideband (UWB) technology
US7658031B2 (en) * 2005-12-21 2010-02-09 Bushnell, Inc. Handheld rangefinder operable to determine hold over ballistic information
US20070214584A1 (en) * 2006-03-14 2007-09-20 Dew Engineering And Development Limited System and method for aligning passenger boarding bridges
US20080017426A1 (en) * 2006-03-23 2008-01-24 Walters Raul J Modular vehicle system and method
USRE46672E1 (en) * 2006-07-13 2018-01-16 Velodyne Lidar, Inc. High definition LiDAR system
CN101688774A (en) 2006-07-13 2010-03-31 威力登音响公司 High definition lidar system
US8541724B2 (en) * 2006-09-29 2013-09-24 Lone Star Ip Holdings, Lp Small smart weapon and weapon system employing the same
US8117955B2 (en) 2006-10-26 2012-02-21 Lone Star Ip Holdings, Lp Weapon interface system and delivery platform employing the same
US8224189B1 (en) 2007-02-02 2012-07-17 Sunlight Photonics Inc. Retro-directive target for free-space optical communication and method of producing the same
US8132759B2 (en) * 2007-03-21 2012-03-13 The Boeing Company System and method for facilitating aerial refueling
US7959110B2 (en) * 2007-04-11 2011-06-14 The Boeing Company Methods and apparatus for resisting torsional loads in aerial refueling booms
US20100133388A1 (en) * 2007-07-11 2010-06-03 Oleg Fedorovich Demchenko Aircraft with multi-purpose integrated electronic complex
US8081298B1 (en) 2008-07-24 2011-12-20 Bushnell, Inc. Handheld rangefinder operable to determine hold-over ballistic information
US8675181B2 (en) * 2009-06-02 2014-03-18 Velodyne Acoustics, Inc. Color LiDAR scanner
WO2011093757A1 (en) * 2010-01-29 2011-08-04 Saab Ab A system and method for tracking and guiding at least one object
ITTO20110323A1 (en) * 2011-04-08 2012-10-09 Thales Alenia Space Italia S P A C On Unico Socio OPTICAL METROLOGICAL SYSTEM, LARGE AND PRECISION PROJECTIVE
ITTO20110325A1 (en) 2011-04-08 2012-10-09 Thales Alenia Space Italia S P A C On Unico Socio METROLOGICAL OPTICAL PROJECTIVE SYSTEM FOR THE DETERMINATION OF TRIM AND POSITION
US9068803B2 (en) 2011-04-19 2015-06-30 Lone Star Ip Holdings, Lp Weapon and weapon system employing the same
US20130027548A1 (en) * 2011-07-28 2013-01-31 Apple Inc. Depth perception device and system
DE102012101460A1 (en) * 2012-02-23 2013-08-29 Conti Temic Microelectronic Gmbh Detecting device i.e. lidar sensor, for detecting environment object for vehicle, has transmission areas non-overlapped with each other and overlapped with receiving area, and transmitter transmitting radiation to transmission areas
US9746560B2 (en) 2013-02-12 2017-08-29 Faro Technologies, Inc. Combination scanner and tracker device having a focusing mechanism
RU2750349C2 (en) 2014-08-26 2021-06-28 Астроскейл Израэл Лтд. Satellite docking system and method
US10627490B2 (en) 2016-01-31 2020-04-21 Velodyne Lidar, Inc. Multiple pulse, LIDAR based 3-D imaging
WO2017164989A1 (en) 2016-03-19 2017-09-28 Velodyne Lidar, Inc. Integrated illumination and detection for lidar based 3-d imaging
WO2017210418A1 (en) 2016-06-01 2017-12-07 Velodyne Lidar, Inc. Multiple pixel scanning lidar
TW201743074A (en) * 2016-06-01 2017-12-16 原相科技股份有限公司 Measurement device and operation method thereof
US20180135979A1 (en) * 2016-11-15 2018-05-17 Incontro Sports Inc. Switching cap and mode-switchable rangefinder comprising the same
US10107910B2 (en) * 2016-12-12 2018-10-23 Goodrich Corporation Object detection system
US20190367192A1 (en) * 2017-02-15 2019-12-05 Astroscale Japan Inc. Capturing system, aerospace vehicle, and plate-like body
EP3363744B1 (en) * 2017-02-15 2022-07-27 Astroscale Japan Inc. Capturing system, aerospace vehicle, and plate-like body
US10625882B2 (en) 2017-03-06 2020-04-21 Effective Space Solutions Ltd. Service satellite for providing in-orbit services using variable thruster control
EP3593166A4 (en) 2017-03-31 2020-12-09 Velodyne Lidar, Inc. Integrated lidar illumination power control
JP2020519881A (en) 2017-05-08 2020-07-02 ベロダイン ライダー, インク. LIDAR data collection and control
US10649072B2 (en) * 2017-05-10 2020-05-12 Massachusetts Institute Of Technology LiDAR device based on scanning mirrors array and multi-frequency laser modulation
JP2018205288A (en) * 2017-05-31 2018-12-27 ソニーセミコンダクタソリューションズ株式会社 Distance measurement device, distance measurement method, and program
CN109383799A (en) * 2017-08-07 2019-02-26 菜鸟智能物流控股有限公司 Aircraft charging method and related device
US10145993B1 (en) * 2017-10-06 2018-12-04 Lighthouse & Beacon, Inc. Retroreflectors providing information encoded in reflected non-visible laser while retaining visible light safety properties
US11294041B2 (en) 2017-12-08 2022-04-05 Velodyne Lidar Usa, Inc. Systems and methods for improving detection of a return signal in a light ranging and detection system
KR102203439B1 (en) * 2018-01-17 2021-01-14 엘지전자 주식회사 a Moving robot and Controlling method for the moving robot
RU2676999C1 (en) * 2018-02-21 2019-01-14 Михаил Викторович Яковлев Method for determining directions on space object
US10712434B2 (en) 2018-09-18 2020-07-14 Velodyne Lidar, Inc. Multi-channel LIDAR illumination driver
CN109240307B (en) * 2018-10-12 2021-07-27 苏州优智达机器人有限公司 Accurate positioning system of robot
US11082010B2 (en) 2018-11-06 2021-08-03 Velodyne Lidar Usa, Inc. Systems and methods for TIA base current detection and compensation
US11885958B2 (en) 2019-01-07 2024-01-30 Velodyne Lidar Usa, Inc. Systems and methods for a dual axis resonant scanning mirror
US10613203B1 (en) 2019-07-01 2020-04-07 Velodyne Lidar, Inc. Interference mitigation for light detection and ranging

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2186658A1 (en) * 1972-03-13 1974-01-11 Thomson Csf
US3917196A (en) * 1974-02-11 1975-11-04 Boeing Co Apparatus suitable for use in orienting aircraft flight for refueling or other purposes
FR2433760A1 (en) * 1978-08-17 1980-03-14 Thomson Csf Detector for position of pilot's helmet - uses opto-electronic system giving line of sight for arming system
EP0122890A1 (en) * 1983-04-19 1984-10-24 AERITALIA - Società Aerospaziale Italiana - p.A. Docking system for space modules

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3224709A (en) * 1962-09-13 1965-12-21 Martin Marietta Corp Method and apparatus for docking vehicles
US3285533A (en) * 1963-06-10 1966-11-15 Industrial Nucleonics Corp Rendezvous and docking of space ships
US3666367A (en) * 1970-06-01 1972-05-30 Hughes Aircraft Co Digital range measuring system
US3796492A (en) * 1971-06-01 1974-03-12 Autech Corp Laser dimension comparator
US3781111A (en) * 1972-03-16 1973-12-25 Nasa Short range laser obstacle detector
US3897150A (en) * 1972-04-03 1975-07-29 Hughes Aircraft Co Scanned laser imaging and ranging system
US4026654A (en) * 1972-10-09 1977-05-31 Engins Matra System for detecting the presence of a possibly moving object
US4003659A (en) * 1974-11-15 1977-01-18 The United States Of America As Represented By The Secretary Of The Army Single plane corner reflector guidance system
US4167329A (en) * 1977-12-12 1979-09-11 Raytheon Company Focussed doppler radar
CA1103803A (en) * 1978-03-01 1981-06-23 National Research Council Of Canada Method and apparatus of determining the center of area or centroid of a geometrical area of unspecified shape lying in a larger x-y scan field
DE2824311C2 (en) * 1978-06-02 1983-03-03 Erwin Sick Gmbh Optik-Elektronik, 7808 Waldkirch Adjustment arrangement for aligning a group of cyclically switched light transmitters or light receivers
JPS5927842B2 (en) * 1978-08-24 1984-07-09 工業技術院長 Target mark displacement measurement method
US4295740A (en) * 1978-09-05 1981-10-20 Westinghouse Electric Corp. Photoelectric docking device
US4260187A (en) * 1979-03-23 1981-04-07 Nasa Terminal guidance sensor system
US4373804A (en) * 1979-04-30 1983-02-15 Diffracto Ltd. Method and apparatus for electro-optically determining the dimension, location and attitude of objects
JPS58201015A (en) * 1982-05-20 1983-11-22 Canon Inc Distance measuring device
CH641308B (en) * 1982-07-13 Wild Heerbrugg Ag DEVICE FOR MEASURING THE RUN TIME OF PULSE SIGNALS.

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2186658A1 (en) * 1972-03-13 1974-01-11 Thomson Csf
US3917196A (en) * 1974-02-11 1975-11-04 Boeing Co Apparatus suitable for use in orienting aircraft flight for refueling or other purposes
FR2433760A1 (en) * 1978-08-17 1980-03-14 Thomson Csf Detector for position of pilot's helmet - uses opto-electronic system giving line of sight for arming system
EP0122890A1 (en) * 1983-04-19 1984-10-24 AERITALIA - Società Aerospaziale Italiana - p.A. Docking system for space modules

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0388618A3 (en) * 1989-03-23 1991-12-11 EDELHOFF POLYTECHNIK GMBH &amp; CO. System for determining the position of an object in space using a video-optical sensor
EP0388618A2 (en) * 1989-03-23 1990-09-26 EDELHOFF POLYTECHNIK GMBH &amp; CO. System for determining the position of an object in space using a video-optical sensor
DE3909762C2 (en) * 1989-03-23 2003-02-06 Msts Logistik Gmbh & Co Garbage collection system with a garbage collection vehicle and garbage containers
EP0390051A2 (en) * 1989-03-31 1990-10-03 Honeywell Inc. Method and apparatus for computing the self-motion of moving imaging devices
EP0390051A3 (en) * 1989-03-31 1991-07-31 Honeywell Inc. Method and apparatus for computing the self-motion of moving imaging devices
EP0467671A3 (en) * 1990-07-18 1993-02-10 National Space Development Agency Of Japan Retry/recovery method in rendezvous manoeuvre
EP0467671A2 (en) * 1990-07-18 1992-01-22 National Space Development Agency Of Japan Retry/recovery method in rendezvous manoeuvre
WO1992014168A1 (en) * 1991-02-12 1992-08-20 Matra Marconi Space France Optical device for determining the relative position of two vehicles and application to an alignment system
FR2672690A1 (en) * 1991-02-12 1992-08-14 Matra Espace OPTICAL DEVICE FOR DETERMINING THE RELATIVE POSITION OF TWO VEHICLES AND ALIGNMENT SYSTEM COMPRISING THE SAME.
US5302816A (en) * 1991-02-12 1994-04-12 Matra Marconi Space France Optical device for determining the relative position of two vehicles, and an alignment system comprising an application thereof
GB2258112A (en) * 1991-07-25 1993-01-27 Marconi Gec Ltd Rendezvous apparatus
GB2258112B (en) * 1991-07-25 1995-06-14 Marconi Gec Ltd Rendezvous apparatus
FR2688613A1 (en) * 1992-03-16 1993-09-17 Aerospatiale METHOD AND DEVICE FOR DETERMINING THE RELATIVE POSITION AND TRACK OF TWO SPACE VEHICLES.
EP0561660A1 (en) * 1992-03-16 1993-09-22 AEROSPATIALE Société Nationale Industrielle Method and apparatus for determining the relative position and trajectory of two spacecraft
EP0573990A3 (en) * 1992-06-09 1994-03-09 Olympus Optical Co
US5369462A (en) * 1992-06-09 1994-11-29 Olympus Optical Co., Ltd. Inclination detecting apparatus and camera for detecting hand shake using the same
EP0573990A2 (en) * 1992-06-09 1993-12-15 Olympus Optical Co., Ltd. Optical device for detecting inclination and variations in inclination for a photographic camera
EP0624806A1 (en) * 1993-04-09 1994-11-17 Trw Inc. Spacecraft docking sensor system
FR2704654A1 (en) * 1993-04-19 1994-11-04 Nec Corp Apparatus and method for measuring relative azimuth
EP0803436A1 (en) * 1996-04-22 1997-10-29 Mitsubishi Denki Kabushiki Kaisha Rendezvous spacecraft collision avoidance device
US5868358A (en) * 1996-04-22 1999-02-09 Mitsubishi Denki Kabushiki Kaisha Rendezvous spacecraft collision avoidance device
FR2902894A1 (en) * 2006-06-27 2007-12-28 Alcatel Sa Lateral and longitudinal satellite metrology system for e.g. satellite formation flight system, has reference satellite including semi-reflecting device interposed between source and lens for deviating light reflected by secondary satellite
EP1873556A1 (en) * 2006-06-27 2008-01-02 Thales Lateral and longitudinal metrology system
US7561262B2 (en) 2006-06-27 2009-07-14 Thales Lateral and longitudinal metrology system
DE102011001387A1 (en) * 2011-03-18 2012-09-20 First Sensor AG Sampling method for sampling field by optical sampling system or scanning system, involves transmitting test light signals by transmitters of transmitter arrangement of optical sampling system
RU2522840C1 (en) * 2012-12-20 2014-07-20 Федеральное государственное автономное научное учреждение "Центральный научно-исследовательский и опытно-конструкторский институт робототехники и технической кибернетики" (ЦНИИ РТК) Electronic space scanning method

Also Published As

Publication number Publication date
US4834531A (en) 1989-05-30
EP0256023A1 (en) 1988-02-24

Similar Documents

Publication Publication Date Title
US4834531A (en) Dead reckoning optoelectronic intelligent docking system
US3781111A (en) Short range laser obstacle detector
AU618402B2 (en) Surveillance sensor
EP3187895B1 (en) Variable resolution light radar system
US4383663A (en) Active optical terminal homing
US5966227A (en) Active cooperative tuned identification friend or foe (ACTIFF)
US5936229A (en) Tracking means for distant ballistic missile targets comprising means for tracking largest radius of curvature
US6504610B1 (en) Method and system for positioning an autonomous mobile unit for docking
US7336345B2 (en) LADAR system with SAL follower
US5142400A (en) Method and apparatus for automatic acquisition and alignment of an optical beam communication link
US5870180A (en) Time measurement device and method useful in a laser range camera
US5274379A (en) Optical identification friend-or-foe
US6262800B1 (en) Dual mode semi-active laser/laser radar seeker
US10649087B2 (en) Object detection system for mobile platforms
EP0899586A2 (en) Target-tracking laser designator
EP0603003A1 (en) An integrated LADAR/FLIR sensor
US4926050A (en) Scanning laser based system and method for measurement of distance to a target
US5652588A (en) Surveillance system including a radar device and electro-optical sensor stations
US5780838A (en) Laser crossbody tracking system and method
US20070222968A1 (en) Laser-based system with LADAR and SAL capabilities
US7548305B1 (en) Shallow angle shape sensor
US7196301B2 (en) System for detecting incoming light from a remote laser source
CN110487120A (en) A kind of the laser system of defense and method of long distance illumination
RU2639321C1 (en) Optical-electronic object detecting system
US5664741A (en) Nutated beamrider guidance using laser designators

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): JP

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE FR GB IT LU NL SE