WO2016168570A1 - Robotic system and method for operating a robot - Google Patents

Robotic system and method for operating a robot Download PDF

Info

Publication number
WO2016168570A1
WO2016168570A1 PCT/US2016/027707 US2016027707W WO2016168570A1 WO 2016168570 A1 WO2016168570 A1 WO 2016168570A1 US 2016027707 W US2016027707 W US 2016027707W WO 2016168570 A1 WO2016168570 A1 WO 2016168570A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
input device
user
control system
motions
Prior art date
Application number
PCT/US2016/027707
Other languages
French (fr)
Inventor
Biao ZHAND
Gregory F. Rossano
Jianjun WAHG
Thomas A. Fuhlbrigge
Original Assignee
Abb Technology Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Abb Technology Ag filed Critical Abb Technology Ag
Publication of WO2016168570A1 publication Critical patent/WO2016168570A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1689Teleoperation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/33Director till display
    • G05B2219/33007Automatically control, manually limited, operator can override control
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39349RCC remote center compliance device inserted between wrist and gripper
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40032Peg and hole insertion, mating and joining, remote center compliance
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40153Teleassistance, operator assists, controls autonomous robot
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40157Planning, event based planning, operator changes plans during execution
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40184Compliant teleoperation, operator controls motion, system controls contact, force
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40191Autonomous manipulation, computer assists operator during manipulation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40195Tele-operation, computer assisted manual operation

Definitions

  • FIG. 8 illustrates some aspects of a non-limiting example of a peg-in-hole type assembly with a reference coordinate system and described with regard to some embodiments of the present invention.
  • FIG. 9 illustrates some aspects of a non-limiting example of an assembly of a peg into a small hole that is positioned in a stepped hole that is described with regard to some embodiments of the present invention.
  • FIG. 1 1 illustrates some aspects of a non-limiting example of a control loop in accordance with an embodiment of the present invention.
  • control system 104 is configured to execute program instructions whereby a user or operator may select, initiate and control a predefined automatic motion sequence, e.g., an assembly motion, of robot 102 while the user is tele- operating robot 102 using input device 106.
  • the operator station 14 has at least one tele-operation input device 14a such as joysticks or stylus-type devices which the operator uses to create continuous motion signals (position or speed signals). When force feedback is added to these devices they become haptic devices. This feedback causes a vibration in the joystick and the operator feels the force feedback in the stylus-type devices.
  • tele-operation input device 14a such as joysticks or stylus-type devices which the operator uses to create continuous motion signals (position or speed signals).
  • the input device is a haptic joystick.

Abstract

The present disclosure provides a robotic system. The robotic system includes a robot, an operator input device, and a control system communicatively coupled to the robot and to the input device. The control system stores executable program instructions, including a predefined automatic motion sequence of the robot. The input device is operative to direct the robot via the control system to perform user-guided motions of the robot in response to human manipulation of the input device. The control system is configured to execute program instructions to execute the predefined automatic motion sequence and to overlay the predefined automatic motion sequence with the user-guided motions.

Description

ROBOTIC SYSTEM AND METHOD FOR OPERATING A ROBOT
TECHNICAL FIELD
The present application generally relates to robotics and more particularly, not exclusively, to a robotic system and a method for operating a robot.
BACKGROUND
Robotic systems are used for many different applications. Some existing systems have various shortcomings relative to certain applications. For example, some robotic operations may be difficult for a human tele-operator to control the robot to perform such operations in a desirable manner. Accordingly, there remains a need for further contributions in this area of technology.
SUMMARY
One embodiment of the present invention is a robotic system. The robotic system includes a robot, an operator input device, and a control system
communicatively coupled to the robot and to the input device. The control system stores executable program instructions, including a predefined automatic motion sequence of the robot. The input device is operative to direct the robot via the control system to perform user-guided motions of the robot in response to human manipulation of the input device. The control system is configured to execute program instructions to execute the predefined automatic motion sequence and to overlay the predefined automatic motion sequence with the user-guided motions. Further embodiments, forms, features, aspects, benefits, and advantages of the present application shall become apparent from the description and figures provided herewith.
BRIEF DESCRIPTION OF THE FIGURES
FIG. 1 schematically illustrates some aspects of a non-limiting example of a robotic system in accordance with an embodiment of the present invention.
FIG. 2 illustrates some aspects of a non-limiting example of a robot in a work cell in accordance with an embodiment of the present invention.
FIG. 3 illustrates some aspects of a non-limiting example of a peg-in-hole assembly employed in describing some aspects of a non-limiting example of an embodiment of the present invention.
FIG. 4 illustrates some aspects of a non-limiting example of a gear system assembly employed in describing some aspects of a non-limiting example of an embodiment of the present invention.
FIG. 5 schematically illustrates some aspects of a non-limiting example of an industrial robotic system in accordance with an embodiment of the present invention.
FIG. 6 illustrates some aspects of a non-limiting example of a tele-operated work cell in accordance with an embodiment of the present invention.
FIG. 7 illustrates some aspects of a non-limiting example of two strategies for assembling a subpart to a main part that may be employed in conjunction with some embodiments of the present invention.
FIG. 8 illustrates some aspects of a non-limiting example of a peg-in-hole type assembly with a reference coordinate system and described with regard to some embodiments of the present invention. FIG. 9 illustrates some aspects of a non-limiting example of an assembly of a peg into a small hole that is positioned in a stepped hole that is described with regard to some embodiments of the present invention.
FIG. 10 illustrates some aspects of a non-limiting example of a spiral search assembly strategy program that may be employed in accordance with an embodiment of the present invention.
FIG. 1 1 illustrates some aspects of a non-limiting example of a control loop in accordance with an embodiment of the present invention.
FIG. 12 illustrates some aspects of a non-limiting example of an assembly of a multi-stage gear system into mesh with another multi-stage gear system that is described with regard to an embodiment of the present invention.
DETAILED DESCRIPTION OF THE ILLUSTRATIVE EMBODIMENTS
For the purposes of promoting an understanding of the principles of the invention, reference will now be made to the embodiments illustrated in the drawings and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the invention is thereby intended. Any alterations and further modifications in the described embodiments, and any further applications of the principles of the invention as described herein are contemplated as would normally occur to one skilled in the art to which the invention relates.
Referring to the drawings, and in particular FIGS. 1 and 2, some aspects of a non-limiting example of a robotic system 100 in accordance with an embodiment of the present invention are schematically illustrated. Robotic system 100 includes a robot 102, a control system 104 and a user or operator input device 106. In one form, robot 102 is a multi-axis industrial robot disposed in a work cell 108, and is tele-operated remotely by a user or operator using input device 106, e.g., to perform assembly work. For example, robot 102 may be tele-operated using input device 106 from any location outside of work cell 108 to assemble a first component, generically represented as component 1 10, to a second component, generically represented as component 1 12, e.g., disposed on a work table or fixture 1 14. For instance, robot 102 may be tele- operated to insert component 1 10 into an opening 1 16 in component 1 12. In other embodiments, robot 102, input device 106 and the user may be co-located. In various embodiments, robot 102 may take any form suited to the application for which it is employed, and may be any type of robot constructed to perform any type of manufacturing work or other operation.
In one form, robot 102 includes a pedestal 1 18; a shoulder 120 coupled to and rotatable about pedestal 1 18; an upper arm 122 coupled to shoulder 120; an elbow 124 coupled to upper arm 122; a rotatable arm 126 coupled to and extending from elbow 124 and culminating in a wrist 128; and a rotatable end effector 130 extending from elbow 124. In other embodiments, robot 102 may have a greater or lesser number of appendages and/or degrees of freedom. In one form, end effector 130 is configured to grip and manipulate component 1 10 for assembly with component 1 12. In other embodiments, end effector 130 may take other forms, and may be configured for performing other operations, e.g. , any operations related to or unrelated to component 1 10. Robot 102 is constructed to translate and rotate component 1 10, in and about the X, Y and Z axes, e.g., as illustrated in FIG. 2, as well as other desired axes.
In some embodiments, robot 102 has associated therewith a haptic feedback sensor 132. In one form, sensor 132 is operative to detect interactions between component 1 10 and anything else in its environs, e.g., physical interactions between component 1 10 (e.g., while held by robot 102) and component 1 12, or between component 1 10 and anything within the reach of robot 102. In one form, haptic feedback sensor 132 is a force sensor. In other embodiments, haptic feedback sensor 132 may take other forms. Force sensor 132 is communicatively coupled to control system 104. In one form, sensor 132 is mounted on robot 102, e.g., on end effector 130. In other embodiments, sensor 132 may be mounted at any suitable location or on any suitable feature of robot 102 or component 1 10. Sensor 132 provides an output for creating haptic feedback to the user or operator of robot 102, e.g., via input device 106.
Control system 104 is communicatively coupled to robot 102 (including sensor 132) and to input device 106. In one form, control system 104 includes a dedicated robot controller 134 and a data processing unit or controller 136. Controller 134 and input device 106 are communicatively coupled to controller 136. In one form, robot controller 134 operates robot 102 based on data provided by controller 136, which receives control input from another system or device, e.g., input device 106. In some embodiments, robotic system 100 includes a display device 138 communicatively coupled to controller 136. In one form, display device 138 is also an input device, e.g., a touch screen display. Display device 138 displays, for example, robot 102 motion data, and may be used to input robot 102 motion commands and to adjust parameters associated with predefined automatic motion sequences, e.g., as described herein.
In some embodiments, robotic system 100 may include one or more sensors 140, e.g., for use in operating robot 102 locally or remotely, for enhancing safety, and/or for other purposes. Sensors 140 may take any suitable form, e.g., including vision sensors such as cameras, acoustic sensors, infrared sensors or one or more other types of proximity sensors, microphones, position sensors, translational and rotational speed sensors, force sensors and/or any other types of sensors. Sensors 124 are communicatively coupled to control system 104. In some embodiments, control system 104 may include a controller 142 communicatively coupled to one or more sensors 140 for processing the output of one or more sensors 140. Controller 142 may be
communicatively coupled to controller 134 and/or controller 136. In some embodiments, one or more of the sensors 140 may be communicatively coupled to controller 134 and/or controller 136 without being communicatively coupled to
intermediate controller 142. Some embodiments may employ one or more sensors 140 without also employing an intermediate controller such as controller 142.
In one form, controllers 134, 136 and 142 are microprocessor-based and the program instructions executed thereby are in the form of software stored in a memory (not shown). However, it is alternatively contemplated that any or all of the controllers and program instructions may be in the form of any combination of software, firmware and hardware, including state machines, and may reflect the output of discreet devices and/or integrated circuits, which may be co-located at a particular location or distributed across more than one location, including any digital and/or analog devices configured to achieve the same or similar results as a processor-based controller executing software and/or firmware and/or hardware based instructions.
Input device 106 is communicatively coupled to control system 104, and with robot 102 via control system 104. In one form, input device 106 is communicatively coupled to controller 136. In one form, input device 106 is a joystick. In other embodiments, input device 106 may take other forms, e.g., a stylus. Input device 106 is constructed to allow the user direct control over the motions or movements of robot 102, e.g., via controllers 136 and 134. Input device is thus constructed to direct robot 102 via control system 104 to perform user-guided motions of robot 102 in response to human manipulation of input device 106. The directed operation of robot 102 by human manipulation of input device 106 is referred to herein as tele-operation of robot 102. In some embodiments, input device 106 is constructed to provide remote control over robot 102, which in various embodiments may include control of the position, direction and/or speed of rotational and translational motion of robot 102, e.g., ultimately end effector 130. In one form, input device 106 is a haptic input device configured to provide haptic feedback to the user or operator, for example, a haptic joystick or haptic stylus. Haptic feedback may be in the form of, for example, force, vibration and/or pulses delivered to the hand of the user from input device 106 and/or other forms of haptic feedback. The haptic feedback may allow the user to "feel" what robot 102 "feels." In other embodiments, input device 106 may not be a haptic device, that is, may not provide haptic feedback. In one form, as a haptic feedback device, input device 106 provides haptic feedback based on the output of haptic feedback sensor 132. In other embodiments, input device 106 may provide haptic feedback based on the output of one or more other sensors, e.g., including one or more sensors 140, in addition to or in place of sensor 132.
Some robotic procedures, for example and without limitation, assembly procedures, may be difficult to perform solely under manual tele-operated control, e.g., user-guided motion control of the robot in response to human manipulation of an input device, such as input device 106. Some such procedures may be more readily performed if user-guided robotic motions are combined with, e.g., superimposed or overlaid with, automatic motion sequences of the robot. In various embodiments of the present invention, control system 104 is configured to execute program instructions whereby a user or operator may select, initiate and control a predefined automatic motion sequence, e.g., an assembly motion, of robot 102 while the user is tele- operating robot 102 using input device 106. Thus, the motion of robot 102 may be based on a combination of the user's tele-operated input that is superimposed with or overlaid with the selected automatic motion sequence. Accordingly, control system 104 stores executable program instructions that include predefined automatic motion sequences of robot 102, and is configured to execute the program instructions to carry out the predefined automatic motion sequences. The automatic motion sequences may be defined at any convenient point in time, e.g., prior to performing an operation such as an assembly operation with robot 102. The predefined automatic motion sequences may be stored in any suitable controller of control system 104, e.g., controller 136 and/or controller 134, or in any memory, network or other device or system accessible by control system 104.
The predefined automatic motion sequences are motions of robot 102 that are directed by control system 104, and may be predefined or programmed into or for access by control system 104 in any suitable manner. Control system 104 is configured to execute program instructions to execute the predefined automatic motion sequence in combination with, e.g., super imposed with or overlaid with, the user-guided motions performed in response to human manipulation of input device 106. In one form, the predefined automatic motion sequences are performed independently of the user- guided motions. That is, the predefined automatic motion sequences are performed independent of the user's manipulation of input device 106 to tele-operate robot 102. In one form, the predefined automatic motion sequences are sequences of individual robotic assembly motions directed to assembling a first component, e.g., component 1 10, to a second component, e.g., component 1 12; wherein the user-guided robotic motions are also robotic motions directed toward assembling the first component to the second component. In other embodiments, the predefined automatic motion sequences maybe directed to performing any other manufacturing work or other operations in addition to or in place of assembly operations. In some embodiments, the automatic motion sequences include at least a first motion and a second motion different from the first motion. For example and without limitation, an automatic tapping sequence may include a Z-axis translation in one direction followed by a Z-axis translation in the opposite direction. An automatic motion sequence may be as simple as a single translation from one location in space to another, an application of a force, a single rotation, or any combination of translations and/or rotations and/or applications of force and/or torque, including, time varying force and/or torque, e.g., applied at end effector 130 or component 1 10.
In some embodiments, the executable program instructions stored in and executable by control system 104 include a plurality of different predefined automatic motion sequences. In such embodiments, control system 104 is configured to execute program instructions to receive a user selection of a desired one or more of the different predefined automatic motion sequences, and to execute the selected one or more of the different predefined automatic motion sequences and overlay or superimpose the selected one or more of the different predefined automatic motion sequences with the user-guided motions, e.g., on the fly. The selection may be made, for example, via display device 138, e.g., in the form of a touch screen display, a keyboard, input device 106 and/or another suitable input device. Thus, for example, in such embodiments, the user may select and implement one or more different predefined automatic motions sequences while tele-operating robot 102. In some embodiments, one or more predefined automatic motion sequences may also or alternatively be selected prior to tele-operation. In such embodiments, the predefined automatic motion sequence may be initiated prior to tele-operation, wherein tele-operation may begin and be performed during the execution of but after the initiation of the predefined automatic motion sequence. In some embodiments, one or more predefined automatic motion
sequences may be initiated prior to initiating tele-operation, and one or more predefined automatic motion sequences may also be initiated after tele-operation has begun. In still other embodiments, one or more predefined automatic motion sequences may also or alternatively be initiated based on the output of a sensor, e.g., force sensor 132 and/or one or more sensors 140. In some embodiments, one or more predefined automatic motion sequences may also or alternatively be initiated based upon control system 104 detecting or determining that a predetermined stage in a robotic procedure, e.g., an assembly process, has been reached.
In some embodiments, one or more of the predefined automatic motion sequences is defined by parameters that are adjustable by the user. In various embodiments, the parameters may be adjusted by the user prior to initiating tele- operation and/or on the fly, that is, while tele-operating robot 102. In some
embodiments, the parameters may be adjusted or changed while the predefined automatic motion sequence is executing. Parameters for automatic assembly motions may be adjusted, for example, via display device 138, e.g., in the form of a touch screen display, a keyboard, input device 106 and/or another suitable input device. Parameters may include any robotic control parameters, for example, frequency and amplitude of hopping or other motion sequences, translational and rotational speed, spatial parameters such as an X-Y axis spiral search start radius change in radius with progression along the spiral, force, torque and the like.
The use of predefined automatic motion sequences in conjunction with tele- operation may assist in performing robotic operations. For example, predefined automatic motion sequences, such as tapping (back and forth motion along a desired line, e.g., the Z axis); hopping in the Z axis while rotating about the Z axis; settling in by rotating about the X axis and/or Y axis; sliding in one direction and then settling in, and others may be employed. In some embodiments, these strategies, i.e., the predefined automatic assembly sequences, may help compensate for minor variations and inaccuracies in the components to be assembled, close tolerances, etc. It may be difficult for a human to duplicate these strategies manually by tele-operation because the user typically is unable to tap, hop, rotate, etc. in a sufficiently repeatable and controlled manner, particularly when speed of assembly is sought. For instance, it is difficult for a human to maintain a consistent amplitude and frequency in hopping and rotating motions.
For example, with reference to FIG. 3, some aspects of a non-limiting example of a peg-in-hole assembly are employed in describing some aspects of a non-limiting example of an embodiment of the present invention. In the depiction of FIG. 3, a first component, peg 150, is sought to be inserted into a hole 152 in a block 154. In difficult assembly cases, e.g., where the clearance between peg 150 and hole 152 is low and any chamfers on peg 150 and hole 152 are small, manual insertion using tele-operated motion alone may be difficult, even with haptic feedback. For example, a slight misalignment between the components might result in jamming or binding. In such cases, adding a predefined automatic assembly motion in the form of a hopping motion in the Z axis might help the user to find and settle into alignment without jamming the components. The hopping motion may be superimposed or overlaid onto the user- guided assembly motions.
Accordingly, in this example, the user tele-operates robot 102 to generate user- guided motions of robot 102, and having selected a predefined automatic motion sequence, e.g., hopping, control system 104 executes the predefined automatic motion sequence and superimposes the predefined automatic motion sequence onto the user- guided motions while the user is tele-operating robot with input device 106. The hopping motion may employ a Z offset that changes over time in a sinusoidal fashion. In various embodiments, the amplitude and frequency of the hopping may be predefined to a default value and/or may be set and/or adjusted by the user before or while tele- operating robot 102.
In some cases, the user may not have visual access all or part of the
components being assembled. For example, a tool at end effector 130 that holds peg 152 may block the user's view. As another example, peg 152 may be sought to be inserted into the smaller diameter hole of a stepped hole. In such cases, the user may select a predefined automatic motion sequence in the form of a spiral search pattern in the X-Y plane, and tele-operate robot 102 to control motion along the Z axis to control the insertion force. Alternatively, for example, the user may select a predefined automatic motion sequence to apply a desired insertion force, while the user tele- operates robot 102 in the X-Y plane. As another example, with reference to FIG. 4, some aspects of a non-limiting example of a gear system assembly are employed in describing some aspects of a non- limiting example of an embodiment of the present invention. In the depiction of FIG. 4, a first component, multi-stage gear assembly 180, is to be inserted into mesh with a corresponding multi-stage gear assembly 182 in a gearbox 184. The left hand view of FIG. 4 illustrates the pre-assembled condition, and the right-hand view illustrates the assembled condition. If the gears are not aligned, one or more repeated sequences of an upward translation along the Z axis, followed by a small rotation about the Z axis and then a downward translation along the Z axis may be employed to engage the gear assembly 180 with gear assembly 182. However, this sequence of robotic motions is difficult to achieve via tele-operation, particularly when it is desired to perform the assembly operation quickly. In such assembly procedures, a predefined automatic motion sequence that controls provides hopping along the Z axis coupled with the rotation about the Z axis may be combined with, or superimposed or overlaid with, tele- operated user-guided motions of robot 102 controlled via input device 106. Parameters may include amplitude and frequency of hopping, as well as degree of rotation between each upward and downward translation. The parameters may be adjusted prior to tele- operation or on the fly. Alternatively, the user would control X and Y axis translation and rotation, whereas a predefined automatic motion sequence would control robot motion and/or force along the Z axis.
Referring now to FIG. 5, there is shown a system 10 that has at least one remote robot station 12, at least one operator station 14 and at least one communication link 16 between the robot station 12 and the operator station 14. The physical distance between the remote robot station 12 and the operator station 14 can vary from "next door" to each other to "another continent".
The robot station 12 includes at least one robot 12a. Robot 12a is, for example, a six degree of freedom industrial robot available from ABB. In system 10, the operator has direct remote control of the motion of robot 12a and attached processes.
Robot station 12 also includes a robot controller 12b that includes a data interface which accepts motion commands and provides actual motion data, and optionally one or more sensor devices 12c that observe the robot station 12 and attached processes, such as cameras, microphones, position sensors, proximity sensors and force sensors. The sensor devices 12c may either be smart sensors, that is the sensor device 12c includes data processing capability, or not smart sensors, that is, the sensor device 12c does not include data processing capability.
If the sensor devices 12c are smart sensors then the output of the sensor devices is connected directly to robot controller 12b. If the sensor devices 12c are not smart sensors, then their output can be connected either to a computation device 18 to process the sensor device output or to the communication link 16 described in more detail below so that the sensor device output is processed in data processing device 14c.
The robot station 12 can also include as an option one or more actuators and other devices (not shown in FIG. 5 but well known to those of ordinary skill in this art), that are mounted to the robot or next to the robot, such as grippers, fixtures, welding guns, spraying guns, spotlights and conveyors. The controller 12b has the program which when executed controls the motion of the robot 12a to perform work. As is well known, the robot may hold a tool, not shown, which is used to perform work on a stationary or moving workpiece, not shown, or may hold the workpiece which has work performed on it by an appropriate tool. The remote sensor devices 12c provide input signals to the controller 12b that the controller uses to control the robot 12a in performance of the work.
The operator station 14 has at least one tele-operation input device 14a such as joysticks or stylus-type devices which the operator uses to create continuous motion signals (position or speed signals). When force feedback is added to these devices they become haptic devices. This feedback causes a vibration in the joystick and the operator feels the force feedback in the stylus-type devices.
The signals from these input devices 14a are used by the controller 12b to operate the robot 12a. The device side also has at least one display device 14b and a data processing device 14c which is connected to both the input devices 14a and the display devices 14b.
The monitoring (display) device 14b shows actual data about the robot motion and attached processes, for example, camera images, acoustic feedback and sensor values. The data processing device 14c processes data in both directions. Device 14c may for example be an industrial PC or a PLC.
The operator station 14 may also include a safety enable device (not shown in FIG. 5) that is separate and distinct from input devices 14a and may for example be a three position switch. The safety enabling device enables and disables power to the robot 12a and attached processes. The communication link 16 connects the robot controller 12b and the data processing device 14c to each other. The communication link 16 comprises one or more communication links 16-1 to 16-N.
The communication link 16 between the operator station 14 and the robot station 12 may be realized with various technologies (e.g. fiber-optic/radio/cable on different types and layers of data protocols). A major portion or the entire infrastructure of the communication link may already exist and be used for other purposes than tele- operating robots. Typical examples are existing Ethernet installations with LAN and WLAN, Bluetooth, ZigBee and other wireless industrial links, point-to-point radio systems or laser-optical systems, and satellite communication links.
System 10 is operated to maintain a reliable "real-time" communication link 16 between device side 14 and the remotely located robot side 12. The system 10 changes parameters of the communication link 16 and the robot motion, depending on the currently available data rate and/or transmission time of the communication link 16.
Referring now to FIG. 6, there is shown in detail the arm 12d of robot 12a of FIG. 5 integrated with a haptic feedback device such as force sensor 12c mounted on robot 12a between the robot arm 12d and a gripped subpart 20 that the robot is adding to a main part 22.
Referring again to FIG. 5, an operator not shown in the figure drives the robot 12a by operating haptic joystick 14a. As the subpart 20 held by the robot 12a comes in contact with objects in the work cell (e.g. the main part 22), the forces generated by this contact are communicated to the operator by commanding the haptic joystick 14a to provide resistance in the direction of those forces. This allows the operator to "feel" what the robot 12a "feels" during the operation of assembling subpart 20 with main part 22.
Many types of strategies may be used to perform an assembly of subpart 20 to main part 22. Simple examples include as shown in FIG. 7 vertical insertion (left side of the figure or a slide and pivot (right side of the figure) for a peg-and-hole type of assembly. The appropriateness of an assembly strategy in a particular scenario depends on many factors, such as the amount and type of compliance in the system, the size of chamfers on the parts 20 and 22, etc.
More complicated assembly strategies may possibly exist in autonomous robotic assembly, such as: tapping (straight up and down motion along the Z axis), hopping in Z while rotating about the Z axis, settling in by rotating in Rx and Ry, sliding in one direction and then settling in, and so forth. These strategies may possibly be effective in completely automatic robotic assembly because the strategies may help compensate for minor variations and inaccuracies in the assembly system. However, it may be difficult for a human to duplicate these strategies manually because the operator may not be able to tap, hop, rotate, etc. in a very repeatable and controlled way. For example, the operator may not be able to maintain a consistent amplitude and frequency in the hopping and rotating motions, which may make the use of these strategies much less effective. In some cases, attempts to reproduce such strategies manually in a tele-operation contact may result in an unstable situation and/or damage to parts or equipment.
These assembly strategies may be predefined in a manner known to those of ordinary skill in this art and made available to the operator, e.g., in a library. The operator may then use a graphical user interface ("GUI") to select from the library the predefined assembly strategy (predefined automatic assembly motion sequence) that the operator believes is appropriate to use for the type of assembly that the robot 12a is to perform. U.S. Patent No. 9,008,836 describes one example of a technique to define assembly strategies (predefined automatic assembly motion sequences) that may be used by a robot to assemble parts. As described therein the assembly process may be categorized based on its nature which may be cylindrical, radial and multi-stage insertion. The operator may use the GUI to specify from the library the search pattern and search parameters. The parameters may be optimized and the optimized
parameter set may be verified, and when a predetermined criteria such as assembly cycle time set and/or success rate is met, the optimization process stops. When the optimization stops the verified parameters may be used to have the robot perform the categorized assembly process. If the parameters do not meet the predetermined criteria, another round of optimization using the same or other parameters may be performed.
Referring now to FIG. 8, there is shown with a reference coordinate system another example of a peg-in-hole type assembly that has very tight tolerances and small chamfers. Manual insertion of the peg 24 in hole 26 using tele-operated motion might be difficult, even with haptic feedback, because a slight misalignment in the parts 24, 26 could cause jamming or binding. Adding a small hopping motion in Z may help the operator to find and settle into the correct alignment without jamming the parts. Even though it may be difficult to "hop" with consistent frequency and amplitude, this type of motion may be implemented by overlaying the motion command given by the operator via the haptic joystick 14a with a predefined automatic motion sequence in the form of a Z offset that changes over time in a sinusoidal fashion. The amplitude and frequency of this hopping can be predetermined prior to the assembly task, or it can be set and adjusted (i.e. made faster, slower, larger in amplitude, or lower in amplitude) in the midst of the assembly task based on the progress of the assembly.
In the above described peg-in-hole type of assembly, the tool which holds the peg 24, may block the view of the operator. The tool is well known in the art and is not shown for ease of illustration. The operator can also have no visual access to the relative X-Y position of hole 26 if, for example as shown in FIG. 9, the operator has to insert the peg 24 into a small hole 25 that is in a stepped hole 26. For these types of assembly the operator can select and enable the spiral search assembly strategy program (predefined automatic assembly motion sequence), one example of which is shown in FIG. 10, in the X-Y plane and tele-operate the Z direction to control the insertion force directly. In this case as shown in FIG. 1 1 , the assembly strategy generates the velocity reference and the tele-operation generates the force reference, which can be combined in control loop.
If however the operator has no visual access to the relative X-Y position of peg hole 26, the operator can enable the selected predefined assembly strategy program (predefined automatic assembly motion sequence) in the Z direction to control the insertion force. The selected assembly strategy program can limit the maximum contact force to protect the parts 24, 26 and the operator tele-operates the X-Y position.
In another assembly example as shown in FIG. 12, a series of gear teeth 28 have to mesh with another series of gear teeth 30 in a multistage assembly. The gear teeth 28 and 30 are shown prior to assembly in the left view in FIG. 12 and in assembled positions in the right view in FIG. 12.
If the gears 28 and 30 do not align correctly, hopping may be used to remove the pressure between the two parts. A rotation in Z after the parts 28 and 30 are
disengaged, to allow the top part 28 to reorient before it comes in contact with the bottom part 30. This synchronization of downward (negative Z), upward (positive Z) and rotation (+/- Rz) while in the upward position may be difficult to achieve manually, especially when trying to achieve it in a quick manner. However, this may be
implemented as a predefined automatic motion sequence that can be triggered and overlaid on top of a manual tele-operated motion.
In one embodiment, a preprogrammed change in Z and Rz may be added to the motion command that comes from the operator's haptic joystick 14a. In another embodiment, the operator only maintains control of the X, Y, Rx and Ry degrees of freedom, while the robot control system directly controls Z and Rz to follow the predefined pattern for those directions. Also the operator can be allowed to adjust the amplitude of the change in Z and Rz and/or the frequency of the hopping motion. The adjustment could occur before the predefined motion was initiated or during the motion.
Other predefined assembly strategies (predefined automatic motion sequences) are possible, such as sliding, rocking, twisting etc. Each type of strategy would have its own parameters that could be predefined or adjusted. There are also many ways to initiate a predefined assembly strategy, such as triggering them based on a sensor, user input, detection of reaching a specified stage, etc. As may be appreciated, the system described herein allows the operator of a tele-operated robot, e.g., used for parts assembly, to use assembly strategies
(predefined automatic assembly motion sequences) that combine hopping, twisting, and sliding, and/or other motions or motion sequences, to improve the effectiveness and repeatability of robotic operations, e.g., assembly operations. As may also be appreciated many of these strategies may be difficult for a human to execute by way of tele-operation since they depend on a human consistently vibrating, rotating, etc. and as may be appreciated is difficult for a human to control the amplitude and frequency of such strategies.
Operators selecting, initiating, and controlling the predefined assembly motions (predefined automatic assembly motion sequences) as they are tele-operating the robot using a haptic joystick allows the robot's actual motion to be based on a combination of the operator's tele-operating and the selected predefined assembly strategy to thereby improve the assembly motion over that of tele-operated operator jogging alone.
Embodiments of the present invention include a robotic system, comprising: a robot; an operator input device; a control system communicatively coupled to the robot and to the input device, the control system storing executable program instructions including a predefined automatic motion sequence of the robot, wherein the input device is operative to direct the robot via the control system to perform user-guided motions of the robot in response to human manipulation of the input device; and wherein the control system is configured to execute program instructions to execute the predefined automatic motion sequence and to overlay the predefined automatic motion sequence with the user-guided motions. In a refinement, the predefined automatic motion sequence is performed independently of the user-guided motions.
In another refinement, the predefined automatic motion sequence is a sequence of individual robotic assembly motions directed to assembling a first component to a second component; and the user-guided robotic motions are robotic motions directed toward assembling the first component to the second component.
In yet another refinement, the automatic motion sequence includes at least a first motion and a second motion different from the first motion.
In still another refinement, the executable program instructions include a plurality of different predefined automatic motion sequences; and the control system is configured to execute program instructions to receive a user selection of a desired one or more of the different predefined automatic motion sequences and to execute the selected one or more of the different predefined automatic motion sequences and to overlay the predefined automatic motion sequence with the user-guided motions.
In yet still another refinement, the predefined automatic motion sequence is defined by a parameter; and the control system is configured to execute program instructions to receive a user input to adjust the parameter during tele-operation of the robot using the input device.
In a further refinement, the robotic system further comprises one or more sensors communicatively coupled to the control system, wherein the control system is
configured to execute program instructions to initiate the predefined automatic motion sequence based on a user input, the output of one or more of the sensors, and/or a determination that a predetermined stage in a robotic procedure has been reached.
In a yet further refinement, the input device is a haptic input device.
In a still further refinement, the robotic system includes a haptic feedback sensor, wherein the input device is constructed to provide haptic feedback to the user based on an output of the haptic feedback sensor.
Embodiments of the present invention include a method for operating a robot, comprising: predefining an automatic motion sequence of the robot; coupling an input device to the robot, wherein the input device is operative to direct movement of the robot in response to human manipulation of the input device; tele-operating the robot using the input device to generate user-guided motions of the robot; and executing the predefined automatic motion sequence and superimposing the predefined automatic motion sequence onto the user-guided motions while tele-operating the robot with the input device.
In a refinement, the predefined automatic motion sequence is performed independently of the user-guided motions.
In another refinement, the predefined automatic motion sequence is a sequence of individual robotic assembly motions directed to assembling a first component to a second component; and wherein the user-guided robotic motions are robotic motions directed toward assembling the first component to the second component.
In yet another refinement, the automatic motion sequence includes at least a first motion and a second motion different from the first motion. In still another refinement, the predefining includes predefining a plurality of different automatic motion sequences, the method further comprising selecting a desired one or more of the automatic motion sequences; and executing the selected one or more of the automatic motion sequences while tele-operating the robot with the input device.
In yet still another refinement, the automatic motion sequence is defined by a parameter, further comprising adjusting the parameter while tele-operating the robot.
In a further refinement, the input device is a haptic input device.
In a yet further refinement, the input device is a haptic joystick.
In a still further refinement, the robot includes a force sensor, the method further comprising providing haptic feedback to the user from the input device based on an output of the force sensor.
Embodiments of the present invention include a robotic system, comprising: a robot constructed to assemble a first component to a second component; a sensor coupled to the robot; a control system communicatively coupled to the robot and to the sensor, the control system storing executable program instructions defining a predefined automatic assembly motion of the robot; a haptic input device
communicatively coupled to the control system and operative to direct the robot via the control system to perform user-guided assembly motions of the robot in response to human manipulation of the input device, and operative to provide haptic feedback to the user based on an output of the sensor, wherein the control system is configured to execute program instructions to superimpose the predefined automatic assembly motion onto the user-guided assembly motions.
In a refinement, the predefined automatic assembly motion is defined by a parameter; and the control system is configured to execute program instructions to receive a user input to adjust the parameter during tele-operation of the robot using the haptic input device.
While the invention has been illustrated and described in detail in the drawings and foregoing description, the same is to be considered as illustrative and not restrictive in character, it being understood that only the preferred embodiments have been shown and described and that all changes and modifications that come within the spirit of the inventions are desired to be protected. It should be understood that while the use of words such as preferable, preferably, preferred or more preferred utilized in the description above indicate that the feature so described may be more desirable, it nonetheless may not be necessary and embodiments lacking the same may be contemplated as within the scope of the invention, the scope being defined by the claims that follow. In reading the claims, it is intended that when words such as "a," "an," "at least one," or "at least one portion" are used there is no intention to limit the claim to only one item unless specifically stated to the contrary in the claim. When the language "at least a portion" and/or "a portion" is used the item can include a portion and/or the entire item unless specifically stated to the contrary.
Unless specified or limited otherwise, the terms "mounted," "connected," "supported," and "coupled" and variations thereof are used broadly and encompass both direct and indirect mountings, connections, supports, and couplings. Further, "connected" and "coupled" are not restricted to physical or mechanical connections or couplings.

Claims

CLAIMS What is claimed is:
1 . A robotic system, comprising:
a robot;
an operator input device;
a control system communicatively coupled to the robot and to the input device, the control system storing executable program instructions including a predefined automatic motion sequence of the robot,
wherein the input device is operative to direct the robot via the control system to perform user-guided motions of the robot in response to human manipulation of the input device; and
wherein the control system is configured to execute program instructions to execute the predefined automatic motion sequence and to overlay the predefined automatic motion sequence with the user-guided motions.
2. The robotic system of claim 1 , wherein the predefined automatic motion sequence is performed independently of the user-guided motions.
3. The robotic system of claim 1 , wherein the predefined automatic motion sequence is a sequence of individual robotic assembly motions directed to assembling a first component to a second component; and wherein the user-guided robotic motions are robotic motions directed toward assembling the first component to the second component.
4. The robotic system of claim 1 , wherein the automatic motion sequence includes at least a first motion and a second motion different from the first motion.
5. The robotic system of claim 1 , wherein the executable program instructions include a plurality of different predefined automatic motion sequences; and wherein the control system is configured to execute program instructions to receive a user selection of a desired one or more of the different predefined automatic motion sequences and to execute the selected one or more of the different predefined automatic motion sequences and to overlay the one or more of the different predefined automatic motion sequences with the user-guided motions.
6. The robotic system of claim 1 , wherein the predefined automatic motion sequence is defined by a parameter; and wherein the control system is configured to execute program instructions to receive a user input to adjust the parameter during tele- operation of the robot using the input device.
7. The robotic system of claim 1 , further comprising one or more sensors
communicatively coupled to the control system, wherein the control system is
configured to execute program instructions to initiate the predefined automatic motion sequence based on a user input, the output of one or more of the sensors, and/or a determination that a predetermined stage in a robotic procedure has been reached.
8. The robotic system of claim 1 , wherein the input device is a haptic input device.
9. The robotic system of claim 8, further comprising a haptic feedback sensor, wherein the input device is constructed to provide haptic feedback to the user based on an output of the haptic feedback sensor.
10. A method for operating a robot, comprising:
predefining an automatic motion sequence of the robot;
coupling an input device to the robot, wherein the input device is operative to direct movement of the robot in response to human manipulation of the input device; tele-operating the robot using the input device to generate user-guided motions of the robot; and
executing the predefined automatic motion sequence and superimposing the predefined automatic motion sequence onto the user-guided motions while tele- operating the robot with the input device.
1 1 . The method of claim 10, wherein the predefined automatic motion sequence is performed independently of the user-guided motions.
12. The method of claim 10, wherein the predefined automatic motion sequence is a sequence of individual robotic assembly motions directed to assembling a first component to a second component; and wherein the user-guided robotic motions are robotic motions directed toward assembling the first component to the second component.
13. The method of claim 10, wherein the automatic motion sequence includes at least a first motion and a second motion different from the first motion.
14. The method of claim 10, wherein the predefining includes predefining a plurality of different automatic motion sequences, further comprising selecting a desired one or more of the automatic motion sequences; and executing the selected one or more of the automatic motion sequences while tele-operating the robot with the input device.
15. The method of claim 10, wherein the automatic motion sequence is defined by a parameter, further comprising adjusting the parameter while tele-operating the robot.
16. The method of claim 10, wherein the input device is a haptic input device.
17. The method of claim 16, wherein the input device is a haptic joystick.
18. The method of claim 16, wherein the robot includes a force sensor, further comprising providing haptic feedback to the user from the input device based on an output of the force sensor.
19. A robotic system, comprising: a robot constructed to assemble a first component to a second component;
a sensor coupled to the robot;
a control system communicatively coupled to the robot and to the sensor, the control system storing executable program instructions defining a predefined automatic assembly motion of the robot;
a haptic input device communicatively coupled to the control system and operative to direct the robot via the control system to perform user-guided assembly motions of the robot in response to human manipulation of the input device, and operative to provide haptic feedback to the user based on an output of the sensor,
wherein the control system is configured to execute program instructions to superimpose the predefined automatic assembly motion onto the user-guided assembly motions.
20. The robotic system of claim 19, wherein the predefined automatic assembly motion is defined by a parameter; and wherein the control system is configured to execute program instructions to receive a user input to adjust the parameter during tele- operation of the robot using the haptic input device.
PCT/US2016/027707 2015-04-15 2016-04-15 Robotic system and method for operating a robot WO2016168570A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201562147651P 2015-04-15 2015-04-15
US62/147,651 2015-04-15

Publications (1)

Publication Number Publication Date
WO2016168570A1 true WO2016168570A1 (en) 2016-10-20

Family

ID=55861213

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2016/027707 WO2016168570A1 (en) 2015-04-15 2016-04-15 Robotic system and method for operating a robot

Country Status (1)

Country Link
WO (1) WO2016168570A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110430976A (en) * 2017-01-27 2019-11-08 隆萨有限公司 Dynamic control automation system
WO2020171201A1 (en) * 2019-02-22 2020-08-27 株式会社エスイーフォー Method for inserting shaft member into hole in segment, and control device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5341459A (en) * 1991-05-09 1994-08-23 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Generalized compliant motion primitive
US20100211204A1 (en) * 2007-01-09 2010-08-19 Abb Inc. Method and system for robotic assembly parameter optimization
WO2014088997A1 (en) * 2012-12-03 2014-06-12 Abb Technology Ag Teleoperation of machines having at least one actuated mechanism and one machine controller comprising a program code including instructions for transferring control of the machine from said controller to a remote control station

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5341459A (en) * 1991-05-09 1994-08-23 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Generalized compliant motion primitive
US20100211204A1 (en) * 2007-01-09 2010-08-19 Abb Inc. Method and system for robotic assembly parameter optimization
WO2014088997A1 (en) * 2012-12-03 2014-06-12 Abb Technology Ag Teleoperation of machines having at least one actuated mechanism and one machine controller comprising a program code including instructions for transferring control of the machine from said controller to a remote control station

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110430976A (en) * 2017-01-27 2019-11-08 隆萨有限公司 Dynamic control automation system
CN110430976B (en) * 2017-01-27 2024-02-06 隆萨有限公司 Dynamic control automation system
WO2020171201A1 (en) * 2019-02-22 2020-08-27 株式会社エスイーフォー Method for inserting shaft member into hole in segment, and control device

Similar Documents

Publication Publication Date Title
US10905508B2 (en) Remote control robot system
US7340323B2 (en) Industrial robot with controlled flexibility and simulated force for automated assembly
US9849595B2 (en) Contact force limiting with haptic feedback for a tele-operated robot
US20150127151A1 (en) Method For Programming Movement Sequences Of A Redundant Industrial Robot And Industrial Robot
Naceri et al. Towards a virtual reality interface for remote robotic teleoperation
JP7423726B2 (en) Robot system, method for controlling the robot system, method for manufacturing articles using the robot system, control device, operating device, method for controlling the operating device, imaging device, method for controlling the imaging device, control program, and recording medium
CN110914020B (en) Handling device with robot, method and computer program
CN108621151A (en) Icon formula programmable control method for Machinery Control System
EP2666064B1 (en) Method for teaching a robot movement
WO2016168570A1 (en) Robotic system and method for operating a robot
JP2017042828A (en) Wire electric discharge machine capable of machining various surfaces
US11104005B2 (en) Controller for end portion control of multi-degree-of-freedom robot, method for controlling multi-degree-of-freedom robot by using controller, and robot operated thereby
CN107710082B (en) Automatic configuration method for an external control system for controlling and/or regulating a robot system
Tirmizi et al. User-friendly programming of flexible assembly applications with collaborative robots
US8588981B2 (en) System of manipulators and method for controlling such a system
Schraft et al. Man-Machine-Interaction and co-operation for mobile and assisting robots
CN111699079B (en) Coordination system, operation device and method
KR20110077556A (en) Teaching system and method for robots
Herdocia et al. Unimodal asymmetric interface for teleoperation of mobile manipulators: A user study
Notheis et al. Skill-based telemanipulation by means of intelligent robots
KR20140140155A (en) Multi-joint robot and method for controlling multi-joint robot
Nicolaus et al. Development of an autonomous ball-picking robot
US20230346491A1 (en) Systems and methods for embodied robot control
Reinhart et al. Is force monitoring in cooperating industrial robots necessary?
IOANEŞ et al. Current trends regarding the intuitive programming of industrial robots

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16719647

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16719647

Country of ref document: EP

Kind code of ref document: A1