WO2000060868A1 - Systems and methods for improved telepresence - Google Patents

Systems and methods for improved telepresence Download PDF

Info

Publication number
WO2000060868A1
WO2000060868A1 PCT/US2000/008921 US0008921W WO0060868A1 WO 2000060868 A1 WO2000060868 A1 WO 2000060868A1 US 0008921 W US0008921 W US 0008921W WO 0060868 A1 WO0060868 A1 WO 0060868A1
Authority
WO
WIPO (PCT)
Prior art keywords
telepresence
devices
operator
input
zone structure
Prior art date
Application number
PCT/US2000/008921
Other languages
French (fr)
Inventor
Matthew O. Anderson
W. David Willis
Robert A. Kinoshita
Original Assignee
Bechtel Bwxt Idaho, Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bechtel Bwxt Idaho, Llc filed Critical Bechtel Bwxt Idaho, Llc
Priority to AU40698/00A priority Critical patent/AU4069800A/en
Publication of WO2000060868A1 publication Critical patent/WO2000060868A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/194Transmission of image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/246Calibration of cameras

Definitions

  • the commands are usually received from input devices and the present invention translates the raw data provided by the input devices into a zone structure that is understood by the potential telepresence devices. Telepresence devices only respond to the zones that affect them. Thus, a slider bar will only respond to data in a particular zone and will ignore the information that may be contained in other zones. Because the raw data of the input devices is converted to a zone structure, any input device is easily capable of controlling any telepresence device. In fact, it is possible for a single input device to control multiple telepresence devices.
  • the present invention extends both methods and systems for controlling telepresence and robotic systems.
  • the embodiments of the present invention may comprise a special purpose or general purpose computer including various computer hardware.
  • Embodiments within the scope of the present invention also include computer-readable media for carrying or having computer-executable instructions or data structures stored thereon.
  • Such computer-readable media can be any available media which can be accessed by a general purpose or special purpose computer.
  • Such computer- readable media can comprise RAM, ROM, EPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
  • the input or commands supplied by the operator are transmitted over the communication link 40 to the telepresence devices 60.
  • the telepresence devices 60 may include one or more hardware modules or devices that are capable of being controlled by the operator commands.
  • the operator is also capable of responding to feedback supplied by the telepresence devices 60.
  • FIG. 2 is a more detailed block diagram illustrating potential configurations of telepresence control 20 and telepresence devices 60.
  • the telepresence 20 comprises input devices 22 and a computer 30.
  • the input devices 22 are used to receive input, movement or commands from an operator that are then provided to computer 30.
  • Computer 30 processes these commands and transmits them to the telepresence devices 60 via communication link 40, which may comprise a radio modem.
  • the telepresence devices 60 then execute the operator commands.
  • FIG. 3 illustrates an exemplary set of zones 99 which are interpreted by a computer as commands to move a telepresence device in a particular direction and at a particular speed.
  • the dead zone 100 is interpreted as no motion and is present essentially to ensure that inadvertent movements are not interpreted as a movement command.
  • First left zone 104 is interpreted as left motion and second left zone 102 is interpreted as a command to move more rapidly to the left. Additional left zones may be implemented, but are not illustrated in Figure 3. In fact, the actual number of zones can vary and may be tailored to a specific operator.
  • a similar analysis can be applied to forward zones 106, right zones 108 and reverse zones 110.
  • Figure 3 is intended to generally illustrate the concept of zones, but is not to be interpreted as limiting the number of zones that may be defined.
  • a headset can be used to interpret approximately six degrees of motion that correspond to movement in the x, y, z, pitch, roll, and yaw directions.
  • the headset 24 has zones as described above, while the joystick 26, the mouse 28, and the keyboard 30 can also be associated with either the same or different zones.
  • the number of zones is dependent on the input device.
  • the joystick 26 has some of the same zones as the headset 24, but the joystick 26 does not have all of the zones that correspond to the headset 24.
  • the keyboard 30 may have more zones that the headset 24 because each of the keys can be associated with a different zone.
  • configuration module 36 may be easily modified to change, add, or remove views. Because the telepresence system as described herein is easily adaptable to any input device, new or different telepresence devices are easily added and controlled. Further, additional input devices may also be added quickly by simply modifying the configuration module 36. Thus, adding a new input device or a telepresence device requires that the configuration module 36 be modified and that the telepresence system be restarted such that the defined views are activated. In addition to defining one or more views, the configuration module 36 may also be utilized to initialize the various input and telepresence devices.

Abstract

The present invention provides a modular, flexible system for deploying multiple video perception technologies. The telepresence system of the present invention is capable of allowing an operator to control multiple mono and stereo video inputs in a hands-free manner. The raw data generated by the input devices (22) is processed into a common zone structure that corresponds to the commands of the user, and the commands represented by the zone structure are transmitted to the appropriate device. The modularized approach permits input devices to be easily interfaced with various telepresence devices. Additionally, new input devices and telepresence devices are easily added to the system and are frequently interchangeable. The present invention also provides a modular configuration (36) component that allows an operator to define a plurality of views each of which defines the telepresence devices to be controlled by a particular input device. The present invention provides a modular flexible system for providing telepresence for a wide range of applications. The modularization of the software components combined with the generalized zone concept allows the systems and methods of the present invention to be easily expanded to encompass new devices and new uses.

Description

SYSTEMS AND METHODS FOR IMPROVED TELEPRESENCE
CONTRACTUAL ORIGIN OF THE INVENTION
This invention was made with United States Government support under Contract No. DE-AC07-94ID 13223, now Contract No. DE-AC07-99ID13727 awarded by the United States Department of Energy. The United States Government has certain rights in the invention.
RELATED APPLICATION
This application claims priority from United States provisional application S/N 60/127,826 filed April 5, 1999, which is hereby incorporated by reference.
BACKGROUND OF THE INVENTION Field of the Invention
The present invention relates generally to remotely controlled robotic systems incorporating telepresence. More particularly, the present invention relates to telepresence systems capable of providing continuous three dimension zooming capability.
Present State of the Art
Robotic systems are progressively being implemented as solutions to problems existing in a wide variety of situations and environments. Some of those environments, such as nuclear reactors, are hazardous to humans and the use of robotic systems prevents humans from being unnecessarily exposed to those hazardous conditions. Other environments and situations that may benefit from the use of robotic systems or devices include medical procedures, underwater activities, and security or surveillance systems. The ability to remotely control robots or robotic systems is becoming more difficult and complex as the robotic systems become more sophisticated and intricate. The complexity arises from the number of tasks that a robotic system may perform as well as the controls that are needed cause the robotic system to perform those tasks. Frequently, operators of remote robotic systems have a need to easily and accurately view the operating environment of the robotic system as well as the objects that are being manipulated by the robotic system. In particular, the ability to display depth is greatly beneficial to remote operators, especially when sensitive objects are being manipulated and handled by the robotic system.
A potential solution to this problem is to permit the robotic system to be controlled by more than one remote operator. The number of controls assigned to each operator may be reduced, but other problems can arise which are related to the interaction of the operators. Frequently, the actions of the operators must be coordinated to produce a particular result. However, the operators are often separated from one another and are often controlling other devices that also require their attention and focus. As a result, the operators are unable to effectively communicate with one another and the performance of the robotic system is reduced. Other attempts to resolve this problem have incorporated video cameras either attached to the robotic system or placed within the operating environment of the robotic system to provide a telepresence. However, if the camera is a singular unit, the remote operator is unable to perceive depth. The lack of depth perception can lead to serious complications, especially in the case of nuclear reactors. For example, a robotic system may be used to seal hazardous materials in an appropriate container. In this case, the operator must be able to simultaneously view the hazardous material by maneuvering a camera, cause the robotic system to grasp the hazardous material, place the hazardous material in the container, and seal the hazardous material in the container. Performing these functions is difficult and slow for several reasons. First, the operator is using more than one device to control both the robotic system and the camera. Second, the camera may not provide stereo vision and the operator is unable to perceive depth. If the camera is capable of providing stereo vision, the camera is typically not capable of providing continuous stereo zooming functions. Cameras capable of providing continuous stereo zooming functions require additional controls that simply add to the existing controls. Furthermore, this additional complexity taxes the ability of the remote operator to efficiently operate the robotic system.
In addition, many robotic systems provide a wide variety of hardware devices for performing various tasks, and it is often difficult for an operator to switch control to different devices. What is needed are systems and methods that permit an operator to more easily control a robotic system having telepresence including stereo zooming capabilities as well as systems and methods for allowing an operator to easily reconfigure the hardware devices that are being controlled by the remote operator.
SUMMARY OF THE INVENTION
A telepresence system provides a remote operator the ability to view an operating environment. One embodiment of the present invention provides a hands free intuitive interface that allows an operator of a remote robotic or telepresence system to concentrate on the tasks at hand. The present invention minimizes the complexity of remote stereo vision controls and provides an operator with an accurate view of the operating environment, including depth perception. The interface of the telepresence system and the operator is simplified to provide a modular, reconfigurable system. In order to provide telepresence, it is often necessary to convert user commands into device motion. Many of the devices on a robotic or telepresence system, including robots, cameras, zoom lenses, slider bars, and the like must often be repositioned, focused or otherwise moved. The present invention defines a generalized zone structure that is translated to device movement. The zones correspond generally to the various axes or directions that a device may move. A slider bar, for example may move along a single axes, while a pan and tilt device may move along multiple axes. The zones are defined such that direction and speed may be inferred from the value of the zones.
The commands are usually received from input devices and the present invention translates the raw data provided by the input devices into a zone structure that is understood by the potential telepresence devices. Telepresence devices only respond to the zones that affect them. Thus, a slider bar will only respond to data in a particular zone and will ignore the information that may be contained in other zones. Because the raw data of the input devices is converted to a zone structure, any input device is easily capable of controlling any telepresence device. In fact, it is possible for a single input device to control multiple telepresence devices.
The telepresence system is further modularized by providing the ability to define multiple views or states. Each view defines an input device and the telepresence devices that are to be controlled by that input device. Depending on the needs of the operator, the operator may issue, for example, a verbal command to change views. One advantage of this modularity is that an operator may use a single device to control a wide variety of telepresence devices. The modularity also allows additional input devices and telepresence devices to be easily and quickly adapted to the systems of the present invention. Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the invention. The features and advantages of the invention may be realized and obtained by means of the instruments and combinations particularly pointed out in the appended claims. These and other features of the present invention will become more fully apparent from the following description and appended claims, or may be learned by the practice of the invention as set forth hereinafter.
BRIEF DESCRIPTION OF THE DRAWINGS In order that the manner in which the above-recited and other advantages and objects of the invention are obtained, a more particular description of the invention briefly described above will be rendered by reference to specific embodiments thereof which are illustrated in the appended drawings. Understanding that these drawings depict only typical embodiments of the invention and are not therefore to be considered to be limiting of its scope, the invention will be described and explained with additional specificity and detail through the use of the accompanying drawings in which:
Figure 1 is a block diagram generally illustrating an exemplary telepresence system;
Figure 2 is a more detailed block diagram of an exemplary telepresence system; and
Figure 3 is a block diagram illustrating the concept of generalized zones for controlling a telepresence system.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Telepresence systems generally refer to systems that allow one or more operators to visually perceive a remote operating environment. Frequently, the operators are unable to physically view the operating environment and therefore rely on the telepresence system to provide an accurate representation of the operating environment. An accurate representation of the operating environment allows the remote operators to more effectively carry out their objectives. For example, the ability to defuse an explosive device using a remotely controlled a robot is greatly enhanced if the operator is able to accurately perceive the both explosive device and the environment of the explosive device.
As previously described, providing an operator with an accurate view of the operating environment requires an operator to interact with an excessive number of controls. The present invention alleviates the complexity of operating a sophisticated robotic system including telepresence devices in part by implementing control techniques that enable an operator to control certain aspects of the robotic and telepresence system in a non-conventional yet intuitive manner. For example, it is often desirable for a remote operator to adjust a camera view while manipulating a robotic arm or gripper and one embodiment of the present invention allows the operator to employ a headset to control the movement of the camera while allowing the operator's hands to use a joystick to control the robotic arm or gripper. In this manner, the complexity of the controls is effectively reduced because the operator is able to intuitively control the camera as the operator's head movements are translated into camera movement and the operator's hands are free to perform other tasks.
The present invention extends both methods and systems for controlling telepresence and robotic systems. The embodiments of the present invention may comprise a special purpose or general purpose computer including various computer hardware. Embodiments within the scope of the present invention also include computer-readable media for carrying or having computer-executable instructions or data structures stored thereon. Such computer-readable media can be any available media which can be accessed by a general purpose or special purpose computer. By way of example, and not limitation, such computer- readable media can comprise RAM, ROM, EPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a computer, the computer properly views the connection as a computer-readable medium. Thus, any such a connection is properly termed a computer-readable medium. Combinations of the above should also be included within the scope of computer-readable media. Computer-executable instructions comprise, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions.
The following discussion are intended to provide a brief, general description of a suitable computing environment in which the invention may be implemented. Although not required, the invention will be described in the general context of computer-executable instructions, such as program modules, being executed by computers in network environments. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Computer-executable instructions, associated data structures, and program modules represent examples of the program code means for executing steps of the methods disclosed herein. The particular sequence of such executable instructions or associated data structures represent examples of corresponding acts for implementing the functions described in such steps. Those skilled in the art will appreciate that the invention may be practiced in network computing environments with many types of computer system configurations, including personal computers, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, and the like. The invention may also be practiced in distributed computing environments where tasks are performed by local and remote processing devices that are linked (either by hardwired links, wireless links, or by a combination of hardwired or wireless links) through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices. Figure 1 is a block diagram illustrating an exemplary telepresence or robotic system illustrated generally as telepresence system 10. Telepresence control 20 communicates with telepresence devices 60 via communication link 40. Communication link 40 is representative of systems and methods that permit communication to occur between telepresence control 20 and telepresence devices 60. Communication link 40 includes, but is not limited to, wireless communication including radio modems and the like as well as physical communication apparatus such as cables and the like. In the case of wireless communication, communication link 40 may comprise a receiver and transmitter at both telepresence control 20 and telepresence devices 60. Communication link 40 may also comprise any means for permitting communication between telepresence control 20 and telepresence devices 60. The telepresence control 20 is the portion of telepresence system 10 that receives input from an operator through one or more input devices. The input or commands supplied by the operator are transmitted over the communication link 40 to the telepresence devices 60. The telepresence devices 60 may include one or more hardware modules or devices that are capable of being controlled by the operator commands. The operator is also capable of responding to feedback supplied by the telepresence devices 60.
Figure 2 is a more detailed block diagram illustrating potential configurations of telepresence control 20 and telepresence devices 60. In one embodiment, the telepresence 20 comprises input devices 22 and a computer 30. The input devices 22 are used to receive input, movement or commands from an operator that are then provided to computer 30. Computer 30 processes these commands and transmits them to the telepresence devices 60 via communication link 40, which may comprise a radio modem. The telepresence devices 60 then execute the operator commands.
Exemplary input devices include, but are not limited to, a headset 24, a joystick 26, a mouse 38 and a keyboard 30. Exemplary telepresence devices include, but are not limited to, stereo camera set 62, zoom camera 64, pan and tilt device (PTD) 66 and 68, slider bar 70, and robot 72. In the illustrated embodiment, the input devices 22 receive input from an operator that is effectively translated into motion by the telepresence devices 60. The input is often in the form of operator movement or motion. For example, the input to the headset 24 is the movement of the operator's head. In the case of a zoom camera, for example, the forward and backward movement of an operator's head may be interpreted as a command to cause a camera to zoom in or out. Alternatively, the forward and backward movement of an operator's head could also be interpreted as a command to physically move the camera either forward or backward. The actual implementation can be configured as needed.
However, it is understood that the present invention encompasses commands that are not related to the movement of the telepresence devices 60.
For example, telepresence devices 60 may comprise sensors for monitoring an environment. The commands provided by the operator may be interpreted as command to begin recording data. Other user commands may include causing the stored data to be transmitted to a remote location. The illustrated embodiment of the present invention effectively isolates the input devices 22 from the telepresence devices 60 such that any input device 22 can be used to control any one or more of the telepresence devices 60.
This ability to control the motion or other aspect of a telepresence device through any input device 22 is achieved in this embodiment through the use of generalized zones that are described with reference to Figure 3. Figure 3 illustrates an exemplary set of zones 99 which are interpreted by a computer as commands to move a telepresence device in a particular direction and at a particular speed. The dead zone 100 is interpreted as no motion and is present essentially to ensure that inadvertent movements are not interpreted as a movement command. Thus, when an operator is using a headset, the operator's head does not need to be held perfectly still and slight head movements will not be interpreted as input commands. First left zone 104 is interpreted as left motion and second left zone 102 is interpreted as a command to move more rapidly to the left. Additional left zones may be implemented, but are not illustrated in Figure 3. In fact, the actual number of zones can vary and may be tailored to a specific operator. A similar analysis can be applied to forward zones 106, right zones 108 and reverse zones 110.
Figure 3 is intended to generally illustrate the concept of zones, but is not to be interpreted as limiting the number of zones that may be defined. For example, a headset can be used to interpret approximately six degrees of motion that correspond to movement in the x, y, z, pitch, roll, and yaw directions.
Moving the head left and right corresponds to movement in the x direction, moving the head forward and backward corresponds to movement in the y direction, while moving the head vertically corresponds to movement in the z direction. Turning the head left and right corresponds to the yaw direction, nodding the head up and down corresponds to movement in the pitch direction and tilting the head left and right corresponds to movement in the roll direction. However, even though a particular input device such as a headset may have multiple zones, it is possible that the telepresence device implementing the movement commands received from the headset may not be able to move in corresponding directions. With reference again to Figure 2, each input device 22 has directions of movement that correspond to the zones as described in Figure 3. The headset 24 has zones as described above, while the joystick 26, the mouse 28, and the keyboard 30 can also be associated with either the same or different zones. The number of zones is dependent on the input device. Thus the joystick 26 has some of the same zones as the headset 24, but the joystick 26 does not have all of the zones that correspond to the headset 24. Alternatively, the keyboard 30 may have more zones that the headset 24 because each of the keys can be associated with a different zone.
The input commands from the input devices are received by an input conversion module 34 operating at computer 30. The input conversion module 34 receives the raw input from the input devices 22 and converts the raw input into a zone structure that is by the computer 30 for each input device 22. The zone structure may use integers, for example, to define movement in a particular direction. Positive integers correspond to movement in one direction while negative integers correspond to movement in the opposite direction. The magnitude of the integer is often related to the speed of movement. The zone structure thus enables any input device 22 to be compatible with one or more telepresence devices 60.
The zone structure is provided to the device modules 32, which processes the zone structure and issues the appropriate movement or operator command across the communication link 40 to the appropriate telepresence device. The raw data provided by the input devices 24 is converted to the zone structure. In this manner, the use of the zone structure, allows any input device to control any telepresence device and input devices are interchangeable. Even though a particular input device 22 may have many different directions and zones associated with it, the device modules 32, or more specifically the telepresence devices 60, only respond to the directions that concern the telepresence device being controlled. For purposes of discussion, all potential directions of movement are referred to as axes. For example, slider bar 70 is a device that is capable of moving along a single axis. If the headset 24 is used to control the movement of the slider bar 70, then the device module 32 that controls the slider bar 70 will only respond to those portions of the zone structure that correspond to motion along that axis and the other portions of the zone structure will be ignored for that device. On the other hand, if the headset 24 is used to control the pan and tilt device (PTD) 66, which is capable of movement along multiple axes, then the device module 32 controlling the pan and tilt device 66 will respond to more portions of the zone structure.
More particularly, the input conversion module 34 and the device modules 32 allow any of the input devices 22 to control any of the instruments or hardware component or devices comprising telepresence devices 60. In fact, it is possible for a single input device to control more than one of the telepresence devices 60. For example, if the headset 24 is selected as the input device and the operator desires to control the zoom camera 64, it is also necessary to control the PTD 68, the camera zoom, and the camera focus. The PTD 68 requires two degrees of freedom or axes: tilt and pan. When operators move their heads left and right, the PTD 68 will pan the zoom camera 64 left and right. When operators nod their head up and down, the PTD 68 will tilt the zoom camera 64 up or down. When operators move their head either forward or backward, the magnification provided by the zoom lens of the zoom camera 64 is altered accordingly. The focus of the zoom camera 64 may be achieved when the headset 24 detects the operator's head being turned either left or right. In this manner, a single input device is able to control the movement of more than one telepresence device. The above example illustrates that the present invention has the ability to allow one or more input devices to control one or more telepresence devices. However, it is desirable to allow a particular input device to control a variety of telepresence devices. While it is possible for more than one input device to be active or used at a time, it is preferable that only one input device be active. It is understood, that the telepresence devices being controlled are typically related to those devices that permit an operator to remotely view an operating environment and that the operator may simultaneously be controlling a robot or other device. Thus, it is preferable that only one input device be active for controlling the telepresence devices that allow the operators to view their actions in the operating environment.
Because a single input device may not be capable of simultaneously controlling all of the telepresence devices 60, configuration module 36 allows an operator to easily change the particular telepresence devices 60 that are being controlled by a particular device. The configuration module 36 defines a plurality of views and each view corresponds to a particular set of devices. Typically, each view defines one input device and the telepresence devices being controlled by that input device. After the views are defined, the operator may switch to a particular view by issuing a verbal command that the computer 30 may recognize, a keyboard command, or other command. When a certain view is active, the selected input device may be used to control the designated telepresence devices. It is understood that more than one view may be active, but only one view is typically utilized because the operator can usually only interact with the visual representation of the operating environment provided by one of the camera sets at a time. If the video provided by another camera set is desired, the operator simply selects another view, a process that is significantly simpler than continually repositioning a particular camera. The following table describes an exemplary configuration module 36 having a plurality of views. The entries in the table correspond to the input devices 22 and telepresence devices 60 illustrated in Figure 2.
Figure imgf000014_0001
Figure imgf000015_0001
Typically, the cameras that may be present as telepresence devices are used to display either a stereo or a static visual representation of the operating environment and by selecting different views, an operator is able to see different aspects of the operating environment without having to move a particular camera. A telepresence system typically has a plurality of camera sets. Some of the camera sets provide stereo vision while others may only provide mono vision. The zoom camera 64 is preferably capable of providing two separate video signals that may be combined to produce stereo vision. Alternatively, the zoom camera 64 may also provide mono vision.
A significant advantage of configuration module 36 is that it may be easily modified to change, add, or remove views. Because the telepresence system as described herein is easily adaptable to any input device, new or different telepresence devices are easily added and controlled. Further, additional input devices may also be added quickly by simply modifying the configuration module 36. Thus, adding a new input device or a telepresence device requires that the configuration module 36 be modified and that the telepresence system be restarted such that the defined views are activated. In addition to defining one or more views, the configuration module 36 may also be utilized to initialize the various input and telepresence devices.
All system commands are also voice activated, Thus, the zones associated with a particular input device may be calibrated or recalibrated, new views may be selected, cameras can be easily moved to a home position, and other actions may be similarly performed.
The present invention may be embodied in other specific forms without departing from its spirit or essential characteristics. The described embodiments are to be considered in all respects only as illustrative and not restrictive. The scope of the invention is, therefore, indicated by the appended claims rather than by the foregoing description. All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scope.

Claims

WE CLAIM:
1. A telepresence system for allowing an operator to interact with a remote operating environment, the system comprising: one or more input devices, wherein the one or more input devices produce raw data representative of operator commands; a computer for receiving the raw data, the computer processing the raw data into a zone structure, wherein the zone structure is representative of the operator commands and is compatible with one or more telepresence devices; and a communication link, wherein the operator commands in the zone structure are received by the one or more telepresence devices over the communication link such that the one or more input devices are configured to control the one or more telepresence devices, wherein the telepresence devices provide the operator with one or more visual representations of the operating environment.
2. A system as defined in claim 1, wherein the one or more input devices comprise one or more of: a headset, a keyboard, a mouse, and a joystick.
3. A system as defined in claim 1, wherein only one of the one or more input devices is permitted to produce raw data at a time.
4. A system as defined in claim 1 , wherein one of the one or more input devices is capable of controlling a plurality of the one or more telepresence devices.
5. A system as defined in claim 1, wherein the communication link is a wireless communication link.
6. A system as defined in claim 1, wherein the one or more telepresence devices comprise one or more of a stereo camera set, a zoom camera, a pan and tilt device, a slider bar, and a robot.
7. A system as defined in claim 6, wherein the pan and tilt device is connected to the stereo camera set and is capable of orienting the stereo camera set.
8. A system as defined in claim 6, wherein the pan and tilt device is connected to the zoom camera and is capable of orienting the zoom camera.
9. In a system having input devices and telepresence devices, a method for controlling one or more identified telepresence devices with a selected input device, the method comprising the steps of: receiving raw data from the selected input device; converting the raw data into a zone structure, wherein the zone structure is representative of movement commands; processing the zone structure with a device module for each identified telepresence device to obtain the movement commands for each identified telepresence device; and transmitting the movement commands to the identified telepresence devices.
10. A method as defined in claim 9, wherein the selected input device is one of a headset, a keyboard, a mouse, or a joystick.
11. A method as defined in claim 9, wherein the zone structure is compatible with the telepresence devices.
12. A method as defined in claim 9, wherein the zone structure is capable of representing a plurality of speeds and directions.
13. A method as defined in claim 9, wherein the identified telepresence devices only respond to portions of the zone structure that correspond to the axes of the identified telepresence devices.
14. A method as defined in claim 9, wherein the raw data corresponds to actions of an operator.
15. A method as defined in claim 9, further comprising the step of executing the movement commands by the identified telepresence devices.
16. A computer readable medium having computer-executable instructions for performing the steps recited in claim 9.
17. In a system having input devices and telepresence devices, a method for configuring the system to provide one or more views, the method comprising the steps of: for each of the one or more views: selecting one or more telepresence devices; selecting a single input device, wherein each of the selected telepresence devices will be controlled by the single input device; storing the one or more views in a configuration module; and configuring the system in accordance with the one or more views defined in the configuration module.
18. A method as defined in claim 17, further comprising the step of executing one of the views stored in the configuration module.
19. A method as defined in claim 17, further comprising the step of switching to a different defined view.
20. A method as defined in claim 17, further comprising the step of editing the configuration module to add, delete or change a view.
21. A method as defined in claim 17, wherein the configuration module is a text file.
22. A method as defined in claim 17, further comprising the step of switching to a different view in response to a voice command from an operator.
23. A computer-readable medium having computer-executable instructions for performing the steps recited in claim 17.
24. A telepresence system for allowing an operator to interact with a remote operating environment, the telepresence system comprising: a plurality of input devices; a plurality of telepresence devices, wherein one or more of the telepresence devices is configured to be controlled by one of the plurality of input devices and one or more of the telepresence devices is configured to provide a visual representation of the operating environment; a computer comprising: an input conversion module, the input conversion receiving raw data from at least one of the plurality of input devices and converting the raw data to a zone structure; and a plurality of device modules corresponding to the plurality of telepresence devices, wherein the device modules receive the zone structure and convert the zone structure to movement commands for each respective telepresence device; and a communication link for transmitting the movement commands to the telepresence devices.
25. A system as defined in claim 24, wherein the telepresence devices comprises one or more stereo camera sets each connected with a different pan and tilt device and a zoom camera connected with another pan and tilt device.
26. A system as defined in claim 25, wherein the zoom camera is capable of providing stereo vision.
27. A system as defined in claim 24, wherem the raw data generated by the input devices correspond to zones, each zone representative of movement in a particular direction and speed.
28. A system as defined in claim 27, wherein the zone structure integrates any of the input devices with one or more of the telepresence devices.
29. A system as defined in claim 24, wherein the computer further comprises a configuration module.
30. A system as defined in claim 29, wherein the configuration module comprises one or more views, wherein each view defines the one or more telepresence devices controlled by a single input device.
31. A system as defined in claim 30, wherein the operator may select a different view.
32. A system as defined in claim 29, wherein the one or more views stored in the configuration module permits a single input device to control different groups of telepresence devices.
33. A system as defined in claim 24, wherein the plurality of telepresence devices provide the operator with a visual representation of the operating environment.
34. A system as defined in claim 33, wherein the visual representation provides depth perception to the operator.
35. A system as defined in claim 24, wherein the communications link is wireless communication.
36. A system as defined in claim 24, wherein the plurality of input devices allow the operator to control the telepresence devices without the use of the operator's hands.
PCT/US2000/008921 1999-04-05 2000-04-04 Systems and methods for improved telepresence WO2000060868A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU40698/00A AU4069800A (en) 1999-04-05 2000-04-04 Systems and methods for improved telepresence

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US12782699P 1999-04-05 1999-04-05
US60/127,826 1999-04-05

Publications (1)

Publication Number Publication Date
WO2000060868A1 true WO2000060868A1 (en) 2000-10-12

Family

ID=22432161

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2000/008921 WO2000060868A1 (en) 1999-04-05 2000-04-04 Systems and methods for improved telepresence

Country Status (2)

Country Link
AU (1) AU4069800A (en)
WO (1) WO2000060868A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004023816A1 (en) 2002-09-03 2004-03-18 Audisoft Technologies Inc. Method and apparatus for telepresence
WO2004062188A2 (en) * 2002-12-31 2004-07-22 Honeywell International Inc. Generic communication server engine

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5495576A (en) * 1993-01-11 1996-02-27 Ritchey; Kurtis J. Panoramic image based virtual reality/telepresence audio-visual system and method
US5703604A (en) * 1995-05-22 1997-12-30 Dodeca Llc Immersive dodecaherdral video viewing system
US5796426A (en) * 1994-05-27 1998-08-18 Warp, Ltd. Wide-angle image dewarping method and apparatus

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5495576A (en) * 1993-01-11 1996-02-27 Ritchey; Kurtis J. Panoramic image based virtual reality/telepresence audio-visual system and method
US5796426A (en) * 1994-05-27 1998-08-18 Warp, Ltd. Wide-angle image dewarping method and apparatus
US5703604A (en) * 1995-05-22 1997-12-30 Dodeca Llc Immersive dodecaherdral video viewing system

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004023816A1 (en) 2002-09-03 2004-03-18 Audisoft Technologies Inc. Method and apparatus for telepresence
EP1552697B1 (en) * 2002-09-03 2011-07-20 Audisoft Technologies Inc. Method and apparatus for telepresence
WO2004062188A2 (en) * 2002-12-31 2004-07-22 Honeywell International Inc. Generic communication server engine
WO2004062188A3 (en) * 2002-12-31 2005-03-03 Honeywell Int Inc Generic communication server engine

Also Published As

Publication number Publication date
AU4069800A (en) 2000-10-23

Similar Documents

Publication Publication Date Title
US6958746B1 (en) Systems and methods for improved telepresence
JP2664205B2 (en) Manipulator control system
US4661032A (en) Bilateral master-slave manipulator control device
CN104440864B (en) A kind of master-slave mode remote operating industrial robot system and its control method
CN110825076B (en) Mobile robot formation navigation semi-autonomous control method based on sight line and force feedback
Yokokohji et al. Operation modes for cooperating with autonomous functions in intelligent teleoperation systems
US20150273689A1 (en) Robot control device, robot, robotic system, teaching method, and program
CN112828916A (en) Remote operation combined interaction device for redundant mechanical arm and remote operation system for redundant mechanical arm
Brooks et al. Superman: A system for supervisory manipulation and the study of human/computer interactions
Omarali et al. Position and velocity control for telemanipulation with interoperability protocol
WO2000060868A1 (en) Systems and methods for improved telepresence
Tran et al. Wireless data glove for gesture-based robotic control
JPS6257884A (en) Manipulator device
CN214025708U (en) Intuitive industrial robot demonstration system
Solvang et al. On industrial robots and cognitive info-communication
Pretlove Augmenting reality for telerobotics: unifying real and virtual worlds
Gou et al. Workspace mapping method based on edge drifting for the teleoperation system
KR20110077556A (en) Teaching system and method for robots
Stark FOR THE EVOLVING SPACE STATION: RESEARCH NEEDS AND OUTSTANDING
JPH0239802B2 (en) ROBOTSUTONOSEIGYOHOHO
McKay et al. Developing the VirtualWindoW into a general purpose telepresence interface
Kinoshita et al. VirtualwindoW: a reconfigurable modular stereo vision system
Yong et al. Robot task execution with telepresence using virtual reality technology
Oliveira et al. A Brief Overview of Teleoperation and Its Applications
KR101969727B1 (en) Apparatus for manipulating multi-joint robot and method thereof

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AL AM AT AU AZ BA BB BG BR BY CA CH CN CR CU CZ DE DK DM EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP