WO1999064208A1 - Robot et procede de commande de son attitude - Google Patents
Robot et procede de commande de son attitude Download PDFInfo
- Publication number
- WO1999064208A1 WO1999064208A1 PCT/JP1999/003089 JP9903089W WO9964208A1 WO 1999064208 A1 WO1999064208 A1 WO 1999064208A1 JP 9903089 W JP9903089 W JP 9903089W WO 9964208 A1 WO9964208 A1 WO 9964208A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- posture
- robot
- falling
- attitude
- sensor
- Prior art date
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1674—Programme controls characterised by safety, monitoring, diagnostic
Definitions
- Patent application title Robot device and attitude control method thereof
- the present invention relates to a robot apparatus having a function of autonomously returning a posture from an abnormal posture state such as a falling state to a normal posture state, and a posture control method thereof.
- This type of robot device has a mechanism system in which an actuator having a predetermined degree of freedom and a sensor for detecting a predetermined physical quantity are arranged at predetermined positions, respectively.
- an actuator having a predetermined degree of freedom and a sensor for detecting a predetermined physical quantity are arranged at predetermined positions, respectively.
- the vehicle can self-run and perform a predetermined operation.
- each constituent unit such as a body, a leg, and a head is predetermined. They are assembled into a predetermined shape by being combined into a state having a correlation.
- Some multi-legged robots having two or more legs are in the form of animals such as cats and dogs.
- the multi-legged walking robot having such a configuration has, for example, four legs, and each leg has a predetermined number of joints.
- Methods for controlling the joints of the feet of this type of robot include recording and reproducing position information and speed information by teaching, and generating and executing position information and speed information by computation using a motion model. There is a way to do that.
- both the method using the teaching method and the method using the motion model are premised on the operation in the environment expected by the designer. In some cases, the situation was contrary to the intention, and the abnormal posture could impair the function and structure of the device, resulting in failure or damage to the operating environment.
- an object of the present invention is to prevent the failure or accident of a robot device due to use in an abnormal posture state such as a falling state in view of the actual situation of the conventional robot device as described above. It is in.
- a robot apparatus includes: a posture recognition unit that recognizes a posture of an apparatus main body and outputs a recognition result; and a posture that determines that the apparatus main body is in a predetermined posture based on the recognition result. It is characterized by comprising a determination means, and a posture correction means for correcting the posture of the device body when the posture determination means determines the predetermined posture.
- the posture control method of the robot device recognizes the posture of the device main body, determines that the device main body has reached a predetermined posture based on the recognition result, and determines the predetermined posture. When the determination is made, the attitude of the device main body is corrected.
- FIG. 1 is a perspective view schematically showing the structure of a multi-legged walking robot to which the present invention is applied.
- FIG. 2 is a perspective view schematically showing an installation state of various sensors such as an acceleration sensor used for detecting a falling state of the multi-legged robot (FIG. 3 is a control diagram of the multi-legged robot)
- FIG. 2 is a block diagram schematically showing the configuration of the system.
- FIG. 4 is a perspective view schematically showing a basic posture of the multi-legged walking robot.
- FIG. 5 is a perspective view schematically showing a state where the left front leg is raised from the basic posture of the multi-legged walking robot.
- FIG. 6 is a perspective view schematically showing a state in which the posture of the multi-legged robot is collapsed.
- FIG. 7 is a perspective view schematically showing a state in which the posture of the multi-legged robot does not collapse.
- FIG. 8 is a flowchart illustrating an example of a method of editing the behavior pattern of the multi-legged robot.
- FIG. 9 is a flowchart illustrating an example of an algorithm of the fall determination by the control unit in the multi-legged walking robot.
- FIGS. 10A and 10B show the deviation angle ⁇ between the average acceleration A cc and the Y-Z plane obtained in the above-described fall determination process, and the projection component of the average acceleration A cc on the YZ plane and Z It is a figure which shows typically the relationship of the angle (phi) made with an axis
- FIG. 11 is a diagram schematically showing the relationship between the falling direction and the angle during walking determined by the constraint condition based on the shape of the multi-legged robot.
- FIG. 12A, FIG. 12B, FIG. 12C, and FIG. 12D are side views schematically showing various falling states during walking of the multi-legged walking bot.
- FIG. 13 is a side view schematically showing a process of a return operation of the multi-legged robot from a fall state to a normal posture state.
- FIG. 14 is a diagram schematically showing a contact detection state by a contact sensor in the standing posture of the multi-legged robot.
- FIG. 15 is a diagram schematically illustrating a contact detection state by the contact sensor in the sitting posture of the multi-legged walking robot.
- FIG. 16 is a diagram schematically showing a state in which image information is captured by a CCD camera in the standing posture of the multi-legged robot.
- FIG. 17A, FIG. 17B, FIG. 17C, and FIG. 17D are diagrams schematically showing image information captured by the CCD camera in the normal posture and the abnormal posture.
- Fig. 18 is based on the image information captured by the CCD camera. It is a figure for explaining the judgment method of the state of the floor.
- Fig. 19 shows a rotation detection device as an abnormal posture detection means.
- FIG. 3 is a schematic perspective view of a mold robot device.
- FIG. 20 is a schematic perspective view of a tire-type robot device provided with a floor detecting device as an abnormal posture detecting means.
- FIG. 21 is a diagram schematically showing a state transition of the return operation from the back-down state.
- the present invention is applied to, for example, a multi-legged walking robot 1 configured as shown in FIG.
- This multi-legged robot 1 is an articulated robot and has the shape of an animal with four legs.
- the articulated robot 1 has a main body 2, right forefoot 3, left forefoot 4, and right It has 5 hind legs, 6 left hind legs, 7 heads, 8 torso, and 9 tails.
- the articulated robot 1 has a brake mechanism 30 at the joints 10, 11, 12, 13 of the right forefoot 3, the left forefoot 4, the right hindfoot 5, the left hindfoot 6, and the like. ing.
- the direct teaching can be used to determine the relative positional relationship of any of the moving parts (legs) among the right front foot 3, the left front foot 4, the right rear foot 5, and the left rear foot 6. Position teaching by the operator Is what you can do.
- the main body 2 includes brackets 20, 21, 22, and 23 for the right front foot 3, the left front foot 4, the right rear foot 5, and the left rear foot 6.
- the head 7 is set on the front part on the main body 2, and the body part 8 is located behind the head 7.
- the tail 9 protrudes upward from the body 8.
- the right forefoot 3 has a leg 3a, a leg 3b, a bracket 20, a joint part 10, 10a, a brake mechanism 30, a 3b, 3d, 3e and the like.
- the upper end of the leg 3a is connected to the bracket 20, and the leg 3a is rotatable in the direction of the arrow R1 around the center wheel CL1.
- the leg 3 a and the leg 3 b are connected by a joint 10.
- the servomotor 3c is built in the main body 2, and when the servomotor 3c operates, the bracket 20 can rotate about the central axis CL2 in the direction of arrow R2.
- the leg 3a can rotate in the direction of arrow R1 about the central axis C L1.
- the leg 3b can rotate with respect to the leg 3a in the direction of the arrow R3 about the central axis CL3.
- the left forefoot 4 has legs 4a, 4b, brackets 21, joints 11 and 11a, a brake mechanism 30, and a servomotor 4c, 4d, 4e.
- the leg 4a is connected to the bracket 21 so that it can rotate in the direction of arrow R4 about the central axis CL4.
- the leg 4b is connected to the leg 4a by a joint 11.
- the sabomo overnight 4c is built into the main unit 2, and when the sabomo overnight 4c is activated, The racket 21 rotates around the central axis CL5 in the direction of arrow R5.
- the servo motor 4d operates, the leg 4a rotates about the central axis CL4 with respect to the bracket 21 in the direction of arrow R4.
- the leg 4b rotates in the direction of arrow R6 about the central axis CL6.
- the right hind leg 5 includes the legs 5a and 5b, the bracket 22 and the joints 12 and 12a, the brake mechanism 30 and the subbodies 5c, ⁇ d, and 5e. Yes.
- the upper end of the leg 5a is connected to the bracket 22.
- the bracket 22 can rotate in the direction of the arrow R7 about the central axis CL7.
- the leg 5a can rotate in the direction of the arrow R8 about the central axis CL8.
- the servo motor 5e is activated, the leg 5b can rotate around the central axis C L9 in the direction of arrow R9.
- the left hind leg 6 has legs 6a and 6b, a bracket 23, joints 13 and 13a, a brake mechanism 30 and servomotors 6c, 6d and 6e.
- the bracket 23 can rotate in the direction of the arrow R10 around the central axis CL10.
- the leg 6a can rotate in the direction of the arrow R11 around the central axis CL11.
- the leg 6b can rotate around the central axis CL12 in the direction of the arrow R12.
- each of the right forefoot 3, the left forefoot 4, the right hindfoot 5, and the left hindfoot 6 are each composed of three degrees of freedom, and are driven by servomotors around multiple axes. can do.
- the head 7 has servomotors 7a, 7b, and 7c.
- the servomotor 7a operates, the head 7 can swing in the direction of arrow R20 around the central axis CL20.
- the servomotor 7b operates, the head 7 swings around the center axis CL21 in the direction of the arrow R21.
- the servomotor 7c operates, the head 7 can swing in the direction of arrow R22 around the central axis CL22. That is, the head 7 is configured with three degrees of freedom.
- the body 8 has a servomotor 8a.
- the servomotor 8a When the servomotor 8a is operated, the tail 9 swings around the center axis CL23 in the direction of the arrow R23.
- the articulated robot 1 has a three-axis (X, y, z) acceleration sensor 41 built in the main body 2 so that the robot 2 can move to the main body 2 in any posture. It can detect acceleration and angular velocity.
- the head 7 is provided with a CCD camera 43 and a microphone 44.
- contact sensors 45 are arranged on the head, each leg tip, abdomen, throat, buttocks, and tail. As shown in FIG. 3, the detection output of each sensor is applied to a CPU (central processing unit) 102 provided in a control unit 100 of the articulated robot 1 via a bus 103 as shown in FIG. You can get it.
- CPU central processing unit
- Fig. 3 shows the control part 100 of this articulated robot 1 and the joint axes of the right forefoot 3, the left forefoot 4, the right hindfoot 5, the left hindfoot 6, the head 7, and the tail 9
- An example of the connection relationship between the servomotors for driving and the position sensor is shown.
- the control unit 100 has a memory 101 and a CPU (central processing unit) 102.
- the bus 103 of the CPU 102 has the right forefoot 3, It is connected to the left forefoot 4, right hindfoot 5, left hindfoot 6, head 7, and tail 9 elements.
- the right forefoot 3 has three-dimensional movements 3c, 3d, and 3e, and position sensors 3P1, 3P2, and 3P3.
- the servomotors 3c, 3d, and 3e are connected to the driver 3D, and the position sensors 3P1, 3P2, and 3P3 are also connected to the driver 3D.
- Each Dryno 3D is connected to bus 103.
- the left front foot 4 is connected to the driver 4D, and the position sensors 4P1, 4P2, and 4P3 are connected to the driver 4D.
- the right hind leg 5 is connected to the driver 5D, and the position sensors 5P1, 5P2, and 5P3 are connected to the bottom 5c, 5d, and 5e, respectively.
- the left rear foot 6 is connected to the driver 6D, and the position sensors 6 ⁇ 1, 6 ⁇ 2, and 6 ⁇ 3 are connected to the driver 6D.
- the sensor 7a, 7b, 7c of the head 7 and the position sensors 7P1, 7 12, 7 ⁇ 3 are connected to a driver 7D.
- the tail 9 is connected to the driver 9D, and the position sensor 9P1 is connected to the driver 9D.
- the position sensors 6 P 1, 6 P 2, and 6 P 3 of the position sensors 5 P 2, 5 P 3 and the left hind foot 6 obtain position information at each position.
- these position sensors are joints.
- a rotation angle sensor such as a potentiometer for angle detection can be used.
- the CPU 102 determines each of the positions based on the feedback position information. Give a command to the driver. Thereby, the corresponding driver performs the servo control for the corresponding motor, and the servo motor rotates to the command position given from the CPU 102.
- the torso portion 8 has a head 7, a right front leg 3, a left front leg 4, a right rear leg 5, and a left rear leg 6.
- Each of the legs 3 to 6 is provided with a joint 10, 11, 11, 12, 13, 30, 30, 30, 30, respectively.
- the posture of the multi-legged robot 1 shown in Fig. 4 is a basic posture in which the right forefoot 3, the left forefoot 4, the right hindfoot 5, and the left hindfoot 6 are straightened.
- FIG. 5 shows a state in which the joints 11 and 30 of the left forefoot 4 have been moved from the basic posture of FIG.
- the external editing instruction computer of the control unit 100 shown in FIG. On 400 pieces of software, the position of the center of gravity W0 of the multi-legged robot 1 shown in Fig. 5 is calculated, and from the position of the center of gravity W0, the multi-legged robot 1 does not fall down. The angle of the joint of at least one of the other right front foot 3, right rear foot 5, and left rear foot 6 can be automatically set.
- This instruction is given from the external editing instruction computer 400 to the CPU 102 of the control unit, so that the CPU 102 issues an operation command to the corresponding feet. Can be.
- the weight of each part of the multi-legged robot i.e., the weight of the torso part 8 and the main body 2, the right front foot 3, the left front foot 4, the right rear foot 5, the left rear foot 6, and the head 7
- the weight of each robot is stored in advance in the memory 402 of the external editing instruction computer 400, and based on the data of these weights, the multi-legged robot 1 shown in FIG.
- the position of the center of gravity W 0 can be calculated.
- step S1 information such as the weight and shape of each component of the multi-legged robot 1 is stored in the memory 101 of the multi-legged robot 1 in advance.
- information on the weight and shape of each element such as the main body 2, torso 8, head 7, right forefoot 3, left forefoot 4, right hindfoot 5, left hindfoot 6, tail 9, etc. is stored.
- the information is transferred from the memory 101 to the memory 402 of the external editing instruction computer 400. This is the acquisition of information such as weight and shape in step S1.
- step S2 the posture editing for the multi-legged robot 1 is started. That is, from the basic posture shown in FIG. 4, the left front leg 4 is made to protrude forward as shown in FIG. At this time, the movement is taught to the joints 11 and 30. If left as it is, the multi-legged robot 1 will fall to the left front because the center of gravity moves to the left front foot 4 as shown in FIG.
- step S3 the external editing instruction computer 400 of the control unit 100 shown in FIG. 3 determines the center of gravity W0 of the multi-foot walking robot 1 with respect to the main body 2 and the torso 8 as shown in FIG. A new center of gravity W1 is calculated along with and the resulting data is used as a new calculated center of gravity.
- the joints 10, 10, 12, 13 and joints 30 of the right forefoot 3, right hindfoot 5, and left hindfoot 6, as shown in FIG. , 30, 30 It is the external editing instruction computer 400 that gives this movement.
- the joints 10, 12, 13, 30, 30, 30, 30 of the right front leg 3, the right rear leg 5, and the left rear leg 6 are required.
- the movement given to 30 is preferably as in steps S4 and S5. That is, the projection point IM of the new center of gravity W1 of the multi-legged robot 1 on the ground plane 300 is located within the triangular center-of-gravity position adjustment range AR.
- the proper range AR is a triangular area formed by connecting the ground contact point CP 1 of the right forefoot 3, the ground point CP 2 of the right hind foot 5, and the ground point CP 3 of the left hind foot 6.
- the multi-legged robot 1 Since the projection point IM of the center of gravity W1 is always included in the appropriate range AR, the multi-legged robot 1 is prevented from falling, and the right foot 3, right hind foot 5, and left hind foot 6 Joints 0, 12, 13, and joints- The motion of the sections 30, 30, 30 can be given, and such a stable posture can be selected with the least motion.
- step S3 After calculating the position of the center of gravity in step S3 in this way, it is checked whether or not the multi-legged walking robot 1 falls in step S4. Calculate or change the movement (angle table) of the joints of, and calculate the position of the center of gravity again in step S3. If it is clear in step S4 that the robot does not fall, the flow proceeds to step S6, and the external editing instruction computer 400 ends editing of the motion pattern of the multi-legged robot 1. When the editing is completed in this way, the external editing instruction computer 400 officially inputs an operation pattern to the CPU 102 of the multi-legged robot 1 (step S7).
- the multi-legged walking robot 1 has acceleration information A cc in each axis (x, y, z) direction detected by a three-axis (x, y, z) acceleration sensor 41 built in the main body 2.
- the control unit 100 detects fallover based on Xt, AccYt, and AccZt.If the fall state is detected, the posture is returned to the normal posture state. It has become.
- the algorithm of the fall determination by the control unit 100 is shown in the flowchart of FIG.
- the control unit 100 calculates acceleration information Acc Xt, Acc in each axis (X, y, z) direction detected by the acceleration sensor 41. Fall detection is performed as follows based on Y t and Acc Z t. First, in the fall determination process, first, in step S11, the oldest acceleration information Acc Xn, Acc Yn, and Acc Zn in the data buffer are discarded, and the time tag of the data in the data buffer is changed. change. In the multi-legged robot 1, the buffer amount of the data buffer is 50 for each axis.
- step SI5 it is determined whether or not the average acceleration (Euclidean distance) A cc is within the range of an allowable error (mm A cc). If the error is out of the error range, it is determined that a large force has been received from the outside due to, for example, lifting, and the process exits the fall determination process.
- the average acceleration (Euclidean distance) A cc is within the range of an allowable error (mm A cc). If the error is out of the error range, it is determined that a large force has been received from the outside due to, for example, lifting, and the process exits the fall determination process.
- the declination 0 between the average acceleration A cc and the Y_Z plane and the angle ⁇ between the projection component of the average acceleration A cc on the ⁇ - ⁇ plane and the Z axis, and the current angle The template deviation S m between the average acceleration Acc and the Y_Z plane, which is the template data in the posture state, and the template which is the projected component of the average acceleration Acc to the Y_Z plane and the Z axis Compare with the angle of 0m and determine that the posture is normal if it is within the respective tolerances (A0m, ⁇ ⁇ m), and judge that it is overturned or abnormal if it is outside the range.
- pennie ⁇ / 2 is arbitrary.
- step S17 If a fall is detected by the above-described fall determination process (step S17), the process proceeds to a fall return step S18, and the posture is changed to a normal posture as follows.
- the trajectory planning data is prepared in order to perform a posture return from the above four falling states (Head Side Down, Right Side Down, Left Side Down, and Tail Side Down) which are created in advance and stored in the memory 1 ⁇ 1. Return to the normal posture by using the playback.
- the fall state changes during execution of the fall return operation. For example, if the head falls down with the front facing down (Head Side Down) and the situation changes to the side fall state when the operation for fall return is started, in such a case, By quickly ending the currently executing fall return operation and executing the detected fall return operation again, the return operation from the fall state can be promptly executed.
- FIG. 13 schematically shows the progress of a return operation from a head-over state (Head Side Down) to a normal posture.
- the trajectory planning for restoring the posture from the forward fall state described above is based on the relative positional relationship between the right front foot 3, left front foot 4, right rear foot 5, and left rear foot 6 of this articulated robot 1. It can be generated in advance by the operator performing position teaching by the above-described direct teaching method, and can be stored in the memory 101.
- the description of the articulated robot 1 is based on the acceleration information of the three-axis (x, y, z) acceleration sensor 41 built in the main body 2.
- the control section 100 performs a fall determination by the control section 100 and returns from the above four types of fall states (Head Side Down, Right Side Down, Left Side Down, and Tail Side Down) to a normal posture state.
- the control unit 100 performs a fall determination based on detection outputs of an angular velocity sensor, an angular acceleration sensor, an inclination sensor, and the like built in the main body 2, and returns to a normal posture state. A return operation may be performed.
- control unit 100 performs a fall determination based on image information obtained by the CCD camera 43 and a detection output from the contact sensor 45, and performs a return operation to a normal posture state. It may be. Further, the control unit 100 may perform a fall determination using a combination of detection outputs from various sensors and perform an operation of returning to a normal posture state.
- an abnormal posture can be detected by comparing the internal posture model with the outputs of the contact sensors installed on the legs and the body.
- a robot device equipped with an image input device recognizes the road surface and correlates its position with the current intended posture of the device to provide an abnormal appearance. It can be detected as a force.
- the image is output by the CCD camera 43, and in the normal posture, as shown in FIG. 17A.
- the abnormal posture an image in which the floor F is turned upside down, as shown in Fig. 17B, and in Fig. 17C and Fig. 17 Since an image in which the floor surface F is inclined as shown in D is obtained, it is possible to detect the abnormal posture state by determining the state of the floor surface F of the image obtained as the image output by the CCD camera 43. it can.
- the work of detecting the edge in the Y direction in the coordinate system of the image is repeated, and the coordinates of the plurality of detected positions obtained are obtained from the coordinates of the detected positions.
- the horizontal edge of the floor F is obtained by calculating the line segment, and the vertical edge of the floor F is similarly obtained from the detection position coordinates obtained as a result of the work of detecting the edge in the X direction. Further, by combining them, the sloping floor surface: the line segment of F may be detected.
- the abnormal posture detection may be performed as follows.
- a floor surface detecting device FD as shown in FIG. 20, it is possible to detect an abnormal posture in a fall or the like.
- a contact-type sensor device such as a non-contact type sensor having a light emitting and a light receiving portion or a micro switch can be used.
- the operation of returning from the fall state is limited to a specific state transition depending on the shape of the robot device.
- the robot operation is controlled in such a way that playback operation data is created by subdividing the fall return operation and playback is performed according to changes in the fall state.
- the fall return operation can be switched immediately.
- the return operation data can be created by dividing each return operation, thereby facilitating the creation of the operation data.
- the posture of the device main body is recognized, and based on the recognition result, it is determined that the device main body is in the predetermined posture, and the robot is determined as the predetermined posture.
- the posture of the device main body for example, it is possible to autonomously return from an abnormal posture state to a normal posture state.
- a robot device capable of autonomously returning to a normal posture state from an abnormal posture state such as a falling state.
- the robot apparatus has a function of autonomously returning from an abnormal posture state to a normal posture state, so that the robot apparatus can be used in an abnormal posture state such as a falling state. Failures and accidents can be prevented, the use environment can be prevented from being destroyed, and the user can be freed from work such as returning to a posture.
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US09/485,184 US6330494B1 (en) | 1998-06-09 | 1999-06-09 | Robot and method of its attitude control |
KR10-2000-7001259A KR100529287B1 (ko) | 1998-06-09 | 1999-06-09 | 로봇 장치 및 그 자세 제어 방법 |
EP99923986A EP1034899B1 (en) | 1998-06-09 | 1999-06-09 | Robot and method of its attitude control |
DE69943312T DE69943312D1 (de) | 1998-06-09 | 1999-06-09 | Manipulator und verfahren zur steuerung seiner lage |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP10/161091 | 1998-06-09 | ||
JP16109198 | 1998-06-09 |
Related Child Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/485,184 A-371-Of-International US6330494B1 (en) | 1998-06-09 | 1999-06-09 | Robot and method of its attitude control |
US09/985,655 Continuation US6567724B2 (en) | 1998-06-09 | 2001-11-05 | Robot apparatus and method of controlling the posture thereof |
US09/985,655 Division US6567724B2 (en) | 1998-06-09 | 2001-11-05 | Robot apparatus and method of controlling the posture thereof |
Publications (1)
Publication Number | Publication Date |
---|---|
WO1999064208A1 true WO1999064208A1 (fr) | 1999-12-16 |
Family
ID=15728446
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP1999/003089 WO1999064208A1 (fr) | 1998-06-09 | 1999-06-09 | Robot et procede de commande de son attitude |
Country Status (6)
Country | Link |
---|---|
US (2) | US6330494B1 (ja) |
EP (1) | EP1034899B1 (ja) |
KR (1) | KR100529287B1 (ja) |
CN (1) | CN1146492C (ja) |
DE (1) | DE69943312D1 (ja) |
WO (1) | WO1999064208A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2000032360A1 (fr) | 1998-11-30 | 2000-06-08 | Sony Corporation | Robot et son procede de commande |
US6462498B1 (en) | 2000-05-09 | 2002-10-08 | Andrew J. Filo | Self-stabilizing walking apparatus that is capable of being reprogrammed or puppeteered |
US6705917B2 (en) | 2000-12-15 | 2004-03-16 | Andrew S. Filo | Self-phase synchronized walking and turning quadruped apparatus |
US6832131B2 (en) | 1999-11-24 | 2004-12-14 | Sony Corporation | Legged mobile robot and method of controlling operation of the same |
US7442107B1 (en) | 1999-11-02 | 2008-10-28 | Sega Toys Ltd. | Electronic toy, control method thereof, and storage medium |
Families Citing this family (61)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6337552B1 (en) * | 1999-01-20 | 2002-01-08 | Sony Corporation | Robot apparatus |
KR20010041969A (ko) * | 1999-01-18 | 2001-05-25 | 이데이 노부유끼 | 로봇 장치, 로봇 장치의 본체 유닛 및 로봇 장치의 결합유닛 |
WO2000068880A1 (fr) | 1999-05-10 | 2000-11-16 | Sony Corporation | Dispositif robot |
JP2001191283A (ja) * | 1999-12-31 | 2001-07-17 | Sony Corp | ロボット装置及びその制御方法 |
EP1286621A4 (en) * | 2000-05-08 | 2009-01-21 | Brainsgate Ltd | METHOD AND DEVICE FOR SPREADING THE SPHENOPALATINEN GANGLION TO CHANGE PROPERTIES OF THE BLOOD BRAIN BARRIER AND CEREBRAL BLOOD FLOW |
US20020059386A1 (en) * | 2000-08-18 | 2002-05-16 | Lg Electronics Inc. | Apparatus and method for operating toys through computer communication |
JP2002127059A (ja) * | 2000-10-20 | 2002-05-08 | Sony Corp | 行動制御装置および方法、ペットロボットおよび制御方法、ロボット制御システム、並びに記録媒体 |
JP4143305B2 (ja) * | 2001-01-30 | 2008-09-03 | 日本電気株式会社 | ロボット装置、照合環境判定方法、及び照合環境判定プログラム |
JP3837479B2 (ja) * | 2001-09-17 | 2006-10-25 | 独立行政法人産業技術総合研究所 | 動作体の動作信号生成方法、その装置及び動作信号生成プログラム |
JP3811072B2 (ja) | 2002-01-18 | 2006-08-16 | 本田技研工業株式会社 | 移動ロボットの異常検知装置 |
US7386364B2 (en) * | 2002-03-15 | 2008-06-10 | Sony Corporation | Operation control device for leg-type mobile robot and operation control method, and robot device |
KR101004820B1 (ko) * | 2002-03-18 | 2010-12-28 | 지니치 야마구치 | 이동체 장치, 이동체 장치의 제어 방법, 로봇 장치, 로봇 장치의 동작 제어 방법 |
US7493263B2 (en) * | 2002-04-30 | 2009-02-17 | Medco Health Solutions, Inc. | Prescription management system |
US7137861B2 (en) * | 2002-11-22 | 2006-11-21 | Carr Sandra L | Interactive three-dimensional multimedia I/O device for a computer |
US8222840B2 (en) * | 2002-12-12 | 2012-07-17 | Sony Corporation | Fuel cell mount apparatus and electric power supply system |
US7072740B2 (en) * | 2002-12-16 | 2006-07-04 | Sony Corporation | Legged mobile robot |
US7238079B2 (en) * | 2003-01-14 | 2007-07-03 | Disney Enterprise, Inc. | Animatronic supported walking system |
US7348746B2 (en) * | 2003-02-14 | 2008-03-25 | Honda Giken Kogyo Kabushiki Kaisha | Abnormality detection system of mobile robot |
CN100344416C (zh) * | 2003-03-23 | 2007-10-24 | 索尼株式会社 | 机器人装置和控制该装置的方法 |
US7761184B2 (en) * | 2003-03-23 | 2010-07-20 | Sony Corporation | Robot apparatus and control method thereof |
JP4246535B2 (ja) * | 2003-04-17 | 2009-04-02 | 本田技研工業株式会社 | 二足歩行移動体の床反力作用点推定方法及び二足歩行移動体の関節モーメント推定方法 |
JP2005115654A (ja) * | 2003-10-08 | 2005-04-28 | Sony Corp | 情報処理装置および方法、プログラム格納媒体、並びにプログラム |
US20060099344A1 (en) * | 2004-11-09 | 2006-05-11 | Eastman Kodak Company | Controlling the vaporization of organic material |
US7339340B2 (en) * | 2005-03-23 | 2008-03-04 | Harris Corporation | Control system and related method for multi-limbed, multi-legged robot |
EP1864763A4 (en) * | 2005-03-30 | 2008-04-30 | Tmsuk Co Ltd | FOUR PAWN DRYERS |
US20070078565A1 (en) * | 2005-10-03 | 2007-04-05 | Modjtaba Ghodoussi | Telerobotic system that transmits changed states of a subsystem |
JP4812426B2 (ja) * | 2005-12-27 | 2011-11-09 | 富士通株式会社 | ロボット制御装置 |
US7348747B1 (en) | 2006-03-30 | 2008-03-25 | Vecna | Mobile robot platform |
KR101297388B1 (ko) * | 2006-06-16 | 2013-08-19 | 삼성전자주식회사 | 위치 보정 기능을 제공하는 이동 장치 및 위치 보정 방법 |
KR100834572B1 (ko) * | 2006-09-29 | 2008-06-02 | 한국전자통신연구원 | 외부 자극에 반응하는 로봇 구동 장치 및 제어 방법 |
CN101219284A (zh) * | 2007-01-08 | 2008-07-16 | 鸿富锦精密工业(深圳)有限公司 | 仿生类装置 |
US7996112B1 (en) | 2007-06-01 | 2011-08-09 | United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | Robot and robot system |
US8385474B2 (en) * | 2007-09-21 | 2013-02-26 | Qualcomm Incorporated | Signal generator with adjustable frequency |
WO2010080448A1 (en) * | 2008-12-19 | 2010-07-15 | Honda Motor Co., Ltd. | Humanoid fall direction change among multiple objects |
US8352077B2 (en) * | 2008-12-19 | 2013-01-08 | Honda Motor Co., Ltd. | Inertia shaping for humanoid fall direction change |
US8833276B2 (en) | 2009-02-06 | 2014-09-16 | William Hunkyun Bang | Burner system for waste plastic fuel |
US8554370B2 (en) * | 2009-05-15 | 2013-10-08 | Honda Motor Co., Ltd | Machine learning approach for predicting humanoid robot fall |
CN102237114A (zh) * | 2010-05-07 | 2011-11-09 | 北京华旗随身数码股份有限公司 | 健身用的视频播放装置 |
US8880221B2 (en) * | 2011-03-21 | 2014-11-04 | Honda Motor Co., Ltd. | Damage reduction control for humanoid robot fall |
US9873556B1 (en) | 2012-08-14 | 2018-01-23 | Kenney Manufacturing Company | Product package and a method for packaging a product |
JP5850003B2 (ja) * | 2013-07-26 | 2016-02-03 | 株式会社安川電機 | ロボットシステム、ロボットシステムのロボット管理コンピュータ及びロボットシステムの管理方法 |
KR20150075909A (ko) * | 2013-12-26 | 2015-07-06 | 한국전자통신연구원 | 3차원 캐릭터 동작 편집방법 및 그 장치 |
JP6338389B2 (ja) * | 2014-02-07 | 2018-06-06 | キヤノン株式会社 | 動力学計算方法及びプログラム、並びにシミュレーション装置 |
US9308648B2 (en) | 2014-07-24 | 2016-04-12 | Google Inc. | Systems and methods for robotic self-right |
CN104932493B (zh) * | 2015-04-01 | 2017-09-26 | 上海物景智能科技有限公司 | 一种自主导航的移动机器人及其自主导航的方法 |
GB2538714A (en) * | 2015-05-25 | 2016-11-30 | Robotical Ltd | Robot Leg |
CN105607632B (zh) * | 2015-10-15 | 2018-02-16 | 浙江大学 | 一种3d欠驱动双足机器人跳跃运动的驱动控制方法 |
CN105599816B (zh) * | 2015-10-20 | 2018-02-16 | 浙江大学 | 一种3d欠驱动双足机器人跳跃运动的步态规划方法 |
CN105500362B (zh) * | 2015-12-23 | 2016-10-26 | 福建省汽车工业集团云度新能源汽车股份有限公司 | 一种多关节全向式管外机器人控制系统 |
CN105666491B (zh) * | 2016-03-11 | 2017-12-05 | 福建省汽车工业集团云度新能源汽车股份有限公司 | 一种多关节管道检修机器人控制系统 |
JP6660242B2 (ja) * | 2016-04-25 | 2020-03-11 | 本田技研工業株式会社 | ロボットの制御信号を伝送するための光ファイバ配線構造 |
US10059392B1 (en) | 2016-06-27 | 2018-08-28 | Boston Dynamics, Inc. | Control of robotic devices with non-constant body pitch |
WO2018198480A1 (ja) * | 2017-04-28 | 2018-11-01 | ソニー株式会社 | 制御装置、および制御方法 |
US10807246B2 (en) * | 2018-01-08 | 2020-10-20 | Beijing Jingdong Shangke Information Technology Co., Ltd. | Mobile robotic device and method of controlling the same manipulator for locomotion and manipulation |
AU2019388551A1 (en) * | 2018-09-26 | 2021-08-12 | Ghost Robotics Corporation | Legged robot |
JP2020179453A (ja) * | 2019-04-25 | 2020-11-05 | セイコーエプソン株式会社 | ロボットシステムの制御方法およびロボットシステム |
KR20200134817A (ko) | 2019-05-23 | 2020-12-02 | 삼성전자주식회사 | 하우징의 입력에 대응하는 피드백을 제공하는 전자 장치 |
CN110861084B (zh) * | 2019-11-18 | 2022-04-05 | 东南大学 | 一种基于深度强化学习的四足机器人跌倒自复位控制方法 |
CN111791221A (zh) * | 2020-06-08 | 2020-10-20 | 阳泉煤业(集团)股份有限公司 | 一种蛇形机器人的翻倒自恢复方法 |
KR102317058B1 (ko) * | 2020-06-26 | 2021-10-26 | 김한수 | 모션 센서 네트워크 시스템 |
CN114952867B (zh) * | 2022-07-26 | 2022-10-25 | 中国工业互联网研究院 | 工业机器人的控制方法、装置、电子设备及可读存储介质 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH028498U (ja) * | 1988-06-29 | 1990-01-19 | ||
JPH0631658A (ja) * | 1992-07-20 | 1994-02-08 | Honda Motor Co Ltd | 脚式移動ロボットの歩行制御装置 |
JPH06198582A (ja) * | 1992-11-05 | 1994-07-19 | Commiss Energ Atom | 歩行ロボット足 |
JPH07205085A (ja) * | 1993-12-30 | 1995-08-08 | Honda Motor Co Ltd | 移動ロボットの位置検知および制御装置 |
JPH0871967A (ja) * | 1994-09-09 | 1996-03-19 | Komatsu Ltd | 歩行ロボットの歩行制御装置および歩行制御方法 |
JPH09142347A (ja) * | 1995-11-24 | 1997-06-03 | Mitsubishi Heavy Ind Ltd | 不整地移動装置 |
JPH10113886A (ja) * | 1996-10-07 | 1998-05-06 | Sony Corp | ロボツト装置 |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3911613A (en) * | 1974-02-15 | 1975-10-14 | Marvin Glass & Associates | Articulated figure toy and accessories |
JPS63191582A (ja) | 1987-02-05 | 1988-08-09 | 株式会社明電舎 | ロボツトの歩行装置 |
JPH028498A (ja) | 1988-06-22 | 1990-01-11 | Kumagai Gumi Co Ltd | コンクリート部材の締結装置 |
US5100362A (en) * | 1990-12-03 | 1992-03-31 | Fogarty A Edward | Propellable articulating animal toy |
US5289916A (en) * | 1991-11-08 | 1994-03-01 | S. R. Mickelberg Company, Inc. | Animated toy in package |
US5172806A (en) * | 1991-11-08 | 1992-12-22 | S. R. Mickelberg Company, Inc. | Animated toy in package |
US5349277A (en) * | 1992-03-12 | 1994-09-20 | Honda Giken Kogyo Kabushiki Kaisha | Control system for legged mobile robot |
US5606494A (en) * | 1993-11-25 | 1997-02-25 | Casio Computer Co., Ltd. | Switching apparatus |
US5626505A (en) * | 1996-02-06 | 1997-05-06 | James Industries, Inc. | Spring-animated toy figure |
-
1999
- 1999-06-09 WO PCT/JP1999/003089 patent/WO1999064208A1/ja active IP Right Grant
- 1999-06-09 DE DE69943312T patent/DE69943312D1/de not_active Expired - Lifetime
- 1999-06-09 EP EP99923986A patent/EP1034899B1/en not_active Expired - Lifetime
- 1999-06-09 CN CNB998013110A patent/CN1146492C/zh not_active Expired - Fee Related
- 1999-06-09 US US09/485,184 patent/US6330494B1/en not_active Expired - Lifetime
- 1999-06-09 KR KR10-2000-7001259A patent/KR100529287B1/ko not_active IP Right Cessation
-
2001
- 2001-11-05 US US09/985,655 patent/US6567724B2/en not_active Expired - Fee Related
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH028498U (ja) * | 1988-06-29 | 1990-01-19 | ||
JPH0631658A (ja) * | 1992-07-20 | 1994-02-08 | Honda Motor Co Ltd | 脚式移動ロボットの歩行制御装置 |
JPH06198582A (ja) * | 1992-11-05 | 1994-07-19 | Commiss Energ Atom | 歩行ロボット足 |
JPH07205085A (ja) * | 1993-12-30 | 1995-08-08 | Honda Motor Co Ltd | 移動ロボットの位置検知および制御装置 |
JPH0871967A (ja) * | 1994-09-09 | 1996-03-19 | Komatsu Ltd | 歩行ロボットの歩行制御装置および歩行制御方法 |
JPH09142347A (ja) * | 1995-11-24 | 1997-06-03 | Mitsubishi Heavy Ind Ltd | 不整地移動装置 |
JPH10113886A (ja) * | 1996-10-07 | 1998-05-06 | Sony Corp | ロボツト装置 |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2000032360A1 (fr) | 1998-11-30 | 2000-06-08 | Sony Corporation | Robot et son procede de commande |
EP1155787A1 (en) * | 1998-11-30 | 2001-11-21 | Sony Corporation | Robot device and control method thereof |
EP1155787A4 (en) * | 1998-11-30 | 2010-01-13 | Sony Corp | ROBOT AND ITS CONTROL PROCEDURE |
US7442107B1 (en) | 1999-11-02 | 2008-10-28 | Sega Toys Ltd. | Electronic toy, control method thereof, and storage medium |
US6832131B2 (en) | 1999-11-24 | 2004-12-14 | Sony Corporation | Legged mobile robot and method of controlling operation of the same |
US7013201B2 (en) | 1999-11-24 | 2006-03-14 | Sony Corporation | Legged mobile robot and method of controlling operation of the same |
US6462498B1 (en) | 2000-05-09 | 2002-10-08 | Andrew J. Filo | Self-stabilizing walking apparatus that is capable of being reprogrammed or puppeteered |
US6705917B2 (en) | 2000-12-15 | 2004-03-16 | Andrew S. Filo | Self-phase synchronized walking and turning quadruped apparatus |
Also Published As
Publication number | Publication date |
---|---|
KR100529287B1 (ko) | 2005-11-17 |
US6330494B1 (en) | 2001-12-11 |
US6567724B2 (en) | 2003-05-20 |
DE69943312D1 (de) | 2011-05-12 |
EP1034899A4 (en) | 2007-11-21 |
EP1034899A1 (en) | 2000-09-13 |
EP1034899B1 (en) | 2011-03-30 |
US20020116091A1 (en) | 2002-08-22 |
KR20010022664A (ko) | 2001-03-26 |
CN1146492C (zh) | 2004-04-21 |
CN1274310A (zh) | 2000-11-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO1999064208A1 (fr) | Robot et procede de commande de son attitude | |
US8340823B2 (en) | Controller of mobile robot | |
JP5213023B2 (ja) | ロボット | |
JP3811072B2 (ja) | 移動ロボットの異常検知装置 | |
US7337040B2 (en) | Self-position estimating device for leg type movable robots | |
EP2199038B1 (en) | Robot and task execution system | |
WO2008004487A1 (fr) | Appareil et procédé de commande de bras robotisé, robot et programme de commande de bras robotisé | |
US20050151496A1 (en) | Two-leg walking humanoid robot | |
US7383717B2 (en) | Force sensor abnormality detection system for legged mobile robot | |
EP1440872A2 (en) | Ambulatory robot and method for controlling the same | |
WO2004071718A1 (ja) | 移動ロボットの異常検知装置 | |
JP4810880B2 (ja) | ロボットとその制御方法 | |
EP2236251B1 (en) | Mobile robot controller | |
US9964956B2 (en) | Operating environment information generating device for mobile robot | |
JP2007152472A (ja) | 充電システム、充電ステーション及びロボット誘導システム | |
JP4905041B2 (ja) | ロボット制御装置 | |
JP3199059B2 (ja) | ロボット装置及びその姿勢制御方法 | |
JP3811073B2 (ja) | 移動ロボットの異常検知装置 | |
JP2001212785A (ja) | ロボット装置及びその姿勢制御方法 | |
JP2569579Y2 (ja) | 視覚センサを備えた脚式移動ロボット | |
JP2008093762A (ja) | 歩行ロボット | |
JP4504769B2 (ja) | 脚式移動ロボットの異常検知装置 | |
CN113892848B (zh) | 可翻转物体的翻转跟随轨迹规划方法、装置及系统 | |
JP6218641B2 (ja) | ロボットの保持システム | |
JP2004017181A (ja) | 歩行式ロボット |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 99801311.0 Country of ref document: CN |
|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): CN KR US |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020007001259 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1999923986 Country of ref document: EP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 09485184 Country of ref document: US |
|
WWP | Wipo information: published in national office |
Ref document number: 1999923986 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 1020007001259 Country of ref document: KR |
|
WWG | Wipo information: grant in national office |
Ref document number: 1020007001259 Country of ref document: KR |