CN103991492A - Intelligent trolley based on Kinect technology - Google Patents

Intelligent trolley based on Kinect technology Download PDF

Info

Publication number
CN103991492A
CN103991492A CN201410255221.3A CN201410255221A CN103991492A CN 103991492 A CN103991492 A CN 103991492A CN 201410255221 A CN201410255221 A CN 201410255221A CN 103991492 A CN103991492 A CN 103991492A
Authority
CN
China
Prior art keywords
motor
kinect
rotating shaft
chassis
lower computer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201410255221.3A
Other languages
Chinese (zh)
Inventor
王见
刘锐
汪宇辰
孔维仲
叶葱葱
卓著
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing University
Original Assignee
Chongqing University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing University filed Critical Chongqing University
Priority to CN201410255221.3A priority Critical patent/CN103991492A/en
Publication of CN103991492A publication Critical patent/CN103991492A/en
Pending legal-status Critical Current

Links

Landscapes

  • Toys (AREA)

Abstract

The invention discloses an intelligent trolley based on the Kinect technology. The intelligent trolley comprises a chassis, a rotating shaft assembly, a wheel assembly, a Kinect sensor, a computer and a lower computer. After being collected by the Kinect sensor, body movements of an operator or voice instructions sent by the operator are transmitted to the computer. The computer analyzes the body movements of the operator or the voice instructions sent by the operator and sends out an instruction to the lower computer. The lower computer controls each motor to rotate in the forward direction and backward direction or to stop rotating according to the instruction so that the movement state of the intelligent trolley can be controlled.

Description

A kind of intelligent carriage based on Kinect technology
Technical field
The present invention relates to smart motion dolly.
Background technology
Intelligent carriage is conventional device during modern industry is produced.Existing intelligent carriage can not need artificial management according to the running automatically in an environment of predefined pattern, can be applicable to that material transports, assembling, welding part, Scientific Exploration etc.Intelligent carriage is displaying time, speed, mileage in real time, has Automatic Track Finding, seeks light, barrier avoiding function, program-controlled moving velocity, accurate vehicle positioning stop, the functions such as remote transmission image.
Kinect is a body sense peripheral hardware of Microsoft's development, is actually a kind of 3D body sense video camera.This equipment has 3 cameras, and centre is RGB colour imagery shot, and the right and left is respectively infrared projection machine (infrared transmitter) and infrared camera (CMOS infrared induction equipment), for depth finding.Kinect has arranged in pairs or groups and has chased after burnt technology, and base motor can move and rotate along with focusing object.While Kinect is built-in microphone array also, for speech recognition.
The motive position design comparison of existing intelligent carriage is simple, the motion of all directions that the complicated programming Control of needs can realize, and be difficult to realize pivot stud.This has a strong impact on the performance of intelligent carriage performance.
Summary of the invention
The object of the invention is to solve the problem that existing intelligent carriage is difficult to controlled motion.
For realizing the technical scheme that the object of the invention adopts, be such, a kind of intelligent carriage based on Kinect technology, comprises chassis, rotating shaft assembly, wheel set, Kinect sensor, computing machine and lower computer.
The lower surface on described chassis is horizontal surface, and four described rotating shaft assemblies are arranged on the lower surface on described chassis.Described rotating shaft assembly comprises motor, output shaft and bearing.Described motor and bearing are fixed on the lower surface on described chassis.Described output shaft is parallel to the lower surface on described chassis.One end of described output shaft connects the rotating shaft of described motor, and the other end of described output shaft, through after described bearing, is installed described wheel set.Be arranged in four described rotating shaft assemblies of lower surface on described chassis, the output shaft of two adjacent rotating shaft assemblies is mutually vertical.
Described Kinect sensor collects after the limb action of operating personal or phonetic order that operating personal sends, passes to described computing machine.
The voice that described computing machine sends the limb action of operating personal or operating personal send instruction to described lower computer after resolving.
Described lower computer according to instruction control that each motor is rotated in the forward, contrarotation or stop the rotation, to control the state of kinematic motion of intelligent carriage.
Further, also comprise an emergency stop switch.Described emergency stop switch is for controlling the break-make of the power supply of each motor.When intelligent carriage moves, if when described Kinect sensor detects distance between intelligent carriage and obstacle lower than threshold value, send stop command to described lower computer.Described lower computer receives after stop command, to described emergency stop switch, sends signal to cut off the power supply of each motor.
Technique effect of the present invention is mathematical, and the kinematic scheme of this dolly has the design of a plurality of rotating shafts, can realize neatly and turning to.In addition, Kinect system that the present invention is integrated, makes dolly understand the instruction of operating personal, and the motion of all directions that realize accurately, has given play to excellent performance.
Accompanying drawing explanation
Fig. 1 is the structural representation that the present invention invents;
Fig. 2 is the upward view that the present invention invents;
Fig. 3 is the structural representation of wheel part;
Fig. 4 is computer controlled diagram of circuit;
Fig. 5 is lower computer control flow chart.
In figure: chassis 1, rotating shaft assembly 2, motor 201, output shaft 202, bearing 203, wheel set 3, wheel hub 301, mounting hole 302, spoke 303, regular polygon frame 304, roller 305, platform 4, Kinect sensor 5.
The specific embodiment
Below in conjunction with drawings and Examples, the invention will be further described, but should not be construed the above-mentioned subject area of the present invention, only limits to following embodiment.Without departing from the idea case in the present invention described above, according to ordinary skill knowledge and customary means, make various replacements and change, all should be included in protection scope of the present invention.
Embodiment 1:
An intelligent carriage based on Kinect technology, comprises chassis 1, rotating shaft assembly 2, wheel set 3, Kinect sensor 5, computing machine and lower computer.
The lower surface on described chassis 1 is horizontal surface, and four described rotating shaft assemblies 2 are arranged on the lower surface on described chassis 1.Described rotating shaft assembly 2 comprises motor 201, output shaft 202 and bearing 203.Described motor 201 and bearing 203 are fixed on the lower surface on described chassis 1.Described output shaft 202 is parallel to the lower surface on described chassis 1.One end of described output shaft 202 connects the rotating shaft of described motor 201, and the other end of described output shaft 202, through after described bearing 203, is installed described wheel set 3.Be arranged in four described rotating shaft assemblies 2 of lower surface on described chassis 1, the output shaft 202 of two adjacent rotating shaft assemblies 2 is mutually vertical.
After the phonetic order that described Kinect sensor 5 acquisition operations personnel's limb action or operating personal send, pass to described computing machine.In embodiment, the phonetic order that operating personal sends " turn right forward,, left, to the right, turn left and backward " totally six kinds of phonetic orders can and send computing machine to by the microphone collection of described Kinect sensor 5.
The voice that described computing machine sends the limb action of operating personal or operating personal send instruction to described lower computer after resolving.
Described lower computer according to instruction control that each motor 201 (four motors 201) is rotated in the forward, contrarotation or stop the rotation, to control the state of kinematic motion of intelligent carriage.
In embodiment, the motor 201 on four rotating shaft assemblies 2 is designated as respectively the first motor, the second motor, the 3rd motor and the 4th motor.
Computer analyzing goes out operating personal and sends phonetic order when " forward ", and described lower computer is controlled the second motor and the 4th motor forward, and the first motor and the 3rd motor are motionless.Intelligent carriage is moved forward.
Computer analyzing goes out operating personal and sends phonetic order when " backward ", and described lower computer is controlled the second motor and the 4th motor reversal, and the first motor and the 3rd motor are motionless.Intelligent carriage is moved backward.
When computer analyzing goes out operating personal and sends phonetic order " left ", described lower computer is controlled the first motor and the 3rd motor forward, and the second motor and the 4th motor are motionless.Intelligent carriage is moved to the left.
When computer analyzing goes out operating personal and sends phonetic order " to the right ", described lower computer is controlled the first motor and the 3rd motor reversal, and the second motor and the 4th motor are motionless.Intelligent carriage is moved right.
When computer analyzing goes out operating personal and sends phonetic order " left-hand rotation ", described lower computer is controlled the first motor forward, and the 3rd motor, the second motor and the 4th motor are motionless.Intelligent carriage is turned left.
When computer analyzing goes out operating personal and sends phonetic order " right-hand rotation ", described lower computer is controlled the 3rd motor forward, and the first motor, the second motor and the 4th motor are motionless.Intelligent carriage is turned right.
Embodiment 2:
The present embodiment main portion is with embodiment 1, and more specifically, in the present embodiment, described lower computer unit is arranged on aluminium sheet upper surface, and it mainly comprises MCU circuit card, the minimum system plate of MCU, speed measuring module, lead battery, speed measuring module circuit card; Two speed measuring module circuit cards are symmetrically distributed in two of square aluminium sheets corner in the same way, and lead battery is positioned at aluminium sheet central authorities; Be used for providing power supply, battery is connected through wired mode with DC machine, motor and speed measuring module are fixed with screw, and be connected with speed measuring module circuit card in wired mode, for measuring in real time motor speed and transmitting data to MCU circuit card, MCU circuit card and minimum system plate thereof are connected by screw, and computing machine is connected, and the hand of rotation of motor is controlled by MCU circuit card with upper computer (computing machine and KINECT sensor) by RS232 serial ports for MCU circuit card and minimum system plate thereof.Top layer control unit is that upper computer is comprised of KINECT sensor, computing machine, KINECT sensor is fixed on little poly (methyl methacrylate) plate by circle band, be used for receiving voice and limb action information, computer bit is in middle level organism glass platform central authorities, can process the information data of KINECT transmission and issue an order to drive system, computing machine is connected by usb data line with KINECT sensor, and computing machine is connected with MCU circuit card through RS232 serial ports.
Overhead control comprises source switch, emergency stop switch, and both are symmetrically distributed in middle level organism glass platform rear side two corners, and source switch is used for controlling power supply and opens and closes, and emergency stop switch is for rapid power cut-off.
1~9 kind of specific implementation process is below to expand and refinement on the basis of embodiment 1:
First, user appears in the camera in KINECT identification range, make limb action or send phonetic order, KINECT is with crossing infrared pick-up head and microphone acceptance pattern as voice messaging and sending data message to computing machine, SDK program means bag on computing machine is processed by LABVIEW language, movable information is sent to the MCU circuit card of lower computer through RS232 serial ports, then through electric wire, control motor turn to (forward or reverse) and rotating speed.
1) user appears in the camera in KINECT identification range, send that phonetic order " marches forward " or " drawing back ", KINECT microphone is accepted voice messaging and is sent data message to computing machine, KINECT limbs voice on computing machine are controlled platform, by LABVIEW language, the information receiving are processed to (forward/backward distance can arrange), movable information is sent to the MCU circuit card of lower computer through RS232 serial ports, through electric wire, control again, front and back motor stalling, left and right motor rotates clockwise/counterclockwise.(motor steering be take the same side as reference system)
2) user appears in the camera in KINECT identification range, send phonetic order " right-hand turning ", KINECT microphone is accepted voice messaging and is sent data message to computing machine, KINECT limbs voice on computing machine are controlled platform, by LABVIEW language, the information receiving are processed to (anglec of rotation can arrange), movable information is sent to the MCU circuit card of lower computer through RS232 serial ports, through electric wire, control motor again, right motor left-hand revolution, left motor clickwise, front side motor clickwise, rear side motor left-hand revolution.(motor steering be take the same side as reference system)
3) user appears in the camera in KINECT identification range, send phonetic order " left-handed turning ", KINECT microphone is accepted voice messaging and is sent data message to computing machine, KINECT limbs voice on computing machine are controlled platform, by LABVIEW language, the information receiving are processed to (anglec of rotation can arrange), movable information is sent to the MCU circuit card of lower computer through RS232 serial ports, through electric wire, control motor again, right motor clickwise, left motor left-hand revolution, front side motor left-hand revolution, rear side motor clickwise.(motor steering be take the same side as reference system).
4) user appears in the camera in KINECT identification range, send phonetic order " in the face of me ", KINECT microphone is accepted voice messaging and is sent data message to computing machine, KINECT limbs voice on computing machine are controlled platform and by LABVIEW language, the information receiving are processed, movable information is sent to the MCU circuit card of lower computer through RS232 serial ports, through electric wire, control motor again, when camera recognizes part limbs feature, platform forwards the direction of sounding in the mode of " left-handed turning " or " right-hand turning ".While there is no limbs bone signal in camera, platform is not with the mode Rotate 180 degree of " left-handed turning " or " right-hand turning ".(motor steering be take the same side as reference system)
5) user appears in the camera in KINECT identification range, send phonetic order " time-out " or " end ", KINECT microphone is accepted voice messaging and is sent data message to computing machine, KINECT limbs voice on computing machine are controlled platform and by LABVIEW language, the information receiving are processed, movable information is sent to the MCU circuit card of lower computer through RS232 serial ports, through electric wire, control motor again, all motor stalling.
6) user appears in the camera in KINECT identification range, send limbs bone instruction " right hand refers to " left, KINECT microphone is accepted voice messaging and is sent data message to computing machine, KINECT limbs voice on computing machine are controlled platform, by LABVIEW language, the information receiving are processed to (anglec of rotation can arrange), movable information is sent to the MCU circuit card of lower computer through RS232 serial ports, through electric wire, control motor again, right motor clickwise, left motor left-hand revolution, front side motor left-hand revolution, rear side motor clickwise.(motor steering be take the same side as reference system).
7) user appears in the camera in KINECT identification range, send limbs bone instruction " left hand refers to " to the right, KINECT microphone is accepted voice messaging and is sent data message to computing machine, KINECT limbs voice on computing machine are controlled platform, by LABVIEW language, the information receiving are processed to (anglec of rotation can arrange), movable information is sent to the MCU circuit card of lower computer through RS232 serial ports, through electric wire, control motor again, right motor left-hand revolution, left motor clickwise, front side motor clickwise, rear side motor left-hand revolution.(motor steering be take the same side as reference system).
8) user appears in the camera in KINECT identification range, sending the instruction of limbs bone " lifts on the right hand " or " on left hand, lifting ", KINECT microphone is accepted voice messaging and is sent data message to computing machine, KINECT limbs voice on computing machine are controlled platform, by LABVIEW language, the information receiving are processed to (forward/backward distance can arrange), movable information is sent to the MCU circuit card of lower computer through RS232 serial ports, through electric wire, control motor again, front and back motor stalling, left and right motor rotates clockwise/counterclockwise.(motor steering be take the same side as reference system).
9) user appears in the camera in KINECT identification range, send limbs bone instruction " both hands intersect at before abdomen ", KINECT microphone is accepted voice messaging and is sent data message to computing machine, KINECT limbs voice on computing machine are controlled platform, by LABVIEW language, the information receiving are processed to (anglec of rotation can arrange), movable information is sent to the MCU circuit card of lower computer through RS232 serial ports, through electric wire, control motor again, all motor stalling.
Embodiment 3:
The present embodiment further illustrates the control system of embodiment 1 or 2.
Referring to Fig. 4 and Fig. 5:
1. hardware initialization stage.The first stage that this stage lower computer hardware powers on.In this stage, first system platform is connected on to hardware and carries out initialization, as kinect sensor, lower computer hardware etc.
2. whether pair initialization successfully judges; If be judged as "No", the processing stage of entering mistake; If be judged as "Yes", at upper computer (kinect sensor and computing machine), carry out system parameter setting, comprise speed, the deflection angle of distance and system platform.
3. whether pair master system receives audio frequency from sensor, video, and depth data stream judges.If be judged as "No", re-start hardware initialization; If be judged as "Yes", upper computer directly carries out sensor information sampling and processing.
4. pair related data is carried out concurrent processing, starts speech recognition simultaneously, and bone is followed the trail of and gesture identification, and coloured image shows.
5. carry out voice, gesture identification command determination; In this stage, judge whether voice identification result meets set voice command, if judgment result is that "No", proceed speech recognition, if judgment result is that "Yes", mate recognition result, generate specific instruction; Meanwhile, in this stage, judge whether sensor tracks skeleton, if judgment result is that "No", proceed gesture identification; If judgment result is that "Yes", continue judgement identification gesture and whether meet set gesture algorithm; Whether meet set gesture algorithm judgement, if judgment result is that "No", continue to judge whether this gesture meets set algorithm, if judgment result is that "Yes", generate specific instruction.
6. generate after specific instruction, upper computer is sent to serial ports by instruction, sends serial ports instruction.
7. lower computer carries out data with upper computer by serial port data line and is connected, and carries out lower computer and whether receives order and judge.If judgment result is that "No", resend serial ports instruction; If judgment result is that "Yes", lower computer executive command.
8. lower computer, in executive command process, carry out newer command judgement.If judgment result is that "No", lower computer continues to carry out, and then waits for newer command; If judgment result is that "Yes", proceed to lower computer executive command, cover last order, carry out newer command.
9. after command execution, system re-starts the parameter setting of system, and this stage can change the speed of platform, and operating range and deflection angle also can not changed, and system is carried out the identification of new round speech gestures.
Embodiment 4:
The present embodiment further discloses a kind of structure of intelligent carriage on embodiment 1 basis, and referring to Fig. 1~3, this dolly comprises chassis 1, rotating shaft assembly 2 and wheel set 3.The lower surface on described chassis 1 is horizontal surface, and four described rotating shaft assemblies 2 are arranged on the lower surface on described chassis 1.
Referring to Fig. 2, described rotating shaft assembly 2 comprises motor 201, output shaft 202 and bearing 203.Described motor 201 and bearing 203 are fixed on the lower surface on described chassis 1.Described output shaft 202 is parallel to the lower surface on described chassis 1.
The upper surface on described chassis 1 is provided with strut bar, and described Kinect sensor 5 is arranged on the upper end of described strut bar.
One end of described output shaft 202 connects the rotating shaft of described motor 201, and the other end of described output shaft 202, through after described bearing 203, is installed described wheel set 3.In embodiment, the outer ring of described bearing 203 is fixed on the lower surface on described chassis 1.Described output shaft 202 passes, and is fixed on the inner ring of described bearing 203.Described output shaft 202 is through the unsettled surrounding on described chassis 1 in one end of the inner ring of described bearing 203, so that installing wheel assembly 3.
In embodiment, described chassis 1 is square.Be arranged in four described rotating shaft assemblies 2 of lower surface on described chassis 1, the output shaft 202 of two adjacent rotating shaft assemblies 2 is mutually vertical.With the center on described square chassis 1, draw " ten " word, described four described rotating shaft assemblies 2 are arranged on respectively in the Si Ge branch of described " ten " word.Make four described output shafts 202 be arranged on radially the lower surface on described square chassis 1.
Described wheel set 3 comprises wheel hub 301, spoke 303, regular polygon frame 304 and roller 305.The center of described wheel hub 301 has mounting hole 302.Described mounting hole 302 is enclosed within on described output shaft 202, and described wheel set 3 is installed on described output shaft 202.On the excircle of described wheel hub 301, connect some spokes 303.Described wheel hub 301 is positioned among described regular polygon frame 304, and described regular polygon frame 304 is concentric with described wheel hub 301.One end of described spoke 303 is connected on the excircle of wheel hub 301, the other end is connected on the summit of described regular polygon frame 304.Described in several, roller 305 is arranged on each limit of described regular polygon frame 304.Described roller 305 can Free-rolling.
Embodiment 5:
The present embodiment keystone configuration is with embodiment 4.In the present embodiment, also comprise platform 4, described platform 4 is arranged on the upper surface on described chassis 1.Described regular polygon frame 304 is dodecagon frames, and the radical of described spoke 303 is 12, and the number of described roller 305 is 12.Be that described regular polygon frame 304 in described wheel set 3 is that cross-sectional plane is the dodecagon metal frame that circular bonding jumper curves.Described dodecagon metal frame is enclosed within the outside of wheel hub 301.By 12 spokes 303, described dodecagon metal frame and wheel hub 301 are fixed together, the center of circle of described wheel hub 301 and the center superposition of dodecagon.In the present embodiment, described roller 305 is hollow circular cylinders of open at both ends, and its material can be rubber.12 described rollers 305 are enclosed within on each limit of described dodecagon metal frame, can Free-rolling.Preferably, on each limit of described dodecagon metal frame, bearing is also installed, described roller 305 is enclosed within on these bearings.
Embodiment 6:
The main portion of the present embodiment, with embodiment 1, just also comprises an emergency stop switch.When intelligent carriage moves, if when described Kinect sensor 5 detects distance between intelligent carriage and obstacle lower than threshold value, send stop command to described lower computer.Described lower computer receives after stop command, to described emergency stop switch, sends signal to cut off the power supply of each motor 201.

Claims (2)

1. the intelligent carriage based on Kinect technology, is characterized in that: comprise chassis (1), rotating shaft assembly (2), wheel set (3), Kinect sensor (5), computing machine and lower computer;
The lower surface on described chassis (1) is horizontal surface, and four described rotating shaft assemblies (2) are arranged on the lower surface of described chassis (1); Described rotating shaft assembly (2) comprises motor (201), output shaft (202) and bearing (203); Described motor (201) and bearing (203) are fixed on the lower surface of described chassis (1); Described output shaft (202) is parallel to the lower surface of described chassis (1); One end of described output shaft (202) connects the rotating shaft of described motor (201), and the other end of described output shaft (202), through after described bearing (203), is installed described wheel set (3); Be arranged in four described rotating shaft assemblies (2) of lower surface on described chassis (1), the output shaft (202) of adjacent two rotating shaft assemblies (2) is mutually vertical;
Described Kinect sensor (5) collects after the limb action of operating personal or phonetic order that operating personal sends, passes to described computing machine;
The voice that described computing machine sends the limb action of operating personal or operating personal send instruction to described lower computer after resolving;
Described lower computer according to instruction control that each motor (201) is rotated in the forward, contrarotation or stop the rotation, to control the state of kinematic motion of intelligent carriage.
2. a kind of intelligent carriage based on Kinect technology according to claim 1, is characterized in that: also comprise an emergency stop switch; Described emergency stop switch is used for controlling the break-make of the power supply of each motor (201);
When intelligent carriage moves, if when described Kinect sensor (5) detects distance between intelligent carriage and obstacle lower than threshold value, send stop command to described lower computer; Described lower computer receives after stop command, to described emergency stop switch, sends signal to cut off the power supply of each motor (201).
CN201410255221.3A 2014-06-11 2014-06-11 Intelligent trolley based on Kinect technology Pending CN103991492A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410255221.3A CN103991492A (en) 2014-06-11 2014-06-11 Intelligent trolley based on Kinect technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410255221.3A CN103991492A (en) 2014-06-11 2014-06-11 Intelligent trolley based on Kinect technology

Publications (1)

Publication Number Publication Date
CN103991492A true CN103991492A (en) 2014-08-20

Family

ID=51305899

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410255221.3A Pending CN103991492A (en) 2014-06-11 2014-06-11 Intelligent trolley based on Kinect technology

Country Status (1)

Country Link
CN (1) CN103991492A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104811610A (en) * 2015-04-01 2015-07-29 广东欧珀移动通信有限公司 Method and device for controlling rotation of camera
CN105223954A (en) * 2015-10-14 2016-01-06 潍坊世纪元通工贸有限公司 A kind of path point type walking robot of identifiable design human body and control method thereof
CN105892620A (en) * 2014-10-28 2016-08-24 贵州师范学院 Kinect based method for remotely controlling intelligent car by motion sensing
CN105943163A (en) * 2016-06-27 2016-09-21 重庆金山科技(集团)有限公司 Minimally invasive surgery robot and control device thereof
CN107089458A (en) * 2017-07-03 2017-08-25 成都大学 A kind of new intelligent environment protection rubbish automatic recovery system
CN108107884A (en) * 2017-11-20 2018-06-01 北京理工华汇智能科技有限公司 Robot follows the data processing method and its intelligent apparatus of navigation
CN108170166A (en) * 2017-11-20 2018-06-15 北京理工华汇智能科技有限公司 The follow-up control method and its intelligent apparatus of robot
CN109886062A (en) * 2017-12-06 2019-06-14 东北林业大学 A kind of camellia oleifera fruit flower identification positioning system
CN111381594A (en) * 2020-03-09 2020-07-07 兰剑智能科技股份有限公司 AGV space obstacle avoidance method and system based on 3D vision

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4237990A (en) * 1979-01-02 1980-12-09 Hau T Omnidirectional vehicle
JPH0867268A (en) * 1994-08-30 1996-03-12 Rikagaku Kenkyusho Drive transmitting mechanism for omnidirectional moving vehicle
CN201080211Y (en) * 2006-12-29 2008-07-02 北京机械工业学院 Household intelligence moving platform device
JP2008155652A (en) * 2006-12-20 2008-07-10 Murata Mach Ltd Self-traveling conveying truck
CN203496595U (en) * 2013-09-13 2014-03-26 东北大学 Amphibiousness all-terrain rescue intelligent robot
CN103713554A (en) * 2013-12-26 2014-04-09 浙江师范大学 Motion sensing following type control system and carrier with motion sensing following type control system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4237990A (en) * 1979-01-02 1980-12-09 Hau T Omnidirectional vehicle
JPH0867268A (en) * 1994-08-30 1996-03-12 Rikagaku Kenkyusho Drive transmitting mechanism for omnidirectional moving vehicle
JP2008155652A (en) * 2006-12-20 2008-07-10 Murata Mach Ltd Self-traveling conveying truck
CN201080211Y (en) * 2006-12-29 2008-07-02 北京机械工业学院 Household intelligence moving platform device
CN203496595U (en) * 2013-09-13 2014-03-26 东北大学 Amphibiousness all-terrain rescue intelligent robot
CN103713554A (en) * 2013-12-26 2014-04-09 浙江师范大学 Motion sensing following type control system and carrier with motion sensing following type control system

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105892620A (en) * 2014-10-28 2016-08-24 贵州师范学院 Kinect based method for remotely controlling intelligent car by motion sensing
CN104811610A (en) * 2015-04-01 2015-07-29 广东欧珀移动通信有限公司 Method and device for controlling rotation of camera
CN104811610B (en) * 2015-04-01 2017-11-17 广东欧珀移动通信有限公司 A kind of method and device for controlling camera rotation
CN105223954A (en) * 2015-10-14 2016-01-06 潍坊世纪元通工贸有限公司 A kind of path point type walking robot of identifiable design human body and control method thereof
CN105223954B (en) * 2015-10-14 2018-03-06 潍坊世纪元通工贸有限公司 The path point type walking robot and its control method of a kind of recognizable human body
CN105943163A (en) * 2016-06-27 2016-09-21 重庆金山科技(集团)有限公司 Minimally invasive surgery robot and control device thereof
CN107089458A (en) * 2017-07-03 2017-08-25 成都大学 A kind of new intelligent environment protection rubbish automatic recovery system
CN107089458B (en) * 2017-07-03 2023-03-31 成都大学 Novel automatic recovery system of intelligence environmental protection rubbish
CN108107884A (en) * 2017-11-20 2018-06-01 北京理工华汇智能科技有限公司 Robot follows the data processing method and its intelligent apparatus of navigation
CN108170166A (en) * 2017-11-20 2018-06-15 北京理工华汇智能科技有限公司 The follow-up control method and its intelligent apparatus of robot
CN109886062A (en) * 2017-12-06 2019-06-14 东北林业大学 A kind of camellia oleifera fruit flower identification positioning system
CN111381594A (en) * 2020-03-09 2020-07-07 兰剑智能科技股份有限公司 AGV space obstacle avoidance method and system based on 3D vision

Similar Documents

Publication Publication Date Title
CN103991492A (en) Intelligent trolley based on Kinect technology
US20180143645A1 (en) Robotic creature and method of operation
CN202512439U (en) Human-robot cooperation system with webcam and wearable sensor
US11285611B2 (en) Robot and method of controlling thereof
CN102141797B (en) Airport terminal service robot and control method thereof
CN101954191B (en) Intelligent entertainment mobile robot
CN104898524A (en) Unmanned plane remote control system based on gesture
CN102699914A (en) Robot
JP2017226350A (en) Imaging device, imaging method and program
CN104111655A (en) Remote control based smart home service robot system
CN105955279B (en) A kind of method for planning path for mobile robot and device based on image vision
CN106625701A (en) Dining-room robot based on machine vision
Jia et al. Research and development of mecanum-wheeled omnidirectional mobile robot implemented by multiple control methods
CN106730714A (en) A kind of table tennis training mate robot
CN105082137A (en) Novel robot
CN105816303A (en) GPS and visual navigation-based blind guiding system and method thereof
CN105596157A (en) Multifunctional wheel chair
CN206200992U (en) A kind of dining room robot based on machine vision
CN105108757B (en) Wheeled Soccer Robot based on smart mobile phone and method of operating thereof
CN101770707B (en) Camera based virtual vehicle driving system and virtual driving method
CN205942440U (en) Intelligence business office robot
CN205042100U (en) Wheeled football robot
CN205750354U (en) A kind of expression robot
CN110871440B (en) Method and device for controlling robot running
CN201576364U (en) Virtual automobile driving system based on camera

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20140820