CN103336580A - Cursor control method of head-mounted device - Google Patents

Cursor control method of head-mounted device Download PDF

Info

Publication number
CN103336580A
CN103336580A CN2013102954255A CN201310295425A CN103336580A CN 103336580 A CN103336580 A CN 103336580A CN 2013102954255 A CN2013102954255 A CN 2013102954255A CN 201310295425 A CN201310295425 A CN 201310295425A CN 103336580 A CN103336580 A CN 103336580A
Authority
CN
China
Prior art keywords
module
eye movement
moving
head
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2013102954255A
Other languages
Chinese (zh)
Other versions
CN103336580B (en
Inventor
卫荣杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Taap Yi Hai (Shanghai) Technology Co. Ltd.
Original Assignee
卫荣杰
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 卫荣杰 filed Critical 卫荣杰
Priority to CN201310295425.5A priority Critical patent/CN103336580B/en
Publication of CN103336580A publication Critical patent/CN103336580A/en
Application granted granted Critical
Publication of CN103336580B publication Critical patent/CN103336580B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention provides a cursor control method of a head-mounted device. The method includes the following steps that a head motion processing sensor maps detected head motion data to be a head motion two-dimensional coordinate displacement and sends the head motion two-dimensional coordinate displacement to a microprocessor; meanwhile, an eye tracker maps detected eye motion data to be an eye motion two-dimensional coordinate displacement and sends the eye motion two-dimensional coordinate displacement to the microprocessor; the microprocessor superimposes the head motion two-dimensional coordinate displacement and the eye motion two-dimensional coordinate displacement to be a two-dimensional coordinate of cursor moving; a click event is executed through a voice recognition module, and the click event and the two-dimensional coordinate of the cursor moving are output together. The fact that the head and the eyes subconsciously move in a coordinated mode by nature is used, so that a mouse is controlled through sampling recognition achieved by head moving, eye moving and tooth touching, intelligent equipment can be put on the body of a person and carried, hands are freed, meanwhile the most direct and precise consciousness expression mode is achieved, namely what is seen can be controlled, operation is natural and easy, the action quantity is small, consumption of physical and mental energy in operation is reduced, time operation is facilitated, and the disabled can conveniently control the cursor.

Description

A kind of cursor control method of head-wearing device
Technical field
The invention belongs to field of mobile communication, be specifically related to a kind of cursor control method of head-wearing device.
Background technology
The cursor control method of most of mouses of prior art, touch pad, a moving mouse, eye movement instrument etc. all is to quantize at a place of human body or a kind of action output.
Wherein, the eye movement of prior art identification is the estimation to the pupil area of visual field, and accuracy and degree of stability are very poor, do not possess the condition of alternative mouse.And eye movement identification does not have practical acknowledgement key function, more important is the body posture compensation behavior of having ignored ergonomics and visual psychology target error, so the user uses very awkward, the light standard exactness is not enough simultaneously, forces use can cause body injury for a long time.
Therefore, there is defective in prior art, needs further improvement and develops.
Summary of the invention
(1) goal of the invention: the purpose of this invention is to provide a kind of with eye movement, head is moving and agomphiasis realizes the cursor control method of namely controlling of looking, and namely a kind of the head-wearing device that carries out the hommization integration is identified in eye movement identification, the moving identification of head and agomphiasis percussion; Utilize the man day to be conigenous the right shallow consciousness cooperative motion of head eye behavior, based on subconsciousness that target error the causes feedback body attitude compensation behavior action to visual psychology, by the application to the compensating movement identification of behavior psychology and Kalman filtering, set up a pullover moving eye movement cursor control program step, realize head eye cursor control method easily naturally.
(2) technical scheme:
A kind of cursor control method of head-wearing device may further comprise the steps:
The moving data that steps A, headwork processes sensor will be checked through are mapped as a moving two-dimensional coordinate displacement and send to microprocessor; Simultaneously, described eye movement instrument is mapped as the eye movement data that are checked through the displacement of eye movement two-dimensional coordinate and sends to described microprocessor;
Step B, described microprocessor are with the displacement superposed two-dimensional coordinate that moves for cursor of the moving two-dimensional coordinate displacement of head and eye movement two-dimensional coordinate;
Step C, described microprocessor are carried out the click event that bone-conduction microphone sends by the voice recognition module, and click event is exported together with the two-dimensional coordinate that cursor moves.
The cursor control method of described head-wearing device, wherein: the headwork processes sensor is mapped as a process of moving two-dimensional coordinate displacement with the moving data of head in the described steps A, specifically may further comprise the steps:
A-h1, headwork sensor with the real-time image data of motion tracking device be optimized moving data to the end, and the moving data of head are passed to the attitude computing module;
A-h2, described attitude computing module are shaken thresholding and weighted data blending algorithm with angular velocity and acceleration in the moving data of head, obtain the optimum attitude data, and the optimum attitude data are passed to described moving Planar Mapping check and correction module;
A-h3, described moving Planar Mapping check and correction module are mapped as a moving two-dimensional coordinate parameter by the geometric maps adjustment parameter that user interaction obtains with the optimum attitude data, and the moving two-dimensional coordinate parameter that will shine upon sends to the psychological compensating module of described microprocessor;
Described psychological compensating module is revised a moving two-dimensional coordinate parameter of the moving Planar Mapping check and correction of head module mapping, and modified value is fed back to described moving Planar Mapping check and correction module; Described moving Planar Mapping check and correction module proofreaied and correct according to the corrected value of described behavior psychology compensating module feedback, and the moving two-dimensional coordinate parameter after will proofreading and correct is passed to the collaborative stack of eye movement computing module.
The cursor control method of described head-wearing device, wherein: eye movement instrument described in the described steps A is mapped as the process of eye movement two-dimensional coordinate displacement with the eye movement data that are checked through, and specifically may further comprise the steps:
A-e1, eye movement identification module are gathered the eye movement data, and give the filter tracking module with the eye movement data transmission of gathering;
A-e2, described filter tracking module obtain the pupil heart and cornea reflective spot by characteristics of image filtering, obtain pupil heart tracking data by noise reduction and time domain tracking filter; And pupil heart tracking data passed to the level and smooth adjustment module of assignment;
A-e3, the level and smooth adjustment module of described assignment are carried out adjustment by smothing filtering value and noise reduction function that the user sets, and the pupil heart tracking data after the adjustment is passed to eye movement Planar Mapping check and correction module;
A-e4, described eye movement Planar Mapping check and correction module are shone upon described pupil heart tracking data by the geometric maps adjustment parameter that user interaction obtains, and the pupil heart tracking data of mapping is sent to described behavior psychology compensating module; Described behavior psychology compensating module is revised the pupil heart tracking data of mapping, and the value of feedback of revising is sent to described eye movement Planar Mapping check and correction module;
Described eye movement Planar Mapping check and correction module is proofreaied and correct the pupil heart tracking data of mapping according to the modified value of feedback, and corrected value is sent to the collaborative stack of described eye movement computing module.
The cursor control method of described head-wearing device, wherein, the method for described behavior psychology compensating module correction may further comprise the steps:
Described behavior psychology compensating module: screen center's normal direction line, customer location, screen center's point, a moving surface level central point, eye movement central point as fixed reference, are done the feature database contrast with the moving eye movement coordinate displacement of head and an eye movement again and found out difference.
The cursor control method of described head-wearing device, wherein: described moving eye movement coordinate displacement comprises with the contrast that feature database is made in an eye movement: the tracking psychology feedback that the contrast sensation target deviates from causes an action of body attitude compensation to obtain error amount, will deviate from the algorithm that the error amount of each margin of error in the process estimates by the square error recursion and proofread and correct.
The cursor control method of described head-wearing device, wherein: described step B specifically may further comprise the steps:
A moving data and eye movement data that the collaborative stack of an eye movement of microprocessor computing module will be optimized are carried out the dynamic proportion stack, make cursor use different stack ratios in different quadrant different motion patterns, and dynamically revise with behavior psychology compensating module.
The cursor control method of described head-wearing device, wherein:
The click event step of sending of microprocessor execution bone-conduction microphone is among the described step C:
C1. described bone-conduction microphone is close to the forehead top, the data of gathering the teeth cavity signal is passed to the voice recognition module of microprocessor;
C2. described voice recognition module draws user's intent instructions by bone conduction waveform characterization libraries contrast, and sends cursor point and hit the event instruction and send cursor instruction execution module to;
C3. cursor instruction execution module is exported together with click event and with the cursor coordinate.
The cursor control method of described head-wearing device, wherein: described cursor instruction execution module comprises the output of click event to be carried out the cursor instruction and the cursor instruction is sent by first wireless transport module.
The cursor control method of described head-wearing device, wherein: send the cursor steering order by first wireless transmission unit and control other computer terminal, and its computer terminal is when using fixed terminal display, need the user in positions different more than two respectively correct moving and eye movement carry out the interactive check and correction of terminal display direct-view unique point, the geometric maps adjustment parameter that obtains and screen distance information.
The cursor control method of described head-wearing device, wherein: described motion tracking device comprises three-axis gyroscope, three axis accelerometer and three magnetic strength instrument.
(3) beneficial effect: the cursor control method of a kind of head-wearing device of the present invention, utilize the man day to be conigenous the right shallow consciousness cooperative motion of head eye behavior, carry out mouse control by eye movement, a sampling identification moving and the agomphiasis percussion, realizing dressing with oneself smart machine and in the liberation both hands, accomplished to realize the most directly and accurately expression way---institute looks namely and controls, the light actuating quantity of operation nature is little, and the consumption that reduces physical energy in the operation helps long-time operation and helps the disabled person to use.
Description of drawings
Fig. 1 is the structural representation of head-wearing device of the present invention;
Fig. 2 is the moving eye movement cursor control program step framework synoptic diagram of head of the present invention;
Fig. 3 obtains the synoptic diagram of geometric maps adjustment parameter and screen distance information for the present invention.
Embodiment
Below in conjunction with preferred embodiment the present invention is described in further details.
A kind of head-wearing device provided by the invention, as shown in Figure 1, comprise headwork sensor, bone-conduction microphone and eye movement instrument, described headwork sensor, bone-conduction microphone and eye movement instrument are connected microprocessor respectively, and described microprocessor connects first wireless transport module.
Described headwork sensor comprises the motion tracking device, and described motion tracking device comprises three-axis gyroscope, three axis accelerometer and three magnetic strength instrument, is used for gathering the motion of user's head.
Described microprocessor comprises that behavior psychology compensating module, the collaborative stack of eye movement computing module, cursor instruct execution module, voice recognition module, an eye movement to make feature database, bone conduction waveform characterization libraries and cursor instruction execution module.
Described headwork sensor comprises motion tracking device, attitude computing module and a moving Planar Mapping check and correction module.
Described eye movement instrument comprises eye movement identification module, filter tracking module, the level and smooth adjustment module of assignment and eye movement Planar Mapping check and correction module.
Described head-wearing device utilizes headwork sensor, eye movement instrument, bone-conduction microphone and microprocessor to move the cursor control method of a pullover moving eye movement, the eye movement data that a moving data that described action sensor can be obtained and eye movement instrument obtain calculate the cursor displacement by stack, bone-conduction microphone picks up tooth sound and judges that drawing the cursor acknowledgement key moves, then with the cursor message reflection in head-mounted display, perhaps by first other terminal of wireless transport module controlled in wireless.For example, the described sensor of wearing can send to second wireless transport module of computer terminal with the cursor control information by first wireless transport module, and described computer terminal sends to the cursor control information on the terminal display and shows.
Described headwork sensor and eye movement instrument carry out the method for cursor control, as shown in Figure 2, specifically may further comprise the steps:
The moving data that steps A, described headwork processes sensor will be checked through are mapped as a moving two-dimensional coordinate displacement and send to described microprocessor; Simultaneously, described eye movement instrument is mapped as the eye movement data that are checked through the displacement of eye movement two-dimensional coordinate and sends to described microprocessor.
Step B, described microprocessor be the displacement superposed two-dimensional coordinate that moves for cursor of the moving two-dimensional coordinate displacement of head and eye movement two-dimensional coordinate, and simultaneously to a moving two-dimensional coordinate displacement and eye movement two-dimensional coordinate Displacement Feedback modified value.
Step C, described microprocessor monitors, and carry out the click event of sending of bone-conduction microphone by described voice recognition module, click event with the two-dimensional coordinate output immediately together that cursor moves, is carried out the cursor instruction or the cursor instruction is sent to other-end by described first wireless transport module.
The cursor control method of head-wearing device of the present invention, the headwork processes sensor is mapped as a process of moving two-dimensional coordinate displacement with the moving data of head in the described steps A, specifically may further comprise the steps:
A-h1, described headwork sensor with the real-time image data of motion tracking device be optimized moving data to the end, and the moving data of head are passed to described attitude computing module.
A-h2, described attitude computing module are shaken thresholding and weighted data blending algorithm with angular velocity and acceleration in the moving data of head, obtain the optimum attitude data, and the optimum attitude data are passed to described moving Planar Mapping check and correction module.
A-h3, described moving Planar Mapping check and correction module are mapped as a moving two-dimensional coordinate parameter by the geometric maps adjustment parameter that user interaction obtains with the optimum attitude data, and the moving two-dimensional coordinate parameter that will shine upon sends to the psychological compensating module of described microprocessor.
Described psychological compensating module is revised a moving two-dimensional coordinate parameter of the moving Planar Mapping check and correction of head module mapping, and modified value is fed back to described moving Planar Mapping check and correction module; Described moving Planar Mapping check and correction module proofreaied and correct according to the corrected value of described behavior psychology compensating module feedback, and the moving two-dimensional coordinate parameter after will proofreading and correct is passed to the collaborative stack of eye movement computing module.
Eye movement instrument described in the described steps A is mapped as the process of eye movement two-dimensional coordinate displacement with the eye movement data that are checked through, and specifically may further comprise the steps:
A-e1, described eye movement identification module are gathered the eye movement data, and the eye movement data transmission that described eye movement instrument is gathered the eye movement identification module is given described filter tracking module.
A-e2, described filter tracking module obtain the pupil heart and cornea reflective spot by characteristics of image filtering, obtain pupil heart tracking data by noise reduction and time domain tracking filter; And pupil heart tracking data passed to the level and smooth adjustment module of described assignment.
A-e3, the level and smooth adjustment module of described assignment are carried out adjustment by smothing filtering value and noise reduction function that the user sets, and the pupil heart tracking data after the adjustment is passed to described eye movement Planar Mapping check and correction module.
A-e4, described eye movement Planar Mapping check and correction module are shone upon described pupil heart tracking data by the geometric maps adjustment parameter that user interaction obtains, and the pupil heart tracking data of mapping is sent to described behavior psychology compensating module; Described behavior psychology compensating module is revised the pupil heart tracking data of mapping, and the value of feedback of revising is sent to described eye movement Planar Mapping check and correction module.
Described eye movement Planar Mapping check and correction module is proofreaied and correct the pupil heart tracking data of mapping according to the modified value of feedback, and corrected value is sent to the collaborative stack of described eye movement computing module.
Described behavior psychology compensating module: screen center's normal direction line, customer location, screen center's point, a moving surface level central point, eye movement central point as fixed reference, are done the feature database contrast with the moving eye movement coordinate displacement of head and an eye movement again and found out difference.
Described moving eye movement coordinate displacement comprises with the contrast that feature database is made in an eye movement: the tracking psychology feedback that the contrast sensation target deviates from causes an action of body attitude compensation, for example a moving short distance comes and goes repeatedly, moving eye movement is improper to deviate from the error amount that the analysis identification of action etc. obtains, to deviate from the error amount of each margin of error in the process proofreaies and correct by the algorithm of a cover square error recursion estimation, and particular weights error correction values feedback is moved Planar Mapping to the end proofread correction cursor movement error in module and the eye movement Planar Mapping check and correction module, give an eye movement the collaborative computing module dynamic proportion superposition value that superposes simultaneously.
The core of described behavior psychology compensating module is based on subconsciousness that target error the causes feedback body attitude compensation behavior action to visual psychology, by the application to the compensating movement identification of behavior psychology and Kalman filtering.
Among the step B of the present invention, described microprocessor is specific as follows with the step of the displacement superposed two-dimensional coordinate that moves for cursor of the moving two-dimensional coordinate displacement of head and eye movement two-dimensional coordinate:
The moving data that the collaborative stack of described eye movement computing module will be optimized and eye movement data are carried out dynamic proportion and are superposeed, the dynamic stack ratio of namely using behavior psychology compensating module to provide, make cursor in different quadrant different motion patterns, use different stack ratios, and dynamically revise with behavior psychology compensating module; Also can select single to use a moving data or eye movement data to drive cursor by the user simultaneously.
The click event step of sending of microprocessor execution bone-conduction microphone is among the described step C:
C1. described bone-conduction microphone is close to the forehead top, the data of gathering the teeth cavity signal is passed to the voice recognition module of microprocessor.
C2. described voice recognition module draws user's intent instructions by bone conduction waveform characterization libraries contrast, and sends cursor point and hit the event instruction and send cursor instruction execution module to.
C3. cursor instruction execution module can be carried out the cursor instruction with click event and with cursor coordinate output immediately together, also the cursor instruction can be sent by first wireless transport module of telling a story.
The present invention is unable to escape the Mi Dasi contact problems for fear of the eye movement cursor from sending instructions, add bone-conduction microphone and identified the sound of teeth cavity and tooth conduction, knock as dental articulation is " left button is clicked ", " interlock+interlock " is " left button double-click ", the sound of " bullet tongue " is " right button ", " play tongue+close tooth ", " close tooth+bullet tongue+close tooth " ... being equal to corresponding mouse control instruction combines.Because agomphiasis is the affirmation action of human initial demand, meet people's nature.
Head-wearing device provided by the invention sends the cursor steering order by first wireless transmission unit and controls other computer terminal, and other computer terminal is when using fixed terminal display, need the user in positions different more than two respectively correct moving and eye movement carry out the interactive check and correction of terminal display direct-view unique point, as shown in Figure 3, the geometric maps adjustment parameter that obtains and screen distance information, follow the tracks of displacement in order to move the back at head by the motion tracking device, and the variable parameter when calculating reposition, to ensure the accuracy of cursor action.
Head-wearing device provided by the invention can provide a kind of with eye movement, head is moving and agomphiasis realizes the cursor control method of namely controlling of looking, and namely a kind of the head-wearing device that carries out the hommization integration is identified in eye movement identification, the moving identification of head and agomphiasis percussion; Utilize the man day to be conigenous the right shallow consciousness cooperative motion of head eye behavior, based on subconsciousness that target error the causes feedback body attitude compensation behavior action to visual psychology, by the application to the compensating movement identification of behavior psychology and Kalman filtering, set up a pullover moving eye movement cursor control program step, realize head eye cursor control method easily naturally.
The cursor control method of a kind of head-wearing device of the present invention and head-wearing device, utilize the man day to be conigenous the right shallow consciousness cooperative motion of head eye behavior, carry out mouse control by eye movement, a sampling identification moving and the agomphiasis percussion, realizing dressing with oneself smart machine and in the liberation both hands, accomplished to realize the most directly and accurately expression way---institute looks namely and controls, the light actuating quantity of operation nature is little, and the consumption that reduces physical energy in the operation helps long-time operation and helps the disabled person to use.
Above content is the explanation to preferred embodiment of the present invention, can help those skilled in the art to understand technical scheme of the present invention more fully.But these embodiment only illustrate, and can not assert that the specific embodiment of the present invention only limits to the explanation of these embodiment.Concerning the general technical staff of the technical field of the invention, without departing from the inventive concept of the premise, can also make some simple deductions and conversion, all should be considered as belonging to protection scope of the present invention.

Claims (10)

1. the cursor control method of a head-wearing device may further comprise the steps:
The moving data that steps A, headwork processes sensor will be checked through are mapped as a moving two-dimensional coordinate displacement and send to microprocessor; Simultaneously, described eye movement instrument is mapped as the eye movement data that are checked through the displacement of eye movement two-dimensional coordinate and sends to described microprocessor;
Step B, described microprocessor are with the displacement superposed two-dimensional coordinate that moves for cursor of the moving two-dimensional coordinate displacement of head and eye movement two-dimensional coordinate;
Step C, described microprocessor are carried out the click event that bone-conduction microphone sends by the voice recognition module, and click event is exported together with the two-dimensional coordinate that cursor moves.
2. according to the cursor control method of the described head-wearing device of claim 1, it is characterized in that: the headwork processes sensor is mapped as a process of moving two-dimensional coordinate displacement with the moving data of head in the described steps A, specifically may further comprise the steps:
A-h1, headwork sensor with the real-time image data of motion tracking device be optimized moving data to the end, and the moving data of head are passed to the attitude computing module;
A-h2, described attitude computing module are shaken thresholding and weighted data blending algorithm with angular velocity and acceleration in the moving data of head, obtain the optimum attitude data, and the optimum attitude data are passed to described moving Planar Mapping check and correction module;
A-h3, described moving Planar Mapping check and correction module are mapped as a moving two-dimensional coordinate parameter by the geometric maps adjustment parameter that user interaction obtains with the optimum attitude data, and the moving two-dimensional coordinate parameter that will shine upon sends to the psychological compensating module of described microprocessor;
Described psychological compensating module is revised a moving two-dimensional coordinate parameter of the moving Planar Mapping check and correction of head module mapping, and modified value is fed back to described moving Planar Mapping check and correction module; Described moving Planar Mapping check and correction module proofreaied and correct according to the corrected value of described behavior psychology compensating module feedback, and the moving two-dimensional coordinate parameter after will proofreading and correct is passed to the collaborative stack of eye movement computing module.
3. according to the cursor control method of the described head-wearing device of claim 2, it is characterized in that: eye movement instrument described in the described steps A is mapped as the process of eye movement two-dimensional coordinate displacement with the eye movement data that are checked through, and specifically may further comprise the steps:
A-e1, eye movement identification module are gathered the eye movement data, and give the filter tracking module with the eye movement data transmission of gathering;
A-e2, described filter tracking module obtain the pupil heart and cornea reflective spot by characteristics of image filtering, obtain pupil heart tracking data by noise reduction and time domain tracking filter; And pupil heart tracking data passed to the level and smooth adjustment module of assignment;
A-e3, the level and smooth adjustment module of described assignment are carried out adjustment by smothing filtering value and noise reduction function that the user sets, and the pupil heart tracking data after the adjustment is passed to eye movement Planar Mapping check and correction module;
A-e4, described eye movement Planar Mapping check and correction module are shone upon described pupil heart tracking data by the geometric maps adjustment parameter that user interaction obtains, and the pupil heart tracking data of mapping is sent to described behavior psychology compensating module; Described behavior psychology compensating module is revised the pupil heart tracking data of mapping, and the value of feedback of revising is sent to described eye movement Planar Mapping check and correction module;
Described eye movement Planar Mapping check and correction module is proofreaied and correct the pupil heart tracking data of mapping according to the modified value of feedback, and corrected value is sent to the collaborative stack of described eye movement computing module.
4. the cursor control method of head-wearing device according to claim 3 is characterized in that, the method for described behavior psychology compensating module correction may further comprise the steps:
Described behavior psychology compensating module: screen center's normal direction line, customer location, screen center's point, a moving surface level central point, eye movement central point as fixed reference, are done the feature database contrast with the moving eye movement coordinate displacement of head and an eye movement again and found out difference.
5. the cursor control method of head-wearing device according to claim 4, it is characterized in that: described moving eye movement coordinate displacement comprises with the contrast that feature database is made in an eye movement: the tracking psychology feedback that the contrast sensation target deviates from causes an action of body attitude compensation to obtain error amount, will deviate from the algorithm that the error amount of each margin of error in the process estimates by the square error recursion and proofread and correct.
6. the cursor control method of head-wearing device according to claim 5, it is characterized in that: described step B specifically may further comprise the steps:
A moving data and eye movement data that the collaborative stack of an eye movement of microprocessor computing module will be optimized are carried out the dynamic proportion stack, make cursor use different stack ratios in different quadrant different motion patterns, and dynamically revise with behavior psychology compensating module.
7. the cursor control method of head-wearing device according to claim 6 is characterized in that: microprocessor is carried out the click event step of sending of bone-conduction microphone and is among the described step C:
C1. described bone-conduction microphone is close to the forehead top, the data of gathering the teeth cavity signal is passed to the voice recognition module of microprocessor;
C2. described voice recognition module draws user's intent instructions by bone conduction waveform characterization libraries contrast, and sends cursor point and hit the event instruction and send cursor instruction execution module to;
C3. cursor instruction execution module is exported together with click event and with the cursor coordinate.
8. the cursor control method of head-wearing device according to claim 7 is characterized in that: described cursor instruction execution module comprises the output of click event to be carried out the cursor instruction and the cursor instruction is sent by first wireless transport module.
9. the cursor control method of head-wearing device according to claim 8, it is characterized in that: send the cursor steering order by first wireless transmission unit and control other computer terminal, and its computer terminal is when using fixed terminal display, need the user in positions different more than two respectively correct moving and eye movement carry out the interactive check and correction of terminal display direct-view unique point, the geometric maps adjustment parameter that obtains and screen distance information.
10. the cursor control method of head-wearing device according to claim 9, it is characterized in that: described motion tracking device comprises three-axis gyroscope, three axis accelerometer and three magnetic strength instrument.
CN201310295425.5A 2013-07-16 2013-07-16 A kind of cursor control method of head-wearing device Active CN103336580B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310295425.5A CN103336580B (en) 2013-07-16 2013-07-16 A kind of cursor control method of head-wearing device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310295425.5A CN103336580B (en) 2013-07-16 2013-07-16 A kind of cursor control method of head-wearing device

Publications (2)

Publication Number Publication Date
CN103336580A true CN103336580A (en) 2013-10-02
CN103336580B CN103336580B (en) 2016-08-24

Family

ID=49244768

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310295425.5A Active CN103336580B (en) 2013-07-16 2013-07-16 A kind of cursor control method of head-wearing device

Country Status (1)

Country Link
CN (1) CN103336580B (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103513770A (en) * 2013-10-09 2014-01-15 中国科学院深圳先进技术研究院 Man-machine interface equipment and man-machine interaction method based on three-axis gyroscope
CN103995607A (en) * 2014-05-22 2014-08-20 百度在线网络技术(北京)有限公司 Control method, control device, controlled method and controlled device
CN104571528A (en) * 2015-01-27 2015-04-29 王露 Eyeball tracking-based IT (intelligent terminal) control device and method
CN104866105A (en) * 2015-06-03 2015-08-26 深圳市智帽科技开发有限公司 Eye movement and head movement interactive method for head display equipment
CN105078404A (en) * 2015-09-02 2015-11-25 北京津发科技股份有限公司 Fully automatic eye movement tracking distance measuring calibration instrument based on laser algorithm and use method of calibration instrument
CN105242790A (en) * 2015-11-04 2016-01-13 钱雄 Intelligent equipment control method and control device
CN106020452A (en) * 2016-05-10 2016-10-12 北京行云时空科技有限公司 Cursor movement method and system based on intelligent headset device
WO2016169476A1 (en) * 2015-04-20 2016-10-27 Beijing Zhigu Rui Tuo Tech Co., Ltd. Method and device for determining head movement
CN109032347A (en) * 2018-07-06 2018-12-18 昆明理工大学 One kind controlling mouse calibration method based on electro-ocular signal
CN110308794A (en) * 2019-07-04 2019-10-08 郑州大学 There are two types of the virtual implementing helmet of display pattern and the control methods of display pattern for tool
CN110727349A (en) * 2019-09-29 2020-01-24 上海猫虎网络科技有限公司 Man-machine interaction method and AR glasses based on bone conduction interaction
CN112363621A (en) * 2020-11-13 2021-02-12 北京达佳互联信息技术有限公司 Terminal control method and device, electronic equipment and storage medium
CN113220183A (en) * 2021-05-24 2021-08-06 中国银行股份有限公司 System operation method and device, computer equipment and readable storage medium
US20220230659A1 (en) * 2021-01-15 2022-07-21 Facebook Technologies, Llc System for non-verbal hands-free user input

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5513649A (en) * 1994-03-22 1996-05-07 Sam Technology, Inc. Adaptive interference canceler for EEG movement and eye artifacts
JPH11281324A (en) * 1998-03-30 1999-10-15 Isuzu Motors Ltd Measuring apparatus for line of sight
US20020077831A1 (en) * 2000-11-28 2002-06-20 Numa Takayuki Data input/output method and system without being notified
CN101308400A (en) * 2007-05-18 2008-11-19 肖斌 Novel human-machine interaction device based on eye-motion and head motion detection
KR20110116580A (en) * 2010-04-19 2011-10-26 국방과학연구소 Line of sight tracking system integrated with head tracker and eye tracker and mehtod thereof
CN102930252A (en) * 2012-10-26 2013-02-13 广东百泰科技有限公司 Sight tracking method based on neural network head movement compensation

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5513649A (en) * 1994-03-22 1996-05-07 Sam Technology, Inc. Adaptive interference canceler for EEG movement and eye artifacts
JPH11281324A (en) * 1998-03-30 1999-10-15 Isuzu Motors Ltd Measuring apparatus for line of sight
US20020077831A1 (en) * 2000-11-28 2002-06-20 Numa Takayuki Data input/output method and system without being notified
CN101308400A (en) * 2007-05-18 2008-11-19 肖斌 Novel human-machine interaction device based on eye-motion and head motion detection
KR20110116580A (en) * 2010-04-19 2011-10-26 국방과학연구소 Line of sight tracking system integrated with head tracker and eye tracker and mehtod thereof
CN102930252A (en) * 2012-10-26 2013-02-13 广东百泰科技有限公司 Sight tracking method based on neural network head movement compensation

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103513770A (en) * 2013-10-09 2014-01-15 中国科学院深圳先进技术研究院 Man-machine interface equipment and man-machine interaction method based on three-axis gyroscope
CN103995607A (en) * 2014-05-22 2014-08-20 百度在线网络技术(北京)有限公司 Control method, control device, controlled method and controlled device
CN103995607B (en) * 2014-05-22 2018-09-07 百度在线网络技术(北京)有限公司 Control method, control device and controlled method and controlled device
CN104571528B (en) * 2015-01-27 2017-09-05 王露 It is a kind of to realize that eyeball controls the device and method of intelligent terminal based on eyeball tracking
CN104571528A (en) * 2015-01-27 2015-04-29 王露 Eyeball tracking-based IT (intelligent terminal) control device and method
US10936052B2 (en) 2015-04-20 2021-03-02 Beijing Zhigu Rui Tuo Tech Co., Ltd Method and device for determining head movement according to electrooculographic information
WO2016169476A1 (en) * 2015-04-20 2016-10-27 Beijing Zhigu Rui Tuo Tech Co., Ltd. Method and device for determining head movement
CN104866105A (en) * 2015-06-03 2015-08-26 深圳市智帽科技开发有限公司 Eye movement and head movement interactive method for head display equipment
CN108170279B (en) * 2015-06-03 2021-07-30 塔普翊海(上海)智能科技有限公司 Eye movement and head movement interaction method of head display equipment
CN108153424B (en) * 2015-06-03 2021-07-09 塔普翊海(上海)智能科技有限公司 Eye movement and head movement interaction method of head display equipment
CN108153424A (en) * 2015-06-03 2018-06-12 塔普翊海(上海)智能科技有限公司 The eye of aobvious equipment is moved moves exchange method with head
CN108170279A (en) * 2015-06-03 2018-06-15 塔普翊海(上海)智能科技有限公司 The eye of aobvious equipment is moved moves exchange method with head
CN104866105B (en) * 2015-06-03 2018-03-02 塔普翊海(上海)智能科技有限公司 The eye of aobvious equipment is dynamic and head moves exchange method
CN105078404A (en) * 2015-09-02 2015-11-25 北京津发科技股份有限公司 Fully automatic eye movement tracking distance measuring calibration instrument based on laser algorithm and use method of calibration instrument
CN105242790B (en) * 2015-11-04 2019-06-21 钱雄 A kind of smart machine control method and control device
CN105242790A (en) * 2015-11-04 2016-01-13 钱雄 Intelligent equipment control method and control device
CN106020452A (en) * 2016-05-10 2016-10-12 北京行云时空科技有限公司 Cursor movement method and system based on intelligent headset device
CN109032347A (en) * 2018-07-06 2018-12-18 昆明理工大学 One kind controlling mouse calibration method based on electro-ocular signal
CN110308794A (en) * 2019-07-04 2019-10-08 郑州大学 There are two types of the virtual implementing helmet of display pattern and the control methods of display pattern for tool
CN110727349A (en) * 2019-09-29 2020-01-24 上海猫虎网络科技有限公司 Man-machine interaction method and AR glasses based on bone conduction interaction
CN110727349B (en) * 2019-09-29 2023-11-21 光感(上海)科技有限公司 Human-computer interaction method and AR (augmented reality) glasses based on bone conduction interaction
CN112363621A (en) * 2020-11-13 2021-02-12 北京达佳互联信息技术有限公司 Terminal control method and device, electronic equipment and storage medium
US20220230659A1 (en) * 2021-01-15 2022-07-21 Facebook Technologies, Llc System for non-verbal hands-free user input
CN113220183A (en) * 2021-05-24 2021-08-06 中国银行股份有限公司 System operation method and device, computer equipment and readable storage medium

Also Published As

Publication number Publication date
CN103336580B (en) 2016-08-24

Similar Documents

Publication Publication Date Title
CN103336580A (en) Cursor control method of head-mounted device
US11402902B2 (en) Gesture based user interfaces, apparatuses and systems using eye tracking, head tracking, hand tracking, facial expressions and other user actions
US11481031B1 (en) Devices, systems, and methods for controlling computing devices via neuromuscular signals of users
Yang et al. Gesture interaction in virtual reality
US20190265802A1 (en) Gesture based user interfaces, apparatuses and control systems
Webster et al. Systematic review of Kinect applications in elderly care and stroke rehabilitation
Interrante et al. Seven league boots: A new metaphor for augmented locomotion through moderately large scale immersive virtual environments
US20220269346A1 (en) Methods and apparatuses for low latency body state prediction based on neuromuscular data
CA2747814C (en) Hands-free pointer system
US20180224930A1 (en) Immersive virtual reality locomotion using head-mounted motion sensors
CN107789803B (en) Cerebral stroke upper limb rehabilitation training method and system
Schwind et al. Up to the finger tip: The effect of avatars on mid-air pointing accuracy in virtual reality
Jantz et al. A brain-computer interface for extended reality interfaces
WO2023087954A1 (en) Upper limb rehabilitation training system for stroke patients
US11497440B2 (en) Human-computer interactive rehabilitation system
Cannan et al. A wearable sensor fusion armband for simple motion control and selection for disabled and non-disabled users
Motti et al. Introduction to wearable computers
US20200341550A1 (en) System, Method, and Apparatus for Virtual Reality Feedback
CN112230777A (en) Cognitive training system based on non-contact interaction
CN113035000A (en) Virtual reality training system for central integrated rehabilitation therapy technology
US20240028129A1 (en) Systems for detecting in-air and surface gestures available for use in an artificial-reality environment using sensors at a wrist-wearable device, and methods of use thereof
CN110473602B (en) Body state data collection processing method for wearable body sensing game device
US11079845B2 (en) System, method, and apparatus for therapy and computer usage
CN110097805A (en) A kind of piano study hand-type correcting device
Sharma et al. Gesture-controlled user interfaces

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
C41 Transfer of patent application or patent right or utility model
TR01 Transfer of patent right

Effective date of registration: 20160819

Address after: 200235 Shanghai, Caobao Road, No. 120, building 38, building 3, building

Patentee after: SHANGHAI TAPU INSTRUMENT MANUFACTURING CO., LTD.

Address before: 556000, Qiandongnan Miao and Dong Autonomous Prefecture, Kaili, Beijing West Road, 43 (309) No. four, building, Guizhou

Patentee before: Wei Rongjie

DD01 Delivery of document by public notice

Addressee: Taap Xiang Hai (Shanghai) Technology Co. Ltd.

Document name: Notification that Application Deemed not to be Proposed

C56 Change in the name or address of the patentee
CP03 Change of name, title or address

Address after: 201802 Shanghai, Jiading District, Shanghai Yi Road, building 412, room 5, 1082

Patentee after: Taap Yi Hai (Shanghai) Technology Co. Ltd.

Address before: 200235 Shanghai, Caobao Road, No. 120, building 38, building 3, building

Patentee before: SHANGHAI TAPU INSTRUMENT MANUFACTURING CO., LTD.