CN104199547A - Man-machine interactive type virtual touch device, system and method - Google Patents

Man-machine interactive type virtual touch device, system and method Download PDF

Info

Publication number
CN104199547A
CN104199547A CN201410436118.9A CN201410436118A CN104199547A CN 104199547 A CN104199547 A CN 104199547A CN 201410436118 A CN201410436118 A CN 201410436118A CN 104199547 A CN104199547 A CN 104199547A
Authority
CN
China
Prior art keywords
hand
coordinate
image
unit
central point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410436118.9A
Other languages
Chinese (zh)
Other versions
CN104199547B (en
Inventor
廖裕民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Rockchip Electronics Co Ltd
Original Assignee
Fuzhou Rockchip Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuzhou Rockchip Electronics Co Ltd filed Critical Fuzhou Rockchip Electronics Co Ltd
Priority to CN201410436118.9A priority Critical patent/CN104199547B/en
Publication of CN104199547A publication Critical patent/CN104199547A/en
Application granted granted Critical
Publication of CN104199547B publication Critical patent/CN104199547B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention provides a man-machine interactive type virtual touch device, system and method. The method comprises the steps that hand recognition is carried out on image data collected by a camera with a single hand of a user horizontally arranged on an image capturing area in a hanging mode, and therefore the position of the central position of the hand in an image is determined, the camera resolution degree is combined, and the pixel position of the central point of the hand is converted into a two-dimensional coordinate value of an XZ coordinate surface and a two-dimensional coordinate value of a YZ coordinate surface so that the coordinate value of the central point of the hand in an XYZ three-dimensional coordinate system can be set; according to the three-dimensional coordinate value, the working mode is determined by the number of fingers used by the user, the operation position and the operation mode of the user are judged, according to the judgment result and coordinates of all current operational icons, a picture for virtualizing the position of the hand on a touch screen and effective operation is drawn, and the drawn image is displayed. According to the man-machine interactive type virtual touch device, system and method, the user can conveniently perform free man-machine interactive operation through the virtual touch screen any time any where.

Description

The virtual contactor control device of man-machine interactive, system and method
Technical field
The present invention relates to virtual touch technology field, relate in particular to the virtual contactor control device of a kind of man-machine interactive, system and method.
Background technology
Touch screen (touch control screen) is the important component part in the man-machine interaction of prior art, existing touch screen input all depends on entity touch screen equipment, that is to say, must there is the in esse touch screen can finishing man-machine interaction, so just greatly limit place scope and the condition of carrying out man-machine interaction.
Summary of the invention
In view of the above problems, the invention provides the virtual contactor control device of a kind of man-machine interactive that overcomes the problems referred to above or address the above problem at least partly, system and method.
The invention provides the virtual contactor control device of a kind of man-machine interactive, comprise indicative control unit and display unit, this device comprises:
View recognition unit, carries out hard recognition for the view data that needs the singlehanded head of operation to gather to user's both hands to camera, to determine the position of hand center in image.
Surface level two-dimensional coordinate is set up unit, in the position of image and the pixel resolution of camera, hand central point location of pixels is converted to the two-dimensional coordinate value of XZ coordinate surface for the hand center position that identifies according to this view recognition unit.
Vertical plane two-dimensional coordinate is set up unit, in the position of image and the pixel resolution of camera, hand central point location of pixels is converted to the two-dimensional coordinate value of YZ coordinate surface for the hand center position that identifies according to this view recognition unit.
Three-dimensional coordinate computing unit, sets up the hand central point location of pixels of determining respectively unit and sets up the coordinate figure of hand central point at XYZ three-dimensional system of coordinate in the two-dimensional coordinate value of XZ coordinate surface and YZ coordinate surface for set up unit and vertical plane two-dimensional coordinate according to this surface level two-dimensional coordinate.
Mode of operation judging unit, carries out the selection of mode of operation for the finger number using according to the singlehanded image user of this identification.
Action judging unit, judges user's operating position and operator scheme for the hand D coordinates value set up according to this three-dimensional coordinate computing unit and the selected mode of operation of this mode of operation judging unit.And
Graphic plotting unit, for drawing out the position of hand and the picture of valid function on virtual touch screen according to the judged result of this action judging unit and current all coordinates that can handle icon, the image that shows user's operation to call this display unit of this indicative control unit control, makes user learn the position in the virtual touch screen that current hand central point is corresponding and continue mobile hand according to shown image and carry out virtual touch control operation according to feedback.
The present invention also provides a kind of man-machine interactive virtual touch-control system, comprises the virtual contactor control device of man-machine interactive described in as above any one and two picture pick-up devices with this device communication connection.
The present invention also provides a kind of man-machine interactive virtual touch control method, and the method comprises:
User, by needing unsettled the lying against in picture catching region of one hand of operation in both hands, carries out hard recognition to the view data of camera collection, to determine the position of hand center in image.
According to the position of hand center position in image and the pixel resolution of camera that identify, hand central point location of pixels is converted to respectively to the two-dimensional coordinate value of XZ coordinate surface and YZ coordinate surface.
Set up the coordinate figure of hand central point in XYZ three-dimensional system of coordinate according to hand central point location of pixels in the two-dimensional coordinate value of XZ coordinate surface and YZ coordinate surface.
Carry out the selection of mode of operation according to the finger number that in the singlehanded image of identification, user uses.
According to the hand D coordinates value of setting up, operating position and the operator scheme that definite mode of operation judges user.
According to judged result and current all coordinates that can handle icon draw out the position of hand and the picture of valid function on virtual touch screen.And
The image that shows this drafting is watched for user, makes user learn the position in the virtual touch screen that current hand central point is corresponding and continue mobile hand according to the image of described demonstration and carry out virtual touch control operation according to feedback.
The virtual contactor control device of a kind of man-machine interactive provided by the invention, system and method, capture image and determine operating position and point number by identification by image recognition hand by camera and judge mode of operation, directly be mapped as the operational motion to virtual touch screen by the hand three-dimensional coordinate obtaining, and on display, show and feed back to user, can make touch screen input no longer need entity device, by intelligent glasses and intelligent bracelet, or the virtual touch screen input environment of picture pick-up device fast construction on intelligent and portable mobile device, carry out touch screen input whenever and wherever possible, facilitate user to carry out man-machine interactive operation freely by virtual touch screen whenever and wherever possible.
Brief description of the drawings
Fig. 1 is the hardware structure schematic diagram of the virtual touch-control system of man-machine interactive in embodiment of the present invention;
Fig. 2 is the high-level schematic functional block diagram of the virtual contactor control device of man-machine interactive in embodiment of the present invention;
Fig. 3 is the transfer principle schematic diagram that in embodiment of the present invention, hand central point location of pixels is converted to the two-dimensional coordinate value of XZ coordinate surface;
Fig. 4 is the schematic flow sheet of the virtual touch control method of man-machine interactive in embodiment of the present invention.
Label declaration:
System 100
Device 10
Brightness sensing unit 101
View recognition unit 102
Longitudinally view recognin unit 1021
Transverse views recognin unit 1022
Finger number judging unit 103
Mode of operation judging unit 104
Surface level two-dimensional coordinate is set up unit 105
Vertical plane two-dimensional coordinate is set up unit 106
Three-dimensional coordinate computing unit 107
Action judging unit 108
Graphic plotting unit 109
Indicative control unit 110
Display unit 111
Picture pick-up device 20
The first camera 201
Second camera 202
Display device 21
Embodiment
By describing technology contents of the present invention, structural attitude in detail, being realized object and effect, below in conjunction with embodiment and coordinate accompanying drawing to be explained in detail.
Refer to Fig. 1, for the hardware structure schematic diagram of the virtual touch-control system of man-machine interactive in embodiment of the present invention, this system 100 comprises the virtual contactor control device of man-machine interactive 10, two picture pick-up devices 20 and display devices 21, for the detection of user's gesture being realized to touch-control input.
Please refer to Fig. 2, for the high-level schematic functional block diagram of the virtual contactor control device of man-machine interactive in embodiment of the present invention, this device 10 comprises that ambient brightness sensing unit 101, view recognition unit 102, finger number judging unit 103, mode of operation judging unit 104, surface level two-dimensional coordinate are set up unit 105, vertical plane two-dimensional coordinate is set up unit 106, three-dimensional coordinate computing unit 107, action judging unit 108, graphic plotting unit 109, indicative control unit 110 and display unit 111.During this device 10 can be applied electronic equipments such as camera, mobile phone, panel computer, this picture pick-up device 20 communicates by network and this device 10 and is connected, and the transmission medium of this network can be the wireless transmission mediums such as bluetooth, zigbee, WIFI.
Each picture pick-up device 20 includes the first camera 201 and second camera 202, respectively as longitudinally picture pick-up device and laterally picture pick-up device.Wherein, can be that intelligent glasses etc. can, in the mobile portable electronic equipment of user's hand top, can be positioned over for intelligent bracelet etc. the mobile portable electronic equipment in user front as the second camera 202 of horizontal picture pick-up device as the first camera 201 of longitudinal picture pick-up device.Further, the first camera 201 and the second camera 202 of each picture pick-up device 20 are respectively common camera and infrared camera.Wherein, common camera can, in the good situation of light condition, carry out image acquisition and be sent to device 10 and analyze user's operational motion.Infrared camera can, in the situation that light condition is poor, carries out image acquisition and be sent to device 10 and analyze user's operational motion.This view recognition unit 102 comprises longitudinal view recognin unit 1021 and transverse views recognin unit 1022, corresponding the first camera 201 and second camera 202 as longitudinal picture pick-up device and horizontal picture pick-up device arranges respectively, for the image of its collection is carried out to identifying processing.
In the time of original state, two pairs of cameras (a pair of common camera and a pair of infrared camera) are used in conjunction with, and shooting direction is set to orthogonal, can catch the action behavior of hand vertical direction and horizontal direction simultaneously.Conventionally, in intelligent glasses, two cameras (a common camera and an infrared camera) are put down, and two cameras (a common camera and an infrared camera) level on intelligent bracelet or smart mobile phone is put.And, jointly form picture catching region by the rectangular area of taking of two pairs of cameras.
The brightness value of these ambient brightness sensing unit 101 induced environments, and ambient brightness value is sent in this view recognition unit 102.This view recognition unit 102 uses common camera or infrared camera according to the luminance threshold value judgement setting in advance.For example, brightness impression scope is 1~100, and threshold value is 50, and ambient brightness value exceedes at 50 o'clock and determines and uses common camera, when ambient brightness value is used infrared camera image lower than 50 time.
Determine according to ambient brightness value after the camera types using, start initial alignment operation, specific as follows.This device 10 is in the time carrying out initial alignment operation, user holds of needing operation in both hands that fist is unsettled lies against the position that two groups of selected cameras can photograph,, picture catching region, and keep the static of certain hour, with completing user hand position initialization flow process, be convenient to device 10 and identify and orient the initial position of hand, so that follow-up operation.This device 10 is identified and the principle of locating hand position will below be described in detail.
In the time carrying out interactive operation, user will need unsettled the lying against in picture catching region of a hand (hereinafter to be referred as one hand) of operation in both hands, the ambient brightness value judgement that this longitudinal view recognin unit 1021 detects according to this ambient brightness sensing unit 101 is used common camera or infrared camera, and when after definite camera using, the longitudinal common camera of picture pick-up device of conduct above one hand or the view data of infrared camera collection being carried out to hard recognition, to determine the position of hand center in image.The ambient brightness value judgement that this transverse views recognin unit 1022 detects according to this ambient brightness sensing unit 101 is used common camera or infrared camera, and when determining after the camera using to carrying out hard recognition in singlehanded front as the common camera of horizontal picture pick-up device or the view data of infrared camera collection, to determine the position of hand center in image.
Wherein, the position of definite hand center, this longitudinal view recognin unit 1021 in image is hand central point pixel at the XZ coordinate surface position in image, and for example, hand central point pixel is positioned at the capable b row of a of XZ face image.The position of definite hand center, this transverse views recognin unit 1022 in image is hand central point pixel at the YZ coordinate surface position in image.
Further, judge that by common camera the method for hand central point comprises color background method and color glove method.Wherein, color background method is specially: the environmental background of bimanualness needs color relatively simple and single, can directly pass through so the direct handle portion Extraction of Image of color interval range of human body complexion out, then obtain the line number of central point according to the mean value of peak in the hand images region extracting and minimum point, obtain the columns of central point by the mean value of the most left point and the rightest point.Color glove auxiliary law is specially: user wears special pure red gloves, because common camera is all RGB (red-green-blue) sampling, can directly extract pure red regional location, also can use green or blue as finger of glove end points color.Then, obtain the line number of central point according to the mean value of highs and lows in the hand images region extracting, obtain the columns of central point by the mean value of the most left point and the rightest point.
The method that judges finger central point by infrared camera comprises temperature filtering method and color glove auxiliary law.Wherein, temperature filtering method is in particular: bimanualness can be directly by the higher feature of the relative environment temperature of human surface temperature directly hand Extraction of Image higher temperature out, then obtain the line number of central point according to the mean value of highs and lows in the hand images region extracting, obtain the columns of central point by the mean value of the most left point and the rightest point.Color glove auxiliary law is specially: user wears special gloves, there is heating effect on the surface of gloves, can directly extract like this thermal region in image, then obtain the line number of central point according to the mean value of highs and lows in the hand images region extracting, obtain the columns of central point by the mean value of the most left point and the rightest point.
This surface level two-dimensional coordinate is set up the position of hand center position in image and the pixel resolution of camera that unit 105 identifies according to this longitudinal view recognin unit 1021, hand central point location of pixels is converted to the two-dimensional coordinate value of XZ coordinate surface.This vertical plane two-dimensional coordinate is set up the position of hand center position in image and the pixel resolution of camera that unit 106 identifies according to this transverse views recognin unit 1022, hand central point location of pixels is converted to the two-dimensional coordinate value of YZ coordinate surface.
Refer to Fig. 3, the transfer principle that hand central point location of pixels is converted to the two-dimensional coordinate value of XZ coordinate surface is in particular: image lower left corner pixel is set to the starting point 0 of two-dimensional coordinate system, goes out the line number of the relative each image of coordinate figure scope and the ratio of columns according to image analytic degree with the coordinate figure range computation being converted to after two-dimensional coordinate.For example, the wide height of XZ coordinate surface image analytic degree is 2000*1000, the coordinate figure scope of two dimension XZ plane coordinate system is that X-axis is 1 to 150, Z axis is 1 to 100, and the column number proportion of the relative image of Z axis coordinate figure scope is 100/1000, the columns ratio 150/2000 of the relative image of X-axis coordinate figure scope.The location of pixels of hand central point is multiplied by the ratio of the relative image line of the coordinate range calculating, columns, thereby obtains being converted to the end points two-dimensional coordinate value after two-dimensional coordinate.For example, the location of pixels of certain hand central point is that 300 row 200 are listed as, and the Z axis coordinate of this hand central point is 300*100/1000=30, and the X-axis coordinate of this hand central point is 200*150/2000=15.The transfer principle of two-dimensional coordinate value that hand central point location of pixels is converted to YZ coordinate surface is the same, does not add and repeats at this.
This three-dimensional coordinate computing unit 107 is set up unit 105 and vertical plane two-dimensional coordinate according to this surface level two-dimensional coordinate and is set up the hand central point location of pixels of determining respectively unit 106 and set up the coordinate figure of hand central point in XYZ three-dimensional system of coordinate in the two-dimensional coordinate value of XZ coordinate surface and YZ coordinate surface.
Wherein, the principle of work of setting up the coordinate figure of hand central point in XYZ three-dimensional system of coordinate is specially: because XZ coordinate surface and YZ coordinate surface have common Z axis, so the Z value of each coordinate end points in the Z value of each coordinate end points in XZ coordinate surface and YZ coordinate surface is extracted and is compared, consistent or the immediate coordinate end points of Z axis coordinate figure can be considered to same end points, then the coordinate figure of XZ coordinate surface and the coordinate figure of YZ coordinate surface that are judged as same end points are merged into a coordinate end points, using the coordinate figure as XYZ three-dimensional system of coordinate.Because Z value is likely different, so the coordinate Z value that the coordinate Z value that the Z value of the new three-dimensional coordinate producing is XZ coordinate surface adds YZ coordinate surface is then divided by 2 operation result, the X in three-dimensional system of coordinate, Y coordinate figure equal respectively X coordinate figure and the Y coordinate figure of XZ coordinate surface and YZ coordinate surface.
This longitudinal view recognin unit 1021 is sent to hand center position this surface level two-dimensional coordinate and sets up in unit 105 and also the hand images identifying is sent to this finger number judging unit 103 in the position in image.
This finger number judging unit 103 is identified user in longitudinal view according to hand images and is operated used finger number.
Particularly, this finger number judging unit 103 is by determining finger number to the identification of pointing end points in hand images.Wherein, method by common camera identification finger end points comprises color background method and color glove auxiliary law, color background method is specially: the environmental background of bimanualness needs color relatively simple and single, can directly pass through so the direct handle portion Extraction of Image of color interval range of human body complexion out, then calculate according to figure endpoint algorithm the cut off position that the each strip of hand extends, as the endpoint location of every finger, then calculate total several end points.Color glove auxiliary law is specially: user need to wear special gloves, each fingertip location of gloves is pure red, because common camera is all RGB (red-green-blue) sampling, can directly extract the position of pure red point, also can use green or blue as finger of glove end points color, then calculate total several end points.
The method of identifying finger end points by infrared camera comprises temperature filtering method and color glove auxiliary law, temperature filtering method is specially: bimanualness can be directly by the higher feature of the relative environment temperature of human surface temperature directly hand Extraction of Image higher temperature out, then calculate according to figure endpoint algorithm the cut off position that the each strip of hand extends, as the endpoint location of every finger, then calculate total several end points.Color glove auxiliary law is in particular: user wears special gloves, and each fingertip location of gloves is the points that have heating, can directly extract like this hotspot location in image, then calculates total several end points.
This mode of operation judging unit 104 uses the number of finger to carry out the selection of mode of operation according to this finger number judging unit 103 definite users, for example, when finger does not stretch out (hand is in holding bulk), to for a change position coordinates of hand in screen of the operator scheme of virtual touch screen; While only using a finger (other are pointed in holding bulk), to the operator scheme of virtual touch screen for choosing certain icon; While using two fingers, be to drag the icon of choosing to the operator scheme of virtual touch screen; While using three fingers, be the whole screen that slides to the operator scheme of virtual touch screen.0 finger to 5 finger of definable at most, 6 kinds of mode of operations altogether, are used mode of operation corresponding to different finger all can define again.
What the mode of operation that hand D coordinates value, this mode of operation judging unit 104 that this action judging unit 108 is set up according to this three-dimensional coordinate computing unit 107 are definite and this graphic plotting unit 109 fed back can handle icon XZ coordinate range region, judge user's operating position and operator scheme.
In the present embodiment, in the time of hand position initial phase, this action judging unit 108 is judged plane Y-axis value according to the minimum end points of vertical direction in the D coordinates value of hand central point (the namely min coordinates value of Y-axis) as clicking, the D coordinates value of handle portion central point is mapped on operable area, and judges that in conjunction with current mode of operation the current hand carrying out is through the corresponding operating result of this click judgement plane.For example, under the mode of operation of the position coordinates at 0 change hand corresponding to finger in screen, clicking operation does not have any implication; 1 corresponding choosing under certain icon working pattern of finger, clicking operation represents to choose the icon in certain screen; 2 corresponding dragging under the icon working pattern of choosing of finger, click action represents that user starts or finish drag operation by certain icon; Point under the corresponding whole screen work pattern of slip at 3, click action represents operation that user starts whole screen-picture or finish to drag etc.
Owing to being hand position initial phase, this action judging unit 108 utilizes Y value to set the initial value of the judgement face of click, so the Y value of hand coordinate is all more than or equal to the decision content of the judgement face of clicking.And, in the time that user moves hand and carries out the operation under normal mode of operation, each this action judging unit 108 receives after the three-dimensional coordinate of hand central point, no longer reset this click and judge plane Y-axis value, but directly judge that according to this click plane Y-axis value judges whether that effective click action appears in virtual touch screen.
Wherein, the D coordinates value of hand central point is mapped on the operable area of touch screen, be in particular: set the coordinate figure scope that this operable area scope is XZ coordinate surface, the coordinate of the XZ face of hand central point can directly be mapped as the planimetric position coordinate in operable area.
Judge click action according to the D coordinates value of hand central point, be specially: judge after plane Y-axis value when having selected to click, as long as the Y value in hand central point three-dimensional coordinate is judged plane Y-axis value lower than this click, judge that this end points passes through this click judge plane, there is click behavior in this finger, then which region decision user, which position carried out to clicking operation in conjunction with hand central point.
The position of hand and the picture of valid function on virtual touch screen, according to the judged result of this action judging unit 108 and current all coordinates that can handle icon, are drawn out in this graphic plotting unit 109.The coordinate figure of all XZ coordinate surfaces that can handle icon of this graphic plotting unit 109 initialization at the beginning, and the coordinates regional of each XZ coordinate surface that can handle icon is fed back to this action judging unit 108.This graphic plotting unit 109 can handle icon according to different operation changes position, make specific response according to the band of position of the click behavior occurring, for example, highlightedly choose, drag, deletion etc., and drawn image is sent to this indicative control unit 110, upgrade simultaneously behind shift position can handle icon coordinate figure and feed back to this action judging unit 108.
This indicative control unit 110 is converted to the image of being drawn by this graphic plotting unit 109 sequential that display device 21 can show, calling this display unit 111 is shown to operated image on virtual touch screen on display device 21 and watches for user, user can learn the position in the virtual touch screen that own current hand central point is corresponding according to feedback, then can start to continue mobile hand according to displaying contents and proceed virtual touch control operation.
Referring to Fig. 4, is the schematic flow sheet of the virtual touch control method of man-machine interactive in embodiment of the present invention, and the method comprises:
Step S30, the brightness value of these ambient brightness sensing unit 101 induced environments, the ambient brightness value judgement that the luminance threshold value that these view recognition unit 102 bases set in advance and this ambient brightness sensing unit 101 sense is used common camera or infrared camera.
In the time of original state, two pairs of cameras (a pair of common camera and a pair of infrared camera) are used in conjunction with, and shooting direction is set to orthogonal, can catch the action behavior of hand vertical direction and horizontal direction simultaneously.Conventionally, in intelligent glasses, two cameras (a common camera and an infrared camera) are put down, and two cameras (a common camera and an infrared camera) level on intelligent bracelet or smart mobile phone is put.And, jointly form picture catching region by the rectangular area of taking of two pairs of cameras.
Step S31, user holds by of needing operation in both hands that fist is unsettled to be lain against in picture catching region and keep the static of certain hour, by device 10 identifications with orient the initial position of hand, the initialization of completing user hand position.
This device 10 is identified and the principle of locating hand position will below be described in detail.
Step S32, user will need unsettled the lying against in picture catching region of a hand (hereinafter to be referred as one hand) of operation in both hands, this longitudinal view recognin unit 1021 is according to carrying out hard recognition in singlehanded top as longitudinal common camera of picture pick-up device or the view data of infrared camera collection, to determine the position of hand center in image.This transverse views recognin unit 1022 is according to carrying out hard recognition in singlehanded front as the view data of the common camera in horizontal picture pick-up device or infrared camera collection, to determine the position of hand center in image.
Particularly, the position of definite hand center, this longitudinal view recognin unit 1021 in image is hand central point pixel at the XZ coordinate surface position in image, and for example, hand central point pixel is positioned at the capable b row of a of XZ face image.The position of definite hand center, this transverse views recognin unit 1022 in image is hand central point pixel at the YZ coordinate surface position in image.
Further, judge that by common camera the method for hand central point comprises color background method and color glove method.Wherein, color background method is specially: the environmental background of bimanualness needs color relatively simple and single, can directly pass through so the direct handle portion Extraction of Image of color interval range of human body complexion out, then obtain the line number of central point according to the mean value of peak in the hand images region extracting and minimum point, obtain the columns of central point by the mean value of the most left point and the rightest point.Color glove auxiliary law is specially: user wears special pure red gloves, because common camera is all RGB (red-green-blue) sampling, can directly extract pure red regional location, also can use green or blue as finger of glove end points color.Then, obtain the line number of central point according to the mean value of highs and lows in the hand images region extracting, obtain the columns of central point by the mean value of the most left point and the rightest point.
The method that judges finger central point by infrared camera comprises temperature filtering method and color glove auxiliary law.Wherein, temperature filtering method is in particular: bimanualness can be directly by the higher feature of the relative environment temperature of human surface temperature directly hand Extraction of Image higher temperature out, then obtain the line number of central point according to the mean value of highs and lows in the hand images region extracting, obtain the columns of central point by the mean value of the most left point and the rightest point.Color glove auxiliary law is specially: user wears special gloves, there is heating effect on the surface of gloves, can directly extract like this thermal region in image, then obtain the line number of central point according to the mean value of highs and lows in the hand images region extracting, obtain the columns of central point by the mean value of the most left point and the rightest point.
Step S33, this surface level two-dimensional coordinate is set up the position of hand center position in image and the pixel resolution of camera that unit 105 identifies according to this longitudinal view recognin unit 1021, hand central point location of pixels is converted to the two-dimensional coordinate value of XZ coordinate surface.This vertical plane two-dimensional coordinate is set up the position of hand center position in image and the pixel resolution of camera that unit 106 identifies according to this transverse views recognin unit 1022, hand central point location of pixels is converted to the two-dimensional coordinate value of YZ coordinate surface.
Wherein, the transfer principle that hand central point location of pixels is converted to the two-dimensional coordinate value of XZ coordinate surface is in particular: image lower left corner pixel is set to the starting point 0 of two-dimensional coordinate system, goes out the line number of the relative each image of coordinate figure scope and the ratio of columns according to image analytic degree with the coordinate figure range computation being converted to after two-dimensional coordinate.For example, the wide height of XZ coordinate surface image analytic degree is 2000*1000, the coordinate figure scope of two dimension XZ plane coordinate system is that X-axis is 1 to 150, Z axis is 1 to 100, and the column number proportion of the relative image of Z axis coordinate figure scope is 100/1000, the columns ratio 150/2000 of the relative image of X-axis coordinate figure scope.The location of pixels of hand central point is multiplied by the ratio of the relative image line of the coordinate range calculating, columns, thereby obtains being converted to the end points two-dimensional coordinate value after two-dimensional coordinate.For example, the location of pixels of certain hand central point is that 300 row 200 are listed as, and the Z axis coordinate of this hand central point is 300*100/1000=30, and the X-axis coordinate of this hand central point is 200*150/2000=15.The transfer principle of two-dimensional coordinate value that hand central point location of pixels is converted to YZ coordinate surface is the same, does not add and repeats at this.
Step S34, this three-dimensional coordinate computing unit 107 is set up unit 105 and vertical plane two-dimensional coordinate according to this surface level two-dimensional coordinate and is set up the hand central point location of pixels of determining respectively unit 106 and set up the coordinate figure of hand central point in XYZ three-dimensional system of coordinate in the two-dimensional coordinate value of XZ coordinate surface and YZ coordinate surface.
Wherein, the method of setting up the coordinate figure of hand central point in XYZ three-dimensional system of coordinate is specially: because XZ coordinate surface and YZ coordinate surface have common Z axis, so the Z value of each coordinate end points in the Z value of each coordinate end points in XZ coordinate surface and YZ coordinate surface is extracted and is compared, consistent or the immediate coordinate end points of Z axis coordinate figure can be considered to same end points, then the coordinate figure of XZ coordinate surface and the coordinate figure of YZ coordinate surface that are judged as same end points are merged into a coordinate end points, using the coordinate figure as XYZ three-dimensional system of coordinate.Because Z value is likely different, so the coordinate Z value that the coordinate Z value that the Z value of the new three-dimensional coordinate producing is XZ coordinate surface adds YZ coordinate surface is then divided by 2 operation result, the X in three-dimensional system of coordinate, Y coordinate figure equal respectively X coordinate figure and the Y coordinate figure of XZ coordinate surface and YZ coordinate surface.
Step S35, this finger number judging unit 103 is identified user in longitudinal view according to hand images and is operated used finger number.
Particularly, this finger number judging unit 103 is by determining finger number to the identification of pointing end points in hand images.Wherein, method by common camera identification finger end points comprises color background method and color glove auxiliary law, color background method is specially: the environmental background of bimanualness needs color relatively simple and single, can directly pass through so the direct handle portion Extraction of Image of color interval range of human body complexion out, then calculate according to figure endpoint algorithm the cut off position that the each strip of hand extends, as the endpoint location of every finger, then calculate total several end points.Color glove auxiliary law is specially: user need to wear special gloves, each fingertip location of gloves is pure red, because common camera is all RGB (red-green-blue) sampling, can directly extract the position of pure red point, also can use green or blue as finger of glove end points color, then calculate total several end points.
The method of identifying finger end points by infrared camera comprises temperature filtering method and color glove auxiliary law, temperature filtering method is specially: bimanualness can be directly by the higher feature of the relative environment temperature of human surface temperature directly hand Extraction of Image higher temperature out, then calculate according to figure endpoint algorithm the cut off position that the each strip of hand extends, as the endpoint location of every finger, then calculate total several end points.Color glove auxiliary law is in particular: user wears special gloves, and each fingertip location of gloves is the points that have heating, can directly extract like this hotspot location in image, then calculates total several end points.
Step S36, this mode of operation judging unit 104 uses the number of finger to carry out the selection of mode of operation according to this finger number judging unit 103 definite users.
For example,, when finger does not stretch out (hand is in holding bulk), to for a change position coordinates of hand in screen of the operator scheme of virtual touch screen; While only using a finger (other are pointed in holding bulk), to the operator scheme of virtual touch screen for choosing certain icon; While using two fingers, be to drag the icon of choosing to the operator scheme of virtual touch screen; While using three fingers, be the whole screen that slides to the operator scheme of virtual touch screen.0 finger to 5 finger of definable at most, 6 kinds of mode of operations altogether, are used mode of operation corresponding to different finger all can define again.
Step S37, what the mode of operation that hand D coordinates value, this mode of operation judging unit 104 that this action judging unit 108 is set up according to this three-dimensional coordinate computing unit 107 are definite and this graphic plotting unit 109 fed back can handle icon XZ coordinate range region, judge user's operating position and operator scheme.
In the present embodiment, in the time of hand position initial phase, this action judging unit 108 is judged plane Y-axis value according to the minimum end points of vertical direction in the D coordinates value of hand central point (the namely min coordinates value of Y-axis) as clicking, the D coordinates value of handle portion central point is mapped on operable area, and judges that in conjunction with current mode of operation the current hand carrying out is through the corresponding operating result of this click judgement plane.For example, under the mode of operation of the position coordinates at 0 change hand corresponding to finger in screen, clicking operation does not have any implication; 1 corresponding choosing under certain icon working pattern of finger, clicking operation represents to choose the icon in certain screen; 2 corresponding dragging under the icon working pattern of choosing of finger, click action represents that user starts or finish drag operation by certain icon; Point under the corresponding whole screen work pattern of slip at 3, click action represents operation that user starts whole screen-picture or finish to drag etc.
Owing to being hand position initial phase, this action judging unit 108 utilizes Y value to set the initial value of the judgement face of click, so the Y value of hand coordinate is all more than or equal to the decision content of the judgement face of clicking.And, in the time that user moves hand and carries out the operation under normal mode of operation, each this action judging unit 108 receives after the three-dimensional coordinate of hand central point, no longer reset this click and judge plane Y-axis value, but directly judge that according to this click plane Y-axis value judges whether that effective click action appears in virtual touch screen.
Wherein, the method D coordinates value of hand central point being mapped on the operable area of touch screen is: set the coordinate figure scope that this operable area scope is XZ coordinate surface, the coordinate of the XZ face of hand central point can directly be mapped as the planimetric position coordinate in operable area.
The method that judges click action according to the D coordinates value of hand central point is: judge after plane Y-axis value when having selected to click, as long as the Y value in hand central point three-dimensional coordinate is judged plane Y-axis value lower than this click, judge that this end points passes through this click judge plane, there is click behavior in this finger, then which region decision user, which position carried out to clicking operation in conjunction with hand central point.
Step S38, the position of hand and the picture of valid function on virtual touch screen, according to the judged result of this action judging unit 108 and current all coordinates that can handle icon, are drawn out in this graphic plotting unit 109.
Wherein, the coordinate figure of all XZ coordinate surfaces that can handle icon of this graphic plotting unit 109 initialization at the beginning, and the coordinates regional of each XZ coordinate surface that can handle icon is fed back to this action judging unit 108.This graphic plotting unit 109 can handle icon according to different operation changes position, make specific response according to the band of position of the click behavior occurring, for example, highlightedly choose, drag, deletion etc., and drawn image is sent to this indicative control unit 110, upgrade simultaneously behind shift position can handle icon coordinate figure and feed back to this action judging unit 108.
Step S39, this indicative control unit 110 is converted to the image of being drawn by this graphic plotting unit 109 sequential that display device 21 can show, calling this display unit 111 is shown to operated image on virtual touch screen on display device 21 and watches for user, user can learn the position in the virtual touch screen that own current hand central point is corresponding according to feedback, then can start to continue mobile hand according to displaying contents and proceed virtual touch control operation.
The virtual contactor control device of a kind of man-machine interactive provided by the invention, system and method, capture image and determine operating position and point number by identification by image recognition hand by camera and judge mode of operation, directly be mapped as the operational motion to virtual touch screen by the hand three-dimensional coordinate obtaining, and on display, show and feed back to user, can make touch screen input no longer need entity device, by intelligent glasses and intelligent bracelet, or the virtual touch screen input environment of picture pick-up device fast construction on intelligent and portable mobile device, carry out touch screen input whenever and wherever possible, facilitate user to carry out man-machine interactive operation freely by virtual touch screen whenever and wherever possible.
The foregoing is only embodiments of the invention; not thereby limit the scope of the claims of the present invention; every equivalent structure or conversion of equivalent flow process that utilizes instructions of the present invention and accompanying drawing content to do; or be directly or indirectly used in other relevant technical fields, be all in like manner included in scope of patent protection of the present invention.

Claims (14)

1. the virtual contactor control device of man-machine interactive, comprises indicative control unit and display unit, it is characterized in that, described device comprises:
View recognition unit, carries out hard recognition for the view data that needs the singlehanded head of operation to gather to user's both hands to camera, to determine the position of hand center in image;
Surface level two-dimensional coordinate is set up unit, in the position of image and the pixel resolution of camera, hand central point location of pixels is converted to the two-dimensional coordinate value of XZ coordinate surface for the hand center position that identifies according to described view recognition unit;
Vertical plane two-dimensional coordinate is set up unit, in the position of image and the pixel resolution of camera, hand central point location of pixels is converted to the two-dimensional coordinate value of YZ coordinate surface for the hand center position that identifies according to described view recognition unit;
Three-dimensional coordinate computing unit, sets up the hand central point location of pixels of determining respectively unit and sets up the coordinate figure of hand central point at XYZ three-dimensional system of coordinate in the two-dimensional coordinate value of XZ coordinate surface and YZ coordinate surface for set up unit and vertical plane two-dimensional coordinate according to described surface level two-dimensional coordinate;
Mode of operation judging unit, carries out the selection of mode of operation for the finger number using according to the singlehanded image user of described identification;
Action judging unit, judges user's operating position and operator scheme for the hand D coordinates value set up according to described three-dimensional coordinate computing unit and the selected mode of operation of described mode of operation judging unit; And
Graphic plotting unit, for drawing out the position of hand and the picture of valid function on virtual touch screen according to the judged result of described action judging unit and current all coordinates that can handle icon, the image that shows user's operation to call display unit described in described indicative control unit control, makes user learn the position in the virtual touch screen that current hand central point is corresponding and continue mobile hand according to the image of described demonstration and carry out virtual touch control operation according to feedback.
2. the virtual contactor control device of man-machine interactive as claimed in claim 1, is characterized in that, also comprises ambient brightness sensing unit, for the brightness value of induced environment;
Described view recognition unit comprises:
Longitudinally view recognin unit, use common camera or infrared camera for the ambient brightness value judgement detecting according to described ambient brightness sensing unit, and when determining after the camera using, the view data of described collection is carried out to hard recognition, to determine hand central point pixel at the XZ coordinate surface position in image; And
Transverse views recognin unit, use common camera or infrared camera for the ambient brightness value judgement detecting according to described ambient brightness sensing unit, and when determining after the camera using, the view data of described collection is carried out to hard recognition, to determine hand central point pixel at the YZ coordinate surface position in image.
3. the virtual contactor control device of man-machine interactive as claimed in claim 2, is characterized in that, also comprises finger number judging unit, for identifying according to the hand images of described longitudinal view recognin unit identification the finger number that longitudinal view user uses.
4. the virtual contactor control device of man-machine interactive as claimed in claim 1, it is characterized in that, described surface level two-dimensional coordinate is set up unit hand central point location of pixels is converted to the two-dimensional coordinate value of XZ coordinate surface, and described vertical plane two-dimensional coordinate is set up unit hand central point location of pixels is converted to the two-dimensional coordinate value of YZ coordinate surface, be specially: image lower left corner pixel is set to the starting point 0 of two-dimensional coordinate system, go out the line number of the relative each image of coordinate figure scope and the ratio of columns according to image analytic degree with the coordinate figure range computation being converted to after two-dimensional coordinate.
5. the virtual contactor control device of man-machine interactive as claimed in claim 1, it is characterized in that, described action judging unit is also for judging plane Y-axis value according to the minimum end points of the D coordinates value vertical direction of hand central point as clicking when the hand position initial phase, the D coordinates value of handle portion central point is mapped on operable area, and judges that in conjunction with current mode of operation the current hand carrying out is through the corresponding operating result of described click judgement plane.
6. the virtual contactor control device of man-machine interactive as claimed in claim 1, is characterized in that, described graphic plotting unit is also for feeding back to described action judging unit by the coordinates regional of initialized XZ coordinate surface that can handle icon; The mode of operation that hand D coordinates value, described mode of operation judging unit that described action judging unit is set up according to described three-dimensional coordinate computing unit are definite and described graphic plotting unit feedback can handle icon XZ coordinate range region decision user's operating position and operator scheme.
7. the virtual touch-control system of man-machine interactive, is characterized in that, comprise the virtual contactor control device of man-machine interactive as described in claim 1-6 any one and with as described in two picture pick-up devices of device communication connection.
8. the virtual touch-control system of man-machine interactive as claimed in claim 7, is characterized in that, described picture pick-up device comprises the first camera and second camera, and respectively as longitudinally picture pick-up device and laterally picture pick-up device, shooting direction is set to orthogonal.
9. the virtual touch-control system of man-machine interactive as claimed in claim 8, is characterized in that, described the first camera and second camera are respectively common camera and infrared camera.
10. the virtual touch control method of man-machine interactive, is characterized in that, described method comprises:
User, by needing unsettled the lying against in picture catching region of one hand of operation in both hands, carries out hard recognition to the view data of camera collection, to determine the position of hand center in image;
According to the position of hand center position in image and the pixel resolution of camera that identify, hand central point location of pixels is converted to respectively to the two-dimensional coordinate value of XZ coordinate surface and YZ coordinate surface;
Set up the coordinate figure of hand central point in XYZ three-dimensional system of coordinate according to hand central point location of pixels in the two-dimensional coordinate value of XZ coordinate surface and YZ coordinate surface;
Carry out the selection of mode of operation according to the finger number that in the singlehanded image of identification, user uses;
According to the hand D coordinates value of setting up, operating position and the operator scheme that definite mode of operation judges user;
According to judged result and current all coordinates that can handle icon draw out the position of hand and the picture of valid function on virtual touch screen; And
The image that shows described drafting is watched for user, makes user learn the position in the virtual touch screen that current hand central point is corresponding and continue mobile hand according to the image of described demonstration and carry out virtual touch control operation according to feedback.
The virtual touch control method of 11. man-machine interactive as claimed in claim 10, it is characterized in that, described user will need unsettled the lying against in picture catching region of one hand of operation in both hands, the view data of camera collection is carried out to hard recognition, before determining the step of the position of hand center in image, comprising:
Use common camera or infrared camera according to the ambient brightness value of induction and the luminance threshold value judgement setting in advance;
User holds by of needing operation in both hands that fist is unsettled to be lain against in picture catching region and keep the static of certain hour, by the camera identification of selecting and the initial position of orienting hand.
The virtual touch control method of 12. man-machine interactive as claimed in claim 11, is characterized in that, the step that the initial position of hand was identified and oriented to the described camera by selecting also comprises afterwards:
Judge plane Y-axis value according to the minimum end points of vertical direction in the D coordinates value of hand central point as clicking, the D coordinates value of handle portion central point is mapped on operable area, and judges that in conjunction with current mode of operation the current hand carrying out is through the corresponding operating result of described click judgement plane.
The virtual touch control method of 13. man-machine interactive as claimed in claim 10, it is characterized in that, described user will need unsettled the lying against in picture catching region of one hand of operation in both hands, the view data of camera collection is carried out to hard recognition, is specially with the step of determining the position of hand center in image:
Determine respectively hand central point pixel at XZ coordinate surface and the YZ coordinate surface position in image.
The virtual touch control method of 14. man-machine interactive as claimed in claim 13, it is characterized in that, the position of hand center position in image and the pixel resolution of camera that described basis identifies, the step that hand central point location of pixels is converted to respectively to the two-dimensional coordinate value of XZ coordinate surface and YZ coordinate surface is specially:
Image lower left corner pixel is set to the starting point 0 of two-dimensional coordinate system, goes out the line number of the relative each image of coordinate figure scope and the ratio of columns according to image analytic degree with the coordinate figure range computation being converted to after two-dimensional coordinate.
CN201410436118.9A 2014-08-29 2014-08-29 Virtual touch screen operation device, system and method Active CN104199547B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410436118.9A CN104199547B (en) 2014-08-29 2014-08-29 Virtual touch screen operation device, system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410436118.9A CN104199547B (en) 2014-08-29 2014-08-29 Virtual touch screen operation device, system and method

Publications (2)

Publication Number Publication Date
CN104199547A true CN104199547A (en) 2014-12-10
CN104199547B CN104199547B (en) 2017-05-17

Family

ID=52084848

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410436118.9A Active CN104199547B (en) 2014-08-29 2014-08-29 Virtual touch screen operation device, system and method

Country Status (1)

Country Link
CN (1) CN104199547B (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105929957A (en) * 2016-04-26 2016-09-07 深圳市奋达科技股份有限公司 Control method, apparatus and device for intelligent glasses
CN106200904A (en) * 2016-06-27 2016-12-07 乐视控股(北京)有限公司 A kind of gesture identifying device, electronic equipment and gesture identification method
CN106802717A (en) * 2017-01-20 2017-06-06 深圳奥比中光科技有限公司 Space gesture remote control thereof and electronic equipment
CN106933347A (en) * 2017-01-20 2017-07-07 深圳奥比中光科技有限公司 The method for building up and equipment in three-dimensional manipulation space
CN106951087A (en) * 2017-03-27 2017-07-14 联想(北京)有限公司 A kind of exchange method and device based on virtual interacting plane
CN107710114A (en) * 2015-06-25 2018-02-16 富士通株式会社 Electronic equipment and drive control method
CN110275447A (en) * 2018-03-13 2019-09-24 发那科株式会社 Control device, control method and control program
CN112306305A (en) * 2020-10-28 2021-02-02 黄奎云 Three-dimensional touch device
CN112506372A (en) * 2020-11-30 2021-03-16 广州朗国电子科技有限公司 Graffiti spray drawing method and device based on touch screen and storage medium
CN115617178A (en) * 2022-11-08 2023-01-17 润芯微科技(江苏)有限公司 Method for completing key and function triggering without contact between fingers and car machine
CN116661656A (en) * 2023-08-02 2023-08-29 安科优选(深圳)技术有限公司 Picture interaction method and shooting display system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04271423A (en) * 1991-02-27 1992-09-28 Nippon Telegr & Teleph Corp <Ntt> Information input method
US20090077504A1 (en) * 2007-09-14 2009-03-19 Matthew Bell Processing of Gesture-Based User Interactions
CN101799717A (en) * 2010-03-05 2010-08-11 天津大学 Man-machine interaction method based on hand action catch
WO2013018099A2 (en) * 2011-08-04 2013-02-07 Eyesight Mobile Technologies Ltd. System and method for interfacing with a device via a 3d display

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04271423A (en) * 1991-02-27 1992-09-28 Nippon Telegr & Teleph Corp <Ntt> Information input method
US20090077504A1 (en) * 2007-09-14 2009-03-19 Matthew Bell Processing of Gesture-Based User Interactions
CN101799717A (en) * 2010-03-05 2010-08-11 天津大学 Man-machine interaction method based on hand action catch
WO2013018099A2 (en) * 2011-08-04 2013-02-07 Eyesight Mobile Technologies Ltd. System and method for interfacing with a device via a 3d display
WO2013018099A3 (en) * 2011-08-04 2013-07-04 Eyesight Mobile Technologies Ltd. System and method for interfacing with a device via a 3d display

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107710114A (en) * 2015-06-25 2018-02-16 富士通株式会社 Electronic equipment and drive control method
CN105929957A (en) * 2016-04-26 2016-09-07 深圳市奋达科技股份有限公司 Control method, apparatus and device for intelligent glasses
CN106200904A (en) * 2016-06-27 2016-12-07 乐视控股(北京)有限公司 A kind of gesture identifying device, electronic equipment and gesture identification method
CN106802717A (en) * 2017-01-20 2017-06-06 深圳奥比中光科技有限公司 Space gesture remote control thereof and electronic equipment
CN106933347A (en) * 2017-01-20 2017-07-07 深圳奥比中光科技有限公司 The method for building up and equipment in three-dimensional manipulation space
CN106951087B (en) * 2017-03-27 2020-02-21 联想(北京)有限公司 Interaction method and device based on virtual interaction plane
CN106951087A (en) * 2017-03-27 2017-07-14 联想(北京)有限公司 A kind of exchange method and device based on virtual interacting plane
CN110275447A (en) * 2018-03-13 2019-09-24 发那科株式会社 Control device, control method and control program
CN110275447B (en) * 2018-03-13 2021-03-09 发那科株式会社 Control device, control method, and computer-readable medium
CN112306305A (en) * 2020-10-28 2021-02-02 黄奎云 Three-dimensional touch device
CN112306305B (en) * 2020-10-28 2021-08-31 黄奎云 Three-dimensional touch device
CN112506372A (en) * 2020-11-30 2021-03-16 广州朗国电子科技有限公司 Graffiti spray drawing method and device based on touch screen and storage medium
CN115617178A (en) * 2022-11-08 2023-01-17 润芯微科技(江苏)有限公司 Method for completing key and function triggering without contact between fingers and car machine
CN116661656A (en) * 2023-08-02 2023-08-29 安科优选(深圳)技术有限公司 Picture interaction method and shooting display system
CN116661656B (en) * 2023-08-02 2024-03-12 安科优选(深圳)技术有限公司 Picture interaction method and shooting display system

Also Published As

Publication number Publication date
CN104199547B (en) 2017-05-17

Similar Documents

Publication Publication Date Title
CN104199550B (en) Virtual keyboard operation device, system and method
CN104199547A (en) Man-machine interactive type virtual touch device, system and method
CN104199548A (en) Man-machine interactive type virtual touch device, system and method
EP2790089A1 (en) Portable device and method for providing non-contact interface
CN106959808A (en) A kind of system and method based on gesture control 3D models
Caputo et al. 3D hand gesture recognition based on sensor fusion of commodity hardware
CN102270037B (en) Manual human machine interface operation system and method thereof
US20140022171A1 (en) System and method for controlling an external system using a remote device with a depth sensor
CN103197885A (en) Method for controlling mobile terminal and mobile terminal thereof
CN103809866A (en) Operation mode switching method and electronic equipment
EP3679825A1 (en) A printing method and system of a nail printing apparatus, and a medium thereof
CN113209601B (en) Interface display method and device, electronic equipment and storage medium
CN112947825A (en) Display control method, display control device, electronic device, and medium
CN105242776A (en) Control method for intelligent glasses and intelligent glasses
CN104267802A (en) Human-computer interactive virtual touch device, system and method
CN104808936A (en) Interface operation method and interface operation method applied portable electronic device
KR101512239B1 (en) System and method for transfering content among devices using touch command and unusual touch
CN107239222A (en) The control method and terminal device of a kind of touch-screen
CN108845752A (en) touch operation method, device, storage medium and electronic equipment
CN114816135B (en) Cross-device drawing system
CN104199549A (en) Man-machine interactive type virtual touch device, system and method
CN106325726A (en) A touch control interaction method
WO2007088430A1 (en) System, device, method and computer program product for using a mobile camera for controlling a computer
US11500453B2 (en) Information processing apparatus
CN103092389A (en) Touch screen device and method for achieving virtual mouse action

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 350003 Fuzhou Gulou District, Fujian, software Avenue, building 89, No. 18

Applicant after: FUZHOU ROCKCHIP ELECTRONICS CO., LTD.

Address before: 350003 Fuzhou Gulou District, Fujian, software Avenue, building 89, No. 18

Applicant before: Fuzhou Rockchip Semiconductor Co., Ltd.

COR Change of bibliographic data
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: 350003 building, No. 89, software Avenue, Gulou District, Fujian, Fuzhou 18, China

Patentee after: Ruixin Microelectronics Co., Ltd

Address before: 350003 building, No. 89, software Avenue, Gulou District, Fujian, Fuzhou 18, China

Patentee before: Fuzhou Rockchips Electronics Co.,Ltd.

CP01 Change in the name or title of a patent holder