WO1998008159A2 - Force feedback mouse - Google Patents

Force feedback mouse Download PDF

Info

Publication number
WO1998008159A2
WO1998008159A2 PCT/CA1997/000585 CA9700585W WO9808159A2 WO 1998008159 A2 WO1998008159 A2 WO 1998008159A2 CA 9700585 W CA9700585 W CA 9700585W WO 9808159 A2 WO9808159 A2 WO 9808159A2
Authority
WO
WIPO (PCT)
Prior art keywords
mouse
computer interface
force
cursor
freedom
Prior art date
Application number
PCT/CA1997/000585
Other languages
French (fr)
Other versions
WO1998008159A3 (en
Inventor
Eric Gregory Kubica
Kevin Lloyd Tuer
Daniel Richard Madill
Kevin Brian Krauel
Matthew E. Cecile
Original Assignee
Control Advancements Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Control Advancements Inc. filed Critical Control Advancements Inc.
Priority to CA002263988A priority Critical patent/CA2263988A1/en
Priority to AU39361/97A priority patent/AU3936197A/en
Publication of WO1998008159A2 publication Critical patent/WO1998008159A2/en
Publication of WO1998008159A3 publication Critical patent/WO1998008159A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/014Force feedback applied to GUI

Definitions

  • the present invention relates generally to a computer interface and more particularly to a mouse having force feedback.
  • a computer mouse In the known computer interface, movement of a computer mouse in an X- and/or Y-direction on a table moves a cursor or other graphical element on a computer display in a corresponding direction.
  • the user operates numerous functions on a graphical user interface, such as pull down menus, activating icons, scrolling windows, etc. by moving the mouse and selectively activating a button on the mouse.
  • the known computer mouse does not provide tactile or force feedback relating to the interaction between the cursor and computer generated objects on the screen, i.e. the user cannot "feel" the objects displayed on the screen. As a result, many people have difficulty operating a computer mouse.
  • One proposed computer input device offers force feedback relating to the cursor interaction with objects on the computer screen. That device utilizes electromagnetic flat coil actuators to generate electromagnetic forces on a handle. However, the electromagnetic flat coil actuators utilized in this computer input device are expensive and generate strong magnetic fields which interfere with the operation of the computer or which could damage computer disks. This computer input device requires an additional computer dedicated solely to controlling the input device.
  • United States Patent Number 4,604,016 discloses a hand controller having force feedback for teleoperation of a tool for surgery.
  • the forces encountered by the tool are translated by a computer to torque motors, thereby providing a real time interactive feedback response enabling a surgeon to "feel" an operation.
  • the position and orientation of the controller are determined by the lengths of twelve lines between the controller and the support structure.
  • the twelve control lines are interconnected with the plurality of torque motors which are variably programmed by the computer to apply tension to each of the lines based upon the force encountered by the tool.
  • This device is large and requires a large number of control lines and motors.
  • the numerous control lines and motors complicate programming of software applications which could utilize the input device. Play in the numerous control lines and friction reduce the precision of the response and feedback of the device.
  • the patent does not disclose force feedback based upon the interaction between a cursor and objects on a computer screen or force feedback based upon the movement of the input device.
  • a force feedback mouse for use in a graphical user interface has been proposed, it has not been implemented in a commercially available operating system. Further, operation of the proposed force feedback mouse has required the use of an additional microprocessor controller dedicated to operation of the mouse.
  • the present invention provides a computer input device having force feedback which is less expensive and simpler than previously known computer input devices.
  • the computer interface of the present invention further imparts a force on a mouse which is based upon the movement of the mouse, thereby providing friction and inertia compensation and the simulation of such effects as viscosity.
  • the computer interface generally comprises an input device and a display connected to a CPU.
  • the input device generally comprises a mouse movable in two degrees of freedom (D.O.F.), preferably along an X-axis and a generally perpendicular Y-axis on a surface. Movement of the mouse in the X and Y-directions generally causes a corresponding movement of a cursor on the display.
  • the mouse is slidably mounted to a first rail generally parallel to the X-axis.
  • the first rail is in turn slidably mounted on a second rail for movement generally parallel to the Y-axis.
  • a pair of motors and belts impart forces on the mouse along the X and Y axes.
  • An encoder connected to each motor measures the movement of the mouse by the user or by the motors.
  • the mouse is fixed to a rail having a rack engaged by a gear and a motor imparting a force parallel to the Y-axis.
  • the mouse, rail, rack, gear and motor are slidably mounted to a first rail for movement in the X-direction.
  • a motor and belt impart a force on the mouse generally parallel to the X-axis.
  • Each of the motors include a sensor for indicating the displacement of the mouse, preferably an encoder, from which velocity and acceleration, including direction, can be calculated.
  • Each of the motors impart force on the mouse along its associated axis based upon movement of the mouse.
  • a motor imparts a force upon the mouse to compensate for friction when it detects motion of the mouse along its associated axis.
  • Each motor also imparts a force corresponding to detected acceleration of the mouse in order to compensate for inertia along the associated axis.
  • the motors selectively impart a force upon the mouse which is generally linearly proportional to the detected velocity of the mouse, opposite the direction of the detected velocity.
  • the mouse is driven by each of the motors to extreme positions along either axis until the mouse or its associated hardware contacts a stop.
  • the CPU and input device detect the impact of the mouse with a stop at each extreme, thereby defining the range of motion of the mouse along each axis to calibrate the motion of the mouse with the motion of a cursor on a display.
  • the stops could comprise limit switches.
  • the computer interface also provides force feedback based upon the interaction of a cursor on a display.
  • a cursor For example, in a graphical user interface, the user can "feel" icons, windows and menus on a display.
  • the motors also assist the user in operating the graphical user interface, such as by preventing the cursor from inadvertently sliding off the side of a pull down menu or a scroll bar thumb.
  • Figure 1 is a schematic of the computer interface of the present invention.
  • Figure 2 is a top view of the computer input device shown in Figure 1.
  • Figure 3 is a sectional view taken along line 3-3 of Figure 2.
  • Figure 4 is a perspective view of an alternate input device to use with the computer interface of Figure 1.
  • Figure 5 is a top view of the input device of Figure 4, partially broken away.
  • Figure 6 is a sectional view of the input device of Figure 5 taken along line 6-6.
  • Figure 7 is a graph showing a method for compensating for friction for the computer input device of Figure 1 or Figure 4.
  • Figure 8 is a graph of an alternate method for compensating for friction in the computer input device shown in Figure 1 or Figure 4.
  • Figure 9 is a graph of a method for compensating for inertia in the computer input device of Figure 1 or Figure 4.
  • Figure 10 is a graph of a method for providing a viscous force feedback for the computer input device of Figure 1 or Figure 4.
  • Figure 11 is a graph of a method for providing the feel of a wall in the computer interface of Figure 1.
  • Figure 12 is a graph of a method for providing the feel of gravity or a potential well for the computer interface of Figure 1.
  • Figure 13 is a graph of a method for providing variable friction areas in the computer interface of Figure 1.
  • Figure 14 is one potential screen displayed by the display of Figure 1.
  • Figure 15 is a schematic of the software operating the computer interface of Figure 1.
  • the present invention provides a computer interface 20 including an input device 22 and a display 24 connected to a CPU 26 as shown in Figure
  • the input device 22 generally includes a mouse 28 moveable in an X- direction and a Y-direction on surface 30 and including user activated buttons 32.
  • the CPU 26 includes a microprocessor 33, memory 34 such as RAM and ROM and a mass storage 35 such as a hard drive or CD-ROM, tape, etc.
  • the memory 34 and storage 35 are programmed such the microprocessor 33 sends and receives signals to and from the input device 22. Movement of the mouse 28 in the X and Y-directions typically results in a corresponding motion of a cursor 36 on the display 24 interacting with screen objects 38, such as simulated objects, or graphical user interface items such as menus, windows, slider bars, etc.
  • "Cursor" 36 as used herein refers to any object on the display
  • the mouse 28 is mounted to an X-slide 44 which is slidably mounted on an X-rail 46 for movement along the X-axis.
  • a non- slip belt 50 driven by a motor 52 imparts force on the mouse 28 and the X- slide 44 in either direction along the X-axis.
  • the motor 52 includes a sensor 54 indicating the amount of movement of the mouse 28, preferably an encoder 54 or alternatively a resolver.
  • the mouse 28 and X-slide are moveable along the X-rail between a stop 56 on an end of the X-rail 46 and a Y-slide 60 secured to the X-rail 46.
  • the Y-slide 60 is slidable in either direction along a Y-rail 62 extending along a Y-axis generally perpendicular to the X-axis.
  • the motor 52, belt 50, X-rail 46, X-slide 44 and mouse 28 are all mounted to the Y-slide for movement together along the Y-axis between stops 64 at either end of the Y-rail 62.
  • a belt 66 driven by a motor 68 having a encoder 70 imparts force on the Y-slide in either direction along the Y-axis.
  • Belts 50, 66 are preferably synchromesh cable, available commercially from Stock Drive Products.
  • the motors 52, 68 have a peak torque of at least
  • the motors 52, 68 are powered by servo drives 42 which preferably utilize pulse-width modulation. As is known for servo-controlled motors, the servo drives 42 regulate the current and voltage supplied to the motors 52, 68 and may at times monitor the consumption of current and voltage by the motors 52, 68 in order to precisely control the force generated by the motors 52, 68.
  • the encoders 54, 70 preferably have a resolution of at least 1000 pulses per revolution and utilize quadrature decoding. Unlike a conventional mouse, the mouse 28 need not actually make direct contact with the surface 30 because movement of the mouse 28 is sensed by the encoders 54, 70 via the motors 52, 68.
  • the belt 50 is looped around a pair of pulleys 72 and secured at either end to the X-slide 44.
  • a single motor 52 directly driving one of the pulleys 72 drives the X-slide 44 in either direction along the X-axis.
  • the Y-axis motor 68 having encoder 70 drives the belt 66 which imparts a force on the mouse 28 along the Y-axis.
  • movement of the mouse 28 along either the X-axis, Y- axis or a combination drives belts 50, 66 rotating motors 52, 68 and encoders 54, 70.
  • the encoders 54, 70 generate signals indicative of the displacement and direction of the mouse 28. From these signals, the CPU 26 can derive the velocity, acceleration, etc of the mouse 28.
  • the CPU 26 controls the servo drives 42 to control the motors 52, 68 to impart forces on the mouse 28 as will be described in more detail below.
  • the forces imparted by motors 52, 68 are precisely controlled by servo drives 42.
  • the servo drives 42 may also monitor the power consumption of motors 52, 68 at times.
  • Figure 1 is shown in Figure 4 generally comprising a mouse 82 movable in two degrees of freedom (D.O.F.), such as along an X-axis and a Y-axis on a surface 84.
  • the surface 84 can be secured to a housing 86 or can be a desktop or tabletop surface 84.
  • the mouse 82 need not be located directly on a surface in order to operate, but preferably the mouse 82 is moveable in two degrees of freedom in a plane.
  • the mouse 82 preferably includes a plurality of buttons 88.
  • the input device 80 includes a Y-rail 92 at least a portion of which comprises a rack 94 engaging a gear 96 driven by a motor 98 with an encoder 100.
  • the mouse 82 is fixed to an end 102 of Y-rail 92.
  • the Y-rail 92 is slidably mounted for movement along the Y-axis on X- slide 106.
  • the X-slide 106 is slidably mounted for movement along the X- axis on X-rail 108 between a pair of stops 110.
  • a belt 112 driven by a motor 116 having an encoder 118 imparts a force along the X-axis to the X-slide 106 which transmits the force to the Y-rail 92 and the mouse 82.
  • the mouse 82 need not actually make direct contact with the surface 84 because movement of the mouse 82 is sensed by encoders 100, 118 via the motors 98, 116.
  • the gear 96 driven by motor 98 engages the rack portion 94 of Y-rail 92 through an opening 120 in X-slide 106.
  • the Y-rail 92 is slidably mounted on the X-slide 106 for relative motion along the Y-axis.
  • the X-slide is slidably mounted on the X- rail 108 to provide relative motion in the X-directions.
  • the motors 98, 116 and encoders 100, 118 are the same as those described in the first embodiment.
  • synchromesh cable could be utilized instead of the rack 94 and gear 96.
  • movement of the mouse 82 along the surface 84 generates a signal to be sent to the CPU 26 from the encoders 100, 118.
  • Movement of the mouse 82 along the Y-axis causes the rack portion 94 of the Y-rail 92 to drive the gear 96 and the motor 98.
  • Figures 7-13 indicate the methods of controlling the motors 98, 116 to impart a force upon the mouse 82.
  • Figures 7-13 will be described specifically with respect to the embodiment shown in Figures 4-6 for purposes of illustration only. These methods are equally applicable to the embodiment shown in Figures 1-3.
  • Figures 7-10 are graphs of the output of the motors 98, 116 based upon input from the mouse 82, either through the encoders 100, 118.
  • Figures 11-13 indicate the output of motors 98, 116 as seen by the mouse 82 based upon the position of a cursor 36 on the display 24.
  • two or more of the forces indicated by the graphs in Figures 7-13 can be superimposed to provide several effects simultaneously.
  • the input device 80 preferably includes compensation for friction which would be experienced by the Y-rail 92 and the X-rail 108. As is known, friction generates a force opposite the direction of the movement of the mouse
  • the CPU 26 compensates for this force of friction by controlling the servo drives 42 to send current and voltage to the motors 98, 116 sufficient to cancel the force of friction.
  • the CPU 26 monitors the velocity of the mouse 82 by monitoring the signal from the encoders 100, 118. As can be seen from Figure 7, when the mouse 82 begins to move, the motors
  • FIG. 8 An alternative method for compensating for friction is indicated by the graph shown in Figure 8.
  • the velocity of the mouse 82 is continuously monitored by the CPU 26.
  • the CPU 26 controls the servo drive 42 and motors 98, 116 to impart a force upon the mouse 82 in the same direction as the input velocity v ⁇ measured from the mouse 82. Again this friction compensation would be performed independently along the X-axis and Y-axis by the motors 98, 116.
  • the input device 80 preferably also compensates for the inertia of the mouse 82 and associated hardware.
  • the CPU 26 monitors the acceleration, both positive and negative of the mouse 82 by monitoring the encoders 100, 1 18. The CPU 26 then calculates a force F out which, based upon the mass of the mouse 82 and hardware moveable along the appropriate axis, would result in an acceleration equal or slightly less than the acceleration input, A m .
  • This inertia compensation provides both negative and positive acceleration to the mouse 82.
  • the inertia compensation is also implemented independently on the X-axis and Y-axis, based upon the characteristics of the specific hardware, including mass, which is moveable along that axis.
  • Figure 10 indicates a method for providing a viscous force feedback to the mouse 82.
  • the force generated by the motors 98, 116 in a manner described above is imparted in a direction opposite to the movement of the mouse 82. Further, the imparted force is increased in magnitude as the velocity of the mouse 82 movement V ⁇ increases, thereby creating a "viscous" feel as the user moves the mouse 82.
  • Figures 11-13 indicate methods for imparting a force upon the mouse 82 wherein the force depends upon the position of the cursor 36 on the display 24, relative to other objects 38 on the display 24. Although the graphs will be described with respect to displacement along the X-axis, the same methods would be used for imparting a force upon the mouse 82 in the Y-axis.
  • Figure 11 indicates force imparted on the mouse 82 in the X-axis based upon the position of the cursor 36 relative to an object 38 on the display 24.
  • motion of the cursor 36 across the display 24 is free from any force resistance until the cursor 36 contacts screen object 38, which in this case would "feel" like the cursor 36 is contacting a wall.
  • the motor 52 imparts a force opposite the direction of the movement of the cursor 36.
  • the mouse 82 is moved through the screen object 38 against the resistive force until the cursor 36 is past the center of the screen object 38, where the resistive force reverses and
  • Figure 12 indicates a method for providing the feel of gravity or a potential well.
  • the force, F oul provides an attractive force to the center of a screen object 38.
  • the motor 52 first provides a force toward the center of the screen object 38 in the same direction as the movement of the mouse
  • the motor 52 provides a high resistive force, which gradually decreases as the cursor 36 moves off of the screen object 38. In this manner, the screen object 38 "feels" like a detent or potential well.
  • a similar method imparts a force upon the mouse 82 when the cursor 36 is moved along the Y-axis across the screen object 38 or from right to left across the screen object 38. Generally, within the screen object 38, a force is imparted on the mouse 82 toward the center of the screen object 38. As shown in Figure 13, the motor 52 can be used to impart a force to the mouse 82 simulating different levels of friction.
  • the motor 52 could impart zero force resistance in a first area 130, slight resistance force in the second area 132, high resistance force in a third area 134 and return to zero resistive force in a fourth area 136.
  • the first, second, third and fourth, 130, 132, 134, 136 could be different screen objects 38.
  • the user can "feel" the different screen objects 38 on the display 24.
  • the same method is used for movement of the mouse along the Y-axis.
  • another effect such as viscosity, or some combination of effects could be used other than friction.
  • Figure 14 shows one possible screen 140 to be displayed by display 24 including a pull down menu 142 having side edges 144.
  • the screen 140 further includes a plurality of icons 148 and a window 150 having edges 152 about its periphery.
  • the window 150 further includes a pair of slider bars 154 having side edges 155 for scrolling through the information displayed in window 150, either by clicking on one of a plurality of arrows 156, or by dragging one of the two boxes 158, in a manner well known to computer users.
  • the window 150 may display simulated objects 160 and 162, such as in a game or a CAD or word-processing program. The simulated objects 160, 162 can be dragged or rotated utilizing the cursor 36 in a manner common to many computer programs.
  • the motors 98, 1 16 preferably operate to compensate for friction in the computer input device 80 utilizing either a constant stepped force output as shown in Figure 7 or a linear force output preload as shown in Figure 8. Further, the motors 98, 116 also compensate for inertia as described with respect to the graph shown in Figure 9.
  • the side edges 144 of the pull down menu 142 are simulated elastic bands utilizing force output proportional to the distance from the center of the pull down menu 142 in the X-direction only.
  • each item selectable from the menu provides an elastic resistance force against the mouse 82 as the mouse 82 is moved in the Y-direction.
  • each item in the menu provides an elastic force attracting the cursor 36 toward the center of each item, until the cursor 36 is moved onto another item. In this manner, the user can "feel" the pull down menu 142 and the items on the menu 142 and avoid inadvertently slipping off the left or right edges 144 of the pull down menu 142.
  • edges 152 of the window 150 are also “walls" as shown in Figure 11.
  • the left and right edges 152 form walls in the X-axis and the top and bottom edges 152 of the window 150 form walls along the Y-axis. In this manner, the user can "feel" the periphery of the window 150.
  • the slider bars 154 operate similarly to the pull down menu 142. Once the user begins to drag a box 158, the side edges 155 of the slider bar 154 become “walls" as shown in the graph of Figure 11. In this manner, the user can "feel" the slider bar 154, moving the cursor 156 and box 158 freely lengthwise across the slider bar 154 axis while preventing the user from inadvertently moving the cursor off of the slider bar 154.
  • the computer interface 20 further preferably provides icons 148 which simulate "gravity" or a potential well or detent. By moving the mouse 82, the user moves the cursor 36 near one of the icons 148.
  • the motors 98, 166 impart a force on the mouse 82 toward the center of the icon 148. While the mouse 82 is being moved away from the center of the icon 148, the motors 98, 116 impart a force toward the center of the icon 148, offering resistance. When the cursor 36 is moved off of icon 148, this force is eliminated. In this manner, the user can "feel" the icons 148. Further, this will assist the user in double clicking on the icon 148 without inadvertently dragging the icon 148 between clicks.
  • moving the cursor 36 across the screen 140 can provide different areas of friction resistance.
  • the screen 140 outside of window 150 provides zero friction area as in area 130 of Figure 13.
  • the window 150 provides a second area 132 of slight friction.
  • the object 162 is the third area 134 of high friction and the center 164 of the object 160 is the fourth area 136 of zero friction.
  • the computer input device 80 would provide more than one of the force outputs in Figures 7-13, with the multiple force graphs superimposed upon each other.
  • the computer interface 20 of the present invention provides a relatively inexpensive, simple computer input device 22, 80 imparting a force output on a mouse 28, 82 which is based upon the movement of the mouse 28, 82, thereby providing friction and inertia compensation and the simulation of effects such as viscosity.
  • the computer interface 20 further provides force feedback relative to the position of a cursor 36 on a display 24, thereby enabling a user to "feel" the objects 38 on the display 24.
  • the memory 34 of the CPU 26 includes an operating system 170 which, among other things, controls the contents of display 24.
  • the operating system 170 is Microsoft Windows 3.1 or Microsoft Windows '95, or Macintosh 0S8, their subsequent versions, or equivalent operating systems.
  • the operating system 170 includes a system metrics file 172 which contains information regarding the characteristics of common screen objects 38, such as the size, shape, location of specific types of screen objects 38.
  • the operating system 170 also includes an OS data structure 174 containing the specific coordinates for each of the screen objects 38.
  • the OS data structure 174 is a tree data structure as is utilized in Microsoft Windows 3.1 , Windows '95 and Macintosh 0S8.
  • the display 24 displays the screen objects 38 based upon information in the data structure 174 and system metrics file 172. Changes to the data structure 174 result in corresponding changes to the screen objects 38 and the display 24.
  • the card 40 sends data to and receives commands from an I/O board driver 178 in communication with a mouse driver 180.
  • the card 40 could alternately be located on the input device 22, 80. It should be recognized that the I/O board driver 178 and mouse driver 180 could be separate, as shown, or combined.
  • the mouse driver 180 includes an X register 182 and a Y register 184 containing the current coordinates of the cursor 36 on the display 24 as modified by signals from the encoders 100, 118 on the input device 22.
  • the operating system 170 periodically interrogates the mouse driver 180 generally at regular intervals to determine the coordinates stored in the X register 182 and the Y register 184. The operating system 170 then causes the display 24 to display the cursor 36 at the current coordinates.
  • the operating system 170 also determines the location of the cursor 36 with respect to the screen objects 38.
  • a configuration utility 188 permits the user to selectively define certain values for the operation of the input device 22. For example, a user can define the magnitudes of feedback forces for different types of screen objects 38. Further, the user can define which of the different types of force feedback, detailed above, are to be associated with types of screen objects 38. In this manner, a user can customize the computer interface 20 to personal preferences. Data indicating these personal preferences is stored in the configuration data 190.
  • a texture mapper 194 creates a layering object 196 (a second data structure) which associates areas on the display 24 with one or more effects. The areas preferably coincide (exactly or at least partially) with the areas occupied by the screen objects 38.
  • the layering object 196 is preferably a tree data structure wherein each sub branch contains coordinates within the parent branch.
  • the ultimate parent branch includes coordinates for the entire desktop.
  • a child branch of the desktop parent branch may be a window occupying an area included within the desktop.
  • the window includes a plurality of screen objects 38 each occupying an area included within the parent window.
  • a child branch need not be totally contained within its parent branch.
  • the layering object 196 is preferably a double-linked list, thereby providing layer ordering. Essentially, the order of the linked list is analogous to the Z-order of all pop-up windows to the desktop window. Therefore, when a hidden application receives the focus, its layer object moves to the front of the linked list.
  • each area includes preferably at least one associated effect.
  • the entire desktop includes inertia and friction compensation.
  • the texture mapper 194 must construct the entire layering object 196.
  • most, if not all, of the layering object 196 is generated by the texture mapper 194 sending API calls to the operating system 170.
  • the texture mapper 194 can determine what screen objects 38 are displayed, how many screen objects 38, such as windows, are displayed, the size of the objects, the location of the objects, the relationship with other objects (i.e. parent and child windows), special windows, Z-order, etc. Other information, such as menu item size can be determined by accessing the operating system data structure 174.
  • the layering object 196 comprises a complete second data structure associating the desktop and each area on the screen which corresponds to a screen object 38 displayed on display 24, with preferably at least one effect.
  • the layering object 196 must be updated as the display 24 changes.
  • a hook 198 in the operating system 170 flags events which cause the texture mapper 194 to update the layering object 196. For example, movement or resizing of windows, movement of screen objects 38, icons, opening or closing windows, changes to the Z-order, cause the hook 198 to indicate to the texture mapper 194 that the layering object 196 should be updated.
  • An effect mapper 200 determines which effect to apply given the current position of the cursor 36 on the display 24.
  • the effect mapper 200 interrogates the mouse driver 180 and X and Y registers 182, 184 periodically, preferably every 10 ms, for the current coordinates of the cursor 36.
  • the effect mapper 200 indexes the layering object 136 utilizing the current coordinates of the cursor 36. Since each child branch of the layering object
  • the effect mapper 200 can only include the coordinates of its parent branch, if the coordinates of the cursor 36 are not located within a parent branch, the effect mapper 200 will skip to the next parent branch. If the current coordinates of the mouse 36 are located within the parent branch, the effect mapper 200 begins searching through the child branches to determine the effect area. When the effect mapper 200 determines that the current coordinates of the cursor 36 are located within an effect area, the effect mapper 200 reads the effect or effects to be applied from the layering object 196, then determines any modifications, such as to the magnitude of the forces to be applied, from the configuration data 190 and calculates a resultant output signal based upon the superposition of all the effects to be applied. This resultant output signal is sent to the I/O board driver 178.
  • the I/O board driver 178 and the computer card 40 drive the input device 22 based upon the resultant output signal.
  • the computer card 40 sends a signal to the input device 22 to generate a force feedback output as described above based upon the superposition of the associated effects as well as customized preferences.
  • the effect mapper 200 controls a pointer to the current layer object so that search time is reduced. If the cursor 36 is within a specified layer, then it is most likely to be contained within that same layer while it is being moved.

Abstract

A computer interface (20) includes an input device (22) and display (24) connected to a CPU (26). The input device (22) generally comprises a mouse (28) movable along an X-axis or Y-axis, resulting in a corresponding movement of a cursor (36) on the display (24). A pair of motors impart a force upon the mouse (28) along the X-axis and the Y-axis. The motors provide force feedback based upon movement of the mouse (28) and based upon interaction of the cursor (36) with objects (38) displayed on the display (24).

Description

FORCE FEEDBACK MOUSE
This application claims priority to provisional patent application Serial No. 60/024,425 filed August 20, 1996 and is a continuation-in-part of U.S. Patent Application serial number 08/802,581 filed February 19, 1997.
BACKGROUND OF THE INVENTION
The present invention relates generally to a computer interface and more particularly to a mouse having force feedback.
In the known computer interface, movement of a computer mouse in an X- and/or Y-direction on a table moves a cursor or other graphical element on a computer display in a corresponding direction. As is well known, the user operates numerous functions on a graphical user interface, such as pull down menus, activating icons, scrolling windows, etc. by moving the mouse and selectively activating a button on the mouse. The known computer mouse does not provide tactile or force feedback relating to the interaction between the cursor and computer generated objects on the screen, i.e. the user cannot "feel" the objects displayed on the screen. As a result, many people have difficulty operating a computer mouse.
For example, many people have difficulty "double clicking" on a computer icon because they inadvertently move the mouse while clicking twice, thereby dragging, rather than double clicking, the icon. The known computer mouse is particularly difficult to operate for the visually impaired or those with poor motor skills, poor hand-eye coordination or those with muscular or nervous disorders.
One proposed computer input device offers force feedback relating to the cursor interaction with objects on the computer screen. That device utilizes electromagnetic flat coil actuators to generate electromagnetic forces on a handle. However, the electromagnetic flat coil actuators utilized in this computer input device are expensive and generate strong magnetic fields which interfere with the operation of the computer or which could damage computer disks. This computer input device requires an additional computer dedicated solely to controlling the input device.
United States Patent Number 4,604,016 discloses a hand controller having force feedback for teleoperation of a tool for surgery. The forces encountered by the tool are translated by a computer to torque motors, thereby providing a real time interactive feedback response enabling a surgeon to "feel" an operation. The position and orientation of the controller are determined by the lengths of twelve lines between the controller and the support structure. The twelve control lines are interconnected with the plurality of torque motors which are variably programmed by the computer to apply tension to each of the lines based upon the force encountered by the tool. This device is large and requires a large number of control lines and motors. The numerous control lines and motors complicate programming of software applications which could utilize the input device. Play in the numerous control lines and friction reduce the precision of the response and feedback of the device. The patent does not disclose force feedback based upon the interaction between a cursor and objects on a computer screen or force feedback based upon the movement of the input device. Although a force feedback mouse for use in a graphical user interface has been proposed, it has not been implemented in a commercially available operating system. Further, operation of the proposed force feedback mouse has required the use of an additional microprocessor controller dedicated to operation of the mouse.
SUMMARY OF THE INVENTION
The present invention provides a computer input device having force feedback which is less expensive and simpler than previously known computer input devices. The computer interface of the present invention further imparts a force on a mouse which is based upon the movement of the mouse, thereby providing friction and inertia compensation and the simulation of such effects as viscosity.
The computer interface generally comprises an input device and a display connected to a CPU. The input device generally comprises a mouse movable in two degrees of freedom (D.O.F.), preferably along an X-axis and a generally perpendicular Y-axis on a surface. Movement of the mouse in the X and Y-directions generally causes a corresponding movement of a cursor on the display. In a first embodiment, the mouse is slidably mounted to a first rail generally parallel to the X-axis. The first rail is in turn slidably mounted on a second rail for movement generally parallel to the Y-axis. A pair of motors and belts impart forces on the mouse along the X and Y axes. An encoder connected to each motor measures the movement of the mouse by the user or by the motors.
In a second embodiment, the mouse is fixed to a rail having a rack engaged by a gear and a motor imparting a force parallel to the Y-axis. The mouse, rail, rack, gear and motor are slidably mounted to a first rail for movement in the X-direction. A motor and belt impart a force on the mouse generally parallel to the X-axis. Each of the motors include a sensor for indicating the displacement of the mouse, preferably an encoder, from which velocity and acceleration, including direction, can be calculated.
Each of the motors, in either embodiment, impart force on the mouse along its associated axis based upon movement of the mouse. For example, a motor imparts a force upon the mouse to compensate for friction when it detects motion of the mouse along its associated axis. Each motor also imparts a force corresponding to detected acceleration of the mouse in order to compensate for inertia along the associated axis. Further, in order to provide a "viscous" feel in some areas of the display, the motors selectively impart a force upon the mouse which is generally linearly proportional to the detected velocity of the mouse, opposite the direction of the detected velocity. In order to calibrate the input device in either embodiment, the mouse is driven by each of the motors to extreme positions along either axis until the mouse or its associated hardware contacts a stop. The CPU and input device detect the impact of the mouse with a stop at each extreme, thereby defining the range of motion of the mouse along each axis to calibrate the motion of the mouse with the motion of a cursor on a display. Alternatively the stops could comprise limit switches.
The computer interface also provides force feedback based upon the interaction of a cursor on a display. For example, in a graphical user interface, the user can "feel" icons, windows and menus on a display. The motors also assist the user in operating the graphical user interface, such as by preventing the cursor from inadvertently sliding off the side of a pull down menu or a scroll bar thumb.
BRIEF DESCRIPTION OF THE DRAWINGS
The above, as well as other advantages of the present invention, will become readily apparent to those skilled in the art from the following detailed description of a preferred embodiment when considered in the light of the accompanying drawings in which:
Figure 1 is a schematic of the computer interface of the present invention.
Figure 2 is a top view of the computer input device shown in Figure 1.
Figure 3 is a sectional view taken along line 3-3 of Figure 2. Figure 4 is a perspective view of an alternate input device to use with the computer interface of Figure 1.
Figure 5 is a top view of the input device of Figure 4, partially broken away.
Figure 6 is a sectional view of the input device of Figure 5 taken along line 6-6. Figure 7 is a graph showing a method for compensating for friction for the computer input device of Figure 1 or Figure 4. Figure 8 is a graph of an alternate method for compensating for friction in the computer input device shown in Figure 1 or Figure 4.
Figure 9 is a graph of a method for compensating for inertia in the computer input device of Figure 1 or Figure 4. Figure 10 is a graph of a method for providing a viscous force feedback for the computer input device of Figure 1 or Figure 4.
Figure 11 is a graph of a method for providing the feel of a wall in the computer interface of Figure 1.
Figure 12 is a graph of a method for providing the feel of gravity or a potential well for the computer interface of Figure 1.
Figure 13 is a graph of a method for providing variable friction areas in the computer interface of Figure 1.
Figure 14 is one potential screen displayed by the display of Figure 1.
Figure 15 is a schematic of the software operating the computer interface of Figure 1.
DESCRIPTION OF A PREFERRED EMBODIMENT
The present invention provides a computer interface 20 including an input device 22 and a display 24 connected to a CPU 26 as shown in Figure
1. The input device 22 generally includes a mouse 28 moveable in an X- direction and a Y-direction on surface 30 and including user activated buttons 32. The CPU 26 includes a microprocessor 33, memory 34 such as RAM and ROM and a mass storage 35 such as a hard drive or CD-ROM, tape, etc. The memory 34 and storage 35 are programmed such the microprocessor 33 sends and receives signals to and from the input device 22. Movement of the mouse 28 in the X and Y-directions typically results in a corresponding motion of a cursor 36 on the display 24 interacting with screen objects 38, such as simulated objects, or graphical user interface items such as menus, windows, slider bars, etc. "Cursor" 36 as used herein refers to any object on the display
24 which is directly controlled by movement of the mouse 28. A cursor 36 in a word-processing application will differ from that utilized in graphics applications, games, file-management applications, etc. Force feedback to the mouse 28 is controlled by a computer card 40 having a pair of servo drives 42. Referring to Figure 2, the mouse 28 is mounted to an X-slide 44 which is slidably mounted on an X-rail 46 for movement along the X-axis. A non- slip belt 50 driven by a motor 52 imparts force on the mouse 28 and the X- slide 44 in either direction along the X-axis. The motor 52 includes a sensor 54 indicating the amount of movement of the mouse 28, preferably an encoder 54 or alternatively a resolver. The mouse 28 and X-slide are moveable along the X-rail between a stop 56 on an end of the X-rail 46 and a Y-slide 60 secured to the X-rail 46.
The Y-slide 60 is slidable in either direction along a Y-rail 62 extending along a Y-axis generally perpendicular to the X-axis. The motor 52, belt 50, X-rail 46, X-slide 44 and mouse 28 are all mounted to the Y-slide for movement together along the Y-axis between stops 64 at either end of the Y-rail 62. A belt 66 driven by a motor 68 having a encoder 70 imparts force on the Y-slide in either direction along the Y-axis.
Belts 50, 66, are preferably synchromesh cable, available commercially from Stock Drive Products. The motors 52, 68 have a peak torque of at least
5 ounce-inches, and most preferably 10 ounce-inches. The motors 52, 68 are powered by servo drives 42 which preferably utilize pulse-width modulation. As is known for servo-controlled motors, the servo drives 42 regulate the current and voltage supplied to the motors 52, 68 and may at times monitor the consumption of current and voltage by the motors 52, 68 in order to precisely control the force generated by the motors 52, 68. The encoders 54, 70 preferably have a resolution of at least 1000 pulses per revolution and utilize quadrature decoding. Unlike a conventional mouse, the mouse 28 need not actually make direct contact with the surface 30 because movement of the mouse 28 is sensed by the encoders 54, 70 via the motors 52, 68. As can be seen in Figure 3, the belt 50 is looped around a pair of pulleys 72 and secured at either end to the X-slide 44. A single motor 52 directly driving one of the pulleys 72 drives the X-slide 44 in either direction along the X-axis. The Y-axis motor 68 having encoder 70 drives the belt 66 which imparts a force on the mouse 28 along the Y-axis.
In operation, movement of the mouse 28 along either the X-axis, Y- axis or a combination drives belts 50, 66 rotating motors 52, 68 and encoders 54, 70. The encoders 54, 70 generate signals indicative of the displacement and direction of the mouse 28. From these signals, the CPU 26 can derive the velocity, acceleration, etc of the mouse 28. The CPU 26 controls the servo drives 42 to control the motors 52, 68 to impart forces on the mouse 28 as will be described in more detail below. The forces imparted by motors 52, 68 are precisely controlled by servo drives 42. The servo drives 42 may also monitor the power consumption of motors 52, 68 at times. An alternate input device 80 for use in the computer interface 20 of
Figure 1 is shown in Figure 4 generally comprising a mouse 82 movable in two degrees of freedom (D.O.F.), such as along an X-axis and a Y-axis on a surface 84. The surface 84 can be secured to a housing 86 or can be a desktop or tabletop surface 84. It should be appreciated that, unlike a conventional mouse, the mouse 82 need not be located directly on a surface in order to operate, but preferably the mouse 82 is moveable in two degrees of freedom in a plane. The mouse 82 preferably includes a plurality of buttons 88.
As can be seen in Figure 5, the input device 80 includes a Y-rail 92 at least a portion of which comprises a rack 94 engaging a gear 96 driven by a motor 98 with an encoder 100. The mouse 82 is fixed to an end 102 of Y-rail 92. The Y-rail 92 is slidably mounted for movement along the Y-axis on X- slide 106. The X-slide 106 is slidably mounted for movement along the X- axis on X-rail 108 between a pair of stops 110. A belt 112 driven by a motor 116 having an encoder 118 imparts a force along the X-axis to the X-slide 106 which transmits the force to the Y-rail 92 and the mouse 82. Unlike a conventional mouse, the mouse 82 need not actually make direct contact with the surface 84 because movement of the mouse 82 is sensed by encoders 100, 118 via the motors 98, 116.
Referring to Figure 6, the gear 96 driven by motor 98 engages the rack portion 94 of Y-rail 92 through an opening 120 in X-slide 106. As can be seen in Figure 6, the Y-rail 92 is slidably mounted on the X-slide 106 for relative motion along the Y-axis. The X-slide is slidably mounted on the X- rail 108 to provide relative motion in the X-directions. Preferably, the motors 98, 116 and encoders 100, 118 are the same as those described in the first embodiment. Alternatively, synchromesh cable could be utilized instead of the rack 94 and gear 96.
In operation, movement of the mouse 82 along the surface 84 generates a signal to be sent to the CPU 26 from the encoders 100, 118. Movement of the mouse 82 along the Y-axis causes the rack portion 94 of the Y-rail 92 to drive the gear 96 and the motor 98. The encoder 100 connected to the motor
98 generates a signal indicating the displacement of the mouse 82 along the Y- axis. Movement of the mouse 82 along the X-axis moves the X-slide 106 along the X-axis, thereby driving belt 112 and motor 116 and causing the encoder 118 to generate a signal indicating the displacement of the mouse 82 in the X-axis. The CPU 26 controls the servo drives 42 to power the motors
98, 116 to impart forces on the mouse 82, in a manner which will be described in more detail below.
Figures 7-13 indicate the methods of controlling the motors 98, 116 to impart a force upon the mouse 82. Figures 7-13 will be described specifically with respect to the embodiment shown in Figures 4-6 for purposes of illustration only. These methods are equally applicable to the embodiment shown in Figures 1-3. Figures 7-10 are graphs of the output of the motors 98, 116 based upon input from the mouse 82, either through the encoders 100, 118. Figures 11-13 indicate the output of motors 98, 116 as seen by the mouse 82 based upon the position of a cursor 36 on the display 24. Generally, two or more of the forces indicated by the graphs in Figures 7-13 can be superimposed to provide several effects simultaneously.
The input device 80 preferably includes compensation for friction which would be experienced by the Y-rail 92 and the X-rail 108. As is known, friction generates a force opposite the direction of the movement of the mouse
82. Preferably the CPU 26 compensates for this force of friction by controlling the servo drives 42 to send current and voltage to the motors 98, 116 sufficient to cancel the force of friction. The CPU 26 monitors the velocity of the mouse 82 by monitoring the signal from the encoders 100, 118. As can be seen from Figure 7, when the mouse 82 begins to move, the motors
98, 116 impart a force on the mouse 82 in the same direction as the velocity of the mouse 82 and of a magnitude equal to the force of the friction. This friction compensation occurs independently along the X-axis and Y-axis.
An alternative method for compensating for friction is indicated by the graph shown in Figure 8. The velocity of the mouse 82 is continuously monitored by the CPU 26. The CPU 26 controls the servo drive 42 and motors 98, 116 to impart a force upon the mouse 82 in the same direction as the input velocity v^ measured from the mouse 82. Again this friction compensation would be performed independently along the X-axis and Y-axis by the motors 98, 116.
The input device 80 preferably also compensates for the inertia of the mouse 82 and associated hardware. The CPU 26 monitors the acceleration, both positive and negative of the mouse 82 by monitoring the encoders 100, 1 18. The CPU 26 then calculates a force Fout which, based upon the mass of the mouse 82 and hardware moveable along the appropriate axis, would result in an acceleration equal or slightly less than the acceleration input, Am. This inertia compensation provides both negative and positive acceleration to the mouse 82. The inertia compensation is also implemented independently on the X-axis and Y-axis, based upon the characteristics of the specific hardware, including mass, which is moveable along that axis. Figure 10 indicates a method for providing a viscous force feedback to the mouse 82. The force generated by the motors 98, 116 in a manner described above is imparted in a direction opposite to the movement of the mouse 82. Further, the imparted force is increased in magnitude as the velocity of the mouse 82 movement V^ increases, thereby creating a "viscous" feel as the user moves the mouse 82.
Figures 11-13 indicate methods for imparting a force upon the mouse 82 wherein the force depends upon the position of the cursor 36 on the display 24, relative to other objects 38 on the display 24. Although the graphs will be described with respect to displacement along the X-axis, the same methods would be used for imparting a force upon the mouse 82 in the Y-axis.
Figure 11 indicates force imparted on the mouse 82 in the X-axis based upon the position of the cursor 36 relative to an object 38 on the display 24. As can be seen from the graph, motion of the cursor 36 across the display 24 is free from any force resistance until the cursor 36 contacts screen object 38, which in this case would "feel" like the cursor 36 is contacting a wall. As the cursor 36 contacts the screen object 38, the motor 52 imparts a force opposite the direction of the movement of the cursor 36. The mouse 82 is moved through the screen object 38 against the resistive force until the cursor 36 is past the center of the screen object 38, where the resistive force reverses and
"pushes" the cursor 36 and mouse 82 off of the screen object 38 where the force returns to zero. Again, preferably two or more of the forces indicated by the graphs in Figures 7-13 would be superimposed in order to compensate for friction and inertia and provide a response relative to the position of the cursor 36 on the display 24.
Figure 12 indicates a method for providing the feel of gravity or a potential well. In this graph, the force, Foul, provides an attractive force to the center of a screen object 38. As the cursor 36 is moved from left to right across a screen object 38, the motor 52 first provides a force toward the center of the screen object 38 in the same direction as the movement of the mouse
82. The force imparted on the mouse 82 then drops to zero in the center of - l i¬
the screen object 38. As the cursor 36 continues to move from left to right from the center of the screen object 38, the motor 52 provides a high resistive force, which gradually decreases as the cursor 36 moves off of the screen object 38. In this manner, the screen object 38 "feels" like a detent or potential well. A similar method imparts a force upon the mouse 82 when the cursor 36 is moved along the Y-axis across the screen object 38 or from right to left across the screen object 38. Generally, within the screen object 38, a force is imparted on the mouse 82 toward the center of the screen object 38. As shown in Figure 13, the motor 52 can be used to impart a force to the mouse 82 simulating different levels of friction. For example, moving the cursor 36 from left to right across the display, the motor 52 could impart zero force resistance in a first area 130, slight resistance force in the second area 132, high resistance force in a third area 134 and return to zero resistive force in a fourth area 136. The first, second, third and fourth, 130, 132, 134, 136 could be different screen objects 38. In this manner, the user can "feel" the different screen objects 38 on the display 24. Again, the same method is used for movement of the mouse along the Y-axis. Alternatively another effect, such as viscosity, or some combination of effects could be used other than friction. Figure 14 shows one possible screen 140 to be displayed by display 24 including a pull down menu 142 having side edges 144. The screen 140 further includes a plurality of icons 148 and a window 150 having edges 152 about its periphery. The window 150 further includes a pair of slider bars 154 having side edges 155 for scrolling through the information displayed in window 150, either by clicking on one of a plurality of arrows 156, or by dragging one of the two boxes 158, in a manner well known to computer users. The window 150 may display simulated objects 160 and 162, such as in a game or a CAD or word-processing program. The simulated objects 160, 162 can be dragged or rotated utilizing the cursor 36 in a manner common to many computer programs. The operation of the embodiment shown in Figures 4-6 of the computer input device 80 will be described with respect to Figure 14 for purposes of illustration only; operation of the embodiment shown in Figures 1-3 would be identical. Preferably, movement of the mouse 82 over surface 84 along the X-axis or Y-axis causes a corresponding movement of cursor 36 on the screen
140 of display 24. Generally, the motors 98, 1 16 preferably operate to compensate for friction in the computer input device 80 utilizing either a constant stepped force output as shown in Figure 7 or a linear force output preload as shown in Figure 8. Further, the motors 98, 116 also compensate for inertia as described with respect to the graph shown in Figure 9.
Preferably the side edges 144 of the pull down menu 142 are simulated elastic bands utilizing force output proportional to the distance from the center of the pull down menu 142 in the X-direction only. Further, each item selectable from the menu provides an elastic resistance force against the mouse 82 as the mouse 82 is moved in the Y-direction. As the mouse 82 moves the cursor 36 along the Y-direction across the pull down menu 142, each item in the menu provides an elastic force attracting the cursor 36 toward the center of each item, until the cursor 36 is moved onto another item. In this manner, the user can "feel" the pull down menu 142 and the items on the menu 142 and avoid inadvertently slipping off the left or right edges 144 of the pull down menu 142.
Similarly, the edges 152 of the window 150 are also "walls" as shown in Figure 11. The left and right edges 152 form walls in the X-axis and the top and bottom edges 152 of the window 150 form walls along the Y-axis. In this manner, the user can "feel" the periphery of the window 150.
The slider bars 154 operate similarly to the pull down menu 142. Once the user begins to drag a box 158, the side edges 155 of the slider bar 154 become "walls" as shown in the graph of Figure 11. In this manner, the user can "feel" the slider bar 154, moving the cursor 156 and box 158 freely lengthwise across the slider bar 154 axis while preventing the user from inadvertently moving the cursor off of the slider bar 154. The computer interface 20 further preferably provides icons 148 which simulate "gravity" or a potential well or detent. By moving the mouse 82, the user moves the cursor 36 near one of the icons 148. When the cursor 38 is on or near the icon 148, the motors 98, 166 impart a force on the mouse 82 toward the center of the icon 148. While the mouse 82 is being moved away from the center of the icon 148, the motors 98, 116 impart a force toward the center of the icon 148, offering resistance. When the cursor 36 is moved off of icon 148, this force is eliminated. In this manner, the user can "feel" the icons 148. Further, this will assist the user in double clicking on the icon 148 without inadvertently dragging the icon 148 between clicks.
Alternatively, or in addition to those effects described above, moving the cursor 36 across the screen 140 can provide different areas of friction resistance. For example, the screen 140 outside of window 150 provides zero friction area as in area 130 of Figure 13. The window 150 provides a second area 132 of slight friction. The object 162 is the third area 134 of high friction and the center 164 of the object 160 is the fourth area 136 of zero friction. Again, the computer input device 80 would provide more than one of the force outputs in Figures 7-13, with the multiple force graphs superimposed upon each other. The computer interface 20 of the present invention provides a relatively inexpensive, simple computer input device 22, 80 imparting a force output on a mouse 28, 82 which is based upon the movement of the mouse 28, 82, thereby providing friction and inertia compensation and the simulation of effects such as viscosity. The computer interface 20 further provides force feedback relative to the position of a cursor 36 on a display 24, thereby enabling a user to "feel" the objects 38 on the display 24.
A schematic for operating the computer interface 20 is shown in Figure 15. The memory 34 of the CPU 26 includes an operating system 170 which, among other things, controls the contents of display 24. Preferably the operating system 170 is Microsoft Windows 3.1 or Microsoft Windows '95, or Macintosh 0S8, their subsequent versions, or equivalent operating systems. As is known, the operating system 170 includes a system metrics file 172 which contains information regarding the characteristics of common screen objects 38, such as the size, shape, location of specific types of screen objects 38. The operating system 170 also includes an OS data structure 174 containing the specific coordinates for each of the screen objects 38.
Preferably, the OS data structure 174 is a tree data structure as is utilized in Microsoft Windows 3.1 , Windows '95 and Macintosh 0S8. The display 24 displays the screen objects 38 based upon information in the data structure 174 and system metrics file 172. Changes to the data structure 174 result in corresponding changes to the screen objects 38 and the display 24.
The card 40 sends data to and receives commands from an I/O board driver 178 in communication with a mouse driver 180. The card 40 could alternately be located on the input device 22, 80. It should be recognized that the I/O board driver 178 and mouse driver 180 could be separate, as shown, or combined. The mouse driver 180 includes an X register 182 and a Y register 184 containing the current coordinates of the cursor 36 on the display 24 as modified by signals from the encoders 100, 118 on the input device 22. As is known, the operating system 170 periodically interrogates the mouse driver 180 generally at regular intervals to determine the coordinates stored in the X register 182 and the Y register 184. The operating system 170 then causes the display 24 to display the cursor 36 at the current coordinates. The operating system 170 also determines the location of the cursor 36 with respect to the screen objects 38.
A configuration utility 188 permits the user to selectively define certain values for the operation of the input device 22. For example, a user can define the magnitudes of feedback forces for different types of screen objects 38. Further, the user can define which of the different types of force feedback, detailed above, are to be associated with types of screen objects 38. In this manner, a user can customize the computer interface 20 to personal preferences. Data indicating these personal preferences is stored in the configuration data 190. A texture mapper 194 creates a layering object 196 (a second data structure) which associates areas on the display 24 with one or more effects. The areas preferably coincide (exactly or at least partially) with the areas occupied by the screen objects 38. The layering object 196 is preferably a tree data structure wherein each sub branch contains coordinates within the parent branch. For example, the ultimate parent branch includes coordinates for the entire desktop. A child branch of the desktop parent branch may be a window occupying an area included within the desktop. In turn, the window includes a plurality of screen objects 38 each occupying an area included within the parent window. A child branch need not be totally contained within its parent branch. The layering object 196 is preferably a double-linked list, thereby providing layer ordering. Essentially, the order of the linked list is analogous to the Z-order of all pop-up windows to the desktop window. Therefore, when a hidden application receives the focus, its layer object moves to the front of the linked list.
Within this layering object 196 each area includes preferably at least one associated effect. As discussed above, preferably, the entire desktop includes inertia and friction compensation. At startup, the texture mapper 194 must construct the entire layering object 196. Preferably, most, if not all, of the layering object 196 is generated by the texture mapper 194 sending API calls to the operating system 170. The texture mapper 194 can determine what screen objects 38 are displayed, how many screen objects 38, such as windows, are displayed, the size of the objects, the location of the objects, the relationship with other objects (i.e. parent and child windows), special windows, Z-order, etc. Other information, such as menu item size can be determined by accessing the operating system data structure 174. When completed, the layering object 196 comprises a complete second data structure associating the desktop and each area on the screen which corresponds to a screen object 38 displayed on display 24, with preferably at least one effect. The layering object 196 must be updated as the display 24 changes.
A hook 198 in the operating system 170 flags events which cause the texture mapper 194 to update the layering object 196. For example, movement or resizing of windows, movement of screen objects 38, icons, opening or closing windows, changes to the Z-order, cause the hook 198 to indicate to the texture mapper 194 that the layering object 196 should be updated. An effect mapper 200 determines which effect to apply given the current position of the cursor 36 on the display 24. The effect mapper 200 interrogates the mouse driver 180 and X and Y registers 182, 184 periodically, preferably every 10 ms, for the current coordinates of the cursor 36. The effect mapper 200 indexes the layering object 136 utilizing the current coordinates of the cursor 36. Since each child branch of the layering object
196 can only include the coordinates of its parent branch, if the coordinates of the cursor 36 are not located within a parent branch, the effect mapper 200 will skip to the next parent branch. If the current coordinates of the mouse 36 are located within the parent branch, the effect mapper 200 begins searching through the child branches to determine the effect area. When the effect mapper 200 determines that the current coordinates of the cursor 36 are located within an effect area, the effect mapper 200 reads the effect or effects to be applied from the layering object 196, then determines any modifications, such as to the magnitude of the forces to be applied, from the configuration data 190 and calculates a resultant output signal based upon the superposition of all the effects to be applied. This resultant output signal is sent to the I/O board driver 178. The I/O board driver 178 and the computer card 40 drive the input device 22 based upon the resultant output signal. The computer card 40 sends a signal to the input device 22 to generate a force feedback output as described above based upon the superposition of the associated effects as well as customized preferences. Preferably, the effect mapper 200 controls a pointer to the current layer object so that search time is reduced. If the cursor 36 is within a specified layer, then it is most likely to be contained within that same layer while it is being moved. In accordance with the provisions of the patent statutes, the present invention has been described in what is considered to represent its preferred embodiment. However, it should be noted that the invention can be practiced otherwise than as specifically illustrated and described without departing from its spirit.

Claims

WHAT IS CLAIMED IS:
1. A computer interface comprising: a mouse moveable generally in a plane in a first degree of freedom and a second degree of freedom; a display displaying a cursor at a plurality of positions on said display based upon movement of said mouse; and a first motor imparting a force on said mouse along said first degree of freedom.
2. The computer interface of Claim 1 further including: a second motor imparting a force on said mouse along said second degree of freedom.
3. The computer interface of Claim 1 wherein said first degree of freedom is a first axis and said second degree of freedom is a generally perpendicular second axis.
4. The computer interface of Claim 3 wherein said first motor imparts said force in a first direction and an opposite second direction along said first axis.
5. The computer interface of Claim 1 further including a first sensor generating a first signal based upon movement of said mouse along said first degree of freedom.
6. The computer interface of Claim 5 further including a second sensor generating a second signal based upon movement of said mouse along said second degree of freedom.
7. The computer interface of Claim 6 wherein said display moves said cursor on said display based upon said first signal from said first sensor and said second signal from said second sensor.
8. The computer interface of Claim 5 wherein said first motor imparts a force on said mouse based upon said first signal from said first sensor.
9. The computer interface of Claim 5 wherein said first sensor is an encoder generating pulses based upon movement of said mouse.
10. The computer interface of Claim 1 wherein said first motor imparts force on said mouse based upon said position of said cursor on said display.
11. The computer interface of Claim 1 wherein said first motor imparts force on said mouse based upon said movement of said mouse.
12. The computer interface of Claim 2 further comprising: a first rail extending generally parallel to said first axis; and said mouse slidably mounted on said first rail.
13. The computer interface of Claim 12 further comprising:a second rail extending generally parallel to said second axis; and said mouse slidably mounted on said second rail.
14. The computer interface of Claim 13 wherein said second rail is slidably mounted on said first rail for movement generally parallel to said first axis.
15. The computer interface of Claim 12 further including: a second rail mounted to said mouse; and a second motor imparting a force on said second rail and said mouse generally parallel to said second axis.
16. The computer interface of Claim 15 wherein said second rail includes a rack, said second motor driving a gear engaging said rack.
17. The computer interface of Claim 16 wherein said second rail is slidably mounted on said first rail for movement generally parallel to said first axis.
18. The computer interface of Claim 17 wherein said first motor drives a gear engaging a belt to impart force upon said mouse along said first axis.
19. The computer interface of Claim 1 wherein said first motor imparts said force on said mouse based upon a first signal from a first sensor, said force compensating for friction encountered during said movement of said mouse along said first degree of freedom.
20. The computer interface of Claim 1 wherein said force imparted by said first motor is based upon a first signal from a first sensor, said force generally compensating for inertia encountered during said movement of said mouse along said first degree of freedom.
21. The computer interface of Claim 1 further including a first stop and a second stop at opposite extremes of movement of said mouse along said first degree of freedom said first motor moving said mouse to said first stop and to said second stop along said first degree of freedom, said computer interface further receiving a feedback signal indicating when said mouse abuts said first stop and said second stop, thereby defining a range of motion along said first degree of freedom and calibrating said computer interface along said firs degree of freedom.
22. A computer input device comprising: a mouse moveable generally in a plane along a first degree of freedom and a second degree of freedom, movement of said mouse generating a signal to be input to a computer; and a first motor imparting a force on said mouse along said first degree of freedom based upon movement of said mouse along said first degree of freedom.
23. The computer interface of Claim 22 wherein said first motor imparts said force on said mouse based upon a first signal from a first sensor, said force compensating for friction encountered during said movement of said mouse along said first degree of freedom.
24. The computer interface of Claim 23 wherein said first sensor determines a velocity of mouse movement along said first degree of freedom, said first motor imparting force based upon said velocity in order to compensate for friction.
25. The computer interface of Claim 24 wherein said force imparted by said first motor is selected to move said mouse along said first degree of freedom generally at said velocity.
26. The computer interface of Claim 22 wherein said force imparted by said first motor is based upon a first signal from a first sensor, said force generally compensating for inertia encountered during said movement of said mouse along said first degree of freedom.
27. The computer interface of Claim 22 wherein said first sensor determines acceleration along said first degree of freedom, said force imparted by said first motor on said mouse being generally proportional to said acceleration, thereby compensating for inertia.
28. The computer interface of Claim 27 wherein said force increases substantially linearly with said acceleration.
29. The computer interface of Claim 22 further including a first stop and a second stop at opposite extremes of movement of said mouse along said first degree of freedom, said first motor moving said mouse to said first stop and to said second stop along said first degree of freedom, said computer interface further receiving a feedback signal indicating when said mouse abuts said first stop and said second stop, thereby defining a range of motion along said first degree of freedom and calibrating said computer interface along said first degree of freedom.
30. The computer interface of Claim 29 wherein said feedback signal is generated by a first sensor.
31. The computer input device of Claim 22 further comprising: a display displaying at least one computer-generated object, said display modifying said object relative to said display based upon movement of said mouse.
32. The computer input device of Claim 22 further comprising a first rail generally parallel to a first degree of freedom, said mouse slidably mounted on said first rail.
33. A computer interface comprising: a mouse moveable generally in a plane along a first axis and a generally perpendicular second axis; a display displaying a cursor at a plurality of positions on said display based upon movement of said mouse; a first rail extending generally parallel to said first axis; said mouse slidably mounted on said first rail; a first motor imparting a force on said mouse in a first direction and an opposite second direction along said first axis; and a second motor imparting a force on said mouse in a first direction and an opposite second direction along said second axis.
34. The computer interface of Claim 33 wherein said first motor and said second motor impart force on said mouse based upon said movement of said mouse.
35. A method for generating a feedback signal based upon the position of a cursor on a display relative to a object including the steps of: a) determining coordinates of an object displayed on said display; b) generating an effect data structure associating the coordinates of said object with at least one feedback signal; c) monitoring an input signal from an input device; d) changing the coordinates of said cursor based upon said input signal; f) determining the position of said cursor relative to said object on said display by comparing said coordinates of said cursor to said effect data structure; h) generating said at least one feedback signal based upon said position of said cursor relative to said object; i) sending said at least one feedback signal to said input device.
36. The method of Claim 35, wherein said step a) includes the step of generating an API call to an operating system.
37. The method according to Claim 35, therein said step a) includes the step of utilizing a hook in the operating system to indicate when the coordinates of said object on said display have changed and updating said effect data structure based upon said change in coordinates of said object.
38. The method according to Claim 35, wherein said step a) further includes the step of indexing an OS data structure maintained by an operating system, said display displaying said object based upon said OS data structure.
39. The method according to Claim 35, further including the steps of: monitoring changes in the coordinates of said object; updating said effect data structure based upon a change in coordinates of said object.
40. The method of Claim 35, wherein said effect data structure associates coordinates on said display corresponding to said object with said associated feedback signal.
41. A method for generating a feedback signal based upon a cursor position on a display including the steps of: a) displaying each of a plurality of objects and a cursor on a display; b) determining the coordinates of each said object; c) determining a type of each said object; d) generating an effect data structure associating each said object with at least one of a plurality of effects; e) generating an input signal with an input/output device; f) moving said cursor on said display based upon said input signal; g) determining cursor coordinates of said cursor; h) determining that said cursor is located at coordinates associated with a first object of said plurality of objects by comparing the cursor coordinates with said effect data structure; i) determining at least one effect associated with said first object; j) generating a force output with said input/output device based upon said at least one effect associated with said first object.
42. The method of Claim 41, further including the steps of: determining that at least two effects are associated with said first object; determining a resultant output signal based upon said at least two effects; generating said resultant output signal to said input device; and generating said force output based upon said resultant output signal.
43. A computer interface comprising: a display displaying a plurality of objects based upon said first data structure, said display further displaying a cursor at cursor coordinates; an input device generating an input signal, said display changing said cursor coordinates based upon said input signal; a first data structure indicating an area occupied by each of a plurality of objects; a texture mapper generating a second data structure associating each said area associated with each said object with at least one effect; an effect mapper comparing the cursor coordinates of said cursor with said second data structure to determine the location of said cursor relative to said objects, said effect mapper determining at least one effect to be applied based upon said location of said cursor relative to said objects, said effect mapper generating a first output signal indicating an effect to be applied; said input device receiving said output signal from said effect mapper, said input device generating a feedback signal.
44. The computer interface of Claim 43, wherein said input device is a mouse.
45. The computer interface of Claim 44, wherein said mouse includes at least one encoder generating said input signal.
46. The computer interface of Claim 44, wherein said mouse generates a force feedback based upon said output signal.
47. The computer interface of Claim 46, wherein said mouse includes at least one motor generating said force feedback based upon said output signal.
48. The computer interface of Claim 47, wherein said at least one motor generates said force feedback based upon movement of said mouse.
49. The computer interface of Claim 44, further including: an operating system changing said areas of said objects in said first data structure, said display changing the areas in which said objects are displayed based upon said changes to said first data structure, said texture mapper updating said second data structure based upon said changes.
50. The computer interface of Claim 49, further including a hook in said operating system indicating when said operating system changes said areas of said objects.
51. The computer interface of Claim 49, wherein said texture mapper generates said second data structure by sending API calls to said operating system.
PCT/CA1997/000585 1996-08-20 1997-08-20 Force feedback mouse WO1998008159A2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CA002263988A CA2263988A1 (en) 1996-08-20 1997-08-20 Force feedback mouse
AU39361/97A AU3936197A (en) 1996-08-20 1997-08-20 Force feedback mouse

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US2442596P 1996-08-20 1996-08-20
US60/024,425 1996-08-20
US08/802,581 US5990869A (en) 1996-08-20 1997-02-19 Force feedback mouse
US08/802,581 1997-02-19

Publications (2)

Publication Number Publication Date
WO1998008159A2 true WO1998008159A2 (en) 1998-02-26
WO1998008159A3 WO1998008159A3 (en) 1998-08-06

Family

ID=26698429

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA1997/000585 WO1998008159A2 (en) 1996-08-20 1997-08-20 Force feedback mouse

Country Status (4)

Country Link
US (1) US5990869A (en)
AU (1) AU3936197A (en)
CA (1) CA2263988A1 (en)
WO (1) WO1998008159A2 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0903662A2 (en) * 1997-09-17 1999-03-24 Sun Microsystems, Inc. Invisible and one-pixel wide scroll bars
US6243078B1 (en) * 1998-06-23 2001-06-05 Immersion Corporation Pointing device with forced feedback button
EP1450241A3 (en) * 2003-02-20 2007-06-06 Alps Electric Co., Ltd. Force-applying input device
EP1333369A3 (en) * 2001-12-25 2008-10-08 Alps Electric Co., Ltd. Force feedback joystick knob
US10152131B2 (en) 2011-11-07 2018-12-11 Immersion Corporation Systems and methods for multi-pressure interaction on touch-sensitive surfaces
GR20180100401A (en) * 2018-09-04 2020-05-11 Γιωργος Αθανασιου Χατζηαυγουστιδης Console for tv set, computer and electronic games controlled from a sofa
EP4101414A1 (en) * 2021-06-08 2022-12-14 Kawasaki Jukogyo Kabushiki Kaisha Robotic surgical system

Families Citing this family (74)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5691898A (en) 1995-09-27 1997-11-25 Immersion Human Interface Corp. Safe and low cost computer peripherals with force feedback for consumer applications
US6166723A (en) * 1995-11-17 2000-12-26 Immersion Corporation Mouse interface device providing force feedback
US5999168A (en) 1995-09-27 1999-12-07 Immersion Corporation Haptic accelerator for force feedback computer peripherals
US5959613A (en) 1995-12-01 1999-09-28 Immersion Corporation Method and apparatus for shaping force signals for a force feedback device
US6704001B1 (en) 1995-11-17 2004-03-09 Immersion Corporation Force feedback device including actuator with moving magnet
US5825308A (en) 1996-11-26 1998-10-20 Immersion Human Interface Corporation Force feedback interface having isotonic and isometric functionality
US6639581B1 (en) 1995-11-17 2003-10-28 Immersion Corporation Flexure mechanism for interface device
US6100874A (en) * 1995-11-17 2000-08-08 Immersion Corporation Force feedback mouse interface
US8508469B1 (en) 1995-12-01 2013-08-13 Immersion Corporation Networked applications including haptic feedback
US6028593A (en) 1995-12-01 2000-02-22 Immersion Corporation Method and apparatus for providing simulated physical interactions within computer generated environments
US6161126A (en) 1995-12-13 2000-12-12 Immersion Corporation Implementing force feedback over the World Wide Web and other computer networks
US6300936B1 (en) 1997-11-14 2001-10-09 Immersion Corporation Force feedback system including multi-tasking graphical host environment and interface device
US6078308A (en) * 1995-12-13 2000-06-20 Immersion Corporation Graphical click surfaces for force feedback applications to provide user selection using cursor interaction with a trigger position within a boundary of a graphical object
US6374255B1 (en) 1996-05-21 2002-04-16 Immersion Corporation Haptic authoring
US6125385A (en) 1996-08-01 2000-09-26 Immersion Corporation Force feedback implementation in web pages
US6411276B1 (en) 1996-11-13 2002-06-25 Immersion Corporation Hybrid control of haptic feedback for host computer and interface device
US6128006A (en) 1998-03-26 2000-10-03 Immersion Corporation Force feedback mouse wheel and other control wheels
US6154201A (en) 1996-11-26 2000-11-28 Immersion Corporation Control knob with multiple degrees of freedom and force feedback
US6686911B1 (en) 1996-11-26 2004-02-03 Immersion Corporation Control knob with control modes and force feedback
US6278441B1 (en) * 1997-01-09 2001-08-21 Virtouch, Ltd. Tactile interface system for electronic data display system
US6954899B1 (en) * 1997-04-14 2005-10-11 Novint Technologies, Inc. Human-computer interface including haptically controlled interactions
US6020876A (en) 1997-04-14 2000-02-01 Immersion Corporation Force feedback interface with selective disturbance filter
US6252579B1 (en) 1997-08-23 2001-06-26 Immersion Corporation Interface device and method for providing enhanced cursor control with force feedback
US6292174B1 (en) 1997-08-23 2001-09-18 Immersion Corporation Enhanced cursor control using limited-workspace force feedback devices
US6088019A (en) 1998-06-23 2000-07-11 Immersion Corporation Low cost force feedback device with actuator for non-primary axis
US6211861B1 (en) * 1998-06-23 2001-04-03 Immersion Corporation Tactile mouse device
US6252583B1 (en) 1997-11-14 2001-06-26 Immersion Corporation Memory and force output management for a force feedback system
US6448977B1 (en) * 1997-11-14 2002-09-10 Immersion Corporation Textures and other spatial sensations for a relative haptic interface device
US8020095B2 (en) 1997-11-14 2011-09-13 Immersion Corporation Force feedback system including multi-tasking graphical host environment
US6256011B1 (en) 1997-12-03 2001-07-03 Immersion Corporation Multi-function control device with force feedback
US6437770B1 (en) 1998-01-26 2002-08-20 University Of Washington Flat-coil actuator having coil embedded in linkage
US6292172B1 (en) * 1998-03-20 2001-09-18 Samir B. Makhlouf System and method for controlling and integrating various media devices in a universally controlled system
US6184868B1 (en) 1998-09-17 2001-02-06 Immersion Corp. Haptic feedback control devices
US6707443B2 (en) * 1998-06-23 2004-03-16 Immersion Corporation Haptic trackball device
US6429846B2 (en) 1998-06-23 2002-08-06 Immersion Corporation Haptic feedback for touchpads and other touch controls
US6697043B1 (en) 1999-12-21 2004-02-24 Immersion Corporation Haptic interface device and actuator assembly providing linear haptic sensations
US6686901B2 (en) * 1998-06-23 2004-02-03 Immersion Corporation Enhancing inertial tactile feedback in computer interface devices having increased mass
US7038667B1 (en) 1998-10-26 2006-05-02 Immersion Corporation Mechanisms for control knobs and other interface devices
JP3791221B2 (en) * 1999-01-21 2006-06-28 株式会社ソニー・コンピュータエンタテインメント Resistance generator and operating device equipped with the same
US6781569B1 (en) 1999-06-11 2004-08-24 Immersion Corporation Hand controller
US6565554B1 (en) 1999-04-07 2003-05-20 Intuitive Surgical, Inc. Friction compensation in a minimally invasive surgical apparatus
US6424356B2 (en) 1999-05-05 2002-07-23 Immersion Corporation Command of force sensations in a forceback system using force effect suites
US6762745B1 (en) 1999-05-10 2004-07-13 Immersion Corporation Actuator control providing linear and continuous force output
US6903721B2 (en) * 1999-05-11 2005-06-07 Immersion Corporation Method and apparatus for compensating for position slip in interface devices
DE20080209U1 (en) 1999-09-28 2001-08-09 Immersion Corp Control of haptic sensations for interface devices with vibrotactile feedback
US6822635B2 (en) 2000-01-19 2004-11-23 Immersion Corporation Haptic interface for laptop computers and other portable devices
US7084854B1 (en) 2000-09-28 2006-08-01 Immersion Corporation Actuator for providing tactile sensations and device for directional tactile sensations
US20030117378A1 (en) 2001-12-21 2003-06-26 International Business Machines Corporation Device and system for retrieving and displaying handwritten annotations
US7181502B2 (en) * 2002-03-21 2007-02-20 International Business Machines Corporation System and method for locating on electronic documents items referenced in a physical document
US8059088B2 (en) 2002-12-08 2011-11-15 Immersion Corporation Methods and systems for providing haptic messaging to handheld communication devices
GB2413416B8 (en) 2002-12-08 2006-09-07 Immersion Corp Haptic massaging in handheld communication devices
US8830161B2 (en) 2002-12-08 2014-09-09 Immersion Corporation Methods and systems for providing a virtual touch haptic effect to handheld communication devices
US7339574B2 (en) * 2003-01-16 2008-03-04 Korean Advanced Institute Of Science And Technology Haptic mouse interface system for providing force and tactile feedbacks to user's fingers and arm
US7310779B2 (en) 2003-06-26 2007-12-18 International Business Machines Corporation Method for creating and selecting active regions on physical documents
US20050076161A1 (en) * 2003-10-03 2005-04-07 Amro Albanna Input system and method
US7724239B2 (en) * 2005-02-22 2010-05-25 Research In Motion Limited Handheld electronic device, cursor positioning sub-system and method employing cursor scaling control
JP2007004705A (en) * 2005-06-27 2007-01-11 Mitsumi Electric Co Ltd Joy stick device
US20080111791A1 (en) * 2006-11-15 2008-05-15 Alex Sasha Nikittin Self-propelled haptic mouse system
US20080231601A1 (en) * 2007-03-22 2008-09-25 Research In Motion Limited Input device for continuous gesturing within a user interface
US8212771B1 (en) 2008-03-24 2012-07-03 Sanchez-Garcia Raul A Interconnected computer mouse and pad
US8159455B2 (en) * 2008-07-18 2012-04-17 Apple Inc. Methods and apparatus for processing combinations of kinematical inputs
US20110237400A1 (en) * 2008-12-02 2011-09-29 Marcus James King Arm Exercise Device and System
US20100295667A1 (en) * 2009-05-22 2010-11-25 Electronics And Telecommunications Research Institute Motion based pointing apparatus providing haptic feedback and control method thereof
US8542105B2 (en) 2009-11-24 2013-09-24 Immersion Corporation Handheld computer interface with haptic feedback
JP5732218B2 (en) * 2010-09-21 2015-06-10 任天堂株式会社 Display control program, display control device, display control system, and display control method
CN102274635A (en) * 2011-07-30 2011-12-14 周海涛 Game controller
JP6298705B2 (en) * 2014-05-01 2018-03-20 日本放送協会 Haptic guidance presentation device
KR102373337B1 (en) 2014-09-02 2022-03-11 애플 인크. Semantic framework for variable haptic output
WO2016072196A1 (en) * 2014-11-04 2016-05-12 アルプス電気株式会社 Manipulation device
DK180122B1 (en) 2016-06-12 2020-05-19 Apple Inc. Devices, methods and graphical user interfaces for providing haptic feedback
DK179823B1 (en) 2016-06-12 2019-07-12 Apple Inc. Devices, methods, and graphical user interfaces for providing haptic feedback
DK201670720A1 (en) 2016-09-06 2018-03-26 Apple Inc Devices, Methods, and Graphical User Interfaces for Generating Tactile Outputs
DK179278B1 (en) 2016-09-06 2018-03-26 Apple Inc Devices, methods and graphical user interfaces for haptic mixing
DK201770372A1 (en) 2017-05-16 2019-01-08 Apple Inc. Tactile feedback for locked device user interfaces

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3919691A (en) * 1971-05-26 1975-11-11 Bell Telephone Labor Inc Tactile man-machine communication system
JPS5886672A (en) * 1981-11-17 1983-05-24 Syst Soken:Kk Input and output device for graph, character, and the like
EP0489469A1 (en) * 1990-12-05 1992-06-10 Koninklijke Philips Electronics N.V. A data input device for use with a data processing apparatus and a data processing apparatus provided with such a device
GB2260389A (en) * 1991-01-23 1993-04-14 William Alexander Courtney Device to assist graphic data input using computer mouse or scanner: general desktop equipment
US5305429A (en) * 1989-11-30 1994-04-19 Makoto Sato Input apparatus using three-dimensional image
JPH0713693A (en) * 1993-06-22 1995-01-17 Oki Electric Ind Co Ltd Pointing device and its control method
WO1996007965A2 (en) * 1994-09-07 1996-03-14 Philips Electronics N.V. Virtual workspace with user-programmable tactile feedback
WO1996018942A1 (en) * 1994-12-14 1996-06-20 Moore Robert S Force feedback for virtual reality

Family Cites Families (62)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2424773A (en) * 1944-02-26 1947-07-29 Interval Instr Inc Locating device
US2984720A (en) * 1959-06-10 1961-05-16 Warner Swasey Co Control unit
US3091130A (en) * 1960-06-27 1963-05-28 Morse Instr Co Single lever control for multiple actions
US3550466A (en) * 1968-11-26 1970-12-29 Byron Jackson Inc Multidirectional control
US3707093A (en) * 1970-09-10 1972-12-26 Marotta Scientific Controls Multi-power control system with single control stick
JPS5149303B1 (en) * 1971-02-03 1976-12-25
US3940674A (en) * 1972-04-14 1976-02-24 The United States Of America As Represented By The Secretary Of The Navy Submarine or vehicle steering system
JPS5846722B2 (en) * 1976-07-30 1983-10-18 東芝機械株式会社 Multi-directional steering mechanism
US4216467A (en) * 1977-12-22 1980-08-05 Westinghouse Electric Corp. Hand controller
US4200780A (en) * 1978-01-18 1980-04-29 Atari, Inc. Control assembly with rotating disc cover for sliding control
JPS5871230U (en) * 1981-11-06 1983-05-14 クラリオン株式会社 Tuning shaft of push button tuner
US4414438A (en) * 1982-06-04 1983-11-08 International Jensen Incorporated Video game controller
US4685678A (en) * 1982-08-13 1987-08-11 Bally Manufacturing Corporation Position transducer system for a joystick
US4509383A (en) * 1982-12-01 1985-04-09 Championship Electronics (Usa) Inc. Joystick controller
SE443672B (en) * 1982-12-23 1986-03-03 Akermans Verkstad Ab CONTROL lever means
US4459578A (en) * 1983-01-13 1984-07-10 Atari, Inc. Finger control joystick utilizing Hall effect
US4491325A (en) * 1983-01-26 1985-01-01 Thomas Bersheim Game control apparatus
US4660828A (en) * 1983-06-15 1987-04-28 Allen Schwab Reactive control apparatus
GB2142711A (en) * 1983-07-04 1985-01-23 Philips Electronic Associated Manually operable x-y signal generator
US4604016A (en) * 1983-08-03 1986-08-05 Joyce Stephen A Multi-dimensional force-torque hand controller having force feedback
US4584443A (en) * 1984-05-14 1986-04-22 Honeywell Inc. Captive digit input device
US4782327A (en) * 1985-01-02 1988-11-01 Victor B. Kley Computer control
US4590339A (en) * 1985-02-19 1986-05-20 Gravis Computer Peripherals Inc. Joystick
US5103404A (en) * 1985-12-06 1992-04-07 Tensor Development, Inc. Feedback for a manipulator
JPH0668758B2 (en) * 1986-01-07 1994-08-31 株式会社日立製作所 Cursor control method and three-dimensional graphic display device
US4748441A (en) * 1986-09-17 1988-05-31 Brzezinski Stephen R M Multiple function control member
JPH0628144B2 (en) * 1986-10-08 1994-04-13 株式会社日立製作所 Device for driving sample stage such as microscope
NL8602697A (en) * 1986-10-27 1988-05-16 Huka Bv Developments JOYSTICK.
US4800721A (en) * 1987-02-13 1989-01-31 Caterpillar Inc. Force feedback lever
US4870389B1 (en) * 1987-02-23 1997-06-17 Ascii Corp Joystick
US4769517A (en) * 1987-04-13 1988-09-06 Swinney Carl M Joystick switch assembly
US4961138A (en) * 1987-05-01 1990-10-02 General Datacomm, Inc. System and apparatus for providing three dimensions of input into a host processor
US4868549A (en) * 1987-05-18 1989-09-19 International Business Machines Corporation Feedback mouse
US4820162A (en) * 1987-11-23 1989-04-11 Robert Ross Joystick control accessory for computerized aircraft flight simulation program
GB2212888A (en) * 1987-12-02 1989-08-02 Philips Electronic Associated X-y signal generating device
US4908791A (en) * 1988-01-15 1990-03-13 Giorgio Paul J Switch display encoder apparatus
US4962448A (en) * 1988-09-30 1990-10-09 Demaio Joseph Virtual pivot handcontroller
US4947701A (en) * 1989-08-11 1990-08-14 Honeywell Inc. Roll and pitch palm pivot hand controller
US5087904A (en) * 1990-02-09 1992-02-11 Devolpi Dean Joy stick
US5059789A (en) * 1990-10-22 1991-10-22 International Business Machines Corp. Optical position and orientation sensor
US5191320A (en) * 1990-12-15 1993-03-02 Sony Corporation Of America Variable scale input device
US5354162A (en) * 1991-02-26 1994-10-11 Rutgers University Actuator system for providing force feedback to portable master support
US5146566A (en) * 1991-05-29 1992-09-08 Ibm Corporation Input/output system for computer user interface using magnetic levitation
US5223828A (en) * 1991-08-19 1993-06-29 International Business Machines Corporation Method and system for enabling a blind computer user to handle message boxes in a graphical user interface
CA2068476C (en) * 1991-08-19 1996-07-23 Frank A. Mckiel, Jr. Audio user interface with stereo and filtered sound effects
US5186629A (en) * 1991-08-22 1993-02-16 International Business Machines Corporation Virtual graphics display capable of presenting icons and windows to the blind computer user and method
US5228356A (en) * 1991-11-25 1993-07-20 Chuang Keh Shih K Variable effort joystick
JPH05150795A (en) * 1991-11-27 1993-06-18 Mitsubishi Electric Corp Sound presence detector
US5189355A (en) * 1992-04-10 1993-02-23 Ampex Corporation Interactive rotary controller system with tactile feedback
US5296871A (en) * 1992-07-27 1994-03-22 Paley W Bradford Three-dimensional mouse with tactile feedback
US5377950A (en) * 1992-09-10 1995-01-03 The University Of British Columbia Platform mountings
US5429140A (en) * 1993-06-04 1995-07-04 Greenleaf Medical Systems, Inc. Integrated virtual reality rehabilitation system
US5396266A (en) * 1993-06-08 1995-03-07 Technical Research Associates, Inc. Kinesthetic feedback apparatus and method
US5513100A (en) * 1993-06-10 1996-04-30 The University Of British Columbia Velocity controller with force feedback stiffness control
US5345214A (en) * 1993-06-17 1994-09-06 Std Electronic International Ltd. Variable range position indicator
US5739811A (en) * 1993-07-16 1998-04-14 Immersion Human Interface Corporation Method and apparatus for controlling human-computer interface systems providing force feedback
US5701140A (en) * 1993-07-16 1997-12-23 Immersion Human Interface Corp. Method and apparatus for providing a cursor control interface with force feedback
US5382885A (en) * 1993-08-09 1995-01-17 The University Of British Columbia Motion scaling tele-operating system with force feedback suitable for microsurgery
US5491477A (en) * 1993-09-13 1996-02-13 Apple Computer, Inc. Anti-rotation mechanism for direct manipulation position input controller for computer
WO1995020787A1 (en) * 1994-01-27 1995-08-03 Exos, Inc. Multimode feedback display technology
US5882206A (en) * 1995-03-29 1999-03-16 Gillio; Robert G. Virtual surgery system
US5724068A (en) * 1995-09-07 1998-03-03 Microsoft Corporation Joystick with uniform center return force

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3919691A (en) * 1971-05-26 1975-11-11 Bell Telephone Labor Inc Tactile man-machine communication system
JPS5886672A (en) * 1981-11-17 1983-05-24 Syst Soken:Kk Input and output device for graph, character, and the like
US5305429A (en) * 1989-11-30 1994-04-19 Makoto Sato Input apparatus using three-dimensional image
EP0489469A1 (en) * 1990-12-05 1992-06-10 Koninklijke Philips Electronics N.V. A data input device for use with a data processing apparatus and a data processing apparatus provided with such a device
GB2260389A (en) * 1991-01-23 1993-04-14 William Alexander Courtney Device to assist graphic data input using computer mouse or scanner: general desktop equipment
JPH0713693A (en) * 1993-06-22 1995-01-17 Oki Electric Ind Co Ltd Pointing device and its control method
WO1996007965A2 (en) * 1994-09-07 1996-03-14 Philips Electronics N.V. Virtual workspace with user-programmable tactile feedback
WO1996018942A1 (en) * 1994-12-14 1996-06-20 Moore Robert S Force feedback for virtual reality

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
"FORCE-FEEDBACK CURSOR CONTROL" NTIS TECH NOTES, 1 May 1990, page 413 XP000137380 *
HIROTA K ET AL: "DEVELOPMENT OF SURFACE DISPLAY" PROCEEDINGS OF THE VIRTUAL REALITY ANNUAL INTERNATIONAL SYMPOSIUM, SEATTLE, SEPT. 18 - 22, 1993, no. SYMP. 1, 18 September 1993, INSTITUTE OF ELECTRICAL AND ELECTRONICS ENGINEERS, pages 256-262, XP000457695 *
PATENT ABSTRACTS OF JAPAN vol. 007, no. 184 (P-216), 13 August 1983 & JP 58 086672 A (SHISUTEMU SOUKEN:KK), 24 May 1983, *
PATENT ABSTRACTS OF JAPAN vol. 095, no. 004, 31 May 1995 & JP 07 013693 A (OKI ELECTRIC IND CO LTD;OTHERS: 01), 17 January 1995, *
YUKIO FUKUI ET AL: "EDGE TRACING OF VIRTUAL SHAPE USING INPUT DEVICE WITH FORCE FEEDBACK" SYSTEMS & COMPUTERS IN JAPAN, vol. 23, no. 5, 1 January 1992, pages 94-104, XP000291594 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0903662A2 (en) * 1997-09-17 1999-03-24 Sun Microsystems, Inc. Invisible and one-pixel wide scroll bars
EP0903662A3 (en) * 1997-09-17 2000-03-01 Sun Microsystems, Inc. Invisible and one-pixel wide scroll bars
US6882354B1 (en) 1997-09-17 2005-04-19 Sun Microsystems, Inc. Scroll bars with user feedback
US6243078B1 (en) * 1998-06-23 2001-06-05 Immersion Corporation Pointing device with forced feedback button
EP1333369A3 (en) * 2001-12-25 2008-10-08 Alps Electric Co., Ltd. Force feedback joystick knob
EP1450241A3 (en) * 2003-02-20 2007-06-06 Alps Electric Co., Ltd. Force-applying input device
US10152131B2 (en) 2011-11-07 2018-12-11 Immersion Corporation Systems and methods for multi-pressure interaction on touch-sensitive surfaces
US10775895B2 (en) 2011-11-07 2020-09-15 Immersion Corporation Systems and methods for multi-pressure interaction on touch-sensitive surfaces
GR20180100401A (en) * 2018-09-04 2020-05-11 Γιωργος Αθανασιου Χατζηαυγουστιδης Console for tv set, computer and electronic games controlled from a sofa
EP4101414A1 (en) * 2021-06-08 2022-12-14 Kawasaki Jukogyo Kabushiki Kaisha Robotic surgical system

Also Published As

Publication number Publication date
CA2263988A1 (en) 1998-02-26
WO1998008159A3 (en) 1998-08-06
AU3936197A (en) 1998-03-06
US5990869A (en) 1999-11-23

Similar Documents

Publication Publication Date Title
WO1998008159A2 (en) Force feedback mouse
US8462116B2 (en) Haptic trackball device
US7136045B2 (en) Tactile mouse
MacKenzie Input devices and interaction techniques for advanced computing
US8212772B2 (en) Haptic interface device and actuator assembly providing linear haptic sensations
US7199790B2 (en) Providing force feedback to a user of an interface device based on interactions of a user-controlled cursor in a graphical user interface
US6078308A (en) Graphical click surfaces for force feedback applications to provide user selection using cursor interaction with a trigger position within a boundary of a graphical object
Herndon et al. The challenges of 3D interaction: a CHI'94 workshop
US6525711B1 (en) Haptic interface including clutch control
US6750877B2 (en) Controlling haptic feedback for enhancing navigation in a graphical environment
US6803924B1 (en) Flexible variation of haptic interface resolution
US20060209037A1 (en) Method and system for providing haptic effects
US8350843B2 (en) Virtual hand: a new 3-D haptic interface and system for virtual environments
WO2004081776A1 (en) A method and system for providing haptic effects
WO2002057885A2 (en) Controlling haptic feedback for enhancing navigation in a graphical environment
EP1182536A1 (en) Navigation in large data sets
Bernard Design and evaluation of spatial interfaces in virtual reality
Smyth Design and evaluation of Pokespace: A bimanual haptic interaction technique

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GE HU IL IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK TJ TM TR TT UA UG US UZ VN YU ZW AM AZ BY KG KZ MD RU TJ TM

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH KE LS MW SD SZ UG ZW AT BE CH DE DK ES FI FR GB GR IE IT LU MC NL PT SE

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
ENP Entry into the national phase

Ref document number: 2263988

Country of ref document: CA

Kind code of ref document: A

Ref document number: 2263988

Country of ref document: CA

NENP Non-entry into the national phase

Ref document number: 1998510201

Country of ref document: JP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase