WO2011062895A2 - Systems and methods for increasing haptic bandwidth in an electronic device - Google Patents

Systems and methods for increasing haptic bandwidth in an electronic device Download PDF

Info

Publication number
WO2011062895A2
WO2011062895A2 PCT/US2010/056829 US2010056829W WO2011062895A2 WO 2011062895 A2 WO2011062895 A2 WO 2011062895A2 US 2010056829 W US2010056829 W US 2010056829W WO 2011062895 A2 WO2011062895 A2 WO 2011062895A2
Authority
WO
WIPO (PCT)
Prior art keywords
actuator
processor
haptic
haptic effect
actuators
Prior art date
Application number
PCT/US2010/056829
Other languages
French (fr)
Other versions
WO2011062895A3 (en
Inventor
Juan Manuel Cruz-Hernandez
Original Assignee
Immersion Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Immersion Corporation filed Critical Immersion Corporation
Priority to EP10787610.4A priority Critical patent/EP2502215B1/en
Priority to KR1020127015581A priority patent/KR101719507B1/en
Priority to CN201080051254.6A priority patent/CN102713793B/en
Priority to JP2012539970A priority patent/JP5668076B2/en
Publication of WO2011062895A2 publication Critical patent/WO2011062895A2/en
Publication of WO2011062895A3 publication Critical patent/WO2011062895A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B06GENERATING OR TRANSMITTING MECHANICAL VIBRATIONS IN GENERAL
    • B06BMETHODS OR APPARATUS FOR GENERATING OR TRANSMITTING MECHANICAL VIBRATIONS OF INFRASONIC, SONIC, OR ULTRASONIC FREQUENCY, e.g. FOR PERFORMING MECHANICAL WORK IN GENERAL
    • B06B1/00Methods or apparatus for generating mechanical vibrations of infrasonic, sonic, or ultrasonic frequency
    • B06B1/02Methods or apparatus for generating mechanical vibrations of infrasonic, sonic, or ultrasonic frequency making use of electrical energy
    • B06B1/06Methods or apparatus for generating mechanical vibrations of infrasonic, sonic, or ultrasonic frequency making use of electrical energy operating with piezoelectric effect or with electrostriction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • HELECTRICITY
    • H02GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
    • H02NELECTRIC MACHINES NOT OTHERWISE PROVIDED FOR
    • H02N2/00Electric machines in general using piezoelectric effect, electrostriction or magnetostriction
    • H02N2/02Electric machines in general using piezoelectric effect, electrostriction or magnetostriction producing linear motion, e.g. actuators; Linear positioners ; Linear motors
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10NELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10N30/00Piezoelectric or electrostrictive devices
    • H10N30/20Piezoelectric or electrostrictive devices with electrical input and mechanical output, e.g. functioning as actuators or vibrators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/014Force feedback applied to GUI

Definitions

  • the present disclosure relates generally to systems and methods for increasing haptic bandwidth in an electronic device.
  • Tactile confirmation has generally addressed or at the very least substituted the use of programmable mechanical clicks effects by typically using a single actuator, such as a vibrating motor.
  • a single actuator such as a vibrating motor.
  • Such conventional haptic effects include vibrations to indicate an incoming call or text message, or to indicate error conditions.
  • a system includes an apparatus having a first actuator; a second actuator; and a processor coupled to the first and second actuators, the processor configured to apply a first command signal to the first actuator to output a first haptic effect from a first start time to a first stop time, the processor configured to apply a second command signal to the second actuator to output a second haptic effect from a second start time to a second stop time.
  • the method comprises receiving an interaction signal at a processor of an interaction occurring within a graphical environment, the interaction corresponding to a haptic effect; applying a first input signal to a first actuator to output a first haptic effect, wherein the first actuator outputs the first haptic effect beginning at a first time and terminating at a second time; and applying a second input signal to a second actuator to output a second haptic effect, wherein the second actuator outputs the second haptic effect beginning at a third time, wherein the third time occurs after the second time.
  • a computer-readable medium comprises program code for causing a processor to execute such a method.
  • Figure 1 shows a system for increasing haptic bandwidth in electronic devices according to an embodiment of the present invention
  • Figures 2 and 3 illustrate an actuator's response to a pulsing signal at frequencies of 5 and 10 Hz, respectively;
  • Figure 4 illustrates a block diagram of an electronic device in accordance with an embodiment of the present invention
  • Figure 5 illustrates a QWERTY keyboard having haptic areas in accordance with an embodiment of the present invention
  • Figure 6 illustrates scheduled activation of multiple actuators in response to interaction of the QWERTY keyboard in Figure 5 in accordance with an embodiment of the present invention
  • Figure 7 illustrates a flow chart directed to the method of outputting haptic effects to increase the haptic bandwidth in an electronic device in accordance with an embodiment of the present invention.
  • Figure 8 illustrates a flow chart directed to the method of outputting haptic effects to increase the haptic bandwidth in an electronic device in accordance with an embodiment.
  • Example embodiments are described herein in the context of systems and methods for increasing haptic bandwidth in an electronic device. Those of ordinary skill in the art will realize that the following description is illustrative only and is not intended to be in any way limiting. Other embodiments will readily suggest themselves to such skilled persons having the benefit of this disclosure. Reference will now be made in detail to implementations of example embodiments as illustrated in the accompanying drawings. The same reference indicators will be used throughout the drawings and the following description to refer to the same or like items.
  • Figure 1 shows a system 50 for increasing haptic bandwidth in an electronic device according to one illustrative embodiment of the present invention.
  • a cell phone 60 comprises a touch screen 66 and several actuators 70-76 for outputting various haptic effects to the cell phone 60.
  • two of the actuators 70, 72 are piezo-electric actuators and the other two actuators 74, 76 are rotary motors having an eccentric rotating mass (commonly referred to as an "ERM").
  • the cell phone 60 also includes a processor 62, a memory 64, a sensor 68.
  • the processor 62 executes software stored in memory 64 and displays graphical user interface (GUI) elements on the touch screen 66.
  • GUI graphical user interface
  • a user interacts with the cell phone 60 by touching the touch screen 66 to select one or more GUI elements or by making gestures on the touch screen 66.
  • the sensor 68 detects the various contacts with the touch screen 66 and provides sensor signals to the processor 62, which interprets the signals based on the position of GUI elements displayed on the touch screen 66 and any detected gestures.
  • the processor 62 may determine that one or more haptic effects are to be output to the cell phone 60 based on user inputs or on events occurring within the GUI or other applications executed by the processor 62, such as text messaging software.
  • the processor 62 selects one or more actuators 70-76 to use to output the haptic effects.
  • memory 64 stores parametric information about each of the actuators, including frequency ranges, resonant frequencies, startup and stop times, power consumption, or physical coupling information, such as whether the actuator is coupled to the housing of the cell phone 60, the touch screen 66, or other parts of the cell phone 60, such as physical keys or buttons (not shown).
  • the processor 62 Based on the actuator information, the processor 62 generates actuator signals for the haptic effects, selects the actuator or actuators to output the haptic effects, and transmits the actuator signals to the actuator(s) at the appropriate times to generate the desired haptic effects.
  • each key "pressed" by the user may result in a haptic effect.
  • the processor 62 determines that sharp, high-frequency haptic effects are needed for each key press.
  • the processor 62 determines that the ERM actuators 74, 76 should be used to output the haptic effects.
  • the processor 62 determines that the ERM actuators 74, 76 are capable of generating high-magnitude forces and are coupled to the housing of the cell phone 60 based on stored actuator profiles for each of the actuators 70-76.
  • both ERM actuators 74, 76 should be used and should be alternated because the startup and stop characteristics of the ERM actuators 74, 76 may take too long to fully stop a haptic effect before the next haptic effect is to be output, i.e. the individual ERM actuators 74, 76 may have insufficient bandwidth to support haptic effects that occur as rapidly as keystrokes.
  • One way of defining the bandwidth of a vibrating motor actuator is the maximum frequency that can be obtained between from an actuator before the pulses output by the actuator begin to feel as mushy, continuous vibration.
  • the pulses 10 are generated by a single vibrating actuator in response to a non-continuous or pulsed signal 20, whereby the pulsed signal 20 is approximately at 5 Hz.
  • the response or deceleration output by the actuator is such that the actuator is able to vibrate for some time and come to an almost complete stop (Point A) before it is instructed to accelerate again.
  • Figure 3 illustrates the same actuator in which the pulsing signal is at a frequency of 10 Hz.
  • the magnitude of pulse vibrations 30 output by the actuator is not able to approach a zero value before the pulsing signal 40 instructs it to begin accelerating again (see Point B in Figure 2).
  • the actuator is unable to decelerate to a magnitude where the magnitude of the haptic effect cannot be felt before the actuator begins to accelerate toward the maximum magnitude again.
  • This can lead to "mushy" haptic effects where each effect tends to be hard to distinguish from the next, which tends to degrade the user's experience.
  • the illustrative system in Figure 1 employs multiple actuators and novel methods of actuating those actuators.
  • the processor may determine that additional haptic effects are needed. For example, when the user presses the "send" button, the processor 62 determines that a haptic effect should be output to indicate that the send button was pressed. In this illustrative embodiment, the processor 62 determines that a texture haptic effect should be output in addition to a vibration effect. In such an embodiment, the processor 62 generates the vibration effects by sending signals alternately to one ERM actuator 74 and then the other ERM actuator 76, as will be described in greater detail below.
  • the processor 62 generates actuator signals with high frequencies (e.g. >20kHz) and determines that the ERM actuators are already in use, that the ERM actuators are not suitable for generating such high frequencies, and that the piezoelectric actuators 70, 72 are capable of generating the necessary frequencies. Further, the processor 62 determines, based on the stored actuator parameter information, that each piezoelectric actuator 70, 72 is configured to output a haptic effect in only one dimension and that the two piezoelectric actuators 70, 72 are oriented along orthogonal axes. Therefore, in this embodiment, the processor 62 determines that each of the piezoelectric actuators 70, 72 should be actuated to generate the texture effect. Thus, the processor 62 transmits high-frequency actuator signals to each of the piezoelectric actuators 70, 72 to generate a haptic effect to simulate a textured surface on the touch screen 66.
  • high frequencies e.g. >20kHz
  • Such an illustrative embodiment provides increased haptic bandwidth by selectively actuating actuators 70-76 based on performance characteristics of the actuators stored within the cell phone's memory 64. Further, because a plurality of different actuators are provided, multiple effects may be output (or played) simultaneously, or may be output with high fidelity despite insufficient performance characteristics of one or more of the actuators 70-76 for the haptic effects to be output. For example, high-magnitude precise vibrations can be output a rate greater than the peak bandwidth of one of the ERM actuators 74, 76 by outputting the vibrations alternately between the two ERM actuators 74, 76.
  • Figure 4 illustrates a block diagram of an electronic device in accordance with an embodiment.
  • Figure 4 illustrates an electronic device 100 having a body or housing 102, a processor 104 within the body 102 and coupled to a memory 106.
  • the processor 104 is able to store information to and retrieve from the memory 106.
  • Such information may include, but is not limited to, actuator profiles, haptic effect profiles, haptic effect output time sequences, programmed voltages to send to the actuators, game data, software data, etc.
  • the electronic device 100 is shown with one or more optional touch screens, touch pads or other touch sensitive components 108 coupled to the processor 104. It should be noted that some embodiments of the present invention may not include a touch sensitive component 108. For instance, some embodiments of the present invention may be applied to other types of devices, such as a joystick, rotatable knob, stand-alone kiosk, computer mouse, virtual reality simulation, computer peripheral, smart phone, handheld computer, game peripheral, etc. However, for explanation purposes, the touch sensitive component 108 will be used to describe embodiments of systems and methods for increasing haptic bandwidth in an electronic device.
  • the device 100 includes a sensor 110 coupled to the touch screen 108 and processor 104, whereby the sensor 110 monitors the position, pressure, and/or movement of the user's finger(s), stylus or other input means during interaction with the touch sensitive component 108.
  • the sensor 1 10 provides sensor signals to the processor 104 to indicate the pressure, position and/or movement of the user's input, whereby the processor 104 running the software program updates the display shown through the touch sensitive component 108 in response thereto.
  • the touch sensitive component 108 incorporates the sensor 1 10 therein as an integral component, and thus the sensor 110 is not a separate component. However, for purposes of discussion, the sensor 110 is referred to herein as a separate component.
  • the electronic device 100 includes a plurality of actuators 1 12, 1 14, 1 16 within the body. It should be noted that although three actuators are shown in Figure 4, as little as two actuators are contemplated or more than three actuators are also contemplated.
  • the actuators 112, 114, 116 are all mounted to the body 102 of the device 100 to impart a haptic effect thereto.
  • one or more of the actuators are mounted to the touch sensitive component 108 or other respective user input device to impart a localized haptic effect thereto. It is contemplated that one or more of the actuators may be mounted to the touch sensitive component 108 or other respective user input device while the remaining actuators are mounted to the body 102 or to one or more physical buttons (not shown).
  • At least one actuator is suspended within the body 102 and may be configured to impart haptic effects to the touch sensitive component and/or the body 102.
  • the actuator may be designed to utilize a flexible or resilient material to amplify haptic effects produced therefrom.
  • one or more actuators are part of an external device or peripheral that is externally mounted to the body 102 to output haptic effects thereto.
  • the actuators 1 12-116 are configured to output one or more haptic effects upon receiving an input command signal from the processor 104.
  • the input command signal may be from an interaction which may occur between the user and a graphical object within a graphical environment run by a software program, whereby the software program may be run on the local processor or a host computer separate from the electronic device.
  • the interaction may also be user independent in which the user's action does not cause the interaction (e.g. text message received, asteroid hitting the user's vehicle in a game).
  • the interaction may, however, cause a haptic event to occur or may be the product of the user selecting a haptic area, both of which are discussed in more detail below.
  • actuators can be of various types including, but not limited to, eccentric rotational mass (ERM) actuators, linear resonant actuators (LRA), piezoelectric actuator, voice coil actuator, electro-active polymer (EAP) actuators, memory shape alloys, pager or DC motors, AC motors, moving magnet actuators, E-core actuators, smartgels, electrostatic actuators, electrotactile actuators, etc.
  • ECM eccentric rotational mass
  • LRA linear resonant actuator
  • EAP electro-active polymer
  • memory shape alloys pager or DC motors
  • AC motors AC motors
  • moving magnet actuators moving magnet actuators
  • E-core actuators smartgels
  • electrostatic actuators electrotactile actuators, etc.
  • the actuators 112-1 16 output their respective haptic effects in response to one or more haptic events occurring in the graphical environment.
  • the haptic event is referred to herein as any interaction, action, collision, or other event which occurs during operation of the device which can potentially have a haptic effect associated with it, which is then output to the user in the form of the haptic effect.
  • a haptic event may occur when a graphical vehicle the user is controlling experiences wind turbulence during game play, whereby an example haptic effect associated with that haptic event could be a vibration.
  • a haptic event may occur when a missile collides with the user's character in the game, whereby an example haptic effect associated with the haptic event is a jolt or pulse.
  • Haptic events may not be associated with the game play, but nonetheless provides the user with important device information while the user is playing a game (e.g. receiving a text message, completion of a song download, battery level low, etc.).
  • the interaction may correlate with a graphical object of a graphical environment which the user interacts with on a display screen.
  • a haptic effect may be output by the system in response to an interaction where the user selects a designated area in a graphical environment, hereby referred to as a displayed haptic enabled area or just "haptic area.”
  • a displayed haptic enabled area hereby referred to as a displayed haptic enabled area or just "haptic area.”
  • the boundaries of a displayed key of a keyboard may each be designated a haptic area.
  • the left boundary 202, right boundary 204, bottom boundary 206 and top boundary 208 of "shift" key may each be designated a haptic area, whereby the processor 104 instructs the actuators to output respective haptic effects when the sensor 1 10 indicates that the user's finger or stylus is moving over one or more of displayed boundary or boundaries. It is also contemplated that the area between the boundaries 202-208 within the "shift" key may be designated a haptic area. In some embodiments, haptic areas are designated when developing the software that is to be run on the device 100. In some embodiments, however, a user may be able to customize existing haptic areas or develop/designate new ones such as via a Preferences or Options menu.
  • the present system and method utilizes multiple actuators to operate in successive order for a duration of time during which the interaction occurs.
  • the staggered output of the multiple actuators are to increase the output bandwidth of the actuators at faster intervals and produce distinct, discrete haptic effects which are discernable to the user.
  • the processor 104 applies an input command signal with a designated voltage and current to the actuator 112 at a start time to cause the actuator 1 12 to accelerate to a maximum designated magnitude to output a corresponding haptic effect.
  • the processor 104 terminates the input command signal at a stop time (such as based on programmed parameters of the hap tic effect which are stored in memory), upon which the actuator 112 decelerates from the maximum magnitude to a stop.
  • the processor 104 then applies a designated voltage and current to the second actuator 114 at a respective start time to cause the actuator 1 14 to accelerate to a maximum designated magnitude to output a corresponding haptic effect.
  • the processor 104 terminates the pulse signal to the actuator 112 to allow the second actuator 1 14 to decelerate from its maximum magnitude to a stop.
  • the processor 104 then again sends the input command signal to the first actuator 1 12 to begin outputting a haptic effect and so on.
  • this process is repeated between the actuators 112, 114 to thus cause the actuators 112, 114 to alternately and successively output their respective haptic effects.
  • a particular actuator does not begin operating until the haptic effect output by the other actuator is at least at a magnitude and/or frequency that is not able to be discernibly felt by the user. However, in some embodiments, a particular actuator does not begin operating until the haptic effect output by the other actuator is at a zero magnitude and/or frequency.
  • the scheduling of the start and stop times of the input command signals toward each of the actuators are predetermined and stored in the memory. This allows the processor to quickly retrieve the scheduling data and thus ease computational burdens when a haptic effect is to be output.
  • the stored scheduling information may be in the form of a lookup table or other stored configuration in which the start and stop times for each actuator, in relation to the other actuators, are already established in which the processor 104 merely processes the stored information and accordingly activates the actuators based on the designated scheduling instructions.
  • the scheduling instructions may be based on the type of actuators used (e.g.
  • ERM ERM, LRA, piezoelectric, etc.
  • desired maximum and minimum magnitudes to be output by the actuators voltages and frequencies at which the actuators will operate, type of haptic effect to be output (e.g. vibration, pop, click, etc.), and the overall operating characteristics of the actuators (e.g. heavy or light actuators, etc.).
  • the particular operating characteristics of the actuator 1 12 will be known to the processor 104 in which the processor 104 is provided information on how long it takes the actuator 112 to accelerate from a stopped position to the desired magnitude and frequency based on the applied voltage and current.
  • the memory 106 may store information regarding how long it takes for the actuator 1 12 to decelerate from its maximum operating magnitude and frequency back to the stopped position. This is because, in one embodiment, the acceleration and deceleration time of the actuator 1 12, based on the type of current (i.e. AC vs. DC), is already known and is stored in the memory 106 as data or an instruction to be read by the processor and accordingly provided to the actuators.
  • memory 106 comprises one or more actuator profiles associated with the actuators 112-1 16.
  • the actuator profiles comprise a plurality of parameters associated with the actuators, such as start-up time, stop time, minimum and maximum frequencies, maximum magnitudes, resonant frequencies, haptic effect types, axis(es) of operation, or power consumption.
  • the processor 104 may then access the actuator profiles to determine which actuators, and how many actuators, to employ to generate one or more haptic effects.
  • Figure 6 illustrates a graph illustrating the scheduled haptic effects output by two actuators in the system in accordance with an embodiment.
  • the top graph 300 illustrates the pulsed haptic effect output by the first actuator 112
  • the bottom graph 400 illustrates the pulsed haptic effect output by the second actuator 1 14 in which both graphs share a common time line.
  • the processor 104 upon a haptic event occurring or haptic area being determined, sends its command signal to the actuator 1 12 at time to in which the actuator 1 12 begins its operation.
  • the input command signal is a square wave signal in which the processor 104 terminates its command signal at time , whereby time occurs before ti.
  • the processor determines time based on actuator parameters stored in memory. For example, in one embodiment, the processor 104 determines a percentage of the stop time for an actuator 112, 114 to determine a minimum amount of time to wait after an actuator signal has been terminated before a new signal may be begun.
  • the processor 104 determines an amount of time to wait after an actuator signal has been terminated before beginning a haptic effect of the same type.
  • a device may comprise multiple different types of actuators, such as ERM actuators, DC motors, piezoelectric actuators, LRAs, etc.
  • a processor may simultaneously actuate multiple actuators to output different types of effects, such as textures, vibrations, and torques.
  • a processor may cause texture effects to be output irrespective of the status of vibrational effects or torsional effects.
  • the processor 104 may determine that no wait time is required as a first haptic effect may be output substantially simultaneously as a second haptic effect without interfering with the two effects.
  • the actuator 1 12 decelerates to a magnitude such that no discernable haptic effect is felt by the user.
  • the actuator 1 12 decelerates to a zero magnitude around time ⁇
  • different input command signals or actuator signals may be employed other than square waves.
  • actuators signals may be generated to accelerate or decelerate actuators to provide high-fidelity haptic effects such as is disclosed in U. S. Patent No. 7,639,232, filed November 30, 2005, entitled
  • the processor 104 sends an input command signal to the actuator 114 in which the actuator 1 14 begins its operation and accelerates to a maximum magnitude.
  • the command signal is a square wave signal in which the processor 104 terminates its command signal at time tm, whereby time tm occurs before t 2 .
  • the actuator 114 has sufficiently decelerated so that the processor 104 determines that the next actuator may be actuated. For example, in this embodiment, the processor 104 determines a portion of the stop time stored as a parameter for actuator 114 in memory. In an embodiment, the actuator 114 comes to or near a complete stop around time tm.
  • the processor 104 delays a fixed amount of time before actuating the next actuator 1 12. Thereafter, the processor 104 then instructs actuator 1 12 to begin operation at time t 2 and so on.
  • This alternating pattern of output from multiple actuators can generate discrete haptic effects which are distinct and discernable when felt by the user, because the actuators are scheduled to operate in a staggered manner to provide the user with the feeling that the pulse from a prior haptic effect has sufficiently degenerated before a subsequent pulse is felt.
  • a single actuator may not be able to achieve this result at frequencies around or greater than 10 Hz, the scheduling of multiple actuators is able to achieve such a result as such higher frequencies.
  • a QWERTY keyboard has keys approximately 6 millimeters wide in which the processor 104 instructs a single actuator to output a haptic effect upon the sensor 1 10 indicating that the user's finger (or stylus) is positioned on one boundary of a particular key.
  • the user's finger runs across a series of keys (in particular, keys “z” to "m") at a rate of 7 keys per second.
  • the actuators are required to output haptic effects on the order of 1 key boundary every 70ms, which translates into approximately 14 key boundaries every second (or 71.4 milliseconds per boundary).
  • a single actuator tasked to output a vibration for each of the haptic areas may generate a continuous, or nearly continuous, vibration, and thus the user may not feel any distinction between key boundaries. This is because the single actuator does not have the time to stop completely before the next pulse is already being output.
  • multiple actuators are employed to successively output the haptic effects to provide this tactile information.
  • the processor 104 applies a first command signal to the actuator 1 12.
  • the processor 104 applies a second command signal to actuator 114.
  • the processor 104 applies a third command signal to actuator 112. This alternating pattern between the multiple actuators 112, 114 produces definitive and distinct haptic effects which are able to be distinguished by the user.
  • a single actuator (such as actuator 1 12) may be used to output multiple haptic effects when the amount of time between triggering haptic events and/or haptic areas is longer than the amount of time needed for the actuator to come to a complete stop or at least decelerate to a magnitude that is not able to be felt to the user.
  • the processor 104 activates multiple actuators (e.g. 2, 3, or more) successively when the amount of time between triggering haptic events and/or haptic areas is less than the amount of time needed for the actuator to come to a complete stop or at least decelerate to a magnitude that is not able to be felt to the user.
  • the amount of time needed is based on the operating parameters and type of actuators used as well as the amount of current and voltage applied to the actuators.
  • the haptic effects that can be produced by the actuators vary depending on the current, voltage, frequency as well as start and stop times. Such haptic effects include, but are not limited to, vibrations, pulses, pops, clicks, damping characteristics, and varying textures.
  • the multiple actuators are utilized to generate different haptic effects for different applications.
  • the two actuators are configured to provide a vibration or pop upon the user's finger or stylus passing over the boundaries of a graphical object (e.g. keyboard keys), as discussed above.
  • one or more actuators coupled to the touch sensitive component are activated when the user is detected within the boundaries to generate a texture-like haptic effect.
  • the actuator is an eccentric rotating mass (ERM) actuator which is driven using a continuous DC voltage, whereby the ERM actuator is pulsed by the processor 104 to output the haptic effect and also achieve relatively short start and stop times at lower frequencies.
  • ERM eccentric rotating mass
  • the ERM actuator's response especially the ability to accelerate and decelerate quickly enough to the desired magnitude, may be slower than needed to produce the distinct haptic effects described above.
  • the response of the actuator will be at a predetermined magnitude and frequency. In other words, increasing the magnitude of the DC driving voltage will proportionally result in an acceleration response with higher magnitude and higher acceleration. In the same vein, decreasing the magnitude of the DC driving voltage will proportionally result in a deceleration response with a lower magnitude and a lower deceleration.
  • an ERM actuator may not be able to generate vibrations that are clear and distinct and having a magnitude of 0.4 Gpp at 120Hz upon the processor applying only a DC voltage to the actuator.
  • the processor 104 applies an AC signal to the actuator, whereby the actuator responds to the driving signal with an acceleration profile having the same frequency content as the input signal.
  • This technique of overdriving the actuators in an AC (bipolar) mode dramatically improves the bandwidth of the actuator in the frequency domain. The actuator is thus able to generate different vibration effects at specific magnitudes and accelerations by superimposing the AC and DC input signals.
  • the main advantage of using multiple actuators in AC mode is that the overall system can achieve the principle of superposition. Applying two different input signals to the actuators, in which each input signal has different frequencies and magnitude parameters, will result in a vibration effect having those frequencies and proportional magnitudes. A single actuator is not capable of generating this superposition effect because it was not meant originally to have such a high bandwidth as was obtained when driving it in AC mode. This superposition principle is important when generating high fidelity vibration feedback (textures, pops and vibrations at the same time).
  • the actuators described above are ERM actuators, the actuators may also be a linear resonant actuator (LRA).
  • the LRA actuator is a DC motor with a resonant mass- spring system in which the mass is actuated linearly back and forth in a one dimensional direction.
  • the device is capable of generating a high acceleration response at a specific frequency, for instance 175Hz. However, at other frequencies the acceleration is close to 0 for the same input magnitude. However, if the magnitude of the input signal is increased in those areas where the response is weak, the resulting acceleration is strong enough to provide a good vibration effect at those specific frequencies and with a magnitude dependent on the magnitude of the driving signal.
  • other types of actuators may be employed.
  • smart gel actuators may be employed to provide textures or physical boundaries on the touch screen that correspond to objects shown by the touch screen, such as keys on a keyboard.
  • actuators 112-1 16 may comprise ERM or LRA actuators and piezoelectric actuators.
  • piezoelectric actuators may provide different types of haptic effects than ERM or LRA actuators.
  • piezoelectric actuators may provide low magnitude effects, but may have wide frequency ranges in which effects may be output.
  • piezoelectric actuators may be well-suited to applying haptic effects to a touch screen.
  • memory 106 may comprise parameters associated with each of the actuators 1 12-116.
  • memory 106 comprises parametric information about each of the actuators, such as minimum and maximum operational frequencies, minimum and maximum operational magnitudes, start-up and stop times, and axis(es) of operation.
  • the ERM actuators have minimum and maximum operational frequencies of approximately 100 and 300 Hz respectively, while the piezoelectric actuators have minimum and maximum operational frequencies from 100 to 25,000 Hz.
  • the processor 104 determines a vibrational haptic effect is to be output at approximately 200Hz and generates a first actuator signal configured to cause a vibration at 200 Hz. Based at least in part on the actuator parameter information, the processor selects one of the ERM actuators. The processor then transmits the first actuator signal to the selected ERM actuator. The processor also determines that a texture haptic effect is to be output at approximately 25,000 Hz and generates a second actuator signal configured to cause a vibration at 25,000 Hz. Based at least in part on the actuator parameter information, the processor selects one of the piezoelectric actuators. The processor then transmits the second actuator signal to the selected piezoelectric actuator. In this embodiment, the processor selects a vibrational haptic effect is to be output at approximately 200Hz and generates a first actuator signal configured to cause a vibration at 200 Hz. Based at least in part on the actuator parameter information, the processor selects one of the ERM actuators. The processor then transmits the second actuator signal to the selected piezoelectric actuator. In this
  • the two haptic effects may be output at approximately the same time.
  • the actuator sequencing described above need not be performed.
  • the processor 104 may output the first actuator signal alternately to the two ERM actuators according to embodiments of the present invention.
  • ERM and piezoelectric actuators other combinations of actuators may be used.
  • a combination of ERM and LRA actuators may be used.
  • multiple ERM or LRA actuators of different sizes may be included to provide a range of vibrational frequencies, which may be actuated individually or simultaneously.
  • memory 106 comprises parameters associated with each actuator, including minimum and maximum operational frequencies, minimum and maximum operational magnitudes, start-up and stop times, and axis(es) of operation.
  • the parameters also further comprise a resonant frequency associated with each actuator, if the respective actuator has such a characteristic.
  • the processor 104 may select a suitable actuator or actuators to generate the desired haptic effects.
  • the processor 104 selects the appropriate actuator or actuators based upon the haptic effect to be output and the parameters describing each of the actuators. In some embodiments, the processor 104 may further select an actuator based on the operational status of an actuator, such as whether the actuator is in use or is still stopping.
  • Figure 7 illustrates a flow chart directed to the method of outputting haptic effects to increase the haptic bandwidth of the actuators in an electronic device.
  • the processor is provided with information as to whether a haptic event occurs (e.g. a collision in a video game) and/or a haptic area has been selected (e.g. user's finger or stylus moving over a boundary of a displayed key). This information may be provided from the sensor 1 10 and/or from the software running by the processor or a separate host computer.
  • the processor 104 Upon the processor 104 being notified that a haptic effect is to be output, the processor 104 applies an input command signal to the first actuator at predetermined start and stop times, as in 504.
  • the processor applies an input command signal to the second actuator at predetermined start and stop times, as in 506, whereby the start time of the second actuator does not occur until after the stop time of the input command signal to the first actuator.
  • this process repeats between the first and second actuators 1 12, 1 14 for a predetermined duration of time, as in 506.
  • the processor 104 confirms that the haptic event and/or haptic area is still activated, or in other words that the interaction is still occurring, when the predetermined duration of time has expired, as in 508. If the interaction which is causing the haptic effect is still occurring when the duration expires, the processor 104 continues to alternate between the actuators, as in 504. On the other hand, if the interaction is over when the duration ends, the processor 104 terminates the input command signal to the actuators, 510. It is contemplated that the processor 104 is informed if the interaction ceases prior to the expiration of the duration, whereby the processor 104 will prematurely terminate the input command signal to the actuators to end the outputted haptic effects.
  • Figure 8 illustrates another flow chart directed to the method of outputting haptic effects to increase the haptic bandwidth of the actuators in an electronic device.
  • the processor determines whether the two or more distinct haptic effects are to be output as a result of the action requiring the haptic effect, as shown as 602. If it is determined that less than two distinct haptic effects are to be produced, the processor 104 instructs only a single actuator to output the haptic effect, as in 604 in Figure 8. Again, this determination may be based on the sensor information and/or software instruction as well as the assigned haptic effect for the particular action.
  • a collision may occur on the display in which the designated haptic effect for the collision is a single vibration, whereby the processor 104 would instruct only one actuator to output the vibration.
  • the processor 104 would instruct only one actuator to output the vibration.
  • 604 is optional as the processor may alternatively choose to have more than one actuator simultaneously, or in sequence, output the haptic effect.
  • haptic event/haptic area it is determined whether the multiple actuators would be operating at a frequency and magnitude such that haptic effects would not be distinct and individually discernable to the user if only a single actuator were employed, as shown as 606 in Figure 8, or whether the two (or more) haptic effects are of different types such that different types of actuators should be used (e.g. ERM and piezoelectric).
  • the processor 104 would then send input command signals to multiple actuators, as in 608, whereby the command signals would selectively activate the actuators in an alternating manner, such as according to the embodiment shown in Figure 7, to output clear, distinct, and discernable haptic effects from the actuators.
  • the processor 104 determines that the multiple haptic effects could be output by a single actuator based on the parameters describing the actuator and characteristics of the haptic effects. In some embodiments, the processor 104 makes these determinations in real-time. However, in some embodiments, each of the assigned haptic effects, along with frequency, magnitude, start and stop time data, other parameters of the actuators and instructions on whether single or multiple actuators are to be output are stored in the memory 106 such that the processor 104 can easily processes the instructions and accordingly instruct which actuators to activate.
  • this determination may be based on the sensor information, actuator parameters stored in memory 106, and/or software instruction as well as the assigned haptic effect for the particular action. For example, a collision may occur on the display in which the designated haptic effect for the collision is a single vibration, whereby the processor 104 would instruct only one actuator to output the vibration.
  • the processor 104 determines that multiple haptic effects are to be output by multiple actuators based on the type of haptic effects to be output. For example, at step 600, the processor 104 determines that a vibrational haptic effect is to be output based on a user contacting the edge of a key on a virtual keyboard and that a textured haptic effect should be output to simulate the feel of a keyboard. In such an embodiment, at step 602, the processor 104 determines that multiple haptic effects are to be output and the method proceeds to step 606.
  • a texture effect may be output by a outputting a vibration at a frequency of greater than approximately 20kHz and adjusting the magnitude of the vibration, such as by setting the magnitude of vibration as a percentage of the maximum magnitude or by modulating the magnitude according to a second signal, such as a sine wave or other periodic or non-periodic waveform.
  • the magnitude of the vibration may be set to 0% outside of a haptic region and to 50% or 100% for contact within the haptic region.
  • a second or modulating frequency may have a frequency of 10Hz such that the magnitude of the 20kHz vibration varies from 0 to 100% at a rate of 10Hz. In some embodiments, higher modulating frequencies may be used, such as 100Hz, 500Hz or 1000Hz, or other suitable frequencies.
  • the processor 104 analyzes parameters stored in memory 106 that are associated with each actuator. Based on the parameters, the processor 104 determines that the ERM actuators are not capable of producing such effects. Therefore the processor 104 determines that a piezoelectric actuator should be selected to output the texture effect.
  • a vibration to indicate the edge of a key on a virtual keyboard may have a high-magnitude vibration frequency between approximately 100-300Hz, such as 200Hz.
  • the processor 104 selects an ERM actuator to output the haptic effects.
  • the processor 104 may further determine that multiple vibrational effects are to be output and that multiple ERM actuators should be employed, such as by employing techniques described above. After determining which actuators are associated with each haptic effect, the method proceeds to step 608.
  • the processor In step 608, the processor generates a first actuator signal configured to cause a vibration at a frequency of greater than approximately 20kHz to generate the texture haptic effect.
  • the processor also generates a second actuator signal configured to cause a vibration at 200Hz.
  • the processor then transmits the first actuator signal to the piezoelectric actuator and transmits the second actuator signal to the ERM actuator.
  • the processor 104 may alternately transmit the second actuator signal to multiple ERM actuators as described above.
  • a computer may comprise a processor or processors.
  • the processor comprises a computer-readable medium, such as a random access memory (RAM) coupled to the processor.
  • the processor executes computer-executable program instructions stored in memory, such as executing one or more computer programs for editing an image.
  • Such processors may comprise a microprocessor, a digital signal processor (DSP), an application-specific integrated circuit (ASIC), field programmable gate arrays (FPGAs), and state machines.
  • Such processors may further comprise programmable electronic devices such as PLCs, programmable interrupt controllers (PICs), programmable logic devices (PLDs), programmable read-only memories (PROMs), electronically programmable read-only memories (EPROMs or EEPROMs), or other similar devices.
  • Such processors may comprise, or may be in communication with, media, for example computer-readable media, that may store instructions that, when executed by the processor, can cause the processor to perform the steps described herein as carried out, or assisted, by a processor.
  • Embodiments of computer-readable media may comprise, but are not limited to, an electronic, optical, magnetic, or other storage device capable of providing a processor, such as the processor in a web server, with computer-readable instructions.
  • Other examples of media comprise, but are not limited to, a floppy disk, CD-ROM, magnetic disk, memory chip, ROM, RAM, ASIC, configured processor, all optical media, all magnetic tape or other magnetic media, or any other medium from which a computer processor can read.
  • the processor, and the processing, described may be in one or more structures, and may be dispersed through one or more structures.
  • the processor may comprise code for carrying out one or more of the methods (or parts of methods) described herein.

Abstract

Systems and methods for increasing the haptic bandwidth of an electronic device are disclosed. One disclosed embodiment of a system is an apparatus having a first actuator; a second actuator; and a processor coupled to the first and second actuators, the processor configured to apply a first command signal to the first actuator to output a first haptic effect from a first start time to a first stop time, the processor configured to apply a second command signal to the second actuator to output a second haptic effect from a second start time to a second stop time.

Description

SYSTEMS AND METHODS FOR INCREASING HAPTIC BANDWIDTH IN AN
ELECTRONIC DEVICE
CROSS-REFERENCES TO RELATED APPLICATION
This application claims priority to U. S. Provisional Patent Application No.
61/262,041 , filed November 17, 2009, entitled "System and Method for Increasing Haptic Bandwidth in an Electronic Device," the entirety of which is hereby incorporated by reference.
FIELD OF THE INVENTION
The present disclosure relates generally to systems and methods for increasing haptic bandwidth in an electronic device.
BACKGROUND
With the increase in popularity of handheld devices, especially mobile phones having touch sensitive surfaces (i.e. touch screens), physical tactile sensations which have traditionally provided by mechanical buttons no longer applies in the realm of this new generation of devices. Tactile confirmation has generally addressed or at the very least substituted the use of programmable mechanical clicks effects by typically using a single actuator, such as a vibrating motor. Such conventional haptic effects include vibrations to indicate an incoming call or text message, or to indicate error conditions.
SUMMARY
Embodiments of the present invention provide systems and methods for increasing haptic bandwidth in an electronic device. For example, in one embodiment, a system includes an apparatus having a first actuator; a second actuator; and a processor coupled to the first and second actuators, the processor configured to apply a first command signal to the first actuator to output a first haptic effect from a first start time to a first stop time, the processor configured to apply a second command signal to the second actuator to output a second haptic effect from a second start time to a second stop time.
In one embodiment of a method, the method comprises receiving an interaction signal at a processor of an interaction occurring within a graphical environment, the interaction corresponding to a haptic effect; applying a first input signal to a first actuator to output a first haptic effect, wherein the first actuator outputs the first haptic effect beginning at a first time and terminating at a second time; and applying a second input signal to a second actuator to output a second haptic effect, wherein the second actuator outputs the second haptic effect beginning at a third time, wherein the third time occurs after the second time. In another embodiment, a computer-readable medium comprises program code for causing a processor to execute such a method.
These illustrative embodiments are mentioned not to limit or define the invention, but rather to provide examples to aid understanding thereof. Illustrative embodiments are discussed in the Detailed Description, which provides further description of the invention. Advantages offered by various embodiments of this invention may be further understood by examining this specification.
BRIEF DESCRIPTION OF THE DRAWINGS
The accompanying drawings, which are incorporated into and constitute a part of this specification, illustrate one or more examples of embodiments and, together with the description of example embodiments, serve to explain the principles and implementations of the embodiments.
Figure 1 shows a system for increasing haptic bandwidth in electronic devices according to an embodiment of the present invention;
Figures 2 and 3 illustrate an actuator's response to a pulsing signal at frequencies of 5 and 10 Hz, respectively;
Figure 4 illustrates a block diagram of an electronic device in accordance with an embodiment of the present invention;
Figure 5 illustrates a QWERTY keyboard having haptic areas in accordance with an embodiment of the present invention;
Figure 6 illustrates scheduled activation of multiple actuators in response to interaction of the QWERTY keyboard in Figure 5 in accordance with an embodiment of the present invention;
Figure 7 illustrates a flow chart directed to the method of outputting haptic effects to increase the haptic bandwidth in an electronic device in accordance with an embodiment of the present invention; and
Figure 8 illustrates a flow chart directed to the method of outputting haptic effects to increase the haptic bandwidth in an electronic device in accordance with an embodiment. DETAILED DESCRIPTION
Example embodiments are described herein in the context of systems and methods for increasing haptic bandwidth in an electronic device. Those of ordinary skill in the art will realize that the following description is illustrative only and is not intended to be in any way limiting. Other embodiments will readily suggest themselves to such skilled persons having the benefit of this disclosure. Reference will now be made in detail to implementations of example embodiments as illustrated in the accompanying drawings. The same reference indicators will be used throughout the drawings and the following description to refer to the same or like items.
In the interest of clarity, not all of the routine features of the implementations described herein are shown and described. It will, of course, be appreciated that in the development of any such actual implementation, numerous implementation-specific decisions must be made in order to achieve the developer's specific goals, such as compliance with application- and business-related constraints, and that these specific goals will vary from one implementation to another and from one developer to another.
Illustrative System for Increasing Haptic Bandwidth in an Electronic Device
Referring now to Figure 1 , Figure 1 shows a system 50 for increasing haptic bandwidth in an electronic device according to one illustrative embodiment of the present invention. In the embodiment shown in Figure 1, a cell phone 60 comprises a touch screen 66 and several actuators 70-76 for outputting various haptic effects to the cell phone 60. In this illustrative embodiment, two of the actuators 70, 72 are piezo-electric actuators and the other two actuators 74, 76 are rotary motors having an eccentric rotating mass (commonly referred to as an "ERM"). In addition to these components, the cell phone 60 also includes a processor 62, a memory 64, a sensor 68.
During ordinary operation, the processor 62 executes software stored in memory 64 and displays graphical user interface (GUI) elements on the touch screen 66. A user interacts with the cell phone 60 by touching the touch screen 66 to select one or more GUI elements or by making gestures on the touch screen 66. The sensor 68 detects the various contacts with the touch screen 66 and provides sensor signals to the processor 62, which interprets the signals based on the position of GUI elements displayed on the touch screen 66 and any detected gestures. At some time during operation, the processor 62 may determine that one or more haptic effects are to be output to the cell phone 60 based on user inputs or on events occurring within the GUI or other applications executed by the processor 62, such as text messaging software. After determining one or more haptic effects to be output, the processor 62 selects one or more actuators 70-76 to use to output the haptic effects. In the embodiment shown in Figure 1 , memory 64 stores parametric information about each of the actuators, including frequency ranges, resonant frequencies, startup and stop times, power consumption, or physical coupling information, such as whether the actuator is coupled to the housing of the cell phone 60, the touch screen 66, or other parts of the cell phone 60, such as physical keys or buttons (not shown). Based on the actuator information, the processor 62 generates actuator signals for the haptic effects, selects the actuator or actuators to output the haptic effects, and transmits the actuator signals to the actuator(s) at the appropriate times to generate the desired haptic effects.
For example, if a user is typing on a virtual keyboard displayed on the touch screen 66, each key "pressed" by the user may result in a haptic effect. In this embodiment, the processor 62 determines that sharp, high-frequency haptic effects are needed for each key press. The processor 62 then determines that the ERM actuators 74, 76 should be used to output the haptic effects. For example, the processor 62 determines that the ERM actuators 74, 76 are capable of generating high-magnitude forces and are coupled to the housing of the cell phone 60 based on stored actuator profiles for each of the actuators 70-76. Further, the processor 62 determines that because key presses may occur in rapid succession, both ERM actuators 74, 76 should be used and should be alternated because the startup and stop characteristics of the ERM actuators 74, 76 may take too long to fully stop a haptic effect before the next haptic effect is to be output, i.e. the individual ERM actuators 74, 76 may have insufficient bandwidth to support haptic effects that occur as rapidly as keystrokes.
One way of defining the bandwidth of a vibrating motor actuator is the maximum frequency that can be obtained between from an actuator before the pulses output by the actuator begin to feel as mushy, continuous vibration. For example, as shown in Figure 2, the pulses 10 are generated by a single vibrating actuator in response to a non-continuous or pulsed signal 20, whereby the pulsed signal 20 is approximately at 5 Hz. For the 5 Hz pulsing signal, the response or deceleration output by the actuator is such that the actuator is able to vibrate for some time and come to an almost complete stop (Point A) before it is instructed to accelerate again. Figure 3 illustrates the same actuator in which the pulsing signal is at a frequency of 10 Hz. As can be seen in Figure 2, the magnitude of pulse vibrations 30 output by the actuator is not able to approach a zero value before the pulsing signal 40 instructs it to begin accelerating again (see Point B in Figure 2). In other words, the actuator is unable to decelerate to a magnitude where the magnitude of the haptic effect cannot be felt before the actuator begins to accelerate toward the maximum magnitude again. This can lead to "mushy" haptic effects where each effect tends to be hard to distinguish from the next, which tends to degrade the user's experience. Thus, to increase haptic bandwidth, the illustrative system in Figure 1 employs multiple actuators and novel methods of actuating those actuators.
After determining that keyboard presses should be generated by the ERM actuators
74, 76, the processor may determine that additional haptic effects are needed. For example, when the user presses the "send" button, the processor 62 determines that a haptic effect should be output to indicate that the send button was pressed. In this illustrative embodiment, the processor 62 determines that a texture haptic effect should be output in addition to a vibration effect. In such an embodiment, the processor 62 generates the vibration effects by sending signals alternately to one ERM actuator 74 and then the other ERM actuator 76, as will be described in greater detail below.
In addition, the processor 62 generates actuator signals with high frequencies (e.g. >20kHz) and determines that the ERM actuators are already in use, that the ERM actuators are not suitable for generating such high frequencies, and that the piezoelectric actuators 70, 72 are capable of generating the necessary frequencies. Further, the processor 62 determines, based on the stored actuator parameter information, that each piezoelectric actuator 70, 72 is configured to output a haptic effect in only one dimension and that the two piezoelectric actuators 70, 72 are oriented along orthogonal axes. Therefore, in this embodiment, the processor 62 determines that each of the piezoelectric actuators 70, 72 should be actuated to generate the texture effect. Thus, the processor 62 transmits high-frequency actuator signals to each of the piezoelectric actuators 70, 72 to generate a haptic effect to simulate a textured surface on the touch screen 66.
Such an illustrative embodiment provides increased haptic bandwidth by selectively actuating actuators 70-76 based on performance characteristics of the actuators stored within the cell phone's memory 64. Further, because a plurality of different actuators are provided, multiple effects may be output (or played) simultaneously, or may be output with high fidelity despite insufficient performance characteristics of one or more of the actuators 70-76 for the haptic effects to be output. For example, high-magnitude precise vibrations can be output a rate greater than the peak bandwidth of one of the ERM actuators 74, 76 by outputting the vibrations alternately between the two ERM actuators 74, 76.
This illustrative example is given to introduce the reader to the general subject matter discussed herein. The invention is not limited to this example. The following sections describe various additional non-limiting embodiments and examples of systems and methods for increasing haptic bandwidth in an electronic device.
Referring now to Figure 4, Figure 4 illustrates a block diagram of an electronic device in accordance with an embodiment. In particular, Figure 4 illustrates an electronic device 100 having a body or housing 102, a processor 104 within the body 102 and coupled to a memory 106. The processor 104 is able to store information to and retrieve from the memory 106. Such information may include, but is not limited to, actuator profiles, haptic effect profiles, haptic effect output time sequences, programmed voltages to send to the actuators, game data, software data, etc.
The electronic device 100 is shown with one or more optional touch screens, touch pads or other touch sensitive components 108 coupled to the processor 104. It should be noted that some embodiments of the present invention may not include a touch sensitive component 108. For instance, some embodiments of the present invention may be applied to other types of devices, such as a joystick, rotatable knob, stand-alone kiosk, computer mouse, virtual reality simulation, computer peripheral, smart phone, handheld computer, game peripheral, etc. However, for explanation purposes, the touch sensitive component 108 will be used to describe embodiments of systems and methods for increasing haptic bandwidth in an electronic device.
In addition, as shown in Figure 4, the device 100 includes a sensor 110 coupled to the touch screen 108 and processor 104, whereby the sensor 110 monitors the position, pressure, and/or movement of the user's finger(s), stylus or other input means during interaction with the touch sensitive component 108. The sensor 1 10 provides sensor signals to the processor 104 to indicate the pressure, position and/or movement of the user's input, whereby the processor 104 running the software program updates the display shown through the touch sensitive component 108 in response thereto. In an embodiment, the touch sensitive component 108 incorporates the sensor 1 10 therein as an integral component, and thus the sensor 110 is not a separate component. However, for purposes of discussion, the sensor 110 is referred to herein as a separate component. In addition, the electronic device 100 includes a plurality of actuators 1 12, 1 14, 1 16 within the body. It should be noted that although three actuators are shown in Figure 4, as little as two actuators are contemplated or more than three actuators are also contemplated. In an embodiment, the actuators 112, 114, 116 are all mounted to the body 102 of the device 100 to impart a haptic effect thereto. In an embodiment, one or more of the actuators are mounted to the touch sensitive component 108 or other respective user input device to impart a localized haptic effect thereto. It is contemplated that one or more of the actuators may be mounted to the touch sensitive component 108 or other respective user input device while the remaining actuators are mounted to the body 102 or to one or more physical buttons (not shown). In an embodiment, at least one actuator is suspended within the body 102 and may be configured to impart haptic effects to the touch sensitive component and/or the body 102. The actuator may be designed to utilize a flexible or resilient material to amplify haptic effects produced therefrom. In an embodiment, one or more actuators are part of an external device or peripheral that is externally mounted to the body 102 to output haptic effects thereto.
In the embodiment shown, the actuators 1 12-116 are configured to output one or more haptic effects upon receiving an input command signal from the processor 104. The input command signal may be from an interaction which may occur between the user and a graphical object within a graphical environment run by a software program, whereby the software program may be run on the local processor or a host computer separate from the electronic device. The interaction may also be user independent in which the user's action does not cause the interaction (e.g. text message received, asteroid hitting the user's vehicle in a game). The interaction may, however, cause a haptic event to occur or may be the product of the user selecting a haptic area, both of which are discussed in more detail below.
The above mentioned actuators can be of various types including, but not limited to, eccentric rotational mass (ERM) actuators, linear resonant actuators (LRA), piezoelectric actuator, voice coil actuator, electro-active polymer (EAP) actuators, memory shape alloys, pager or DC motors, AC motors, moving magnet actuators, E-core actuators, smartgels, electrostatic actuators, electrotactile actuators, etc.
As stated above, the actuators 112-1 16 output their respective haptic effects in response to one or more haptic events occurring in the graphical environment. The haptic event is referred to herein as any interaction, action, collision, or other event which occurs during operation of the device which can potentially have a haptic effect associated with it, which is then output to the user in the form of the haptic effect.
For instance, a haptic event may occur when a graphical vehicle the user is controlling experiences wind turbulence during game play, whereby an example haptic effect associated with that haptic event could be a vibration. Another example is that a haptic event may occur when a missile collides with the user's character in the game, whereby an example haptic effect associated with the haptic event is a jolt or pulse. Haptic events may not be associated with the game play, but nonetheless provides the user with important device information while the user is playing a game (e.g. receiving a text message, completion of a song download, battery level low, etc.).
As also mentioned above, the interaction may correlate with a graphical object of a graphical environment which the user interacts with on a display screen. For instance, a haptic effect may be output by the system in response to an interaction where the user selects a designated area in a graphical environment, hereby referred to as a displayed haptic enabled area or just "haptic area." In an example, as shown in Figure 5, the boundaries of a displayed key of a keyboard may each be designated a haptic area. In Figure 5, the left boundary 202, right boundary 204, bottom boundary 206 and top boundary 208 of "shift" key may each be designated a haptic area, whereby the processor 104 instructs the actuators to output respective haptic effects when the sensor 1 10 indicates that the user's finger or stylus is moving over one or more of displayed boundary or boundaries. It is also contemplated that the area between the boundaries 202-208 within the "shift" key may be designated a haptic area. In some embodiments, haptic areas are designated when developing the software that is to be run on the device 100. In some embodiments, however, a user may be able to customize existing haptic areas or develop/designate new ones such as via a Preferences or Options menu.
Referring again to Figure 4, the present system and method utilizes multiple actuators to operate in successive order for a duration of time during which the interaction occurs. The staggered output of the multiple actuators are to increase the output bandwidth of the actuators at faster intervals and produce distinct, discrete haptic effects which are discernable to the user. In an embodiment, when a haptic event occurs (or a haptic area is selected), the processor 104 applies an input command signal with a designated voltage and current to the actuator 112 at a start time to cause the actuator 1 12 to accelerate to a maximum designated magnitude to output a corresponding haptic effect. Thereafter, the processor 104 terminates the input command signal at a stop time (such as based on programmed parameters of the hap tic effect which are stored in memory), upon which the actuator 112 decelerates from the maximum magnitude to a stop. The processor 104 then applies a designated voltage and current to the second actuator 114 at a respective start time to cause the actuator 1 14 to accelerate to a maximum designated magnitude to output a corresponding haptic effect. Upon reaching a stop time of the input command signal for the second actuator 1 14, the processor 104 terminates the pulse signal to the actuator 112 to allow the second actuator 1 14 to decelerate from its maximum magnitude to a stop. The processor 104 then again sends the input command signal to the first actuator 1 12 to begin outputting a haptic effect and so on.
In this embodiment, this process is repeated between the actuators 112, 114 to thus cause the actuators 112, 114 to alternately and successively output their respective haptic effects. In some embodiments, a particular actuator does not begin operating until the haptic effect output by the other actuator is at least at a magnitude and/or frequency that is not able to be discernibly felt by the user. However, in some embodiments, a particular actuator does not begin operating until the haptic effect output by the other actuator is at a zero magnitude and/or frequency.
In an embodiment, the scheduling of the start and stop times of the input command signals toward each of the actuators are predetermined and stored in the memory. This allows the processor to quickly retrieve the scheduling data and thus ease computational burdens when a haptic effect is to be output. The stored scheduling information may be in the form of a lookup table or other stored configuration in which the start and stop times for each actuator, in relation to the other actuators, are already established in which the processor 104 merely processes the stored information and accordingly activates the actuators based on the designated scheduling instructions. The scheduling instructions may be based on the type of actuators used (e.g. ERM, LRA, piezoelectric, etc.), the desired maximum and minimum magnitudes to be output by the actuators, voltages and frequencies at which the actuators will operate, type of haptic effect to be output (e.g. vibration, pop, click, etc.), and the overall operating characteristics of the actuators (e.g. heavy or light actuators, etc.).
In an embodiment, the particular operating characteristics of the actuator 1 12 will be known to the processor 104 in which the processor 104 is provided information on how long it takes the actuator 112 to accelerate from a stopped position to the desired magnitude and frequency based on the applied voltage and current. Further, the memory 106 may store information regarding how long it takes for the actuator 1 12 to decelerate from its maximum operating magnitude and frequency back to the stopped position. This is because, in one embodiment, the acceleration and deceleration time of the actuator 1 12, based on the type of current (i.e. AC vs. DC), is already known and is stored in the memory 106 as data or an instruction to be read by the processor and accordingly provided to the actuators. For example, in one embodiment, memory 106 comprises one or more actuator profiles associated with the actuators 112-1 16. In one embodiment, the actuator profiles comprise a plurality of parameters associated with the actuators, such as start-up time, stop time, minimum and maximum frequencies, maximum magnitudes, resonant frequencies, haptic effect types, axis(es) of operation, or power consumption. The processor 104 may then access the actuator profiles to determine which actuators, and how many actuators, to employ to generate one or more haptic effects.
Figure 6 illustrates a graph illustrating the scheduled haptic effects output by two actuators in the system in accordance with an embodiment. As shown in Figure 6, the top graph 300 illustrates the pulsed haptic effect output by the first actuator 112 and the bottom graph 400 illustrates the pulsed haptic effect output by the second actuator 1 14 in which both graphs share a common time line. As shown in Figure 6, upon a haptic event occurring or haptic area being determined, the processor 104 sends its command signal to the actuator 1 12 at time to in which the actuator 1 12 begins its operation. As shown in this embodiment, the input command signal is a square wave signal in which the processor 104 terminates its command signal at time , whereby time occurs before ti. In this embodiment, the processor determines time based on actuator parameters stored in memory. For example, in one embodiment, the processor 104 determines a percentage of the stop time for an actuator 112, 114 to determine a minimum amount of time to wait after an actuator signal has been terminated before a new signal may be begun.
In one embodiment, the processor 104 determines an amount of time to wait after an actuator signal has been terminated before beginning a haptic effect of the same type. For example, in one embodiment, a device may comprise multiple different types of actuators, such as ERM actuators, DC motors, piezoelectric actuators, LRAs, etc. In such an embodiment, a processor may simultaneously actuate multiple actuators to output different types of effects, such as textures, vibrations, and torques. In such an embodiment, a processor may cause texture effects to be output irrespective of the status of vibrational effects or torsional effects. In such an embodiment, the processor 104 may determine that no wait time is required as a first haptic effect may be output substantially simultaneously as a second haptic effect without interfering with the two effects.
Around time , the actuator 1 12 decelerates to a magnitude such that no discernable haptic effect is felt by the user. In an embodiment, the actuator 1 12 decelerates to a zero magnitude around time · In some embodiments, different input command signals or actuator signals may be employed other than square waves. For example, actuators signals may be generated to accelerate or decelerate actuators to provide high-fidelity haptic effects such as is disclosed in U. S. Patent No. 7,639,232, filed November 30, 2005, entitled
"Systems and Methods for Controlling a Resonant Device for Generating Vibrotactile Haptic Effects," the entirety of which is hereby incorporated by reference.
At time ti the processor 104 sends an input command signal to the actuator 114 in which the actuator 1 14 begins its operation and accelerates to a maximum magnitude. As shown in this embodiment, the command signal is a square wave signal in which the processor 104 terminates its command signal at time tm, whereby time tm occurs before t2. Around time tm, the actuator 114 has sufficiently decelerated so that the processor 104 determines that the next actuator may be actuated. For example, in this embodiment, the processor 104 determines a portion of the stop time stored as a parameter for actuator 114 in memory. In an embodiment, the actuator 114 comes to or near a complete stop around time tm. In some embodiments, the processor 104 delays a fixed amount of time before actuating the next actuator 1 12. Thereafter, the processor 104 then instructs actuator 1 12 to begin operation at time t2 and so on. This alternating pattern of output from multiple actuators can generate discrete haptic effects which are distinct and discernable when felt by the user, because the actuators are scheduled to operate in a staggered manner to provide the user with the feeling that the pulse from a prior haptic effect has sufficiently degenerated before a subsequent pulse is felt. Considering that in some embodiments a single actuator may not be able to achieve this result at frequencies around or greater than 10 Hz, the scheduling of multiple actuators is able to achieve such a result as such higher frequencies.
In another example, a QWERTY keyboard has keys approximately 6 millimeters wide in which the processor 104 instructs a single actuator to output a haptic effect upon the sensor 1 10 indicating that the user's finger (or stylus) is positioned on one boundary of a particular key. In another example, the user's finger runs across a series of keys (in particular, keys "z" to "m") at a rate of 7 keys per second. At the rate of 7 keys per second, the actuators are required to output haptic effects on the order of 1 key boundary every 70ms, which translates into approximately 14 key boundaries every second (or 71.4 milliseconds per boundary). A single actuator tasked to output a vibration for each of the haptic areas may generate a continuous, or nearly continuous, vibration, and thus the user may not feel any distinction between key boundaries. This is because the single actuator does not have the time to stop completely before the next pulse is already being output.
To ensure proper triggering of the haptic effects as well as clear, distinct and discernable haptic effects at the key boundaries, multiple actuators are employed to successively output the haptic effects to provide this tactile information. As the sensor 1 10 detects the user's input over the left boundary 202 of the "shift" key (see Figure 5), the processor 104 applies a first command signal to the actuator 1 12. As the sensor 1 10 detects the user's input over the right boundary 204 of the "shift" key, the processor 104 applies a second command signal to actuator 114. Accordingly, as the sensor 110 detects the user's input over the left boundary of key "z", the processor 104 applies a third command signal to actuator 112. This alternating pattern between the multiple actuators 112, 114 produces definitive and distinct haptic effects which are able to be distinguished by the user.
It should be noted that a single actuator (such as actuator 1 12) may be used to output multiple haptic effects when the amount of time between triggering haptic events and/or haptic areas is longer than the amount of time needed for the actuator to come to a complete stop or at least decelerate to a magnitude that is not able to be felt to the user. However, in some embodiments, the processor 104 activates multiple actuators (e.g. 2, 3, or more) successively when the amount of time between triggering haptic events and/or haptic areas is less than the amount of time needed for the actuator to come to a complete stop or at least decelerate to a magnitude that is not able to be felt to the user. The amount of time needed is based on the operating parameters and type of actuators used as well as the amount of current and voltage applied to the actuators.
The haptic effects that can be produced by the actuators vary depending on the current, voltage, frequency as well as start and stop times. Such haptic effects include, but are not limited to, vibrations, pulses, pops, clicks, damping characteristics, and varying textures. In an embodiment, the multiple actuators are utilized to generate different haptic effects for different applications. For example, the two actuators are configured to provide a vibration or pop upon the user's finger or stylus passing over the boundaries of a graphical object (e.g. keyboard keys), as discussed above. In addition, one or more actuators coupled to the touch sensitive component are activated when the user is detected within the boundaries to generate a texture-like haptic effect.
In an embodiment, the actuator is an eccentric rotating mass (ERM) actuator which is driven using a continuous DC voltage, whereby the ERM actuator is pulsed by the processor 104 to output the haptic effect and also achieve relatively short start and stop times at lower frequencies. However, when operating at higher frequencies (i.e. > 50Hz), the ERM actuator's response, especially the ability to accelerate and decelerate quickly enough to the desired magnitude, may be slower than needed to produce the distinct haptic effects described above. This is because, for a given constant DC driving voltage, the response of the actuator will be at a predetermined magnitude and frequency. In other words, increasing the magnitude of the DC driving voltage will proportionally result in an acceleration response with higher magnitude and higher acceleration. In the same vein, decreasing the magnitude of the DC driving voltage will proportionally result in a deceleration response with a lower magnitude and a lower deceleration.
For example, an ERM actuator may not be able to generate vibrations that are clear and distinct and having a magnitude of 0.4 Gpp at 120Hz upon the processor applying only a DC voltage to the actuator. Instead of driving the actuator only in DC mode, the processor 104 applies an AC signal to the actuator, whereby the actuator responds to the driving signal with an acceleration profile having the same frequency content as the input signal. This results in the ERM actuator having a considerable higher acceleration response than typical DC driven ERM actuators. This technique of overdriving the actuators in an AC (bipolar) mode dramatically improves the bandwidth of the actuator in the frequency domain. The actuator is thus able to generate different vibration effects at specific magnitudes and accelerations by superimposing the AC and DC input signals.
The main advantage of using multiple actuators in AC mode is that the overall system can achieve the principle of superposition. Applying two different input signals to the actuators, in which each input signal has different frequencies and magnitude parameters, will result in a vibration effect having those frequencies and proportional magnitudes. A single actuator is not capable of generating this superposition effect because it was not meant originally to have such a high bandwidth as was obtained when driving it in AC mode. This superposition principle is important when generating high fidelity vibration feedback (textures, pops and vibrations at the same time). Although the actuators described above are ERM actuators, the actuators may also be a linear resonant actuator (LRA). The LRA actuator is a DC motor with a resonant mass- spring system in which the mass is actuated linearly back and forth in a one dimensional direction. The device is capable of generating a high acceleration response at a specific frequency, for instance 175Hz. However, at other frequencies the acceleration is close to 0 for the same input magnitude. However, if the magnitude of the input signal is increased in those areas where the response is weak, the resulting acceleration is strong enough to provide a good vibration effect at those specific frequencies and with a magnitude dependent on the magnitude of the driving signal. In some embodiments, other types of actuators may be employed. For example, smart gel actuators may be employed to provide textures or physical boundaries on the touch screen that correspond to objects shown by the touch screen, such as keys on a keyboard.
As discussed previously, some embodiments of the present invention may comprise a plurality of different types of actuators. For example, in one embodiment, actuators 112-1 16 may comprise ERM or LRA actuators and piezoelectric actuators. As noted previously, piezoelectric actuators may provide different types of haptic effects than ERM or LRA actuators. For example, piezoelectric actuators may provide low magnitude effects, but may have wide frequency ranges in which effects may be output. In some embodiments, piezoelectric actuators may be well-suited to applying haptic effects to a touch screen.
In one embodiment, memory 106 may comprise parameters associated with each of the actuators 1 12-116. In such an embodiment, memory 106 comprises parametric information about each of the actuators, such as minimum and maximum operational frequencies, minimum and maximum operational magnitudes, start-up and stop times, and axis(es) of operation. For example, in this embodiment, the ERM actuators have minimum and maximum operational frequencies of approximately 100 and 300 Hz respectively, while the piezoelectric actuators have minimum and maximum operational frequencies from 100 to 25,000 Hz.
In this embodiment, the processor 104 determines a vibrational haptic effect is to be output at approximately 200Hz and generates a first actuator signal configured to cause a vibration at 200 Hz. Based at least in part on the actuator parameter information, the processor selects one of the ERM actuators. The processor then transmits the first actuator signal to the selected ERM actuator. The processor also determines that a texture haptic effect is to be output at approximately 25,000 Hz and generates a second actuator signal configured to cause a vibration at 25,000 Hz. Based at least in part on the actuator parameter information, the processor selects one of the piezoelectric actuators. The processor then transmits the second actuator signal to the selected piezoelectric actuator. In this
embodiment, the two haptic effects may be output at approximately the same time. Thus, the actuator sequencing described above need not be performed. However, if multiple haptic effects are to be output in rapid succession, the processor 104 may output the first actuator signal alternately to the two ERM actuators according to embodiments of the present invention.
While the prior embodiment disclosed a combination of ERM and piezoelectric actuators, other combinations of actuators may be used. For example, in one embodiment a combination of ERM and LRA actuators may be used. For example, multiple ERM or LRA actuators of different sizes may be included to provide a range of vibrational frequencies, which may be actuated individually or simultaneously. In such an embodiment, memory 106 comprises parameters associated with each actuator, including minimum and maximum operational frequencies, minimum and maximum operational magnitudes, start-up and stop times, and axis(es) of operation. The parameters also further comprise a resonant frequency associated with each actuator, if the respective actuator has such a characteristic. Thus, the processor 104 may select a suitable actuator or actuators to generate the desired haptic effects.
As discussed with respect to the embodiment with a combination of piezoelectric and
ERM actuators, the processor 104 selects the appropriate actuator or actuators based upon the haptic effect to be output and the parameters describing each of the actuators. In some embodiments, the processor 104 may further select an actuator based on the operational status of an actuator, such as whether the actuator is in use or is still stopping.
Referring now to Figure 7, Figure 7 illustrates a flow chart directed to the method of outputting haptic effects to increase the haptic bandwidth of the actuators in an electronic device. In particular, in 502 the processor is provided with information as to whether a haptic event occurs (e.g. a collision in a video game) and/or a haptic area has been selected (e.g. user's finger or stylus moving over a boundary of a displayed key). This information may be provided from the sensor 1 10 and/or from the software running by the processor or a separate host computer. Upon the processor 104 being notified that a haptic effect is to be output, the processor 104 applies an input command signal to the first actuator at predetermined start and stop times, as in 504. Thereafter, the processor applies an input command signal to the second actuator at predetermined start and stop times, as in 506, whereby the start time of the second actuator does not occur until after the stop time of the input command signal to the first actuator. In some embodiments, this process repeats between the first and second actuators 1 12, 1 14 for a predetermined duration of time, as in 506.
The processor 104 confirms that the haptic event and/or haptic area is still activated, or in other words that the interaction is still occurring, when the predetermined duration of time has expired, as in 508. If the interaction which is causing the haptic effect is still occurring when the duration expires, the processor 104 continues to alternate between the actuators, as in 504. On the other hand, if the interaction is over when the duration ends, the processor 104 terminates the input command signal to the actuators, 510. It is contemplated that the processor 104 is informed if the interaction ceases prior to the expiration of the duration, whereby the processor 104 will prematurely terminate the input command signal to the actuators to end the outputted haptic effects.
Referring now to Figure 8, Figure 8 illustrates another flow chart directed to the method of outputting haptic effects to increase the haptic bandwidth of the actuators in an electronic device. It should be noted that the methods in Figures 7 and 8 can be combined completely or partially and the methods are not mutually exclusive. As shown in Figure 7, the processor determines whether the two or more distinct haptic effects are to be output as a result of the action requiring the haptic effect, as shown as 602. If it is determined that less than two distinct haptic effects are to be produced, the processor 104 instructs only a single actuator to output the haptic effect, as in 604 in Figure 8. Again, this determination may be based on the sensor information and/or software instruction as well as the assigned haptic effect for the particular action. For example, a collision may occur on the display in which the designated haptic effect for the collision is a single vibration, whereby the processor 104 would instruct only one actuator to output the vibration. It should be noted that 604 is optional as the processor may alternatively choose to have more than one actuator simultaneously, or in sequence, output the haptic effect.
However, if it is determined that more than two distinct haptic effects are to be produced based on the haptic event/haptic area, it is determined whether the multiple actuators would be operating at a frequency and magnitude such that haptic effects would not be distinct and individually discernable to the user if only a single actuator were employed, as shown as 606 in Figure 8, or whether the two (or more) haptic effects are of different types such that different types of actuators should be used (e.g. ERM and piezoelectric). For example, based on the frequency and magnitude of the input command signal, if an ERM actuator would not be able to decelerate to a negligible magnitude, or for a sufficient percentage of its stop time as stored in an actuator profile within memory 114, before it is required to accelerate again to the maximum magnitude, the resulting haptic effects may feel mushy and indistinct, as described above. Accordingly, in such a case, the processor 104 would then send input command signals to multiple actuators, as in 608, whereby the command signals would selectively activate the actuators in an alternating manner, such as according to the embodiment shown in Figure 7, to output clear, distinct, and discernable haptic effects from the actuators. In contrast, if it is determined that the multiple haptic effects could be output by a single actuator based on the parameters describing the actuator and characteristics of the haptic effects, the processor 104 generates input command signals based on the haptic effects and applies the input command signal to only one actuator to output the haptic effect. In some embodiments, the processor 104 makes these determinations in real-time. However, in some embodiments, each of the assigned haptic effects, along with frequency, magnitude, start and stop time data, other parameters of the actuators and instructions on whether single or multiple actuators are to be output are stored in the memory 106 such that the processor 104 can easily processes the instructions and accordingly instruct which actuators to activate. Again, this determination may be based on the sensor information, actuator parameters stored in memory 106, and/or software instruction as well as the assigned haptic effect for the particular action. For example, a collision may occur on the display in which the designated haptic effect for the collision is a single vibration, whereby the processor 104 would instruct only one actuator to output the vibration.
In another embodiment, the processor 104 determines that multiple haptic effects are to be output by multiple actuators based on the type of haptic effects to be output. For example, at step 600, the processor 104 determines that a vibrational haptic effect is to be output based on a user contacting the edge of a key on a virtual keyboard and that a textured haptic effect should be output to simulate the feel of a keyboard. In such an embodiment, at step 602, the processor 104 determines that multiple haptic effects are to be output and the method proceeds to step 606.
In step 606, the processor determines which effects are to be output by which actuators by determining which actuators are capable of outputting the haptic effects. For example, in one embodiment, a texture effect may be output by a outputting a vibration at a frequency of greater than approximately 20kHz and adjusting the magnitude of the vibration, such as by setting the magnitude of vibration as a percentage of the maximum magnitude or by modulating the magnitude according to a second signal, such as a sine wave or other periodic or non-periodic waveform. For example, in one embodiment, the magnitude of the vibration may be set to 0% outside of a haptic region and to 50% or 100% for contact within the haptic region. In one embodiment, a second or modulating frequency may have a frequency of 10Hz such that the magnitude of the 20kHz vibration varies from 0 to 100% at a rate of 10Hz. In some embodiments, higher modulating frequencies may be used, such as 100Hz, 500Hz or 1000Hz, or other suitable frequencies. The processor 104 analyzes parameters stored in memory 106 that are associated with each actuator. Based on the parameters, the processor 104 determines that the ERM actuators are not capable of producing such effects. Therefore the processor 104 determines that a piezoelectric actuator should be selected to output the texture effect.
Similarly, a vibration to indicate the edge of a key on a virtual keyboard may have a high-magnitude vibration frequency between approximately 100-300Hz, such as 200Hz. In such a case, the processor 104 selects an ERM actuator to output the haptic effects. The processor 104 may further determine that multiple vibrational effects are to be output and that multiple ERM actuators should be employed, such as by employing techniques described above. After determining which actuators are associated with each haptic effect, the method proceeds to step 608.
In step 608, the processor generates a first actuator signal configured to cause a vibration at a frequency of greater than approximately 20kHz to generate the texture haptic effect. The processor also generates a second actuator signal configured to cause a vibration at 200Hz. The processor then transmits the first actuator signal to the piezoelectric actuator and transmits the second actuator signal to the ERM actuator. In an embodiment, the processor 104 may alternately transmit the second actuator signal to multiple ERM actuators as described above.
While the methods and systems herein are described in terms of software executing on various machines, the methods and systems may also be implemented as specifically- configured hardware, such a field-programmable gate array (FPGA) specifically to execute the various methods. For example, referring again to Figures 1 and 2, embodiments can be implemented in digital electronic circuitry, or in computer hardware, firmware, software, or in combination of them. In one embodiment, a computer may comprise a processor or processors. The processor comprises a computer-readable medium, such as a random access memory (RAM) coupled to the processor. The processor executes computer-executable program instructions stored in memory, such as executing one or more computer programs for editing an image. Such processors may comprise a microprocessor, a digital signal processor (DSP), an application-specific integrated circuit (ASIC), field programmable gate arrays (FPGAs), and state machines. Such processors may further comprise programmable electronic devices such as PLCs, programmable interrupt controllers (PICs), programmable logic devices (PLDs), programmable read-only memories (PROMs), electronically programmable read-only memories (EPROMs or EEPROMs), or other similar devices.
Such processors may comprise, or may be in communication with, media, for example computer-readable media, that may store instructions that, when executed by the processor, can cause the processor to perform the steps described herein as carried out, or assisted, by a processor. Embodiments of computer-readable media may comprise, but are not limited to, an electronic, optical, magnetic, or other storage device capable of providing a processor, such as the processor in a web server, with computer-readable instructions. Other examples of media comprise, but are not limited to, a floppy disk, CD-ROM, magnetic disk, memory chip, ROM, RAM, ASIC, configured processor, all optical media, all magnetic tape or other magnetic media, or any other medium from which a computer processor can read. The processor, and the processing, described may be in one or more structures, and may be dispersed through one or more structures. The processor may comprise code for carrying out one or more of the methods (or parts of methods) described herein.
The foregoing description of some embodiments of the invention has been presented only for the purpose of illustration and description and is not intended to be exhaustive or to limit the invention to the precise forms disclosed. Numerous modifications and adaptations thereof will be apparent to those skilled in the art without departing from the spirit and scope of the invention.
Reference herein to "one embodiment" or "an embodiment" means that a particular feature, structure, operation, or other characteristic described in connection with the embodiment may be included in at least one implementation of the invention. The invention is not restricted to the particular embodiments described as such. The appearance of the phrase "in one embodiment" or "in an embodiment" in various places in the specification does not necessarily refer to the same embodiment. Any particular feature, structure, operation, or other characteristic described in this specification in relation to "one embodiment" may be combined with other features, structures, operations, or other characteristics described in respect of any other embodiment.

Claims

CLAIMS That which is claimed is:
1. An apparatus comprising:
a first actuator;
a second actuator;
a processor coupled to the first and second actuators, the processor configured to apply a first command signal to the first actuator to output a first haptic effect from a first start time to a first stop time, the processor configured to apply a second command signal to the second actuator to output a second haptic effect from a second start time to a second stop time.
2. The apparatus of claim 1, wherein the first haptic effect and the second haptic effect are vibrations.
3. The apparatus of claim 1, wherein the first actuator is an eccentric rotating mass.
4. The apparatus of claim 1, wherein the second actuator is an eccentric rotating mass.
5. The apparatus of claim 1, wherein the first actuator is a linear resonating actuator.
6. The apparatus of claim 1, wherein the second actuator is a linear resonating actuator.
7. The apparatus of claim 1, further comprising:
a touch sensitive component coupled to the processor, the touch sensitive component configured to display a graphical object thereon; and
a sensor coupled to the touch sensitive component and the processor, the sensor configured to detect a position of a user input on the touch sensitive component, wherein the processor is configured to activate the first and second actuators upon the sensor indicating the user's input on a haptic enabled area in the graphical object.
8. The apparatus of claim 1, wherein the processor is configured to activate the first and second actuators upon a software program indicating a haptic event has occurred.
9. The apparatus of claim 1, further comprising a third actuator coupled to the processor, wherein the third actuator outputs a third haptic effect different than the first and second haptic effects.
10. The apparatus of claim 9, wherein the third actuator is coupled to a touch sensitive component, wherein the third haptic effect is a high-frequency vibration applied to the touch sensitive component to provide a texture effect or to reduce a friction force between the touch sensitive component and a user's input.
1 1. The apparatus of claim 1, wherein the processor applies a AC voltage to at least a portion of the first command signal to achieve a desired change in velocity from the first actuator.
12. The apparatus of claim 1, wherein the processor applies a AC voltage to at least a portion of the second command signal to achieve a desired change in velocity from the second actuator.
13. A method comprising:
receiving an interaction signal at a processor of an interaction occurring within a graphical environment, the interaction corresponding to a haptic effect;
applying a first input signal to a first actuator to output a first haptic effect, wherein the first actuator outputs the first haptic effect beginning at a first time and terminating at a second time; and
applying a second input signal to a second actuator to output a second haptic effect, wherein the second actuator outputs the second haptic effect beginning at a third time, wherein the third time occurs after the second time.
14. The method of claim 13, wherein the second haptic effect terminates at a fourth time, the method further comprising: applying the first input signal to the first actuator to output the first haptic effect beginning at a fifth time, wherein the fifth time occurs after the fourth time.
15. The method of claim 13, further comprising:
displaying the graphical environment via a touch sensitive component coupled to the processor;
detecting a selection of a haptic area in the graphical environment; and
sending the interaction signal corresponding to the selection of the haptic area to the processor.
16. The method of claim 13, further comprising outputting a third haptic effect via a third actuator upon receiving a corresponding input command signal from the processor, wherein the third haptic effect different than the first and second haptic effects.
17. An electronic device comprising:
a body;
a processor within the body; and a plurality of actuators within the body and coupled to the processor, each actuator configured to output a corresponding haptic effect upon receiving a respective input signal from the processor,
wherein the processor is configured to:
receive an interaction signal indicating an interaction, the interaction corresponding to a haptic effect;
apply a first input signal to a first actuator of the plurality of actuators to output a first haptic effect, wherein the first actuator outputs the first haptic effect beginning at a first time and terminating at a second time; and
apply a second input signal to a second actuator of the plurality of actuators to output a second haptic effect, wherein the second actuator outputs the second haptic effect beginning at a third time, wherein the third time occurs after the second time.
18. The device of claim 17, further comprising:
a touch sensitive component coupled to the processor and the body, the touch sensitive component configured to display a graphical object thereon; and
a sensor coupled to the touch sensitive component and the processor, the sensor configured to detect a position of a user input on the touch sensitive component, wherein the processor is configured to activate at least the first and second actuators upon the sensor indicating the user's input on a haptic enabled area in the graphical object.
19. The device of claim 17, wherein the processor is configured to activate the first and second actuators upon a software program indicating a haptic event has occurred.
20. The device of claim 17, further comprising a third actuator coupled to the processor, wherein the third actuator outputs a third haptic effect different than the first and second haptic effects.
21. A system comprising:
a piezoelectric actuator;
a second actuator; and
a processor in communication with the first actuator and the second actuator, the processor configured to:
generate a first actuator signal, the first actuator signal configured to cause a vibration at a frequency of greater than approximately 20kHz;
generate a second actuator signal, the second actuator signal configured to cause a vibration between approximately 100-300Hz; transmit the first actuator signal to the piezoelectric actuator; and
transmit the second actuator signal to the second actuator.
22. The system of claim 21 , further comprising a computer-readable medium, the computer-readable medium configured to store first and second actuator information, the first actuator information comprising at least one parameter describing a characteristic of the first actuator, and the second actuator information comprising at least one parameters describing a characteristic of the second actuator.
23. The system of claim 22, wherein the processor is configured to:
receive a command;
determine a haptic effect based on the command;
select one of the piezoelectric actuator or the second actuator based at least in part on the haptic effect, the first actuator information, and the second actuator information;
if the piezoelectric actuator is selected, generate the first actuator signal and transmit the first actuator signal to the piezoelectric actuator,
if the second actuator is selected, generate the second actuator signal and transmit the first actuator signal to the second actuator.
24. The system of claim 21 , wherein the processor is further configured to:
receive a command,
determine a haptic effect based at least in part on the command,
transmit the first actuator signal to the piezoelectric actuator if the haptic effect comprises a friction haptic effect, and
transmit the second actuator signal to the second actuator if the haptic effect comprises a vibrational haptic effect.
25. The system of claim 21 , further comprising a touch-sensitive input device, and wherein the piezoelectric actuator is coupled to the touch-sensitive input device.
26. The system of claim 21 , wherein the second actuator comprises one of an eccentric rotating mass, a linear resonant actuator, or a piezoelectric actuator.
27. The system of claim 21 , further comprising a third actuator, the third actuator comprising a second piezoelectric actuator,
wherein the piezoelectric actuator is a first piezoelectric actuator and is configured to output haptic effects in a first direction, and
wherein the second piezoelectric actuator is configured to output haptic effects in a second direction, the second direction different from the first direction.
PCT/US2010/056829 2009-11-17 2010-11-16 Systems and methods for increasing haptic bandwidth in an electronic device WO2011062895A2 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
EP10787610.4A EP2502215B1 (en) 2009-11-17 2010-11-16 Systems and methods for increasing haptic bandwidth in an electronic device
KR1020127015581A KR101719507B1 (en) 2009-11-17 2010-11-16 Systems and methods for increasing haptic bandwidth in an electronic device
CN201080051254.6A CN102713793B (en) 2009-11-17 2010-11-16 For increasing the system and method for the haptic bandwidth in electronic equipment
JP2012539970A JP5668076B2 (en) 2009-11-17 2010-11-16 System and method for increasing haptic bandwidth in electronic devices

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US26204109P 2009-11-17 2009-11-17
US61/262,041 2009-11-17

Publications (2)

Publication Number Publication Date
WO2011062895A2 true WO2011062895A2 (en) 2011-05-26
WO2011062895A3 WO2011062895A3 (en) 2011-12-15

Family

ID=43969405

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2010/056829 WO2011062895A2 (en) 2009-11-17 2010-11-16 Systems and methods for increasing haptic bandwidth in an electronic device

Country Status (6)

Country Link
US (1) US20110115709A1 (en)
EP (1) EP2502215B1 (en)
JP (1) JP5668076B2 (en)
KR (1) KR101719507B1 (en)
CN (1) CN102713793B (en)
WO (1) WO2011062895A2 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013206019A (en) * 2012-03-28 2013-10-07 Kyocera Corp Portable terminal equipment and control method
JP2014102819A (en) * 2012-11-20 2014-06-05 Immersion Corp Method and apparatus for providing haptic cues for guidance and alignment with electrostatic friction
JP2014203457A (en) * 2013-04-01 2014-10-27 レノボ・シンガポール・プライベート・リミテッド Touch input device haptic feedback
US9690422B2 (en) 2014-01-30 2017-06-27 Kyocera Document Solutions Inc. Touch panel apparatus and touch panel control method

Families Citing this family (57)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8294600B2 (en) * 2008-02-15 2012-10-23 Cody George Peterson Keyboard adaptive haptic response
US10007340B2 (en) 2009-03-12 2018-06-26 Immersion Corporation Systems and methods for interfaces featuring surface-based haptic effects
US9927873B2 (en) * 2009-03-12 2018-03-27 Immersion Corporation Systems and methods for using textures in graphical user interface widgets
US10564721B2 (en) * 2009-03-12 2020-02-18 Immersion Corporation Systems and methods for using multiple actuators to realize textures
US9746923B2 (en) 2009-03-12 2017-08-29 Immersion Corporation Systems and methods for providing features in a friction display wherein a haptic effect is configured to vary the coefficient of friction
US9874935B2 (en) * 2009-03-12 2018-01-23 Immersion Corporation Systems and methods for a texture engine
US9696803B2 (en) * 2009-03-12 2017-07-04 Immersion Corporation Systems and methods for friction displays and additional haptic effects
US9965094B2 (en) 2011-01-24 2018-05-08 Microsoft Technology Licensing, Llc Contact geometry tests
US8988087B2 (en) 2011-01-24 2015-03-24 Microsoft Technology Licensing, Llc Touchscreen testing
US9448626B2 (en) 2011-02-11 2016-09-20 Immersion Corporation Sound to haptic effect conversion system using amplitude value
US8717152B2 (en) 2011-02-11 2014-05-06 Immersion Corporation Sound to haptic effect conversion system using waveform
US8982061B2 (en) 2011-02-12 2015-03-17 Microsoft Technology Licensing, Llc Angular contact geometry
US9542092B2 (en) 2011-02-12 2017-01-10 Microsoft Technology Licensing, Llc Prediction-based touch contact tracking
US8773377B2 (en) 2011-03-04 2014-07-08 Microsoft Corporation Multi-pass touch contact tracking
US8913019B2 (en) 2011-07-14 2014-12-16 Microsoft Corporation Multi-finger detection and component resolution
US9378389B2 (en) 2011-09-09 2016-06-28 Microsoft Technology Licensing, Llc Shared item account selection
US9785281B2 (en) 2011-11-09 2017-10-10 Microsoft Technology Licensing, Llc. Acoustic touch sensitive testing
US20130227411A1 (en) * 2011-12-07 2013-08-29 Qualcomm Incorporated Sensation enhanced messaging
US8914254B2 (en) 2012-01-31 2014-12-16 Microsoft Corporation Latency measurement
KR101181505B1 (en) * 2012-02-28 2012-09-10 한국과학기술원 Haptic interface having asymmetric reflecting points
US9715276B2 (en) 2012-04-04 2017-07-25 Immersion Corporation Sound to haptic effect conversion system using multiple actuators
US9368005B2 (en) * 2012-08-31 2016-06-14 Immersion Corporation Sound to haptic effect conversion system using mapping
US9092059B2 (en) 2012-10-26 2015-07-28 Immersion Corporation Stream-independent sound to haptic effect conversion system
EP2743800B1 (en) * 2012-12-13 2019-02-20 Immersion Corporation Haptic system with increased LRA bandwidth
JP5781495B2 (en) * 2012-12-28 2015-09-24 京セラドキュメントソリューションズ株式会社 Touch panel device
US9489047B2 (en) * 2013-03-01 2016-11-08 Immersion Corporation Haptic device with linear resonant actuator
US9189098B2 (en) * 2013-03-14 2015-11-17 Immersion Corporation Systems and methods for syncing haptic feedback calls
JP6093659B2 (en) * 2013-06-24 2017-03-08 シャープ株式会社 Information processing apparatus and information processing program
US11229239B2 (en) * 2013-07-19 2022-01-25 Rai Strategic Holdings, Inc. Electronic smoking article with haptic feedback
US9317120B2 (en) * 2013-09-06 2016-04-19 Immersion Corporation Multiplexing and demultiplexing haptic signals
US9898085B2 (en) * 2013-09-06 2018-02-20 Immersion Corporation Haptic conversion system using segmenting and combining
US10599218B2 (en) 2013-09-06 2020-03-24 Immersion Corporation Haptic conversion system using frequency shifting
JP6172284B2 (en) * 2013-09-26 2017-08-02 富士通株式会社 Drive control apparatus, electronic device, and drive control method
MX338463B (en) * 2013-09-26 2016-04-15 Fujitsu Ltd Drive control apparatus, electronic device, and drive control method.
FR3015108B1 (en) * 2013-12-13 2019-05-31 Dav CONTROL OF ACTUATORS OF A SENSITIVE CONTROL SURFACE WITH A HAPTIC RETURN
US20150205352A1 (en) * 2013-12-29 2015-07-23 Immersion Corporation Distributed control architecture for haptic devices
US10067566B2 (en) 2014-03-19 2018-09-04 Immersion Corporation Systems and methods for a shared haptic experience
US9645646B2 (en) 2014-09-04 2017-05-09 Intel Corporation Three dimensional contextual feedback wristband device
JP5956525B2 (en) * 2014-10-01 2016-07-27 レノボ・シンガポール・プライベート・リミテッド Input device
US10331214B2 (en) * 2015-09-08 2019-06-25 Sony Corporation Information processing device, method, and computer program
CN105446646B (en) * 2015-12-11 2019-01-11 小米科技有限责任公司 Content input method, device and touch control device based on dummy keyboard
WO2017191639A1 (en) * 2016-05-05 2017-11-09 ContinUse Biometrics Ltd. System and method for use in tissue monitoring and analysis
CN105897120B (en) * 2016-05-19 2018-09-21 瑞声科技(新加坡)有限公司 Accurately control the signal creating method of motor
JP6992045B2 (en) 2016-07-22 2022-01-13 ハーマン インターナショナル インダストリーズ インコーポレイテッド Tactile guidance system
US10304298B2 (en) * 2016-07-27 2019-05-28 Immersion Corporation Braking characteristic detection system for haptic actuator
CN106293089B (en) * 2016-08-15 2023-04-25 南京信息工程大学 Vibration sensing device and working method based on same
US11500538B2 (en) * 2016-09-13 2022-11-15 Apple Inc. Keyless keyboard with force sensing and haptic feedback
CN110520040B (en) * 2017-01-27 2022-05-31 西北大学 Epidermis virtual reality device
FR3064504B1 (en) * 2017-03-31 2022-02-04 Commissariat Energie Atomique INTERFACE OFFERING LOCALIZED FRICTION MODULATION BY ACOUSTIC LUBRICATION
CN117270637A (en) 2017-07-26 2023-12-22 苹果公司 Computer with keyboard
CN108325806B (en) * 2017-12-29 2020-08-21 瑞声科技(新加坡)有限公司 Vibration signal generation method and device
CN111566595A (en) * 2018-01-09 2020-08-21 索尼公司 Information processing apparatus, information processing method, and program
JP7146425B2 (en) * 2018-03-19 2022-10-04 ソニーグループ株式会社 Information processing device, information processing method, and recording medium
JP7138024B2 (en) 2018-11-28 2022-09-15 京セラ株式会社 Electronics
US11537209B2 (en) * 2019-12-17 2022-12-27 Activision Publishing, Inc. Systems and methods for guiding actors using a motion capture reference system
CN111399645B (en) * 2020-03-13 2023-07-25 Oppo广东移动通信有限公司 Wearable device, tactile feedback method, device and storage medium
WO2024008507A1 (en) * 2022-07-07 2024-01-11 Jt International Sa An aerosol generating device

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7639232B2 (en) 2004-11-30 2009-12-29 Immersion Corporation Systems and methods for controlling a resonant device for generating vibrotactile haptic effects

Family Cites Families (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5734373A (en) * 1993-07-16 1998-03-31 Immersion Human Interface Corporation Method and apparatus for controlling force feedback interface systems utilizing a host computer
WO1997020305A1 (en) * 1995-11-30 1997-06-05 Virtual Technologies, Inc. Tactile feedback man-machine interface device
US5703624A (en) * 1996-02-09 1997-12-30 Van Kruistum; Timothy Portable image viewer
US6046527A (en) * 1996-07-05 2000-04-04 Honeybee Robotics, Inc. Ultrasonic positioner with multiple degrees of freedom of movement
US7815436B2 (en) * 1996-09-04 2010-10-19 Immersion Corporation Surgical simulation interface device and method
US6429846B2 (en) * 1998-06-23 2002-08-06 Immersion Corporation Haptic feedback for touchpads and other touch controls
US6693622B1 (en) * 1999-07-01 2004-02-17 Immersion Corporation Vibrotactile haptic feedback devices
US6822635B2 (en) * 2000-01-19 2004-11-23 Immersion Corporation Haptic interface for laptop computers and other portable devices
AU2001293056A1 (en) * 2000-09-25 2002-04-08 Motorwiz, Inc. Model-based machine diagnostics and prognostics using theory of noise and communications
CN100468294C (en) * 2000-09-28 2009-03-11 伊默逊股份有限公司 Directional tactile feedback for haptic feedback interface devices
AU2001294852A1 (en) * 2000-09-28 2002-04-08 Immersion Corporation Directional tactile feedback for haptic feedback interface devices
JP4061105B2 (en) * 2002-03-29 2008-03-12 アルプス電気株式会社 Haptic device
US7161580B2 (en) * 2002-04-25 2007-01-09 Immersion Corporation Haptic feedback using rotary harmonic moving mass
AU2003297716A1 (en) * 2002-12-08 2004-06-30 Immersion Corporation Methods and systems for providing haptic messaging to handheld communication devices
CA2422265A1 (en) * 2003-03-14 2004-09-14 Handshake Interactive Technologies Inc. A method and system for providing haptic effects
JP4478436B2 (en) * 2003-11-17 2010-06-09 ソニー株式会社 INPUT DEVICE, INFORMATION PROCESSING DEVICE, REMOTE CONTROL DEVICE, AND INPUT DEVICE CONTROL METHOD
US7112737B2 (en) * 2003-12-31 2006-09-26 Immersion Corporation System and method for providing a haptic effect to a musical instrument
US20060209037A1 (en) * 2004-03-15 2006-09-21 David Wang Method and system for providing haptic effects
JP4046095B2 (en) * 2004-03-26 2008-02-13 ソニー株式会社 Input device with tactile function, information input method, and electronic device
US9046922B2 (en) * 2004-09-20 2015-06-02 Immersion Corporation Products and processes for providing multimodal feedback in a user interface device
JP4756916B2 (en) * 2005-05-31 2011-08-24 キヤノン株式会社 Vibration wave motor
JP5275025B2 (en) * 2005-06-27 2013-08-28 コアクティヴ・ドライヴ・コーポレイション Synchronous vibrator for tactile feedback
US8780053B2 (en) * 2007-03-21 2014-07-15 Northwestern University Vibrating substrate for haptic interface
US8405618B2 (en) * 2006-03-24 2013-03-26 Northwestern University Haptic device with indirect haptic feedback
US8210942B2 (en) * 2006-03-31 2012-07-03 Wms Gaming Inc. Portable wagering game with vibrational cues and feedback mechanism
US10152124B2 (en) * 2006-04-06 2018-12-11 Immersion Corporation Systems and methods for enhanced haptic effects
US20070236474A1 (en) * 2006-04-10 2007-10-11 Immersion Corporation Touch Panel with a Haptically Generated Reference Key
US8174512B2 (en) * 2006-06-02 2012-05-08 Immersion Corporation Hybrid haptic device utilizing mechanical and programmable haptic effects
JP2007331066A (en) * 2006-06-15 2007-12-27 Canon Inc Contact presenting device and method
US20080084384A1 (en) * 2006-10-05 2008-04-10 Immersion Corporation Multiple Mode Haptic Feedback System
US7468573B2 (en) * 2006-10-30 2008-12-23 Motorola, Inc. Method of providing tactile feedback
US7626579B2 (en) * 2006-11-01 2009-12-01 Immersion Corporation Sanitizing a touch panel surface
US8120585B2 (en) * 2006-11-16 2012-02-21 Nokia Corporation Method, apparatus, and computer program product providing vibration control interface
US20090102805A1 (en) * 2007-10-18 2009-04-23 Microsoft Corporation Three-dimensional object simulation using audio, visual, and tactile feedback
KR100954529B1 (en) * 2007-11-27 2010-04-23 한국과학기술연구원 A ring type piezoelectric ultrasonic resonator and a piezoelectric ultrasonic rotary motor using thereof
US20090207129A1 (en) * 2008-02-15 2009-08-20 Immersion Corporation Providing Haptic Feedback To User-Operated Switch
NL2003141A1 (en) * 2008-07-30 2010-02-02 Asml Holding Nv Actuator system using multiple piezoelectric actuators.
US8749495B2 (en) * 2008-09-24 2014-06-10 Immersion Corporation Multiple actuation handheld device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7639232B2 (en) 2004-11-30 2009-12-29 Immersion Corporation Systems and methods for controlling a resonant device for generating vibrotactile haptic effects

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013206019A (en) * 2012-03-28 2013-10-07 Kyocera Corp Portable terminal equipment and control method
JP2014102819A (en) * 2012-11-20 2014-06-05 Immersion Corp Method and apparatus for providing haptic cues for guidance and alignment with electrostatic friction
US10078384B2 (en) 2012-11-20 2018-09-18 Immersion Corporation Method and apparatus for providing haptic cues for guidance and alignment with electrostatic friction
JP2014203457A (en) * 2013-04-01 2014-10-27 レノボ・シンガポール・プライベート・リミテッド Touch input device haptic feedback
US9690422B2 (en) 2014-01-30 2017-06-27 Kyocera Document Solutions Inc. Touch panel apparatus and touch panel control method

Also Published As

Publication number Publication date
CN102713793A (en) 2012-10-03
EP2502215B1 (en) 2020-06-03
EP2502215A2 (en) 2012-09-26
CN102713793B (en) 2016-08-31
JP5668076B2 (en) 2015-02-12
KR101719507B1 (en) 2017-03-24
KR20120116935A (en) 2012-10-23
JP2013511108A (en) 2013-03-28
US20110115709A1 (en) 2011-05-19
WO2011062895A3 (en) 2011-12-15

Similar Documents

Publication Publication Date Title
EP2502215B1 (en) Systems and methods for increasing haptic bandwidth in an electronic device
JP6251716B2 (en) System and method for pre-touch and true touch
EP2264572B1 (en) Method and apparatus for generating haptic feedback and an actuator
EP1748350B1 (en) Touch device and method for providing tactile feedback
CN104020844B (en) Haptic apparatus with linear resonance actuator
JP6283622B2 (en) Virtual detent mechanism by vibrotactile feedback
US20170336871A1 (en) User Interface Impact Actuator
KR101618665B1 (en) Multi-touch device having dynamichaptic effects
CN113711163A (en) Method and apparatus for providing haptic output signals to haptic actuators
JP5606462B2 (en) System and method using multiple actuators to achieve texture
EP2264562A2 (en) Method and apparatus for generating haptic feedback and a haptic interface
JP2018032417A (en) System and method using multiple actuators to realize textures
EP2339427A2 (en) Method and apparatus for generating vibrations in portable terminal
KR20090078342A (en) Multiple mode haptic feedback system
CN103324305A (en) Eccentric rotating mass actuator optimization for haptic effects
JP6731866B2 (en) Control device, input system and control method
US10656716B2 (en) Control device, input system, and control method
WO2020258074A1 (en) Method and device for generating haptic feedback
WO2018051668A1 (en) Tactile sensation presentation device

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 201080051254.6

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10787610

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2010787610

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2012539970

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 20127015581

Country of ref document: KR

Kind code of ref document: A