EP2406704A1 - Systems and methods for a texture engine - Google Patents

Systems and methods for a texture engine

Info

Publication number
EP2406704A1
EP2406704A1 EP10712202A EP10712202A EP2406704A1 EP 2406704 A1 EP2406704 A1 EP 2406704A1 EP 10712202 A EP10712202 A EP 10712202A EP 10712202 A EP10712202 A EP 10712202A EP 2406704 A1 EP2406704 A1 EP 2406704A1
Authority
EP
European Patent Office
Prior art keywords
haptic effect
processor
haptic
signal
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
EP10712202A
Other languages
German (de)
French (fr)
Inventor
Juan Manuel Cruz-Hernandez
Danny A. Grant
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Immersion Corp
Original Assignee
Immersion Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US12/697,037 external-priority patent/US9927873B2/en
Priority claimed from US12/697,042 external-priority patent/US10564721B2/en
Priority claimed from US12/696,900 external-priority patent/US9696803B2/en
Priority claimed from US12/696,893 external-priority patent/US9746923B2/en
Priority claimed from US12/696,908 external-priority patent/US10007340B2/en
Priority claimed from US12/697,010 external-priority patent/US9874935B2/en
Application filed by Immersion Corp filed Critical Immersion Corp
Publication of EP2406704A1 publication Critical patent/EP2406704A1/en
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B06GENERATING OR TRANSMITTING MECHANICAL VIBRATIONS IN GENERAL
    • B06BMETHODS OR APPARATUS FOR GENERATING OR TRANSMITTING MECHANICAL VIBRATIONS OF INFRASONIC, SONIC, OR ULTRASONIC FREQUENCY, e.g. FOR PERFORMING MECHANICAL WORK IN GENERAL
    • B06B1/00Methods or apparatus for generating mechanical vibrations of infrasonic, sonic, or ultrasonic frequency
    • B06B1/02Methods or apparatus for generating mechanical vibrations of infrasonic, sonic, or ultrasonic frequency making use of electrical energy
    • B06B1/06Methods or apparatus for generating mechanical vibrations of infrasonic, sonic, or ultrasonic frequency making use of electrical energy operating with piezoelectric effect or with electrostriction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1601Constructional details related to the housing of computer displays, e.g. of CRT monitors, of flat displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • HELECTRICITY
    • H02GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
    • H02NELECTRIC MACHINES NOT OTHERWISE PROVIDED FOR
    • H02N2/00Electric machines in general using piezoelectric effect, electrostriction or magnetostriction
    • H02N2/02Electric machines in general using piezoelectric effect, electrostriction or magnetostriction producing linear motion, e.g. actuators; Linear positioners ; Linear motors
    • HELECTRICITY
    • H02GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
    • H02NELECTRIC MACHINES NOT OTHERWISE PROVIDED FOR
    • H02N2/00Electric machines in general using piezoelectric effect, electrostriction or magnetostriction
    • H02N2/02Electric machines in general using piezoelectric effect, electrostriction or magnetostriction producing linear motion, e.g. actuators; Linear positioners ; Linear motors
    • H02N2/06Drive circuits; Control arrangements or methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1637Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/014Force feedback applied to GUI
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B6/00Tactile signalling systems, e.g. personal calling systems

Definitions

  • the present invention generally relates to haptic feedback and more particularly to systems and methods for a texture engine.
  • a system for a texture engine comprises: a processor configured to receive a display signal comprising a plurality of pixels, determine a haptic effect comprising a texture, and transmit a haptic signal associated with the haptic effect to an actuator in communication with the processor, the actuator configured to receive the haptic signal and output the haptic effect.
  • Figure 1 is a block diagram of a system for a texture engine according to one embodiment of the present invention
  • Figure 2 is an illustration of a system for a texture engine according to one embodiment of the present invention
  • Figure 3 a is an illustration of a system for a texture engine according to one embodiment of the present invention.
  • Figure 3b is an illustration of a system for a texture engine according to one embodiment of the present invention.
  • Figure 4 is a flow chart of a method for a texture engine according to one embodiment of the present invention
  • Figure 5 a is an illustration of one of the textures that a texture engine may generate according to one embodiment of the present invention
  • Figure 5b is another illustration of one of the textures that a texture engine may generate according to one embodiment of the present invention
  • Figure 5c is another illustration of one of the textures that a texture engine may generate according to one embodiment of the present invention
  • Figure 5d is another illustration of one of the textures that a texture engine may generate according to one embodiment of the present invention.
  • Figure 5e is another illustration of one of the textures that a texture engine may generate according to one embodiment of the present invention.
  • Figure 5f is another illustration of one of the textures that a texture engine may generate according to one embodiment of the present invention.
  • Figure 5g is another illustration of one of the textures that a texture engine may generate according to one embodiment of the present invention
  • Figure 5h is another illustration of one of the textures that a texture engine may generate according to one embodiment of the present invention.
  • Embodiments of the present invention provide systems and methods for a texture engine.
  • One illustrative embodiment of the present invention comprises a messaging device, such as a mobile phone.
  • the messaging device comprises the Samsung Haptic Phone (SCH- W420) equipped with Immersion Corporation's TouchSense® 3000, TouchSense® 4000, or TouchSense® 5000 vibrotactile feedback systems, formerly known as Immersion Corporation's VibeTonz® vibrotactile feedback system.
  • Samsung Haptic Phone SCH- W420
  • Immersion Corporation's TouchSense® 3000, TouchSense® 4000, or TouchSense® 5000 vibrotactile feedback systems formerly known as Immersion Corporation's VibeTonz® vibrotactile feedback system.
  • different messaging devices and haptic feedback systems may be utilized.
  • the illustrative messaging device comprises a display, a speaker, a network interface, a memory, and a processor in communication with each of these elements.
  • the illustrative messaging device also comprises a touch-sensitive interface and an actuator, both of which are in communication with the processor.
  • the touch-sensitive interface is configured to sense a user's interaction with the messaging device, and the actuator is configured to output a haptic effect.
  • the illustrative messaging device may further comprise a manipulandum configured to detect a user interaction and transmit an interface signal associated with the user interaction to the processor.
  • the display is configured to display a graphical user interface to the user.
  • the graphical user interface may comprise virtual objects, for example icons, buttons, or a virtual keyboard.
  • the illustrative messaging device further comprises a touch-sensitive interface, such as a touch-screen, mounted overtop of the display.
  • the touch-sensitive interface allows the user to interact with the virtual objects displayed in the graphical user interface.
  • the graphical user interface may comprise a virtual keyboard, and in such an embodiment, the touch-sensitive interface allows the user to touch a key on the virtual keyboard to input the alphanumeric character associated with the key. This functionality may be used to type messages, or otherwise interact with objects in the graphical user interface.
  • the processor is configured to determine a haptic effect and transmit a haptic signal corresponding to the haptic effect to an actuator configured to output the haptic effect.
  • this haptic effect simulates a texture that the user feels on the surface of the touch-sensitive interface.
  • the simulated texture may be associated with the user interface shown on the display.
  • the display may show an icon comprising the shape of a rock.
  • the processor may determine a haptic effect configured to simulate the texture of the rock on the surface of the touch-sensitive interface. Then, the processor will transmit a haptic signal to an actuator configured to output the haptic effect.
  • the actuator receives the haptic signal, it will output a haptic effect, such as a vibration, at a frequency configured to cause the surface of the touch-sensitive interface to approximate the texture of the rock.
  • the processor may implement a haptic map to determine the haptic effect.
  • the processor may receive a display signal comprising a plurality of pixels, each of the pixels associated with a color.
  • each pixel in the display signal may be associated with the color red, green, or blue, and it may further be associated with an intensity for each color.
  • the processor will assign a haptic value to each color and further assign a haptic intensity associated with the intensity of each color. Then, the processor will transmit a haptic signal comprising the haptic values and haptic intensities to an actuator configured to output the haptic effect.
  • the processor may further determine the haptic effect based on an external trigger.
  • the processor is configured to receive an interface signal from a touch-sensitive interface configured to detect a user interaction. Then, in the illustrative embodiment, the processor will determine the haptic effect based at least in part on the interface signal. For example, the processor may modify the haptic value or haptic intensity based at least in part on the interface signal. In the illustrative embodiment, if the touch-sensitive interface detects a high speed or high pressure user interaction, the processor will determine a higher intensity haptic effect.
  • the illustrative messaging device may output a haptic effect for a multitude of purposes.
  • the haptic effect may act as a confirmation that the processor has received an interface signal associated with a user interaction.
  • the graphical user interface may comprise a button, and the touch-sensitive interface may detect user interaction associated with pressing the button and transmit an interface signal to the processor.
  • the processor may determine a haptic effect to confirm receiving the interface signal.
  • the haptic effect may cause the user to feel a texture on the surface of the touch-sensitive interface.
  • the processor may further determine haptic effects for other purposes.
  • the illustrative messaging device may output a texture to alert the user to boundaries on the display or as an identification for objects such as icons on the surface of the display.
  • FIG. 1 is a block diagram of a system for a texture engine according to one embodiment of the present invention.
  • the system 100 comprises a messaging device 102, such as a mobile phone, portable digital assistant (PDA), portable media player, portable computer, portable gaming device, or some other mobile device.
  • messaging device 102 may comprise a laptop, tablet, desktop PC, or other similar device.
  • the messaging device may comprise an external monitor for use with a PC or some other device.
  • the messaging device 102 comprises a processor 110 in communication with a network interface 112, a touch-sensitive interface 114, a display 116, an actuator 118, a speaker 120, and a memory 122.
  • the processor 110 is configured to execute computer-executable program instructions stored in memory 122.
  • processor 110 may execute one or more computer programs for messaging or for generating haptic feedback.
  • Processor 110 may comprise a microprocessor, a digital signal processor (DSP), an application-specific integrated circuit (ASIC), one or more field programmable gate arrays (FPGAs), or state machines.
  • DSP digital signal processor
  • ASIC application-specific integrated circuit
  • FPGAs field programmable gate arrays
  • Processor 110 may further comprise a programmable electronic device such as a programmable logic controller (PLC), a programmable interrupt controller (PIC), a programmable logic device (PLD), a programmable read-only memory (PROM), an electronically programmable readonly memory (EPROM or EEPROM), or other similar devices.
  • Memory 122 comprises a computer-readable medium that stores instructions, which when executed by processor 1 10, cause processor 110 to perform various steps, such as those described herein.
  • Embodiments of computer-readable media may comprise, but are not limited to, an electronic, optical, magnetic, or other storage or transmission devices capable of providing processor 110 with computer-readable instructions.
  • media comprise, but are not limited to, a floppy disk, CD-ROM, magnetic disk, memory chip, ROM, RAM, ASIC, configured processor, all optical media, all magnetic tape or other magnetic media, or any other medium from which a computer processor can read.
  • various other devices may include computer-readable media such as a router, private or public network, or other transmission devices.
  • the processor 110 and the processing described may be in one or more structures, and may be dispersed throughout one or more structures.
  • the Processor 110 is in communication with the network interface 112.
  • the network interface 112 may comprise one or more methods of mobile communication, such as infrared, radio, Wi-Fi, or cellular network communication. In other variations, the network interface 112 comprises a wired network interface, such as Ethernet.
  • the messaging device 102 can be configured to exchange messages or virtual message objects with other devices (not shown) over networks such as a cellular network and/or the Internet. Embodiments of messages exchanged between devices may comprise voice messages, text messages, data messages, or other forms of digital messages.
  • the processor 110 is also in communication with one or more touch-sensitive interfaces 114. In some embodiments, touch-sensitive interface 114 may comprise a touchscreen or a touch-pad.
  • touch-sensitive interface 114 may comprise a touch-screen mounted overtop of a display configured to receive a display signal and output an image to the user.
  • touch-sensitive interface 114 may comprise an optical sensor or another type of sensor.
  • touch- sensitive interface may comprise an LED detector.
  • touch- sensitive interface 114 may comprise an LED finger detector mounted on the side of display 116.
  • the processor is in communication with a single touch-sensitive interface 114, in other embodiments, the processor is in communication with a plurality of touch-sensitive interfaces, for example, a first touch-screen and a second touch screen.
  • the touch-sensitive interface 114 is configured to detect user interaction, and based on the user interaction, transmit signals to processor 110.
  • touch-sensitive interface 114 may be configured to detect multiple aspects of the user interaction. For example, touch-sensitive interface 114 may detect the speed and pressure of a user interaction, and incorporate this information into the interface signal.
  • the processor 110 is also in communication with a display 116.
  • the processor 110 can be configured to generate a graphical representation of a user interface to be shown on display 116, then transmit a display signal comprising the graphical representation to display 116.
  • display 116 is configured to receive a display signal from another device.
  • display 116 may comprise an external display, such as a computer monitor.
  • Display 116 is configured to receive a display signal and output an image associated with that display signal.
  • the display signal may comprise a vga, hdmi, svga, video, s-video, or other type of display signal known in the art.
  • display 116 comprises a flat screen display, such as a Liquid Crystal Display (LCD) or Plasma Screen Display.
  • display 116 comprises a Cathode Ray Tube (CRT) or other type of display known in the art.
  • display 116 may comprise touch-sensitive interface 114, for example, display 116 may comprise a touch- screen LCD.
  • display 116 may comprise a flexible screen or flexible display.
  • display 116 may comprise a haptic substrate mounted underneath its surface.
  • display 116 is made of a flexible material, and in response to signals received from processor 110, the haptic substrate flexes, forming ridges, troughs, or other features on the surface of display 116.
  • the haptic substrate may comprise a plasma actuator, a piezoelectric actuator, an electro-active polymer, a micro-electro-mechanical system, a shape memory alloy, a grid of fluid or gas-filled cells.
  • processor 110 receives signals from touch-sensitive interface 114 that are associated with an interaction with the graphical user interface shown on display 116.
  • touch-sensitive interface 114 may comprise a touchscreen and a graphical user interface on display 116 may comprises a virtual keyboard.
  • the touch-screen when the user interacts with a section of the touch-screen that overlays one of the keys of the virtual keyboard, the touch-screen will send an interface signal to processor 110 corresponding to that user interaction. Based on the interface signal, processor 110 will determine that the user pressed one of the keys on the virtual keyboard.
  • This functionality allows the user to interact with other icons and virtual objects on the display 116. For example, in some embodiments the user may flick the touch-screen to move a virtual ball or turn a virtual knob.
  • processor 110 is also in communication with an actuation system comprising one or more actuators 118, a suspension system for each actuator, and electrical power and control wiring for each actuator.
  • messaging device 102 comprises more than one actuation system.
  • Processor 110 is configured to determine a haptic effect and transmit a haptic signal corresponding to the haptic effect to actuator 118.
  • the haptic effect comprises a vibrotactile texture felt on the surface of display 116, touch-sensitive interface 114, or the housing of messaging device 102.
  • determining a haptic effect may comprise performing a series of calculations.
  • determining the haptic effect may comprise accessing a lookup table.
  • determining the haptic effect may comprise a combination of lookup tables and algorithms.
  • determining the haptic effect may comprise a haptic map.
  • determining the haptic effect may comprise mapping the display signal to the actuators.
  • the display signal may comprise a plurality of pixels, each of the pixels associated with a color.
  • each pixel may be associated with the color red, green, or blue; each color may further be associated with an intensity, for example, an intensity of 1-8.
  • determining the haptic effect may comprise assigning a haptic effect to each color.
  • the haptic effect may comprise a direction and intensity of operation, for example, in one embodiment the haptic signal may be configured to cause a rotary actuator to rotate clockwise at one-half power.
  • the intensity of operation may be associated with the intensity of the color.
  • processor 110 determines a haptic effect, it transmits a haptic signal comprising the haptic effect.
  • processor 110 may assign a haptic effect to only some of the pixels in the display signal.
  • the haptic effect may be associated with only a portion of the display signal.
  • processor 110 may utilize a haptic map to determine the haptic effect and then output the display signal to display 116.
  • processor 110 may determine the haptic effect using a haptic map, and then not transmit the display signal to display 116.
  • the display 116 may stay dark, or off, while actuator 118 is outputting the haptic effect.
  • processor 110 may receive a display signal from a digital camera associated with messaging device 102.
  • the user may have deactivated display 116.
  • the processor may utilize a haptic map to provide the user with a haptic effect simulating a texture on the surface of the display. This texture may be used to alert the user when the camera is in focus, or when some other event has occurred.
  • processor 110 may use facial recognition software to determine haptic effects simulating textures at locations on display 116 that would be associated with faces if display 116 were activated.
  • processor 110 may determine the haptic effect based at least in part on a user interaction or trigger. In such an embodiment, processor 110 receives an interface signal from touch-sensitive interface 114, and determines the haptic effect based at least in part on the interface signal. For example, in some embodiments, processor 110 may determine the haptic effects based on the location of the user interaction detected by touch- sensitive interface 114. For example, in such an embodiment, processor 110 may determine a haptic effect that simulates the texture of a virtual object that the user is touching on the display 116. In other embodiments, processor 110 may determine the intensity of the haptic effect based at least in part on the interface signal.
  • processor 110 may determine a high intensity haptic effect. In another embodiment, if touch-sensitive interface 114 detects a low pressure user interaction, processor 110 may determine a low intensity haptic effect. In still other embodiments, processor 110 may determine the intensity of the haptic effect based at least in part on the speed of the user interaction. For example, in one embodiment, processor 110 may determine a low intensity haptic effect when touch-sensitive interface 114 detects low speed user interaction. In still other embodiments, processor 110 may determine no haptic effect, unless it receives an interface signal associated with user interaction from touch- sensitive interface 114.
  • processor 110 determines the haptic effect, it transmits a haptic signal associated with the haptic effect to actuator 118.
  • Actuator 118 is configured to receive a haptic signal from processor 110 and generate the haptic effect.
  • Actuator 118 may be, for example, a piezoelectric actuator, an electric motor, an electro-magnetic actuator, a voice coil, a shape memory alloy, an electro-active polymer, a solenoid, an eccentric rotating mass motor (ERM), or a linear resonant actuator (LRA).
  • actuator 118 may comprise a plurality of actuators, for example an ERM and an LRA.
  • processor 110 may determine a haptic effect configured to simulate the texture of a rock on the surface of touch-sensitive interface 114. Then, processor 110 will transmit a haptic signal associated with the haptic effect to actuator 118, which outputs the haptic effect. For example, when actuator 118 receives the haptic signal, it may output a vibration at a frequency configured to cause the surface of the touch-sensitive interface to comprise the texture of a rock.
  • actuator 118 may be configured to output a vibration at a frequency that causes the surface of display 116 or touch-sensitive interface 114 to comprise the texture of: water, ice, leather, sand, gravel, snow, skin, fur, or some other surface.
  • the haptic effect may be output onto a different portion of messaging device 102, for example onto its housing.
  • actuator 118 may output a multitude of vibrations configured to output multiple textures at the same time.
  • actuator 118 may output a vibration configured to cause the surface of display 116 to comprise the texture of sand.
  • actuator 118 may be configured to output additional vibrations, configured to cause the user to feel the texture of rocks in the sand.
  • Processor 110 may determine a haptic effect for many reasons. For example, in some embodiments, processor 110 may output a haptic effect that corresponds to a the texture of an object shown on display 116. In such an embodiment, the display may show multiple objects, and the processor may determine a different haptic effect as the user moves his/her finger from object to object, thus simulating a different texture for each object. In some embodiments, the haptic effect may act as a confirmation that processor 110 has received a signal associated with user interaction.
  • the graphical user interface may comprise a button and touch-sensitive interface 114 may detect user interaction associated with pressing the button.
  • processor 110 may determine a haptic effect to confirm receipt of the interface signal.
  • the haptic effect may cause the user to feel a texture on the surface of touch-sensitive interface 114.
  • the processor may output a haptic effect that simulates the texture of sand to confirm that processor 110 has received the user input.
  • the processor may determine a different texture, for example, the texture of water, ice, oil, rocks, or skin.
  • the haptic effect may serve a different purpose, for example, alerting the user of boundaries on display 116, or providing the user with haptic information about the image on display 116.
  • each icon on display 116 may comprise a different texture and when the user moves his/her finger from one icon to another, the processor will determine a haptic effect that simulates the texture of each icon.
  • the processor may change the texture when the user's finger moves from contact with an icon to contact with the background of the display, thus alerting the user that he/she is no longer touching the icon.
  • processor 110 is also in communication with speaker 120.
  • Speaker 120 is configured to receive audio signals from processor 110 and output them to the user.
  • the audio signals may be associated with the haptic effect output by actuator 118, or the image output by display 116. In other embodiments, the audio signal may not correspond to the haptic effect or the image.
  • processor 110 may further comprise one or more sensors, for example, a GPS sensor, an imaging sensor, accelerometer, location sensor, rotary velocity sensor, light sensor, camera, microphone, or some other type of sensor.
  • the sensor may be configured to detect changes in acceleration, inclination, inertia, or location.
  • messaging device 102 may comprise an accelerometer configured to measure the messaging device's acceleration.
  • the sensor is configured to transmit sensor signals to processor 110.
  • the sensor signals may comprise one or more parameters associated with a position, a movement, an acceleration, or a "jerk” (i.e. the derivative of acceleration) of the messaging device 102.
  • the sensor may generate and transmit a sensor signal comprising a plurality of parameters, each parameter associated with a movement along or about one measured translational or rotational axis.
  • the sensor outputs voltages or currents that processor 110 is programmed to interpret to indicate movement along one or more axes.
  • processor 110 will receive the sensor signal and determine that it should activate a virtual workspace and interpret sensed movement of the messaging device 102 in an X, Y, or Z direction as corresponding to a virtual movement "within" the virtual workspace.
  • the user may then move device 102 within the virtual workspace to select functions or files, by gesturing within the virtual space. For example, by moving the messaging device 102 in the Z- Axis overtop of a function within the virtual workspace.
  • the user may use gestures within the virtual workspace to modify the haptic effects output by messaging device 102.
  • Figure 2 is an illustration of a system for a texture engine according to one embodiment of the present invention.
  • Figure 2 comprises a messaging device 200, such as a mobile phone, PDA, portable media player, portable gaming device, or mobile computer.
  • the messaging device 200 is configured to send and receive signals, such as voicemail, text messages, and other data messages, over a network such as a cellular network or the Internet.
  • the messaging device 200 may comprise a wireless network interface and/or a wired network interface (not shown in Figure 2).
  • the device 200 is illustrated as a handheld messaging device in Figure 2, other embodiments may comprise different devices, such as video game systems and/or personal computers.
  • the messaging device 200 comprises a housing 202 and a display 216.
  • display 216 may comprise an LCD display.
  • display 216 may comprise a plasma display, or other type of display known in the art.
  • Display 216 is configured to receive a display signal and output an image associated with that display signal.
  • the display signal may comprise a vga, hdmi, svga, video, s-video, or other type of display signal known in the art.
  • display 216 comprises a textured ball 204.
  • Display 216 further comprises texture selection icons 206, which comprise the textures of rocks, sand, and water.
  • the messaging device 200 further comprises a manipulandum 214.
  • the manipulandum 214 comprises a roller ball and buttons.
  • the messaging device 200 also comprises a touch-sensitive interface 218.
  • touch-sensitive interface 218 comprises a touch-screen positioned overtop of display 216.
  • display 216 and the touch-screen may comprise a single integrated component, such as a touch-screen display.
  • Manipulandum 214 and touch-sensitive interface 218 are configured to detect user interaction and transmit interface signals corresponding to the user interaction to the processor.
  • the user interaction is associated with a graphical user interface shown on display 216.
  • the processor receives the interface signal and, based at least in part on the interface signal, manipulates the graphical user interface.
  • the user may use either manipulandum 214 or touch-sensitive interface 218 to select one of texture selection icons 206.
  • the processor may manipulate the display to give textured ball 204 the appearance of a sandy surface, and further determine a haptic effect that causes the user to feel a sandy texture when interacting with textured ball 204.
  • the processor may determine a haptic effect that causes the user to feel a rocky texture when the user interacts with textured ball 204.
  • Messaging device 200 further comprises an actuator configured to receive a haptic signal and output a haptic effect (not shown in Figure 2).
  • the haptic effect comprises a vibrotactile texture felt by the user of messaging device 200.
  • Processor 110 is configured to determine a haptic effect and transmit a haptic signal corresponding to the haptic effect to the actuator.
  • determining a haptic effect may comprise a series of calculations to determine the haptic effect.
  • determining the haptic effect may comprise accessing a lookup table to determine the appropriate haptic effect.
  • determining the haptic effect may comprise using a combination of lookup tables and algorithms.
  • processor 110 determines the haptic effect, it transmits a haptic signal associated with the haptic effect to the actuator.
  • the actuator receives the haptic signal from processor 110 and generates the haptic effect.
  • the user may feel the haptic effect via the surface of display 216 or through some other part of messaging device 200, for example via manipulandum 214 or housing 202.
  • the processor may modify this haptic effect as the user moves his/her finger over the surface of textured ball 204, in order to simulate changes in texture. Illustrations of Systems for a Texture Engine
  • Figure 3 a is an illustration of a system for a texture engine according to one embodiment of the present invention.
  • Figure 3a comprises a messaging device 300, such as a mobile phone, PDA, portable media player, portable gaming device, or mobile computer.
  • the messaging device 300 is configured to send and receive signals comprising messages, such as voicemail, text messages, and other data messages, over a network such as a cellular network or the Internet.
  • the messaging device 300 may comprise a wireless network interface and/or a wired network interface (not shown in Figure 3 a).
  • the device 300 is illustrated as a handheld messaging device in Figure 3a, other embodiments may comprise different devices, such as video game systems and/or personal computers.
  • messaging device 300 comprises a display 316.
  • Display 316 is configured to receive a display signal, and output an image based at least in part on the display signal.
  • Messaging device 300 further compromises a processor (not shown in Figure 3a) configured to transmit the display signal to display 316.
  • Messaging device 300 further comprises a touch-sensitive interface 314 mounted overtop of display 316.
  • Touch-sensitive interface 314 is configured to detect a user interaction and transmit an interface signal corresponding to the user interaction to the processor.
  • Display 316 comprises two icons 302 and 304. When the user interacts with one of icons 302 and 304, touch-sensitive interface 314 will detect the user interaction and transmit a corresponding interface signal to the processor. Based on this interface signal, the processor may determine that the user has opened a file linked to one of the icons or performed some other action known in the art.
  • each of icons 302 and 304 comprises a texture.
  • icon 302 comprises the texture of bricks and icon 304 comprises the texture of rocks.
  • different textures may be used, for example, the texture of sand, water, oil, grass, fur, skin, leather, ice, wood, or some other texture known in the art.
  • the processor when the user interacts with the section of display 316 associated with the icon 302 the processor will determine a haptic effect associated with the texture of bricks.
  • This haptic effect may be characterized by a random signal punctuated with high powered pulses as user's finger 306 moves across mortar.
  • other haptic effects will be used to simulate different textures that may correspond to the image shown on display 316.
  • FIG 3b is an illustration of a system for a texture engine according to one embodiment of the present invention.
  • determining the haptic effect comprises mapping the display signal to the actuator.
  • the embodiment shown in Figure 3b comprises a magnified section of a display 350.
  • Display 350 is configured to receive a display signal from the processor.
  • the display signal comprises a plurality of pixels that are each associated with a color and an intensity of that color.
  • Display 350 receives this display signal and outputs an image associated with the display signal.
  • the magnified portion of display 350 comprises six pixels: 351, 352, 353, 354, 355, and 356.
  • Each pixel is associated with a color and an intensity for that color ranging from 1-10.
  • pixel 355 is associated with the color green, and the color intensity 3 out of 10.
  • the display 350 will output the color green at an intensity of 3 at the location of pixel 355.
  • the processor will determine the haptic effect based, at least in part, on the display signal and an interface signal received from a touch- sensitive interface mounted overtop of display 350 (not shown in Figure 3b). For example, in the embodiment shown in Figure 3b, the processor uses the display signal to associate, or "map," a haptic effect with each pixel. For example, in the embodiment shown in Figure 3b, the processor may determine a different frequency haptic effect for each color. The processor may further associate the intensity of the haptic effect at each pixel with the intensity of the color at each pixel. For example, the processor may determine that a pixel with a color intensity of 8 will also have a haptic intensity of 8.
  • the processor When the processor receives an interface signal associated with user interaction overtop of the pixels on the display, the processor will output a haptic signal associated with the pixels the user is interacting with. This haptic effect is configured to cause the user to feel a texture on the surface of the display.
  • the processor may determine blue pixels are associated with a knocking haptic effect, red pixels are associated with a pulsing vibration, and green pixels are associated with a clicking haptic effect.
  • the processor when the touch-sensitive interface detects that the user's finger has passed over pixel 351 , the processor will determine a knocking with an intensity of 1. Then as the user's finger moves over pixel 352, the processor will determine a pulsing vibration with an intensity of 5. And, as the user's finger continues to move across display 350 to pixel 353, the processor may determine a clicking effect with an intensity of 3.
  • FIG. 4 is a flow chart of a method for a texture engine according to one embodiment of the present invention, which is discussed with respect to the device shown in Figure 1. As shown in Figure 4, the method 400 begins when processor 110 receives a display signal comprising a plurality of pixels 402.
  • the display signal may comprise a vga, hdmi, svga, video, s-video, or other type of display signal known in the art.
  • the display signal may comprise a graphical user interface, or other image that the messaging device will display to the user via display 116.
  • touch-sensitive interface 114 transmits an interface signal to processor 110, which receives the interface signal 404.
  • touch-sensitive interface 114 may comprise a touch-screen or a touch-pad.
  • touch-sensitive interface 114 may comprise a touch-screen mounted overtop of a display configured to receive a display signal and output an image to the user.
  • the touch- sensitive interface may comprise a button, switch, scroll wheel, roller ball, or some other type of physical device interface known in the art.
  • processor 110 is in communication with a single touch-sensitive interface 114.
  • processor 110 is in communication with a plurality of touch-sensitive interfaces 114, for example, a touch-screen and a roller ball.
  • Touch-sensitive interface 114 is configured to detect user interaction, and based at least in part on the user interaction, transmit signals to the processor.
  • touch-sensitive interface 114 may be configured to detect multiple aspects of the user interaction. For example, touch-sensitive interface 114 may detect the speed and pressure of a user interaction and incorporate this information into the interface signal.
  • processor 110 determines a haptic effect comprising a texture 406.
  • the haptic effect may comprise a vibration that the user may feel through the surface of a touch- sensitive interface or a manipulandum. In some embodiments, this vibration may cause the user to feel a texture on the surface of the touch-sensitive interface. For example, the texture of leather, snow, sand, ice, skin, or some other surface.
  • determining a haptic effect may comprise a series of calculations to determine the haptic effect.
  • determining the haptic effect may comprise accessing a lookup table to determine the appropriate haptic effect.
  • determining the haptic effect may comprise a combination of lookup tables and algorithms.
  • determining the haptic effect may comprise a haptic map. In such an embodiment, determining the haptic effect may comprise mapping the display signal to the actuators.
  • the display signal may comprise a plurality of pixels, each of the pixels associated with a color.
  • determining the haptic effect may comprise assigning a haptic effect to each color.
  • processor 110 will output a haptic signal comprising the haptic effect.
  • processor 110 may assign a haptic effect to only some of the pixels in the display signal. For example, in such an embodiment, the haptic effect may be associated with only a portion of the display signal.
  • processor 110 may determine the haptic effect based, at least in part, on a user interaction or trigger. In such an embodiment, processor 110 receives an interface signal from touch-sensitive interface 114, and determines the haptic effect based at least in part on the interface signal. For example, in some embodiments, processor 110 may determine different intensity haptic effects based on the interface signal received from touch- sensitive interface 114. For example, if touch-sensitive interface 114 detects a high pressure user interaction, processor 110 may determine a high intensity haptic effect. In another embodiment, if touch-sensitive interface 114 detects a low pressure user interaction, processor 110 may determine a low intensity haptic effect.
  • processor 110 may determine a low intensity haptic effect when touch-sensitive interface 114 detects low speed user interaction. Further, processor 110 may determine high intensity haptic effects when touch-sensitive interface 114 detects a high speed user interaction. In still other embodiments, processor 110 may determine no haptic effect, unless it receives an interface signal comprising a user interaction from touch-sensitive interface 114. Finally, processor 110 transmits a haptic signal associated with the haptic effect to actuator 1 18, which is configured to receive the haptic signal and output the haptic effect 408. Actuator 1 18 is configured to receive a haptic signal from processor 110 and generate the haptic effect.
  • Actuator 118 may be, for example, a piezoelectric actuator, an electric motor, an electro-magnetic actuator, a voice coil, a linear resonant actuator, a shape memory alloy, an electro-active polymer, a solenoid, an eccentric rotating mass motor (ERM), or a linear resonant actuator (LRA).
  • a piezoelectric actuator an electric motor, an electro-magnetic actuator, a voice coil, a linear resonant actuator, a shape memory alloy, an electro-active polymer, a solenoid, an eccentric rotating mass motor (ERM), or a linear resonant actuator (LRA).
  • Figure 5 a is an illustration of one of the textures that a texture engine may generate according to one embodiment of the present invention.
  • the embodiment shown in Figure 5a comprises brick.
  • the texture of brick is characterized by having a rough irregular texture from bricks, punctuated with the feel of gritty valleys from the mortar.
  • a system for a texture engine may generate the rough irregular texture of brick by driving an actuator, such as a LRA, LPA, or FPA, with a random signal with medium to high maximum variance while the user's finger is moving. In some embodiments, this variance may be adjusted for different roughness.
  • the transition from brick to mortar may be rendered by a high duration pop created by an ERM. Additionally, if the mortar is thick enough, a fine texture may be rendered by driving an actuator with a lower magnitude signal with a higher variance than that used to drive the actuator outputting the texture of the brick.
  • Figure 5b is an illustration of one of the textures that a texture engine may generate according to one embodiment of the present invention.
  • the embodiment shown in Figure 5b comprises rocks.
  • the texture of rocks is characterized by smooth surfaces punctuated with transitions as the user moves from rock to rock.
  • an actuator such as an FPA
  • Individual rocks may be rendered by a non- visual edge map of the displayed image, and outputting a high magnitude haptic signal to an actuator, such as an LPA or ERM, when the touch-sensitive interface detects the user's movement. For example, outputting the haptic effect whenever the touch- sensitive interface detects that the user's finger is transitioning from one rock to another.
  • Figure 5c is an illustration of one of the textures that a texture engine may generate according to one embodiment of the present invention.
  • the embodiment shown in Figure 5 c comprises sand or sandpaper.
  • Sand is characterized by a rough, gritty feel as well as the sensation a pile of sand particles building up in front of the user's finger.
  • an actuator such as an LRA, LPA or FPA is driven with a random signal with high maximum variance while the user's finger is moving.
  • the processor may adjust the variance of the signal to create different degrees of roughness.
  • an actuator such as an FPA may be used.
  • the processor will drive the actuator with a signal that starts with a low intensity and builds as the user moves his/her finger in one direction.
  • the texture shown in Figure 5c may comprise sandpaper.
  • Sandpaper is characterized by having a rough, gritty feel.
  • the processor drives an actuator, such as an LRA, LPA or FPA with a random signal with high maximum variance.
  • this signal is output only while the user's finger is moving across the surface the touch-sensitive interface.
  • the processor may adjust the variance of the signal to change the level of roughness.
  • Figure 5d is an illustration of one of the textures that a texture engine may generate according to one embodiment of the present invention.
  • the texture comprises the texture of grass. Grass is characterized by a periodic light sensation that almost tickles the user's finger.
  • the processor may drive an actuator, such as an FPA, with a signal configured to create patches of low friction overlaid with patches of grass.
  • the processor may render individual grass blades by having a non- visual edge map of the displayed image and outputting a low magnitude signal to an actuator, such as an LPA or ERM, when the user interface detects the user interaction.
  • Figure 5e is an illustration of one of the textures that a texture engine may generate according to one embodiment of the present invention.
  • the texture comprises the texture of fabric. Fabric is characterized by a light smooth sensation.
  • the processor may drive an actuator, such as an LPA or an LRA with low magnitude high frequency signals, as the user's finger moves across the surface of the touch-sensitive interface.
  • FIG. 5f is an illustration of one of the textures that a texture engine may generate according to one embodiment of the present invention.
  • the texture comprises the texture of water or molasses. Water is characterized by having almost no sensation. However, water that is disturbed may splash around and hit against the user's finger.
  • the processor may drive an actuator such as an FPA to reduce the friction on the surface of the touch-sensitive interface.
  • the processor may output the haptic signal only when the user is touching the screen.
  • the processor may drive the actuator with a signal configured to increase the friction on the user's finger as it moves across the surface of the touch-sensitive interface.
  • Figure 5g is an illustration of one of the textures that a texture engine may generate according to one embodiment of the present invention.
  • the texture comprises the texture of leather.
  • Leather is characterized by an overall smooth feeling that comprises the bumps and valleys of the surface of the leather.
  • the processor may drive an actuator, such as an FPA, with a signal configured to output a haptic effect that reduces friction as the user's finger moves across the surface of the touch-sensitive interface.
  • the processor can output the cracks and bumps by driving the actuator with a very short low magnitude haptic signal at times when the touch-sensitive interface detects that the user's finger is moving.
  • Figure 5g is an illustration of one of the textures that a texture engine may generate according to one embodiment of the present invention.
  • the texture comprises the texture of wood.
  • Wood may be characterized by an irregular bumpy texture punctuated by a sharp transition as the user moves from board to board.
  • the processor may drive an actuator such as an
  • the processor may output a haptic signal configured to cause the actuator to generate a high magnitude, short duration, pop.
  • haptic effects associated with different textures may be output.
  • the processor may transmit a haptic signal configured to cause the actuator to output a haptic effect configured to cause the user to feel a texture associated with the texture of ice. Ice is characterized by low friction, in some embodiments; ice has a completely smooth texture. In other embodiments, ice comprises a fine low magnitude gritty texture.
  • the processor may determine a haptic signal configured to cause the actuator to reduce the friction as much as possible while the user moves their finger across the surface of the touch-sensitive interface.
  • the processor may drive an actuator, such as an LPA or LRA, with a haptic signal configured to output low magnitude effects while the user moves their finger. These low magnitude effects may be associated with imperfections or grit on the surface of the ice.
  • the processor may drive the actuator with a signal configured to cause the actuator to output a haptic effect approximating the texture of lizard skin.
  • Lizard skin is characterized by an overall smooth sensation punctuated by transitions from bump to bump on the skin.
  • the processor may drive an actuator with a haptic signal configured to cause the actuator to create patches of low friction on the touch-sensitive interface.
  • the processor may render cracks on the surface of the skin by outputting high magnitude haptic signals periodically, when the touch-sensitive interface detects that the user's finger is moving across its surface. These high magnitude signals may approximate the cracks in the surface of the skin.
  • the processor may drive the actuator with a signal configured to cause the actuator to output a haptic effect approximating the texture of fur. Fur is characterized by a periodic light sensation that is very soft to the touch.
  • the processor may drive the actuator with a haptic signal configured to cause the actuator to output a haptic effect configured to reduce the friction the user feels on the surface of the touch-sensitive interface.
  • the processor may further render individual hairs outputting a low magnitude pulsing haptic signals as the touch-sensitive interface detects the user's movement.
  • the processor may drive the actuator with a signal configured to cause the actuator to output a haptic effect approximating the texture of metal.
  • Metal is characterized by a smooth low friction surface that, in some embodiments, includes light grit.
  • the processor may drive the actuator with a signal configured to lower the friction the user feels on the surface of the touch-sensitive interface.
  • the processor may render individual bumps by outputting brief high magnitude haptic signals when the touch-sensitive interface detects that the user is moving over its surface. These brief high magnitude signals may approximate grit on the surface of the metal.
  • the processor may drive the actuator with a signal configured to cause the actuator to output a haptic effect approximating another sensation, for example, heat.
  • the processor may output a haptic signal configured to cause the actuator to output a high frequency jolting effect, when the user touches elements of the display that are associated with heat.
  • systems and methods of a texture engine adds a previously unused haptic effect to a mobile device. This new effect provides a new avenue for the user to receive information from the mobile device, without the user having to look at the display of the mobile device.
  • systems and methods of a texture engine may allow the user to assign different textures to different icons, buttons, or other components of their display. Thus, the user may be able to determine which icon they are touching, without having to look at that icon. This may increase usability of the device, and may make a device more useful to the visually impaired.
  • systems and methods for a texture engine provides the user with more information, without distracting the user from other tasks, it will reduce user error. For example, users will be less likely to hit the wrong icon or press the wrong key if they are utilizing systems and methods for a texture engine.
  • This functionality may serve both to increase user satisfaction and increase the adoption rate for technology that incorporates systems and methods for a texture engine.
  • adapted to or “configured to” herein is meant as open and inclusive language that does not foreclose devices adapted to or configured to perform additional tasks or steps. Additionally, the use of “based on” is meant to be open and inclusive, in that a process, step, calculation, or other action "based on” one or more recited conditions or values may, in practice, be based on additional conditions or values beyond those recited. Headings, lists, and numbering included herein are for ease of explanation only and are not meant to be limiting. Embodiments in accordance with aspects of the present subject matter can be implemented in digital electronic circuitry, in computer hardware, firmware, software, or in combinations of the preceding. In one embodiment, a computer may comprise a processor or processors.
  • the processor comprises or has access to a computer-readable medium, such as a random access memory (RAM) coupled to the processor.
  • the processor executes computer- executable program instructions stored in memory, such as executing one or more computer programs including a sensor sampling routine, a haptic effect selection routine, and suitable programming to produce signals to generate the selected haptic effects as noted above.
  • Such processors may comprise a microprocessor, a digital signal processor (DSP), an application-specific integrated circuit (ASIC), field programmable gate arrays (FPGAs), and state machines. Such processors may further comprise programmable electronic devices such as PLCs, programmable interrupt controllers (PICs), programmable logic devices (PLDs), programmable read-only memories (PROMs), electronically programmable read-only memories (EPROMs or EEPROMs), or other similar devices. Such processors may comprise, or may be in communication with, media, for example tangible computer-readable media, that may store instructions that, when executed by the processor, can cause the processor to perform the steps described herein as carried out, or assisted, by a processor.
  • PLCs programmable interrupt controllers
  • PLDs programmable logic devices
  • PROMs programmable read-only memories
  • EPROMs or EEPROMs electronically programmable read-only memories
  • Embodiments of computer-readable media may comprise, but are not limited to, all electronic, optical, magnetic, or other storage devices capable of providing a processor, such as the processor in a web server, with computer-readable instructions.
  • Other examples of media comprise, but are not limited to, a floppy disk, CD-ROM, magnetic disk, memory chip, ROM, RAM, ASIC, configured processor, all optical media, all magnetic tape or other magnetic media, or any other medium from which a computer processor can read.
  • various other devices may include computer-readable media, such as a router, private or public network, or other transmission device.
  • the processor, and the processing, described may be in one or more structures, and may be dispersed through one or more structures.
  • the processor may comprise code for carrying out one or more of the methods (or parts of methods) described herein.

Abstract

Systems and methods for a texture engine are disclosed. For example, one disclosed system includes: a processor configured to receive a display signal including a plurality of pixels, determine a haptic effect comprising a texture, and transmit a haptic signal associated with the haptic effect to an actuator in communication with the processor, the actuator configured to receive the haptic signal and output the haptic effect.

Description

SYSTEMS AND METHODS FOR A TEXTURE ENGINE CROSS-REFERENCES TO RELATED APPLICATIONS
This patent application claims priority to U.S. Provisional Patent Application No. 61/159,482, entitled "Locating Features Using a Friction Display," filed March 12, 2009, which is incorporated by reference herein in its entirety.
This patent application claims priority to U.S. Provisional Patent Application No. 61/262,041, entitled "System and Method for Increasing Haptic Bandwidth in an Electronic Device" filed November 17, 2009, which is incorporated by reference herein in its entirety. This patent application claims priority to U.S. Provisional Patent Application No.
61/262,038, entitled "Friction Rotary Device for Haptic Feedback" filed November 17, 2009, which is incorporated by reference herein in its entirety.
This patent application claims priority to U.S. Utility Patent Application No. 12/696,893, entitled "Systems And Methods For Providing Features In A Friction Display" filed January 29, 2010, which is incorporated by reference herein in its entirety.
This patent application claims priority to U.S. Utility Patent Application No. 12/696,900, entitled "Systems And Methods For Friction Displays And Additional Haptic Effects" filed January 29, 2010, which is incorporated by reference herein in its entirety.
This patent application claims priority to U.S. Utility Patent Application No. 12/696,908, entitled "Systems And Methods For Interfaces Featuring Surface-Based Haptic Effects" filed January 29, 2010, which is incorporated by reference herein in its entirety.
This patent application claims priority to U.S. Utility Patent Application No. 12/697,010, entitled "Systems And Methods For A Texture Engine" filed January 29, 2010, which is incorporated by reference herein in its entirety. This patent application claims priority to U.S. Utility Patent Application No.
12/697,037, entitled "Systems And Methods For Using Textures In Graphical User Interface Widgets" filed January 29, 2010, which is incorporated by reference herein in its entirety.
This patent application claims priority to U.S. Utility Patent Application No. 12/697,042, entitled "Systems And Methods For Using Multiple Actuators To Realize Textures" filed January 29, 2010, which is incorporated by reference herein in its entirety.
FIELD OF THE INVENTION
The present invention generally relates to haptic feedback and more particularly to systems and methods for a texture engine. BACKGROUND
Over the past several years, the use of handheld devices of all types has grown exponentially. These devices are used as portable organizers, telephones, music players, and gaming systems. Many modern handheld devices now incorporate some type of haptic feedback. As haptic technology improves, devices may incorporate haptic feedback simulating a texture. Accordingly, a haptic texture engine is needed.
SUMMARY
Embodiments of the present invention provide systems and methods for a texture engine. For example, in one embodiment, a system for a texture engine comprises: a processor configured to receive a display signal comprising a plurality of pixels, determine a haptic effect comprising a texture, and transmit a haptic signal associated with the haptic effect to an actuator in communication with the processor, the actuator configured to receive the haptic signal and output the haptic effect.
This illustrative embodiment is mentioned not to limit or define the invention, but rather to provide an example to aid understanding thereof. Illustrative embodiments are discussed in the Detailed Description, which provides further description of the invention. Advantages offered by various embodiments of this invention may be further understood by examining this specification.
BRIEF DESCRIPTION OF THE DRAWINGS These and other features, aspects, and advantages of the present invention are better understood when the following Detailed Description is read with reference to the accompanying drawings, wherein:
Figure 1 is a block diagram of a system for a texture engine according to one embodiment of the present invention; Figure 2 is an illustration of a system for a texture engine according to one embodiment of the present invention;
Figure 3 a is an illustration of a system for a texture engine according to one embodiment of the present invention;
Figure 3b is an illustration of a system for a texture engine according to one embodiment of the present invention;
Figure 4 is a flow chart of a method for a texture engine according to one embodiment of the present invention; Figure 5 a is an illustration of one of the textures that a texture engine may generate according to one embodiment of the present invention;
Figure 5b is another illustration of one of the textures that a texture engine may generate according to one embodiment of the present invention; Figure 5c is another illustration of one of the textures that a texture engine may generate according to one embodiment of the present invention;
Figure 5d is another illustration of one of the textures that a texture engine may generate according to one embodiment of the present invention;
Figure 5e is another illustration of one of the textures that a texture engine may generate according to one embodiment of the present invention;
Figure 5f is another illustration of one of the textures that a texture engine may generate according to one embodiment of the present invention;
Figure 5g is another illustration of one of the textures that a texture engine may generate according to one embodiment of the present invention; and Figure 5h is another illustration of one of the textures that a texture engine may generate according to one embodiment of the present invention.
DETAILED DESCRIPTION
Embodiments of the present invention provide systems and methods for a texture engine. Illustrative Embodiment of a Texture Engine
One illustrative embodiment of the present invention comprises a messaging device, such as a mobile phone. In the illustrative embodiment, the messaging device comprises the Samsung Haptic Phone (SCH- W420) equipped with Immersion Corporation's TouchSense® 3000, TouchSense® 4000, or TouchSense® 5000 vibrotactile feedback systems, formerly known as Immersion Corporation's VibeTonz® vibrotactile feedback system. In other embodiments, different messaging devices and haptic feedback systems may be utilized.
The illustrative messaging device comprises a display, a speaker, a network interface, a memory, and a processor in communication with each of these elements. The illustrative messaging device also comprises a touch-sensitive interface and an actuator, both of which are in communication with the processor. The touch-sensitive interface is configured to sense a user's interaction with the messaging device, and the actuator is configured to output a haptic effect. The illustrative messaging device may further comprise a manipulandum configured to detect a user interaction and transmit an interface signal associated with the user interaction to the processor.
In the illustrative messaging device, the display is configured to display a graphical user interface to the user. The graphical user interface may comprise virtual objects, for example icons, buttons, or a virtual keyboard. The illustrative messaging device further comprises a touch-sensitive interface, such as a touch-screen, mounted overtop of the display. The touch-sensitive interface allows the user to interact with the virtual objects displayed in the graphical user interface. For example, in one embodiment, the graphical user interface may comprise a virtual keyboard, and in such an embodiment, the touch-sensitive interface allows the user to touch a key on the virtual keyboard to input the alphanumeric character associated with the key. This functionality may be used to type messages, or otherwise interact with objects in the graphical user interface.
In the illustrative messaging device the processor is configured to determine a haptic effect and transmit a haptic signal corresponding to the haptic effect to an actuator configured to output the haptic effect. In the illustrative messaging device, this haptic effect simulates a texture that the user feels on the surface of the touch-sensitive interface. The simulated texture may be associated with the user interface shown on the display. For example, the display may show an icon comprising the shape of a rock. In such an embodiment, the processor may determine a haptic effect configured to simulate the texture of the rock on the surface of the touch-sensitive interface. Then, the processor will transmit a haptic signal to an actuator configured to output the haptic effect. When the actuator receives the haptic signal, it will output a haptic effect, such as a vibration, at a frequency configured to cause the surface of the touch-sensitive interface to approximate the texture of the rock.
In the illustrative embodiment, the processor may implement a haptic map to determine the haptic effect. For example, in the illustrative embodiment, the processor may receive a display signal comprising a plurality of pixels, each of the pixels associated with a color. For example, in the illustrative embodiment, each pixel in the display signal may be associated with the color red, green, or blue, and it may further be associated with an intensity for each color. In the illustrative embodiment, the processor will assign a haptic value to each color and further assign a haptic intensity associated with the intensity of each color. Then, the processor will transmit a haptic signal comprising the haptic values and haptic intensities to an actuator configured to output the haptic effect.
In the illustrative embodiment, the processor may further determine the haptic effect based on an external trigger. For example, in the illustrative embodiment, the processor is configured to receive an interface signal from a touch-sensitive interface configured to detect a user interaction. Then, in the illustrative embodiment, the processor will determine the haptic effect based at least in part on the interface signal. For example, the processor may modify the haptic value or haptic intensity based at least in part on the interface signal. In the illustrative embodiment, if the touch-sensitive interface detects a high speed or high pressure user interaction, the processor will determine a higher intensity haptic effect.
The illustrative messaging device may output a haptic effect for a multitude of purposes. For example, in one embodiment, the haptic effect may act as a confirmation that the processor has received an interface signal associated with a user interaction. For example, the graphical user interface may comprise a button, and the touch-sensitive interface may detect user interaction associated with pressing the button and transmit an interface signal to the processor. In response, the processor may determine a haptic effect to confirm receiving the interface signal. In such an embodiment, the haptic effect may cause the user to feel a texture on the surface of the touch-sensitive interface. In the illustrative embodiment, the processor may further determine haptic effects for other purposes. For example, the illustrative messaging device may output a texture to alert the user to boundaries on the display or as an identification for objects such as icons on the surface of the display.
This illustrative example is given to introduce the reader to the general subject matter discussed herein. The invention is not limited to this example. The following sections describe various additional non-limiting embodiments and examples of systems and methods for a texture engine.
Illustrated System for a Texture Engine
Referring now to the drawings in which like numerals indicate like elements throughout the several figures, Figure 1 is a block diagram of a system for a texture engine according to one embodiment of the present invention. As shown in Figure 1, the system 100 comprises a messaging device 102, such as a mobile phone, portable digital assistant (PDA), portable media player, portable computer, portable gaming device, or some other mobile device. In some embodiments, messaging device 102 may comprise a laptop, tablet, desktop PC, or other similar device. In still other embodiments, the messaging device may comprise an external monitor for use with a PC or some other device. The messaging device 102 comprises a processor 110 in communication with a network interface 112, a touch-sensitive interface 114, a display 116, an actuator 118, a speaker 120, and a memory 122. The processor 110 is configured to execute computer-executable program instructions stored in memory 122. For example, processor 110 may execute one or more computer programs for messaging or for generating haptic feedback. Processor 110 may comprise a microprocessor, a digital signal processor (DSP), an application-specific integrated circuit (ASIC), one or more field programmable gate arrays (FPGAs), or state machines. Processor 110 may further comprise a programmable electronic device such as a programmable logic controller (PLC), a programmable interrupt controller (PIC), a programmable logic device (PLD), a programmable read-only memory (PROM), an electronically programmable readonly memory (EPROM or EEPROM), or other similar devices. Memory 122 comprises a computer-readable medium that stores instructions, which when executed by processor 1 10, cause processor 110 to perform various steps, such as those described herein. Embodiments of computer-readable media may comprise, but are not limited to, an electronic, optical, magnetic, or other storage or transmission devices capable of providing processor 110 with computer-readable instructions. Other examples of media comprise, but are not limited to, a floppy disk, CD-ROM, magnetic disk, memory chip, ROM, RAM, ASIC, configured processor, all optical media, all magnetic tape or other magnetic media, or any other medium from which a computer processor can read. In addition, various other devices may include computer-readable media such as a router, private or public network, or other transmission devices. The processor 110 and the processing described may be in one or more structures, and may be dispersed throughout one or more structures.
Processor 110 is in communication with the network interface 112. The network interface 112 may comprise one or more methods of mobile communication, such as infrared, radio, Wi-Fi, or cellular network communication. In other variations, the network interface 112 comprises a wired network interface, such as Ethernet. The messaging device 102 can be configured to exchange messages or virtual message objects with other devices (not shown) over networks such as a cellular network and/or the Internet. Embodiments of messages exchanged between devices may comprise voice messages, text messages, data messages, or other forms of digital messages. The processor 110 is also in communication with one or more touch-sensitive interfaces 114. In some embodiments, touch-sensitive interface 114 may comprise a touchscreen or a touch-pad. For example, in some embodiments, touch-sensitive interface 114 may comprise a touch-screen mounted overtop of a display configured to receive a display signal and output an image to the user. In other embodiments, touch-sensitive interface 114 may comprise an optical sensor or another type of sensor. In one embodiment, touch- sensitive interface may comprise an LED detector. For example, in one embodiment, touch- sensitive interface 114 may comprise an LED finger detector mounted on the side of display 116. In some embodiments, the processor is in communication with a single touch-sensitive interface 114, in other embodiments, the processor is in communication with a plurality of touch-sensitive interfaces, for example, a first touch-screen and a second touch screen. The touch-sensitive interface 114 is configured to detect user interaction, and based on the user interaction, transmit signals to processor 110. In some embodiments, touch-sensitive interface 114 may be configured to detect multiple aspects of the user interaction. For example, touch-sensitive interface 114 may detect the speed and pressure of a user interaction, and incorporate this information into the interface signal.
In the embodiment shown in Figure 1, the processor 110 is also in communication with a display 116. The processor 110 can be configured to generate a graphical representation of a user interface to be shown on display 116, then transmit a display signal comprising the graphical representation to display 116. In other embodiments, display 116 is configured to receive a display signal from another device. For example, in some embodiments, display 116 may comprise an external display, such as a computer monitor. Display 116 is configured to receive a display signal and output an image associated with that display signal. In some embodiments, the display signal may comprise a vga, hdmi, svga, video, s-video, or other type of display signal known in the art. In some embodiments, display 116 comprises a flat screen display, such as a Liquid Crystal Display (LCD) or Plasma Screen Display. In other embodiments display 116 comprises a Cathode Ray Tube (CRT) or other type of display known in the art. In still other embodiments, display 116 may comprise touch-sensitive interface 114, for example, display 116 may comprise a touch- screen LCD. In still other embodiments, display 116 may comprise a flexible screen or flexible display. For example, in some embodiments, display 116 may comprise a haptic substrate mounted underneath its surface. In such an embodiment, display 116 is made of a flexible material, and in response to signals received from processor 110, the haptic substrate flexes, forming ridges, troughs, or other features on the surface of display 116. In some embodiments, the haptic substrate may comprise a plasma actuator, a piezoelectric actuator, an electro-active polymer, a micro-electro-mechanical system, a shape memory alloy, a grid of fluid or gas-filled cells.
In some embodiments, processor 110 receives signals from touch-sensitive interface 114 that are associated with an interaction with the graphical user interface shown on display 116. For example, in one embodiment, touch-sensitive interface 114 may comprise a touchscreen and a graphical user interface on display 116 may comprises a virtual keyboard. In such an embodiment, when the user interacts with a section of the touch-screen that overlays one of the keys of the virtual keyboard, the touch-screen will send an interface signal to processor 110 corresponding to that user interaction. Based on the interface signal, processor 110 will determine that the user pressed one of the keys on the virtual keyboard. This functionality allows the user to interact with other icons and virtual objects on the display 116. For example, in some embodiments the user may flick the touch-screen to move a virtual ball or turn a virtual knob. As shown in Figure 1, processor 110 is also in communication with an actuation system comprising one or more actuators 118, a suspension system for each actuator, and electrical power and control wiring for each actuator. In some embodiments, messaging device 102 comprises more than one actuation system. Processor 110 is configured to determine a haptic effect and transmit a haptic signal corresponding to the haptic effect to actuator 118. In some embodiments, the haptic effect comprises a vibrotactile texture felt on the surface of display 116, touch-sensitive interface 114, or the housing of messaging device 102. In some embodiments, determining a haptic effect may comprise performing a series of calculations. In other embodiments, determining the haptic effect may comprise accessing a lookup table. In still other embodiments, determining the haptic effect may comprise a combination of lookup tables and algorithms.
In some embodiments, determining the haptic effect may comprise a haptic map. In such an embodiment, determining the haptic effect may comprise mapping the display signal to the actuators. For example, the display signal may comprise a plurality of pixels, each of the pixels associated with a color. In such an embodiment, each pixel may be associated with the color red, green, or blue; each color may further be associated with an intensity, for example, an intensity of 1-8. In such an embodiment, determining the haptic effect may comprise assigning a haptic effect to each color. In some embodiments, the haptic effect may comprise a direction and intensity of operation, for example, in one embodiment the haptic signal may be configured to cause a rotary actuator to rotate clockwise at one-half power. In some embodiments, the intensity of operation may be associated with the intensity of the color. Once processor 110 determines a haptic effect, it transmits a haptic signal comprising the haptic effect. In some embodiments, processor 110 may assign a haptic effect to only some of the pixels in the display signal. For example, in such an embodiment, the haptic effect may be associated with only a portion of the display signal. In some embodiments, processor 110 may utilize a haptic map to determine the haptic effect and then output the display signal to display 116. In other embodiments, processor 110 may determine the haptic effect using a haptic map, and then not transmit the display signal to display 116. In such an embodiment, the display 116 may stay dark, or off, while actuator 118 is outputting the haptic effect. For example, in such an embodiment, processor 110 may receive a display signal from a digital camera associated with messaging device 102. In some embodiments, in order to conserve battery power, the user may have deactivated display 116. In such an embodiment, the processor may utilize a haptic map to provide the user with a haptic effect simulating a texture on the surface of the display. This texture may be used to alert the user when the camera is in focus, or when some other event has occurred. For example, processor 110 may use facial recognition software to determine haptic effects simulating textures at locations on display 116 that would be associated with faces if display 116 were activated.
In some embodiments, processor 110 may determine the haptic effect based at least in part on a user interaction or trigger. In such an embodiment, processor 110 receives an interface signal from touch-sensitive interface 114, and determines the haptic effect based at least in part on the interface signal. For example, in some embodiments, processor 110 may determine the haptic effects based on the location of the user interaction detected by touch- sensitive interface 114. For example, in such an embodiment, processor 110 may determine a haptic effect that simulates the texture of a virtual object that the user is touching on the display 116. In other embodiments, processor 110 may determine the intensity of the haptic effect based at least in part on the interface signal. For example, if touch-sensitive interface 114 detects a high pressure user interaction, processor 110 may determine a high intensity haptic effect. In another embodiment, if touch-sensitive interface 114 detects a low pressure user interaction, processor 110 may determine a low intensity haptic effect. In still other embodiments, processor 110 may determine the intensity of the haptic effect based at least in part on the speed of the user interaction. For example, in one embodiment, processor 110 may determine a low intensity haptic effect when touch-sensitive interface 114 detects low speed user interaction. In still other embodiments, processor 110 may determine no haptic effect, unless it receives an interface signal associated with user interaction from touch- sensitive interface 114.
Once processor 110 determines the haptic effect, it transmits a haptic signal associated with the haptic effect to actuator 118. Actuator 118 is configured to receive a haptic signal from processor 110 and generate the haptic effect. Actuator 118 may be, for example, a piezoelectric actuator, an electric motor, an electro-magnetic actuator, a voice coil, a shape memory alloy, an electro-active polymer, a solenoid, an eccentric rotating mass motor (ERM), or a linear resonant actuator (LRA). In some embodiments, actuator 118 may comprise a plurality of actuators, for example an ERM and an LRA. In one embodiment of the present invention, the haptic effect generated by actuator
118 is configured to simulate a texture that the user feels on the surface of touch-sensitive interface 114 or display 116. This texture may be associated with the graphical user interface shown on display 116. For example, display 116 may show an icon comprising the shape of a rock. In such an embodiment, processor 110 may determine a haptic effect configured to simulate the texture of a rock on the surface of touch-sensitive interface 114. Then, processor 110 will transmit a haptic signal associated with the haptic effect to actuator 118, which outputs the haptic effect. For example, when actuator 118 receives the haptic signal, it may output a vibration at a frequency configured to cause the surface of the touch-sensitive interface to comprise the texture of a rock. In other embodiments, actuator 118 may be configured to output a vibration at a frequency that causes the surface of display 116 or touch-sensitive interface 114 to comprise the texture of: water, ice, leather, sand, gravel, snow, skin, fur, or some other surface. In some embodiments, the haptic effect may be output onto a different portion of messaging device 102, for example onto its housing. In some embodiments, actuator 118 may output a multitude of vibrations configured to output multiple textures at the same time. For example, actuator 118 may output a vibration configured to cause the surface of display 116 to comprise the texture of sand. In such an embodiment, actuator 118 may be configured to output additional vibrations, configured to cause the user to feel the texture of rocks in the sand.
Processor 110 may determine a haptic effect for many reasons. For example, in some embodiments, processor 110 may output a haptic effect that corresponds to a the texture of an object shown on display 116. In such an embodiment, the display may show multiple objects, and the processor may determine a different haptic effect as the user moves his/her finger from object to object, thus simulating a different texture for each object. In some embodiments, the haptic effect may act as a confirmation that processor 110 has received a signal associated with user interaction. For example, in one embodiment, the graphical user interface may comprise a button and touch-sensitive interface 114 may detect user interaction associated with pressing the button. When touch-sensitive interface 114 transmits an interface signal associated with the user interaction to processor 110, processor 110 may determine a haptic effect to confirm receipt of the interface signal. In such an embodiment, the haptic effect may cause the user to feel a texture on the surface of touch-sensitive interface 114. For example, the processor may output a haptic effect that simulates the texture of sand to confirm that processor 110 has received the user input. In other embodiments, the processor may determine a different texture, for example, the texture of water, ice, oil, rocks, or skin. In some embodiments, the haptic effect may serve a different purpose, for example, alerting the user of boundaries on display 116, or providing the user with haptic information about the image on display 116. For example, in some embodiments, each icon on display 116 may comprise a different texture and when the user moves his/her finger from one icon to another, the processor will determine a haptic effect that simulates the texture of each icon. In further embodiments, the processor may change the texture when the user's finger moves from contact with an icon to contact with the background of the display, thus alerting the user that he/she is no longer touching the icon.
As shown in Figure 1, processor 110 is also in communication with speaker 120. Speaker 120 is configured to receive audio signals from processor 110 and output them to the user. In some embodiments, the audio signals may be associated with the haptic effect output by actuator 118, or the image output by display 116. In other embodiments, the audio signal may not correspond to the haptic effect or the image.
In some embodiments, processor 110 may further comprise one or more sensors, for example, a GPS sensor, an imaging sensor, accelerometer, location sensor, rotary velocity sensor, light sensor, camera, microphone, or some other type of sensor. The sensor may be configured to detect changes in acceleration, inclination, inertia, or location. For example, messaging device 102 may comprise an accelerometer configured to measure the messaging device's acceleration. The sensor is configured to transmit sensor signals to processor 110.
The sensor signals may comprise one or more parameters associated with a position, a movement, an acceleration, or a "jerk" (i.e. the derivative of acceleration) of the messaging device 102. For example, in one embodiment, the sensor may generate and transmit a sensor signal comprising a plurality of parameters, each parameter associated with a movement along or about one measured translational or rotational axis. In some embodiments, the sensor outputs voltages or currents that processor 110 is programmed to interpret to indicate movement along one or more axes.
In some embodiments, processor 110 will receive the sensor signal and determine that it should activate a virtual workspace and interpret sensed movement of the messaging device 102 in an X, Y, or Z direction as corresponding to a virtual movement "within" the virtual workspace. The user may then move device 102 within the virtual workspace to select functions or files, by gesturing within the virtual space. For example, by moving the messaging device 102 in the Z- Axis overtop of a function within the virtual workspace. In some embodiments, the user may use gestures within the virtual workspace to modify the haptic effects output by messaging device 102. Figure 2 is an illustration of a system for a texture engine according to one embodiment of the present invention. Figure 2 comprises a messaging device 200, such as a mobile phone, PDA, portable media player, portable gaming device, or mobile computer. The messaging device 200 is configured to send and receive signals, such as voicemail, text messages, and other data messages, over a network such as a cellular network or the Internet. The messaging device 200 may comprise a wireless network interface and/or a wired network interface (not shown in Figure 2). Although the device 200 is illustrated as a handheld messaging device in Figure 2, other embodiments may comprise different devices, such as video game systems and/or personal computers.
As shown in Figure 2, the messaging device 200 comprises a housing 202 and a display 216. In some embodiments, display 216 may comprise an LCD display. In other embodiments, display 216 may comprise a plasma display, or other type of display known in the art. Display 216 is configured to receive a display signal and output an image associated with that display signal. In some embodiments, the display signal may comprise a vga, hdmi, svga, video, s-video, or other type of display signal known in the art. In the embodiment shown in Figure 2, display 216 comprises a textured ball 204. Display 216 further comprises texture selection icons 206, which comprise the textures of rocks, sand, and water.
Referring still to Figure 2, the messaging device 200 further comprises a manipulandum 214. In the embodiment shown in Figure 2, the manipulandum 214 comprises a roller ball and buttons. The messaging device 200 also comprises a touch-sensitive interface 218. In the embodiment shown in Figure 2, touch-sensitive interface 218 comprises a touch-screen positioned overtop of display 216. In some embodiments, display 216 and the touch-screen may comprise a single integrated component, such as a touch-screen display. Manipulandum 214 and touch-sensitive interface 218 are configured to detect user interaction and transmit interface signals corresponding to the user interaction to the processor. In some embodiments, the user interaction is associated with a graphical user interface shown on display 216. In such an embodiment, the processor receives the interface signal and, based at least in part on the interface signal, manipulates the graphical user interface. For example, in the embodiment shown in Figure 2, the user may use either manipulandum 214 or touch-sensitive interface 218 to select one of texture selection icons 206. Once the user has selected a texture for textured ball 204, its appearance on the screen may change to correspond to that texture. For example, if the user selects the sand texture icon, the processor may manipulate the display to give textured ball 204 the appearance of a sandy surface, and further determine a haptic effect that causes the user to feel a sandy texture when interacting with textured ball 204. Or, in another embodiment, if the user selects the rocky texture icon, the processor may determine a haptic effect that causes the user to feel a rocky texture when the user interacts with textured ball 204.
Messaging device 200 further comprises an actuator configured to receive a haptic signal and output a haptic effect (not shown in Figure 2). In some embodiments, the haptic effect comprises a vibrotactile texture felt by the user of messaging device 200. Processor 110 is configured to determine a haptic effect and transmit a haptic signal corresponding to the haptic effect to the actuator. In some embodiments, determining a haptic effect may comprise a series of calculations to determine the haptic effect. In other embodiments, determining the haptic effect may comprise accessing a lookup table to determine the appropriate haptic effect. In still other embodiments, determining the haptic effect may comprise using a combination of lookup tables and algorithms. Once processor 110 determines the haptic effect, it transmits a haptic signal associated with the haptic effect to the actuator. The actuator receives the haptic signal from processor 110 and generates the haptic effect. The user may feel the haptic effect via the surface of display 216 or through some other part of messaging device 200, for example via manipulandum 214 or housing 202. In some embodiments, the processor may modify this haptic effect as the user moves his/her finger over the surface of textured ball 204, in order to simulate changes in texture. Illustrations of Systems for a Texture Engine
Figure 3 a is an illustration of a system for a texture engine according to one embodiment of the present invention. Figure 3a comprises a messaging device 300, such as a mobile phone, PDA, portable media player, portable gaming device, or mobile computer. The messaging device 300 is configured to send and receive signals comprising messages, such as voicemail, text messages, and other data messages, over a network such as a cellular network or the Internet. The messaging device 300 may comprise a wireless network interface and/or a wired network interface (not shown in Figure 3 a). Although the device 300 is illustrated as a handheld messaging device in Figure 3a, other embodiments may comprise different devices, such as video game systems and/or personal computers. As shown in Figure 3a, messaging device 300 comprises a display 316. Display 316 is configured to receive a display signal, and output an image based at least in part on the display signal. Messaging device 300 further compromises a processor (not shown in Figure 3a) configured to transmit the display signal to display 316. Messaging device 300 further comprises a touch-sensitive interface 314 mounted overtop of display 316. Touch-sensitive interface 314 is configured to detect a user interaction and transmit an interface signal corresponding to the user interaction to the processor. Display 316 comprises two icons 302 and 304. When the user interacts with one of icons 302 and 304, touch-sensitive interface 314 will detect the user interaction and transmit a corresponding interface signal to the processor. Based on this interface signal, the processor may determine that the user has opened a file linked to one of the icons or performed some other action known in the art.
As shown in Figure 3a, each of icons 302 and 304 comprises a texture. In the embodiment shown, icon 302 comprises the texture of bricks and icon 304 comprises the texture of rocks. In other embodiments, different textures may be used, for example, the texture of sand, water, oil, grass, fur, skin, leather, ice, wood, or some other texture known in the art. When the user, shown in Figure 3a as finger 306, interacts with the section of display 316 associated with each icon, the processor will determine a haptic effect configured to simulate the texture of that icon. The processor will then output a signal associated with the haptic effect to an actuator (not shown in Figure 3 a) configured to output the haptic effect. For example, in the embodiment shown in Figure 3 a, when the user interacts with the section of display 316 associated with the icon 302 the processor will determine a haptic effect associated with the texture of bricks. This haptic effect may be characterized by a random signal punctuated with high powered pulses as user's finger 306 moves across mortar. In other embodiments, other haptic effects will be used to simulate different textures that may correspond to the image shown on display 316.
Figure 3b is an illustration of a system for a texture engine according to one embodiment of the present invention. In the embodiment shown in Figure 3b, determining the haptic effect comprises mapping the display signal to the actuator. The embodiment shown in Figure 3b, comprises a magnified section of a display 350. Display 350 is configured to receive a display signal from the processor. The display signal comprises a plurality of pixels that are each associated with a color and an intensity of that color. Display 350 receives this display signal and outputs an image associated with the display signal. In the embodiment shown in Figure 3b, the magnified portion of display 350 comprises six pixels: 351, 352, 353, 354, 355, and 356. Each pixel is associated with a color and an intensity for that color ranging from 1-10. For example, pixel 355 is associated with the color green, and the color intensity 3 out of 10. Thus, the display 350 will output the color green at an intensity of 3 at the location of pixel 355.
In the embodiment shown in Figure 3b, the processor will determine the haptic effect based, at least in part, on the display signal and an interface signal received from a touch- sensitive interface mounted overtop of display 350 (not shown in Figure 3b). For example, in the embodiment shown in Figure 3b, the processor uses the display signal to associate, or "map," a haptic effect with each pixel. For example, in the embodiment shown in Figure 3b, the processor may determine a different frequency haptic effect for each color. The processor may further associate the intensity of the haptic effect at each pixel with the intensity of the color at each pixel. For example, the processor may determine that a pixel with a color intensity of 8 will also have a haptic intensity of 8. When the processor receives an interface signal associated with user interaction overtop of the pixels on the display, the processor will output a haptic signal associated with the pixels the user is interacting with. This haptic effect is configured to cause the user to feel a texture on the surface of the display.
For example, in the embodiment shown in Figure 3b, the processor may determine blue pixels are associated with a knocking haptic effect, red pixels are associated with a pulsing vibration, and green pixels are associated with a clicking haptic effect. In such an embodiment, when the touch-sensitive interface detects that the user's finger has passed over pixel 351 , the processor will determine a knocking with an intensity of 1. Then as the user's finger moves over pixel 352, the processor will determine a pulsing vibration with an intensity of 5. And, as the user's finger continues to move across display 350 to pixel 353, the processor may determine a clicking effect with an intensity of 3.
These haptic effects are configured to cause the user to feel a texture on the surface of display 350 as the user moves his/her finger over the surface of display 350. In some embodiments, the processor may be in communication with more than one actuator, and each color may be associated with its own actuator. In other embodiments, different combinations of colors, intensities, and haptic effects may be used to cause the user to feel a texture on the surface of the display. Figure 4 is a flow chart of a method for a texture engine according to one embodiment of the present invention, which is discussed with respect to the device shown in Figure 1. As shown in Figure 4, the method 400 begins when processor 110 receives a display signal comprising a plurality of pixels 402. The display signal may comprise a vga, hdmi, svga, video, s-video, or other type of display signal known in the art. The display signal may comprise a graphical user interface, or other image that the messaging device will display to the user via display 116.
Then, touch-sensitive interface 114 transmits an interface signal to processor 110, which receives the interface signal 404. In some embodiments, touch-sensitive interface 114 may comprise a touch-screen or a touch-pad. For example, in some embodiments, touch- sensitive interface 114 may comprise a touch-screen mounted overtop of a display configured to receive a display signal and output an image to the user. In other embodiments, the touch- sensitive interface may comprise a button, switch, scroll wheel, roller ball, or some other type of physical device interface known in the art. In some embodiments, processor 110 is in communication with a single touch-sensitive interface 114. In other embodiments, processor 110 is in communication with a plurality of touch-sensitive interfaces 114, for example, a touch-screen and a roller ball. Touch-sensitive interface 114 is configured to detect user interaction, and based at least in part on the user interaction, transmit signals to the processor. In some embodiments, touch-sensitive interface 114 may be configured to detect multiple aspects of the user interaction. For example, touch-sensitive interface 114 may detect the speed and pressure of a user interaction and incorporate this information into the interface signal.
Next, processor 110 determines a haptic effect comprising a texture 406. The haptic effect may comprise a vibration that the user may feel through the surface of a touch- sensitive interface or a manipulandum. In some embodiments, this vibration may cause the user to feel a texture on the surface of the touch-sensitive interface. For example, the texture of leather, snow, sand, ice, skin, or some other surface. In some embodiments, determining a haptic effect may comprise a series of calculations to determine the haptic effect. In other embodiments, determining the haptic effect may comprise accessing a lookup table to determine the appropriate haptic effect. In still other embodiments, determining the haptic effect may comprise a combination of lookup tables and algorithms.
In some embodiments, determining the haptic effect may comprise a haptic map. In such an embodiment, determining the haptic effect may comprise mapping the display signal to the actuators. For example, the display signal may comprise a plurality of pixels, each of the pixels associated with a color. In such an embodiment, determining the haptic effect may comprise assigning a haptic effect to each color. Then processor 110 will output a haptic signal comprising the haptic effect. In some embodiments, processor 110 may assign a haptic effect to only some of the pixels in the display signal. For example, in such an embodiment, the haptic effect may be associated with only a portion of the display signal. In some embodiments, processor 110 may determine the haptic effect based, at least in part, on a user interaction or trigger. In such an embodiment, processor 110 receives an interface signal from touch-sensitive interface 114, and determines the haptic effect based at least in part on the interface signal. For example, in some embodiments, processor 110 may determine different intensity haptic effects based on the interface signal received from touch- sensitive interface 114. For example, if touch-sensitive interface 114 detects a high pressure user interaction, processor 110 may determine a high intensity haptic effect. In another embodiment, if touch-sensitive interface 114 detects a low pressure user interaction, processor 110 may determine a low intensity haptic effect. In still other embodiments, processor 110 may determine a low intensity haptic effect when touch-sensitive interface 114 detects low speed user interaction. Further, processor 110 may determine high intensity haptic effects when touch-sensitive interface 114 detects a high speed user interaction. In still other embodiments, processor 110 may determine no haptic effect, unless it receives an interface signal comprising a user interaction from touch-sensitive interface 114. Finally, processor 110 transmits a haptic signal associated with the haptic effect to actuator 1 18, which is configured to receive the haptic signal and output the haptic effect 408. Actuator 1 18 is configured to receive a haptic signal from processor 110 and generate the haptic effect. Actuator 118 may be, for example, a piezoelectric actuator, an electric motor, an electro-magnetic actuator, a voice coil, a linear resonant actuator, a shape memory alloy, an electro-active polymer, a solenoid, an eccentric rotating mass motor (ERM), or a linear resonant actuator (LRA).
Figure 5 a is an illustration of one of the textures that a texture engine may generate according to one embodiment of the present invention. The embodiment shown in Figure 5a comprises brick. The texture of brick is characterized by having a rough irregular texture from bricks, punctuated with the feel of gritty valleys from the mortar. A system for a texture engine may generate the rough irregular texture of brick by driving an actuator, such as a LRA, LPA, or FPA, with a random signal with medium to high maximum variance while the user's finger is moving. In some embodiments, this variance may be adjusted for different roughness. In some embodiments, the transition from brick to mortar may be rendered by a high duration pop created by an ERM. Additionally, if the mortar is thick enough, a fine texture may be rendered by driving an actuator with a lower magnitude signal with a higher variance than that used to drive the actuator outputting the texture of the brick.
Figure 5b is an illustration of one of the textures that a texture engine may generate according to one embodiment of the present invention. The embodiment shown in Figure 5b comprises rocks. The texture of rocks is characterized by smooth surfaces punctuated with transitions as the user moves from rock to rock. In order to output the texture of a rock, an actuator, such as an FPA, is used to create patches of low friction. Individual rocks may be rendered by a non- visual edge map of the displayed image, and outputting a high magnitude haptic signal to an actuator, such as an LPA or ERM, when the touch-sensitive interface detects the user's movement. For example, outputting the haptic effect whenever the touch- sensitive interface detects that the user's finger is transitioning from one rock to another.
Figure 5c is an illustration of one of the textures that a texture engine may generate according to one embodiment of the present invention. The embodiment shown in Figure 5 c comprises sand or sandpaper. Sand is characterized by a rough, gritty feel as well as the sensation a pile of sand particles building up in front of the user's finger. In order to output the rough gritty texture, of sand, an actuator, such as an LRA, LPA or FPA is driven with a random signal with high maximum variance while the user's finger is moving. In some embodiments, the processor may adjust the variance of the signal to create different degrees of roughness. To create the feeling of sand piling up, an actuator such as an FPA may be used. In such an embodiment, when user moves their finger across the touch screen, the processor will drive the actuator with a signal that starts with a low intensity and builds as the user moves his/her finger in one direction.
In another embodiment, the texture shown in Figure 5c may comprise sandpaper. Sandpaper is characterized by having a rough, gritty feel. To create the rough, gritty feel the processor drives an actuator, such as an LRA, LPA or FPA with a random signal with high maximum variance. In some embodiments, this signal is output only while the user's finger is moving across the surface the touch-sensitive interface. In some embodiments, the processor may adjust the variance of the signal to change the level of roughness. Figure 5d is an illustration of one of the textures that a texture engine may generate according to one embodiment of the present invention. In the embodiment shown in Figure 5c, the texture comprises the texture of grass. Grass is characterized by a periodic light sensation that almost tickles the user's finger. In order to create the sensation of grass, the processor may drive an actuator, such as an FPA, with a signal configured to create patches of low friction overlaid with patches of grass. In some embodiments, the processor may render individual grass blades by having a non- visual edge map of the displayed image and outputting a low magnitude signal to an actuator, such as an LPA or ERM, when the user interface detects the user interaction. Figure 5e is an illustration of one of the textures that a texture engine may generate according to one embodiment of the present invention. In the embodiment shown in Figure 5e, the texture comprises the texture of fabric. Fabric is characterized by a light smooth sensation. In order to create the sensation of the texture of fabric, the processor may drive an actuator, such as an LPA or an LRA with low magnitude high frequency signals, as the user's finger moves across the surface of the touch-sensitive interface.
Figure 5f is an illustration of one of the textures that a texture engine may generate according to one embodiment of the present invention. In the embodiment shown in Figure 5f, the texture comprises the texture of water or molasses. Water is characterized by having almost no sensation. However, water that is disturbed may splash around and hit against the user's finger. To emulate the texture of water, the processor may drive an actuator such as an FPA to reduce the friction on the surface of the touch-sensitive interface. To emulate the water sloshing, the processor may output the haptic signal only when the user is touching the screen. To emulate the texture of a more viscous fluid, such as molasses, or oil, the processor may drive the actuator with a signal configured to increase the friction on the user's finger as it moves across the surface of the touch-sensitive interface.
Figure 5g is an illustration of one of the textures that a texture engine may generate according to one embodiment of the present invention. In the embodiment shown in Figure 5g, the texture comprises the texture of leather. Leather is characterized by an overall smooth feeling that comprises the bumps and valleys of the surface of the leather. In order to create the sensations of the texture of leather, the processor may drive an actuator, such as an FPA, with a signal configured to output a haptic effect that reduces friction as the user's finger moves across the surface of the touch-sensitive interface. The processor can output the cracks and bumps by driving the actuator with a very short low magnitude haptic signal at times when the touch-sensitive interface detects that the user's finger is moving.
Figure 5g is an illustration of one of the textures that a texture engine may generate according to one embodiment of the present invention. In the embodiment shown in Figure 5e, the texture comprises the texture of wood. Wood may be characterized by an irregular bumpy texture punctuated by a sharp transition as the user moves from board to board. In order to create the irregular bumpy texture, the processor may drive an actuator such as an
LRA, LPA, or FPA with a non- visual edge map of the displayed image and drive the actuator with a very short low magnitude signal at various times when the user's finger is moving. To output the transition from plank to plank, the processor may output a haptic signal configured to cause the actuator to generate a high magnitude, short duration, pop. In other embodiments, haptic effects associated with different textures may be output. For example, in one embodiment, the processor may transmit a haptic signal configured to cause the actuator to output a haptic effect configured to cause the user to feel a texture associated with the texture of ice. Ice is characterized by low friction, in some embodiments; ice has a completely smooth texture. In other embodiments, ice comprises a fine low magnitude gritty texture. In order to create the texture of ice, the processor may determine a haptic signal configured to cause the actuator to reduce the friction as much as possible while the user moves their finger across the surface of the touch-sensitive interface. In another embodiment, the processor may drive an actuator, such as an LPA or LRA, with a haptic signal configured to output low magnitude effects while the user moves their finger. These low magnitude effects may be associated with imperfections or grit on the surface of the ice. In another embodiment, the processor may drive the actuator with a signal configured to cause the actuator to output a haptic effect approximating the texture of lizard skin. Lizard skin is characterized by an overall smooth sensation punctuated by transitions from bump to bump on the skin. In order to implement a haptic effect comprising the texture of lizard skin, the processor may drive an actuator with a haptic signal configured to cause the actuator to create patches of low friction on the touch-sensitive interface. The processor may render cracks on the surface of the skin by outputting high magnitude haptic signals periodically, when the touch-sensitive interface detects that the user's finger is moving across its surface. These high magnitude signals may approximate the cracks in the surface of the skin. In yet another embodiment, the processor may drive the actuator with a signal configured to cause the actuator to output a haptic effect approximating the texture of fur. Fur is characterized by a periodic light sensation that is very soft to the touch. In order to implement a haptic effect comprising the texture of fur, the processor may drive the actuator with a haptic signal configured to cause the actuator to output a haptic effect configured to reduce the friction the user feels on the surface of the touch-sensitive interface. The processor may further render individual hairs outputting a low magnitude pulsing haptic signals as the touch-sensitive interface detects the user's movement.
In yet another embodiment, the processor may drive the actuator with a signal configured to cause the actuator to output a haptic effect approximating the texture of metal. Metal is characterized by a smooth low friction surface that, in some embodiments, includes light grit. In order to implement a haptic effect comprising the texture of metal, the processor may drive the actuator with a signal configured to lower the friction the user feels on the surface of the touch-sensitive interface. In some embodiments, the processor may render individual bumps by outputting brief high magnitude haptic signals when the touch-sensitive interface detects that the user is moving over its surface. These brief high magnitude signals may approximate grit on the surface of the metal.
In yet another embodiments, the processor may drive the actuator with a signal configured to cause the actuator to output a haptic effect approximating another sensation, for example, heat. In such an embodiment, the processor may output a haptic signal configured to cause the actuator to output a high frequency jolting effect, when the user touches elements of the display that are associated with heat.
Advantages of Systems and Methods for a Texture Engine There are many advantages of systems and methods for a texture engine. For example, systems and methods of a texture engine adds a previously unused haptic effect to a mobile device. This new effect provides a new avenue for the user to receive information from the mobile device, without the user having to look at the display of the mobile device. For example, systems and methods of a texture engine may allow the user to assign different textures to different icons, buttons, or other components of their display. Thus, the user may be able to determine which icon they are touching, without having to look at that icon. This may increase usability of the device, and may make a device more useful to the visually impaired.
Further, because systems and methods for a texture engine provides the user with more information, without distracting the user from other tasks, it will reduce user error. For example, users will be less likely to hit the wrong icon or press the wrong key if they are utilizing systems and methods for a texture engine. This functionality may serve both to increase user satisfaction and increase the adoption rate for technology that incorporates systems and methods for a texture engine. General Considerations
The use of "adapted to" or "configured to" herein is meant as open and inclusive language that does not foreclose devices adapted to or configured to perform additional tasks or steps. Additionally, the use of "based on" is meant to be open and inclusive, in that a process, step, calculation, or other action "based on" one or more recited conditions or values may, in practice, be based on additional conditions or values beyond those recited. Headings, lists, and numbering included herein are for ease of explanation only and are not meant to be limiting. Embodiments in accordance with aspects of the present subject matter can be implemented in digital electronic circuitry, in computer hardware, firmware, software, or in combinations of the preceding. In one embodiment, a computer may comprise a processor or processors. The processor comprises or has access to a computer-readable medium, such as a random access memory (RAM) coupled to the processor. The processor executes computer- executable program instructions stored in memory, such as executing one or more computer programs including a sensor sampling routine, a haptic effect selection routine, and suitable programming to produce signals to generate the selected haptic effects as noted above.
Such processors may comprise a microprocessor, a digital signal processor (DSP), an application-specific integrated circuit (ASIC), field programmable gate arrays (FPGAs), and state machines. Such processors may further comprise programmable electronic devices such as PLCs, programmable interrupt controllers (PICs), programmable logic devices (PLDs), programmable read-only memories (PROMs), electronically programmable read-only memories (EPROMs or EEPROMs), or other similar devices. Such processors may comprise, or may be in communication with, media, for example tangible computer-readable media, that may store instructions that, when executed by the processor, can cause the processor to perform the steps described herein as carried out, or assisted, by a processor. Embodiments of computer-readable media may comprise, but are not limited to, all electronic, optical, magnetic, or other storage devices capable of providing a processor, such as the processor in a web server, with computer-readable instructions. Other examples of media comprise, but are not limited to, a floppy disk, CD-ROM, magnetic disk, memory chip, ROM, RAM, ASIC, configured processor, all optical media, all magnetic tape or other magnetic media, or any other medium from which a computer processor can read. Also, various other devices may include computer-readable media, such as a router, private or public network, or other transmission device. The processor, and the processing, described may be in one or more structures, and may be dispersed through one or more structures. The processor may comprise code for carrying out one or more of the methods (or parts of methods) described herein.
While the present subject matter has been described in detail with respect to specific embodiments thereof, it will be appreciated that those skilled in the art, upon attaining an understanding of the foregoing may readily produce alterations to, variations of, and equivalents to such embodiments. Accordingly, it should be understood that the present disclosure has been presented for purposes of example rather than limitation, and does not preclude inclusion of such modifications, variations and/or additions to the present subject matter as would be readily apparent to one of ordinary skill in the art.

Claims

That which is claimed is: 1. A system comprising: a processor configured to: receive a display signal comprising a plurality of pixels; determine a haptic effect comprising a texture; and transmit a haptic signal associated with the haptic effect; an actuator in communication with the processor, the actuator configured to receive the haptic signal and output the haptic effect.
2. The system of claim 1, wherein the texture is a vibrotactile effect.
3. The system of claim 1, wherein the texture comprises the texture of: sand, lizard skin, or a brick.
4. The system of claim 1, wherein the actuator comprises one of: an eccentric rotating mass motor, a linear resonant actuator, shape memory allow, electroactive polymer or a piezoelectric actuator.
5. The system of claim 1, wherein the haptic effect is determined based at least in part on the display signal.
6. The system of claim 5, wherein each of the plurality of pixels is associated with a color, and wherein determining the haptic effect comprises assigning a haptic value to the color.
7. The system of claim 6, wherein determining the haptic effect comprises assigning a haptic value to only some of the plurality of pixels.
8. The system of claim 6, wherein each color comprises an intensity, and determining the haptic effect further comprises adjusting the haptic value to correspond to the intensity.
9. The system of claim 1, further comprising a display in communication with the processor, the display configured to receive the display signal and output an image.
10. The system of claim 9, wherein the texture is output onto a surface of the display.
11. The system of claim 9, wherein the actuator is coupled to the display.
12. The system of claim 1, further comprising a housing configured to enclose the actuator and the processor.
13. The system of claim 12, wherein the housing comprises a mobile device housing.
14. The system of claim 12, wherein the actuator is coupled to the housing.
15. The system of claim 1, further comprising a touch-sensitive interface configured to detect user interaction and transmit a sensor signal to the processor based at least in part on the user interaction.
16. The system of claim 15, wherein the processor is further configured to determine the haptic effect based at least in part on the sensor signal.
17. The system of claim 16, wherein the touch-sensitive interface is configured to detect the speed of the user interaction and wherein determining the haptic effect comprises adjusting the haptic effect to correspond with the speed of the user interaction.
18. The system of claim 16, wherein the touch-sensitive interface is configured to detect the pressure of the user interaction and wherein determining the haptic effect comprises adjusting the intensity of the haptic effect to correspond with the pressure of the user interaction.
19. A method for outputting a haptic effect comprising: receiving a display signal comprising a plurality of pixels; determining a haptic effect comprising a texture; and transmitting a haptic signal associated with the haptic effect to an actuator configured to receive the haptic signal and output the haptic effect.
20. The method of claim 19, wherein the haptic effect is determined based at least in part on the display signal.
21. The method of claim 20, wherein each of the plurality of pixels is associated with a color, and wherein determining the haptic effect comprises assigning a haptic value to each color.
22. The method of claim 20, wherein each color comprises an intensity and determining the haptic effect further comprises associating the haptic value with the intensity.
23. The method of claim 20, further comprising receiving an interface signal from a touch-sensitive interface, and wherein the haptic effect is determined based at least in part on the interface signal.
24. A system comprising: a touch-sensitive interface configured to detect a user interaction and transmit a signal corresponding to the user interaction, the touch-sensitive interface configured to detect the speed and pressure of the user interaction; a processor in communication the touch-sensitive interface, the processor configured to: receive a display signal comprising a plurality of pixels that each comprise a color and an intensity; determine a haptic effect based at least in part on the color and intensity of each pixel and the speed and pressure of the user interaction; and transmit a haptic signal associated with the haptic effect; an actuator in communication with the processor, the actuator configured to receive the haptic signal and output the haptic effect.
EP10712202A 2009-03-12 2010-03-11 Systems and methods for a texture engine Ceased EP2406704A1 (en)

Applications Claiming Priority (10)

Application Number Priority Date Filing Date Title
US15948209P 2009-03-12 2009-03-12
US26204109P 2009-11-17 2009-11-17
US26203809P 2009-11-17 2009-11-17
US12/697,037 US9927873B2 (en) 2009-03-12 2010-01-29 Systems and methods for using textures in graphical user interface widgets
US12/697,042 US10564721B2 (en) 2009-03-12 2010-01-29 Systems and methods for using multiple actuators to realize textures
US12/696,900 US9696803B2 (en) 2009-03-12 2010-01-29 Systems and methods for friction displays and additional haptic effects
US12/696,893 US9746923B2 (en) 2009-03-12 2010-01-29 Systems and methods for providing features in a friction display wherein a haptic effect is configured to vary the coefficient of friction
US12/696,908 US10007340B2 (en) 2009-03-12 2010-01-29 Systems and methods for interfaces featuring surface-based haptic effects
US12/697,010 US9874935B2 (en) 2009-03-12 2010-01-29 Systems and methods for a texture engine
PCT/US2010/026909 WO2010105012A1 (en) 2009-03-12 2010-03-11 Systems and methods for a texture engine

Publications (1)

Publication Number Publication Date
EP2406704A1 true EP2406704A1 (en) 2012-01-18

Family

ID=73451201

Family Applications (1)

Application Number Title Priority Date Filing Date
EP10712202A Ceased EP2406704A1 (en) 2009-03-12 2010-03-11 Systems and methods for a texture engine

Country Status (5)

Country Link
EP (1) EP2406704A1 (en)
JP (1) JP5779508B2 (en)
KR (2) KR102003426B1 (en)
CN (1) CN102349038B (en)
WO (1) WO2010105012A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10073527B2 (en) 2009-03-12 2018-09-11 Immersion Corporation Systems and methods for providing features in a friction display including a haptic effect based on a color and a degree of shading

Families Citing this family (62)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9927873B2 (en) 2009-03-12 2018-03-27 Immersion Corporation Systems and methods for using textures in graphical user interface widgets
US9448713B2 (en) * 2011-04-22 2016-09-20 Immersion Corporation Electro-vibrotactile display
WO2013157626A1 (en) * 2012-04-20 2013-10-24 株式会社ニコン Electronic device and vibration control method
WO2013169851A2 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for facilitating user interaction with controls in a user interface
WO2013169846A1 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for displaying additional information in response to a user contact
EP2847659B1 (en) 2012-05-09 2019-09-04 Apple Inc. Device, method, and graphical user interface for transitioning between display states in response to a gesture
WO2013169865A2 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input
WO2013169842A2 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for selecting object within a group of objects
WO2013169843A1 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for manipulating framed graphical objects
WO2013169845A1 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for scrolling nested regions
AU2013259613B2 (en) 2012-05-09 2016-07-21 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface
EP3264252B1 (en) 2012-05-09 2019-11-27 Apple Inc. Device, method, and graphical user interface for performing an operation in accordance with a selected mode of operation
KR101806350B1 (en) 2012-05-09 2017-12-07 애플 인크. Device, method, and graphical user interface for selecting user interface objects
CN109298789B (en) 2012-05-09 2021-12-31 苹果公司 Device, method and graphical user interface for providing feedback on activation status
WO2013169849A2 (en) 2012-05-09 2013-11-14 Industries Llc Yknots Device, method, and graphical user interface for displaying user interface objects corresponding to an application
CN102662477A (en) * 2012-05-10 2012-09-12 孙晓颖 Touch representation device based on electrostatic force
CN104737096B (en) * 2012-05-31 2018-01-02 诺基亚技术有限公司 Display device
JP6071372B2 (en) 2012-09-21 2017-02-01 キヤノン株式会社 Electronic device and control method of electronic device
US9196134B2 (en) * 2012-10-31 2015-11-24 Immersion Corporation Method and apparatus for simulating surface features on a user interface with haptic effects
JP6020083B2 (en) * 2012-11-19 2016-11-02 アイシン・エィ・ダブリュ株式会社 Operation support system, operation support method, and computer program
US9330544B2 (en) * 2012-11-20 2016-05-03 Immersion Corporation System and method for simulated physical interactions with haptic effects
CN107831991B (en) 2012-12-29 2020-11-27 苹果公司 Device, method and graphical user interface for determining whether to scroll or select content
WO2014105274A1 (en) 2012-12-29 2014-07-03 Yknots Industries Llc Device, method, and graphical user interface for navigating user interface hierarchies
CN105144057B (en) 2012-12-29 2019-05-17 苹果公司 For moving the equipment, method and graphic user interface of cursor according to the cosmetic variation of the control icon with simulation three-dimensional feature
AU2013368441B2 (en) 2012-12-29 2016-04-14 Apple Inc. Device, method, and graphical user interface for forgoing generation of tactile output for a multi-contact gesture
US9405369B2 (en) * 2013-04-26 2016-08-02 Immersion Corporation, Inc. Simulation of tangible user interface interactions and gestures using array of haptic cells
EP3144780A1 (en) 2013-06-11 2017-03-22 Immersion Corporation Systems and methods for pressure-based haptic effects
EP2876533A4 (en) * 2013-09-26 2015-08-19 Fujitsu Ltd Drive control apparatus, electronic device, and drive control method
JP2015130168A (en) * 2013-12-31 2015-07-16 イマージョン コーポレーションImmersion Corporation Friction augmented control, and method to convert buttons of touch control panels to friction augmented controls
JP6644466B2 (en) * 2013-12-31 2020-02-12 イマージョン コーポレーションImmersion Corporation System and method for providing tactile notification
JP6289100B2 (en) * 2014-01-06 2018-03-07 キヤノン株式会社 Information processing apparatus, information processing method, and program
WO2015121964A1 (en) * 2014-02-14 2015-08-20 富士通株式会社 Input device
JP6319328B2 (en) * 2014-02-14 2018-05-09 富士通株式会社 Educational tactile sensation providing apparatus and system
WO2015121969A1 (en) * 2014-02-14 2015-08-20 富士通株式会社 Tactile device and system
WO2015121971A1 (en) * 2014-02-14 2015-08-20 富士通株式会社 Tactile device and system
CN111399646A (en) * 2014-03-21 2020-07-10 意美森公司 Systems, methods, and computer-readable media for force-based object manipulation and haptic detection
US9645646B2 (en) * 2014-09-04 2017-05-09 Intel Corporation Three dimensional contextual feedback wristband device
JP2016057764A (en) * 2014-09-08 2016-04-21 株式会社東海理化電機製作所 Tactile sense presentation device
KR102398389B1 (en) 2014-11-12 2022-05-16 엘지디스플레이 주식회사 Method for modeling of haptic signal from haptic object, display apparatus and method for driving thereof
US10048757B2 (en) 2015-03-08 2018-08-14 Apple Inc. Devices and methods for controlling media presentation
US9632664B2 (en) 2015-03-08 2017-04-25 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10095396B2 (en) 2015-03-08 2018-10-09 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US9645732B2 (en) 2015-03-08 2017-05-09 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US9639184B2 (en) 2015-03-19 2017-05-02 Apple Inc. Touch input cursor manipulation
US20170045981A1 (en) 2015-08-10 2017-02-16 Apple Inc. Devices and Methods for Processing Touch Inputs Based on Their Intensities
US10067653B2 (en) 2015-04-01 2018-09-04 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US10346030B2 (en) 2015-06-07 2019-07-09 Apple Inc. Devices and methods for navigating between user interfaces
US10200598B2 (en) 2015-06-07 2019-02-05 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US9860451B2 (en) 2015-06-07 2018-01-02 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US9891811B2 (en) 2015-06-07 2018-02-13 Apple Inc. Devices and methods for navigating between user interfaces
US9830048B2 (en) 2015-06-07 2017-11-28 Apple Inc. Devices and methods for processing touch inputs with instructions in a web page
US10248308B2 (en) 2015-08-10 2019-04-02 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interfaces with physical gestures
US10416800B2 (en) 2015-08-10 2019-09-17 Apple Inc. Devices, methods, and graphical user interfaces for adjusting user interface objects
US10235035B2 (en) 2015-08-10 2019-03-19 Apple Inc. Devices, methods, and graphical user interfaces for content navigation and manipulation
US9880735B2 (en) 2015-08-10 2018-01-30 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
EP3659011B1 (en) * 2017-07-27 2022-09-21 Telefonaktiebolaget LM Ericsson (PUBL) Improved perception of haptic objects
US10747404B2 (en) * 2017-10-24 2020-08-18 Microchip Technology Incorporated Touchscreen including tactile feedback structures and corresponding virtual user interface elements
JP7087367B2 (en) * 2017-12-08 2022-06-21 富士フイルムビジネスイノベーション株式会社 Information processing equipment, programs and control methods
US11158220B2 (en) * 2018-12-10 2021-10-26 Universal City Studios Llc Interactive animated protection window with haptic feedback system
JP7345387B2 (en) * 2019-12-26 2023-09-15 Kddi株式会社 Tactile sensation presentation system, local terminal and server device of the tactile sensation presentation system, tactile sensation presentation method, and tactile sensation presentation program
WO2023108131A1 (en) * 2021-12-10 2023-06-15 Shaw Industries Group, Inc. Visceral surface covering simulator and method of use
KR102504937B1 (en) 2021-12-22 2023-03-02 현대건설기계 주식회사 Remote Control System for Construction Equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030117490A1 (en) * 2001-12-20 2003-06-26 Kirk Tecu Tactile display apparatus
US20030234768A1 (en) * 2002-05-16 2003-12-25 Junichi Rekimoto Input method and input device
US20060024647A1 (en) * 2004-07-30 2006-02-02 France Telecom Method and apparatus for communicating graphical information to a visually impaired person using haptic feedback
US20080122589A1 (en) * 2006-11-28 2008-05-29 Ivanov Yuri A Tactile Output Device
US20090167701A1 (en) * 2007-12-28 2009-07-02 Nokia Corporation Audio and tactile feedback based on visual environment
WO2009097866A1 (en) * 2008-02-04 2009-08-13 Nokia Corporation Device and method for providing tactile information

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6429846B2 (en) * 1998-06-23 2002-08-06 Immersion Corporation Haptic feedback for touchpads and other touch controls
JP2001290572A (en) * 2000-04-05 2001-10-19 Fuji Xerox Co Ltd Information processor
JP2003099177A (en) * 2001-09-21 2003-04-04 Fuji Xerox Co Ltd Method for preparing haptic information and method for presenting haptic information and its device
JP2004310518A (en) * 2003-04-08 2004-11-04 Fuji Xerox Co Ltd Picture information processor
DE10340188A1 (en) 2003-09-01 2005-04-07 Siemens Ag Screen with a touch-sensitive user interface for command input
US20060209037A1 (en) 2004-03-15 2006-09-21 David Wang Method and system for providing haptic effects
JP4860625B2 (en) * 2004-10-08 2012-01-25 イマージョン コーポレーション Haptic feedback for simulating buttons and scrolling motion on touch input devices
US7616192B2 (en) * 2005-07-28 2009-11-10 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. Touch device and method for providing tactile feedback
KR100791379B1 (en) * 2006-01-02 2008-01-07 삼성전자주식회사 System and method for user interface

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030117490A1 (en) * 2001-12-20 2003-06-26 Kirk Tecu Tactile display apparatus
US20030234768A1 (en) * 2002-05-16 2003-12-25 Junichi Rekimoto Input method and input device
US20060024647A1 (en) * 2004-07-30 2006-02-02 France Telecom Method and apparatus for communicating graphical information to a visually impaired person using haptic feedback
US20080122589A1 (en) * 2006-11-28 2008-05-29 Ivanov Yuri A Tactile Output Device
US20090167701A1 (en) * 2007-12-28 2009-07-02 Nokia Corporation Audio and tactile feedback based on visual environment
WO2009097866A1 (en) * 2008-02-04 2009-08-13 Nokia Corporation Device and method for providing tactile information

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of WO2010105012A1 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10073527B2 (en) 2009-03-12 2018-09-11 Immersion Corporation Systems and methods for providing features in a friction display including a haptic effect based on a color and a degree of shading
US10747322B2 (en) 2009-03-12 2020-08-18 Immersion Corporation Systems and methods for providing features in a friction display

Also Published As

Publication number Publication date
CN102349038A (en) 2012-02-08
JP5779508B2 (en) 2015-09-16
KR102003426B1 (en) 2019-07-24
JP2012520137A (en) 2012-09-06
KR20110130469A (en) 2011-12-05
KR102051180B1 (en) 2019-12-02
WO2010105012A1 (en) 2010-09-16
CN102349038B (en) 2016-08-24
KR20160110547A (en) 2016-09-21

Similar Documents

Publication Publication Date Title
US10198077B2 (en) Systems and methods for a texture engine
US10564721B2 (en) Systems and methods for using multiple actuators to realize textures
EP2406701B1 (en) System and method for using multiple actuators to realize textures
JP5779508B2 (en) System and method for a texture engine
CN105892921B (en) System and method for implementing texture using multiple actuators
US10379618B2 (en) Systems and methods for using textures in graphical user interface widgets
EP2406705A1 (en) Systems and methods for using textures in graphical user interface widgets
KR101992070B1 (en) Systems and methods for a texture engine

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20110824

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR

AX Request for extension of the european patent

Extension state: AL BA ME RS

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: IMMERSION CORPORATION

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: IMMERSION CORPORATION

17Q First examination report despatched

Effective date: 20170519

REG Reference to a national code

Ref country code: DE

Ref legal event code: R003

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED

18R Application refused

Effective date: 20200122