US20160034088A1 - Touch Force Deflection Sensor - Google Patents

Touch Force Deflection Sensor Download PDF

Info

Publication number
US20160034088A1
US20160034088A1 US14/776,610 US201314776610A US2016034088A1 US 20160034088 A1 US20160034088 A1 US 20160034088A1 US 201314776610 A US201314776610 A US 201314776610A US 2016034088 A1 US2016034088 A1 US 2016034088A1
Authority
US
United States
Prior art keywords
force
map
deflection
displacement
moment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US14/776,610
Other versions
US9851828B2 (en
Inventor
Peter W. Richards
Sinan Filiz
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apple Inc. filed Critical Apple Inc.
Publication of US20160034088A1 publication Critical patent/US20160034088A1/en
Application granted granted Critical
Publication of US9851828B2 publication Critical patent/US9851828B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01LMEASURING FORCE, STRESS, TORQUE, WORK, MECHANICAL POWER, MECHANICAL EFFICIENCY, OR FLUID PRESSURE
    • G01L1/00Measuring force or stress, in general
    • G01L1/20Measuring force or stress, in general by measuring variations in ohmic resistance of solid materials or of electrically-conductive fluids; by making use of electrokinetic cells, i.e. liquid-containing cells wherein an electrical potential is produced or varied upon the application of stress
    • G01L1/205Measuring force or stress, in general by measuring variations in ohmic resistance of solid materials or of electrically-conductive fluids; by making use of electrokinetic cells, i.e. liquid-containing cells wherein an electrical potential is produced or varied upon the application of stress using distributed sensing elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0414Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01LMEASURING FORCE, STRESS, TORQUE, WORK, MECHANICAL POWER, MECHANICAL EFFICIENCY, OR FLUID PRESSURE
    • G01L1/00Measuring force or stress, in general
    • G01L1/14Measuring force or stress, in general by measuring variations in capacitance or inductance of electrical elements, e.g. by measuring variations of frequency of electrical oscillators
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01LMEASURING FORCE, STRESS, TORQUE, WORK, MECHANICAL POWER, MECHANICAL EFFICIENCY, OR FLUID PRESSURE
    • G01L1/00Measuring force or stress, in general
    • G01L1/20Measuring force or stress, in general by measuring variations in ohmic resistance of solid materials or of electrically-conductive fluids; by making use of electrokinetic cells, i.e. liquid-containing cells wherein an electrical potential is produced or varied upon the application of stress
    • G01L1/22Measuring force or stress, in general by measuring variations in ohmic resistance of solid materials or of electrically-conductive fluids; by making use of electrokinetic cells, i.e. liquid-containing cells wherein an electrical potential is produced or varied upon the application of stress using resistance strain gauges
    • G01L1/2287Measuring force or stress, in general by measuring variations in ohmic resistance of solid materials or of electrically-conductive fluids; by making use of electrokinetic cells, i.e. liquid-containing cells wherein an electrical potential is produced or varied upon the application of stress using resistance strain gauges constructional details of the strain gauges
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0414Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
    • G06F3/04144Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position using an array of force sensing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • G06F3/0446Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means using a grid-like structure of electrodes in at least two directions, e.g. using row and column electrodes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/045Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using resistive elements, e.g. a single continuous surface or two parallel surfaces put in contact
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04106Multi-sensing digitiser, i.e. digitiser using at least two different sensing technologies simultaneously or alternatively, e.g. for detecting pen and finger, for saving power or for improving position detection

Definitions

  • This subject matter of this disclosure relates generally to electronic devices, and specifically to touch screens and other electronic display components.
  • the disclosure relates to touch screen (or touchscreen) sensors, track pads (or trackpads), and other input devices for mobile phones, personal and tablet computers, and other portable and stationary electronics applications.
  • Touch pads, touch screens and other input-sensing devices have a broad range of applications including computer systems, mobile phones, media players, personal digital assistants, gaming platforms, and other electronic devices. Suitable technologies include a variety of resistive and capacitive-coupled sensor systems, as well as optical, electromagnetic, and surface acoustic wave devices.
  • a conducting grid may be utilized, for example with sets of orthogonal traces separated by a dielectric insulator.
  • the grid functions as a capacitive array, which is sensitive to contact (or proximity) based on changes in the corresponding voltage or charge capacity, for example as manifested in a current output.
  • contact with the input surface causes changes in resistance across the insulating layer, which is registered by an increase or decrease in corresponding sense currents.
  • control functions can also be integrated into a single device or form factor, including, but not limited to, real-time operation and control of voice and data communications, messaging, media playback and development, gaming, internet access, navigational services, and personal digital assistant functions including alarms, reminders and calendar tasks.
  • the disclosure relates to force-sensitive input devices, including track pads, touch screens and other control systems with sensitivity to contact forces.
  • the disclosure encompasses a touch sensitive input system for an electronic device, with a deflection sensor disposed adjacent to or along the control surface.
  • the deflection sensor is configured to generate a deflection signal, for example based on a touch pattern generated by a user on the surface of a touch screen, track pad, or other force-sensing control device.
  • a processor is provided in signal communication with the deflection sensor.
  • the processor is operable to generate a deflection map characterizing deflection of the sensing surface, based on the deflection signal.
  • the processor is also operable to generate a force map characterizing force on the sensing surface, based on a transformation of the deflection map.
  • the transformation may be based on a generalized inverse of a compliance operator, which relates the deflection map to the force map.
  • the compliance operator is not necessarily square, and may instead have a rectangular representation, with no strictly defined inverse.
  • the compliance operator may have more rows than columns, where the rows correspond to entries in the deflection map and the columns corresponding to entries in the force map.
  • the compliance operator may have more columns than rows.
  • the deflection sensor can be formed of an array of conductive traces disposed in a generally parallel sense with respect to the sensing surface, and configured to generate the deflection signal based on capacitive or resistive coupling.
  • a position sensor may also be disposed with respect to the sensing surface, either combined with the deflection sensor, or provided as an independent system.
  • the position sensor can be configured to generate a position signal based on a touch pattern on the sensing surface, and the processor can be configured to generate a position map characterizing the touch pattern, based on the position signal.
  • a variety of electronic devices may utilize such a touch-sensitive input system, for example a mobile device or smartphone in which the sensing surface comprises a cover glass configured for viewing a touch screen display.
  • a computing device may include a track pad comprising the sensing surface of the input system.
  • Such devices may be controlled based on characteristics of the force map, for example based on a force magnitude or centroid location, as determined in real time operation of the device.
  • Exemplary methods of operation include sensing deflection of a control surface on the electronic device, in response to the touch pattern.
  • a displacement or deflection map may be generated, based on the deflection, and a transformation (or transformation operator) may be defined to relate the displacement map to forces imposed on the control surface.
  • a force moment map can be generated by operation of the transformation on the displacement map, where the force moment map describes the imposed force.
  • the transformation may include a generalized inverse of a compliance operator (or compliance matrix), where the compliance operator relates the displacement map to the imposed forces.
  • the compliance operator need not necessarily be square or invertible. Alternatively, another transformation may be used.
  • the force moment map can be generated based only on the deflection data. That is, the displacement map can be transformed independently of any separate position mapping, so that the force moment map describes the force imposed on the control surface based only on deflection of the control surface, absent any other additional position data.
  • the position of the touch pattern position can also be sensed on the control surface, and a centroid of the force moment map can be determined, based in whole or part on the sensed position.
  • the sensor array may provide both deflection and position data, where the position data are also used to described the force imposed on the control surface, based on the centroid of the touch pattern. This contrasts with independent analysis methods, as described above, where the force magnitudes and centroids are generated independently, based only on the deflection data and displacement mapping, without reference to any other position data or position mapping.
  • a compliance operator C may have a substantially rectangular representation, so that the compliance operator C has no inverse C ⁇ 1 .
  • the moment operator M can be defined to determine the characteristics of the force(s) imposed by the touch pattern, for example one or more scalar or “zeroth-order” moments, in order to determine the magnitude of the force, and any number of first, second or higher-order moments, for example to define centroid values based on linear and two-dimensional positioning of the imposed forces across the control or sensing surface.
  • Mechanical stress analysis can also be utilized, based on the material properties of the cover glass, spring membrane, or other control or sensing surface components.
  • the transformation can be defined by physical calibration of the sensing surface, for example using known force and moment inputs, or by any combination of finite element analysis, mechanical stress analysis, and physical calibrations.
  • the transformation can be stored in the form of a lookup table, in memory on the device, and the force moment map can be generated in real time, based on the displacement map in combination with the lookup table.
  • a force moment map can be generated in real time based on the displacement map and such a device lookup table, in order to describe operational input forces and magnitudes imposed on the control surface, according to the corresponding user-defined touch patterns.
  • the force moment map may also describe a force-moment distribution imposed on the surface by the touch pattern, for example to describe both force magnitudes and centroids.
  • program code may be stored or embedded on a non-volatile computer readable storage medium, where the program code is executable by a processor on the electronic device to perform any of the control methods and functions described herein.
  • the program code may be executable to sense deflection of a control surface on the electronic device in response to a (e.g., user-defined) touch pattern.
  • a displacement map of the control surface can be generated, based on the deflection, and the displacement map can be transformed into a force moment map.
  • the transformation operator may include or be based on a generalized inverse of a compliance operator, where the compliance operator relates the displacement map to forces imposed on the control surface by the touch pattern. Forces imposed on the control surface can be determined based on the force moment map, in order to control operation of the electronic device, based on the imposed forces.
  • FIG. 1 is a block diagram of an exemplary electronic device with a force-sensitive input system, for example a touch screen or track pad.
  • FIG. 2 is a schematic diagram of a force-sensitive input system for the device.
  • FIG. 3A is schematic illustration of a representative force pattern sensed by the device.
  • FIG. 3B is schematic illustration of an alternate force pattern sensed by the device.
  • FIG. 4A is a perspective view of an exemplary electronic device utilizing the force sensing system, in a touch screen application.
  • FIG. 4B is a perspective view of an exemplary electronic device utilizing the force sensing system, in a track pad application.
  • FIG. 5 is block diagram illustrating an exemplary method for operating an electronic device, based on the sensed input force.
  • FIG. 1 is a block diagram of an exemplary electronic device 10 with force-sensitive input system 12 , for example a touch screen, track pad or other electromechanical system configured to receive input from external force 14 .
  • Force-sensitive input system 12 provides improved flexibility and functionality for electronic devices 10 , as configured for a wide range of different applications including smartphones (smart phones) and other mobile devices, media players, digital assistants, gaming platforms, computing devices and other electronics systems, in both portable and stationary configurations.
  • force-sensitive input system or apparatus 12 may include both a displacement/force or deflection sensor 12 A and a position sensor 12 B, in either discrete or integrated form.
  • System 12 may also include an internal processor or microprocessor 12 C, and may be combined with an integrated graphical display 16 , for example a touch screen.
  • electronic device 10 may incorporate force sensing system 12 with a separate or discrete display component 16 , for example a force sensing track pad in combination with a separate computer monitor or multimedia display.
  • Device 10 may also include a separate device controller or processor 18 , along with other control or input devices 20 , such as home, menu, and hold buttons, volume controls, mute switches, and other control mechanisms 20 .
  • device 10 may include various internal and external accessories 22 and 24 , for example audio (speaker and microphone) components, cameras and other sensors, lighting, flash, and indicator systems, and additional specialized systems such as acceleration and motion sensors, gyro and GPS sensors, and other accessory or peripheral features, devices and components.
  • Device 10 is typically provided within housing 28 , for example utilizing a variety of metal, plastic and cover glass components to house force-sensitive input system 12 , display 16 , controller/processor 18 and additional control and accessory features 20 and 22 .
  • One or more external accessories 24 can be coupled to electronics device 10 via a variety of different device ports 26 in housing 28 , for example SCSI (small computer system interface), USB (universal serial bus), and other serial and parallel (e.g., SATA and PATA) device ports 26 , or using a wireless interface (I/F), such as an infrared (IR), Bluetooth, or radio frequency (RF) device.
  • SCSI small computer system interface
  • USB universal serial bus
  • I/F wireless interface
  • I/F such as an infrared (IR), Bluetooth, or radio frequency (RF) device.
  • Device controller 18 includes microprocessor (pp) and memory components configured to load and execute a combination of operating system and application firmware and software, in order to provide a range of functionality for device 10 .
  • Representative device functions include, but are not limited to, voice communications, voice control, media playback and development, internet browsing, email, messaging, gaming, security, transactions, navigation, calendaring, alarms, reminders, and other personal assistant tasks.
  • controller/processor 18 can be coupled in electronic and data communication with force sensing system 12 and one or more additional control buttons or other input mechanisms 20 , along with various internal and external components including display 16 and accessory features 22 and 24 . Controller/processor 18 can also provide for additional input-output (I/O) and communications features, utilizing a variety of different hard-wired and wireless interfaces and ports 26 , as described above.
  • I/O input-output
  • input system 12 can be configured to provide a combination of position and force sensing capabilities, offering greater input sensing sensitivity, range, and flexibility.
  • input device may include force/deflection sensor components 12 A, either alone or in combination with position sensor components 12 B, in order to provide increased control capabilities for user operation of electronics system or device 10 , as described below.
  • FIG. 2 is a schematic diagram of force-sensitive input system 12 , for example a force-sensitive touch screen or touch pad device.
  • force-sensitive input system 12 includes a force or deflection/displacement sensor (or circuit) 12 A and position sensor (or circuit) 12 B, in combination with a processor 12 C and driver circuit 12 D.
  • force/deflection sensor circuit 12 A may be provided independently, without position sensor circuit 12 B.
  • Force-sensitive input system 12 is configured to recognize single and multiple touch events or touch patterns 30 , as defined along touch-sensitive surface 32 by drive and sensor traces 34 A and 34 B.
  • system 12 may be configured to generate one or both of force/deflection (e.g., displacement) data 30 A and position data 30 B for time-separated, near-simultaneous and substantially simultaneous touch patterns 30 , including data characterizing the corresponding touch position, velocity, size, shape, and force magnitude, in order to provide for more flexible control and operation of electronic devices 10 , as described above with respect to FIG. 1 .
  • force/deflection e.g., displacement
  • touch-sensitive surface 32 is defined by an array of crossed sensor traces 34 A and 34 B, for example a set of substantially parallel driver traces 34 A, in combination with a set of substantially perpendicular or orthogonal sensor traces 34 B, as shown in FIG. 2 .
  • sensor trace and drive trace designations may be reversed, and trace sets 34 A and 34 B can both be active, or both passive.
  • drive circuit 12 D is coupled to drive traces 34 A, with force/deflection and position sensor circuits 12 A and 12 B coupled to sensor traces 34 B.
  • Sensor traces 34 B may also be oriented in a substantially perpendicular (e.g., vertical) or orthogonal sense with respect to (e.g. horizontal) drive traces 34 A. More generally, traces 34 A and 34 B may have any orientation, horizontal, vertical, or otherwise, and may intersect at a range of angles, or be formed in a polar coordinate arrangement, or other form.
  • Force/deflection and position sensor circuits 12 A and 12 B can also be combined into an integrated force/position sensor configuration, or utilize separate sensor traces 34 B, for example with two or more sets of sensor traces 34 B arranged at different skew angles.
  • Processor 12 C generates force/deflection and position data 30 A and 30 B based on signals from one or more force, position, and drive circuits 12 A, 12 B and 12 D, as described above.
  • processor 12 C may be provided as an “on-board” or integrated processor element, as shown in FIG. 2 , or as an off-board or external processor component, for example as provided within a device controller 18 , as shown in FIG. 1 , or using other internal or external data processing components.
  • touch sensitive surface 32 may incorporate or accommodate a display devices 16 , for example a graphical or media display for use in a force-sensitive touch screen embodiment of input system 12 .
  • input system 12 may be provided independently of any display 16 , for example with touch-sensitive surface 32 configured for use in a track pad, or similar input device.
  • Sensing points or nodes 36 on touch-sensitive surface may be defined by the intersections of drive and sensor traces 34 A and 34 B, as shown in FIG. 2 .
  • crossed resistive or capacitive traces 34 A and 34 B may be separated by an electrically insulating dielectric spacer or spring membrane layer, with sensitivity to force (or deflection) based on changes in the capacitive or resistive coupling, as defined at various sensing points 36 .
  • Capacitive systems encompass both self-capacitance and mutual capacitance-based measurements, in which the capacitance at different sensing points 36 varies based on the shape and force characteristics of touch patterns 30 , as well as the proximity and electrical properties of the external object(s) used to generate touch patterns 30 .
  • an array of discrete sensor components 36 may also be utilized, for example resistive, capacitive, piezoelectric, or optical sensors, either alone or in combination with trace arrays 34 A and 34 B.
  • discrete sensors may be provided at intersections 36 of traces 34 A and 34 B, as shown in FIG. 2 , or between traces 34 A and 34 B, or at a combination of intersections and other locations.
  • sensing system or device 12 is operable to detect and track a range of attributes for each touch pattern 30 , including, but not limited to, position, size, shape, force, and deflection.
  • system 12 is also configurable to identify distinct touch patterns 30 , and to determine additional attributes including centroid, moment, velocity and acceleration, as individual touch patterns 30 track across sensing surface 32 .
  • sensing points 36 may vary, depending on desired resolution, sensitivity, and other characteristics. In touch-screen applications, the distribution of nodes 36 may also depend upon desired optical properties, for example transparency, as determined by the density of nodes 36 in an indium tin oxide (ITO) matrix, or other transparent material used to form sensing surface 32 .
  • ITO indium tin oxide
  • Force/deflection and position data 30 A and 30 B are utilized to define the shape and other characteristics of touch patterns 30 at particular points in time, allowing corresponding commands to be executed on the host device.
  • individual sensor points or nodes 36 may function substantially independently, with each sensor point 36 generating separate sensor signals.
  • multiplexed signals may be generated along crossed drive and sensor traces 34 A and 34 B, and processor 12 C may provide demultiplexing capability.
  • Drive traces 34 A can also be sequentially activated in order to generate time-separated sensor signals, providing substantially independent force/deflection and position data 30 A and 30 B.
  • force-related data 30 A For force-related data 30 A, however, touch patterns 30 typically result in deflections across a substantial area of sensing surface 32 .
  • force/deflection data 30 A from separate sensor points 36 are not generally independent, and data 30 A from a number of separate sensor points 36 may be required to determine the corresponding force magnitudes, centroids, and moments. This analysis may be performed based on a pixilated image or displacement map of sensing surface 32 , as described below.
  • FIG. 3A is schematic illustration of a representative force pattern sensed by electronic device 10 .
  • touch pattern 30 is presented as an applied force on sensing surface 32 of input system 12 , for example by touching a finger or stylus to the touch screen display on a mobile phone or smartphone application of electronic device 10 , or using a calibrated weight.
  • Input system 12 generates a two-dimensional displacement or deflection map 40 based on pattern 30 , fusing an array of overlapping traces 34 A and 34 B or sensors 36 to measure deflection, as described above.
  • maximal displacement region 42 may shift toward the center of sensing surface 32 , along with centroid 44 , which represents the “center of mass” or amplitude-weighted mean of the displacement function, as characterized by the (e.g. first-order) moments of deflection map 40 .
  • force amplitudes, moments and other information may thus be generated from deflection data 30 A (see FIG. 2 ), and utilized to generate two-dimensional deflection map 40 .
  • Deflection map 40 can be transformed to produce a force moment mapping, with moments including the scalar input force and centroid 44 .
  • system 12 may also utilize single or multi-touch position data 30 B to determine the input forces associated with a particular touch pattern 30 , for example at the corresponding finger or stylus positions.
  • the displacement-to-force algorithm may also accommodate force/centroid accuracy and noise effects, capacitive (or other sensor) accuracy, pixel-by-pixel noise contributions, and mechanical tolerances.
  • a linear deflection force model is utilized, in which deflection map 40 is determined based on a sum of deflections due to multiple individual touch patterns 30 , or from a single touch pattern 30 extending across an area of sensing surface 32 .
  • the deflection due to a 200 g weight equivalent input e.g., a force of about 1.96 N
  • the deflection due to a 200 g weight equivalent input may be approximately twice the deflection due to that from a 100 g weight equivalent (0.98 N), as extended across a range of different pixels or nodes (i.e., sensing positions) 36 .
  • deflection map 40 at a given point A e.g., midway between points B and C
  • deflection map 40 at a given point A is not necessarily the average of the deflection map for the same input force applied at the endpoints B and C.
  • System 12 may also utilize capacitances and other sensor measurements that do not necessarily scale linearly with displacement, for example by compensating for nonlinearities in the conversion algorithm, and based on physical calibrations.
  • Deflection map 40 can be converted into a force measurement (or force mapping) based on a compliance analysis, as performed in vector, matrix, or tensor form. That is, various forces F applied to the cover glass or other sensing surface 32 can be modeled based on the corresponding measured displacements X (e.g., deflection map 40 ), using a compliance operator (or compliance matrix) C that relates force to displacement, for example by:
  • displacement X and force F can each be modeled as two-dimensional matrix or tensor quantities, based on the values of deflection map 40 , and the corresponding force distribution on the cover glass, track pad, or other sensing surface 32 .
  • compliance tensor C may have a four-dimensional or higher-order form, based on the two-dimensional forms of displacement and force terms X and F.
  • force term F can be “unwrapped” into a one-dimensional vector, for example with m entries corresponding to the number of discrete locations at which the input forces are modeled along sensing surface 32 , rearranged into a linear (e.g., row or column) vector form.
  • displacements X can also be represented as a one-dimensional quantity, with deflection map 40 unwrapped into k discrete entries, in either column or row vector form.
  • compliance term C can be represented as a two-dimensional matrix or tensor operator, for example with k rows corresponding to the entries in displacement term X, and m columns corresponding to the entries in force term F.
  • displacements X and forces F can be represented as vector, matrix, or tensor quantities of arbitrary dimension, in either covariant or contravariant form, and compliance term C may vary accordingly.
  • n applied forces or touch patterns 30 may be associated with n displacement mappings X, utilizing m ⁇ n and k ⁇ n matrix or tensor forms for force F and displacement X, respectively, with a correspondingly higher-order representation for compliance tensor or matrix operator C.
  • compliance operator C can be derived from plate mechanics, utilizing the known stress and strain properties of the cover glass or tracking surface to relate input forces F and displacements X.
  • Various calibration methods may also be employed, either alone or in combination with mechanical analysis and FEA techniques.
  • a robot arm or other mechanical system can be utilized to physically position a set of different calibration weights or other predefined force generating elements along a particular touch screen, track pad, or other sensing surface 32 , in order to generate to generate various known touch patterns 30 .
  • tabulating displacements X based on the resulting deflection map 40 will give compliance term C directly, for use in modeling force distributions F based on arbitrary input.
  • FIG. 3B is schematic illustration of an alternate force pattern in which multiple touch patterns 30 presented, for example by touching two or more fingers to the cover glass or other sensing surface 32 of a smartphone or other mobile device 10 , as shown in FIG. 3A .
  • Input system 12 generates a two-dimensional deflection map 40 based on touch patterns 30 , corresponding to displacements X for use in generating a force input model F based on compliance operator C, as described above.
  • compliance operator C can be constrained to have more row entries k than column entries m, because displacements X can be sampled for any force model F on an arbitrarily fine grid.
  • finite element analysis and other techniques may be utilized, based on an arbitrary number of discrete displacement sampling points k, as described above.
  • compliance operator C may be defined in a substantially square form, with an equal number of rows k and columns m, or with more columns m than rows k.
  • modeling or test forces F may be selected to generate compliance operator C in full rank form; that is, with k independent rows (or m independent columns), such that the rank of C is equal to the larger of k and m.
  • This property may be utilized to generalize a force/displacement mapping algorithm, as described below.
  • a derived result Y may be desired, for example a derived result Y that is related to applied force F in some predictable fashion.
  • a derived result Y is desired for which displacements X can be used to generate a relationship for a force map or force-moment mapping based on a particular moment M.
  • the desired transformation (or operator) J can be expressed as
  • C compliance operator
  • C ⁇ 1 does not necessarily exist.
  • the pseudoinverse C + may be constrained to exist by selection (or construction) of a full-rank compliance operator C, where C T C has inverse (C T C) ⁇ 1 , as described above.
  • the right (pseudo) inverse may be used:
  • C is defined as a rectangular matrix with more column entries m than row entries k.
  • the generalized inverse or pseudoinverse C + provides a “best” or least squares approximation, which can serve the function of an inverse for operations on strictly rectangular (non-square) matrices and other non-invertible forms.
  • the generalized inverse C + may be suitable for operations including linear matrices (or vectors) X and F, as described above, in order to generate a more general force/displacement mapping algorithm suitable for physical applications including force-sensitive input systems for electronic devices.
  • the individual values in (C T C) ⁇ 1 C T can also be corrected for operational effects, such as temperature, aging, etc., for example by storing an initial (or “ab initio”) value for the transformation in the lookup table, and scaling appropriately.
  • the transformation operator in turn, can be determined based on a combination of analytical and calibration methods, as described above, and stored in a lookup table in order to provide real-time force and moment analysis, based on a particular displacement mapping X.
  • the singular value decomposition (SVD) of compliance matrix C can also be used to identify or determine which degrees of freedom in the force map may be more difficult or challenging to reconstruct from displacements X, for example based on noise and numerical stability considerations.
  • the characteristics of compliance matrix C may be determined by
  • FIG. 4A is a perspective view of an exemplary electronic device 10 utilizing force-sensing input system 12 .
  • device 10 is configured for use in a portable device application such as a mobile phone or smartphone, e.g., as shown in FIGS. 3A and 3B , above.
  • device 10 may be configured as a media player, digital assistant, touch screen, tablet computer, personal computer, computer display, or other electronic device, in either portable or stationary form.
  • force-sensitive input system 12 is integrated into a front cover glass component, for example with sensing surface 32 provided in the form of a display window for a touch screen or other display system, as defined between border regions 50 .
  • sensing surface 32 may be formed of a glass or other durable transparent material, for example silica glass, indium tin oxide, a clear plastic polymer such as acrylic or polycarbonate, or a transparent ceramic such as crystalline aluminum oxide or sapphire.
  • Force and position sensitivity to different touch patterns 30 on surface 32 can be provided via a layer array of capacitive or resistive traces, or using discrete piezoelectric devices or other force and position sensing devices, for example as described above with respect to FIG. 2 .
  • Housing assembly 28 and frame 28 A are typically formed of metals such as aluminum and steel, or from plastic, glass, ceramics, composite materials, and combinations thereof.
  • Frame 28 A may be used to attach cover glass/sensing surface 32 to various top, bottom and side housing components 28 B, 28 C, and 28 D, as shown in FIG. 4A , or via an adhesive coupling.
  • device 10 can also accommodate a number of additional control and accessory features, including menu and hold buttons, volume switches, and other control mechanisms 20 , and audio, camera, lighting, and other accessory features 22 .
  • One or more ports or connector apertures 26 may also be provided for power and data communications with device 10 , with various mechanical fasteners 52 and access ports 54 to fix cover glass/sensing surface 32 to housing assembly 28 , and to provide access to internal components such a flash memory device or subscriber identity module (SIM card).
  • SIM card subscriber identity module
  • FIG. 4B is a perspective view of electronic device 10 in an alternate configuration, for example a laptop or notebook computer, a minicomputer, a personal computer, or other portable or stationary data processing application.
  • housing 28 may accommodate an input or keyboard component 56 and separate (e.g., upright or hinged) display 58 .
  • input system 12 is provided in the form of a force-sensitive track pad or multi-touch tracking system, with sensing or tracking surface 32 positioned below or adjacent to keyboard 56 .
  • sensing surface 32 may incorporate layered sets of capacitive or resistively coupled conductive traces, for example with a dielectric spring membrane or other dielectric separator.
  • piezoelectric devices or other discrete position and force sensors may be used, as described above.
  • force-sensitive input system 12 is configured to measure the spatial profile of deflection or deformation in sensing surface 32 , in response to the forces applied in touch pattern(s) 30 . Due to the bending plate (or spring membrane) mechanics, however, deflection is not necessarily localized to the specific area or region of touch pattern 30 , but may be distributed and shifted, as shown above in FIGS. 3A and 3B . Thus, it is not trivial to recover or determine an accurate estimate of the applied forces corresponding to touch pattern(s) 30 , from the displacement of sensing surface 32 alone.
  • Different algorithms may thus be applied, as described above, in order to estimate the touch forces based on a displacement or deflection map of the screen or track pad surface 32 .
  • the algorithm may also utilize knowledge of the finger positions or touch patterns 30 to improve the force estimate, for example as obtained from a capacitive (e.g., multi-touch) sensor or other position sensing system.
  • the force reconstruction algorithm may be defined based on the deflection map alone, absent any such independent position data.
  • Such techniques are applicable to a range of different electronics applications and devices 10 for which force-sensitive input is desirable, including both display-based and non-display based input devices 12 , such as force-sensitive touch screens and track pads.
  • FIG. 5 is block diagram illustrating an exemplary method 60 for operating an electronic device based on a sensed input force, for example an electronic device 10 with force-sensitive input system 12 , as described above.
  • method 60 may include one or more steps including, but not limited to, sensing deflection of a control surface (step 61 ), mapping displacement of the surface (step 62 ), and transforming the displacement map (step 63 ) to generate a force map or force-moment mapping (step 64 ).
  • the force-moment mapping may include one or more force magnitudes (step 65 ) or centroids (step 66 ), which can also be utilized to control the device (step 67 ).
  • the device may also provide for position sensing (step 68 ) of the touch pattern on the control surface, which can be used to generate a position map (step 70 ), for example for use in refining the force-moment mapping (step 64 ).
  • Sensing deflection can be performed for touch patterns on the control surface of a track pad, touch screen display, or similar control input device, using a capacitive or resistive grid or discrete sensor array to generate deflection data, as described above.
  • the deflection data e.g., a set of deflection sensor signals
  • the two-dimensional data can also be unwrapped into one-dimensional vector form, for example a row or column vector, in either covariant or contravariant form.
  • the sensor array may also provide for position sensing (step 68 ), as described above, in either a single-touch or multi-touch configuration.
  • the position data can also be mapped (step 70 ), providing a two-dimensional image of the touch pattern(s) on the control surface.
  • the position map can be utilized in conjunction with the force mapping, for example in order to refining the force magnitude determination (step 65 ), or to determine one or more centroids of the force distribution (step 66 )
  • force-moment mapping may be performed independently of position sensing (step 68 ), and absent any other explicit position data characterizing the touch patterns on the control surface (step 70 ), outside the deflection data (step 61 ) and displacement map (step 62 ).
  • one or both of the force magnitudes (step 65 ) and centroids (step 66 ) of the force distribution are determined based on deflection alone, by generating the displacement map (step 62 ) based only on deflection data (step 61 ). That is, the displacement map (step 62 ) is transformed (step 63 ) independently of, and without reference to, any separate position mapping (step 70 ) or position data (step 68 ).
  • the generalized inverse operator or pseudoinverse C + can be defined in either a left-side or right-side convention, based on the number of rows and columns in compliance matrix C, and the corresponding dimension (or number of entries) in the force and displacement mappings F and X, for example as defined in one-dimensional row or column vector form, as described above, or using covariant or contravariant vectors of arbitrary order.
  • the resulting transformation data can be stored in a lookup table, so that the desired force moment mapping MF can be determined based on the displacement mapping X (step 76 ), using the lookup table in real time.
  • the form of the force mapping output (step 64 ) can be defined by the desired moment mapping M, and may include one or more force magnitudes (step 65 ) and centroids (step 66 ), which characterize the force distribution due to the touch patterns on the control surface.
  • Device control is performed based on the force outputs (step 64 ), either alone or in combination with independent position data (steps 68 and 70 ).
  • the force magnitude may be utilized to define a scalar input, such as a volume, playback speed, or zoom ratio, with or without reference to position.
  • a position may also be derived from the deflection data, for example in order to generate one or more centroids (step 66 ), in order to control any of the electronic device operations described herein.
  • the force outputs (step 64 ) may be used in conjunction with an independent position mapping (step 70 ), with one or both of the force magnitude (step 65 ) and centroid (step 66 ) determined at least in part based on position sensing (step 68 ), as well as deflection sensing (step 61 ).

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

A touch sensitive input system for an electronic device includes a deflection sensor configured to generate a deflection signal based on deflection of a control or sensing surface, and a processor in signal communication with the deflection sensor. The processor is operable to generate a deflection or displacement map characterizing displacement of the surface based on the deflection signal, and a force map characterizing force on the surface based on a transformation of the displacement map. The transformation may be based on a generalized inverse of a compliance operator, where the compliance operator relates the displacement map to the force map. The compliance operator is not necessarily square, and does not necessarily have a traditional inverse.

Description

    TECHNICAL FIELD
  • This subject matter of this disclosure relates generally to electronic devices, and specifically to touch screens and other electronic display components. In particular, the disclosure relates to touch screen (or touchscreen) sensors, track pads (or trackpads), and other input devices for mobile phones, personal and tablet computers, and other portable and stationary electronics applications.
  • BACKGROUND
  • Touch pads, touch screens and other input-sensing devices have a broad range of applications including computer systems, mobile phones, media players, personal digital assistants, gaming platforms, and other electronic devices. Suitable technologies include a variety of resistive and capacitive-coupled sensor systems, as well as optical, electromagnetic, and surface acoustic wave devices.
  • In capacitive sensing systems, a conducting grid may be utilized, for example with sets of orthogonal traces separated by a dielectric insulator. The grid functions as a capacitive array, which is sensitive to contact (or proximity) based on changes in the corresponding voltage or charge capacity, for example as manifested in a current output. In resistive devices, contact with the input surface causes changes in resistance across the insulating layer, which is registered by an increase or decrease in corresponding sense currents.
  • Increasingly, capacitive, resistive and other touch-sensitive systems are incorporated into track pads and visual display devices, providing increased input sensitivity for more flexible device control. As device technologies advance, moreover, an increasing number of control functions can also be integrated into a single device or form factor, including, but not limited to, real-time operation and control of voice and data communications, messaging, media playback and development, gaming, internet access, navigational services, and personal digital assistant functions including alarms, reminders and calendar tasks.
  • As the range of electronics device functions increases, there is also a desire for more advanced touch screens, track pads, and other input devices. In particular, there is a desire for more advanced input systems that can be adapted to real-time control and display functions for an ever-wider range of different electronic device applications, including track pad and touch screen display devices with improved input sensitivity and tracking capabilities.
  • SUMMARY
  • This disclosure relates to force-sensitive input devices, including track pads, touch screens and other control systems with sensitivity to contact forces. In particular examples and embodiments, the disclosure encompasses a touch sensitive input system for an electronic device, with a deflection sensor disposed adjacent to or along the control surface. The deflection sensor is configured to generate a deflection signal, for example based on a touch pattern generated by a user on the surface of a touch screen, track pad, or other force-sensing control device.
  • A processor is provided in signal communication with the deflection sensor. The processor is operable to generate a deflection map characterizing deflection of the sensing surface, based on the deflection signal. The processor is also operable to generate a force map characterizing force on the sensing surface, based on a transformation of the deflection map. The transformation may be based on a generalized inverse of a compliance operator, which relates the deflection map to the force map.
  • The compliance operator is not necessarily square, and may instead have a rectangular representation, with no strictly defined inverse. For example, the compliance operator may have more rows than columns, where the rows correspond to entries in the deflection map and the columns corresponding to entries in the force map. Alternatively, the compliance operator may have more columns than rows.
  • The deflection sensor can be formed of an array of conductive traces disposed in a generally parallel sense with respect to the sensing surface, and configured to generate the deflection signal based on capacitive or resistive coupling. A position sensor may also be disposed with respect to the sensing surface, either combined with the deflection sensor, or provided as an independent system. The position sensor can be configured to generate a position signal based on a touch pattern on the sensing surface, and the processor can be configured to generate a position map characterizing the touch pattern, based on the position signal.
  • A variety of electronic devices may utilize such a touch-sensitive input system, for example a mobile device or smartphone in which the sensing surface comprises a cover glass configured for viewing a touch screen display. Alternatively, a computing device may include a track pad comprising the sensing surface of the input system. Such devices may be controlled based on characteristics of the force map, for example based on a force magnitude or centroid location, as determined in real time operation of the device.
  • Exemplary methods of operation include sensing deflection of a control surface on the electronic device, in response to the touch pattern. A displacement or deflection map may be generated, based on the deflection, and a transformation (or transformation operator) may be defined to relate the displacement map to forces imposed on the control surface. A force moment map can be generated by operation of the transformation on the displacement map, where the force moment map describes the imposed force.
  • For example, the transformation may include a generalized inverse of a compliance operator (or compliance matrix), where the compliance operator relates the displacement map to the imposed forces. The compliance operator need not necessarily be square or invertible. Alternatively, another transformation may be used.
  • Depending upon application, the force moment map can be generated based only on the deflection data. That is, the displacement map can be transformed independently of any separate position mapping, so that the force moment map describes the force imposed on the control surface based only on deflection of the control surface, absent any other additional position data.
  • Alternatively, the position of the touch pattern position can also be sensed on the control surface, and a centroid of the force moment map can be determined, based in whole or part on the sensed position. For example, the sensor array may provide both deflection and position data, where the position data are also used to described the force imposed on the control surface, based on the centroid of the touch pattern. This contrasts with independent analysis methods, as described above, where the force magnitudes and centroids are generated independently, based only on the deflection data and displacement mapping, without reference to any other position data or position mapping.
  • In one particular example, the transformation may comprise a generalized inverse C+ of a compliance operator C, which relates a particular displacement map X to the force F imposed on the surface, for example with X=CF. Such a compliance operator C may have a substantially rectangular representation, so that the compliance operator C has no inverse C−1.
  • Generating the force moment map of the control surface may also include generating a derived output Y, which is related to the force F imposed on the surface and a moment M, for example with Y=MF. The transformation may also determine the derived output via Y=JX and the transformation may include the moment M, so that the derived result is Y=MC+X.
  • That is, the transformation operator J may be defined by J=MC+, and force moment mapping may be defined operation of the transformation operator on the displacement mapping, for example with MF=MC+X. The moment operator M, in turn, can be defined to determine the characteristics of the force(s) imposed by the touch pattern, for example one or more scalar or “zeroth-order” moments, in order to determine the magnitude of the force, and any number of first, second or higher-order moments, for example to define centroid values based on linear and two-dimensional positioning of the imposed forces across the control or sensing surface.
  • The transformation (or transformation operator, e.g. J=MC+) can be defined by finite element analysis, utilizing a grid of desired accuracy to define the force (or force-moment) mapping based on an arbitrary number of deflection points. Mechanical stress analysis can also be utilized, based on the material properties of the cover glass, spring membrane, or other control or sensing surface components. Alternatively, the transformation can be defined by physical calibration of the sensing surface, for example using known force and moment inputs, or by any combination of finite element analysis, mechanical stress analysis, and physical calibrations.
  • The transformation can be stored in the form of a lookup table, in memory on the device, and the force moment map can be generated in real time, based on the displacement map in combination with the lookup table. In particular, a force moment map can be generated in real time based on the displacement map and such a device lookup table, in order to describe operational input forces and magnitudes imposed on the control surface, according to the corresponding user-defined touch patterns. The force moment map may also describe a force-moment distribution imposed on the surface by the touch pattern, for example to describe both force magnitudes and centroids.
  • To perform the method on a particular electronic device, program code may be stored or embedded on a non-volatile computer readable storage medium, where the program code is executable by a processor on the electronic device to perform any of the control methods and functions described herein. For example, the program code may be executable to sense deflection of a control surface on the electronic device in response to a (e.g., user-defined) touch pattern. A displacement map of the control surface can be generated, based on the deflection, and the displacement map can be transformed into a force moment map.
  • The transformation operator may include or be based on a generalized inverse of a compliance operator, where the compliance operator relates the displacement map to forces imposed on the control surface by the touch pattern. Forces imposed on the control surface can be determined based on the force moment map, in order to control operation of the electronic device, based on the imposed forces.
  • In one particular application, the program code may be executable to generate a particular derived output Y, which can be related to force(s) F imposed on the surface via a moment M; that is, with Y=MF. The transformation operator J may also be utilized to define the derived output Y, such that Y=JX.
  • Such a transformation operator J can also include a generalized inverse C+ of a compliance operator C, which relates the deflection map X to the force F imposed on the surface via X=CF. One such transformation operator, for example, is J=MC+, which may also be defined by J=M(CTC)−1CT, as expressed in a left-side generalized inverse form. This particular form may corresponds to a compliance operator C with a substantially rectangular representation, for example with more rows m than columns k, and with no well-defined inverse C−1, as defined by CC−1=I.
  • The transformation operator J may also include the moment M, for example with J=MC+. Thus, the derived output Y may be determined by Y=MC+X and by Y=MF. Thus, a particular force moment mapping MF and displacement mapping X may be related by the transformation operator J, for example according to MF=MC+X. Such a transformation operator J=MC+ may be determined by a combination of finite element analysis, calibration, and mechanical stress calculations, as described above, and stored in a lookup table for real-time operation of the electronic device, based on the force input as determined from the deflection mapping.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram of an exemplary electronic device with a force-sensitive input system, for example a touch screen or track pad.
  • FIG. 2 is a schematic diagram of a force-sensitive input system for the device.
  • FIG. 3A is schematic illustration of a representative force pattern sensed by the device.
  • FIG. 3B is schematic illustration of an alternate force pattern sensed by the device.
  • FIG. 4A is a perspective view of an exemplary electronic device utilizing the force sensing system, in a touch screen application.
  • FIG. 4B is a perspective view of an exemplary electronic device utilizing the force sensing system, in a track pad application.
  • FIG. 5 is block diagram illustrating an exemplary method for operating an electronic device, based on the sensed input force.
  • DETAILED DESCRIPTION
  • FIG. 1 is a block diagram of an exemplary electronic device 10 with force-sensitive input system 12, for example a touch screen, track pad or other electromechanical system configured to receive input from external force 14. Force-sensitive input system 12 provides improved flexibility and functionality for electronic devices 10, as configured for a wide range of different applications including smartphones (smart phones) and other mobile devices, media players, digital assistants, gaming platforms, computing devices and other electronics systems, in both portable and stationary configurations.
  • Depending upon desired functionality, force-sensitive input system or apparatus 12 may include both a displacement/force or deflection sensor 12A and a position sensor 12B, in either discrete or integrated form. System 12 may also include an internal processor or microprocessor 12C, and may be combined with an integrated graphical display 16, for example a touch screen. Alternatively, electronic device 10 may incorporate force sensing system 12 with a separate or discrete display component 16, for example a force sensing track pad in combination with a separate computer monitor or multimedia display.
  • Device 10 may also include a separate device controller or processor 18, along with other control or input devices 20, such as home, menu, and hold buttons, volume controls, mute switches, and other control mechanisms 20. In addition, device 10 may include various internal and external accessories 22 and 24, for example audio (speaker and microphone) components, cameras and other sensors, lighting, flash, and indicator systems, and additional specialized systems such as acceleration and motion sensors, gyro and GPS sensors, and other accessory or peripheral features, devices and components.
  • Device 10 is typically provided within housing 28, for example utilizing a variety of metal, plastic and cover glass components to house force-sensitive input system 12, display 16, controller/processor 18 and additional control and accessory features 20 and 22. One or more external accessories 24 can be coupled to electronics device 10 via a variety of different device ports 26 in housing 28, for example SCSI (small computer system interface), USB (universal serial bus), and other serial and parallel (e.g., SATA and PATA) device ports 26, or using a wireless interface (I/F), such as an infrared (IR), Bluetooth, or radio frequency (RF) device.
  • Device controller 18 includes microprocessor (pp) and memory components configured to load and execute a combination of operating system and application firmware and software, in order to provide a range of functionality for device 10. Representative device functions include, but are not limited to, voice communications, voice control, media playback and development, internet browsing, email, messaging, gaming, security, transactions, navigation, calendaring, alarms, reminders, and other personal assistant tasks. In order to obtain user input, controller/processor 18 can be coupled in electronic and data communication with force sensing system 12 and one or more additional control buttons or other input mechanisms 20, along with various internal and external components including display 16 and accessory features 22 and 24. Controller/processor 18 can also provide for additional input-output (I/O) and communications features, utilizing a variety of different hard-wired and wireless interfaces and ports 26, as described above.
  • To accommodate the wide variety of different functionalities contemplated for device 10, input system 12 can be configured to provide a combination of position and force sensing capabilities, offering greater input sensing sensitivity, range, and flexibility. In particular, input device may include force/deflection sensor components 12A, either alone or in combination with position sensor components 12B, in order to provide increased control capabilities for user operation of electronics system or device 10, as described below.
  • FIG. 2 is a schematic diagram of force-sensitive input system 12, for example a force-sensitive touch screen or touch pad device. In this particular example, force-sensitive input system 12 includes a force or deflection/displacement sensor (or circuit) 12A and position sensor (or circuit) 12B, in combination with a processor 12C and driver circuit 12D. Alternatively, force/deflection sensor circuit 12A may be provided independently, without position sensor circuit 12B.
  • Force-sensitive input system 12 is configured to recognize single and multiple touch events or touch patterns 30, as defined along touch-sensitive surface 32 by drive and sensor traces 34A and 34B. In particular, system 12 may be configured to generate one or both of force/deflection (e.g., displacement) data 30A and position data 30B for time-separated, near-simultaneous and substantially simultaneous touch patterns 30, including data characterizing the corresponding touch position, velocity, size, shape, and force magnitude, in order to provide for more flexible control and operation of electronic devices 10, as described above with respect to FIG. 1.
  • In the grid or array-based configuration of FIG. 2, touch-sensitive surface 32 is defined by an array of crossed sensor traces 34A and 34B, for example a set of substantially parallel driver traces 34A, in combination with a set of substantially perpendicular or orthogonal sensor traces 34B, as shown in FIG. 2. Alternatively, the sensor trace and drive trace designations may be reversed, and trace sets 34A and 34B can both be active, or both passive.
  • In one particular example, drive circuit 12D is coupled to drive traces 34A, with force/deflection and position sensor circuits 12A and 12B coupled to sensor traces 34B. Sensor traces 34B may also be oriented in a substantially perpendicular (e.g., vertical) or orthogonal sense with respect to (e.g. horizontal) drive traces 34A. More generally, traces 34A and 34B may have any orientation, horizontal, vertical, or otherwise, and may intersect at a range of angles, or be formed in a polar coordinate arrangement, or other form. Force/deflection and position sensor circuits 12A and 12B can also be combined into an integrated force/position sensor configuration, or utilize separate sensor traces 34B, for example with two or more sets of sensor traces 34B arranged at different skew angles.
  • Processor 12C generates force/deflection and position data 30A and 30B based on signals from one or more force, position, and drive circuits 12A, 12B and 12D, as described above. Depending upon configuration, processor 12C may be provided as an “on-board” or integrated processor element, as shown in FIG. 2, or as an off-board or external processor component, for example as provided within a device controller 18, as shown in FIG. 1, or using other internal or external data processing components.
  • In combined sensor/display configurations, as shown in FIG. 2, touch sensitive surface 32 may incorporate or accommodate a display devices 16, for example a graphical or media display for use in a force-sensitive touch screen embodiment of input system 12. Alternatively, input system 12 may be provided independently of any display 16, for example with touch-sensitive surface 32 configured for use in a track pad, or similar input device.
  • Sensing points or nodes 36 on touch-sensitive surface may be defined by the intersections of drive and sensor traces 34A and 34B, as shown in FIG. 2. For example, crossed resistive or capacitive traces 34A and 34B may be separated by an electrically insulating dielectric spacer or spring membrane layer, with sensitivity to force (or deflection) based on changes in the capacitive or resistive coupling, as defined at various sensing points 36. Capacitive systems encompass both self-capacitance and mutual capacitance-based measurements, in which the capacitance at different sensing points 36 varies based on the shape and force characteristics of touch patterns 30, as well as the proximity and electrical properties of the external object(s) used to generate touch patterns 30.
  • Alternatively, an array of discrete sensor components 36 may also be utilized, for example resistive, capacitive, piezoelectric, or optical sensors, either alone or in combination with trace arrays 34A and 34B. In combined trace/sensor array configurations, discrete sensors may be provided at intersections 36 of traces 34A and 34B, as shown in FIG. 2, or between traces 34A and 34B, or at a combination of intersections and other locations.
  • In each of these configurations, sensing system or device 12 is operable to detect and track a range of attributes for each touch pattern 30, including, but not limited to, position, size, shape, force, and deflection. In addition, system 12 is also configurable to identify distinct touch patterns 30, and to determine additional attributes including centroid, moment, velocity and acceleration, as individual touch patterns 30 track across sensing surface 32.
  • The number and configuration of sensing points 36 may vary, depending on desired resolution, sensitivity, and other characteristics. In touch-screen applications, the distribution of nodes 36 may also depend upon desired optical properties, for example transparency, as determined by the density of nodes 36 in an indium tin oxide (ITO) matrix, or other transparent material used to form sensing surface 32.
  • Force/deflection and position data 30A and 30B are utilized to define the shape and other characteristics of touch patterns 30 at particular points in time, allowing corresponding commands to be executed on the host device. For example, individual sensor points or nodes 36 may function substantially independently, with each sensor point 36 generating separate sensor signals. Alternatively, multiplexed signals may be generated along crossed drive and sensor traces 34A and 34B, and processor 12C may provide demultiplexing capability. Drive traces 34A can also be sequentially activated in order to generate time-separated sensor signals, providing substantially independent force/deflection and position data 30A and 30B.
  • For force-related data 30A, however, touch patterns 30 typically result in deflections across a substantial area of sensing surface 32. Thus, force/deflection data 30A from separate sensor points 36 are not generally independent, and data 30A from a number of separate sensor points 36 may be required to determine the corresponding force magnitudes, centroids, and moments. This analysis may be performed based on a pixilated image or displacement map of sensing surface 32, as described below.
  • FIG. 3A is schematic illustration of a representative force pattern sensed by electronic device 10. As shown in FIG. 3A, touch pattern 30 is presented as an applied force on sensing surface 32 of input system 12, for example by touching a finger or stylus to the touch screen display on a mobile phone or smartphone application of electronic device 10, or using a calibrated weight. Input system 12 generates a two-dimensional displacement or deflection map 40 based on pattern 30, fusing an array of overlapping traces 34A and 34B or sensors 36 to measure deflection, as described above.
  • Note that the position of touch pattern 30 does not necessarily coincide with the maximal displacement in deflection map 40, due to the stress response properties and perimeter mount configuration of the cover glass, track pad or other sensing/control surface 32. In particular, maximal displacement region 42 may shift toward the center of sensing surface 32, along with centroid 44, which represents the “center of mass” or amplitude-weighted mean of the displacement function, as characterized by the (e.g. first-order) moments of deflection map 40.
  • To determine the input forces associated with touch pattern 30, force amplitudes, moments and other information may thus be generated from deflection data 30A (see FIG. 2), and utilized to generate two-dimensional deflection map 40. Deflection map 40, in turn, can be transformed to produce a force moment mapping, with moments including the scalar input force and centroid 44. Alternatively, system 12 may also utilize single or multi-touch position data 30B to determine the input forces associated with a particular touch pattern 30, for example at the corresponding finger or stylus positions. The displacement-to-force algorithm may also accommodate force/centroid accuracy and noise effects, capacitive (or other sensor) accuracy, pixel-by-pixel noise contributions, and mechanical tolerances.
  • In one approach, a linear deflection force model is utilized, in which deflection map 40 is determined based on a sum of deflections due to multiple individual touch patterns 30, or from a single touch pattern 30 extending across an area of sensing surface 32. For example, the deflection due to a 200 g weight equivalent input (e.g., a force of about 1.96 N) may be approximately twice the deflection due to that from a 100 g weight equivalent (0.98 N), as extended across a range of different pixels or nodes (i.e., sensing positions) 36.
  • Note that the linear (or linear superposition) model does not imply that deflection scales linearly with distance, only with applied force. Thus, the value of deflection map 40 at a given point A, e.g., midway between points B and C, is not necessarily the average of the deflection map for the same input force applied at the endpoints B and C. However, this assumption may become approximately true for short distances (that is, as points A and B get closer together). System 12 may also utilize capacitances and other sensor measurements that do not necessarily scale linearly with displacement, for example by compensating for nonlinearities in the conversion algorithm, and based on physical calibrations.
  • Deflection map 40 can be converted into a force measurement (or force mapping) based on a compliance analysis, as performed in vector, matrix, or tensor form. That is, various forces F applied to the cover glass or other sensing surface 32 can be modeled based on the corresponding measured displacements X (e.g., deflection map 40), using a compliance operator (or compliance matrix) C that relates force to displacement, for example by:

  • X=CF.   [1]
  • The particular formats used for the force and displacement terms F and X are arbitrary, and vary from application to application (along with compliance term C). For example, displacement X and force F can each be modeled as two-dimensional matrix or tensor quantities, based on the values of deflection map 40, and the corresponding force distribution on the cover glass, track pad, or other sensing surface 32. In this approach, compliance tensor C may have a four-dimensional or higher-order form, based on the two-dimensional forms of displacement and force terms X and F.
  • Alternatively, force term F can be “unwrapped” into a one-dimensional vector, for example with m entries corresponding to the number of discrete locations at which the input forces are modeled along sensing surface 32, rearranged into a linear (e.g., row or column) vector form. Similarly, displacements X can also be represented as a one-dimensional quantity, with deflection map 40 unwrapped into k discrete entries, in either column or row vector form. In this approach, compliance term C can be represented as a two-dimensional matrix or tensor operator, for example with k rows corresponding to the entries in displacement term X, and m columns corresponding to the entries in force term F.
  • More generally, displacements X and forces F can be represented as vector, matrix, or tensor quantities of arbitrary dimension, in either covariant or contravariant form, and compliance term C may vary accordingly. In particular, n applied forces or touch patterns 30 may be associated with n displacement mappings X, utilizing m×n and k×n matrix or tensor forms for force F and displacement X, respectively, with a correspondingly higher-order representation for compliance tensor or matrix operator C.
  • In physical applications of system 12, compliance operator C can be derived from plate mechanics, utilizing the known stress and strain properties of the cover glass or tracking surface to relate input forces F and displacements X. Alternatively, or in addition, finite element analysis (FEA) techniques may also be used, for example by simulating a set of applied force scenarios F based on different touch patterns 30, tabulating the resulting displacements X based on simulated deflection mappings 40, and deriving the elements of compliance term C based on the relationship X=CF.
  • These may be considered direct approaches, as applicable to a range of electronic devices 10 with cover glass/touch screen and track pad systems 12 having well defined stress response characteristics. Various calibration methods may also be employed, either alone or in combination with mechanical analysis and FEA techniques. For example, a robot arm or other mechanical system can be utilized to physically position a set of different calibration weights or other predefined force generating elements along a particular touch screen, track pad, or other sensing surface 32, in order to generate to generate various known touch patterns 30. Thus, tabulating displacements X based on the resulting deflection map 40 will give compliance term C directly, for use in modeling force distributions F based on arbitrary input.
  • FIG. 3B is schematic illustration of an alternate force pattern in which multiple touch patterns 30 presented, for example by touching two or more fingers to the cover glass or other sensing surface 32 of a smartphone or other mobile device 10, as shown in FIG. 3A. Input system 12 generates a two-dimensional deflection map 40 based on touch patterns 30, corresponding to displacements X for use in generating a force input model F based on compliance operator C, as described above.
  • In general, compliance operator C can be constrained to have more row entries k than column entries m, because displacements X can be sampled for any force model F on an arbitrarily fine grid. For example, finite element analysis and other techniques may be utilized, based on an arbitrary number of discrete displacement sampling points k, as described above. Alternatively, compliance operator C may be defined in a substantially square form, with an equal number of rows k and columns m, or with more columns m than rows k.
  • More generally, however, modeling or test forces F may be selected to generate compliance operator C in full rank form; that is, with k independent rows (or m independent columns), such that the rank of C is equal to the larger of k and m. Thus, compliance operator C may be defined such that CTC (C-transpose C) is invertible, even when there is no strictly defined inverse C−1 (i.e., where C−1 does not exist, there being no C−1 that satisfies C−1C=I). This property, in turn, may be utilized to generalize a force/displacement mapping algorithm, as described below.
  • In particular, when the displacement sensor response to an applied force F is described by X=CF, forces F are related to displacements X, but the relationship is not obvious because one must work “backwards” from displacement X to forces F, while there is no guarantee that any particular compliance operator C has an inverse C−1 (that is, there is no guarantee of a closed form such as F=C−1X). In fact, there is not even any guarantee that C is a square matrix, because in the general case there may be more rows k than columns m (or vice-versa), based on the ability to model displacements in an arbitrary number of discrete locations, as described above.
  • In order to determine the force mapping F, therefore, a derived result Y may be desired, for example a derived result Y that is related to applied force F in some predictable fashion. In particular, a derived result Y is desired for which displacements X can be used to generate a relationship for a force map or force-moment mapping based on a particular moment M. For example, a relationship of the form MF=Y may be defined, from which imposed forces F can be determined from a particular displacement mapping X, for various different choices of moment M; that is:

  • X→MF=Y.   [2]
  • More generally, this may be considered the common case for deriving a scalar force Ftotal for example according to:

  • X→F total=[1 1 . . . 1]m F ,   [3]
  • or a force+moment vector from a corresponding matrix of appropriate dimension, e.g., for a two-dimensional mapping, of dimension 2-D+1:
  • X [ F total Mx My ] = [ 1 1 1 x 1 x 2 x m y 1 y 2 y m ] F . [ 4 ]
  • The above forms can also be utilized to generate forces F according to n different finger positions or other touch patterns 30, for example:
  • X [ F 1 F 2 F n ] , [ 5 ]
  • or, to generate a full two-dimensional force mapping:

  • X→F.   [6]
  • For a linear force/displacement algorithm, the desired transformation (or operator) J can be expressed as

  • JX=MF,   [7]
  • where operator (or matrix) J transforms displacement term X into a desired system output Y=MF. Using the relationship X=CF, this is:

  • JCF=MF.   [8]
  • Conceptually, therefore, J=MC−1 would in principle be a suitable form, except that compliance operator C is not necessarily square, as described above, and C−1 does not necessarily exist. Thus, there is no general, closed-form solution, and the desired force/displacement conversion algorithm is neither simple, nor obvious.
  • An alternative approach is to consider the generalized inverse or “pseudoinverse” C+, for example the left (pseudo) inverse, as defined by:

  • C + [L]=(C T C)−1 C T.   [9A]
  • This form may be appropriate for a rectangular matrix, for example with more row entries k than column entries m. Based on this definition, moreover, the pseudoinverse C+ may be constrained to exist by selection (or construction) of a full-rank compliance operator C, where CTC has inverse (CTC)−1, as described above. Alternatively, the right (pseudo) inverse may be used:

  • C + [R] =C T(CC T)−1,   [9B]
  • For example where C is defined as a rectangular matrix with more column entries m than row entries k.
  • In the case that C does happen to have an inverse C−1, the generalized inverse C+ can be defined so that the generalized inverse or pseudoinverse is substantially the same as the inverse (that is, with C+=C−1). In the more general case that C does not have an inverse (that is, C−1 is not well defined), however, the generalized inverse or pseudoinverse C+ provides a “best” or least squares approximation, which can serve the function of an inverse for operations on strictly rectangular (non-square) matrices and other non-invertible forms.
  • In particular, the generalized inverse C+ may be suitable for operations including linear matrices (or vectors) X and F, as described above, in order to generate a more general force/displacement mapping algorithm suitable for physical applications including force-sensitive input systems for electronic devices. For example, the selection J=MC+ (left inverse case) gives

  • J=M(C T C)−1 C T,   [10]
  • and, for the derived result Y:

  • Y=JX=M(C T C)−1 C T X.   [11]
  • The transformation operator J=M(CTC)−1CT (or MC+) can be calculated offline and stored in a device lookup table (LUT), for example via finite element analysis, or using a combination of finite element analysis, mechanical stress/displacement analysis, and physical calibrations, as described above. The individual values in (CTC)−1CT can also be corrected for operational effects, such as temperature, aging, etc., for example by storing an initial (or “ab initio”) value for the transformation in the lookup table, and scaling appropriately.
  • Alternatively, a right pseudoinverse may be used. For the general case, however,

  • Y=JX=MC + X,   [12]
  • which yields derived result Y based on displacement mapping X, as desired. The derived result also has the form the form MF=Y, as described above, so that:

  • MF=MC + X.   [13]
  • Thus, a particular or selected force-moment mapping MF may be determined in terms of the displacements X, based on the transformation operator J=MC+. The transformation operator, in turn, can be determined based on a combination of analytical and calibration methods, as described above, and stored in a lookup table in order to provide real-time force and moment analysis, based on a particular displacement mapping X.
  • The singular value decomposition (SVD) of compliance matrix C can also be used to identify or determine which degrees of freedom in the force map may be more difficult or challenging to reconstruct from displacements X, for example based on noise and numerical stability considerations. In particular, the characteristics of compliance matrix C may be determined by

  • C=UΣVT,   [14]

  • where

  • Σiii.   [15]
  • Right-singular vectors (columns of V) corresponding to small singular values a give degrees of freedom in the force map that may affect displacement more weakly (e.g., as compared to larger values). Thus, recovering the components of a given force map F corresponding to each value σi will tend to be a more poorly conditioned problem, as the values of σi get smaller.
  • FIG. 4A is a perspective view of an exemplary electronic device 10 utilizing force-sensing input system 12. In this particular example, device 10 is configured for use in a portable device application such as a mobile phone or smartphone, e.g., as shown in FIGS. 3A and 3B, above. Alternatively, device 10 may be configured as a media player, digital assistant, touch screen, tablet computer, personal computer, computer display, or other electronic device, in either portable or stationary form.
  • As shown in FIG. 4A, force-sensitive input system 12 is integrated into a front cover glass component, for example with sensing surface 32 provided in the form of a display window for a touch screen or other display system, as defined between border regions 50. In cover glass implementations, sensing surface 32 may be formed of a glass or other durable transparent material, for example silica glass, indium tin oxide, a clear plastic polymer such as acrylic or polycarbonate, or a transparent ceramic such as crystalline aluminum oxide or sapphire. Force and position sensitivity to different touch patterns 30 on surface 32 can be provided via a layer array of capacitive or resistive traces, or using discrete piezoelectric devices or other force and position sensing devices, for example as described above with respect to FIG. 2.
  • Housing assembly 28 and frame 28A are typically formed of metals such as aluminum and steel, or from plastic, glass, ceramics, composite materials, and combinations thereof. Frame 28A may be used to attach cover glass/sensing surface 32 to various top, bottom and side housing components 28B, 28C, and 28D, as shown in FIG. 4A, or via an adhesive coupling. Depending on configuration, device 10 can also accommodate a number of additional control and accessory features, including menu and hold buttons, volume switches, and other control mechanisms 20, and audio, camera, lighting, and other accessory features 22. One or more ports or connector apertures 26 may also be provided for power and data communications with device 10, with various mechanical fasteners 52 and access ports 54 to fix cover glass/sensing surface 32 to housing assembly 28, and to provide access to internal components such a flash memory device or subscriber identity module (SIM card).
  • FIG. 4B is a perspective view of electronic device 10 in an alternate configuration, for example a laptop or notebook computer, a minicomputer, a personal computer, or other portable or stationary data processing application. As shown in FIG. 5B, housing 28 may accommodate an input or keyboard component 56 and separate (e.g., upright or hinged) display 58.
  • In this particular example, input system 12 is provided in the form of a force-sensitive track pad or multi-touch tracking system, with sensing or tracking surface 32 positioned below or adjacent to keyboard 56. In track pad applications of input sensing system 12, sensing surface 32 may incorporate layered sets of capacitive or resistively coupled conductive traces, for example with a dielectric spring membrane or other dielectric separator. Alternatively, piezoelectric devices or other discrete position and force sensors may be used, as described above.
  • As illustrated in FIGS. 4A and 4B, force-sensitive input system 12 is configured to measure the spatial profile of deflection or deformation in sensing surface 32, in response to the forces applied in touch pattern(s) 30. Due to the bending plate (or spring membrane) mechanics, however, deflection is not necessarily localized to the specific area or region of touch pattern 30, but may be distributed and shifted, as shown above in FIGS. 3A and 3B. Thus, it is not trivial to recover or determine an accurate estimate of the applied forces corresponding to touch pattern(s) 30, from the displacement of sensing surface 32 alone.
  • Different algorithms may thus be applied, as described above, in order to estimate the touch forces based on a displacement or deflection map of the screen or track pad surface 32. Depending upon application, the algorithm may also utilize knowledge of the finger positions or touch patterns 30 to improve the force estimate, for example as obtained from a capacitive (e.g., multi-touch) sensor or other position sensing system. Alternatively, the force reconstruction algorithm may be defined based on the deflection map alone, absent any such independent position data. Such techniques are applicable to a range of different electronics applications and devices 10 for which force-sensitive input is desirable, including both display-based and non-display based input devices 12, such as force-sensitive touch screens and track pads.
  • FIG. 5 is block diagram illustrating an exemplary method 60 for operating an electronic device based on a sensed input force, for example an electronic device 10 with force-sensitive input system 12, as described above. In this particular example, method 60 may include one or more steps including, but not limited to, sensing deflection of a control surface (step 61), mapping displacement of the surface (step 62), and transforming the displacement map (step 63) to generate a force map or force-moment mapping (step 64).
  • The force-moment mapping (step 64) may include one or more force magnitudes (step 65) or centroids (step 66), which can also be utilized to control the device (step 67). Depending on application, the device may also provide for position sensing (step 68) of the touch pattern on the control surface, which can be used to generate a position map (step 70), for example for use in refining the force-moment mapping (step 64).
  • Sensing deflection (step 61) can be performed for touch patterns on the control surface of a track pad, touch screen display, or similar control input device, using a capacitive or resistive grid or discrete sensor array to generate deflection data, as described above. The deflection data (e.g., a set of deflection sensor signals) can be converted into a displacement map (step 62), for example using a two-dimensional pixel grid to characterize displacements across the (e.g., horizontal) control surface, with the displacements measured in a third (e.g., vertical) direction. The two-dimensional data can also be unwrapped into one-dimensional vector form, for example a row or column vector, in either covariant or contravariant form.
  • Depending on control surface configuration, the sensor array may also provide for position sensing (step 68), as described above, in either a single-touch or multi-touch configuration. The position data can also be mapped (step 70), providing a two-dimensional image of the touch pattern(s) on the control surface. The position map can be utilized in conjunction with the force mapping, for example in order to refining the force magnitude determination (step 65), or to determine one or more centroids of the force distribution (step 66)
  • More generally, force-moment mapping (step 64) may be performed independently of position sensing (step 68), and absent any other explicit position data characterizing the touch patterns on the control surface (step 70), outside the deflection data (step 61) and displacement map (step 62). In these applications, one or both of the force magnitudes (step 65) and centroids (step 66) of the force distribution are determined based on deflection alone, by generating the displacement map (step 62) based only on deflection data (step 61). That is, the displacement map (step 62) is transformed (step 63) independently of, and without reference to, any separate position mapping (step 70) or position data (step 68).
  • Transformation of the displacement data is non-trivial, however, as described above. In particular, a compliance matrix C may be defined (step 71), relating the displacement map X to the force distribution F via X=CF. Because the compliance mapping C is not necessarily square, however, and may not have an inverse C−1, a derived result or system output Y is desired (step 72), where the desired system output Y is related to the forces imposed on the control surface by a moment mapping M; that is, with Y=MF.
  • The derived result Y, in turn, is determined based on a transformation of the displacement map (step 73), for example with transformation operator J defined by Y=JX. Based on the definition of the compliance operator C, the transformation operator J can also be described in terms of the force mapping F (step 74), for example with JX=JCF.
  • Thus, a generalized inverse or pseudoinverse operator approach (step 75) may be utilized, where C+ is the generalized inverse of compliance operator C, and the transformation operator is J=MC+. The generalized inverse operator or pseudoinverse C+, in turn, can be defined in either a left-side or right-side convention, based on the number of rows and columns in compliance matrix C, and the corresponding dimension (or number of entries) in the force and displacement mappings F and X, for example as defined in one-dimensional row or column vector form, as described above, or using covariant or contravariant vectors of arbitrary order.
  • The transformation operator J may also include a particular moment M (e.g., J=MC+, as shown above), and may be defined offline, based on a combination of physical calibrations, stress and strain analysis, and finite element analysis techniques. The resulting transformation data can be stored in a lookup table, so that the desired force moment mapping MF can be determined based on the displacement mapping X (step 76), using the lookup table in real time. The form of the force mapping output (step 64) can be defined by the desired moment mapping M, and may include one or more force magnitudes (step 65) and centroids (step 66), which characterize the force distribution due to the touch patterns on the control surface.
  • Device control (step 67) is performed based on the force outputs (step 64), either alone or in combination with independent position data (steps 68 and 70). For example, the force magnitude (step 65) may be utilized to define a scalar input, such as a volume, playback speed, or zoom ratio, with or without reference to position. A position may also be derived from the deflection data, for example in order to generate one or more centroids (step 66), in order to control any of the electronic device operations described herein. Alternatively, the force outputs (step 64) may be used in conjunction with an independent position mapping (step 70), with one or both of the force magnitude (step 65) and centroid (step 66) determined at least in part based on position sensing (step 68), as well as deflection sensing (step 61).
  • While this invention has been described with reference to exemplary embodiments, it will be understood by those skilled in the art that various changes can be made and equivalents may be substituted for elements thereof, without departing from the spirit and scope of the invention. In addition, modifications may be made to adapt the teachings of the invention to particular situations and materials, without departing from the essential scope thereof. Thus, the invention is not limited to the particular examples that are disclosed herein, but encompasses all embodiments falling within the scope of the appended claims.

Claims (20)

We claim:
1. A touch sensitive input system for an electronic device, the system comprising:
a deflection sensor disposed with respect to a sensing surface of the electronic device, the deflection sensor configured to generate a deflection signal based on deflection of the sensing surface; and
a processor in signal communication with the deflection sensor, the processor operable to generate a displacement map characterizing deflection of the sensing surface based on the deflection signal, the processor further configured to generate a force map characterizing force on the sensing surface based on a transformation of the displacement map;
wherein the transformation of the displacement map comprises a generalized inverse of a compliance operator relating the displacement map to the force map.
2. The system of claim 1, wherein the compliance operator has a rectangular representation, such that the compliance operator has no inverse.
3. The system of claim 1, wherein the compliance operator has more rows than columns, the rows corresponding to entries in the displacement map and the columns corresponding to entries in the force map.
4. The system of claim 1, wherein the deflection sensor comprises an array of conductive traces disposed in a generally parallel sense with respect to the sensing surface, the array of conducting traces configured to generate the deflection signal based on capacitive coupling.
5. The system of claim 1, further comprising a position sensor disposed with respect to the sensing surface, the position sensor configured to generate a position signal based on a touch pattern on the sensing surface, wherein the processor is further operable to generate a position map characterizing the touch pattern, based on the position signal.
6. A mobile device comprising the touch sensitive input system of claim 1, wherein the sensing surface comprises a cover glass configured for viewing a touch screen display.
7. A computing device comprising the touch sensitive input system of claim 1, and further comprising a track pad comprising the sensing surface.
8. A method of operating an electronic device, the method comprising:
sensing deflection of a control surface of the electronic device in response to a touch pattern on the control surface;
generating a displacement map of the control surface, based on the deflection;
defining a transformation relating the displacement map to force imposed on the control surface by the touch pattern; and
generating a force moment map by operation of the transformation on the displacement map, wherein the force moment map describes the force imposed on the control surface by the touch pattern.
9. The method of claim 8, wherein generating the force moment map comprises transforming the displacement map absent other position data relating to the touch pattern on the control surface, such that the force moment map describes the force imposed on the control surface based only on the deflection data, independently of any other such position data.
10. The method of claim 8, further comprising:
sensing a position of the touch pattern on the control surface;
determining a centroid of the force moment map based on the position of the touch pattern; and
describing the force imposed on the control surface by the touch pattern, based on the centroid.
11. The method of claim 8, wherein the transformation is based on a generalized inverse C+ of a compliance operator C relating a displacement map X to force F imposed on the control surface, such that X=CF.
12. The method of claim 11, wherein the compliance operator C has a substantially rectangular representation, such that the compliance operator C has no inverse C−1.
13. The method of claim 11, wherein generating the force moment map of the control surface comprises generating a derived output Y related to the force F imposed on the control surface and a moment M, such that Y=MF.
14. The method of claim 13, wherein the transformation further includes the moment M, such that the derived output Y=MC+X and the force F, moment M, and displacement map X are related by MF=MC+X.
15. The method of claim 8, further comprising determining the transformation by one or more of finite element analysis, mechanical stress analysis, and physical calibration of the displacement of the control surface by known force and moment inputs.
16. The method of claim 15, further comprising storing the transformation operator in the form of a lookup table.
17. The method of claim 16, wherein transforming the displacement map comprises generating a force moment map based on the displacement map and the lookup table, such that the force moment map describes the force imposed on the control surface by the touch pattern.
18. The method of claim 16, wherein transforming the displacement map comprises generating a force moment map based on the displacement map and the lookup table, such that the force moment map describes a force-moment distribution imposed on the control surface by the touch pattern.
19. A non-volatile computer readable storage medium having program code embedded thereon, the program code executable by a processor of an electronic device to perform a method comprising:
sensing deflection of a surface of the electronic device in response to a touch pattern thereon;
generating a displacement map of the surface, based on the deflection;
transforming the displacement map into a force moment map via a transformation operator, the transformation operator based on a generalized inverse of a compliance operator relating the displacement map to force imposed on the surface by the touch pattern;
determining the force imposed on the surface based on the force moment map; and
controlling operation of the electronic device, based on the force.
20. The storage medium of claim 19, the method further comprising generating a derived output Y related to a force F imposed on the surface and a moment M, such that Y=MF;
wherein the transformation operator J determines the derived output Y according to the displacement map X, such that Y=JX;
wherein the transformation operator J comprises a generalized inverse C+ of a compliance operator C relating a displacement map X to the force F imposed on the surface, such that X=CF; and
wherein the transformation operator J further comprises the moment M, such that the force F imposed on the surface, the moment M, and the displacement mapping X are related by the transformation operator J=MC+ according to MF=MC+X.
US14/776,610 2013-03-15 2013-03-15 Touch force deflection sensor Active US9851828B2 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2013/032697 WO2014143066A1 (en) 2013-03-15 2013-03-15 Touch force deflection sensor

Publications (2)

Publication Number Publication Date
US20160034088A1 true US20160034088A1 (en) 2016-02-04
US9851828B2 US9851828B2 (en) 2017-12-26

Family

ID=48045107

Family Applications (1)

Application Number Title Priority Date Filing Date
US14/776,610 Active US9851828B2 (en) 2013-03-15 2013-03-15 Touch force deflection sensor

Country Status (2)

Country Link
US (1) US9851828B2 (en)
WO (1) WO2014143066A1 (en)

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160320914A1 (en) * 2013-12-27 2016-11-03 Fujikura Ltd. Input device and method for controlling input device
US20160328067A1 (en) * 2013-12-27 2016-11-10 Fujikura Ltd. Electronic apparatus and method for controlling electronic apparatus
US20170060314A1 (en) * 2015-08-31 2017-03-02 Synaptics Inforjporated Estimating force applied by an input object to a touch sensor
US20170308075A1 (en) * 2016-04-26 2017-10-26 Ford Global Technologies, Llc Determination of continuous user interaction and intent through measurement of force variability
DE102017100686A1 (en) 2016-09-12 2018-03-15 Shanghai Tianma Micro-electronics Co., Ltd. DISPLAY FIELD AND DISPLAY DEVICE
US10006937B2 (en) 2015-03-06 2018-06-26 Apple Inc. Capacitive sensors for electronic devices and methods of forming the same
US10007343B2 (en) 2016-03-31 2018-06-26 Apple Inc. Force sensor in an input device
US10048789B2 (en) 2014-02-12 2018-08-14 Apple Inc. Force determination employing sheet sensor and capacitive array
CN108415630A (en) * 2017-02-09 2018-08-17 晶门科技(中国)有限公司 A kind of device and its manufacturing method combining capacitance touching control sensor
CN108415629A (en) * 2017-02-09 2018-08-17 晶门科技(中国)有限公司 A kind of device and its manufacturing method combining capacitance touching control sensor
US10162444B2 (en) 2012-12-14 2018-12-25 Apple Inc. Force sensor incorporated into display
US10162446B2 (en) 2015-08-04 2018-12-25 Apple Inc. Proximity edge sensing
US10168814B2 (en) 2012-12-14 2019-01-01 Apple Inc. Force sensing based on capacitance changes
US10198123B2 (en) 2014-04-21 2019-02-05 Apple Inc. Mitigating noise in capacitive sensor
US10254894B2 (en) 2015-12-23 2019-04-09 Cambridge Touch Technologies Ltd. Pressure-sensitive touch panel
US10262179B2 (en) 2013-07-25 2019-04-16 Apple Inc. Input member with capacitive sensor
US10282046B2 (en) 2015-12-23 2019-05-07 Cambridge Touch Technologies Ltd. Pressure-sensitive touch panel
US10310659B2 (en) 2014-12-23 2019-06-04 Cambridge Touch Technologies Ltd. Pressure-sensitive touch panel
US10318038B2 (en) 2014-12-23 2019-06-11 Cambridge Touch Technologies Ltd. Pressure-sensitive touch panel
US10386970B2 (en) 2013-02-08 2019-08-20 Apple Inc. Force determination based on capacitive sensing
US10409421B2 (en) * 2016-06-12 2019-09-10 Apple Inc. Devices and methods for processing touch inputs based on adjusted input parameters
US10613695B2 (en) 2016-01-20 2020-04-07 Parade Technologies, Ltd. Integrated touch sensing and force sensing in a touch detection device
WO2020169953A1 (en) * 2019-02-19 2020-08-27 Cambridge Touch Technologies Ltd. Force sensing touch panel
US10817116B2 (en) 2017-08-08 2020-10-27 Cambridge Touch Technologies Ltd. Device for processing signals from a pressure-sensing touch panel
US20210089168A1 (en) * 2016-02-19 2021-03-25 Apple Inc. Force Sensing Architectures
US11093088B2 (en) 2017-08-08 2021-08-17 Cambridge Touch Technologies Ltd. Device for processing signals from a pressure-sensing touch panel
US11143497B2 (en) * 2017-09-22 2021-10-12 International Business Machines Corporation Determination of a flexible display
US11249638B2 (en) * 2016-08-25 2022-02-15 Parade Technologies, Ltd. Suppression of grip-related signals using 3D touch

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101712346B1 (en) * 2014-09-19 2017-03-22 주식회사 하이딥 Touch input device
US10928180B2 (en) * 2017-04-22 2021-02-23 Tactual Labs Co. Flexible deformation sensor
US10866683B2 (en) 2018-08-27 2020-12-15 Apple Inc. Force or touch sensing on a mobile device using capacitive or pressure sensing
GB202005910D0 (en) * 2020-03-26 2020-06-10 Peratech Holdco Ltd Gaming Device

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030214485A1 (en) * 2002-05-17 2003-11-20 Roberts Jerry B. Calibration of force based touch panel systems
US20060066582A1 (en) * 2004-09-24 2006-03-30 Apple Computer, Inc. Raw data track pad device and system
US20100024573A1 (en) * 2008-07-29 2010-02-04 Dodge Daverman Single Sided Capacitive Force Sensor for Electronic Devices
US20100053116A1 (en) * 2008-08-26 2010-03-04 Dodge Daverman Multi-touch force sensing touch-screen devices and methods
US20100117989A1 (en) * 2008-11-12 2010-05-13 Hon Hai Precision Industry Co., Ltd. Touch panel module and touch panel system with same
US20120086666A1 (en) * 2010-10-12 2012-04-12 Cypress Semiconductor Corporation Force Sensing Capacitive Hybrid Touch Sensor
US20120105358A1 (en) * 2010-11-03 2012-05-03 Qualcomm Incorporated Force sensing touch screen
US20120188202A1 (en) * 2009-09-30 2012-07-26 Sharp Kabushiki Kaisha Liquid-crystal panel equipped with touch sensor function

Family Cites Families (141)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS59104764A (en) 1982-12-03 1984-06-16 Canon Inc Liquid crystal display keyboard
US5343064A (en) 1988-03-18 1994-08-30 Spangler Leland J Fully integrated single-crystal silicon-on-insulator process, sensors and circuits
US5929517A (en) 1994-12-29 1999-07-27 Tessera, Inc. Compliant integrated circuit package and method of fabricating the same
WO1997018528A1 (en) 1995-11-13 1997-05-22 Synaptics, Inc. Stylus input capacitive touchpad sensor
WO1997040482A1 (en) 1996-04-24 1997-10-30 Logitech, Inc. Touch and pressure sensing method and apparatus
GB2313195A (en) 1996-05-02 1997-11-19 Univ Bristol Data entry device
JP3053007B2 (en) 1997-07-28 2000-06-19 日本電気株式会社 Fingerprint sensor
KR100595924B1 (en) 1998-01-26 2006-07-05 웨인 웨스터만 Method and apparatus for integrating manual input
US7800592B2 (en) 2005-03-04 2010-09-21 Apple Inc. Hand held electronic device with multiple touch sensing devices
EP0967468A1 (en) 1998-06-26 1999-12-29 Bossard AG Capacitive force sensor
JP2003501128A (en) 1999-06-02 2003-01-14 シーメンス アクチエンゲゼルシヤフト Method and apparatus for extracting physiological data
JP3779515B2 (en) 1999-11-10 2006-05-31 グンゼ株式会社 Touch panel
NO20001360D0 (en) 2000-03-15 2000-03-15 Thin Film Electronics Asa Vertical electrical connections in stack
US7289083B1 (en) 2000-11-30 2007-10-30 Palm, Inc. Multi-sided display for portable computer
US6677932B1 (en) 2001-01-28 2004-01-13 Finger Works, Inc. System and method for recognizing touch typing under limited tactile feedback conditions
US6570557B1 (en) 2001-02-10 2003-05-27 Finger Works, Inc. Multi-touch system and method for emulating modifier keys via fingertip chords
US6545495B2 (en) 2001-04-17 2003-04-08 Ut-Battelle, Llc Method and apparatus for self-calibration of capacitive sensors
DE10130373B4 (en) 2001-06-23 2006-09-21 Abb Patent Gmbh Capacitive differential pressure sensor
US6995752B2 (en) 2001-11-08 2006-02-07 Koninklijke Philips Electronics N.V. Multi-point touch pad
US11275405B2 (en) 2005-03-04 2022-03-15 Apple Inc. Multi-functional hand-held device
US20040100448A1 (en) 2002-11-25 2004-05-27 3M Innovative Properties Company Touch display
JP4531358B2 (en) 2003-07-10 2010-08-25 株式会社エヌ・ティ・ティ・ドコモ Touch panel display device
JP2005031425A (en) 2003-07-14 2005-02-03 Olympus Corp Microscope objective lens
US6989728B2 (en) 2003-10-14 2006-01-24 Duraswitch Industries, Inc. Flexible magnetically coupled pushbutton switch
DE102004016155B3 (en) 2004-04-01 2006-05-24 Infineon Technologies Ag Force sensor with organic field effect transistors, pressure sensor based thereon, position sensor and fingerprint sensor
CN1332297C (en) 2004-06-09 2007-08-15 夏普株式会社 Protection hood of sensor surface capable of conducting fixed point operation
US7609178B2 (en) 2006-04-20 2009-10-27 Pressure Profile Systems, Inc. Reconfigurable tactile sensor input device
US7373843B2 (en) 2005-06-02 2008-05-20 Fidelica Microsystems Flexible imaging pressure sensor
US7337085B2 (en) 2005-06-10 2008-02-26 Qsi Corporation Sensor baseline compensation in a force-based touch device
CN101000529B (en) 2006-01-13 2011-09-14 北京汇冠新技术股份有限公司 Device for detecting touch of infrared touch screen
US7538760B2 (en) 2006-03-30 2009-05-26 Apple Inc. Force imaging input device and system
US7511702B2 (en) 2006-03-30 2009-03-31 Apple Inc. Force and location sensitive display
JP4736944B2 (en) 2006-05-17 2011-07-27 パナソニック株式会社 Manufacturing method of wiring substrate with conductive layer
US20070272919A1 (en) 2006-05-24 2007-11-29 Matsushita Electric Industrial Co., Ltd. Stressed organic semiconductor devices
US7920134B2 (en) 2007-06-13 2011-04-05 Apple Inc. Periodic sensor autocalibration and emulation by varying stimulus level
EP2073107A1 (en) 2007-12-10 2009-06-24 Research In Motion Limited Electronic device and touch screen having discrete touch-sensitive areas
KR101237640B1 (en) 2008-01-29 2013-02-27 (주)멜파스 Touchscreen apparatus having structure for preventing forming of parasitic capacitance
US20090237374A1 (en) 2008-03-20 2009-09-24 Motorola, Inc. Transparent pressure sensor and method for using
US8169332B2 (en) 2008-03-30 2012-05-01 Pressure Profile Systems Corporation Tactile device with force sensitive touch input surface
KR100943989B1 (en) 2008-04-02 2010-02-26 (주)엠아이디티 Capacitive Touch Screen
KR20090112118A (en) 2008-04-23 2009-10-28 엘지이노텍 주식회사 Display device
ATE475928T1 (en) 2008-05-29 2010-08-15 Research In Motion Ltd ELECTRONIC DEVICE AND TOUCH SCREEN DISPLAY
US20100220065A1 (en) 2009-02-27 2010-09-02 Research In Motion Limited Touch-sensitive display including a force-sensor and portable electronic device including same
KR101016221B1 (en) 2008-11-14 2011-02-25 한국표준과학연구원 Method for Embodiment of Algorism Using Force Sesor
US20100123686A1 (en) 2008-11-19 2010-05-20 Sony Ericsson Mobile Communications Ab Piezoresistive force sensor integrated in a display
EP2368170B1 (en) 2008-11-26 2017-11-01 BlackBerry Limited Touch-sensitive display method and apparatus
EP2202619A1 (en) 2008-12-23 2010-06-30 Research In Motion Limited Portable electronic device including tactile touch-sensitive input device and method of controlling same
US8760413B2 (en) 2009-01-08 2014-06-24 Synaptics Incorporated Tactile surface
WO2010096499A2 (en) * 2009-02-17 2010-08-26 Noah Anglin Floating plane touch detection system
JP5493739B2 (en) 2009-03-19 2014-05-14 ソニー株式会社 Sensor device and information processing device
US9024907B2 (en) 2009-04-03 2015-05-05 Synaptics Incorporated Input device with capacitive force sensor and method for constructing the same
JP5146389B2 (en) 2009-04-03 2013-02-20 ソニー株式会社 Information processing apparatus and estimation method
US9383881B2 (en) 2009-06-03 2016-07-05 Synaptics Incorporated Input device and method with pressure-sensitive layer
JP2011017626A (en) 2009-07-09 2011-01-27 Sony Corp Mechanical quantity detection member and mechanical quantity detection apparatus
CN102473235A (en) 2009-07-09 2012-05-23 比尔凯科技新加坡有限公司 Reading device able to identify a tag or an object adapted to be identified, related methods and systems
US9323398B2 (en) 2009-07-10 2016-04-26 Apple Inc. Touch and hover sensing
US20110012845A1 (en) 2009-07-20 2011-01-20 Rothkopf Fletcher R Touch sensor structures for displays
US20110037706A1 (en) 2009-08-14 2011-02-17 Research In Motion Limited Electronic device including tactile touch-sensitive input device and method of controlling same
US8390481B2 (en) 2009-08-17 2013-03-05 Apple Inc. Sensing capacitance changes of a housing of an electronic device
US8334849B2 (en) 2009-08-25 2012-12-18 Pixart Imaging Inc. Firmware methods and devices for a mutual capacitance touch sensing device
US8072437B2 (en) 2009-08-26 2011-12-06 Global Oled Technology Llc Flexible multitouch electroluminescent display
US8780055B2 (en) 2009-10-02 2014-07-15 Blackberry Limited Low power wakeup detection circuit and a portable electronic device having a low power wakeup detection circuit
EP2315102B1 (en) 2009-10-02 2018-01-31 BlackBerry Limited A low power wakeup detection circuit and a portable electronic device having a low power wakeup detection circuit
TWI419028B (en) 2009-10-07 2013-12-11 Wintek Corp Touch panel and display device using the same
TWI420375B (en) 2009-10-09 2013-12-21 Egalax Empia Technology Inc Device and method for parallel-scanning differential touch panel
US10068728B2 (en) 2009-10-15 2018-09-04 Synaptics Incorporated Touchpad with capacitive force sensing
EP2315186B1 (en) 2009-10-26 2016-09-21 Lg Electronics Inc. Mobile terminal with flexible body for inputting a signal upon bending said body
TWI408449B (en) 2009-11-03 2013-09-11 Wintek Corp Liquid crystal display panel
JP5347913B2 (en) 2009-11-06 2013-11-20 ソニー株式会社 SENSOR DEVICE, ELECTRONIC DEVICE, AND METHOD FOR MANUFACTURING SENSOR DEVICE
US8633916B2 (en) 2009-12-10 2014-01-21 Apple, Inc. Touch pad with force sensors and actuator feedback
JP5526761B2 (en) 2009-12-22 2014-06-18 ソニー株式会社 Sensor device and information processing device
EP2357547B1 (en) 2010-01-04 2014-03-26 BlackBerry Limited Portable electronic device and method of controlling same
US8669963B2 (en) 2010-02-03 2014-03-11 Interlink Electronics, Inc. Sensor system
US9092129B2 (en) 2010-03-17 2015-07-28 Logitech Europe S.A. System and method for capturing hand annotations
JP5540797B2 (en) 2010-03-19 2014-07-02 ソニー株式会社 Sensor device and display device
US20110235156A1 (en) 2010-03-26 2011-09-29 Qualcomm Mems Technologies, Inc. Methods and devices for pressure detection
US9057653B2 (en) 2010-05-11 2015-06-16 Synaptics Incorporated Input device with force sensing
US8274495B2 (en) 2010-05-25 2012-09-25 General Display, Ltd. System and method for contactless touch screen
EP2580647A1 (en) 2010-06-11 2013-04-17 3M Innovative Properties Company Positional touch sensor with force measurement
US20120026123A1 (en) 2010-07-30 2012-02-02 Grunthaner Martin Paul Compensation for Capacitance Change in Touch Sensing Device
US8963874B2 (en) 2010-07-31 2015-02-24 Symbol Technologies, Inc. Touch screen rendering system and method of operation thereof
EP2418561B1 (en) 2010-08-11 2012-10-03 Research In Motion Limited Electronic device including touch-sensitive display
US20120038577A1 (en) 2010-08-16 2012-02-16 Floatingtouch, Llc Floating plane touch input device and method
US8923014B2 (en) 2010-08-19 2014-12-30 Lg Display Co., Ltd. Display device having touch panel
US8982081B2 (en) * 2010-09-12 2015-03-17 Shenzhen New Degree Technology Co., Ltd. Displacement sensing touch panel and touch screen using the same
JP5625669B2 (en) 2010-09-17 2014-11-19 ソニー株式会社 Sensor device and information processing device
US8351993B2 (en) 2010-10-08 2013-01-08 Research In Motion Limited Device having side sensor
KR101704536B1 (en) 2010-10-08 2017-02-09 삼성전자주식회사 Touch Panel type of Slim and Portable Device including the same
EP2628069B1 (en) 2010-10-12 2020-12-02 New York University Apparatus for sensing utilizing tiles, sensor having a set of plates, object identification for multi-touch surfaces, and method
US8743083B2 (en) 2010-10-15 2014-06-03 Logitech Europe, S.A. Dual mode touchpad with a low power mode using a proximity detection mode
US20120092279A1 (en) 2010-10-18 2012-04-19 Qualcomm Mems Technologies, Inc. Touch sensor with force-actuated switched capacitor
TWI428811B (en) 2010-10-22 2014-03-01 Hon Hai Prec Ind Co Ltd Electronic device including a touch sensitive screen and wrist worn electronic device
CN201828892U (en) 2010-11-02 2011-05-11 摩贝斯电子(苏州)有限公司 Touch pad
US8724861B1 (en) 2010-12-06 2014-05-13 University Of South Florida Fingertip force, location, and orientation sensor
TWM422119U (en) 2010-12-30 2012-02-01 Egalax Empia Technology Inc Capacitive touch screen
US20120169612A1 (en) 2010-12-30 2012-07-05 Motorola, Inc. Method and apparatus for a touch and nudge interface
US8577289B2 (en) 2011-02-17 2013-11-05 Apple Inc. Antenna with integrated proximity sensor for proximity-based radio-frequency power control
US8638316B2 (en) 2011-03-11 2014-01-28 Cypress Semiconductor Corporation Two prong capacitive sensor pattern
GB2494482A (en) 2011-04-06 2013-03-13 Research In Motion Ltd Gesture recognition on a portable device with force-sensitive housing
JP5730240B2 (en) 2011-04-25 2015-06-03 信越ポリマー株式会社 Capacitance sensor sheet manufacturing method and capacitance sensor sheet
US20120274602A1 (en) 2011-04-29 2012-11-01 Qualcomm Mems Technologies, Inc. Wiring and periphery for integrated capacitive touch devices
JP5666698B2 (en) 2011-05-20 2015-02-12 アルプス電気株式会社 LOAD DETECTING DEVICE, ELECTRONIC DEVICE USING THE LOAD DETECTING DEVICE, AND METHOD FOR MANUFACTURING LOAD DETECTING DEVICE
TW201250530A (en) 2011-06-07 2012-12-16 Hannstar Display Corp Integrated touch panel structure and manufacturing method thereof
US20120319987A1 (en) 2011-06-15 2012-12-20 Synaptics Incorporated System and method for calibrating an input device
JP5710837B2 (en) 2011-06-20 2015-04-30 シナプティクス インコーポレイテッド Touch and display device with integrated sensor controller
US9490804B2 (en) 2011-09-28 2016-11-08 Cypress Semiconductor Corporation Capacitance sensing circuits, methods and systems having conductive touch surface
KR101984161B1 (en) 2011-11-08 2019-05-31 삼성전자 주식회사 Touch screen panel and portable device
JP5273328B1 (en) 2011-11-11 2013-08-28 パナソニック株式会社 Touch panel device
US10144669B2 (en) 2011-11-21 2018-12-04 Apple Inc. Self-optimizing chemical strengthening bath for glass
US8922523B2 (en) 2011-11-29 2014-12-30 Apple Inc. Embedded force measurement
US9360977B2 (en) 2011-12-08 2016-06-07 Sony Mobile Communications Ab Input interface, handheld electronic device and method of producing an input interface
US20130176270A1 (en) 2012-01-09 2013-07-11 Broadcom Corporation Object classification for touch panels
US9459738B2 (en) 2012-03-06 2016-10-04 Apple Inc. Calibration for pressure effects on touch sensor panels
US8919211B1 (en) 2012-04-17 2014-12-30 Alarm.Com Incorporated Force-sensitive occupancy sensing technology
US9086768B2 (en) 2012-04-30 2015-07-21 Apple Inc. Mitigation of parasitic capacitance
JP6023879B2 (en) 2012-05-18 2016-11-09 アップル インコーポレイテッド Apparatus, method and graphical user interface for operating a user interface based on fingerprint sensor input
US9030440B2 (en) 2012-05-18 2015-05-12 Apple Inc. Capacitive sensor packaging
CN104604207A (en) 2012-06-05 2015-05-06 Nec卡西欧移动通信株式会社 Mobile terminal device
KR20130137438A (en) 2012-06-07 2013-12-17 삼성전기주식회사 Touch sensor and the manufacturing method
US9430102B2 (en) 2012-07-05 2016-08-30 Apple Touch interface using patterned bulk amorphous alloy
WO2014018121A1 (en) 2012-07-26 2014-01-30 Changello Enterprise Llc Fingerprint-assisted force estimation
WO2014018119A1 (en) 2012-07-26 2014-01-30 Changello Enterprise Llc Ultrasound-based force and touch sensing
US20140085247A1 (en) 2012-09-21 2014-03-27 Apple Inc. Force Sensing Using Dual-Layer Cover Glass with Gel Adhesive and Capacitive Sensing
US20140111953A1 (en) 2012-10-19 2014-04-24 Apple Inc. Electronic Devices With Components Mounted to Touch Sensor Substrates
US9104898B2 (en) 2012-11-21 2015-08-11 Lenovo (Singapore) Pte. Ltd. Utilizing force information to improve fingerprint reading
US9088282B2 (en) 2013-01-25 2015-07-21 Apple Inc. Proximity sensors with optical and electrical sensing capabilities
US10386970B2 (en) 2013-02-08 2019-08-20 Apple Inc. Force determination based on capacitive sensing
US8577644B1 (en) 2013-03-11 2013-11-05 Cypress Semiconductor Corp. Hard press rejection
WO2014143065A1 (en) 2013-03-15 2014-09-18 Rinand Solutions Llc Force-sensitive fingerprint sensing input
US10229258B2 (en) 2013-03-27 2019-03-12 Samsung Electronics Co., Ltd. Method and device for providing security content
US9207134B2 (en) 2013-04-04 2015-12-08 Chung-Yuan Christian University Electronic device and quad-axial force and torque measurement sensor thereof
US9671889B1 (en) 2013-07-25 2017-06-06 Apple Inc. Input member with capacitive sensor
US9697409B2 (en) 2013-09-10 2017-07-04 Apple Inc. Biometric sensor stack structure
WO2015080696A1 (en) 2013-11-26 2015-06-04 Rinand Solutions Llc Self-calibration of force sensors and inertial compensation
US9375874B2 (en) 2013-12-03 2016-06-28 Htc Corporation Accessory, electronic assembly, control method, and method for forming an accessory
US20150185946A1 (en) 2013-12-30 2015-07-02 Google Inc. Touch surface having capacitive and resistive sensors
US9411458B2 (en) 2014-06-30 2016-08-09 Synaptics Incorporated System and method for determining input object information from proximity and force measurements
US9390308B2 (en) 2014-09-12 2016-07-12 Blackberry Limited Fingerprint scanning method
WO2016172713A1 (en) 2015-04-23 2016-10-27 Shenzhen Huiding Technology Co., Ltd. Multifunction fingerprint sensor
CN204650590U (en) 2015-05-12 2015-09-16 天津柒城科技发展有限公司 A kind of goods delivery manipulater
US9715301B2 (en) 2015-08-04 2017-07-25 Apple Inc. Proximity edge sensing

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030214485A1 (en) * 2002-05-17 2003-11-20 Roberts Jerry B. Calibration of force based touch panel systems
US20060066582A1 (en) * 2004-09-24 2006-03-30 Apple Computer, Inc. Raw data track pad device and system
US20100024573A1 (en) * 2008-07-29 2010-02-04 Dodge Daverman Single Sided Capacitive Force Sensor for Electronic Devices
US20100053116A1 (en) * 2008-08-26 2010-03-04 Dodge Daverman Multi-touch force sensing touch-screen devices and methods
US20100117989A1 (en) * 2008-11-12 2010-05-13 Hon Hai Precision Industry Co., Ltd. Touch panel module and touch panel system with same
US20120188202A1 (en) * 2009-09-30 2012-07-26 Sharp Kabushiki Kaisha Liquid-crystal panel equipped with touch sensor function
US20120086666A1 (en) * 2010-10-12 2012-04-12 Cypress Semiconductor Corporation Force Sensing Capacitive Hybrid Touch Sensor
US20120105358A1 (en) * 2010-11-03 2012-05-03 Qualcomm Incorporated Force sensing touch screen

Cited By (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10162444B2 (en) 2012-12-14 2018-12-25 Apple Inc. Force sensor incorporated into display
US10168814B2 (en) 2012-12-14 2019-01-01 Apple Inc. Force sensing based on capacitance changes
US10386970B2 (en) 2013-02-08 2019-08-20 Apple Inc. Force determination based on capacitive sensing
US10262179B2 (en) 2013-07-25 2019-04-16 Apple Inc. Input member with capacitive sensor
US20160328067A1 (en) * 2013-12-27 2016-11-10 Fujikura Ltd. Electronic apparatus and method for controlling electronic apparatus
US20160320914A1 (en) * 2013-12-27 2016-11-03 Fujikura Ltd. Input device and method for controlling input device
US10048789B2 (en) 2014-02-12 2018-08-14 Apple Inc. Force determination employing sheet sensor and capacitive array
US10198123B2 (en) 2014-04-21 2019-02-05 Apple Inc. Mitigating noise in capacitive sensor
US10318038B2 (en) 2014-12-23 2019-06-11 Cambridge Touch Technologies Ltd. Pressure-sensitive touch panel
US10310659B2 (en) 2014-12-23 2019-06-04 Cambridge Touch Technologies Ltd. Pressure-sensitive touch panel
US10006937B2 (en) 2015-03-06 2018-06-26 Apple Inc. Capacitive sensors for electronic devices and methods of forming the same
US10162446B2 (en) 2015-08-04 2018-12-25 Apple Inc. Proximity edge sensing
US20170060314A1 (en) * 2015-08-31 2017-03-02 Synaptics Inforjporated Estimating force applied by an input object to a touch sensor
US10261619B2 (en) * 2015-08-31 2019-04-16 Synaptics Incorporated Estimating force applied by an input object to a touch sensor
US10282046B2 (en) 2015-12-23 2019-05-07 Cambridge Touch Technologies Ltd. Pressure-sensitive touch panel
US10254894B2 (en) 2015-12-23 2019-04-09 Cambridge Touch Technologies Ltd. Pressure-sensitive touch panel
US10613695B2 (en) 2016-01-20 2020-04-07 Parade Technologies, Ltd. Integrated touch sensing and force sensing in a touch detection device
US20210089168A1 (en) * 2016-02-19 2021-03-25 Apple Inc. Force Sensing Architectures
US11803276B2 (en) * 2016-02-19 2023-10-31 Apple Inc. Force sensing architectures
US10007343B2 (en) 2016-03-31 2018-06-26 Apple Inc. Force sensor in an input device
US20170308075A1 (en) * 2016-04-26 2017-10-26 Ford Global Technologies, Llc Determination of continuous user interaction and intent through measurement of force variability
US10372121B2 (en) * 2016-04-26 2019-08-06 Ford Global Technologies, Llc Determination of continuous user interaction and intent through measurement of force variability
US10409421B2 (en) * 2016-06-12 2019-09-10 Apple Inc. Devices and methods for processing touch inputs based on adjusted input parameters
US11449214B2 (en) 2016-08-25 2022-09-20 Parade Technologies, Ltd. 3D touch enabled gestures
US11249638B2 (en) * 2016-08-25 2022-02-15 Parade Technologies, Ltd. Suppression of grip-related signals using 3D touch
DE102017100686A1 (en) 2016-09-12 2018-03-15 Shanghai Tianma Micro-electronics Co., Ltd. DISPLAY FIELD AND DISPLAY DEVICE
CN108415630A (en) * 2017-02-09 2018-08-17 晶门科技(中国)有限公司 A kind of device and its manufacturing method combining capacitance touching control sensor
CN108415629A (en) * 2017-02-09 2018-08-17 晶门科技(中国)有限公司 A kind of device and its manufacturing method combining capacitance touching control sensor
US10235002B2 (en) * 2017-02-09 2019-03-19 Solomon Systech Limited Touch sensor
US10817116B2 (en) 2017-08-08 2020-10-27 Cambridge Touch Technologies Ltd. Device for processing signals from a pressure-sensing touch panel
US11093088B2 (en) 2017-08-08 2021-08-17 Cambridge Touch Technologies Ltd. Device for processing signals from a pressure-sensing touch panel
US11143497B2 (en) * 2017-09-22 2021-10-12 International Business Machines Corporation Determination of a flexible display
US11494022B2 (en) 2019-02-19 2022-11-08 Cambridge Touch Technologies Ltd. Force sensing touch panel
WO2020169953A1 (en) * 2019-02-19 2020-08-27 Cambridge Touch Technologies Ltd. Force sensing touch panel

Also Published As

Publication number Publication date
US9851828B2 (en) 2017-12-26
WO2014143066A1 (en) 2014-09-18

Similar Documents

Publication Publication Date Title
US9851828B2 (en) Touch force deflection sensor
US9958994B2 (en) Shear force detection using capacitive sensors
US8169332B2 (en) Tactile device with force sensitive touch input surface
US9454255B2 (en) Device and method for localized force sensing
CN105992991B (en) Low shape TrackPoint
US8026906B2 (en) Integrated force sensitive lens and software
JP5897055B2 (en) Pressure sensor and touch panel
US8800385B2 (en) Detection device, electronic apparatus, and robot
KR101793769B1 (en) System and method for determining object information using an estimated deflection response
KR20180016132A (en) electronic device including fingerprint sensor
US9195339B2 (en) System and method for determining object information using an estimated rigid motion response
US20130257744A1 (en) Piezoelectric tactile interface
CN106030467A (en) Flexible sensor
CN107436694B (en) Force sensor with uniform response along axis
US10394364B2 (en) Touch pressure sensitivity correction method and computer-readable recording medium
US20150169123A1 (en) Touch sensor controller and method for driving the same
CN105992992A (en) Low-profile pointing stick
KR102363707B1 (en) An electronic apparatus comprising a force sensor and a method for controlling electronic apparatus thereof
WO2014058005A1 (en) Input device and multiple point load detection method employing input device
US20190064984A1 (en) Force electrodes with non-uniform spacing
US11928292B2 (en) Touchscreen device
KR20220133623A (en) Display including strain gauge sensor and electronic device including the same
KR20210123913A (en) An electronic device including force sensors disposed on the same floor and contorl method thereof

Legal Events

Date Code Title Description
STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4