US20110292268A1 - Multi-region touchpad device - Google Patents
Multi-region touchpad device Download PDFInfo
- Publication number
- US20110292268A1 US20110292268A1 US12/851,421 US85142110A US2011292268A1 US 20110292268 A1 US20110292268 A1 US 20110292268A1 US 85142110 A US85142110 A US 85142110A US 2011292268 A1 US2011292268 A1 US 2011292268A1
- Authority
- US
- United States
- Prior art keywords
- touch
- menu
- band
- touchpad
- recited
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 230000007246 mechanism Effects 0.000 claims description 5
- 230000003287 optical effect Effects 0.000 claims description 4
- 238000000034 method Methods 0.000 abstract description 12
- 230000000875 corresponding effect Effects 0.000 description 54
- 210000003811 finger Anatomy 0.000 description 48
- 230000009471 action Effects 0.000 description 28
- 241000699666 Mus <mouse, genus> Species 0.000 description 13
- 230000004044 response Effects 0.000 description 12
- 238000005516 engineering process Methods 0.000 description 10
- 230000003993 interaction Effects 0.000 description 10
- 230000000694 effects Effects 0.000 description 7
- 230000006870 function Effects 0.000 description 7
- 230000003213 activating effect Effects 0.000 description 6
- 238000004891 communication Methods 0.000 description 6
- 241000699670 Mus sp. Species 0.000 description 4
- 230000001413 cellular effect Effects 0.000 description 4
- 230000001419 dependent effect Effects 0.000 description 4
- 238000003825 pressing Methods 0.000 description 4
- 238000003860 storage Methods 0.000 description 4
- 230000004069 differentiation Effects 0.000 description 3
- 238000010408 sweeping Methods 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 241001422033 Thestylus Species 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 238000010079 rubber tapping Methods 0.000 description 2
- 210000003813 thumb Anatomy 0.000 description 2
- 230000004913 activation Effects 0.000 description 1
- 239000008186 active pharmaceutical agent Substances 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000005057 finger movement Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000003973 paint Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 229920001690 polydopamine Polymers 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B62—LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
- B62D—MOTOR VEHICLES; TRAILERS
- B62D1/00—Steering controls, i.e. means for initiating a change of direction of the vehicle
- B62D1/02—Steering controls, i.e. means for initiating a change of direction of the vehicle vehicle-mounted
- B62D1/04—Hand wheels
- B62D1/046—Adaptations on rotatable parts of the steering wheel for accommodation of switches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/169—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated pointing device, e.g. trackball in the palm rest area, mini-joystick integrated between keyboard keys, touch pads or touch stripes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
- G06F3/0202—Constructional details or processes of manufacture of the input device
- G06F3/021—Arrangements integrating additional peripherals in a keyboard, e.g. card or barcode reader, optical scanner
- G06F3/0213—Arrangements providing an integrated pointing device in a keyboard, e.g. trackball, mini-joystick
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03543—Mice or pucks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03547—Touch pads, in which fingers can move on a surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04809—Textured surface identifying touch areas, e.g. overlay structure for a virtual keyboard
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10T—TECHNICAL SUBJECTS COVERED BY FORMER US CLASSIFICATION
- Y10T74/00—Machine element or mechanism
- Y10T74/20—Control lever and linkage systems
- Y10T74/20576—Elements
- Y10T74/20732—Handles
- Y10T74/20834—Hand wheels
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10T—TECHNICAL SUBJECTS COVERED BY FORMER US CLASSIFICATION
- Y10T74/00—Machine element or mechanism
- Y10T74/20—Control lever and linkage systems
- Y10T74/20576—Elements
- Y10T74/20732—Handles
- Y10T74/20834—Hand wheels
- Y10T74/2087—Rim grips and covers
Definitions
- Handheld devices have become more and more prevalent, in forms such as cellular phones, wireless phones, smartphones, music players, video players, netbooks, laptop computers, e-reading devices, tablet computers, cameras, controllers, remote controls, analytic devices, sensors, and many other types of devices.
- touch sensitive color displays that can detect touching by a finger or stylus.
- touch sensitive displays including those using capacitive sensors, resistive sensors, and active digitizers. Some displays are limited to detecting only single touches, while others are capable of sensing multiple simultaneous touches.
- Touch sensitive displays are convenient in handheld devices because of the simplicity of their operation to the user. Menu items can be displayed and a user can interact directly with the menu items by touching or tapping them, without the need to position or manipulate an on-screen indicator such as a pointer, arrow, or cursor. Furthermore, the touch capabilities of the display reduce the need for additional hardware input devices such as buttons, knobs, switches, mice, pointing sticks, track pads, joysticks, and other types of input devices.
- touch sensitive user interfaces One disadvantage of touch sensitive user interfaces, however, is that a user's finger can often obstruct the user's view of the display, and repeated touching of the display can result in fingerprints and smudges that obscure the display. Furthermore, it may be awkward in some devices for a user to both hold the device and to provide accurate touch input via the display, especially with one hand. Because of this, many devices are more awkward in operation than would be desirable.
- FIG. 1 is a rear perspective view of a handheld device utilizing a rear touch panel.
- FIG. 2 is a rear view of the handheld device of FIG. 1 , showing a possible hand and finger placement relative to a rear touch panel.
- FIG. 3 is a rear view of the handheld device of FIG. 1 , showing another possible hand and finger placement relative to the rear touch panel.
- FIG. 4 is a front perspective view of an alternative handheld device utilizing an edge touch panel.
- FIG. 5 is a front view of the handheld device of FIG. 1 , showing an embodiment of a banded menu structure that can be used in conjunction with the rear touch panel shown in FIGS. 1 and 2 .
- FIG. 6 is a front perspective view of the handheld device of FIG. 1 , showing the relationship between its rear touch panel and the banded menu structure shown in FIG. 5 .
- FIG. 7 is a close-up of a banded menu structure such as might be implemented in conjunction with a handheld device.
- FIG. 8 is a front view of a handheld device such as shown in FIG. 1 , illustrating an example of a possible user interaction with the handheld device.
- FIGS. 9-15 are close-ups of banded menu configurations illustrating user interface examples.
- FIG. 16 is a flowchart showing how a menu structure such as shown in FIG. 7 might be utilized in a handheld device.
- FIG. 17 is a block diagram showing relevant components of a handheld device that might be used to support the menus and related components described herein.
- FIG. 18 is a drawing of a general-purpose computer having a keyboard that incorporate a multi-level touchpad in accordance with the techniques described herein.
- FIG. 19 is a perspective view of a portion of the keyboard shown in FIG. 18 , showing more details of the multi-level touchpad.
- FIG. 20 illustrates a menu structure that is used in conjunction with the touchpad shown in FIGS. 18 and 19 .
- FIGS. 21 and 22 illustrate a usage scenario that utilizes the touchpad and menu structure of FIGS. 18-20 .
- FIGS. 23 and 24 illustrate another usage scenario that utilizes the touchpad and menu structure of FIGS. 18-20 .
- FIGS. 25 and 26 illustrate a usage scenario that utilizes the touchpad of FIGS. 18-19 .
- FIG. 27 is a flowchart illustrating a generalized procedure for implementing the usage scenarios described above with reference to FIGS. 18-26 .
- FIG. 28 is a perspective view of a computer mouse that incorporates a multi-region touchpad.
- FIG. 29 is a perspective view of a digitizer pad that incorporates a multi-region touchpad.
- FIG. 30 is a perspective view of a tablet computer that incorporates a multi-region touchpad.
- FIG. 31 is a perspective view of a remote control that incorporates a multi-region touchpad.
- FIG. 32 is a perspective view of a game controller that incorporates a multi-region touchpad.
- FIG. 33 is a perspective view of a digital camera that incorporates a multi-region touchpad.
- FIG. 34 is a perspective view of an automobile interior having a steering wheel that incorporates a multi-region touchpad.
- FIG. 1 shows a handheld device 100 featuring a front surface 101 (not visible in FIG. 1 ) and an alternate surface (in this case a back or rear surface) 102 .
- Handheld device 100 may be held in one hand by a user, with front surface 101 facing and visible to the user.
- Alternate surface 102 is, in this embodiment, opposite front surface 101 , and faces away from the user during typical handheld operation.
- front surface 101 may have a display and/or other user interface elements.
- Handheld device 100 has a touch sensitive sensor 103 , also referred to herein as a touch panel or multi-region touchpad.
- Touch panel 103 is situated in the alternate surface, in this embodiment facing away from a user who is holding handheld device 100 .
- a user's finger such as the user's index finger, may be positioned over or on touch panel 103 ; touch panel 103 is positioned in such a way as to make this finger placement comfortable and convenient.
- FIGS. 2 and 3 show two examples of how device 100 might be grasped by a user.
- the user holds device 100 with a single hand 201 in a portrait orientation, with index finger 202 positioned over touch panel 103 for operation of touch panel 103 .
- the user holds device 100 in a landscape position with left hand 301 and right hand 302 , with index finger 303 of the left hand positioned over touch panel 103 .
- Touch panel 103 has multiple areas that are tactually delineated from each other so that a user can distinguish between the areas by touch.
- the areas comprise a plurality of successively nested or hierarchically arranged annular rings or bands 104 .
- Bands 104 may be concentric in some embodiments, and may surround a common central touch area 105 . Individual bands 104 may be referred to as touch bands in the following discussion.
- each of bands 104 has a different elevation or depth relative to alternate surface 102 of handheld device 100 .
- each successively inward band is stepped down in elevation from alternate surface 102 or from its outwardly neighboring band.
- outer band 104 ( a ) is stepped down from alternate surface 102 and therefore is deeper or has a lower elevation than alternate surface 102 .
- Middle band 104 ( b ) is stepped down from its outwardly neighboring band 104 ( a ) and is therefore deeper and has a lower elevation than outer band 104 ( a ).
- Inner band 104 ( c ) is stepped down from its outwardly neighboring band 104 ( b ) and is therefore deeper and has a lower elevation than middle band 104 ( b ).
- central area 105 is stepped down from surrounding inner band 104 ( c ) and is therefore deeper and has a lower elevation than inner band 104 ( c ).
- touch bands 104 may each successively extend upward from the bordering larger band.
- outer band 104 ( a ) may be lower than middle band 104 ( b ), which in turn is lower than inner band 104 ( c ), which is in turn lower than central area 105 , thus forming a convex arrangement.
- the respective bands may all share the same level, but may be tactually detectable by virtue of a raised border between them. For purposes of simplicity, however, the disclosed embodiment will address only a concave arrangement of touchpad 103 .
- Bands 104 can be irregularly shaped or can form a wide variety of shapes such as circles, ovals, rectangles, or squares. In the illustrated embodiment, bands 104 are irregularly shaped to allow easy finger positioning at desired locations. The irregular shape of bands 104 allows a user to learn the orientation of the bands and thus aids in non-visual interaction with touch panel 103 .
- Touch panel 103 is sensitive to touch, and can detect the particular location at which it is touched or pressed. Thus, it can detect which individual band 104 is touched, and the position or coordinates along the band of the touched location. A user can slide his or her finger radially between bands 104 or around a single band 104 , and touch panel 103 can detect the movement and absolute placement of the finger as it moves along or over the bands. Central area 105 is also sensitive to touch in the same manner.
- Touch panel 103 can be implemented using capacitive, resistive, or pressure sensing technology, or using other technologies that can detect a user's finger placement. Touch panel 103 may also integrate additional sensors, such as sensors that detect the pressing or depression of central area 105 or other areas of touch panel 103 .
- Different embodiment may utilize different numbers of bands, and a single band or two bands may be used in different embodiments. Furthermore, the bands may be shaped and positioned differently.
- FIG. 4 shows an embodiment of handheld device 100 having two straight or linear touch-sensitive areas or bands 401 and 402 , positioned adjacently along the vertical length of the right side or edge 403 of handheld device 100 .
- Front touch band 401 is positioned on the right edge 403 , toward or adjacent front surface 101 .
- Rear touch band 402 is positioned on the right edge 403 , toward or adjacent rear surface 102 .
- Tactile delineation between touch bands 401 and 402 can be provided by a ridge or valley between the bands.
- the bands can have different elevations relative to right side surface 403 .
- FIG. 5 is a front view of handheld device 100 (in this embodiment, a cellular phone), showing one possible configuration of front surface 101 .
- a front-facing display or display panel 501 in front surface 101 .
- display panel 501 may be a touch sensitive display panel.
- Other user interface elements such as buttons, indicators, speakers, microphones, etc., may also be located on or around front surface 101 , although they are not shown in FIG. 5 .
- Display panel 501 can be used as part of a user interface to operate handheld device 100 . It can also be used to display content, such as text, video, pictures, etc.
- a graphical menu 502 can be displayed at times on front display 501 .
- Menu 502 has a plurality of graphically- or visually-delineated menu areas or bands 504 corresponding respectively to the tactually-delineated touch sensitive areas 104 on alternate surface 102 .
- menu areas 504 include an outer band 504 ( a ), a middle band 504 ( b ), and an inner band 504 ( c ).
- menu 502 includes a center visual area 505 .
- FIG. 6 illustrates relative positions of touch panel 103 and graphical menu 502 in one embodiment.
- rear touch panel 103 is positioned opposite and directly behind display panel 501 .
- Bands 504 of graphical menu 502 are shaped and sized the same as their corresponding touch-panel bands 104 , and are positioned at the corresponding or same lateral coordinates along front surface 101 and alternate surface 102 .
- outer touch band 104 ( a ) has generally the same size, shape, and lateral position as outer menu band 504 ( a ); middle touch band 104 ( b ) has generally the same size, shape, and lateral position as middle menu band 504 ( b ); inner touch band 104 ( c ) has generally the same size, shape, and lateral position as outer menu band 504 ( c ); and center area 105 of touch panel 103 has generally the same size, shape, and lateral position as center area 505 of front display panel 501 .
- graphical menu 502 faces the user, and touch panel 103 faces away from the user.
- display panel 501 and touch panel 103 may or may not be precisely parallel with each other.
- there may be a lateral and/or angular offset between graphical menu 502 and touch panel 103 such that touch panel 103 is not directly behind menu 502 or is not parallel with the surface of display panel 501 .
- the correspondence in size and shape between the menu bands and the touch bands may not be exact in all embodiments.
- the bands and center area of touch panel 103 and menu 502 may differ from one another, but will be similar enough that when a user interacts with touch panel 103 , the user perceives it to have a one-to-one positional correspondence with the elements of menu 502 .
- menu items are displayed in menu bands 504 .
- Each displayed menu item is located at a particular point on a menu band 504 , and therefore corresponds to a similar point on corresponding touch band 104 of touch panel 103 .
- a particular menu band 504 can be selected or activated by touching its corresponding touch band.
- a particular menu item can be selected or activated by touching the corresponding position or location on the corresponding touch band 104 .
- touching any particular location on touchpad 103 can be considered similar to touching or clicking on the corresponding location on graphical menu 502 . If a user desires to select a menu item or some other graphical object positioned at a particular point on menu 502 , for example, he or she presses the corresponding point or location on touch panel 103 .
- the tactual delineations between bands of touch panel 103 help the user identify and move between graphical menu bands to locate particular menu item groups.
- FIG. 7 shows details of how such a menu 502 might be structured.
- FIG. 7 shows a menu structure 700 as an example of both menu 502 and its corresponding touch panel 103 .
- This example uses two selection bands: an outer band 701 and an inner band 702 , both of which surround a center area 703 .
- Outer band 701 corresponds to an outer displayed menu band and a correspondingly positioned outer touch band on alternate surface 102 .
- Inner band 702 corresponds to a displayed inner menu band and a correspondingly positioned inner touch band on alternate surface 102 .
- Center area 703 corresponds to an area within the displayed menu as well as a correspondingly positioned touch sensitive area on touch panel 103 .
- touch panel 103 has two touch bands, corresponding to the two touch bands shown in FIG. 7 .
- each of the menu bands 701 and 702 contains a group of related menu items. Each menu item may be represented by text or a graphical element, object, or icon. In this example, the items are represented by text.
- Inner menu band 702 contains menu items labeled “ITEM A 1 ”, “ITEM A 2 ”, “ITEM A 3 ”, “ITEM A 4 ”, “ITEM A 5 ” and “ITEM A 6 ”.
- Outer menu band 701 contains menu items labeled “ITEM B 1 ”, “ITEM B 2 ”, “ITEM B 3 ”, “ITEM B 4 ”, “ITEM B 5 ”, “ITEM B 6 ”, and “ITEM B 7 ”.
- Each menu band 701 and 702 may also have a band heading or title, indicating the category or type of menu items contained within the band.
- inner menu band 702 has a heading “GROUP A”
- outer menu band 701 has a heading “GROUP B”.
- hand-held device 100 is configured to initiate actions associated respectively with the menu items in response to their selection.
- FIG. 7 illustrates one of many variations of band shapes that might be utilized when implementing both menu 502 and its corresponding touch panel 103 .
- the bands have larger widths toward their right-hand and lower sides. This configuration is intended to work well when the device is held in the left hand of a user, who uses his or her left index finger to interact with touch panel 103 . This leaves the right hand free to interact with display panel 501 on front surface 101 .
- touch panel 103 may be symmetrical, with bands that are the same width on their left and right sides.
- Menu 502 might be non-symmetrical, similar to menu structure 700 .
- the non-symmetry of menu 502 might allow menu items labels and icons to easily fit within its right-hand side.
- the slight differences between the shapes of the touch bands and the corresponding menu bands will likely be nearly imperceptible to a user, or at least easily ignored.
- This arrangement allows menu 502 to be displayed using either a right-hand or left-hand orientation, depending on preferences of a user, while using the same touch panel 103 .
- touch panel 103 For purposes of discussion, interaction with touch panel 103 will be described with reference to bands and locations of menu structure 700 . Thus, “touching” or “tapping” ITEM A 1 is understood to mean that the user touches the corresponding location on touch panel 103 .
- Menu structure 700 can be sensitive to the context that is otherwise presented by handheld device 100 .
- the particular menu items found on menu 700 may vary depending on the activity that is being performed on handheld device 100 .
- different bands of menu 700 can have menu items that vary depending on a previous selection within a different band. Specific examples will be described below.
- menu 700 may be activated or initiated by touching center touch area 105 of touch panel 103 .
- handheld device displays menu 700 .
- menu 700 might be activated by touching any portion of touch panel 103 , or by some other means such as by interaction with front-surface elements of handheld device 100 .
- menu structure 700 Upon initially displaying menu structure 700 , individual menu items may or may not be displayed. For example, upon initial display, each menu band may only indicate its group heading or title, and the individual menu items may be hidden.
- the user may touch one of the touch bands to activate or reveal the menu items within that touch band.
- the user may touch inner band 702 , which causes device 100 to activate that band and to display or reveal its individual menu items.
- activating a particular band might result in that band being highlighted in some manner, such as by an animation, bold text, or distinguishing shades or colors.
- Activation or selection of a band might also be indicated by enlarging that band on displayed menu 700 in relation to other, non-activated bands.
- Another band might be activated by touching it, or by selecting an item from a first band.
- outer band 701 may contain items that depend on a previous selection made from the items of inner band 702 .
- touching or selecting an item within inner band 702 may activate outer band 701 , and outer band 701 might in this scenario contain items or commands related to the menu item selected from inner band 702 .
- Selection of a band or menu item may be made by touching and releasing the corresponding location on touch panel 103 .
- selection may be made by touching at one location, sliding to another location, and releasing.
- menu structure 700 may be implemented such that touching center area 703 opens menu structure 700 , and sliding to inner band 702 allows the user to move to a menu item on inner band 702 . Releasing when over a particular menu item might select or activate that menu item.
- Selection within menu structure 700 or within a band of menu structure 700 may be accompanied by a highlight indicating the location of the user's finger at any time within the menu structure. For example, touching in a location on touch panel 103 in a location corresponding to ITEM A 1 may cause ITEM A 1 to become bold or otherwise highlighted. Furthermore, any area that is currently being touched can be made to glow on display panel 501 , or some similar visual mechanism can be used to indicate finger placement and movement on menu structure 700 . Thus, a user might touch a menu band, move his or her finger along the menu band until the desired menu item is highlighted, and then release his or her touch, thereby activating the menu item that was highlighted upon the touch release.
- touch panel 103 will not be explicitly shown in the figures accompanying this discussion. It is assumed that in the examples described, touch panel 103 lies directly behind the illustrated graphical menus, and that the touch bands of the touch panel have shapes and sizes that correspond at least roughly with the menu bands of the displayed graphical menus. User interactions with the touch panel will be described with reference to corresponding points on the displayed graphical menus.
- FIGS. 8-11 illustrate how the elements and techniques described above might be used to edit and share a picture that is stored on a handheld device such as a cellular telecommunications device.
- handheld device 100 is displaying a photograph 801 on its display surface 501 .
- Touch panel 103 is represented in dashed lines to indicate its location relative to display panel 501 .
- a menu is not displayed in FIG. 8 .
- FIG. 9 shows a menu 901 that is displayed on display panel 501 in response to a user touching center area 105 of touch panel 103 .
- This menu is configured to allow a user to perform various operations with respect to the displayed picture 801 .
- the object of these operations, picture 801 is displayed or represented within center area 703 .
- Inner band 702 is configured to correspond to various editing operations that can be performed on picture 801 , and has a band heading 901 that reads “EDIT”.
- Outer band 701 is configured to correspond to various communications options that can be performed in conjunction with picture 801 , and has a band heading 902 that reads “SHARE”.
- a user can touch anywhere in inner band 702 to activate or reveal the menu items of that band.
- a user can touch anywhere in outer band 701 to activate or reveal the menu items of that band.
- FIG. 10 shows the result of a user touching inner band 702 .
- it is activated or highlighted.
- an activated band is enlarged and its menu items are revealed.
- Menu items 1001 of inner band 702 comprise “Paint”, “Copy”, “Crop”, “Effects”, “Text”, and “Save”.
- the user can move his or her finger around inner band 702 until it is positioned corresponding to a desired menu item.
- the location at which the user is touching the band will be highlighted or somehow indicated on display 501 so that finger movement can be visually confirmed.
- the finger is at the desired menu item, the user released the finger touch and the menu item is selected or activated.
- the user wants to crop the displayed picture 801 .
- the user first touches and releases center area 703 to activate menu 700 .
- the user then touches inner band 702 , which reveals menu items 901 relating to editing actions.
- the user moves his or her finger until touching the menu item “Crop”, and releases.
- This causes device 100 to display an on-screen tool for cropping picture 801 .
- picture 801 may be again displayed in full size on front display panel 501 , as in FIG. 8 , and a moveable rectangle may be shown for the user to position in the desired cropping location.
- the user may drag the displayed rectangle by pressing and dragging on display panel 501 to achieve the desired positioning of the rectangle, and the desired cropping of picture 801 .
- FIG. 11 shows a subsequent operation that may be performed on the cropped picture 801 .
- the cropped picture 801 is displayed in center area 703 as the object of a proposed action.
- Menu 700 may reappear after the cropping operation, or may be reactivated by the user again touching center area 703 .
- the user has touched the outer band 701 to reveal the menu items 1101 of that band, which relate to different communications options that are available with regard to the targeted picture. These options include “Email”, “Text”, “IM”, “Facebook”, “Twitter”, and “Blog”. These menu items correspond to actions that device 100 or an application program within device 100 will initiate upon selection of the menu items. Notice that in this example, as with FIG. 10 , the activated menu band is enlarged to indicate that it is active. Enlarging the active menu band also allows its menu items to occupy more screen space and therefore make them more visible to the user.
- FIGS. 12-15 illustrate how the elements and techniques described might be used to select and interact with different contacts, using a menu structure 1200 that is displayed on handheld device 100 .
- Example menu 1200 uses three levels of menu bands and corresponding touch bands: an outer band 1201 , a middle band 1202 , and an inner band 1203 . These bands surround a center area 1204 .
- FIG. 13 shows the menu items 1301 revealed upon activating inner band 1203 .
- inner band 1203 contains menu items corresponding to contacts that the user has designated as belonging to a particular group. It contains a group heading or label 1302 , which in this example reads “FAMILY”, indicating that the contacts within this band are part of the “FAMILY” contact group.
- the menu items include “Mom”, “Dad”, “Aric”, “Janelle”, “Grandma”, and “Jim”. A user can touch or select any one of these menu items to select the corresponding contact.
- FIG. 14 shows menu items 1401 that are revealed upon activating middle band 1202 . These menu items relate to activities that can be performed with respect to a contact that has been selected from inner band 1203 .
- Middle band 1202 has a group heading or label 1402 , which in this example reads “COMM”, indicating that the band contains communications options.
- “Jim” has been previously selected from inner band 1203 and is displayed in center area 1204 as the object of any selected operations.
- the menu items and corresponding operations include “eMail”, “Text”, “Call”, “Chat”, and “Twitter”.
- the available menu items might vary depending on the information available for the selected contact. For example, some contacts might only include a telephone number, and communications options might therefore be limited to texting and calling. Other contacts might include other information such as Chat IDs, and a “Chat” activity might therefore be available for these contacts.
- the menu items available in this band are sensitive to the menu context selected in previous interactions with menu 1200 .
- FIG. 15 shows menu items 1501 that are revealed upon activating outer band 1201 .
- Outer band 1201 contains menu items corresponding to different contact groups that a user has defined, and contains a group heading or title 1502 that reads “GROUPS”.
- these contact groups include “Family”, “Office”, “Friends”, and “Favorites”. Selecting one of these groups changes the context of menu 1200 . In particular, it changes the contact group that is shown within inner band 1203 . After selecting “Office” from outer band 1201 , for example, the label 1302 of inner band 1203 will change to “OFFICE”, and the listed menu items 1301 within inner band 1203 will change to those that the user has included in the “User” group.
- the described menu structure might be used as an application launcher, with different types of applications being organized within different menu bands. End-users may be given the ability to organize applications within menu bands in accordance with personal preferences.
- the described menu structure might also be used as a general context menu, presenting operations such as copy, paste, delete, add bookmark, refresh, etc., depending on operations that might be appropriate at a particular time when the menu structure is opened. Again, different types of operations might be presented in different menu bands, such as “edit” operations in an inner band and “sharing” operations in an outward band.
- support for the menu structure can be provided through an application programming interface (API) and corresponding software development kit (SDK) to allow the menu functionality to be used and customized by various application programs.
- API application programming interface
- SDK software development kit
- the operating system of the handheld device can expose APIs allowing application programs to register certain activities and actions that might be performed with respect to certain types of objects, or in certain contexts. Registering in this manner would result in the indicated activities or actions being included in the contextual menus described above.
- FIG. 16 illustrates the above user interface techniques in simplified flowchart form.
- An action 1601 comprises displaying a menu on a front-facing display of a handheld device.
- the menu may have visually-delineated menu areas or bands corresponding in shape and position to the nested or hierarchical touch bands of a rear-facing touch sensor of the handheld device.
- An action 1602 comprises displaying menu items in the menu bands. As already described, each menu item corresponds to a position on the rear-facing touch sensor of the handheld device.
- An action 1603 comprises navigating among the menu bands and menu items in response to rear touch sensor input.
- Action 1604 comprises selecting a particular one of the menu items in response to the user touching its corresponding position on the rear-facing touch sensor.
- some of the user interactions might be performed by touching the display itself at the desired menu location, as an alternative to touching the corresponding location on the rear touch panel.
- Some embodiments may allow the user to touch either the front displayed menu or the corresponding rear touch panel, at the user's discretion.
- FIG. 17 shows relevant components of an exemplary device 1700 implementing the features described herein.
- Device 1700 of FIG. 17 comprises one or more processors 1701 and memory 1702 .
- Memory 1702 is accessible and readable by processors 1701 and can store programs and operational logic for implementing the functionality described herein.
- memory 1702 can contain instructions that are executable by processors 1701 to perform and implement the described functionality.
- OS 1703 contains operational logic for basic device operation, while applications 1704 work in conjunction with OS 1703 to implement additional, higher-level functionality.
- Applications 1704 may in many embodiments be installed by device manufacturers, resellers, retailers, or end-users. In other embodiments, the OS and applications may be built into the device at manufacture. Furthermore, some implementations may be specially programmed with a dedicated program or set of instructions, and may or may not have separate operating system and application layers.
- memory 1702 may include internal device memory as well as other memory that may be removable or installable.
- Internal memory may include different types of machine-readable media, such as electronic memory, flash memory, and/or magnetic memory, and may include both volatile and non-volatile memory.
- External memory may similarly be of different machine-readable types, including rotatable magnetic media, flash storage media, so-called “memory sticks,” external hard drives, network-accessible storage, etc. Both applications and operating systems may be distributed on such external memory and installed from there. Applications and operating systems may also be installed and/or updated from remote sources that are accessed using wireless means, such as WiFi, cellular telecommunications technology, and so forth.
- device 1700 may have a display 1705 and a touch panel 1706 , the characteristics of which are described herein.
- OS 1703 and/or applications 1704 interact with display 1705 and touch panel 1706 to implement the user interface behaviors and techniques described above.
- device 1700 might have an application programming interface (API) 1707 that exposes the functionality of display 1705 and touch panel 1706 to applications through high-level function calls, allowing third-party application to utilize the described functionality without the need for interacting with device components at a low level.
- API 1707 may include function calls for performing the actions described with reference to FIG. 16 , including:
- API 1707 may allow application programs to register certain functions or actions, along with potential objects of those functions or actions, allowing the handheld device to include those functions and activities as menu items in appropriate contexts.
- Device 1700 may also include other input/output elements 1708 , such as different types of displays, touch-sensitive display panels, controls, buttons, lenses, image capture devices, storage devices, microphones, etc.
- input/output elements 1708 such as different types of displays, touch-sensitive display panels, controls, buttons, lenses, image capture devices, storage devices, microphones, etc.
- FIG. 18 shows an embodiment in which a multi-region or hierarchical touchpad is used in conjunction with a conventional computing device.
- the computing device is a personal desktop computer 1800 comprising a keyboard 1801 , a display 1802 , and a system controller or processor 1803 .
- Keyboard 1801 has traditional input mechanisms, including keys or buttons 1804 that are operable by a user to enter text on computer 1800 .
- the keyboard may also have a conventional touchpad 1805 that can be used in place of a computer mouse to control an on-screen pointer or cursor.
- touchpad 1806 is positioned apart and independently from display 1802 , not directly behind or otherwise aligned with an on-screen menu 1807 .
- hierarchical touchpad 1806 can be used with a variety of different types of computing devices, such as laptop computers, netbook computers, tablet computers, mobile devices, gaming devices, cameras, input peripherals, special purpose computing devices, and so forth.
- a touchpad such as this can be implemented as a stand-alone accessory or integrated with different types of input devices. For example, it might be integrated with a mouse or digitizer pad.
- FIG. 19 shows hierarchical touchpad 1806 in more detail.
- hierarchical touchpad 1806 has multiple areas that are tactually delineated from each other so that a user can distinguish between the areas by touch.
- the areas comprise a plurality of hierarchically-arranged and tactually delineated touch-sensitive areas or bands.
- the touch-sensitive areas or bands are arranged concentrically, surrounding a central touch-sensitive area 1901 .
- An inner or primary touch-sensitive band 1902 has an annular or ring-like shape, and is immediately adjacent central touch-sensitive area 1901 .
- An outer or secondary touch-sensitive band 1903 also has an annular or ring-like shape. Outer touch-sensitive band 1903 surrounds and is immediately adjacent inner touch-sensitive band 1902 .
- the central area 1901 may be omitted in some embodiments, and the inner touch-sensitive band 1902 may comprise a circular area rather than an annular area.
- each of the touch-sensitive bands 1902 and 1903 has a different elevation or depth relative to a surface 1904 in which it is positioned. There are steps or discontinuous edges 1905 between the different elevations that provide tactile differentiation between bands, allowing a user to reliably locate a particular touch band via tactile feedback with a finger, without visually looking at touchpad 1806 .
- each successively inward band or area is stepped down in elevation from surface 1904 or from its outwardly neighboring band.
- outer band 1903 is stepped down from surface 1904 and therefore is deeper or has a lower elevation than surface 1904 .
- Inner band 1902 is stepped down from its outwardly neighboring band 1903 and is therefore deeper and has a lower elevation than outer band 1903 .
- central area 1901 is stepped down from surrounding inner band 1902 and is therefore deeper and has a lower elevation than inner band 1902 .
- the progressively and inwardly increasing depths of bands 1902 and 1903 and central area 1901 relative to surface 1904 create a concavity or depression relative to surface 1904 .
- the position and dimensions of touch panel 1806 can be chosen so that a user's finger naturally locates and rests within the concavity, such that it is comfortable to move the finger to different locations around touch panel 1806 .
- the touch-sensitive bands may each successively extend upward from the bordering larger band.
- outer band 1903 may be lower than inner band 1902 , which in turn may be lower than central area 1901 , thus forming a convex arrangement.
- the respective bands may all share the same level, but may be tactually detectable by virtue of a raised border between them.
- the disclosed embodiment will address only the concave arrangement shown in FIG. 19 .
- Touchpad 1806 is sensitive to touch, and can detect the particular location at which it is touched or pressed. Thus, it can detect which individual band is touched, and the position or coordinates along the band of the touched location. A user can slide his or her finger radially between bands or around a single band, and touch panel 1806 can detect the movement and absolute placement of the finger as it moves along or over the bands.
- Central area 1901 is also sensitive to touch in some embodiments.
- Touchpad 1806 can be implemented using capacitive, resistive, or pressure sensing technology, or using other technologies that can detect a user's finger placement. Touchpad 1806 may also integrate additional sensors, such as sensors that detect the pressing or depression of central area 1901 or other areas of touchpad 1806 . In addition to being sensitive to touching with a finger, the touch-sensitive bands can also be sensitive to touching by a stylus or other object.
- Different embodiment may utilize different numbers of bands, and a single band or three bands may be used in different embodiments. Furthermore, the bands may be shaped and positioned differently than illustrated here.
- touchpad 1806 can be used with an on-screen menu having a graphical appearance that is similar to that of the touchpad itself Referring back to FIG. 18 , an example of such an on-screen menu 1807 is shown on display 1802 .
- FIG. 20 shows on-screen menu 1807 in more detail.
- On-screen menu 1807 comprises a first-level menu 2001 and a second-level menu 2002 .
- First-level menu 2001 corresponds to inner touch-sensitive band 1902 and has a shape similar to that of inner touch-sensitive band 1902 .
- Second-level menu 2002 corresponds to outer touch-sensitive band 1903 and has a shape similar to that of outer touch-sensitive band 1903 .
- First-level menu 2001 and second-level menu 2002 are also arranged relative to each other similar to the arrangement of the bands of touchpad 1806 , with second-level menu 2002 surrounding and immediately adjacent first-level menu 2001 .
- Each of menus 2001 and 2002 comprises an annular or ring-shaped area within which menu items or choices can be displayed.
- FIG. 21 illustrates a usage embodiment in which there is a one-to-one correspondence between positions of inner touch-sensitive band 1902 and positions of first-level menu 2001 .
- FIG. 21 shows a graphical window or pane 2100 upon which on-screen menu 1807 is displayed.
- first-level menu 2001 is shown as having four menu choices: File, Edit, View, and Help. These choices are distributed around first-level menu 2001 at approximately equal intervals, at the left, top, right, and bottom of first-level menu 2001 , respectively.
- Dashed arrows indicate, for each of these choices, corresponding locations on inner touch-sensitive band 1902 of touchpad 1806 .
- the position of a particular menu choice on inner touch-sensitive band 1902 is assumed to be the same as the choice's position on first-level menu 2001 .
- the “File” choice is displayed at the left of first-level menu 2001 , and is assumed to also be located at the left of inner touch-sensitive band 1902 . Touching the left of first-level menu is equivalent to selecting or “touching” the “File” menu item.
- the “Edit” choice is displayed at the top of first-level menu 2001 , and is assumed to also be located at the top of inner touch-sensitive band 1902 . Touching the top of first-level menu is equivalent to selecting or “touching” the “Edit” menu item.
- the “View” and “Help” choices may be selected by touching their corresponding locations on touchpad 1806 .
- a cursor or pointer 2101 can be used in some embodiments to indicate on on-screen menu 1807 the current position of a finger on touchpad 1806 .
- the cursor or pointer can be a graphical arrow, dot, highlight, or other type of graphical delineation.
- a finger 2102 is shown touching a point on touch-sensitive band 1902 , and the corresponding location of on-screen menu 1807 is indicated by circular cursor 2101 .
- Cursor 2101 provides graphical feedback to the user as the user moves his or her finger to various locations around touch-sensitive bands 1902 and 1903 .
- touching touchpad 1806 may cause menu 1807 to appear on window 2100 , along with cursor 2101 that indicates where the touch is occurring.
- cursor 2101 follows its movement in the corresponding areas of menu 1807 .
- the user may visually watch cursor 2101 to verify finger placement, and to guide cursor 2101 over a desired menu choice. For example, the user may use their finger to guide cursor 2101 over the “Edit” choice of first-level menu 2001 . Releasing or removing the touch contact with touchpad 1806 , while the cursor is over a particular menu choice, results in the selection of that menu choice.
- FIG. 22 illustrates the result of selecting the “Edit” choice from first-level menu 2001 .
- second-level menu 2002 is displayed with second-level choices that depend on the selected first-level choice.
- the second-level choices comprise “Undo”, “Copy”, “Cut”, and “Paste”. These choices are distributed around second-level menu 2002 at approximately equal intervals, at the left, top, right, and bottom of second-level menu 2002 , respectively.
- Dashed arrows indicate, for each of these choices, corresponding locations on outer touch-sensitive band 1903 of touchpad 1806 .
- the position of a particular menu choice on outer touch-sensitive band 1903 is assumed to be the same as the choice's position on second-level menu 2002 .
- the “Undo” choice is displayed at the left of first-level menu 2002 , and is assumed to also be located at the left of outer touch-sensitive band 1903 .
- Touching the left of second-level menu is equivalent to selecting or “touching” the “Undo” menu item.
- the “Copy” choice is displayed at the top of second-level menu 2002 , and is assumed to also be located at the top of outer touch-sensitive band 1903 . Touching the top of second-level menu 2002 is equivalent to selecting or “touching” the “Copy” menu item.
- the “Cut” and “Paste” choices may be selected by touching their corresponding locations on touchpad 1806 .
- Cursor 2101 can be used to indicate the current position of finger 2102 on touchpad 1806 as described above.
- finger 2102 is shown touching a point on outer touch-sensitive band 1903 , and the corresponding location of on-screen menu 1807 is indicated by cursor 2101 .
- a particular menu choice can be selected by moving the cursor over it and then releasing touchpad 1806 .
- FIG. 23 shows another embodiment, having an on-screen menu that does not correspond in size or shape to the hierarchical touchpad.
- FIG. 23 shows a graphical window or pane 2300 upon which a first-level menu 2301 is displayed.
- First-level menu 2001 is shown as having four menu choices: File, Edit, View, and Help. These choices are arranged horizontally and linearly, from left to right, along the top of window 2300 .
- Dashed arrows indicate, for each of these choices, corresponding locations on inner touch-sensitive band 1902 of touchpad 1806 .
- the menu choices are arranged on inner touch-sensitive band 1902 in the same sequence as their presentation within first-level menu 2001 .
- the “File” choice corresponds to the left of inner touch-sensitive band 1902
- the “Edit” choice corresponds to the top of inner touch-sensitive band 1902
- the “View” choice corresponds to the right of inner touch-sensitive band 1902
- the “Help” choice corresponds to the bottom of inner touch-sensitive band 1902 .
- touching the left of first-level menu is equivalent to selecting or “touching” the “File” menu item.
- Touching the top of first-level menu is equivalent to selecting or “touching” the “Edit” menu item.
- the “View” and “Help” choices may be selected by touching their corresponding locations on touchpad 1806 .
- cursor or pointer can be used as in previous embodiments to show the current selection.
- cursor functionality is provided by underlining any menu choice that the user is currently “touching.”
- the user's finger is assumed to be touching the location on touchpad 1806 corresponding to the “Edit” choice, and the “Edit” choice is therefore underlined.
- FIG. 24 shows the result of selecting the “Edit” choice from first-level menu 2301 .
- second-level menu 2401 is displayed with second-level choices that are selectable by touching the secondary touch-sensitive area 1903 of touchpad 1806 .
- the second-level choices comprise Undo, Copy, Cut, and Paste. These choices are arranged vertically and linearly, beneath the selected first-level menu choice.
- Dashed arrows indicate, for each of these choices, corresponding locations on outer touch-sensitive band 1903 of touchpad 1806 .
- the menu choices are arranged on outer touch-sensitive band 1903 in the same sequence as their presentation within second-level menu 2401 .
- the “Undo” choice corresponds to the left of outer touch-sensitive band 1903
- the “Copy” choice corresponds to the top of outer touch-sensitive band 1903
- the “Cut” choice corresponds to the right of outer touch-sensitive band 1903
- the “Paste” choice corresponds to the bottom of outer touch-sensitive band 1903 .
- touching the left of outer touch-sensitive band 1903 is equivalent to selecting or “touching” the “Undo” menu item.
- Touching the top of outer touch-sensitive band 1903 is equivalent to selecting or “touching” the “Copy” menu item.
- the “Cut” and “Paste” choices may be selected by touching their corresponding locations on touchpad 1806 .
- Underlining is again used to indicate a current selection, which in this case is “Cut”.
- a particular menu choice can be selected by moving the underlining to the appropriate choice and then releasing the touching of touchpad 1806 .
- FIG. 25 illustrates yet another usage scenario for a hierarchical touchpad.
- inner touch-sensitive band 1902 corresponds to menu choices within a vertical, linear first-level menu 2501 .
- First-level menu 2501 has the menu choices “Start Date”, “Start Time”, “End Date”, and “End Time”, along with corresponding values for those menu choices.
- the value of a particular menu choice can be selected or changed by selecting the that menu choice from first-level menu 2501 .
- the “Start Date” choice can be selected by touching the left of inner touch-sensitive band 1902 .
- the “Start Time” choice can be selected by touching the top of inner touch-sensitive band 1903 .
- the “End Date” choice can be selected by touching the right of inner touch-sensitive band 1902 .
- the “End Time” choice can be selected by touching the bottom of inner touch-sensitive band 1902 .
- the correspondence between menu choices and positions on inner touch-sensitive band 1902 are indicated by dashed arrows.
- FIG. 26 shows the result of selecting the “End Date” choice. Selecting any choice from the first-level menu 2501 cases a second-level menu 2601 to open, containing menu choices that vary depending on the selected first-level menu choice.
- second-level menu 2601 is a scrollable window having second-level menu choices that scroll vertically in response to sweeping outer touch-sensitive band 1903 in a circular motion.
- a blocked or highlighted line 2602 indicates the current selection: June 16. Sweeping a finger around outer touch-sensitive band 1903 in a clockwise direction scrolls in one direction. Sweeping a finger around outer touch-sensitive band 1903 in a counter-clockwise direction scrolls in the other direction.
- FIG. 27 shows a generalized procedure 2700 to implement the usage scenarios described above, in conjunction with a hierarchical touchpad such as touchpad 1806 .
- Procedure 2700 is described in terms of actions or steps that can be implemented by programs or instruction sequences that are stored in memory and executed by a processor, for example by processor 1701 of FIG. 17 or system controller 1803 of FIG. 18 . Other types of operational logic might also be used to implement the described procedure.
- An action 2701 comprises displaying first-level choices in a first-level menu, wherein the first-level choices are selectable by touching corresponding locations of a primary band of a hierarchical touchpad such as described above.
- the first-level menu has a graphical shape like that of the primary band of the hierarchical touchpad.
- the first-level menu may be arranged and shaped differently than the primary band of the hierarchical touchpad.
- An action 2702 comprises accepting user selection of a first-level menu choice.
- a particular choice may be selected by touching the corresponding position on the primary band of the hierarchical touchpad, or by touching and releasing the corresponding position.
- An action 2703 performed in response to the user selecting a first-level choice from the first-level menu, comprises displaying second-level menu choices in a second-level menu, wherein the second-level choices are selectable by touching a secondary band of a hierarchical touchpad.
- the second-level menu may be displayed initially, upon initial display of the first-level menu.
- the second-level menu may be initially hidden, and may be made visible only upon selection of a particular choice from the first-level menu.
- the second-level choices of the second-level menu may vary depending on the selected first-level choice.
- the second-level menu has a graphical shape like that of the secondary band of the hierarchical touchpad.
- the second-level menu may be arranged and shaped differently than the secondary band of the hierarchical touchpad.
- An action 2704 comprises accepting user selection of a second-level menu choice.
- a particular second-level choice may be selected by touching a corresponding position on the secondary band of the hierarchical touchpad, or by touching and releasing the corresponding position.
- touching the secondary band of the hierarchical touchpad in a circular motion may scroll choices through a highlighted cursor, or may scroll a cursor through a list of choices.
- actions 2705 and 2706 relating to a third menu level. Actions 2705 and 2706 may be implemented in some embodiments.
- Action 2705 performed in response to the user selecting a second-level choice from the second-level menu, comprises displaying third-level menu choices in a third-level menu, wherein the third-level choices are selectable by touching a third band of a hierarchical touchpad.
- the third-level menu may be displayed initially, upon initial display of the first-level and second-level menus.
- the third-level menu may be initially hidden, and may be made visible only upon selection of a particular choice from the second-level menu.
- the third-level choices of the second-level menu may vary depending on the selected second-level choice.
- the third-level menu has a graphical shape like that of a third band of the hierarchical touchpad.
- the third-level menu may be arranged and shaped differently than the third band of the hierarchical touchpad.
- An action 2706 comprises accepting user selection of a third-level menu choice.
- a particular third-level choice may be selected by touching a corresponding position on the third band of the hierarchical touchpad, or by touching and releasing the corresponding position.
- touching the third band of the hierarchical touchpad in a circular motion may scroll choices through a highlighted cursor, or may scroll a cursor through a list of choices.
- the hierarchical touchpad can be used in various different ways, with various different types of graphical menus that are not limited to the examples shown above.
- Nested, multi-level, or multi-region touchpads such as described above can be incorporated in many different types of devices, including computer input devices like keyboards, computer mice, handheld computers such as PDAs and tablet PCs, computerized devices such as cameras and media players, vehicle controls such as steering wheels, and so forth.
- computer input devices like keyboards, computer mice, handheld computers such as PDAs and tablet PCs
- computerized devices such as cameras and media players
- vehicle controls such as steering wheels, and so forth.
- FIG. 28 shows an example of a computer input device for use with a graphical user interface such as the graphical user interface discussed with reference to FIG. 18 .
- a device such as this allows a user to manipulate an element such as a cursor or pointer on the graphical user interface.
- Devices falling into this category include mice, trackballs, joysticks, gyroscopic controllers, and similar devices.
- these devices include at least one control mechanism that a user can move or interact with to control the position of an on-screen cursor, pointer, or tool, or to otherwise control or influence some element of the displayed graphical user interface.
- FIG. 28 comprises a computer mouse 2800 having a body 2801 designed to be grasped by the hand of a user to move or slide across a flat surface 2802 .
- movement of the mouse over the flat surface produces a similar movement of a cursor or pointer across the associated graphical user interface.
- Movement of mouse 2800 is detected by a mechanical roller or other type of sensor in the bottom of mouse 2800 .
- Mouse 2800 can communicate with an associated computer device via an electrical cord or using common wireless communication technologies. Mice or mice-like devices can be implemented in various ways, using various shapes and motion-sensing technologies.
- Computer mouse 2800 may have buttons or other controls 2803 that interact with an associated computer device to perform various functions.
- computer mouse 2800 has a multi-region touchpad 2804 similar to the touchpad embodiments described above.
- multi-region touchpad 2804 is positioned on the top of body 2801 , between buttons 2803 . It could be alternatively positioned on the side of body 2801 , or in other convenient positions.
- multi-region touchpad 2804 can provide an additional way for a user to interact with a computer device.
- multi-region touchpad 2804 can be used in conjunction with a hierarchical menu, as described above, to perform operations with respect to an object or element indicated by the current position of a cursor.
- Multi-region touchpad 2804 can also be used to select a current tool and/or tool characteristic, such as selecting a type of paintbrush and a color for the paintbrush, or selecting a line type and width for a line-drawing tool.
- the locations around the touch bands of multi-region touchpad 2804 can be associated with particular menu choices, so that a user becomes accustomed to the locations of those choices over time.
- one of the touch bands may correspond to a continuous range of selections.
- the outer touch band may correspond to a continuous gamut of colors as on a color wheel.
- a combination of three touch bands might be used in this fashion to specify hue, saturation, and brightness, or “red”, “green”, and “blue” components of a color.
- Many different control schemes might be implemented with touchpad 2804 and mouse 2800 , not limited to the examples described here. Furthermore, different applications might implement different control schemes, even when running on the same computer and utilizing the same mouse 2800 .
- FIG. 29 shows another example of a computer input device.
- the computer input device comprises a digitizer pad 2900 and an associated stylus 2901 .
- a digitizer pad is commonly used in conjunction with a graphical user interface to control placement and operation of an on-screen tool, cursor, pointer, or other element.
- Digitizer pad 2900 has a two-dimensional surface 2902 with an active digitizer configured to sense placement of stylus 2901 when its tip 2903 touches surface 2902 .
- Stylus 2901 is held in the hand of a user like a pencil, and can be moved across surface 2902 . In many usage scenarios, movement of the stylus over surface 2902 produces a similar movement of a tool, cursor, or pointer across the associated graphical user interface.
- Digitizer pad 2900 can communicate with an associated computer device via an electrical cord or using common wireless communication technologies. Digitizer pads and similar devices can be implemented in various ways, using various configurations and stylus detection technologies. Also, some digitizer pads may sense finger placement on surface 2802 , in addition or alternatively to stylus placement.
- Digitizer pad 2900 has a multi-region touchpad 2904 similar to the touchpad embodiments described above.
- multi-region touchpad 2904 is positioned at a corner of digitizer pad 2900 . It can be touched by the user's finger or by stylus 2901 to activate menu choices or selections. It could be alternatively positioned in other positions relative to surface 2902 , or within surface 2902 .
- multi-region touchpad 2904 can provide another way for a user to interact with a computer device.
- multi-region touchpad 2904 can be used in conjunction with menus, as described above, to perform operations with respect to an object or element indicated by the current position of a cursor.
- Multi-region touchpad 2904 can also be used to select a current tool and/or tool characteristic. This might be particularly useful in conjunction with a drafting or painting program, in which the stylus is often used as a brush or other drawing tool. In these situations, multi-region touchpad 2904 can be used in a manner similar to a paintwell or inkwell, for choosing current tools and tool characteristics.
- touchpad 2904 can make it easier for a user to touch the touchpad in a particular location without having to look directly at the touchpad.
- touchpad 2900 is shown as being circular, other shapes might be utilized to provide better tactile feedback and to allow easier identification of a “home” or default position.
- the touch bands of multi-region touchpad 2904 can be used in combination and in a hierarchically contextual manner to select and specify tools and characteristics. For example, a tool might be selected by touching a location on an inner touch band. This might result in a dependent set of options being available on an outer touch band. Thus, selecting a “pencil” tool using an inner touch band might result in “pencil width” options being made available through an outer touch band. As already described, currently available menu choices for any active touch band can be indicated via an on-screen menu structure, which in some cases can be shaped similarly to the touch band whose menu choices are being displayed.
- buttons may be given the ability to configure multi-region touchpad 2904 themselves, so that frequently used commands or menu choices can be made easily available.
- one or more of the touch bands may correspond to continuous ranges of selections or choices, rather than to discrete menu choices.
- FIG. 30 shows another example of a device that can be used with a multi-region touchpad.
- This example is a tablet or other handheld computer 3000 having a touch-screen display 3001 .
- Some tablet computers may have functionality similar to a general-purpose desktop or laptop computer, but in a relatively thin form factor.
- Other tablet computers may have more limited functionality, and be designed for specific tasks.
- buttons or keys 3002 may be used for basic functionality.
- Touch-screen display 3001 may be sensitive to touch, using capacitive or resistive technology, to detect the touch of a finger. Alternatively, or in addition, touch-screen display 3001 may employ an active digitizer for use in conjunction with a stylus. Other embodiments may use other types of user input that do not necessarily include touch input.
- a hierarchical or multi-region touchpad 3003 may be positioned on the front of tablet computer 3000 to provide one means of user input. Multi-region touchpad 3003 can be used to perform various contextually-dependent operations in accordance with different the usage scenarios described above.
- a menu 3004 corresponding in shape to multi-region touchpad 3003 , may be displayed on touch-screen display 3001 to indicate current choices available to a user via multi-region touchpad 3003 .
- Multi-region touchpad 3003 can be receptive to finger touch, to stylus touch, or both. Although touchpad 3003 is shown as being circular, other shapes might be utilized to provide better tactile feedback and to allow easier identification of a “home” or default position.
- FIG. 31 shows another example: a remote control 3100 such as might be used to control a television, stereo system, or other system.
- a remote control such as this typically uses radio or infrared frequencies to communicate with different controlled devices and includes a number of physical buttons 3101 or other control elements that a user presses to designate different actions.
- some remote controls might include a small display, and some might include a touch-sensitive display that might replace many of buttons 3101 .
- a multi-region touchpad 3102 can be positioned on the operating surface of remote control 3100 for operation by a user's thumb or other finger.
- Multi-region touchpad 3102 can be used in conjunction with a menu structure that is displayed on a graphical or video component of the system being controlled, such as on a television screen.
- multi-region touchpad 3102 might be used in conjunction with a touch-screen that is part of the remote control itself. Note also that touchpad 3102 can also be placed on the rear of remote control 3100 .
- FIG. 32 shows a game controller 3200 as yet another example of a device that can utilize a multi-region touchpad.
- a game controller typically operates in conjunction with dedicated video game consoles or other computerized gaming devices to interface with a user and to allow the user to control aspects of game play that are presented on some form of video display.
- the controller has a number of buttons or other control elements 3201 .
- a multi-region touchpad 3202 can be positioned at a convenient location on the top, side, or bottom of controller 3200 for use in accordance with the usage scenarios described above.
- Touchpad 3202 can be used in conjunction with menus displayed or overlaid on the video game display to select options and to allow quick, on-the-fly selection of various aspects of game play.
- an inner touch band might be used to allow the player to select from different items of inventory, while an outer touch band is used to allow the player to select different weapons.
- an inner touch band might be used to specify an action or verb to perform in the game, while a dependent outer touch band is used to specify items that might be available as the object of that action or verb.
- a combination of three touch bands might be used to indicate a tool, an action to perform with the tool, and an object of the action. More than one multi-region touchpad might be used on a single controller, allowing convenient access to a greater number of game functions.
- FIG. 33 shows a digital camera 3300 as an example of a device that can utilize a multi-region touchpad.
- Digital camera 3300 has a lens 3301 an internal image sensor (not shown), as well as various mechanical, electrical, and optical controls that are used to capture still and/or moving images.
- the illustrated example also has an LCD (liquid-crystal display) 3302 or other flat-screen display that is used for previewing images, for displaying captured images, and for user interaction to set various operational parameters.
- LCD liquid-crystal display
- a multi-region touchpad 3303 is positioned on the back of camera 3300 , adjacent display 3302 .
- Touchpad 3303 can be used in conjunction with display 3302 in accordance with many of the techniques described above to set operational parameters of the camera, including exposure parameters.
- an inner touch band of touchpad 3303 can be used to set exposure length, while an outer touch band is used to select aperture.
- Another touch band might be used to select equivalent film speed.
- Touchpad 3303 might be dedicated to controlling specific camera/exposure parameters, or might be used in a contextually dependent fashion to control many different parameters. Furthermore, touchpad 3303 might be used without an associated display, and might be positioned differently, such as on the front of camera 3300 .
- FIG. 34 show another example of a control device that might incorporate a multi-region touchpad such as described herein.
- This example comprises an automotive steering wheel 3400 such as commonly used in automobiles and other vehicles.
- Steering wheel 3400 includes a hub 3401 , one or more spokes 3402 , and a rim 3403 that is grasped by a driver to rotate the steering wheel.
- a multi-region touchpad 3404 is positioned on hub 3401 or one of the spokes 3402 , adjacent rim 3403 , so that it can be accessed by the thumb or finger of the driver. Note that although touchpad 3404 is shown on the side of the steering wheel facing the driver, it might also be located on the side of the steering wheel facing away from the driver.
- the touchpad 3404 can also be located in different locations within the vehicle, such as on various other primary controls that are grasped by a driver to control different aspects of vehicle operation and navigation.
- the touchpad 3404 might be positioned on the top or end of a shift lever or some other physical vehicle control that is operated by a hand.
- Touchpad 3404 can be used with various vehicle control systems to set various operational parameters of a vehicle or to interact with on-board systems such as entertainment systems or route guidance systems. Menus can be presented on a so-called “heads-up” display, as transparent images on the vehicle windshield, within the driver's normal line of sight. This allows the driver to interact with vehicle subsystems without looking away from the road. Additionally, the position and tactile nature of touchpad 3404 allows the driver to navigate various menus without moving their hand from the steering wheel. Touch bands corresponding to different menu levels are easily identified by tactile differentiation, while positions within each level are easily located by their correspondence in shape with displayed menu structures.
Abstract
Description
- This application is a continuation-in-part of and claims the benefit of U.S. patent application Ser. No. 12/788,239, filed on May 26, 2010, which is incorporated by reference herein in its entirety.
- Handheld devices have become more and more prevalent, in forms such as cellular phones, wireless phones, smartphones, music players, video players, netbooks, laptop computers, e-reading devices, tablet computers, cameras, controllers, remote controls, analytic devices, sensors, and many other types of devices.
- User interfaces for handheld devices have become increasingly sophisticated, and many user interfaces now include color bitmap displays. Furthermore, many user interfaces utilize touch sensitive color displays that can detect touching by a finger or stylus. There are many varieties of touch sensitive displays, including those using capacitive sensors, resistive sensors, and active digitizers. Some displays are limited to detecting only single touches, while others are capable of sensing multiple simultaneous touches.
- Touch sensitive displays are convenient in handheld devices because of the simplicity of their operation to the user. Menu items can be displayed and a user can interact directly with the menu items by touching or tapping them, without the need to position or manipulate an on-screen indicator such as a pointer, arrow, or cursor. Furthermore, the touch capabilities of the display reduce the need for additional hardware input devices such as buttons, knobs, switches, mice, pointing sticks, track pads, joysticks, and other types of input devices.
- One disadvantage of touch sensitive user interfaces, however, is that a user's finger can often obstruct the user's view of the display, and repeated touching of the display can result in fingerprints and smudges that obscure the display. Furthermore, it may be awkward in some devices for a user to both hold the device and to provide accurate touch input via the display, especially with one hand. Because of this, many devices are more awkward in operation than would be desirable.
- The detailed description is set forth with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different figures indicates similar or identical items or features.
-
FIG. 1 is a rear perspective view of a handheld device utilizing a rear touch panel. -
FIG. 2 is a rear view of the handheld device ofFIG. 1 , showing a possible hand and finger placement relative to a rear touch panel. -
FIG. 3 is a rear view of the handheld device ofFIG. 1 , showing another possible hand and finger placement relative to the rear touch panel. -
FIG. 4 is a front perspective view of an alternative handheld device utilizing an edge touch panel. -
FIG. 5 is a front view of the handheld device ofFIG. 1 , showing an embodiment of a banded menu structure that can be used in conjunction with the rear touch panel shown inFIGS. 1 and 2 . -
FIG. 6 is a front perspective view of the handheld device ofFIG. 1 , showing the relationship between its rear touch panel and the banded menu structure shown inFIG. 5 . -
FIG. 7 is a close-up of a banded menu structure such as might be implemented in conjunction with a handheld device. -
FIG. 8 is a front view of a handheld device such as shown inFIG. 1 , illustrating an example of a possible user interaction with the handheld device. -
FIGS. 9-15 are close-ups of banded menu configurations illustrating user interface examples. -
FIG. 16 is a flowchart showing how a menu structure such as shown inFIG. 7 might be utilized in a handheld device. -
FIG. 17 is a block diagram showing relevant components of a handheld device that might be used to support the menus and related components described herein. -
FIG. 18 is a drawing of a general-purpose computer having a keyboard that incorporate a multi-level touchpad in accordance with the techniques described herein. -
FIG. 19 is a perspective view of a portion of the keyboard shown inFIG. 18 , showing more details of the multi-level touchpad. -
FIG. 20 illustrates a menu structure that is used in conjunction with the touchpad shown inFIGS. 18 and 19 . -
FIGS. 21 and 22 illustrate a usage scenario that utilizes the touchpad and menu structure ofFIGS. 18-20 . -
FIGS. 23 and 24 illustrate another usage scenario that utilizes the touchpad and menu structure ofFIGS. 18-20 . -
FIGS. 25 and 26 illustrate a usage scenario that utilizes the touchpad ofFIGS. 18-19 . -
FIG. 27 is a flowchart illustrating a generalized procedure for implementing the usage scenarios described above with reference toFIGS. 18-26 . -
FIG. 28 is a perspective view of a computer mouse that incorporates a multi-region touchpad. -
FIG. 29 is a perspective view of a digitizer pad that incorporates a multi-region touchpad. -
FIG. 30 is a perspective view of a tablet computer that incorporates a multi-region touchpad. -
FIG. 31 is a perspective view of a remote control that incorporates a multi-region touchpad. -
FIG. 32 is a perspective view of a game controller that incorporates a multi-region touchpad. -
FIG. 33 is a perspective view of a digital camera that incorporates a multi-region touchpad. -
FIG. 34 is a perspective view of an automobile interior having a steering wheel that incorporates a multi-region touchpad. -
FIG. 1 shows ahandheld device 100 featuring a front surface 101 (not visible inFIG. 1 ) and an alternate surface (in this case a back or rear surface) 102.Handheld device 100 may be held in one hand by a user, withfront surface 101 facing and visible to the user.Alternate surface 102 is, in this embodiment,opposite front surface 101, and faces away from the user during typical handheld operation. In some embodiments,front surface 101 may have a display and/or other user interface elements. -
Handheld device 100 has a touchsensitive sensor 103, also referred to herein as a touch panel or multi-region touchpad.Touch panel 103 is situated in the alternate surface, in this embodiment facing away from a user who is holdinghandheld device 100. In operation, a user's finger, such as the user's index finger, may be positioned over or ontouch panel 103;touch panel 103 is positioned in such a way as to make this finger placement comfortable and convenient. -
FIGS. 2 and 3 show two examples of howdevice 100 might be grasped by a user. InFIG. 2 , the user holdsdevice 100 with asingle hand 201 in a portrait orientation, withindex finger 202 positioned overtouch panel 103 for operation oftouch panel 103. InFIG. 3 , the user holdsdevice 100 in a landscape position withleft hand 301 andright hand 302, withindex finger 303 of the left hand positioned overtouch panel 103. -
Touch panel 103 has multiple areas that are tactually delineated from each other so that a user can distinguish between the areas by touch. In the described embodiment, the areas comprise a plurality of successively nested or hierarchically arranged annular rings orbands 104. In the illustrated example, there are three such bands: an outer band 104(a), a middle band 104(b), and an inner band 104(c).Bands 104 may be concentric in some embodiments, and may surround a commoncentral touch area 105.Individual bands 104 may be referred to as touch bands in the following discussion. - In the described embodiment, each of
bands 104 has a different elevation or depth relative toalternate surface 102 ofhandheld device 100. There are steps or discontinuous edges between the different elevations that provide tactile differentiation between areas orbands 104, allowing a user to reliably locate a particular touch band, via tactile feedback with a finger, without visually looking attouch panel 103. - In this example, each successively inward band is stepped down in elevation from
alternate surface 102 or from its outwardly neighboring band. In particular, outer band 104(a) is stepped down fromalternate surface 102 and therefore is deeper or has a lower elevation thanalternate surface 102. Middle band 104(b) is stepped down from its outwardly neighboring band 104(a) and is therefore deeper and has a lower elevation than outer band 104(a). Inner band 104(c) is stepped down from its outwardly neighboring band 104(b) and is therefore deeper and has a lower elevation than middle band 104(b). Similarly,central area 105 is stepped down from surrounding inner band 104(c) and is therefore deeper and has a lower elevation than inner band 104(c). Those of skill in the art will understand thattouch bands 104 may each successively extend upward from the bordering larger band. Thus, outer band 104(a) may be lower than middle band 104(b), which in turn is lower than inner band 104(c), which is in turn lower thancentral area 105, thus forming a convex arrangement. In another embodiment, the respective bands may all share the same level, but may be tactually detectable by virtue of a raised border between them. For purposes of simplicity, however, the disclosed embodiment will address only a concave arrangement oftouchpad 103. - The progressively and inwardly increasing depths of
bands 104 andcentral area 105 relative toalternate surface 102 create a concavity ordepression 106 relative toalternate surface 102. Position and dimensions oftouch panel 103 can be chosen so that a user's index finger naturally locates and rests withinconcavity 106, such that it is comfortable to move the finger to different locations aroundtouch panel 103. -
Bands 104 can be irregularly shaped or can form a wide variety of shapes such as circles, ovals, rectangles, or squares. In the illustrated embodiment,bands 104 are irregularly shaped to allow easy finger positioning at desired locations. The irregular shape ofbands 104 allows a user to learn the orientation of the bands and thus aids in non-visual interaction withtouch panel 103. -
Touch panel 103 is sensitive to touch, and can detect the particular location at which it is touched or pressed. Thus, it can detect whichindividual band 104 is touched, and the position or coordinates along the band of the touched location. A user can slide his or her finger radially betweenbands 104 or around asingle band 104, andtouch panel 103 can detect the movement and absolute placement of the finger as it moves along or over the bands.Central area 105 is also sensitive to touch in the same manner. -
Touch panel 103 can be implemented using capacitive, resistive, or pressure sensing technology, or using other technologies that can detect a user's finger placement.Touch panel 103 may also integrate additional sensors, such as sensors that detect the pressing or depression ofcentral area 105 or other areas oftouch panel 103. - Different embodiment may utilize different numbers of bands, and a single band or two bands may be used in different embodiments. Furthermore, the bands may be shaped and positioned differently.
- As an example of a different touch area configuration,
FIG. 4 shows an embodiment ofhandheld device 100 having two straight or linear touch-sensitive areas orbands handheld device 100.Front touch band 401 is positioned on theright edge 403, toward or adjacentfront surface 101.Rear touch band 402 is positioned on theright edge 403, toward or adjacentrear surface 102. - Tactile delineation between
touch bands right side surface 403. -
FIG. 5 is a front view of handheld device 100 (in this embodiment, a cellular phone), showing one possible configuration offront surface 101. In this embodiment, there is a front-facing display ordisplay panel 501 infront surface 101. In some embodiments,display panel 501 may be a touch sensitive display panel. Other user interface elements, such as buttons, indicators, speakers, microphones, etc., may also be located on or aroundfront surface 101, although they are not shown inFIG. 5 . -
Display panel 501 can be used as part of a user interface to operatehandheld device 100. It can also be used to display content, such as text, video, pictures, etc. - A
graphical menu 502 can be displayed at times onfront display 501.Menu 502 has a plurality of graphically- or visually-delineated menu areas orbands 504 corresponding respectively to the tactually-delineated touchsensitive areas 104 onalternate surface 102. In this example,menu areas 504 include an outer band 504(a), a middle band 504(b), and an inner band 504(c). In addition,menu 502 includes a centervisual area 505. -
FIG. 6 illustrates relative positions oftouch panel 103 andgraphical menu 502 in one embodiment. In this embodiment,rear touch panel 103 is positioned opposite and directly behinddisplay panel 501.Bands 504 ofgraphical menu 502 are shaped and sized the same as their corresponding touch-panel bands 104, and are positioned at the corresponding or same lateral coordinates alongfront surface 101 andalternate surface 102. Thus, outer touch band 104(a) has generally the same size, shape, and lateral position as outer menu band 504(a); middle touch band 104(b) has generally the same size, shape, and lateral position as middle menu band 504(b); inner touch band 104(c) has generally the same size, shape, and lateral position as outer menu band 504(c); andcenter area 105 oftouch panel 103 has generally the same size, shape, and lateral position ascenter area 505 offront display panel 501. - Generally,
graphical menu 502 faces the user, andtouch panel 103 faces away from the user. However,display panel 501 andtouch panel 103 may or may not be precisely parallel with each other. Although in particular embodiments it may be desirable to positiongraphical menu 502 so that is directly in front of and aligned withtouch panel 103 as illustrated, other arrangements may work well in certain situations. In particular, in some embodiments there may be a lateral and/or angular offset betweengraphical menu 502 andtouch panel 103, such thattouch panel 103 is not directly behindmenu 502 or is not parallel with the surface ofdisplay panel 501. Furthermore, the correspondence in size and shape between the menu bands and the touch bands may not be exact in all embodiments. Thus, the bands and center area oftouch panel 103 andmenu 502 may differ from one another, but will be similar enough that when a user interacts withtouch panel 103, the user perceives it to have a one-to-one positional correspondence with the elements ofmenu 502. - In operation, as will be described in more detail below, menu items are displayed in
menu bands 504. Each displayed menu item is located at a particular point on amenu band 504, and therefore corresponds to a similar point oncorresponding touch band 104 oftouch panel 103. Aparticular menu band 504 can be selected or activated by touching its corresponding touch band. A particular menu item can be selected or activated by touching the corresponding position or location on thecorresponding touch band 104. - Generally, touching any particular location on
touchpad 103 can be considered similar to touching or clicking on the corresponding location ongraphical menu 502. If a user desires to select a menu item or some other graphical object positioned at a particular point onmenu 502, for example, he or she presses the corresponding point or location ontouch panel 103. The tactual delineations between bands oftouch panel 103 help the user identify and move between graphical menu bands to locate particular menu item groups. -
FIG. 7 shows details of how such amenu 502 might be structured.FIG. 7 shows amenu structure 700 as an example of bothmenu 502 and itscorresponding touch panel 103. This example uses two selection bands: anouter band 701 and aninner band 702, both of which surround acenter area 703.Outer band 701 corresponds to an outer displayed menu band and a correspondingly positioned outer touch band onalternate surface 102.Inner band 702 corresponds to a displayed inner menu band and a correspondingly positioned inner touch band onalternate surface 102.Center area 703 corresponds to an area within the displayed menu as well as a correspondingly positioned touch sensitive area ontouch panel 103. Thus, it is assumed in this example thattouch panel 103 has two touch bands, corresponding to the two touch bands shown inFIG. 7 . - Generally, each of the
menu bands Inner menu band 702 contains menu items labeled “ITEM A1”, “ITEM A2”, “ITEM A3”, “ITEM A4”, “ITEM A5” and “ITEM A6”.Outer menu band 701 contains menu items labeled “ITEM B1”, “ITEM B2”, “ITEM B3”, “ITEM B4”, “ITEM B5”, “ITEM B6”, and “ITEM B7”. - Each
menu band inner menu band 702 has a heading “GROUP A”, andouter menu band 701 has a heading “GROUP B”. - Generally, individual menu items correspond to actions, and selecting a menu item initiates the corresponding action. Thus, hand-held
device 100 is configured to initiate actions associated respectively with the menu items in response to their selection. -
FIG. 7 illustrates one of many variations of band shapes that might be utilized when implementing bothmenu 502 and itscorresponding touch panel 103. In this non-symmetrical variation, the bands have larger widths toward their right-hand and lower sides. This configuration is intended to work well when the device is held in the left hand of a user, who uses his or her left index finger to interact withtouch panel 103. This leaves the right hand free to interact withdisplay panel 501 onfront surface 101. - In a configuration such as this,
touch panel 103 may be symmetrical, with bands that are the same width on their left and right sides.Menu 502 might be non-symmetrical, similar tomenu structure 700. The non-symmetry ofmenu 502 might allow menu items labels and icons to easily fit within its right-hand side. However, the slight differences between the shapes of the touch bands and the corresponding menu bands will likely be nearly imperceptible to a user, or at least easily ignored. This arrangement allowsmenu 502 to be displayed using either a right-hand or left-hand orientation, depending on preferences of a user, while using thesame touch panel 103. - User interaction can be implemented in different ways. For purposes of discussion, interaction with
touch panel 103 will be described with reference to bands and locations ofmenu structure 700. Thus, “touching” or “tapping” ITEM A1 is understood to mean that the user touches the corresponding location ontouch panel 103. -
Menu structure 700 can be sensitive to the context that is otherwise presented byhandheld device 100. In other words, the particular menu items found onmenu 700 may vary depending on the activity that is being performed onhandheld device 100. Furthermore, different bands ofmenu 700 can have menu items that vary depending on a previous selection within a different band. Specific examples will be described below. - In certain embodiments,
menu 700 may be activated or initiated by touchingcenter touch area 105 oftouch panel 103. In response, handhelddevice displays menu 700. Alternatively,menu 700 might be activated by touching any portion oftouch panel 103, or by some other means such as by interaction with front-surface elements ofhandheld device 100. - Upon initially displaying
menu structure 700, individual menu items may or may not be displayed. For example, upon initial display, each menu band may only indicate its group heading or title, and the individual menu items may be hidden. - After activating
menu structure 700 by touchingcenter area 703, the user may touch one of the touch bands to activate or reveal the menu items within that touch band. For example, the user may touchinner band 702, which causesdevice 100 to activate that band and to display or reveal its individual menu items. In addition, activating a particular band might result in that band being highlighted in some manner, such as by an animation, bold text, or distinguishing shades or colors. Activation or selection of a band might also be indicated by enlarging that band on displayedmenu 700 in relation to other, non-activated bands. - Another band might be activated by touching it, or by selecting an item from a first band. For example,
outer band 701 may contain items that depend on a previous selection made from the items ofinner band 702. Thus, touching or selecting an item withininner band 702 may activateouter band 701, andouter band 701 might in this scenario contain items or commands related to the menu item selected frominner band 702. - Selection of a band or menu item may be made by touching and releasing the corresponding location on
touch panel 103. Alternatively, selection may be made by touching at one location, sliding to another location, and releasing. For example,menu structure 700 may be implemented such that touchingcenter area 703 opensmenu structure 700, and sliding toinner band 702 allows the user to move to a menu item oninner band 702. Releasing when over a particular menu item might select or activate that menu item. - Selection within
menu structure 700 or within a band ofmenu structure 700 may be accompanied by a highlight indicating the location of the user's finger at any time within the menu structure. For example, touching in a location ontouch panel 103 in a location corresponding to ITEM A1 may cause ITEM A1 to become bold or otherwise highlighted. Furthermore, any area that is currently being touched can be made to glow ondisplay panel 501, or some similar visual mechanism can be used to indicate finger placement and movement onmenu structure 700. Thus, a user might touch a menu band, move his or her finger along the menu band until the desired menu item is highlighted, and then release his or her touch, thereby activating the menu item that was highlighted upon the touch release. - The user interface arrangement described above can be used in a variety of ways. The following examples assume the use of front-facing
display panel 501 and rear-facingtouch panel 103. For purposes of example and illustration,touch panel 103 will not be explicitly shown in the figures accompanying this discussion. It is assumed that in the examples described,touch panel 103 lies directly behind the illustrated graphical menus, and that the touch bands of the touch panel have shapes and sizes that correspond at least roughly with the menu bands of the displayed graphical menus. User interactions with the touch panel will be described with reference to corresponding points on the displayed graphical menus. -
FIGS. 8-11 illustrate how the elements and techniques described above might be used to edit and share a picture that is stored on a handheld device such as a cellular telecommunications device. InFIG. 8 ,handheld device 100 is displaying aphotograph 801 on itsdisplay surface 501.Touch panel 103 is represented in dashed lines to indicate its location relative to displaypanel 501. A menu is not displayed inFIG. 8 . -
FIG. 9 shows amenu 901 that is displayed ondisplay panel 501 in response to a user touchingcenter area 105 oftouch panel 103. This menu is configured to allow a user to perform various operations with respect to the displayedpicture 801. The object of these operations,picture 801, is displayed or represented withincenter area 703.Inner band 702 is configured to correspond to various editing operations that can be performed onpicture 801, and has a band heading 901 that reads “EDIT”.Outer band 701 is configured to correspond to various communications options that can be performed in conjunction withpicture 801, and has a band heading 902 that reads “SHARE”. A user can touch anywhere ininner band 702 to activate or reveal the menu items of that band. A user can touch anywhere inouter band 701 to activate or reveal the menu items of that band. -
FIG. 10 shows the result of a user touchinginner band 702. In response to touching a band, it is activated or highlighted. In this example, an activated band is enlarged and its menu items are revealed.Menu items 1001 ofinner band 702 comprise “Paint”, “Copy”, “Crop”, “Effects”, “Text”, and “Save”. While still touchinginner band 702, the user can move his or her finger aroundinner band 702 until it is positioned corresponding to a desired menu item. In some embodiments, the location at which the user is touching the band will be highlighted or somehow indicated ondisplay 501 so that finger movement can be visually confirmed. When the finger is at the desired menu item, the user released the finger touch and the menu item is selected or activated. - Suppose, for example, that the user wants to crop the displayed
picture 801. The user first touches andreleases center area 703 to activatemenu 700. The user then touchesinner band 702, which revealsmenu items 901 relating to editing actions. The user moves his or her finger until touching the menu item “Crop”, and releases. This causesdevice 100 to display an on-screen tool for croppingpicture 801. Although this tool is not illustrated,picture 801 may be again displayed in full size onfront display panel 501, as inFIG. 8 , and a moveable rectangle may be shown for the user to position in the desired cropping location. The user may drag the displayed rectangle by pressing and dragging ondisplay panel 501 to achieve the desired positioning of the rectangle, and the desired cropping ofpicture 801. -
FIG. 11 shows a subsequent operation that may be performed on the croppedpicture 801. After the cropping operation described above, the croppedpicture 801 is displayed incenter area 703 as the object of a proposed action.Menu 700 may reappear after the cropping operation, or may be reactivated by the user again touchingcenter area 703. - In the example of
FIG. 11 , the user has touched theouter band 701 to reveal themenu items 1101 of that band, which relate to different communications options that are available with regard to the targeted picture. These options include “Email”, “Text”, “IM”, “Facebook”, “Twitter”, and “Blog”. These menu items correspond to actions thatdevice 100 or an application program withindevice 100 will initiate upon selection of the menu items. Notice that in this example, as withFIG. 10 , the activated menu band is enlarged to indicate that it is active. Enlarging the active menu band also allows its menu items to occupy more screen space and therefore make them more visible to the user. -
FIGS. 12-15 illustrate how the elements and techniques described might be used to select and interact with different contacts, using amenu structure 1200 that is displayed onhandheld device 100.Example menu 1200 uses three levels of menu bands and corresponding touch bands: anouter band 1201, amiddle band 1202, and aninner band 1203. These bands surround acenter area 1204. -
FIG. 13 shows themenu items 1301 revealed upon activatinginner band 1203. In this example,inner band 1203 contains menu items corresponding to contacts that the user has designated as belonging to a particular group. It contains a group heading orlabel 1302, which in this example reads “FAMILY”, indicating that the contacts within this band are part of the “FAMILY” contact group. In this example, the menu items include “Mom”, “Dad”, “Aric”, “Janelle”, “Grandma”, and “Jim”. A user can touch or select any one of these menu items to select the corresponding contact. -
FIG. 14 showsmenu items 1401 that are revealed upon activatingmiddle band 1202. These menu items relate to activities that can be performed with respect to a contact that has been selected frominner band 1203.Middle band 1202 has a group heading orlabel 1402, which in this example reads “COMM”, indicating that the band contains communications options. - In this example, “Jim” has been previously selected from
inner band 1203 and is displayed incenter area 1204 as the object of any selected operations. The menu items and corresponding operations include “eMail”, “Text”, “Call”, “Chat”, and “Twitter”. The available menu items might vary depending on the information available for the selected contact. For example, some contacts might only include a telephone number, and communications options might therefore be limited to texting and calling. Other contacts might include other information such as Chat IDs, and a “Chat” activity might therefore be available for these contacts. Thus, the menu items available in this band are sensitive to the menu context selected in previous interactions withmenu 1200. -
FIG. 15 showsmenu items 1501 that are revealed upon activatingouter band 1201.Outer band 1201 contains menu items corresponding to different contact groups that a user has defined, and contains a group heading ortitle 1502 that reads “GROUPS”. In this example, these contact groups include “Family”, “Office”, “Friends”, and “Favorites”. Selecting one of these groups changes the context ofmenu 1200. In particular, it changes the contact group that is shown withininner band 1203. After selecting “Office” fromouter band 1201, for example, thelabel 1302 ofinner band 1203 will change to “OFFICE”, and the listedmenu items 1301 withininner band 1203 will change to those that the user has included in the “User” group. - The above usage scenarios are only examples, and the user described interaction techniques might be useful in many different situations. As another example, the described menu structure might be used as an application launcher, with different types of applications being organized within different menu bands. End-users may be given the ability to organize applications within menu bands in accordance with personal preferences.
- The described menu structure might also be used as a general context menu, presenting operations such as copy, paste, delete, add bookmark, refresh, etc., depending on operations that might be appropriate at a particular time when the menu structure is opened. Again, different types of operations might be presented in different menu bands, such as “edit” operations in an inner band and “sharing” operations in an outward band.
- Furthermore, support for the menu structure can be provided through an application programming interface (API) and corresponding software development kit (SDK) to allow the menu functionality to be used and customized by various application programs. In addition, the operating system of the handheld device can expose APIs allowing application programs to register certain activities and actions that might be performed with respect to certain types of objects, or in certain contexts. Registering in this manner would result in the indicated activities or actions being included in the contextual menus described above.
-
FIG. 16 illustrates the above user interface techniques in simplified flowchart form. Anaction 1601 comprises displaying a menu on a front-facing display of a handheld device. As described above, the menu may have visually-delineated menu areas or bands corresponding in shape and position to the nested or hierarchical touch bands of a rear-facing touch sensor of the handheld device. - An
action 1602 comprises displaying menu items in the menu bands. As already described, each menu item corresponds to a position on the rear-facing touch sensor of the handheld device. - An
action 1603 comprises navigating among the menu bands and menu items in response to rear touch sensor input.Action 1604 comprises selecting a particular one of the menu items in response to the user touching its corresponding position on the rear-facing touch sensor. - Note that in the embodiments described above, having a front-facing touch-sensitive display, some of the user interactions might be performed by touching the display itself at the desired menu location, as an alternative to touching the corresponding location on the rear touch panel. Some embodiments may allow the user to touch either the front displayed menu or the corresponding rear touch panel, at the user's discretion.
-
FIG. 17 shows relevant components of anexemplary device 1700 implementing the features described herein. -
Device 1700 ofFIG. 17 comprises one ormore processors 1701 andmemory 1702.Memory 1702 is accessible and readable byprocessors 1701 and can store programs and operational logic for implementing the functionality described herein. Specifically,memory 1702 can contain instructions that are executable byprocessors 1701 to perform and implement the described functionality. - In some cases, the programs and logic of
memory 1702 will be organized as an operating system (OS) 1703 andapplications 1704.OS 1703 contains operational logic for basic device operation, whileapplications 1704 work in conjunction withOS 1703 to implement additional, higher-level functionality.Applications 1704 may in many embodiments be installed by device manufacturers, resellers, retailers, or end-users. In other embodiments, the OS and applications may be built into the device at manufacture. Furthermore, some implementations may be specially programmed with a dedicated program or set of instructions, and may or may not have separate operating system and application layers. - Note that
memory 1702 may include internal device memory as well as other memory that may be removable or installable. Internal memory may include different types of machine-readable media, such as electronic memory, flash memory, and/or magnetic memory, and may include both volatile and non-volatile memory. External memory may similarly be of different machine-readable types, including rotatable magnetic media, flash storage media, so-called “memory sticks,” external hard drives, network-accessible storage, etc. Both applications and operating systems may be distributed on such external memory and installed from there. Applications and operating systems may also be installed and/or updated from remote sources that are accessed using wireless means, such as WiFi, cellular telecommunications technology, and so forth. - Some embodiments of
device 1700 may have adisplay 1705 and atouch panel 1706, the characteristics of which are described herein.OS 1703 and/orapplications 1704 interact withdisplay 1705 andtouch panel 1706 to implement the user interface behaviors and techniques described above. In many embodiments,device 1700 might have an application programming interface (API) 1707 that exposes the functionality ofdisplay 1705 andtouch panel 1706 to applications through high-level function calls, allowing third-party application to utilize the described functionality without the need for interacting with device components at a low level.API 1707 may include function calls for performing the actions described with reference toFIG. 16 , including: -
- displaying a menu on a front-facing display, the menu having visually-delineated menu bands corresponding in shape and position to the nested touch bands of the rear-facing touch sensor;
- displaying menu items in the menu bands, each menu item corresponding to a position on the rear-facing touch sensor; and
- selecting a particular one of the menu items in response to the user touching its corresponding position on the rear-facing touch sensor.
- Similarly,
API 1707 may allow application programs to register certain functions or actions, along with potential objects of those functions or actions, allowing the handheld device to include those functions and activities as menu items in appropriate contexts. -
Device 1700 may also include other input/output elements 1708, such as different types of displays, touch-sensitive display panels, controls, buttons, lenses, image capture devices, storage devices, microphones, etc. - Note that various embodiments include programs, devices, and components that are configured or programmed to perform in accordance with the descriptions herein, as well as computer-readable storage media containing programs or instructions for implementing the described functionality. Various different device embodiments, utilizing different types of operational logic and input output elements, will be described below. The architecture shown in
FIG. 17 is an example illustrating basic operational concepts, and that the various described embodiments might be implemented in many different ways. -
FIG. 18 shows an embodiment in which a multi-region or hierarchical touchpad is used in conjunction with a conventional computing device. In this embodiment, the computing device is a personal desktop computer 1800 comprising akeyboard 1801, adisplay 1802, and a system controller orprocessor 1803. -
Keyboard 1801 has traditional input mechanisms, including keys orbuttons 1804 that are operable by a user to enter text on computer 1800. The keyboard may also have aconventional touchpad 1805 that can be used in place of a computer mouse to control an on-screen pointer or cursor. In addition, there is a multi-level orhierarchical touchpad 1806 positioned at the lower left corner of the top surface ofkeyboard 1801. In this embodiment,touchpad 1806 is positioned apart and independently fromdisplay 1802, not directly behind or otherwise aligned with an on-screen menu 1807. - Although illustrated in conjunction with a traditional desktop computer,
hierarchical touchpad 1806 can be used with a variety of different types of computing devices, such as laptop computers, netbook computers, tablet computers, mobile devices, gaming devices, cameras, input peripherals, special purpose computing devices, and so forth. Furthermore, a touchpad such as this can be implemented as a stand-alone accessory or integrated with different types of input devices. For example, it might be integrated with a mouse or digitizer pad. -
FIG. 19 showshierarchical touchpad 1806 in more detail. As in some of the previously described embodiments,hierarchical touchpad 1806 has multiple areas that are tactually delineated from each other so that a user can distinguish between the areas by touch. In the described embodiment, the areas comprise a plurality of hierarchically-arranged and tactually delineated touch-sensitive areas or bands. In this embodiment, the touch-sensitive areas or bands are arranged concentrically, surrounding a central touch-sensitive area 1901. An inner or primary touch-sensitive band 1902 has an annular or ring-like shape, and is immediately adjacent central touch-sensitive area 1901. An outer or secondary touch-sensitive band 1903 also has an annular or ring-like shape. Outer touch-sensitive band 1903 surrounds and is immediately adjacent inner touch-sensitive band 1902. Note that thecentral area 1901 may be omitted in some embodiments, and the inner touch-sensitive band 1902 may comprise a circular area rather than an annular area. - In the described embodiment, each of the touch-
sensitive bands surface 1904 in which it is positioned. There are steps ordiscontinuous edges 1905 between the different elevations that provide tactile differentiation between bands, allowing a user to reliably locate a particular touch band via tactile feedback with a finger, without visually looking attouchpad 1806. - In this example, each successively inward band or area is stepped down in elevation from
surface 1904 or from its outwardly neighboring band. In particular,outer band 1903 is stepped down fromsurface 1904 and therefore is deeper or has a lower elevation thansurface 1904.Inner band 1902 is stepped down from its outwardly neighboringband 1903 and is therefore deeper and has a lower elevation thanouter band 1903. Similarly,central area 1901 is stepped down from surroundinginner band 1902 and is therefore deeper and has a lower elevation thaninner band 1902. - The progressively and inwardly increasing depths of
bands central area 1901 relative to surface 1904 create a concavity or depression relative tosurface 1904. The position and dimensions oftouch panel 1806 can be chosen so that a user's finger naturally locates and rests within the concavity, such that it is comfortable to move the finger to different locations aroundtouch panel 1806. - Those of skill in the art will understand that the touch-sensitive bands may each successively extend upward from the bordering larger band. Thus,
outer band 1903 may be lower thaninner band 1902, which in turn may be lower thancentral area 1901, thus forming a convex arrangement. In another embodiment, the respective bands may all share the same level, but may be tactually detectable by virtue of a raised border between them. For purposes of simplicity, however, the disclosed embodiment will address only the concave arrangement shown inFIG. 19 . -
Touchpad 1806 is sensitive to touch, and can detect the particular location at which it is touched or pressed. Thus, it can detect which individual band is touched, and the position or coordinates along the band of the touched location. A user can slide his or her finger radially between bands or around a single band, andtouch panel 1806 can detect the movement and absolute placement of the finger as it moves along or over the bands.Central area 1901 is also sensitive to touch in some embodiments. -
Touchpad 1806 can be implemented using capacitive, resistive, or pressure sensing technology, or using other technologies that can detect a user's finger placement.Touchpad 1806 may also integrate additional sensors, such as sensors that detect the pressing or depression ofcentral area 1901 or other areas oftouchpad 1806. In addition to being sensitive to touching with a finger, the touch-sensitive bands can also be sensitive to touching by a stylus or other object. - Different embodiment may utilize different numbers of bands, and a single band or three bands may be used in different embodiments. Furthermore, the bands may be shaped and positioned differently than illustrated here.
- In operation,
touchpad 1806 can be used with an on-screen menu having a graphical appearance that is similar to that of the touchpad itself Referring back toFIG. 18 , an example of such an on-screen menu 1807 is shown ondisplay 1802. -
FIG. 20 shows on-screen menu 1807 in more detail. On-screen menu 1807 comprises a first-level menu 2001 and a second-level menu 2002. First-level menu 2001 corresponds to inner touch-sensitive band 1902 and has a shape similar to that of inner touch-sensitive band 1902. Second-level menu 2002 corresponds to outer touch-sensitive band 1903 and has a shape similar to that of outer touch-sensitive band 1903. First-level menu 2001 and second-level menu 2002 are also arranged relative to each other similar to the arrangement of the bands oftouchpad 1806, with second-level menu 2002 surrounding and immediately adjacent first-level menu 2001. Each ofmenus -
FIG. 21 illustrates a usage embodiment in which there is a one-to-one correspondence between positions of inner touch-sensitive band 1902 and positions of first-level menu 2001.FIG. 21 shows a graphical window orpane 2100 upon which on-screen menu 1807 is displayed. Withinmenu 1807, first-level menu 2001 is shown as having four menu choices: File, Edit, View, and Help. These choices are distributed around first-level menu 2001 at approximately equal intervals, at the left, top, right, and bottom of first-level menu 2001, respectively. - Dashed arrows indicate, for each of these choices, corresponding locations on inner touch-
sensitive band 1902 oftouchpad 1806. Generally, the position of a particular menu choice on inner touch-sensitive band 1902 is assumed to be the same as the choice's position on first-level menu 2001. Thus, the “File” choice is displayed at the left of first-level menu 2001, and is assumed to also be located at the left of inner touch-sensitive band 1902. Touching the left of first-level menu is equivalent to selecting or “touching” the “File” menu item. The “Edit” choice is displayed at the top of first-level menu 2001, and is assumed to also be located at the top of inner touch-sensitive band 1902. Touching the top of first-level menu is equivalent to selecting or “touching” the “Edit” menu item. Similarly, the “View” and “Help” choices may be selected by touching their corresponding locations ontouchpad 1806. - A cursor or
pointer 2101 can be used in some embodiments to indicate on on-screen menu 1807 the current position of a finger ontouchpad 1806. The cursor or pointer can be a graphical arrow, dot, highlight, or other type of graphical delineation. InFIG. 21 , afinger 2102 is shown touching a point on touch-sensitive band 1902, and the corresponding location of on-screen menu 1807 is indicated bycircular cursor 2101.Cursor 2101 provides graphical feedback to the user as the user moves his or her finger to various locations around touch-sensitive bands - In operation, touching
touchpad 1806 may causemenu 1807 to appear onwindow 2100, along withcursor 2101 that indicates where the touch is occurring. As the finger moves alongtouchpad 1806,cursor 2101 follows its movement in the corresponding areas ofmenu 1807. The user may visually watchcursor 2101 to verify finger placement, and to guidecursor 2101 over a desired menu choice. For example, the user may use their finger to guidecursor 2101 over the “Edit” choice of first-level menu 2001. Releasing or removing the touch contact withtouchpad 1806, while the cursor is over a particular menu choice, results in the selection of that menu choice. -
FIG. 22 illustrates the result of selecting the “Edit” choice from first-level menu 2001. In response to the selected first-level choice, second-level menu 2002 is displayed with second-level choices that depend on the selected first-level choice. In this example, the second-level choices comprise “Undo”, “Copy”, “Cut”, and “Paste”. These choices are distributed around second-level menu 2002 at approximately equal intervals, at the left, top, right, and bottom of second-level menu 2002, respectively. - Dashed arrows indicate, for each of these choices, corresponding locations on outer touch-
sensitive band 1903 oftouchpad 1806. Generally, the position of a particular menu choice on outer touch-sensitive band 1903 is assumed to be the same as the choice's position on second-level menu 2002. Thus, the “Undo” choice is displayed at the left of first-level menu 2002, and is assumed to also be located at the left of outer touch-sensitive band 1903. Touching the left of second-level menu is equivalent to selecting or “touching” the “Undo” menu item. The “Copy” choice is displayed at the top of second-level menu 2002, and is assumed to also be located at the top of outer touch-sensitive band 1903. Touching the top of second-level menu 2002 is equivalent to selecting or “touching” the “Copy” menu item. Similarly, the “Cut” and “Paste” choices may be selected by touching their corresponding locations ontouchpad 1806. -
Cursor 2101 can be used to indicate the current position offinger 2102 ontouchpad 1806 as described above. In this example,finger 2102 is shown touching a point on outer touch-sensitive band 1903, and the corresponding location of on-screen menu 1807 is indicated bycursor 2101. A particular menu choice can be selected by moving the cursor over it and then releasingtouchpad 1806. -
FIG. 23 shows another embodiment, having an on-screen menu that does not correspond in size or shape to the hierarchical touchpad.FIG. 23 shows a graphical window orpane 2300 upon which a first-level menu 2301 is displayed. First-level menu 2001 is shown as having four menu choices: File, Edit, View, and Help. These choices are arranged horizontally and linearly, from left to right, along the top ofwindow 2300. - Dashed arrows indicate, for each of these choices, corresponding locations on inner touch-
sensitive band 1902 oftouchpad 1806. Generally, the menu choices are arranged on inner touch-sensitive band 1902 in the same sequence as their presentation within first-level menu 2001. Thus, the “File” choice corresponds to the left of inner touch-sensitive band 1902, the “Edit” choice corresponds to the top of inner touch-sensitive band 1902, the “View” choice corresponds to the right of inner touch-sensitive band 1902, and the “Help” choice corresponds to the bottom of inner touch-sensitive band 1902. Thus, touching the left of first-level menu is equivalent to selecting or “touching” the “File” menu item. Touching the top of first-level menu is equivalent to selecting or “touching” the “Edit” menu item. Similarly, the “View” and “Help” choices may be selected by touching their corresponding locations ontouchpad 1806. - A cursor or pointer can be used as in previous embodiments to show the current selection. In this example, cursor functionality is provided by underlining any menu choice that the user is currently “touching.” In
FIG. 23 , the user's finger is assumed to be touching the location ontouchpad 1806 corresponding to the “Edit” choice, and the “Edit” choice is therefore underlined. -
FIG. 24 shows the result of selecting the “Edit” choice from first-level menu 2301. In response to the selected first-level choice, second-level menu 2401 is displayed with second-level choices that are selectable by touching the secondary touch-sensitive area 1903 oftouchpad 1806. In this example, the second-level choices comprise Undo, Copy, Cut, and Paste. These choices are arranged vertically and linearly, beneath the selected first-level menu choice. - Dashed arrows indicate, for each of these choices, corresponding locations on outer touch-
sensitive band 1903 oftouchpad 1806. Generally, the menu choices are arranged on outer touch-sensitive band 1903 in the same sequence as their presentation within second-level menu 2401. Thus, the “Undo” choice corresponds to the left of outer touch-sensitive band 1903, the “Copy” choice corresponds to the top of outer touch-sensitive band 1903, the “Cut” choice corresponds to the right of outer touch-sensitive band 1903, and the “Paste” choice corresponds to the bottom of outer touch-sensitive band 1903. Thus, touching the left of outer touch-sensitive band 1903 is equivalent to selecting or “touching” the “Undo” menu item. Touching the top of outer touch-sensitive band 1903 is equivalent to selecting or “touching” the “Copy” menu item. Similarly, the “Cut” and “Paste” choices may be selected by touching their corresponding locations ontouchpad 1806. - Underlining is again used to indicate a current selection, which in this case is “Cut”. A particular menu choice can be selected by moving the underlining to the appropriate choice and then releasing the touching of
touchpad 1806. -
FIG. 25 illustrates yet another usage scenario for a hierarchical touchpad. In this example, inner touch-sensitive band 1902 corresponds to menu choices within a vertical, linear first-level menu 2501. First-level menu 2501 has the menu choices “Start Date”, “Start Time”, “End Date”, and “End Time”, along with corresponding values for those menu choices. The value of a particular menu choice can be selected or changed by selecting the that menu choice from first-level menu 2501. In this case, the “Start Date” choice can be selected by touching the left of inner touch-sensitive band 1902. The “Start Time” choice can be selected by touching the top of inner touch-sensitive band 1903. The “End Date” choice can be selected by touching the right of inner touch-sensitive band 1902. The “End Time” choice can be selected by touching the bottom of inner touch-sensitive band 1902. The correspondence between menu choices and positions on inner touch-sensitive band 1902 are indicated by dashed arrows. -
FIG. 26 shows the result of selecting the “End Date” choice. Selecting any choice from the first-level menu 2501 cases a second-level menu 2601 to open, containing menu choices that vary depending on the selected first-level menu choice. In this example, second-level menu 2601 is a scrollable window having second-level menu choices that scroll vertically in response to sweeping outer touch-sensitive band 1903 in a circular motion. A blocked or highlighted line 2602 indicates the current selection: June 16. Sweeping a finger around outer touch-sensitive band 1903 in a clockwise direction scrolls in one direction. Sweeping a finger around outer touch-sensitive band 1903 in a counter-clockwise direction scrolls in the other direction. -
FIG. 27 shows ageneralized procedure 2700 to implement the usage scenarios described above, in conjunction with a hierarchical touchpad such astouchpad 1806.Procedure 2700 is described in terms of actions or steps that can be implemented by programs or instruction sequences that are stored in memory and executed by a processor, for example byprocessor 1701 ofFIG. 17 orsystem controller 1803 ofFIG. 18 . Other types of operational logic might also be used to implement the described procedure. - An
action 2701 comprises displaying first-level choices in a first-level menu, wherein the first-level choices are selectable by touching corresponding locations of a primary band of a hierarchical touchpad such as described above. In some embodiments, the first-level menu has a graphical shape like that of the primary band of the hierarchical touchpad. In other embodiments, the first-level menu may be arranged and shaped differently than the primary band of the hierarchical touchpad. - An
action 2702 comprises accepting user selection of a first-level menu choice. A particular choice may be selected by touching the corresponding position on the primary band of the hierarchical touchpad, or by touching and releasing the corresponding position. - An
action 2703, performed in response to the user selecting a first-level choice from the first-level menu, comprises displaying second-level menu choices in a second-level menu, wherein the second-level choices are selectable by touching a secondary band of a hierarchical touchpad. In one embodiment, the second-level menu may be displayed initially, upon initial display of the first-level menu. In other embodiments, the second-level menu may be initially hidden, and may be made visible only upon selection of a particular choice from the first-level menu. In either embodiment, the second-level choices of the second-level menu may vary depending on the selected first-level choice. In some embodiments, the second-level menu has a graphical shape like that of the secondary band of the hierarchical touchpad. In other embodiments, the second-level menu may be arranged and shaped differently than the secondary band of the hierarchical touchpad. - An
action 2704 comprises accepting user selection of a second-level menu choice. In some embodiments, a particular second-level choice may be selected by touching a corresponding position on the secondary band of the hierarchical touchpad, or by touching and releasing the corresponding position. In some embodiments, touching the secondary band of the hierarchical touchpad in a circular motion may scroll choices through a highlighted cursor, or may scroll a cursor through a list of choices. - Although certain of the above embodiments are described as having two menu levels, other embodiments may have additional menu levels. This is described by
actions Actions -
Action 2705, performed in response to the user selecting a second-level choice from the second-level menu, comprises displaying third-level menu choices in a third-level menu, wherein the third-level choices are selectable by touching a third band of a hierarchical touchpad. In one embodiment, the third-level menu may be displayed initially, upon initial display of the first-level and second-level menus. In another embodiment, the third-level menu may be initially hidden, and may be made visible only upon selection of a particular choice from the second-level menu. In either embodiment, the third-level choices of the second-level menu may vary depending on the selected second-level choice. In some embodiments, the third-level menu has a graphical shape like that of a third band of the hierarchical touchpad. In other embodiments, the third-level menu may be arranged and shaped differently than the third band of the hierarchical touchpad. - An
action 2706 comprises accepting user selection of a third-level menu choice. In some embodiments, a particular third-level choice may be selected by touching a corresponding position on the third band of the hierarchical touchpad, or by touching and releasing the corresponding position. In some embodiments, touching the third band of the hierarchical touchpad in a circular motion may scroll choices through a highlighted cursor, or may scroll a cursor through a list of choices. - As illustrated by the various embodiments, the hierarchical touchpad can be used in various different ways, with various different types of graphical menus that are not limited to the examples shown above.
- Nested, multi-level, or multi-region touchpads such as described above can be incorporated in many different types of devices, including computer input devices like keyboards, computer mice, handheld computers such as PDAs and tablet PCs, computerized devices such as cameras and media players, vehicle controls such as steering wheels, and so forth.
-
FIG. 28 shows an example of a computer input device for use with a graphical user interface such as the graphical user interface discussed with reference toFIG. 18 . A device such as this allows a user to manipulate an element such as a cursor or pointer on the graphical user interface. Devices falling into this category include mice, trackballs, joysticks, gyroscopic controllers, and similar devices. Generally, these devices include at least one control mechanism that a user can move or interact with to control the position of an on-screen cursor, pointer, or tool, or to otherwise control or influence some element of the displayed graphical user interface. - The example of
FIG. 28 comprises acomputer mouse 2800 having abody 2801 designed to be grasped by the hand of a user to move or slide across aflat surface 2802. In many usage scenarios, movement of the mouse over the flat surface produces a similar movement of a cursor or pointer across the associated graphical user interface. Movement ofmouse 2800 is detected by a mechanical roller or other type of sensor in the bottom ofmouse 2800.Mouse 2800 can communicate with an associated computer device via an electrical cord or using common wireless communication technologies. Mice or mice-like devices can be implemented in various ways, using various shapes and motion-sensing technologies. -
Computer mouse 2800 may have buttons orother controls 2803 that interact with an associated computer device to perform various functions. In addition,computer mouse 2800 has amulti-region touchpad 2804 similar to the touchpad embodiments described above. In this embodiment,multi-region touchpad 2804 is positioned on the top ofbody 2801, betweenbuttons 2803. It could be alternatively positioned on the side ofbody 2801, or in other convenient positions. - When used with
mouse 2800,multi-region touchpad 2804 can provide an additional way for a user to interact with a computer device. For example,multi-region touchpad 2804 can be used in conjunction with a hierarchical menu, as described above, to perform operations with respect to an object or element indicated by the current position of a cursor.Multi-region touchpad 2804 can also be used to select a current tool and/or tool characteristic, such as selecting a type of paintbrush and a color for the paintbrush, or selecting a line type and width for a line-drawing tool. Rather than scrolling, the locations around the touch bands ofmulti-region touchpad 2804 can be associated with particular menu choices, so that a user becomes accustomed to the locations of those choices over time. For example, a user may learn that pressing the top of the inner touch band selects the “thin” paintbrush tool, and subsequently touching the left of the outer touch band sets the paintbrush tool to use “red” as its color. In some embodiments, users may be given the ability to configuremulti-region touchpad 2804 themselves, so that frequently used commands or menu choices can be made easily available. In some embodiments, one of the touch bands may correspond to a continuous range of selections. For example, the outer touch band may correspond to a continuous gamut of colors as on a color wheel. A combination of three touch bands might be used in this fashion to specify hue, saturation, and brightness, or “red”, “green”, and “blue” components of a color. Many different control schemes might be implemented withtouchpad 2804 andmouse 2800, not limited to the examples described here. Furthermore, different applications might implement different control schemes, even when running on the same computer and utilizing thesame mouse 2800. -
FIG. 29 shows another example of a computer input device. In this case, the computer input device comprises adigitizer pad 2900 and an associatedstylus 2901. A digitizer pad is commonly used in conjunction with a graphical user interface to control placement and operation of an on-screen tool, cursor, pointer, or other element.Digitizer pad 2900 has a two-dimensional surface 2902 with an active digitizer configured to sense placement ofstylus 2901 when itstip 2903 touchessurface 2902.Stylus 2901 is held in the hand of a user like a pencil, and can be moved acrosssurface 2902. In many usage scenarios, movement of the stylus oversurface 2902 produces a similar movement of a tool, cursor, or pointer across the associated graphical user interface.Digitizer pad 2900 can communicate with an associated computer device via an electrical cord or using common wireless communication technologies. Digitizer pads and similar devices can be implemented in various ways, using various configurations and stylus detection technologies. Also, some digitizer pads may sense finger placement onsurface 2802, in addition or alternatively to stylus placement. -
Digitizer pad 2900 has amulti-region touchpad 2904 similar to the touchpad embodiments described above. In this embodiment,multi-region touchpad 2904 is positioned at a corner ofdigitizer pad 2900. It can be touched by the user's finger or bystylus 2901 to activate menu choices or selections. It could be alternatively positioned in other positions relative tosurface 2902, or withinsurface 2902. - When used with
digitizer pad 2900,multi-region touchpad 2904 can provide another way for a user to interact with a computer device. For example,multi-region touchpad 2904 can be used in conjunction with menus, as described above, to perform operations with respect to an object or element indicated by the current position of a cursor.Multi-region touchpad 2904 can also be used to select a current tool and/or tool characteristic. This might be particularly useful in conjunction with a drafting or painting program, in which the stylus is often used as a brush or other drawing tool. In these situations,multi-region touchpad 2904 can be used in a manner similar to a paintwell or inkwell, for choosing current tools and tool characteristics. The tactile delineations ofmulti-region touchpad 2904 can make it easier for a user to touch the touchpad in a particular location without having to look directly at the touchpad. Furthermore, althoughtouchpad 2900 is shown as being circular, other shapes might be utilized to provide better tactile feedback and to allow easier identification of a “home” or default position. - As described above, the touch bands of
multi-region touchpad 2904 can be used in combination and in a hierarchically contextual manner to select and specify tools and characteristics. For example, a tool might be selected by touching a location on an inner touch band. This might result in a dependent set of options being available on an outer touch band. Thus, selecting a “pencil” tool using an inner touch band might result in “pencil width” options being made available through an outer touch band. As already described, currently available menu choices for any active touch band can be indicated via an on-screen menu structure, which in some cases can be shaped similarly to the touch band whose menu choices are being displayed. - As in the previous embodiment, users may be given the ability to configure
multi-region touchpad 2904 themselves, so that frequently used commands or menu choices can be made easily available. In addition, one or more of the touch bands may correspond to continuous ranges of selections or choices, rather than to discrete menu choices. -
FIG. 30 shows another example of a device that can be used with a multi-region touchpad. This example is a tablet orother handheld computer 3000 having a touch-screen display 3001. Some tablet computers may have functionality similar to a general-purpose desktop or laptop computer, but in a relatively thin form factor. Other tablet computers may have more limited functionality, and be designed for specific tasks. - Many tablet computers do not have a full hardware alphanumeric keyboard, relying instead on virtual keyboards displayed on touch-
screen display 3001. A limited number of buttons orkeys 3002 may be used for basic functionality. - Touch-
screen display 3001 may be sensitive to touch, using capacitive or resistive technology, to detect the touch of a finger. Alternatively, or in addition, touch-screen display 3001 may employ an active digitizer for use in conjunction with a stylus. Other embodiments may use other types of user input that do not necessarily include touch input. - A hierarchical or
multi-region touchpad 3003 may be positioned on the front oftablet computer 3000 to provide one means of user input.Multi-region touchpad 3003 can be used to perform various contextually-dependent operations in accordance with different the usage scenarios described above. Amenu 3004, corresponding in shape tomulti-region touchpad 3003, may be displayed on touch-screen display 3001 to indicate current choices available to a user viamulti-region touchpad 3003. -
Multi-region touchpad 3003 can be receptive to finger touch, to stylus touch, or both. Althoughtouchpad 3003 is shown as being circular, other shapes might be utilized to provide better tactile feedback and to allow easier identification of a “home” or default position. -
FIG. 31 shows another example: aremote control 3100 such as might be used to control a television, stereo system, or other system. A remote control such as this typically uses radio or infrared frequencies to communicate with different controlled devices and includes a number ofphysical buttons 3101 or other control elements that a user presses to designate different actions. Although not shown in this embodiment, some remote controls might include a small display, and some might include a touch-sensitive display that might replace many ofbuttons 3101. - A
multi-region touchpad 3102 can be positioned on the operating surface ofremote control 3100 for operation by a user's thumb or other finger.Multi-region touchpad 3102 can be used in conjunction with a menu structure that is displayed on a graphical or video component of the system being controlled, such as on a television screen. Alternatively,multi-region touchpad 3102 might be used in conjunction with a touch-screen that is part of the remote control itself. Note also thattouchpad 3102 can also be placed on the rear ofremote control 3100. -
FIG. 32 shows agame controller 3200 as yet another example of a device that can utilize a multi-region touchpad. A game controller typically operates in conjunction with dedicated video game consoles or other computerized gaming devices to interface with a user and to allow the user to control aspects of game play that are presented on some form of video display. In this example, the controller has a number of buttons orother control elements 3201. In addition, amulti-region touchpad 3202 can be positioned at a convenient location on the top, side, or bottom ofcontroller 3200 for use in accordance with the usage scenarios described above. -
Touchpad 3202 can be used in conjunction with menus displayed or overlaid on the video game display to select options and to allow quick, on-the-fly selection of various aspects of game play. For example, an inner touch band might be used to allow the player to select from different items of inventory, while an outer touch band is used to allow the player to select different weapons. As another example, an inner touch band might be used to specify an action or verb to perform in the game, while a dependent outer touch band is used to specify items that might be available as the object of that action or verb. A combination of three touch bands might be used to indicate a tool, an action to perform with the tool, and an object of the action. More than one multi-region touchpad might be used on a single controller, allowing convenient access to a greater number of game functions. -
FIG. 33 shows adigital camera 3300 as an example of a device that can utilize a multi-region touchpad.Digital camera 3300 has alens 3301 an internal image sensor (not shown), as well as various mechanical, electrical, and optical controls that are used to capture still and/or moving images. The illustrated example also has an LCD (liquid-crystal display) 3302 or other flat-screen display that is used for previewing images, for displaying captured images, and for user interaction to set various operational parameters. - A
multi-region touchpad 3303 is positioned on the back ofcamera 3300,adjacent display 3302.Touchpad 3303 can be used in conjunction withdisplay 3302 in accordance with many of the techniques described above to set operational parameters of the camera, including exposure parameters. As an example, an inner touch band oftouchpad 3303 can be used to set exposure length, while an outer touch band is used to select aperture. Another touch band might be used to select equivalent film speed. -
Touchpad 3303 might be dedicated to controlling specific camera/exposure parameters, or might be used in a contextually dependent fashion to control many different parameters. Furthermore,touchpad 3303 might be used without an associated display, and might be positioned differently, such as on the front ofcamera 3300. -
FIG. 34 show another example of a control device that might incorporate a multi-region touchpad such as described herein. This example comprises anautomotive steering wheel 3400 such as commonly used in automobiles and other vehicles.Steering wheel 3400 includes ahub 3401, one ormore spokes 3402, and arim 3403 that is grasped by a driver to rotate the steering wheel. Amulti-region touchpad 3404 is positioned onhub 3401 or one of thespokes 3402,adjacent rim 3403, so that it can be accessed by the thumb or finger of the driver. Note that althoughtouchpad 3404 is shown on the side of the steering wheel facing the driver, it might also be located on the side of the steering wheel facing away from the driver. Thetouchpad 3404 can also be located in different locations within the vehicle, such as on various other primary controls that are grasped by a driver to control different aspects of vehicle operation and navigation. For example, thetouchpad 3404 might be positioned on the top or end of a shift lever or some other physical vehicle control that is operated by a hand. -
Touchpad 3404 can be used with various vehicle control systems to set various operational parameters of a vehicle or to interact with on-board systems such as entertainment systems or route guidance systems. Menus can be presented on a so-called “heads-up” display, as transparent images on the vehicle windshield, within the driver's normal line of sight. This allows the driver to interact with vehicle subsystems without looking away from the road. Additionally, the position and tactile nature oftouchpad 3404 allows the driver to navigate various menus without moving their hand from the steering wheel. Touch bands corresponding to different menu levels are easily identified by tactile differentiation, while positions within each level are easily located by their correspondence in shape with displayed menu structures. - The embodiments described above may be implemented using various different means of operational logic, including the architecture illustrated by
FIG. 17 . - Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as exemplary forms of implementing the claims.
- Further, it should be noted that the system configurations illustrated above are purely exemplary of systems in which the implementations may be provided, and the implementations are not limited to the particular hardware configurations illustrated. In the description, numerous details are set forth for purposes of explanation in order to provide a thorough understanding of the disclosure. However, it will be apparent to one skilled in the art that not all of these specific details are required.
Claims (24)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/851,421 US20110292268A1 (en) | 2010-05-26 | 2010-08-05 | Multi-region touchpad device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/788,239 US20110291946A1 (en) | 2010-05-26 | 2010-05-26 | Touchpad interaction |
US12/851,421 US20110292268A1 (en) | 2010-05-26 | 2010-08-05 | Multi-region touchpad device |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/788,239 Continuation-In-Part US20110291946A1 (en) | 2010-05-26 | 2010-05-26 | Touchpad interaction |
Publications (1)
Publication Number | Publication Date |
---|---|
US20110292268A1 true US20110292268A1 (en) | 2011-12-01 |
Family
ID=45021820
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/851,421 Abandoned US20110292268A1 (en) | 2010-05-26 | 2010-08-05 | Multi-region touchpad device |
Country Status (1)
Country | Link |
---|---|
US (1) | US20110292268A1 (en) |
Cited By (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100026473A1 (en) * | 2008-07-25 | 2010-02-04 | Phoenix Contact Gmbh & Co. Kg | Touch-sensitive front panel for a touch screen |
US20100300235A1 (en) * | 2007-05-11 | 2010-12-02 | Continental Automotive Gmbh | Shift Lever Head |
US20110271193A1 (en) * | 2008-08-27 | 2011-11-03 | Sony Corporation | Playback apparatus, playback method and program |
US20110291946A1 (en) * | 2010-05-26 | 2011-12-01 | T-Mobile Usa, Inc. | Touchpad interaction |
US20120019999A1 (en) * | 2008-06-27 | 2012-01-26 | Nokia Corporation | Touchpad |
US20120086846A1 (en) * | 2010-10-06 | 2012-04-12 | National Cheng Kung University | Interface system of an image-capturing device |
US20130100034A1 (en) * | 2011-10-19 | 2013-04-25 | Matthew Nicholas Papakipos | Mobile Device with Concave Shaped Back Side |
US20130117715A1 (en) * | 2011-11-08 | 2013-05-09 | Microsoft Corporation | User interface indirect interaction |
US20130215313A1 (en) * | 2012-02-21 | 2013-08-22 | Kyocera Corporation | Mobile terminal and imaging key control method |
US20140176454A1 (en) * | 2012-12-20 | 2014-06-26 | Institute For Information Industry | Touch control method and handheld device utilizing the same |
US20150074614A1 (en) * | 2012-01-25 | 2015-03-12 | Thomson Licensing | Directional control using a touch sensitive device |
US20150227163A1 (en) * | 2012-08-30 | 2015-08-13 | Delphi Technologies, Inc. | Control module comprising a touch-sensitive surface |
US20150242102A1 (en) * | 2012-10-02 | 2015-08-27 | Denso Corporation | Manipulating apparatus |
US20150293616A1 (en) * | 2014-04-09 | 2015-10-15 | Wei-Chih Cheng | Operating system with shortcut touch panel having shortcut function |
US20150323992A1 (en) * | 2014-05-09 | 2015-11-12 | Microsoft Corporation | Sculpted displays for clickable user interactions |
US20160110206A1 (en) * | 2014-10-17 | 2016-04-21 | Samsung Electronics Co., Ltd. | Display apparatus and controlling method thereof |
US20160124532A1 (en) * | 2013-07-22 | 2016-05-05 | Hewlett-Packard Development Company, L.P. | Multi-Region Touchpad |
EP2962902A4 (en) * | 2013-02-28 | 2016-10-05 | Nippon Seiki Co Ltd | Vehicular operating device |
US20170010804A1 (en) * | 2015-07-10 | 2017-01-12 | Hyundai Motor Company | Vehicle and control method for the vehicle |
EP3222471A4 (en) * | 2014-11-19 | 2017-11-29 | Panasonic Intellectual Property Management Co., Ltd. | Input device and input method therefor |
DE102016014507A1 (en) * | 2016-12-07 | 2018-06-07 | Leopold Kostal Gmbh & Co. Kg | Operating arrangement for a motor vehicle and method for selecting a list item from a list displayed on a display |
US20180273050A1 (en) * | 2014-06-25 | 2018-09-27 | Tomtom International B.V. | Vehicular human machine interfaces |
EP3338172A4 (en) * | 2015-08-20 | 2019-07-03 | Inpris Innovative Products Ltd | Device, system, and methods for entering commands or characters using a touch screen |
US10764502B2 (en) * | 2017-11-08 | 2020-09-01 | Canon Kabushiki Kaisha | Imaging apparatus |
WO2020223958A1 (en) | 2019-05-09 | 2020-11-12 | Microsoft Technology Licensing, Llc | Quick menu selection device and method |
US11561639B2 (en) * | 2017-11-13 | 2023-01-24 | Samsung Electronics Co., Ltd. | Display device and control method for performing operations relating to user input and display state |
US20230077557A1 (en) * | 2021-09-10 | 2023-03-16 | Kabushiki Kaisha Tokai Rika Denki Seisakusho | Operation device |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030023353A1 (en) * | 2000-02-18 | 2003-01-30 | Ziad Badarneh | Arrangement for a switch-equipped steering wheel |
US6765557B1 (en) * | 2000-04-10 | 2004-07-20 | Interlink Electronics, Inc. | Remote control having touch pad to screen mapping |
US20050052426A1 (en) * | 2003-09-08 | 2005-03-10 | Hagermoser E. Scott | Vehicle touch input device and methods of making same |
US20050190144A1 (en) * | 2004-02-26 | 2005-09-01 | Microsoft Corporation | Multi-modal navigation in a graphical user interface computing system |
US20060029451A1 (en) * | 2003-07-31 | 2006-02-09 | Microsoft Corporation | Dual navigation control computer keyboard |
US20070050597A1 (en) * | 2005-08-24 | 2007-03-01 | Nintendo Co., Ltd. | Game controller and game system |
US20070057922A1 (en) * | 2005-09-13 | 2007-03-15 | International Business Machines Corporation | Input having concentric touch pads |
US20070086764A1 (en) * | 2005-10-17 | 2007-04-19 | Konicek Jeffrey C | User-friendlier interfaces for a camera |
US20070097090A1 (en) * | 2005-10-31 | 2007-05-03 | Battles Amy E | Digital camera user interface |
US20090194343A1 (en) * | 2008-01-31 | 2009-08-06 | Fujifilm Corporation | Operation apparatus and electronic device equipped therewith |
WO2009155952A1 (en) * | 2008-06-27 | 2009-12-30 | Nokia Corporation | Touchpad |
US20110291946A1 (en) * | 2010-05-26 | 2011-12-01 | T-Mobile Usa, Inc. | Touchpad interaction |
US20110291956A1 (en) * | 2010-05-26 | 2011-12-01 | T-Mobile Usa, Inc. | Hierarchical touchpad interaction |
-
2010
- 2010-08-05 US US12/851,421 patent/US20110292268A1/en not_active Abandoned
Patent Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030023353A1 (en) * | 2000-02-18 | 2003-01-30 | Ziad Badarneh | Arrangement for a switch-equipped steering wheel |
US6765557B1 (en) * | 2000-04-10 | 2004-07-20 | Interlink Electronics, Inc. | Remote control having touch pad to screen mapping |
US20060029451A1 (en) * | 2003-07-31 | 2006-02-09 | Microsoft Corporation | Dual navigation control computer keyboard |
US20050052426A1 (en) * | 2003-09-08 | 2005-03-10 | Hagermoser E. Scott | Vehicle touch input device and methods of making same |
US20050190144A1 (en) * | 2004-02-26 | 2005-09-01 | Microsoft Corporation | Multi-modal navigation in a graphical user interface computing system |
US20070050597A1 (en) * | 2005-08-24 | 2007-03-01 | Nintendo Co., Ltd. | Game controller and game system |
US20070057922A1 (en) * | 2005-09-13 | 2007-03-15 | International Business Machines Corporation | Input having concentric touch pads |
US20070086764A1 (en) * | 2005-10-17 | 2007-04-19 | Konicek Jeffrey C | User-friendlier interfaces for a camera |
US20070097090A1 (en) * | 2005-10-31 | 2007-05-03 | Battles Amy E | Digital camera user interface |
US20090194343A1 (en) * | 2008-01-31 | 2009-08-06 | Fujifilm Corporation | Operation apparatus and electronic device equipped therewith |
WO2009155952A1 (en) * | 2008-06-27 | 2009-12-30 | Nokia Corporation | Touchpad |
US20120019999A1 (en) * | 2008-06-27 | 2012-01-26 | Nokia Corporation | Touchpad |
US20110291946A1 (en) * | 2010-05-26 | 2011-12-01 | T-Mobile Usa, Inc. | Touchpad interaction |
US20110291956A1 (en) * | 2010-05-26 | 2011-12-01 | T-Mobile Usa, Inc. | Hierarchical touchpad interaction |
Cited By (38)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100300235A1 (en) * | 2007-05-11 | 2010-12-02 | Continental Automotive Gmbh | Shift Lever Head |
US20120019999A1 (en) * | 2008-06-27 | 2012-01-26 | Nokia Corporation | Touchpad |
US8198990B2 (en) * | 2008-07-25 | 2012-06-12 | Phoenix Contact Gmbh & Co. Kg | Touch-sensitive front panel for a touch screen |
US20100026473A1 (en) * | 2008-07-25 | 2010-02-04 | Phoenix Contact Gmbh & Co. Kg | Touch-sensitive front panel for a touch screen |
US20110271193A1 (en) * | 2008-08-27 | 2011-11-03 | Sony Corporation | Playback apparatus, playback method and program |
US8294018B2 (en) * | 2008-08-27 | 2012-10-23 | Sony Corporation | Playback apparatus, playback method and program |
US20110291946A1 (en) * | 2010-05-26 | 2011-12-01 | T-Mobile Usa, Inc. | Touchpad interaction |
US20120086846A1 (en) * | 2010-10-06 | 2012-04-12 | National Cheng Kung University | Interface system of an image-capturing device |
US20130100034A1 (en) * | 2011-10-19 | 2013-04-25 | Matthew Nicholas Papakipos | Mobile Device with Concave Shaped Back Side |
US20130117715A1 (en) * | 2011-11-08 | 2013-05-09 | Microsoft Corporation | User interface indirect interaction |
US9594504B2 (en) * | 2011-11-08 | 2017-03-14 | Microsoft Technology Licensing, Llc | User interface indirect interaction |
US20150074614A1 (en) * | 2012-01-25 | 2015-03-12 | Thomson Licensing | Directional control using a touch sensitive device |
US20130215313A1 (en) * | 2012-02-21 | 2013-08-22 | Kyocera Corporation | Mobile terminal and imaging key control method |
US9001253B2 (en) * | 2012-02-21 | 2015-04-07 | Kyocera Corporation | Mobile terminal and imaging key control method for selecting an imaging parameter value |
US20150227163A1 (en) * | 2012-08-30 | 2015-08-13 | Delphi Technologies, Inc. | Control module comprising a touch-sensitive surface |
US9690320B2 (en) * | 2012-08-30 | 2017-06-27 | Delphi Technologies, Inc. | Control module comprising a touch-sensitive surface |
US20150242102A1 (en) * | 2012-10-02 | 2015-08-27 | Denso Corporation | Manipulating apparatus |
US20140176454A1 (en) * | 2012-12-20 | 2014-06-26 | Institute For Information Industry | Touch control method and handheld device utilizing the same |
EP2962902A4 (en) * | 2013-02-28 | 2016-10-05 | Nippon Seiki Co Ltd | Vehicular operating device |
US9886108B2 (en) * | 2013-07-22 | 2018-02-06 | Hewlett-Packard Development Company, L.P. | Multi-region touchpad |
US20160124532A1 (en) * | 2013-07-22 | 2016-05-05 | Hewlett-Packard Development Company, L.P. | Multi-Region Touchpad |
US9274620B2 (en) * | 2014-04-09 | 2016-03-01 | Wei-Chih Cheng | Operating system with shortcut touch panel having shortcut function |
US20150293616A1 (en) * | 2014-04-09 | 2015-10-15 | Wei-Chih Cheng | Operating system with shortcut touch panel having shortcut function |
US20150323992A1 (en) * | 2014-05-09 | 2015-11-12 | Microsoft Corporation | Sculpted displays for clickable user interactions |
US9841817B2 (en) * | 2014-05-09 | 2017-12-12 | Microsoft Technology Licensing, Llc | Sculpted displays for clickable user interactions |
US10654489B2 (en) * | 2014-06-25 | 2020-05-19 | Tomtom Global Content B.V. | Vehicular human machine interfaces |
US20180273050A1 (en) * | 2014-06-25 | 2018-09-27 | Tomtom International B.V. | Vehicular human machine interfaces |
US20160110206A1 (en) * | 2014-10-17 | 2016-04-21 | Samsung Electronics Co., Ltd. | Display apparatus and controlling method thereof |
EP3222471A4 (en) * | 2014-11-19 | 2017-11-29 | Panasonic Intellectual Property Management Co., Ltd. | Input device and input method therefor |
US20170010804A1 (en) * | 2015-07-10 | 2017-01-12 | Hyundai Motor Company | Vehicle and control method for the vehicle |
CN106335368A (en) * | 2015-07-10 | 2017-01-18 | 现代自动车株式会社 | Vehicle and control method for the vehicle |
EP3338172A4 (en) * | 2015-08-20 | 2019-07-03 | Inpris Innovative Products Ltd | Device, system, and methods for entering commands or characters using a touch screen |
DE102016014507A1 (en) * | 2016-12-07 | 2018-06-07 | Leopold Kostal Gmbh & Co. Kg | Operating arrangement for a motor vehicle and method for selecting a list item from a list displayed on a display |
US10764502B2 (en) * | 2017-11-08 | 2020-09-01 | Canon Kabushiki Kaisha | Imaging apparatus |
US11561639B2 (en) * | 2017-11-13 | 2023-01-24 | Samsung Electronics Co., Ltd. | Display device and control method for performing operations relating to user input and display state |
WO2020223958A1 (en) | 2019-05-09 | 2020-11-12 | Microsoft Technology Licensing, Llc | Quick menu selection device and method |
EP3966672A4 (en) * | 2019-05-09 | 2022-12-14 | Microsoft Technology Licensing, LLC | Quick menu selection device and method |
US20230077557A1 (en) * | 2021-09-10 | 2023-03-16 | Kabushiki Kaisha Tokai Rika Denki Seisakusho | Operation device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20110292268A1 (en) | Multi-region touchpad device | |
US20110291946A1 (en) | Touchpad interaction | |
US10353570B1 (en) | Thumb touch interface | |
US8775966B2 (en) | Electronic device and method with dual mode rear TouchPad | |
US8739053B2 (en) | Electronic device capable of transferring object between two display units and controlling method thereof | |
US20050162402A1 (en) | Methods of interacting with a computer using a finger(s) touch sensing input device with visual feedback | |
TWI633461B (en) | Computer-implemented method,non-transitory computer-readable storage medium,and electronic device for manipulating user interface objects | |
US9766739B2 (en) | Method and apparatus for constructing a home screen in a terminal having a touch screen | |
AU2011243470B2 (en) | Method for providing Graphical User Interface and mobile device adapted thereto | |
TWI381305B (en) | Method for displaying and operating user interface and electronic device | |
US8638315B2 (en) | Virtual touch screen system | |
KR101038459B1 (en) | Text selection using a touch sensitive screen of a handheld mobile communication device | |
EP2726966B1 (en) | An apparatus and associated methods related to touch sensitive displays | |
US20140055384A1 (en) | Touch panel and associated display method | |
US20160062467A1 (en) | Touch screen control | |
US20100037183A1 (en) | Display Apparatus, Display Method, and Program | |
US20140053102A1 (en) | Terminal and method for providing user interface | |
US20090153527A1 (en) | User interface for selecting and controlling plurality of parameters and method for selecting and controlling plurality of parameters | |
KR101983290B1 (en) | Method and apparatus for displaying a ketpad using a variety of gestures | |
WO2014100953A1 (en) | An apparatus and associated methods | |
US20110291956A1 (en) | Hierarchical touchpad interaction | |
KR20110074663A (en) | Information processing apparatus and control method therefor | |
KR20150092672A (en) | Apparatus and Method for displaying plural windows | |
KR20140033839A (en) | Method??for user's??interface using one hand in terminal having touchscreen and device thereof | |
KR20110085189A (en) | Operation method of personal portable device having touch panel |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: T-MOBILE USA, INC., WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MANN, JONATHAN L.;EWING, RICHARD ALAN, JR;KUNCL, PARKER RALPH;AND OTHERS;SIGNING DATES FROM 20100729 TO 20100803;REEL/FRAME:024798/0044 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: DEUTSCHE TELEKOM AG, GERMANY Free format text: INTELLECTUAL PROPERTY SECURITY AGREEMENT;ASSIGNOR:T-MOBILE USA, INC.;REEL/FRAME:041225/0910 Effective date: 20161229 |
|
AS | Assignment |
Owner name: T-MOBILE USA, INC., WASHINGTON Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:DEUTSCHE BANK AG NEW YORK BRANCH;REEL/FRAME:052969/0314 Effective date: 20200401 Owner name: IBSV LLC, WASHINGTON Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:DEUTSCHE TELEKOM AG;REEL/FRAME:052969/0381 Effective date: 20200401 Owner name: PUSHSPRING, INC., WASHINGTON Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:DEUTSCHE BANK AG NEW YORK BRANCH;REEL/FRAME:052969/0314 Effective date: 20200401 Owner name: METROPCS WIRELESS, INC., WASHINGTON Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:DEUTSCHE BANK AG NEW YORK BRANCH;REEL/FRAME:052969/0314 Effective date: 20200401 Owner name: LAYER3 TV, INC., WASHINGTON Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:DEUTSCHE BANK AG NEW YORK BRANCH;REEL/FRAME:052969/0314 Effective date: 20200401 Owner name: T-MOBILE USA, INC., WASHINGTON Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:DEUTSCHE TELEKOM AG;REEL/FRAME:052969/0381 Effective date: 20200401 Owner name: METROPCS COMMUNICATIONS, INC., WASHINGTON Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:DEUTSCHE BANK AG NEW YORK BRANCH;REEL/FRAME:052969/0314 Effective date: 20200401 Owner name: IBSV LLC, WASHINGTON Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:DEUTSCHE BANK AG NEW YORK BRANCH;REEL/FRAME:052969/0314 Effective date: 20200401 Owner name: T-MOBILE SUBSIDIARY IV CORPORATION, WASHINGTON Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:DEUTSCHE BANK AG NEW YORK BRANCH;REEL/FRAME:052969/0314 Effective date: 20200401 |