US20160041729A1 - System and method for generating a user interface for text and item selection - Google Patents

System and method for generating a user interface for text and item selection Download PDF

Info

Publication number
US20160041729A1
US20160041729A1 US14/887,161 US201514887161A US2016041729A1 US 20160041729 A1 US20160041729 A1 US 20160041729A1 US 201514887161 A US201514887161 A US 201514887161A US 2016041729 A1 US2016041729 A1 US 2016041729A1
Authority
US
United States
Prior art keywords
selection
pointing device
item
selection set
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/887,161
Inventor
Nicholas Daniel Doerring
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
OpenTV Inc
Original Assignee
OpenTV Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by OpenTV Inc filed Critical OpenTV Inc
Priority to US14/887,161 priority Critical patent/US20160041729A1/en
Assigned to OPENTV, INC. reassignment OPENTV, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: Doerring, Nicholas D.
Publication of US20160041729A1 publication Critical patent/US20160041729A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0236Character input methods using selection techniques to select from displayed items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Definitions

  • This disclosure relates to methods and systems supporting computing and data processing systems. More particularly, a system and method for generating a user interface for text and item selection is described.
  • Conventional systems can display two-dimensional grid of alphanumeric characters from which a user can make selections. These conventional systems allow a user to manipulate a joystick or game controller to navigate right, left, up, and down within the grid to identify and select a desired character. Other conventional systems provide a displayed on-screen replica of a standard two-dimensional keyboard that can be navigated in a similar two-dimensional manner. However, these conventional user interfaces for text input can be slow and awkward to use.
  • U.S. Pat. No. 6,593,913 describes a method and system for selecting a character with a user input device comprising a plurality of buttons.
  • a first plurality of characters is displayed on a display device in a pattern corresponding to a pattern of a plurality of buttons of a user input device, and a character from the first plurality of characters is selected in response to actuation of one of the plurality of buttons.
  • the number of characters displayed on the display device for selection by the user input device is less than or equal to the number of buttons in the plurality of buttons. In this way, any of the characters displayed on the display device for selection by the user input device can be selected by actuation of a single one of the plurality of buttons.
  • U.S. Pat. No. 5,543,818 describes a method and apparatus for entering alphanumeric or other text to a computer system using an input device having a small number of keys.
  • the computer system includes a processor programmed to display a character selection menu (including displayed groups of characters), to move a displayed cursor from one group to another in response to actuation of at least one cursor movement key on the input device, and to select a character within a group in response to actuation of one of at least two selection keys on the input device.
  • the system reduces the maximum number of keystrokes required conventionally to select a character from a character set, and enables character selection from a larger set of displayed characters using no more keystrokes than required conventionally to select the same character from a smaller set.
  • U.S. Pat. No. 6,501,464 describes a graphical user interface in the form of a transparent keyboard may be positioned over an existing computer display. The user may input textual data through the keyboard by selecting keys in the transparent keyboard display. The text entry may then appear on the computer display in non-transparent or conventional format.
  • FIGS. 1-5 illustrate a particular example embodiment of a user interface for computer users, electronic game players, or television (TV) users in which an item selection set is arranged in a linear orientation.
  • FIGS. 6-10 illustrate a particular example embodiment of a user interface for computer users or electronic game players in which an item selection set is arranged in a circular orientation.
  • FIG. 11 illustrates a particular example embodiment of a user interface for computer users, electronic game players or TV users in which a selection set includes items representing a variety of physical or logical entities.
  • FIGS. 12 and 13 illustrate processing flow diagrams for example embodiments.
  • FIG. 14 is a block diagram of machine in the example form of a computer system within which instructions, for causing the machine to perform any one or more of the methodologies discussed herein, may be executed.
  • FIG. 1 illustrates a particular example embodiment of a user interface for computer users, electronic game players, or TV users.
  • a user interface 100 is shown as may be displayed within a conventional display screen window on a computer display, electronic game system, or TV screen of a user.
  • the user system may be connected or connectable to a pointing device, thumbstick device, mouse, TV remote device, a game controller device, spatial movement detection devices, such as the Wii system sold by Nintendo of America, Inc., or any other hardware or software device/system, which can signal movement in a two-dimensional space via a conventional hardware interface (hereinafter called a pointing device).
  • the user system can use conventional hardware and operating system software to support the novel user interface 100 described herein.
  • the user interface 100 is created using the functionality of various embodiments described herein.
  • the user interface 100 in an example embodiment is comprised of a display region 102 (e.g., a window, a data entry box, etc.) in which a selection set 104 is displayed.
  • the selection set 104 is comprised of a set of items, each of which may represent an individually selectable option.
  • the items in selection set 104 are letters of the English language alphabet.
  • the items can be a set of selectable alphanumeric characters, alphanumeric characters plus special characters, alphanumeric characters in a language other than English, mathematical symbols, geometric shapes, icons, logos, drawing primitives, objects, images, device objects, or any of a variety of other types of selectable items.
  • the selection set 104 is configured in a linear pattern of evenly-spaced items extending to the borders of display region 102 .
  • a selection vector 106 is shown in the center of the display region 102 . The selection vector 106 is used to mark a location in display region 102 at which one of the items in selection set 104 is selectable.
  • selection set 104 overlays (or is in proximity to) selection vector 106 , that item can be selected using selection button 112 .
  • the selectable item overlaying (or in proximity to) selection vector 106 is reproduced as selectable item 108 as shown in FIG. 1 .
  • the letter, ‘N’ of selection set 104 is in proximity to selection vector 106 . Therefore, the letter, ‘N’ is reproduced as selectable item 108 as shown in FIG. 1 .
  • the user can activate selection button 112 .
  • selection button 112 can be any form of signal generation device, such as a hardware button or softkey located anywhere on a remote device, a gesture detection device, audible command detection device, or the like.
  • the selected item is saved in an item string created and saved in display area 116 .
  • the letter, ‘N’ of selection set 104 has been selected by a user by user activation of selection button 112 and the selected item (i.e., letter, ‘N’) has been saved in the current position of the item string in display area 116 .
  • the next position in the item string of display area 116 is represented by the underscore character shown in display region 116 of FIG. 1 .
  • a backspace button or item selection can be provided to clear a previous entry and move the string insertion point (i.e., an underscore character) backward one position.
  • FIG. 1 also illustrates a pointing device 111 , which can be deflected (or used to point) in two dimensions about a center point of a motion area 110 of the pointing device 111 as shown in FIG. 1 .
  • the pointing device 111 is shown in FIG. 1 at its neutral position (e.g., undeflected position, center position, or home position) at the center point of the pointing device motion area 110 .
  • the selection set 104 remains static at its current position.
  • the current position is as shown with the letter, ‘N’ of selection set 104 in proximity to selection vector 106 .
  • the deflection of the pointing device 111 causes a corresponding linear motion of the selection set 104 in relation to the selection vector 106 .
  • the pointing device 111 can signal movement in two dimensions simultaneously or may indicate separate movements in just one direction.
  • other types of pointing devices may indicate a vector movement relative to a point (e.g., the center of the motion area 110 ). Any of these pointing devices can be used with the various embodiments described herein.
  • the user interface 100 is shown after a user has selected an item from the selection set 104 by activation of selection button 112 while the desired item is positioned in proximity to selection vector 106 .
  • the user has selected the letter, ‘N’ of selection set 104 and the selected item has been reproduced in the display region 116 .
  • the user has deflected the pointing device 111 to the left (the 270° position) as illustrated by the vector shown in the representation of the pointing device motion area 110 illustrated in FIG. 2 .
  • the selection set 104 is set in a relatively slow fluid motion to the left as shown by the arrow 113 in FIG. 2 .
  • the selection vector 106 remains stationary.
  • the items in selection set 104 pass over the selection vector 106 as long as the pointing device remains deflected to the left.
  • the corresponding item is reproduced as selectable item 108 .
  • the appearance of selectable item 108 indicates to the user that the selectable item 108 can be selected by the user and copied to the display area 116 , if the selection button 112 is activated at the moment the desired item is over the selection vector 106 and displayed as selectable item 108 .
  • the speed at which the selection set moves can be controlled by the degree of deflection of the pointing device 111 to the left or right (the the 90° position). For example, if the pointing device 111 is only slightly deflected to the left or right, the selection set 104 can move at a slow rate to the corresponding direction left or right. If the pointing device 111 is fully deflected to the left or right, the selection set 104 can move at a fast rate to the corresponding direction left or right.
  • the speed of movement of the selection set for a particular level of deflection of the pointing device 111 can be pre-configured in a particular embodiment. Thus, in the manner as shown in the example of FIG.
  • a user can move the selection set 104 to the left with left deflection of the pointing device 111 to position a desired item in the selection set 10 . 4 over the selection vector 106 .
  • the user has moved the selection set 104 to the left with left deflection of the pointing device 111 , positioned a desired item (e.g., the letter ‘R’) in the selection set 104 over the selection vector 106 , and activated selection button 112 .
  • the selected item e.g., the letter ‘R’
  • the string insertion point i.e., an underscore character
  • the user has deflected the pointing device 111 to the right (the 90° position) as illustrated by the vector shown in the representation of the pointing device motion area 110 illustrated in FIG. 3 .
  • the selection set 104 is set in a relatively slow fluid motion to the right as shown by the arrow 115 in FIG. 3 .
  • the selection vector 106 remains stationary.
  • the items in selection set 104 pass over the selection vector 106 as long as the pointing device remains deflected to the right.
  • the corresponding item is reproduced as selectable item 108 as described above.
  • the speed at which the selection set moves to the right can be controlled by the degree of deflection of the pointing device 111 to the right.
  • a user can move the selection set 104 to the right with right deflection of the pointing device 111 to position a desired item in the selection set 104 over the selection vector 106 .
  • the user has moved the selection set 104 to the right with right deflection of the pointing device 111 , positioned a desired item (e.g., the letter ‘H’) in the selection set 104 over the selection vector 106 , and activated selection button 112 .
  • a desired item e.g., the letter ‘H’
  • the selected item e.g., the letter ‘H’
  • the string insertion point i.e., an underscore character
  • FIG. 4 a selection set zooming feature of a particular embodiment is illustrated.
  • the user has deflected the pointing device 111 in a downward (the 180° position) direction as illustrated by the vector shown in the representation of the pointing device motion area 110 illustrated in FIG. 4 .
  • the items in selection set 104 have been reduced in size a relatively slow fluid motion to simulate a zoom out operation in a view of the selection set 104 .
  • the spacing between the items in selection set 104 has been selectively reduced based on the amount of downward deflection of the pointing device 111 and the length of time the pointing device 111 is deflected downward.
  • the user can more quickly navigate to a desired item with a left or right deflection of the pointing device 111 .
  • a user can perform a zoom out operation to more quickly navigate to a desired item with a left or right deflection of the pointing device 111 .
  • the user is still given the opportunity to move the selection set 104 to the right or left with right or left deflection of the pointing device 111 to position a desired item in the selection set 104 over the selection vector 106 and make an item selection.
  • the selection set zooming feature of a particular embodiment is further illustrated.
  • the user has deflected the pointing device 111 in an upward (the 0° or 360° position) direction as illustrated by the vector shown in the representation of the pointing device motion area 110 illustrated in FIG. 5 .
  • the items in selection set 104 have been enlarged in a relatively slow fluid motion to simulate a zoom in operation in a view of the selection set 104 .
  • the spacing between the items in selection set 104 has been selectively increased based on the amount of upward deflection of the pointing device 111 and the length of time the pointing device 111 is deflected upward.
  • the user can more accurately navigate to a desired item with a left or right deflection of the pointing device 111 .
  • a user can perform a zoom in operation to more accurately navigate to a desired item with a left or right deflection of the pointing device 111 .
  • the user is still given the opportunity to move the selection set 104 to the right or left with right or left deflection of the pointing device 111 to position a desired item in the selection set 104 over the selection vector 106 and make an item selection.
  • the selection set 104 can be held stationary and the selection vector 106 can be moved to the left or right within display area 102 with a corresponding deflection of the pointing device 111 to the left or right. In this manner, a user can move the selection vector 106 to the right with right deflection of the pointing device 111 or left with a left deflection of the pointing device 111 to position the selection vector 106 over a desired item in the selection set 104 .
  • the selection of a desired item in the selection set 104 can be performed in the manner described above.
  • the zoom in and zoom out operations can be performed in the same manner as described above.
  • FIGS. 6-10 an example embodiment 200 of a circularly arranged selection set 204 embodiment is shown.
  • the circular item selection mechanism uses the same underlying principles as the linear embodiment described above.
  • the available items of selection set 204 are arranged in a circular display area 202 as shown in FIG. 6 .
  • the selection set 204 can be arranged in an oval shape, a rectangular shape, or any other arbitrary shape.
  • a radial position can be mapped to a particular location on an arbitrary shape.
  • a small crosshair at the end of a selection vector 206 is also provided to define a particular selection point.
  • the selection vector 206 and its crosshair indicator can be moved and positioned relative to the circularly arranged selection set 204 using two-dimensional movement of the pointing device 211 relative to the motion area 110 of the pointing device 211 as represented in FIGS. 6-10 .
  • the pointing device 211 is at its neutral position (undeflected position) at the center point of the motion area 110 of the pointing device 211 .
  • the selection vector 206 and its crosshair indicator remain positioned at the center of the circularly arranged selection set 204 .
  • a display area 216 is provided in example embodiment 200 to assemble an item string.
  • a string insertion point i.e., an underscore character
  • the user has deflected the pointing device 211 to the left and slightly downward (the approx. 265° position) as illustrated by the vector shown in the representation of the pointing device motion area 110 illustrated in FIG. 7 .
  • the selection vector 206 and its crosshair indicator (the selection point) has moved in a direction corresponding to the movement of the pointing device 211 .
  • the selection vector 206 and its crosshair indicator has moved to overlay an item in the selection set 204 .
  • the selection set 204 remains stationary.
  • the user can identify a selectable item, which is in proximity to the crosshair indicator of selection vector 206 .
  • the selectable item identified thereby is highlighted or distinctively shaded/colored to indicate to the user that the selectable item can be selected by activation of selection button 112 . If the user activates selection button 112 , the selectable item in selection set 204 is copied to the insertion point position of display area 216 and the string insertion point (i.e., an underscore character) is advanced to the next position in the item string being assembled in display area 216 .
  • the user has deflected the pointing device 211 to the left and slightly downward (the approx. 265° position) thereby causing the selection vector 206 and its crosshair indicator to move in a corresponding direction to overlay an item (e.g., letter ‘K’) in the selection set 204 .
  • the user has then activated selection button 112 thereby causing the selectable item (e.g., letter ‘K’), identified by the movement of the selection vector 206 and its crosshair indicator, to be inserted at the insertion point of display area 216 as shown in FIG. 7 .
  • a user can move the selection vector 206 and its crosshair indicator in two dimensions by a corresponding two-dimensional deflection of the pointing device 211 to position the selection vector 206 and its crosshair indicator over or in proximity to a desired item in the selection set 204 .
  • the selection of the desired item in the selection set 104 can be performed by activation of selection button 112 in the manner described above.
  • the user does not need to position the crosshair indicator over or in proximity to a desired item in the selection set 204 .
  • the selection vector 206 can merely be moved in a direction towards a desired item selection and a radially projected selection vector 206 is used to identify and highlight a selectable item in the selection set 204 . In this manner, the user does not have to move the crosshair of selection vector 216 completely out to the position of the desired item of selection set 204 .
  • the location of the selection point identified by the position of the crosshair of selection vector 216 is continuously updated in response to corresponding movement of the pointing device 211 .
  • the same deflection of the pointing device 211 can cause different effects based on the location of the selection point. For example, when the selection point is positioned near the center of the circular selection set 204 , the movement of the selection point becomes more rapid relative to the movement of the pointing device 211 . When the selection point is positioned furthest from the center of the circular selection set 204 , the movement of the selection point becomes less rapid relative to the movement of the pointing device 211 .
  • the user can quickly position the selection point in the selection set 204 while maintaining accurate control as the selection point approaches a desired item in selection set 204 .
  • the motion of the selection point slows as it nears the edge of the circular selection set 204 making it easier for a user to hone in on a target item.
  • the selection point is quickly pulled back to the center of the circular selection set 204 .
  • the motion of the crosshair of selection vector 216 is always in the direction of deflection of the pointing device 211 , but this motion can be taken from the current location of the crosshair.
  • the magnitude of the motion vector as provided by the pointing device 211 can be relative to the current location of the crosshair.
  • the crosshair may start in the center of the circle and the pointing device 211 is then deflected as far as it can go in the 180 degree position.
  • the selection position is updated by moving the crosshair down so that it is perhaps 1 ⁇ 3 radians from the center (i.e., not all the way down.) The user then changes the deflection of the pointing device 211 to the full 270 degree position.
  • the crosshair only moves 1/9 radians (because the crosshair is further from the center) and it moves from its current location to the left in response to the new deflection of the pointing device 211 .
  • the selection vector 206 would be pointing toward items, “F” or “G” as shown in FIG. 9 and would have a similar magnitude.
  • motion for a particular embodiment is computed by just adding deflection vectors to the current location, where the magnitude of the motion vector is adjusted based on the location of the crosshair relative to the center position and the magnitude of the deflection. So even if the user only keeps the pointing device 211 slightly deflected in a given direction, then the crosshair will still eventually reach the edge of the selection ring 202 .
  • the selectable item that can be selected is indicated by a white crosshair and the item turns red.
  • a few special symbols can be used to indicate special functions. For example:
  • the user has deflected the pointing device 211 to the right and upward (the approx. 45° position) as illustrated by the vector shown in the representation of the pointing device motion area 110 illustrated in FIG. 8 .
  • the selection vector 206 and its crosshair indicator (the selection point) has moved in a direction corresponding to the movement of the pointing device 211 .
  • the selection vector 206 and its crosshair indicator has moved to overlay an item in the circular selection set 204 .
  • the selection set 204 remains stationary.
  • the selectable item identified thereby can be highlighted or distinctively shaded/colored to indicate to the user that the selectable item can be selected by activation of selection button 112 .
  • the user has deflected the pointing device 211 to the right and upward (the approx. 45° position) thereby causing the selection vector 206 and its crosshair indicator (the selection point) to move in a corresponding direction to overlay an item (e.g., letter ‘U’) in the selection set 204 .
  • selection button 112 thereby causing the selectable item (e.g., letter ‘U’), identified by the selection point, to be inserted at the insertion point of display area 216 as shown in FIG. 8 .
  • the selectable item e.g., letter ‘U’
  • a user can move the selection point in two dimensions by a corresponding two-dimensional deflection of the pointing device 211 to position the selection point over a desired item in the selection set 204 .
  • the selection of the desired item in the selection set 104 can be performed by activation of selection button 112 in the manner described above.
  • the user has deflected the pointing device 211 downward and slightly to the left (the approx. 182° position) as illustrated by the vector shown in the representation of the pointing device motion area 110 illustrated in FIG. 9 .
  • the user has applied only a slight deflection of pointing device 211 .
  • the selection vector 206 and its crosshair indicator (the selection point) has moved in a direction corresponding to the movement of the pointing device 211 ; but, the selection point has only slightly moved from the center position of the circular selection set 204 due to the corresponding slight deflection of pointing device 211 .
  • FIG. 9 the user has deflected the pointing device 211 downward and slightly to the left (the approx. 182° position) as illustrated by the vector shown in the representation of the pointing device motion area 110 illustrated in FIG. 9 .
  • the user has applied only a slight deflection of pointing device 211 .
  • the selection vector 206 and its crosshair indicator (the selection point) has moved in a direction corresponding to the
  • the selection vector 206 and its crosshair indicator has not moved to overlay an item in the circular selection set 204 ; but, the user can apply a larger deflection to the pointing device 211 to select an item from the selection set 204 as described above.
  • This larger deflection of pointing device 211 is shown in the example of FIG. 10 .
  • the user has more fully deflected the pointing device 211 downward and slightly to the left (the approx. 182° position) as illustrated by the vector shown in the representation of the pointing device motion area 110 .
  • This deflection of pointing device 211 causes the selection vector 206 and its crosshair indicator (the selection point) to move in a corresponding direction to overlay an item (e.g., letter ‘E’) in the selection set 204 .
  • the user has then activated selection button 112 thereby causing the selectable item (e.g., letter ‘E’), identified by the selection point, to be inserted at the insertion point of display area 216 as shown in FIG. 10 .
  • the selection set 204 remains stationary while the selection vector 206 and its crosshair indicator (the selection point) moves in a manner corresponding to deflection of the pointing device 211 .
  • the selection vector 206 and its crosshair indicator can be held stationary at a pre-defined position (e.g., at the 12-o'clock position of display area 202 ) and the circular selection set 204 can be rotated clockwise or counter-clockwise within display area 202 with a corresponding deflection of the pointing device 211 to the right or left.
  • a user can move the selection set 204 underneath the stationary selection point with right or left deflection of the pointing device 211 .
  • an item of selection set 204 can be positioned in proximity to the selection point to identify a selectable item.
  • the selection of a desired item in the selection set 204 can be performed in the manner described above.
  • the zoom in and zoom out operations can be performed in an embodiment of the circularly arranged selection set 204 in the same manner as described above.
  • FIG. 11 illustrates a particular example embodiment of a user interface for computer users, electronic game players, or TV users in which a selection set includes items representing a variety of physical or logical entities.
  • a user interface 1100 of an example embodiment is comprised of a display region 1102 (e.g., a window, a data entry box, etc.) in which a selection set 1104 is displayed.
  • the selection set 1104 is comprised of a set of items, each of which may represent an individually selectable option.
  • the items in selection set 1104 are geometric shapes, icons, logos, drawing primitives, objects, images, device objects, or any of a variety of other types of selectable items that each represent a corresponding physical device, a file folder, a file or document, an image, a video or audio stream, a communication device, mode, network, or protocol, or any of a variety of other types of objects or entities.
  • the selection set 1104 is configured in a linear pattern of evenly-spaced items extending to the borders of display region 1102 .
  • the selection set 1104 may similarly be arranged in a circular selection set as described above.
  • a selection vector 106 can be provided in the center of the display region 1102 to mark a location in display region 1102 at which one of the items in selection set 1104 is selectable.
  • a selection vector 206 and its crosshair indicator can be provided to identify a selectable item in selection set 1104 as described above. In this manner, a wide variety of selectable items can be made available for selection by a user using the various embodiments of the user interface described herein.
  • the granularity control can be a zooming mechanism to modify the size and/or position of items in a selection set.
  • the granularity control can be a modification of the motion vector based on a distance from a reference point and the speed or quantity of deflection of a pointing device.
  • the method of an example embodiment 900 includes: receiving an input from a pointing device indicating movement right or left (processing block 910 ); receiving an input from a pointing device indicating movement upward or downward (processing block 912 ); moving a selection set to the right in response to movement of the pointing device to the right and moving the selection set to the left in response to movement of the pointing device to the left (processing block 914 ); and zooming a selection set outward in response to movement of the pointing device downward, and zooming the selection set inward in response to movement of the pointing device upward (processing block 916 ).
  • a processing flow 1300 for generating a user interface in another example embodiment includes: displaying a selection set in a circular orientation, the selection set including a plurality of items (processing block 1310 ); receiving an input from a pointing device indicating movement in a two-dimensional direction (processing block 1312 ); moving an endpoint of a selection pointer relative to the selection set in response to movement of the pointing device in the two-dimensional direction (processing block 1314 ); and enabling selection of an item in the selection set corresponding to the position of the endpoint of the selection pointer relative to the selection set (processing block 1316 ).
  • a component can be a tangible unit capable of performing certain operations and may be configured or arranged in a certain manner.
  • one or more computer systems e.g., a standalone, client or server computer system
  • one or more components of a computer system e.g., a processor or a group of processors
  • software e.g., an application or application portion
  • a component may be implemented mechanically or electronically.
  • a component may comprise dedicated circuitry or logic that is permanently configured (e.g., as a special-purpose processor) to perform certain operations.
  • a component may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations. It will be appreciated that the decision to implement a component mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
  • the term “component” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired) or temporarily configured (e.g., programmed) to operate in a certain manner and/or to perform certain operations described herein.
  • components are temporarily configured (e.g., programmed)
  • each of the components need not be configured or instantiated at any one instance in time.
  • the components comprise a general-purpose processor configured using software
  • the general-purpose processor may be configured as respective different components at different times.
  • Software may accordingly configure a processor, for example, to constitute a particular component at one instance of time and to constitute a different component at a different instance of time.
  • Components can provide information to, and receive information from, other components. Accordingly, the described components may be regarded as being communicatively coupled. Where multiple of such components exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) that connect the components. In embodiments in which multiple components are configured or instantiated at different times, communications between such components may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple components have access. For example, one component may perform an operation, and store the output of that operation in a memory device to which it is communicatively coupled. A further component may then, at a later time, access the memory device to retrieve and process the stored output. Components may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information).
  • a resource e.g., a collection of information
  • Example embodiments may be implemented in digital electronic circuitry, or in computer hardware, firmware, software, or in combinations of them.
  • Example embodiments may be implemented using a computer program product, e.g., a computer program tangibly embodied in an information carrier, e.g., in a machine-readable medium for execution by, or to control the operation of, data processing apparatus, e.g., a programmable processor, a computer, or multiple computers.
  • a computer program can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, subroutine, or other unit suitable for use in a computing environment.
  • a computer program can be deployed to be executed on one computer or on multiple computers at one site or distributed across multiple sites and interconnected by a communication network.
  • operations may be performed by one or more programmable processors executing a computer program to perform functions by operating on input data and generating output.
  • Method operations can also be performed by, and apparatus of example embodiments may be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit).
  • FPGA field programmable gate array
  • ASIC application-specific integrated circuit
  • the computing system can include clients and servers.
  • a client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
  • both hardware and software architectures require consideration. Specifically, it will be appreciated that the choice of whether to implement certain functionality in permanently configured hardware (e.g., an ASIC), in temporarily configured hardware (e.g., a combination of software and a programmable processor), or a combination permanently and temporarily configured hardware may be a design choice.
  • hardware e.g., machine
  • software architectures that may be deployed, in various example embodiments.
  • FIG. 14 is a block diagram of machine in the example form of a computer system 700 within which instructions, for causing the machine to perform any one or more of the methodologies discussed herein, may be executed.
  • the machine operates as a standalone device or may be connected (e.g., networked) to other machines.
  • the machine may operate in the capacity of a server or a client machine in server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.
  • the machine may be a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine.
  • PC personal computer
  • PDA Personal Digital Assistant
  • STB set-top box
  • a cellular telephone a web appliance
  • network router switch or bridge
  • any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine.
  • the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions
  • the example computer system 700 includes a processor 702 (e.g., a central processing unit (CPU), a graphics processing unit (CPU) or both), a main memory 70 . 4 and a static memory 706 , which communicate with each other via a bus 708 .
  • the computer system 700 may further include a video display unit 710 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)).
  • the computer system 700 may also include an alphanumeric input device 712 (e.g., a keyboard), a user interface (UI) pointing device 714 (e.g., pointing device, thumbstick device, mouse, TV remote device, a game controller device, spatial movement detection devices, such as the Wii system sold by Nintendo of America, Inc., or any other hardware or software device/system, which can signal movement in a two-dimensional space, herein a pointing device), a disk drive unit 716 , a signal generation device 718 (e.g., a hardware button or softkey located anywhere on a remote device, a gesture detection device, audible command detection device, or the like) and a network interface device 720 .
  • a signal generation device 718 e.g., a hardware button or softkey located anywhere on a remote device, a gesture detection device, audible command detection device, or the like
  • the disk drive unit 716 includes a machine-readable medium 722 on which is stored one or more sets of instructions and data structures (e.g., software 724 ) embodying or utilized by any one or more of the methodologies or functions described herein.
  • the software 724 may also reside, completely or at least partially, within the main memory 704 and/or within the processor 702 during execution thereof by the computer system 700 , the main memory 704 and the processor 702 also constituting machine-readable media.
  • machine-readable medium 722 is shown in an example embodiment to be a single medium, the term “machine-readable medium” may include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more instructions or data structures.
  • the term “machine-readable medium” shall also be taken to include any tangible medium that is capable of storing, encoding or carrying instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present invention, or that is capable of storing, encoding or carrying data structures utilized by or associated with such instructions.
  • the term “machine-readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, and optical and magnetic media.
  • machine-readable media include non-volatile memory, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.
  • semiconductor memory devices e.g., EPROM, EEPROM, and flash memory devices
  • magnetic disks such as internal hard disks and removable disks
  • magneto-optical disks and CD-ROM and DVD-ROM disks.
  • the software 724 may further be transmitted or received over a communications network 726 using a transmission medium.
  • the software 724 may be transmitted using the network interface device 720 and any one of a number of well-known transfer protocols (e.g., HTTP). Examples of communication networks include a local area network (“LAN”), a wide area network (“WAN”), the Internet, mobile telephone networks, Plain Old Telephone (POTS) networks, and wireless data networks (e.g., WiFi and WiMax networks).
  • POTS Plain Old Telephone
  • the term “transmission medium” shall be taken to include any intangible medium that is capable of storing, encoding or carrying instructions for execution by the machine, and includes digital or analog communications signals or other intangible medium to facilitate communication of such software.
  • the described methods may be implemented using one a distributed or non-distributed software application designed under a three-tier architecture paradigm. Under this paradigm, various parts of computer code (or software) that instantiate or configure components or modules may be categorized as belonging to one or more of these three tiers. Some embodiments may include a first tier as an interface (e.g., an interface tier). Further, a second tier may be a logic (or application) tier that performs application processing of data inputted through the interface level. The logic tier may communicate the results of such processing to the interface tier, and/or to a backend, or storage tier. The processing performed by the logic tier may relate to certain rules, or processes that govern the software as a whole.
  • a third, storage tier may be a persistent storage medium, or a non-persistent storage medium. In some cases, one or more of these tiers may be collapsed into another, resulting in a two-tier architecture, or even a one-tier architecture.
  • the interface and logic tiers may be consolidated, or the logic and storage tiers may be consolidated, as in the case of a software application with an embedded database.
  • the three-tier architecture may be implemented using one technology, or, a variety of technologies.
  • the example three-tier architecture, and the technologies through which it is implemented may be realized on one or more computer systems operating, for example, as a standalone system, or organized in a server-client, peer-to-peer, distributed or so some other suitable configuration. Further, these three tiers may be distributed between more than one computer systems as various components.
  • Example embodiments may include the above described tiers, and processes or operations about constituting these tiers may be implemented as components. Common to many of these components is the ability to generate, use, and manipulate data. The components, and the functionality associated with each, may form part of standalone, client, server, or peer computer systems. The various components may be implemented by a computer system on an as-needed basis. These components may include software written in an object-oriented computer language such that a component oriented, or object-oriented programming technique can be implemented using a Visual Component Library (VCL), Component Library for Cross Platform (CLX), Java Beans (JB), Java Enterprise Beans (EJB), Component Object Model (COM), Distributed Component Object Model (DCOM), or other suitable technique.
  • VCL Visual Component Library
  • CLX Component Library for Cross Platform
  • JB Java Beans
  • EJB Java Enterprise Beans
  • COM Component Object Model
  • DCOM Distributed Component Object Model
  • Software for these components may further enable communicative coupling to other components (e.g., via various Application Programming interfaces (APIs)), and may be compiled into one complete server, client, and/or peer software application. Further, these APIs may be able to communicate through various distributed programming protocols as distributed computing components.
  • APIs Application Programming interfaces
  • Some example embodiments may include remote procedure calls being used to implement one or more of the above described components across a distributed programming environment as distributed computing components.
  • an interface component e.g., an interface tier
  • a logic component e.g., a logic tier
  • first and second computer systems may be configured in a standalone, server-client, peer-to-peer, or some other suitable configuration.
  • Software for the components may be written using the above described object-oriented programming techniques, and can be written in the same programming language, or a different programming language.
  • Various protocols may be implemented to enable these various components to communicate regardless of the programming language used to write these components.
  • a component written in C++ may be able to communicate with another component written in the Java programming language through utilizing a distributed computing protocol such as a Common Object Request Broker Architecture (CORBA), a Simple Object Access Protocol (SOAP), or some other suitable protocol.
  • CORBA Common Object Request Broker Architecture
  • SOAP Simple Object Access Protocol
  • Some embodiments may include the use of one or more of these protocols with the various protocols outlined in the Open Systems Interconnection (OSI) model, or Transmission Control Protocol/Internet Protocol (TCP/IP) protocol stack model for defining the protocols used by a network to transmit data.
  • OSI Open Systems Interconnection
  • TCP/IP Transmission Control Protocol/Internet Protocol
  • Example embodiments may use the OSI model or TCP/IP protocol stack model for defining the protocols used by a network to transmit data.
  • a system of data transmission between a server and client, or between peer computer systems may for example include five layers comprising: an application layer, a transport layer, a network layer, a data link layer, and a physical layer.
  • the various tiers e.g., the interface, logic, and storage tiers
  • data from an application residing at the application layer is loaded into the data load field of a TCP segment residing at the transport layer.
  • This TCP segment also contains port information for a recipient software application residing remotely.
  • This TCP segment is loaded into the data load field of an IP datagram residing at the network layer.
  • this IP datagram is loaded into a frame residing at the data link layer.
  • This frame is then encoded at the physical layer, and the data transmitted over a network such as an internet, Local Area Network (LAN), Wide Area Network (WAN), or some other suitable network.
  • internet refers to a network of networks. These networks may use a variety of protocols for the exchange of data, including the aforementioned TCP/IP, and additionally ATM, SNA, SDI, or some other suitable protocol. These networks may be organized within a variety of topologies (e.g., a star topology), or structures.
  • inventive subject matter may be referred to herein, individually and/or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept if more than one is in fact disclosed.
  • inventive concept merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept if more than one is in fact disclosed.

Abstract

A system and method for generating a user interface for text and item selection is disclosed. As described for various embodiments, a system and process is disclosed for providing an arrangement of selectable items, a mechanism for selection from the arrangement of selectable items, and a mechanism for adjusting the granularity of control of the selector. In one embodiment, the granularity control can be a zooming mechanism to modify the size and/or position of items in a selection set. In another embodiment, the granularity control can be a modification of the motion vector based on a distance from a reference point and the speed or quantity of deflection of a pointing device. Thus, as a selection point approaches the selection set, the motion of the selection point becomes less responsive to movement of the pointing device, so the user has more control over the positioning of the selection point relative to an item in the selection set.

Description

    COPYRIGHT NOTICE
  • A portion of the disclosure of this patent document contains material that is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent files or records, but otherwise reserves all copyright rights whatsoever. The following notice applies to the software and data as described below and in the drawings that form a part of this document: Copyright 2007-2009, OpenTV Inc., All Rights Reserved.
  • BACKGROUND
  • 1. Technical Field
  • This disclosure relates to methods and systems supporting computing and data processing systems. More particularly, a system and method for generating a user interface for text and item selection is described.
  • 2. Related Art
  • Conventional systems, like www.twostick.org, can display two-dimensional grid of alphanumeric characters from which a user can make selections. These conventional systems allow a user to manipulate a joystick or game controller to navigate right, left, up, and down within the grid to identify and select a desired character. Other conventional systems provide a displayed on-screen replica of a standard two-dimensional keyboard that can be navigated in a similar two-dimensional manner. However, these conventional user interfaces for text input can be slow and awkward to use.
  • U.S. Pat. No. 6,593,913 describes a method and system for selecting a character with a user input device comprising a plurality of buttons. In one preferred embodiment, a first plurality of characters is displayed on a display device in a pattern corresponding to a pattern of a plurality of buttons of a user input device, and a character from the first plurality of characters is selected in response to actuation of one of the plurality of buttons. In this embodiment, the number of characters displayed on the display device for selection by the user input device is less than or equal to the number of buttons in the plurality of buttons. In this way, any of the characters displayed on the display device for selection by the user input device can be selected by actuation of a single one of the plurality of buttons.
  • U.S. Pat. No. 5,543,818 describes a method and apparatus for entering alphanumeric or other text to a computer system using an input device having a small number of keys. The computer system includes a processor programmed to display a character selection menu (including displayed groups of characters), to move a displayed cursor from one group to another in response to actuation of at least one cursor movement key on the input device, and to select a character within a group in response to actuation of one of at least two selection keys on the input device. The system reduces the maximum number of keystrokes required conventionally to select a character from a character set, and enables character selection from a larger set of displayed characters using no more keystrokes than required conventionally to select the same character from a smaller set.
  • U.S. Pat. No. 6,501,464 describes a graphical user interface in the form of a transparent keyboard may be positioned over an existing computer display. The user may input textual data through the keyboard by selecting keys in the transparent keyboard display. The text entry may then appear on the computer display in non-transparent or conventional format.
  • Thus, a system and method for generating a user interface for text and item selection are needed.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Embodiments illustrated by way of example and not limitation in the figures of the accompanying drawings, in which:
  • FIGS. 1-5 illustrate a particular example embodiment of a user interface for computer users, electronic game players, or television (TV) users in which an item selection set is arranged in a linear orientation.
  • FIGS. 6-10 illustrate a particular example embodiment of a user interface for computer users or electronic game players in which an item selection set is arranged in a circular orientation.
  • FIG. 11 illustrates a particular example embodiment of a user interface for computer users, electronic game players or TV users in which a selection set includes items representing a variety of physical or logical entities.
  • FIGS. 12 and 13 illustrate processing flow diagrams for example embodiments.
  • FIG. 14 is a block diagram of machine in the example form of a computer system within which instructions, for causing the machine to perform any one or more of the methodologies discussed herein, may be executed.
  • DETAILED DESCRIPTION
  • In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of some example embodiments. It will be evident, however, to one of ordinary skill in the art that the various embodiments described herein may be practiced without these specific details. According to various example embodiments described herein, there is provided a system and method for generating a user interface for text and item selection.
  • FIG. 1 illustrates a particular example embodiment of a user interface for computer users, electronic game players, or TV users. As shown in FIG. 1, a user interface 100 is shown as may be displayed within a conventional display screen window on a computer display, electronic game system, or TV screen of a user. As described in more detail below, the user system may be connected or connectable to a pointing device, thumbstick device, mouse, TV remote device, a game controller device, spatial movement detection devices, such as the Wii system sold by Nintendo of America, Inc., or any other hardware or software device/system, which can signal movement in a two-dimensional space via a conventional hardware interface (hereinafter called a pointing device). The user system can use conventional hardware and operating system software to support the novel user interface 100 described herein.
  • The user interface 100 is created using the functionality of various embodiments described herein. Referring to FIG. 1, the user interface 100 in an example embodiment is comprised of a display region 102 (e.g., a window, a data entry box, etc.) in which a selection set 104 is displayed. In the embodiment shown, the selection set 104 is comprised of a set of items, each of which may represent an individually selectable option. In this case, the items in selection set 104 are letters of the English language alphabet. In other embodiments, the items can be a set of selectable alphanumeric characters, alphanumeric characters plus special characters, alphanumeric characters in a language other than English, mathematical symbols, geometric shapes, icons, logos, drawing primitives, objects, images, device objects, or any of a variety of other types of selectable items. In the embodiment illustrated in FIG. 1, the selection set 104 is configured in a linear pattern of evenly-spaced items extending to the borders of display region 102. In the embodiment shown in FIG. 1, a selection vector 106 is shown in the center of the display region 102. The selection vector 106 is used to mark a location in display region 102 at which one of the items in selection set 104 is selectable. In other words, if an item in selection set 104 overlays (or is in proximity to) selection vector 106, that item can be selected using selection button 112. For clarity, the selectable item overlaying (or in proximity to) selection vector 106 is reproduced as selectable item 108 as shown in FIG. 1. Thus, as shown in the example embodiment of FIG. 1, the letter, ‘N’ of selection set 104 is in proximity to selection vector 106. Therefore, the letter, ‘N’ is reproduced as selectable item 108 as shown in FIG. 1. If a user of user interface 100 shown in FIG. 1 wishes to select the selectable item 108, the user can activate selection button 112. It will be apparent to those of ordinary skill in the art that selection button 112 can be any form of signal generation device, such as a hardware button or softkey located anywhere on a remote device, a gesture detection device, audible command detection device, or the like. As a result of this activation of selection button 112, the selected item is saved in an item string created and saved in display area 116. Thus, as shown in the example embodiment of FIG. 1, the letter, ‘N’ of selection set 104 has been selected by a user by user activation of selection button 112 and the selected item (i.e., letter, ‘N’) has been saved in the current position of the item string in display area 116. The next position in the item string of display area 116 is represented by the underscore character shown in display region 116 of FIG. 1. In a particular embodiment, a backspace button or item selection can be provided to clear a previous entry and move the string insertion point (i.e., an underscore character) backward one position.
  • The example of FIG. 1 also illustrates a pointing device 111, which can be deflected (or used to point) in two dimensions about a center point of a motion area 110 of the pointing device 111 as shown in FIG. 1. The pointing device 111 is shown in FIG. 1 at its neutral position (e.g., undeflected position, center position, or home position) at the center point of the pointing device motion area 110. In the neutral position, the selection set 104 remains static at its current position. In the example of FIG. 1, the current position is as shown with the letter, ‘N’ of selection set 104 in proximity to selection vector 106. As described in more detail below, the deflection of the pointing device 111 causes a corresponding linear motion of the selection set 104 in relation to the selection vector 106. In various embodiments, the pointing device 111 can signal movement in two dimensions simultaneously or may indicate separate movements in just one direction. Additionally, other types of pointing devices may indicate a vector movement relative to a point (e.g., the center of the motion area 110). Any of these pointing devices can be used with the various embodiments described herein.
  • Referring now to the example embodiment shown in FIG. 2, the user interface 100 is shown after a user has selected an item from the selection set 104 by activation of selection button 112 while the desired item is positioned in proximity to selection vector 106. In this particular case, the user has selected the letter, ‘N’ of selection set 104 and the selected item has been reproduced in the display region 116. As shown in the example of FIG. 2, the user has deflected the pointing device 111 to the left (the 270° position) as illustrated by the vector shown in the representation of the pointing device motion area 110 illustrated in FIG. 2. In response to this pointing device 111 deflection, the selection set 104 is set in a relatively slow fluid motion to the left as shown by the arrow 113 in FIG. 2. The selection vector 106 remains stationary. As a result, the items in selection set 104 pass over the selection vector 106 as long as the pointing device remains deflected to the left. As each item of selection set 104 passes over the selection vector 106, the corresponding item is reproduced as selectable item 108. The appearance of selectable item 108 indicates to the user that the selectable item 108 can be selected by the user and copied to the display area 116, if the selection button 112 is activated at the moment the desired item is over the selection vector 106 and displayed as selectable item 108. In a particular embodiment, the speed at which the selection set moves can be controlled by the degree of deflection of the pointing device 111 to the left or right (the the 90° position). For example, if the pointing device 111 is only slightly deflected to the left or right, the selection set 104 can move at a slow rate to the corresponding direction left or right. If the pointing device 111 is fully deflected to the left or right, the selection set 104 can move at a fast rate to the corresponding direction left or right. The speed of movement of the selection set for a particular level of deflection of the pointing device 111 can be pre-configured in a particular embodiment. Thus, in the manner as shown in the example of FIG. 2, a user can move the selection set 104 to the left with left deflection of the pointing device 111 to position a desired item in the selection set 10.4 over the selection vector 106. In the example of FIG. 2, the user has moved the selection set 104 to the left with left deflection of the pointing device 111, positioned a desired item (e.g., the letter ‘R’) in the selection set 104 over the selection vector 106, and activated selection button 112. As a result, the selected item (e.g., the letter ‘R’) has been copied to the display area 116 and the string insertion point (i.e., an underscore character) has been advanced one position to the right to mark the point in the string at which the next selected item will be inserted in the item string being assembled in display area 116.
  • Referring now to the example embodiment shown in FIG. 3, the user has deflected the pointing device 111 to the right (the 90° position) as illustrated by the vector shown in the representation of the pointing device motion area 110 illustrated in FIG. 3. In response to this pointing device 111 deflection, the selection set 104 is set in a relatively slow fluid motion to the right as shown by the arrow 115 in FIG. 3. Again, the selection vector 106 remains stationary. As a result, the items in selection set 104 pass over the selection vector 106 as long as the pointing device remains deflected to the right. As each item of selection set 104 passes over the selection vector 106, the corresponding item is reproduced as selectable item 108 as described above. Again, the speed at which the selection set moves to the right can be controlled by the degree of deflection of the pointing device 111 to the right. Thus, in the manner as shown in the example of FIG. 3, a user can move the selection set 104 to the right with right deflection of the pointing device 111 to position a desired item in the selection set 104 over the selection vector 106. In the example of FIG. 3, the user has moved the selection set 104 to the right with right deflection of the pointing device 111, positioned a desired item (e.g., the letter ‘H’) in the selection set 104 over the selection vector 106, and activated selection button 112. As a result, the selected item (e.g., the letter ‘H’) has been copied to the display area 116 and the string insertion point (i.e., an underscore character) has been advanced one position to the right to mark the point in the string at which the next selected item will be inserted in the item string being assembled in display area 116.
  • Referring now to the example embodiment shown in FIG. 4, a selection set zooming feature of a particular embodiment is illustrated. As shown in FIG. 4, the user has deflected the pointing device 111 in a downward (the 180° position) direction as illustrated by the vector shown in the representation of the pointing device motion area 110 illustrated in FIG. 4. In response to this pointing device 111 deflection, the items in selection set 104 have been reduced in size a relatively slow fluid motion to simulate a zoom out operation in a view of the selection set 104. One effect of the zoom out operation as shown in the example of FIG. 4 is that the spacing between the items in selection set 104 has been selectively reduced based on the amount of downward deflection of the pointing device 111 and the length of time the pointing device 111 is deflected downward. With the resulting reduced spacing between items in selection set 104, the user can more quickly navigate to a desired item with a left or right deflection of the pointing device 111. In cases where there may be many items in selection set 104, it may be desirable to use the zoom out operation provided in a particular embodiment to more quickly reach an item in selection set 104 that is located a relatively large distance away from the selection vector 106. Thus, in the manner as shown in the example of FIG. 4, a user can perform a zoom out operation to more quickly navigate to a desired item with a left or right deflection of the pointing device 111. After a zoom out operation has been performed as described above, the user is still given the opportunity to move the selection set 104 to the right or left with right or left deflection of the pointing device 111 to position a desired item in the selection set 104 over the selection vector 106 and make an item selection.
  • Referring now to the example embodiment shown in FIG. 5, the selection set zooming feature of a particular embodiment is further illustrated. As shown in FIG. 5, the user has deflected the pointing device 111 in an upward (the 0° or 360° position) direction as illustrated by the vector shown in the representation of the pointing device motion area 110 illustrated in FIG. 5. In response to this pointing device 111 upward deflection, the items in selection set 104 have been enlarged in a relatively slow fluid motion to simulate a zoom in operation in a view of the selection set 104. One effect of the zoom in operation as shown in the example of FIG. 5 is that the spacing between the items in selection set 104 has been selectively increased based on the amount of upward deflection of the pointing device 111 and the length of time the pointing device 111 is deflected upward. With the resulting increased spacing between items in selection set 104, the user can more accurately navigate to a desired item with a left or right deflection of the pointing device 111. In cases where there may be only a few items in selection set 104 or the items are densely packed together because of display area space limitations, it may be desirable to use the zoom in operation provided in a particular embodiment to more accurately reach an item in selection set 104. Thus, in the manner as shown in the example of FIG. 5, a user can perform a zoom in operation to more accurately navigate to a desired item with a left or right deflection of the pointing device 111. After a zoom in operation has been performed as described above, the user is still given the opportunity to move the selection set 104 to the right or left with right or left deflection of the pointing device 111 to position a desired item in the selection set 104 over the selection vector 106 and make an item selection.
  • In an alternative embodiment of the linearly arranged selection set 104 embodiment 100 as shown in FIGS. 1-5, the selection set 104 can be held stationary and the selection vector 106 can be moved to the left or right within display area 102 with a corresponding deflection of the pointing device 111 to the left or right. In this manner, a user can move the selection vector 106 to the right with right deflection of the pointing device 111 or left with a left deflection of the pointing device 111 to position the selection vector 106 over a desired item in the selection set 104. The selection of a desired item in the selection set 104 can be performed in the manner described above. Similarly, the zoom in and zoom out operations can be performed in the same manner as described above.
  • Referring now to FIGS. 6-10, an example embodiment 200 of a circularly arranged selection set 204 embodiment is shown. In the particular embodiment shown, the circular item selection mechanism uses the same underlying principles as the linear embodiment described above. In the circular embodiment, the available items of selection set 204 are arranged in a circular display area 202 as shown in FIG. 6. In related embodiments, the selection set 204 can be arranged in an oval shape, a rectangular shape, or any other arbitrary shape. In these alternate embodiments, a radial position can be mapped to a particular location on an arbitrary shape. A small crosshair at the end of a selection vector 206 is also provided to define a particular selection point. As described in more detail below, the selection vector 206 and its crosshair indicator can be moved and positioned relative to the circularly arranged selection set 204 using two-dimensional movement of the pointing device 211 relative to the motion area 110 of the pointing device 211 as represented in FIGS. 6-10. As shown in FIG. 6, the pointing device 211 is at its neutral position (undeflected position) at the center point of the motion area 110 of the pointing device 211. In the neutral position, the selection vector 206 and its crosshair indicator remain positioned at the center of the circularly arranged selection set 204. As described above for display area 116, a display area 216 is provided in example embodiment 200 to assemble an item string. A string insertion point (i.e., an underscore character) is provided to mark the point in the string at which the next selected item will be inserted in the item string being assembled in display area 216.
  • Referring now to the example embodiment shown in FIG. 7, the user has deflected the pointing device 211 to the left and slightly downward (the approx. 265° position) as illustrated by the vector shown in the representation of the pointing device motion area 110 illustrated in FIG. 7. In response to this pointing device 211 deflection, the selection vector 206 and its crosshair indicator (the selection point) has moved in a direction corresponding to the movement of the pointing device 211. In the example shown in FIG. 7, the selection vector 206 and its crosshair indicator has moved to overlay an item in the selection set 204. In this embodiment, the selection set 204 remains stationary. By using the pointing device 211 to position the selection vector 206 and its crosshair indicator over an item in the selection set 204, the user can identify a selectable item, which is in proximity to the crosshair indicator of selection vector 206. As the crosshair indicator of selection vector 206 is moved in proximity to an item of selection set 204, the selectable item identified thereby is highlighted or distinctively shaded/colored to indicate to the user that the selectable item can be selected by activation of selection button 112. If the user activates selection button 112, the selectable item in selection set 204 is copied to the insertion point position of display area 216 and the string insertion point (i.e., an underscore character) is advanced to the next position in the item string being assembled in display area 216. In the example shown in FIG. 7, the user has deflected the pointing device 211 to the left and slightly downward (the approx. 265° position) thereby causing the selection vector 206 and its crosshair indicator to move in a corresponding direction to overlay an item (e.g., letter ‘K’) in the selection set 204. The user has then activated selection button 112 thereby causing the selectable item (e.g., letter ‘K’), identified by the movement of the selection vector 206 and its crosshair indicator, to be inserted at the insertion point of display area 216 as shown in FIG. 7. In this manner, a user can move the selection vector 206 and its crosshair indicator in two dimensions by a corresponding two-dimensional deflection of the pointing device 211 to position the selection vector 206 and its crosshair indicator over or in proximity to a desired item in the selection set 204. The selection of the desired item in the selection set 104 can be performed by activation of selection button 112 in the manner described above. In an alternative embodiment, the user does not need to position the crosshair indicator over or in proximity to a desired item in the selection set 204. Instead, the selection vector 206 can merely be moved in a direction towards a desired item selection and a radially projected selection vector 206 is used to identify and highlight a selectable item in the selection set 204. In this manner, the user does not have to move the crosshair of selection vector 216 completely out to the position of the desired item of selection set 204.
  • In a particular embodiment, the location of the selection point identified by the position of the crosshair of selection vector 216 is continuously updated in response to corresponding movement of the pointing device 211. The same deflection of the pointing device 211 can cause different effects based on the location of the selection point. For example, when the selection point is positioned near the center of the circular selection set 204, the movement of the selection point becomes more rapid relative to the movement of the pointing device 211. When the selection point is positioned furthest from the center of the circular selection set 204, the movement of the selection point becomes less rapid relative to the movement of the pointing device 211. In this manner, the user can quickly position the selection point in the selection set 204 while maintaining accurate control as the selection point approaches a desired item in selection set 204. In other words, the motion of the selection point slows as it nears the edge of the circular selection set 204 making it easier for a user to hone in on a target item. When the pointing device 211 is not deflected, the selection point is quickly pulled back to the center of the circular selection set 204.
  • In a particular embodiment, the motion of the crosshair of selection vector 216 is always in the direction of deflection of the pointing device 211, but this motion can be taken from the current location of the crosshair. The magnitude of the motion vector as provided by the pointing device 211 can be relative to the current location of the crosshair. For example, the crosshair may start in the center of the circle and the pointing device 211 is then deflected as far as it can go in the 180 degree position. The selection position is updated by moving the crosshair down so that it is perhaps ⅓ radians from the center (i.e., not all the way down.) The user then changes the deflection of the pointing device 211 to the full 270 degree position. This time, the crosshair only moves 1/9 radians (because the crosshair is further from the center) and it moves from its current location to the left in response to the new deflection of the pointing device 211. In this case, the selection vector 206 would be pointing toward items, “F” or “G” as shown in FIG. 9 and would have a similar magnitude. In general, motion for a particular embodiment is computed by just adding deflection vectors to the current location, where the magnitude of the motion vector is adjusted based on the location of the crosshair relative to the center position and the magnitude of the deflection. So even if the user only keeps the pointing device 211 slightly deflected in a given direction, then the crosshair will still eventually reach the edge of the selection ring 202.
  • In a particular embodiment, the selectable item that can be selected is indicated by a white crosshair and the item turns red. In a particular embodiment, a few special symbols can be used to indicate special functions. For example:
    • ‘>’ can be used for space;
    • ‘<’ can be used for backspace;
    • ‘+’ can select a next alphabet arrangement as a selection set;
    • ‘−’ can select a previous alphabet arrangement as a selection set.
  • Referring now to the example embodiment shown in FIG. 8, the user has deflected the pointing device 211 to the right and upward (the approx. 45° position) as illustrated by the vector shown in the representation of the pointing device motion area 110 illustrated in FIG. 8. In response to this pointing device 211 deflection, the selection vector 206 and its crosshair indicator (the selection point) has moved in a direction corresponding to the movement of the pointing device 211. In the example shown in FIG. 8, the selection vector 206 and its crosshair indicator has moved to overlay an item in the circular selection set 204. In this embodiment, the selection set 204 remains stationary. As the crosshair indicator of selection vector 206 is moved in proximity to an item of selection set 204, the selectable item identified thereby can be highlighted or distinctively shaded/colored to indicate to the user that the selectable item can be selected by activation of selection button 112. In the example shown in FIG. 8, the user has deflected the pointing device 211 to the right and upward (the approx. 45° position) thereby causing the selection vector 206 and its crosshair indicator (the selection point) to move in a corresponding direction to overlay an item (e.g., letter ‘U’) in the selection set 204. The user has then activated selection button 112 thereby causing the selectable item (e.g., letter ‘U’), identified by the selection point, to be inserted at the insertion point of display area 216 as shown in FIG. 8. In this manner, a user can move the selection point in two dimensions by a corresponding two-dimensional deflection of the pointing device 211 to position the selection point over a desired item in the selection set 204. The selection of the desired item in the selection set 104 can be performed by activation of selection button 112 in the manner described above.
  • Referring now to the example embodiment shown in FIG. 9, the user has deflected the pointing device 211 downward and slightly to the left (the approx. 182° position) as illustrated by the vector shown in the representation of the pointing device motion area 110 illustrated in FIG. 9. In this example, the user has applied only a slight deflection of pointing device 211. In response to this pointing device 211 deflection, the selection vector 206 and its crosshair indicator (the selection point) has moved in a direction corresponding to the movement of the pointing device 211; but, the selection point has only slightly moved from the center position of the circular selection set 204 due to the corresponding slight deflection of pointing device 211. In the example shown in FIG. 8, the selection vector 206 and its crosshair indicator has not moved to overlay an item in the circular selection set 204; but, the user can apply a larger deflection to the pointing device 211 to select an item from the selection set 204 as described above. This larger deflection of pointing device 211 is shown in the example of FIG. 10. As illustrated in FIG. 10, the user has more fully deflected the pointing device 211 downward and slightly to the left (the approx. 182° position) as illustrated by the vector shown in the representation of the pointing device motion area 110. This deflection of pointing device 211 causes the selection vector 206 and its crosshair indicator (the selection point) to move in a corresponding direction to overlay an item (e.g., letter ‘E’) in the selection set 204. The user has then activated selection button 112 thereby causing the selectable item (e.g., letter ‘E’), identified by the selection point, to be inserted at the insertion point of display area 216 as shown in FIG. 10. Again in this embodiment, the selection set 204 remains stationary while the selection vector 206 and its crosshair indicator (the selection point) moves in a manner corresponding to deflection of the pointing device 211.
  • In an alternative embodiment of the circularly arranged selection set 204 embodiment 200 as shown in FIGS. 6-10, the selection vector 206 and its crosshair indicator (the selection point) can be held stationary at a pre-defined position (e.g., at the 12-o'clock position of display area 202) and the circular selection set 204 can be rotated clockwise or counter-clockwise within display area 202 with a corresponding deflection of the pointing device 211 to the right or left. In this manner, a user can move the selection set 204 underneath the stationary selection point with right or left deflection of the pointing device 211. Thus, an item of selection set 204 can be positioned in proximity to the selection point to identify a selectable item. The selection of a desired item in the selection set 204 can be performed in the manner described above. Similarly, the zoom in and zoom out operations can be performed in an embodiment of the circularly arranged selection set 204 in the same manner as described above.
  • FIG. 11 illustrates a particular example embodiment of a user interface for computer users, electronic game players, or TV users in which a selection set includes items representing a variety of physical or logical entities. Referring now to an example embodiment illustrated in FIG. 11, a user interface 1100 of an example embodiment is comprised of a display region 1102 (e.g., a window, a data entry box, etc.) in which a selection set 1104 is displayed. In the embodiment shown, the selection set 1104 is comprised of a set of items, each of which may represent an individually selectable option. In this case, the items in selection set 1104 are geometric shapes, icons, logos, drawing primitives, objects, images, device objects, or any of a variety of other types of selectable items that each represent a corresponding physical device, a file folder, a file or document, an image, a video or audio stream, a communication device, mode, network, or protocol, or any of a variety of other types of objects or entities. In the embodiment illustrated in FIG. 1, the selection set 1104 is configured in a linear pattern of evenly-spaced items extending to the borders of display region 1102. In an alternative embodiment, the selection set 1104 may similarly be arranged in a circular selection set as described above. As described above, a selection vector 106 can be provided in the center of the display region 1102 to mark a location in display region 1102 at which one of the items in selection set 1104 is selectable. Similarly, in a circular selection set embodiment, a selection vector 206 and its crosshair indicator (the selection point) can be provided to identify a selectable item in selection set 1104 as described above. In this manner, a wide variety of selectable items can be made available for selection by a user using the various embodiments of the user interface described herein.
  • Thus, as described for various embodiments herein, a system and process is disclosed for providing an arrangement of selectable items, a mechanism for selection from the arrangement of selectable items, and a mechanism for adjusting the granularity of control of the selector. In one embodiment, the granularity control can be a zooming mechanism to modify the size and/or position of items in a selection set. In another embodiment, the granularity control can be a modification of the motion vector based on a distance from a reference point and the speed or quantity of deflection of a pointing device. Thus, as a selection point approaches the selection set, the motion of the selection point becomes less responsive to movement of the pointing device, so the user has more control over the positioning of the selection point relative to an item in the selection set.
  • Example Process Flow
  • Referring to FIG. 12, a processing flow 900 for generating a user interface in an example embodiment is shown. The method of an example embodiment 900 includes: receiving an input from a pointing device indicating movement right or left (processing block 910); receiving an input from a pointing device indicating movement upward or downward (processing block 912); moving a selection set to the right in response to movement of the pointing device to the right and moving the selection set to the left in response to movement of the pointing device to the left (processing block 914); and zooming a selection set outward in response to movement of the pointing device downward, and zooming the selection set inward in response to movement of the pointing device upward (processing block 916).
  • Referring to FIG. 13, a processing flow 1300 for generating a user interface in another example embodiment is shown. The method of an example embodiment 1300 includes: displaying a selection set in a circular orientation, the selection set including a plurality of items (processing block 1310); receiving an input from a pointing device indicating movement in a two-dimensional direction (processing block 1312); moving an endpoint of a selection pointer relative to the selection set in response to movement of the pointing device in the two-dimensional direction (processing block 1314); and enabling selection of an item in the selection set corresponding to the position of the endpoint of the selection pointer relative to the selection set (processing block 1316).
  • Modules, Components and Logic
  • Certain embodiments are described herein as including logic or a number of components, modules, or mechanisms. A component can be a tangible unit capable of performing certain operations and may be configured or arranged in a certain manner. In example embodiments, one or more computer systems (e.g., a standalone, client or server computer system) or one or more components of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a component that operates to perform certain operations as described herein.
  • In various embodiments, a component may be implemented mechanically or electronically. For example, a component may comprise dedicated circuitry or logic that is permanently configured (e.g., as a special-purpose processor) to perform certain operations. A component may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations. It will be appreciated that the decision to implement a component mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
  • Accordingly, the term “component” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired) or temporarily configured (e.g., programmed) to operate in a certain manner and/or to perform certain operations described herein. Considering embodiments in which components are temporarily configured (e.g., programmed), each of the components need not be configured or instantiated at any one instance in time. For example, where the components comprise a general-purpose processor configured using software, the general-purpose processor may be configured as respective different components at different times. Software may accordingly configure a processor, for example, to constitute a particular component at one instance of time and to constitute a different component at a different instance of time.
  • Components can provide information to, and receive information from, other components. Accordingly, the described components may be regarded as being communicatively coupled. Where multiple of such components exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) that connect the components. In embodiments in which multiple components are configured or instantiated at different times, communications between such components may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple components have access. For example, one component may perform an operation, and store the output of that operation in a memory device to which it is communicatively coupled. A further component may then, at a later time, access the memory device to retrieve and process the stored output. Components may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information).
  • Electronic Apparatus and System
  • Example embodiments may be implemented in digital electronic circuitry, or in computer hardware, firmware, software, or in combinations of them. Example embodiments may be implemented using a computer program product, e.g., a computer program tangibly embodied in an information carrier, e.g., in a machine-readable medium for execution by, or to control the operation of, data processing apparatus, e.g., a programmable processor, a computer, or multiple computers.
  • A computer program can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, subroutine, or other unit suitable for use in a computing environment. A computer program can be deployed to be executed on one computer or on multiple computers at one site or distributed across multiple sites and interconnected by a communication network.
  • In example embodiments, operations may be performed by one or more programmable processors executing a computer program to perform functions by operating on input data and generating output. Method operations can also be performed by, and apparatus of example embodiments may be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit).
  • The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. In embodiments deploying a programmable computing system, it will be appreciated that both hardware and software architectures require consideration. Specifically, it will be appreciated that the choice of whether to implement certain functionality in permanently configured hardware (e.g., an ASIC), in temporarily configured hardware (e.g., a combination of software and a programmable processor), or a combination permanently and temporarily configured hardware may be a design choice. Below are set out hardware (e.g., machine) and software architectures that may be deployed, in various example embodiments.
  • Example Machine Architecture and Machine-Readable Medium
  • FIG. 14 is a block diagram of machine in the example form of a computer system 700 within which instructions, for causing the machine to perform any one or more of the methodologies discussed herein, may be executed.
  • In alternative embodiments, the machine operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine may operate in the capacity of a server or a client machine in server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine may be a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
  • The example computer system 700 includes a processor 702 (e.g., a central processing unit (CPU), a graphics processing unit (CPU) or both), a main memory 70.4 and a static memory 706, which communicate with each other via a bus 708. The computer system 700 may further include a video display unit 710 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)). The computer system 700 may also include an alphanumeric input device 712 (e.g., a keyboard), a user interface (UI) pointing device 714 (e.g., pointing device, thumbstick device, mouse, TV remote device, a game controller device, spatial movement detection devices, such as the Wii system sold by Nintendo of America, Inc., or any other hardware or software device/system, which can signal movement in a two-dimensional space, herein a pointing device), a disk drive unit 716, a signal generation device 718 (e.g., a hardware button or softkey located anywhere on a remote device, a gesture detection device, audible command detection device, or the like) and a network interface device 720.
  • Machine-Readable Medium
  • The disk drive unit 716 includes a machine-readable medium 722 on which is stored one or more sets of instructions and data structures (e.g., software 724) embodying or utilized by any one or more of the methodologies or functions described herein. The software 724 may also reside, completely or at least partially, within the main memory 704 and/or within the processor 702 during execution thereof by the computer system 700, the main memory 704 and the processor 702 also constituting machine-readable media.
  • While the machine-readable medium 722 is shown in an example embodiment to be a single medium, the term “machine-readable medium” may include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more instructions or data structures. The term “machine-readable medium” shall also be taken to include any tangible medium that is capable of storing, encoding or carrying instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present invention, or that is capable of storing, encoding or carrying data structures utilized by or associated with such instructions. The term “machine-readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, and optical and magnetic media. Specific examples of machine-readable media include non-volatile memory, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.
  • Transmission Medium
  • The software 724 may further be transmitted or received over a communications network 726 using a transmission medium. The software 724 may be transmitted using the network interface device 720 and any one of a number of well-known transfer protocols (e.g., HTTP). Examples of communication networks include a local area network (“LAN”), a wide area network (“WAN”), the Internet, mobile telephone networks, Plain Old Telephone (POTS) networks, and wireless data networks (e.g., WiFi and WiMax networks). The term “transmission medium” shall be taken to include any intangible medium that is capable of storing, encoding or carrying instructions for execution by the machine, and includes digital or analog communications signals or other intangible medium to facilitate communication of such software.
  • Example Three-Tier Software Architecture
  • In some embodiments, the described methods may be implemented using one a distributed or non-distributed software application designed under a three-tier architecture paradigm. Under this paradigm, various parts of computer code (or software) that instantiate or configure components or modules may be categorized as belonging to one or more of these three tiers. Some embodiments may include a first tier as an interface (e.g., an interface tier). Further, a second tier may be a logic (or application) tier that performs application processing of data inputted through the interface level. The logic tier may communicate the results of such processing to the interface tier, and/or to a backend, or storage tier. The processing performed by the logic tier may relate to certain rules, or processes that govern the software as a whole. A third, storage tier, may be a persistent storage medium, or a non-persistent storage medium. In some cases, one or more of these tiers may be collapsed into another, resulting in a two-tier architecture, or even a one-tier architecture. For example, the interface and logic tiers may be consolidated, or the logic and storage tiers may be consolidated, as in the case of a software application with an embedded database. The three-tier architecture may be implemented using one technology, or, a variety of technologies. The example three-tier architecture, and the technologies through which it is implemented, may be realized on one or more computer systems operating, for example, as a standalone system, or organized in a server-client, peer-to-peer, distributed or so some other suitable configuration. Further, these three tiers may be distributed between more than one computer systems as various components.
  • Components
  • Example embodiments may include the above described tiers, and processes or operations about constituting these tiers may be implemented as components. Common to many of these components is the ability to generate, use, and manipulate data. The components, and the functionality associated with each, may form part of standalone, client, server, or peer computer systems. The various components may be implemented by a computer system on an as-needed basis. These components may include software written in an object-oriented computer language such that a component oriented, or object-oriented programming technique can be implemented using a Visual Component Library (VCL), Component Library for Cross Platform (CLX), Java Beans (JB), Java Enterprise Beans (EJB), Component Object Model (COM), Distributed Component Object Model (DCOM), or other suitable technique.
  • Software for these components may further enable communicative coupling to other components (e.g., via various Application Programming interfaces (APIs)), and may be compiled into one complete server, client, and/or peer software application. Further, these APIs may be able to communicate through various distributed programming protocols as distributed computing components.
  • Distributed Computing Components and Protocols
  • Some example embodiments may include remote procedure calls being used to implement one or more of the above described components across a distributed programming environment as distributed computing components. For example, an interface component (e.g., an interface tier) may form part of a first computer system that is remotely located from a second computer system containing a logic component (e.g., a logic tier). These first and second computer systems may be configured in a standalone, server-client, peer-to-peer, or some other suitable configuration. Software for the components may be written using the above described object-oriented programming techniques, and can be written in the same programming language, or a different programming language. Various protocols may be implemented to enable these various components to communicate regardless of the programming language used to write these components. For example, a component written in C++ may be able to communicate with another component written in the Java programming language through utilizing a distributed computing protocol such as a Common Object Request Broker Architecture (CORBA), a Simple Object Access Protocol (SOAP), or some other suitable protocol. Some embodiments may include the use of one or more of these protocols with the various protocols outlined in the Open Systems Interconnection (OSI) model, or Transmission Control Protocol/Internet Protocol (TCP/IP) protocol stack model for defining the protocols used by a network to transmit data.
  • A System of Transmission Between a Server and Client
  • Example embodiments may use the OSI model or TCP/IP protocol stack model for defining the protocols used by a network to transmit data. In applying these models, a system of data transmission between a server and client, or between peer computer systems may for example include five layers comprising: an application layer, a transport layer, a network layer, a data link layer, and a physical layer. In the case of software, for instantiating or configuring components, having a three tier architecture, the various tiers (e.g., the interface, logic, and storage tiers) reside on the application layer of the TCP/IP protocol stack. In an example implementation using the TCP/IP protocol stack model, data from an application residing at the application layer is loaded into the data load field of a TCP segment residing at the transport layer. This TCP segment also contains port information for a recipient software application residing remotely. This TCP segment is loaded into the data load field of an IP datagram residing at the network layer. Next, this IP datagram is loaded into a frame residing at the data link layer. This frame is then encoded at the physical layer, and the data transmitted over a network such as an internet, Local Area Network (LAN), Wide Area Network (WAN), or some other suitable network. In some cases, internet refers to a network of networks. These networks may use a variety of protocols for the exchange of data, including the aforementioned TCP/IP, and additionally ATM, SNA, SDI, or some other suitable protocol. These networks may be organized within a variety of topologies (e.g., a star topology), or structures.
  • Although an embodiment has been described with reference to specific example embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of the invention. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense. The accompanying drawings that form a part hereof, show by way of illustration, and not of limitation, specific embodiments in which the subject matter may be practiced. The embodiments illustrated are described in sufficient detail to enable those skilled in the art to practice the teachings disclosed herein. Other embodiments may be utilized and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. This Detailed Description, therefore, is not to be taken in a limiting sense, and the scope of various embodiments is defined only by the appended claims, along with the full range of equivalents to which such claims are entitled.
  • Such embodiments of the inventive subject matter may be referred to herein, individually and/or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept if more than one is in fact disclosed. Thus, although specific embodiments have been illustrated and described herein, it should be appreciated that any arrangement calculated to achieve the same purpose may be substituted for the specific embodiments shown. This disclosure is intended to cover any and all adaptations or variations of various embodiments. Combinations of the above embodiments, and other embodiments not specifically described herein, will be apparent to those of skill in the art upon reviewing the above description.
  • The Abstract of the Disclosure is provided to comply with 37 C.F.R. §1.72(b), requiring an abstract that will allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as it separate embodiment.

Claims (1)

What is claimed is:
1. A method comprising:
receiving an input from a pointing device indicating movement right or left;
receiving an input from a pointing device indicating movement upward or downward;
moving a selection set right in response to movement of the pointing device to the right, moving the selection set left in response to movement of the pointing device to the left; and
zooming the selection set outward in response to movement of the pointing device downward, and zooming the selection set inward in response to movement of the pointing device upward.
US14/887,161 2009-02-05 2015-10-19 System and method for generating a user interface for text and item selection Abandoned US20160041729A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/887,161 US20160041729A1 (en) 2009-02-05 2015-10-19 System and method for generating a user interface for text and item selection

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US12/366,293 US9195317B2 (en) 2009-02-05 2009-02-05 System and method for generating a user interface for text and item selection
US14/887,161 US20160041729A1 (en) 2009-02-05 2015-10-19 System and method for generating a user interface for text and item selection

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US12/366,293 Continuation US9195317B2 (en) 2009-02-05 2009-02-05 System and method for generating a user interface for text and item selection

Publications (1)

Publication Number Publication Date
US20160041729A1 true US20160041729A1 (en) 2016-02-11

Family

ID=41853775

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/366,293 Active 2029-12-22 US9195317B2 (en) 2009-02-05 2009-02-05 System and method for generating a user interface for text and item selection
US14/887,161 Abandoned US20160041729A1 (en) 2009-02-05 2015-10-19 System and method for generating a user interface for text and item selection

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US12/366,293 Active 2029-12-22 US9195317B2 (en) 2009-02-05 2009-02-05 System and method for generating a user interface for text and item selection

Country Status (2)

Country Link
US (2) US9195317B2 (en)
EP (1) EP2216709A3 (en)

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110035700A1 (en) * 2009-08-05 2011-02-10 Brian Meaney Multi-Operation User Interface Tool
KR101636705B1 (en) * 2009-08-06 2016-07-06 삼성전자주식회사 Method and apparatus for inputting letter in portable terminal having a touch screen
US8170372B2 (en) * 2010-08-06 2012-05-01 Kennedy Michael B System and method to find the precise location of objects of interest in digital images
ES2736800T3 (en) * 2010-09-30 2020-01-07 Rakuten Inc Display device, display procedure, non-transient computer readable recording medium in which a program and script program is registered
US9495773B2 (en) * 2011-10-24 2016-11-15 Nokia Technologies Oy Location map submission framework
US9354805B2 (en) * 2012-04-30 2016-05-31 Blackberry Limited Method and apparatus for text selection
DE102013004245A1 (en) * 2013-03-12 2014-09-18 Audi Ag A device associated with a vehicle with spelling means - selection mark
KR102194262B1 (en) 2013-12-02 2020-12-23 삼성전자주식회사 Method for displaying pointing information and device thereof
USD757038S1 (en) * 2014-04-18 2016-05-24 Nutonian, Inc. Display screen with graphical user interface
CN104035676B (en) * 2014-06-25 2019-01-18 百度在线网络技术(北京)有限公司 A kind of switching method and device of the page
KR20160097867A (en) * 2015-02-10 2016-08-18 삼성전자주식회사 Image display apparatus and method for displaying image
EP3575944B1 (en) * 2018-05-29 2021-04-07 Advanced Digital Broadcast S.A. System and method for inputting aplhanumeric characters in a computer system
EP3575943B1 (en) * 2018-05-29 2021-04-07 Advanced Digital Broadcast S.A. System and method for a virtual keyboard
KR20210016752A (en) * 2019-08-05 2021-02-17 윤현진 English input keyboard for critically ill patients
IT202100013235A1 (en) * 2021-05-21 2022-11-21 Dico Tech S R L SYSTEM AND METHOD FOR NON-VERBAL COMMUNICATION

Citations (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5287812A (en) * 1989-09-28 1994-02-22 Svedala Industries, Inc. Railroad car positioning apparatus
US5844559A (en) * 1996-10-07 1998-12-01 Apple Computer, Inc. Method, apparatus and computer program products to display objects using windows
US6341201B1 (en) * 1997-09-30 2002-01-22 Fuji Photo Optical Co., Ltd. Remotely controllable camera system
US6466199B2 (en) * 1998-07-23 2002-10-15 Alps Electric Co., Ltd. Method for moving a pointing cursor
US20030117419A1 (en) * 2001-12-12 2003-06-26 Stmicroelectronics, Inc. Method and system of continuously scaling video images
US6621992B2 (en) * 1999-10-13 2003-09-16 Sharp Kabushiki Kaisha Copier operation control and input device
US20040183817A1 (en) * 2002-12-03 2004-09-23 Bitstream Inc. Methods, systems, and programming for scaled display of web pages
US20040194655A1 (en) * 2003-04-04 2004-10-07 Insana Samuel P. Carriage assembly for positioning railroad cars
US6833848B1 (en) * 1999-12-16 2004-12-21 Ricoh Co., Ltd. Game console based digital photo album
US20050052345A1 (en) * 2001-10-31 2005-03-10 Siemens Aktiengesellschaft Control device
US6963332B1 (en) * 1998-11-10 2005-11-08 Nec Corporation Letter input method and device using the same
US20060146019A1 (en) * 2005-01-03 2006-07-06 Nokia Corporation Control device for browsing and selecting an item in a list
US7106300B2 (en) * 2002-07-12 2006-09-12 Mitutoyo Corporation Method for converting joystick deflection into motion in a computer vision system
US20060288314A1 (en) * 2005-06-15 2006-12-21 Microsoft Corporation Facilitating cursor interaction with display objects
US7254785B2 (en) * 2000-02-17 2007-08-07 George Reed Selection interface system
US20080015841A1 (en) * 2000-05-26 2008-01-17 Longe Michael R Directional Input System with Automatic Correction
US20080074389A1 (en) * 2006-09-27 2008-03-27 Beale Marc Ivor J Cursor control method
US20080281442A1 (en) * 2004-04-01 2008-11-13 Siemens Aktiengesellschaft Control Device For Displacing At Least One Machine Axis Of A Machine Tool Or Production Machine
US20090031240A1 (en) * 2007-07-27 2009-01-29 Gesturetek, Inc. Item selection using enhanced control
US20090187860A1 (en) * 2008-01-23 2009-07-23 David Fleck Radial control menu, graphical user interface, method of controlling variables using a radial control menu, and computer readable medium for performing the method
US20090249257A1 (en) * 2008-03-31 2009-10-01 Nokia Corporation Cursor navigation assistance
US20090262072A1 (en) * 2008-02-04 2009-10-22 E-Lead Electronic Co., Ltd. Cursor control system and method thereof
US20090262086A1 (en) * 2007-12-26 2009-10-22 E-Lead Electronic Co., Ltd. Touch-pad cursor control method
US7739604B1 (en) * 2002-09-25 2010-06-15 Apple Inc. Method and apparatus for managing windows
US20100162150A1 (en) * 2006-05-26 2010-06-24 Google Inc. Embedded Navigation Interface
US20100192102A1 (en) * 2009-01-29 2010-07-29 International Business Machines Corporation Displaying radial menus near edges of a display area
US8341553B2 (en) * 2000-02-17 2012-12-25 George William Reed Selection interface systems and methods
US9039533B2 (en) * 2003-03-25 2015-05-26 Creative Kingdoms, Llc Wireless interactive game having both physical and virtual elements

Family Cites Families (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4794388A (en) * 1980-02-20 1988-12-27 Summagraphics Corporation Method of and apparatus for controlling a display
US5185628A (en) * 1991-11-18 1993-02-09 Easatman Kodak Company Reproduction apparatus with improved operator interactive display for use in job set-up
US6061062A (en) * 1991-12-20 2000-05-09 Apple Computer, Inc. Zooming controller
US5543818A (en) 1994-05-13 1996-08-06 Sony Corporation Method and apparatus for entering text using an input device having a small number of keys
US5825308A (en) * 1996-11-26 1998-10-20 Immersion Human Interface Corporation Force feedback interface having isotonic and isometric functionality
US6147674A (en) * 1995-12-01 2000-11-14 Immersion Corporation Method and apparatus for designing force sensations in force feedback computer applications
US5828575A (en) * 1996-05-06 1998-10-27 Amadasoft America, Inc. Apparatus and method for managing and distributing design and manufacturing information throughout a sheet metal production facility
US5748185A (en) * 1996-07-03 1998-05-05 Stratos Product Development Group Touchpad with scroll and pan regions
GB0027260D0 (en) * 2000-11-08 2000-12-27 Koninl Philips Electronics Nv An image control system
US6370282B1 (en) * 1999-03-03 2002-04-09 Flashpoint Technology, Inc. Method and system for advanced text editing in a portable digital electronic device using a button interface
US6346938B1 (en) * 1999-04-27 2002-02-12 Harris Corporation Computer-resident mechanism for manipulating, navigating through and mensurating displayed image of three-dimensional geometric model
US6593913B1 (en) 2000-03-14 2003-07-15 Jellyvision, Inc Method and system for selecting a displayed character using an input device
DE10022970A1 (en) * 2000-05-11 2001-11-22 Bosch Gmbh Robert Inputting a character sequence into a computer device, e.g. phone or vehicle navigation system, where an input device allows simple selection of characters from a character array saving receptive pushing of individual keys
DE10025126A1 (en) * 2000-05-20 2001-11-22 Bayerische Motoren Werke Ag Character-wise entry of search term for information and/or communications system in motor vehicle involves displaying selected characters and last dialed character offset from list
US8287374B2 (en) * 2000-07-07 2012-10-16 Pryor Timothy R Reconfigurable control displays for games, toys, and other applications
US6501464B1 (en) 2000-10-31 2002-12-31 Intel Corporation On-screen transparent keyboard interface
US20020084984A1 (en) * 2000-12-29 2002-07-04 Beinor Stephen E. Linear joystick
US6972776B2 (en) * 2001-03-20 2005-12-06 Agilent Technologies, Inc. Scrolling method using screen pointing device
US6640185B2 (en) * 2001-07-21 2003-10-28 Alpine Electronics, Inc. Display method and apparatus for navigation system
US7075513B2 (en) * 2001-09-04 2006-07-11 Nokia Corporation Zooming and panning content on a display screen
US20040017355A1 (en) * 2002-07-24 2004-01-29 Youngtack Shim Cursor control systems and methods
JP2004070654A (en) * 2002-08-06 2004-03-04 Matsushita Electric Ind Co Ltd Portable electronic equipment
JP2004157866A (en) * 2002-11-07 2004-06-03 Fujitsu Component Ltd Pointing device, its control method and mobile telephone
JP4043965B2 (en) * 2003-02-05 2008-02-06 カルソニックカンセイ株式会社 List display device
US8046705B2 (en) * 2003-05-08 2011-10-25 Hillcrest Laboratories, Inc. Systems and methods for resolution consistent semantic zooming
US7154526B2 (en) * 2003-07-11 2006-12-26 Fuji Xerox Co., Ltd. Telepresence system and method for video teleconferencing
US8274534B2 (en) * 2005-01-31 2012-09-25 Roland Wescott Montague Methods for combination tools that zoom, pan, rotate, draw, or manipulate during a drag
WO2006099395A2 (en) * 2005-03-11 2006-09-21 Adobe Systems, Inc. System and method for displaying information using a compass
JP4170314B2 (en) * 2005-05-25 2008-10-22 株式会社スクウェア・エニックス Scroll display control device, program, and recording medium
US7797641B2 (en) * 2005-05-27 2010-09-14 Nokia Corporation Mobile communications terminal and method therefore
JP2009514106A (en) * 2005-10-26 2009-04-02 株式会社ソニー・コンピュータエンタテインメント System and method for interfacing with a computer program
US7934169B2 (en) * 2006-01-25 2011-04-26 Nokia Corporation Graphical user interface, electronic device, method and computer program that uses sliders for user input
US7750911B2 (en) * 2006-02-21 2010-07-06 Chrysler Group Llc Pen-based 3D drawing system with 3D mirror symmetric curve drawing
US20080007753A1 (en) * 2006-07-06 2008-01-10 Mark Van Regenmorter User interface for a multi-function peripheral device
US8277316B2 (en) * 2006-09-14 2012-10-02 Nintendo Co., Ltd. Method and apparatus for using a common pointing input to control 3D viewpoint and object targeting
AU2006252194B2 (en) * 2006-12-21 2010-02-11 Canon Kabushiki Kaisha Scrolling Interface
US8373655B2 (en) * 2007-01-05 2013-02-12 Apple Inc. Adaptive acceleration of mouse cursor
US20080222530A1 (en) * 2007-03-06 2008-09-11 Microsoft Corporation Navigating user interface controls on a two-dimensional canvas
US8686991B2 (en) * 2007-09-26 2014-04-01 Autodesk, Inc. Navigation system for a 3D virtual scene
JP4691536B2 (en) * 2007-09-27 2011-06-01 株式会社日立製作所 Display method of information display device
US8341544B2 (en) * 2007-12-14 2012-12-25 Apple Inc. Scroll bar with video region in a media system
US8194037B2 (en) * 2007-12-14 2012-06-05 Apple Inc. Centering a 3D remote controller in a media system
US7630148B1 (en) * 2008-06-11 2009-12-08 Ge Inspection Technologies, Lp System for providing zoom, focus and aperture control in a video inspection device
US8754910B2 (en) * 2008-10-01 2014-06-17 Logitech Europe S.A. Mouse having pan, zoom, and scroll controls

Patent Citations (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5287812A (en) * 1989-09-28 1994-02-22 Svedala Industries, Inc. Railroad car positioning apparatus
US5844559A (en) * 1996-10-07 1998-12-01 Apple Computer, Inc. Method, apparatus and computer program products to display objects using windows
US6341201B1 (en) * 1997-09-30 2002-01-22 Fuji Photo Optical Co., Ltd. Remotely controllable camera system
US6466199B2 (en) * 1998-07-23 2002-10-15 Alps Electric Co., Ltd. Method for moving a pointing cursor
US6963332B1 (en) * 1998-11-10 2005-11-08 Nec Corporation Letter input method and device using the same
US6621992B2 (en) * 1999-10-13 2003-09-16 Sharp Kabushiki Kaisha Copier operation control and input device
US6833848B1 (en) * 1999-12-16 2004-12-21 Ricoh Co., Ltd. Game console based digital photo album
US8341553B2 (en) * 2000-02-17 2012-12-25 George William Reed Selection interface systems and methods
US7254785B2 (en) * 2000-02-17 2007-08-07 George Reed Selection interface system
US20080015841A1 (en) * 2000-05-26 2008-01-17 Longe Michael R Directional Input System with Automatic Correction
US20050052345A1 (en) * 2001-10-31 2005-03-10 Siemens Aktiengesellschaft Control device
US20030117419A1 (en) * 2001-12-12 2003-06-26 Stmicroelectronics, Inc. Method and system of continuously scaling video images
US7106300B2 (en) * 2002-07-12 2006-09-12 Mitutoyo Corporation Method for converting joystick deflection into motion in a computer vision system
US7739604B1 (en) * 2002-09-25 2010-06-15 Apple Inc. Method and apparatus for managing windows
US20040183817A1 (en) * 2002-12-03 2004-09-23 Bitstream Inc. Methods, systems, and programming for scaled display of web pages
US9039533B2 (en) * 2003-03-25 2015-05-26 Creative Kingdoms, Llc Wireless interactive game having both physical and virtual elements
US20040194655A1 (en) * 2003-04-04 2004-10-07 Insana Samuel P. Carriage assembly for positioning railroad cars
US20080281442A1 (en) * 2004-04-01 2008-11-13 Siemens Aktiengesellschaft Control Device For Displacing At Least One Machine Axis Of A Machine Tool Or Production Machine
US20060146019A1 (en) * 2005-01-03 2006-07-06 Nokia Corporation Control device for browsing and selecting an item in a list
US20060288314A1 (en) * 2005-06-15 2006-12-21 Microsoft Corporation Facilitating cursor interaction with display objects
US20100162150A1 (en) * 2006-05-26 2010-06-24 Google Inc. Embedded Navigation Interface
US20080074389A1 (en) * 2006-09-27 2008-03-27 Beale Marc Ivor J Cursor control method
US20090031240A1 (en) * 2007-07-27 2009-01-29 Gesturetek, Inc. Item selection using enhanced control
US20090262086A1 (en) * 2007-12-26 2009-10-22 E-Lead Electronic Co., Ltd. Touch-pad cursor control method
US20090187860A1 (en) * 2008-01-23 2009-07-23 David Fleck Radial control menu, graphical user interface, method of controlling variables using a radial control menu, and computer readable medium for performing the method
US20090262072A1 (en) * 2008-02-04 2009-10-22 E-Lead Electronic Co., Ltd. Cursor control system and method thereof
US20090249257A1 (en) * 2008-03-31 2009-10-01 Nokia Corporation Cursor navigation assistance
US20100192102A1 (en) * 2009-01-29 2010-07-29 International Business Machines Corporation Displaying radial menus near edges of a display area

Also Published As

Publication number Publication date
EP2216709A2 (en) 2010-08-11
US9195317B2 (en) 2015-11-24
US20100199224A1 (en) 2010-08-05
EP2216709A3 (en) 2012-11-07

Similar Documents

Publication Publication Date Title
US9195317B2 (en) System and method for generating a user interface for text and item selection
CN1327328C (en) Three-dimensional motion graphic user interface and method and apparatus for providing the same
US7917868B2 (en) Three-dimensional motion graphic user interface and method and apparatus for providing the same
US8510680B2 (en) Three-dimensional motion graphic user interface and method and apparatus for providing the same
TWI596538B (en) Provision of an open instance of an application
USRE47592E1 (en) Managing user interface control panels
CN103703437B (en) For the method and apparatus of image viewing application
US8769424B2 (en) Simplified user interface navigation in at least first and second cursor navigation directions
US20130127738A1 (en) Dynamic scaling of touch sensor
US20140223381A1 (en) Invisible control
KR102218490B1 (en) Graphical input display having a carousel of characters to facilitate character input
JP2017523515A (en) Change icon size
KR20190059995A (en) User interface for a computing device
US20160196006A1 (en) Customizable Bladed Applications
US20140329593A1 (en) Text entry using game controller
JP2006164261A (en) Data processor having advanced user interface, and system
JP2006164260A (en) Data processor and user interface for system
US20160085388A1 (en) Desktop Environment Differentiation in Virtual Desktops
KR20140073381A (en) Display apparatus and method for controlling thereof
US11023070B2 (en) Touch input hover
EP3238019B1 (en) Least disruptive icon displacement
EP3575943B1 (en) System and method for a virtual keyboard
EP3575944B1 (en) System and method for inputting aplhanumeric characters in a computer system
KR20060014874A (en) Three dimensional motion graphic user interface and method and apparutus for providing this user interface
CN105260111A (en) Touch control equipment and function navigation system therefor

Legal Events

Date Code Title Description
AS Assignment

Owner name: OPENTV, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:DOERRING, NICHOLAS D.;REEL/FRAME:036825/0972

Effective date: 20090205

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION