US20110238612A1 - Multi-factor probabilistic model for evaluating user input - Google Patents
Multi-factor probabilistic model for evaluating user input Download PDFInfo
- Publication number
- US20110238612A1 US20110238612A1 US12/732,190 US73219010A US2011238612A1 US 20110238612 A1 US20110238612 A1 US 20110238612A1 US 73219010 A US73219010 A US 73219010A US 2011238612 A1 US2011238612 A1 US 2011238612A1
- Authority
- US
- United States
- Prior art keywords
- user input
- probability
- user interface
- computer
- interface control
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N7/00—Computing arrangements based on specific mathematical models
- G06N7/01—Probabilistic graphical models, e.g. probabilistic networks
Definitions
- GUIs Graphical user interfaces
- controls For instance, a GUI might present menu controls, scroll bar controls, button controls, and other types of controls to a user. The user can then utilize a user input device to select and otherwise interact with the controls. For instance, a user might utilize a touch screen user input device, a mouse user input device, or another type of user input device to interact with the controls.
- Point-in-rectangle hit testing is suitable for use with user input devices that have a high degree of precision, such as a mouse user input device.
- Point-in-rectangle hit testing can be problematic, however, when utilized with touch screen user input devices and other types of user input devices that do not have the precision of a mouse user input device.
- the high degree of precision typically required by point-in-rectangle hit testing mechanisms may also cause frustration for users that do not possess the fine motor skills necessary to precisely select on-screen user interface controls.
- a multi-factor probabilistic model for evaluating user input is presented herein.
- user input can be evaluated in a manner that reduces the precision typically required by point-in-rectangle hit testing. This can be beneficial when used in conjunction with touch screen user input devices and other types of user input devices that do not have the precision of a mouse user input device. This may also be beneficial when used by users that do not possess the fine motor skills necessary to precisely select on-screen user interface controls.
- a user may approach interactions with a computer that utilizes the model in a variety of ways. For instance, the user might interact with user interface elements by precise positioning as done previously, by unique gesture, by hand pose or shape, or in another manner.
- a multi-factor probabilistic model is utilized to evaluate user input to determine if the user input is intended for an on-screen user interface control.
- the probability that the user input was intended for each on-screen user interface control is computed.
- the user input is then associated with the user interface control that has the highest computed probability.
- the highest probability must also exceed a threshold probability in order for the user input to be associated with the user interface control.
- the probability that the user input was intended for each user interface control may also be computed over time.
- the probability is computed utilizing the multi-factor probabilistic model.
- the multi-factor probabilistic model computes the probability that user input was intended for each user interface control utilizing a multitude of factors.
- the factors utilized by the model might include, but are not limited to, the probability that the user input is near each user interface control, the probability that the motion, or path, of the user input is consistent with motion typically utilized to control the user interface control, the probability that the shape of the user input is consistent with the shape of user input for controlling the user interface control, and that the size of the user input is consistent with the size of user input for controlling the user interface control.
- Each factor may be assigned a weight.
- a high probability might be calculated for several user interface controls.
- a suitable user interface may be provided through which a user can specify which of the user interface controls that the user input was intended for.
- the user interface controls having the highest probabilities might be visually emphasized in order to indicate to a user that an ambiguous user input was received. The user might then more particularly select one of the emphasized user interface controls in order to complete the user input operation.
- FIG. 1 is a block diagram showing aspects of an illustrative operating environment and aspects of a probabilistic multi-factor model for evaluating user input disclosed herein;
- FIG. 2 is a flow diagram showing aspects of the operation of a probabilistic multi-factor model for evaluating user input in one embodiment presented herein;
- FIGS. 3-5 are screen diagrams illustrating the application of the probabilistic multi-factor model provided herein to several sample user inputs.
- FIG. 6 is a computer architecture diagram showing an illustrative computer hardware and software architecture for a computing system capable of implementing the embodiments presented herein.
- FIG. 1 illustrates aspects of an illustrative operating environment 100 along with aspects of a probabilistic multi-factor model 110 for evaluating user input 108 .
- the operating environment 100 for the embodiments presented herein includes an application program 102 and an operating system 104 capable of generating graphical user interface (“UI”) controls 106 .
- UI controls 106 are elements of a graphical user interface that display information to a user and/or that receive input from the user.
- the UI controls 106 might include, but are not limited to, scroll bars, check boxes, radio buttons, buttons, selectable icons, menu items, dialog boxes, sliders, list boxes, drop-down lists, toolbars, ribbon controls, combo boxes, tabs, and windows.
- UI control as utilized herein encompasses any type of UI element displayed by a program that receives user input 108 .
- FIG. 1 illustrates only an operating system 104 and an application program 102 generating the UI controls 106 , other types of programs might also generate the UI controls 106 .
- a user input device 112 receives user input 108 and provides the user input 108 to a computer system executing the operating system 104 and the application program 102 .
- the user input device 112 is a touch screen user input device, such as a capacitive touch screen.
- the user input device 112 may comprise any type of user input device including, but not limited to, a keyboard, mouse, trackball, joystick, light pen, and game controller.
- the user input device 112 might also comprise a free space motion capture system. For instance, some free space motion capture systems utilize an infrared emitter and sensor to detect motion in free space. The detected motion can then be provided to a computer system as the user input 108 .
- the term user input device 112 refers to any type of device through which a user might provide input to a computer system.
- the user input 108 will vary depending upon the type of user input device 112 being utilized.
- the user input 108 when the user input device 112 is a touch screen, the user input 108 will comprise a user's interaction with the touch screen.
- the user input device 112 can generate data that indicates the two-dimensional coordinates of the user's interaction with the touch screen.
- the data might vary over time, so that user input 108 that has a time component can also be described.
- the data might indicate that a gesture was made on the touch screen, such as a swipe across the touch screen with a single or multiple fingers.
- Other types of user input devices 112 will similarly generate appropriate data that describes the user input 108 .
- the user input device 112 receives the user input 108 .
- the user input 108 is then provided to the probabilistic multi-factor model 110 .
- the probabilistic multi-factor model 110 is a software component configured to evaluate the user input 108 to determine which of one or more UI controls 106 that the user input 108 should be provided to.
- the probabilistic multi-factor model 110 might be integrated with the operating system 104 or the application program 102 .
- the probabilistic multi-factor model 110 might also execute as part of another type of program.
- the probabilistic multi-factor model 110 computes the probability that the user input 108 was intended for each on-screen UI control 106 .
- the probabilistic multi-factor model 110 then associates the received user input 108 with the UI control 106 that has the highest computed probability.
- the highest probability must also exceed a threshold probability in order for the user input 108 to be associated with a UI control 106 .
- the probability that the user input 108 was intended for each UI control 106 might also be computed over time. In this way, user input 108 that varies with time, such as gestures, can be evaluated over the duration of the user input 108 to determine the intended UI control 106 .
- the multi-factor probabilistic model 110 takes multiple factors into account, as more fully described herein, a user may approach interactions with a computer that utilizes the model in a variety of ways.
- the user might interact with the user interface controls 106 by precise positioning as done previously, by unique gesture, or by hand pose or shape, for instance.
- the multi-factor probabilistic model 110 computes the probability that the user input 108 was intended for each UI control 106 utilizing a multitude of factors.
- the model 110 might utilize factors including, but not limited to, the probability that the user input 108 is near each UI control 106 , the probability that the motion, or path, of the user input 108 is consistent with motion typically utilized to control the UI control 106 , the probability that the shape of the user input 108 is consistent with the shape of user input for controlling the UI control 106 , and that the size of the user input 108 is consistent with the size of user input for controlling the UI control 106 .
- Each factor might also be weighted. Facilities might also be provided for allowing a user to specify or modify the weight assigned to each factor.
- the probabilistic multi-factor model 110 provides a user interface for allowing a user to disambiguate the user input 108 .
- the probabilistic multi-factor model 110 might compute a high probability for several user interface controls 106 .
- the probabilistic multi-factor model 110 might provide a suitable UI through which a user can specify the UI control 106 that the user input 108 was intended for.
- the UI controls 106 having the highest probabilities might be visually emphasized in order to indicate to a user that ambiguous user input 108 was received. The user might then more particularly select one of the emphasized UI controls 106 in order to specify that the selected UI control 106 is the intended control. Additional details regarding the operation of the probabilistic multi-factor model 110 in this regard will be provided below with respect to FIGS. 2-5 .
- FIG. 2 is a flow diagram that includes a routine 200 showing aspects of an illustrative process performed by the probabilistic multi-factor model 110 when evaluating user input 108 .
- the logical operations described herein with respect to FIG. 2 are implemented (1) as a sequence of computer implemented acts or program modules running on a computing system and/or (2) as interconnected machine logic circuits or circuit modules within the computing system.
- the implementation is a matter of choice dependent on the performance and other requirements of the computing system.
- the logical operations described herein are referred to variously as operations, structural devices, acts, or modules. These operations, structural devices, acts and modules may be implemented in software, in firmware, in special purpose digital logic, and any combination thereof. It should also be appreciated that more or fewer operations may be performed than shown in the figures and described herein. These operations may also be performed in a different order than those described herein.
- the routine 200 begins at operation 202 , where the probabilistic multi-factor model 110 determines whether user input 108 has been received. If user input 108 has been received, the routine 200 proceeds to operation 204 , where a variable (referred to herein as the “current UI control variable”) is initialized to identify one of the on-screen UI controls 106 .
- the current UI control variable identifies the UI control 106 for which the probabilistic multi-factor model 110 is currently computing a probability (the “current UI control”). Once the current UI control variable has been initialized, the routine 200 proceeds from operation 204 to operation 206 .
- the probabilistic multi-factor model 110 computes the probability that the user input 108 was intended for the current UI control 106 .
- the multi-factor probabilistic model 110 computes the probability that the user input 108 was intended for the current UI control 106 utilizing a multitude of factors.
- W 1 -W 3 represent weights assigned to each of the factors. As discussed briefly above, facilities might also be provided for allowing a user to specify or modify the weight assigned to each factor.
- the probabilistic multi-factor model 110 may compute the probability that the user input 108 was intended for the current UI control 106 using other factors. For instance, the factors may take into consideration the motion, gesture, shape, context, and history of the user input 108 . The probabilistic multi-factor model 110 might also utilize other factors when determining the probability.
- the routine 200 proceeds to operation 208 .
- the probabilistic multi-factor model 110 determines whether there are other on-screen UI controls 106 for which the probability should be calculated. If so, the routine 200 proceeds from operation 208 to operation 210 where the current UI control variable is incremented to identify the next UI control 106 . The routine 200 then proceeds from operation 210 to operation 206 where the probability that the user input 108 was intended for manipulating the current UI control 106 is computed in the manner described above.
- the routine 200 proceeds to operation 212 .
- the probabilistic multi-factor model 110 determines whether additional user input 108 has been received. For instance, additional user input 108 may be received where the user input 108 is a gesture that varies over time. In this way, the probabilistic multi-factor model 110 will continually compute the probability that the user input 108 was intended for each of the UI controls 106 over the time that the user input 108 is made. If more user input 108 has been received, the routine 200 returns to operation 204 described above, where the probability that the user input 108 was intended for each of the UI controls 106 is again computed. If no additional user input 108 has been received, the routine 200 proceeds from operation 212 to operation 214 .
- the probabilistic multi-factor model 110 determines whether the probability calculations described above resulted in a high probability that a single UI control 106 was the intended recipient of the user input 108 . In one implementation, the highest probability must also exceed a threshold probability in order for the user input 108 to be associated with a UI control 106 . If one UI control 106 had a high probability, the routine 200 proceeds from operation 214 to operation 216 .
- the probabilistic multi-factor model 110 associates the user input 108 with the UI control 106 having the highest computed probability. Associating the user input 108 with the UI control 106 might include passing the user input 108 directly to the UI control 106 , notifying the operating system 104 or application 102 that user input 108 was received for the UI control 106 , or causing the UI control 106 to be notified of the receipt of the user input 108 in another manner. From operation 216 , the routine 200 proceeds to operation 202 , described above.
- the routine 200 proceeds from operation 214 to operation 218 .
- the probabilistic multi-factor model 110 determines whether the probability calculations described above resulted in a high probability that multiple UI controls 106 were the intended recipient of the user input 108 . If not, the routine 200 proceeds to operation 202 described above. If so, the routine 200 proceeds to operation 220 , where the probabilistic multi-factor model 110 causes a user interface to be provided that allows the user to disambiguate the user input 108 .
- the probabilistic multi-factor model 110 provides a suitable UI through which the user can specify the UI control 106 that the user input 108 was intended for.
- the UI controls 106 having the highest probabilities might be visually emphasized in order to indicate to a user that ambiguous user input 108 was received. The user might then more particularly select one of the emphasized UI controls 106 in order to specify that the selected UI control 106 is the intended UI control 106 . From operation 220 , the routine 200 returns to operation 202 , described above.
- FIG. 3 shows a screen display 300 A provided by a user input device 112 equipped with a touch screen.
- the screen display 300 A in this example includes a single UI control 106 A, which is a button.
- user input 108 has been received on the touch screen in the form of a single touch 304 A.
- the single touch 304 A would not register as a selection of the UI control 106 A because the touch 304 A is not within the boundaries of the UI control 106 A
- the single touch 304 A may be registered as a selection of the UI control 106 A even though the touch 304 A is not within the boundaries of the UI control 106 A.
- the probabilistic multi-factor model 110 may take into account the type of user input (e.g. a single touch 304 A), the distance from the touch 304 A to the UI control 106 A (e.g. relatively close), the consistency of the touch 304 A with the type of input expected by the UI control 106 A (e.g. very consistent), the consistency of the shape and size of the touch 304 A with the shape and size of input expected by the UI control 106 A (e.g.
- the probabilistic multi-factor model 110 may compute a high probability that the touch 304 A was intended for the UI control 106 A and associate the touch 304 A with the UI control 106 A. In this way, the touch 304 A might select the UI control 106 A even though the touch 304 A is not within the boundaries of the actual control.
- FIG. 4 shows a screen display 300 B provided by a user input device 112 equipped with a touch screen.
- the screen display 300 B in this example includes a two UI controls 106 A and 106 B, both of which are buttons.
- user input 108 has been received on the touch screen in the form of a single touch 304 B.
- the single touch 304 B would not register as a selection of either of the UI controls 106 A and 106 B, because the touch 304 A is not within the boundaries of either of these controls.
- the single touch 304 B may be registered as a selection of the UI control 106 B even though the touch 304 B is not within the boundaries of the UI control 106 B.
- the probabilistic multi-factor model 110 may take into account the type of user input (e.g. a single touch 304 B), the distance from the touch 304 A to the UI controls 106 B and 106 C (e.g. close to the UI control 106 B but further from the UI control 106 B), the consistency of the touch 304 A with the type of input expected by the UI controls 106 A and 106 B (e.g.
- the probabilistic multi-factor model 110 may compute a high probability that the touch 304 B was intended for the UI control 106 B and associate the touch 304 B with the UI control 106 B since the touch 304 B is closer to the UI control 106 B than the UI control 106 C. In this way, the touch 304 B might select the UI control 106 B even though the touch 304 B is not within the boundaries of this control.
- FIG. 5 shows a screen display 300 C provided by a user input device 112 equipped with a touch screen.
- the screen display 300 C in this example includes a two UI controls 106 D and 106 E.
- the UI control 106 D is a vertically oriented scroll bar and the UI control 106 E is a horizontally oriented scroll bar.
- user input 108 has been received on the touch screen in the form of a horizontal swiping gesture 304 C across the touch screen.
- the gesture 304 C would not register as a selection of either of the UI controls 106 D and 106 E, because the gesture 304 C is not within the boundaries of either of these controls.
- the gesture 304 C may be registered as a selection of the UI control 106 E even though the gesture 304 C is not within the boundaries of the UI control 106 E.
- the probabilistic multi-factor model 110 may take into account the type of user input (e.g. a horizontal swipe), the distance from the gesture 304 C to the UI controls 106 D and 106 E (e.g. about the same distance to each), the consistency of the gesture 304 C with the type of input expected by the UI controls 106 D and 106 E (e.g.
- the probabilistic multi-factor model 110 may compute a high probability that the gesture 304 C was intended for the UI control 106 E and associate the gesture 106 E with the UI control 106 E since the gesture 304 C is a horizontal motion similar to what would be expected by the horizontally oriented UI control 106 E. In this way, the gesture 304 C might control the UI control 106 E even though the gesture 304 C is not within the boundaries of this control.
- FIG. 6 shows an illustrative computer architecture for a computer 600 capable of executing the software components described herein for providing a multi-factor probabilistic model for evaluating user input.
- the computer architecture shown in FIG. 6 illustrates a conventional desktop, laptop computer, or server computer and may be utilized to execute the software components described herein for providing a multi-factor probabilistic model for evaluating user input.
- the computer architecture shown in FIG. 6 includes a central processing unit 602 (“CPU”), a system memory 608 , including a random access memory 614 (“RAM”) and a read-only memory (“ROM”) 616 , and a system bus 604 that couples the memory to the CPU 602 .
- the computer 600 further includes a mass storage device 610 for storing an operating system 618 , application programs, and other program modules, which will be described in greater detail below.
- the mass storage device 610 is connected to the CPU 602 through a mass storage controller (not shown) connected to the bus 604 .
- the mass storage device 610 and its associated computer-readable media provide non-volatile storage for the computer 600 .
- computer-readable media can be any available computer storage media that can be accessed by the computer 600 .
- computer-readable storage media may include volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data.
- computer-readable media includes, but is not limited to, RAM, ROM, EPROM, EEPROM, flash memory or other solid state memory technology, CD-ROM, digital versatile disks (“DVD”), HD-DVD, BLU-RAY, or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by the computer 600 .
- the computer 600 may operate in a networked environment using logical connections to remote computers through a network such as the network 620 .
- the computer 600 may connect to the network 620 through a network interface unit 606 connected to the bus 604 . It should be appreciated that the network interface unit 606 may also be utilized to connect to other types of networks and remote computer systems.
- the computer 600 may also include an input/output controller 612 for receiving and processing input from a number of other devices, including a keyboard, mouse, or electronic stylus (not shown in FIG. 6 ). Similarly, an input/output controller may provide output to a display screen, a printer, or other type of output device (also not shown in FIG. 6 ).
- a number of program modules and data files may be stored in the mass storage device 610 and RAM 614 of the computer 600 , including an operating system 618 suitable for controlling the operation of a networked desktop, laptop, or server computer.
- the mass storage device 610 and RAM 614 may also store one or more program modules.
- the mass storage device 610 and the RAM 614 may store an application program 102 or operating system 104 that provide the functionality described herein for evaluating user input using a multi-factor probabilistic model 110 .
- software applications or modules may, when loaded into the CPU 602 and executed, transform the CPU 602 and the overall computer 600 from a general-purpose computing system into a special-purpose computing system customized to perform the functionality presented herein.
- the CPU 602 may be constructed from any number of transistors or other discrete circuit elements, which may individually or collectively assume any number of states. More specifically, the CPU 602 may operate as one or more finite-state machines, in response to executable instructions contained within the software or modules. These computer-executable instructions may transform the CPU 602 by specifying how the CPU 602 transitions between states, thereby physically transforming the transistors or other discrete hardware elements constituting the CPU 602 .
- Encoding the software or modules onto a mass storage device may also transform the physical structure of the mass storage device or associated computer readable storage media.
- the specific transformation of physical structure may depend on various factors, in different implementations of this description. Examples of such factors may include, but are not limited to: the technology used to implement the computer readable storage media, whether the computer readable storage media are characterized as primary or secondary storage, and the like.
- the computer readable storage media is implemented as semiconductor-based memory
- the software or modules may transform the physical state of the semiconductor memory, when the software is encoded therein.
- the software may transform the states of transistors, capacitors, or other discrete circuit elements constituting the semiconductor memory.
- the computer readable storage media may be implemented using magnetic or optical technology.
- the software or modules may transform the physical state of magnetic or optical media, when the software is encoded therein. These transformations may include altering the magnetic characteristics of particular locations within given magnetic media. These transformations may also include altering the physical features or characteristics of particular locations within given optical media, to change the optical characteristics of those locations. Other transformations of physical media are possible without departing from the scope and spirit of the present description, with the foregoing examples provided only to facilitate this discussion.
Abstract
Description
- Graphical user interfaces (“GUIs”) typically allow a user to interact with a computer system by providing a variety of on-screen user interface controls (“controls”). For instance, a GUI might present menu controls, scroll bar controls, button controls, and other types of controls to a user. The user can then utilize a user input device to select and otherwise interact with the controls. For instance, a user might utilize a touch screen user input device, a mouse user input device, or another type of user input device to interact with the controls.
- In order to determine whether a user has provided input intended for a particular control, current computing systems typically compare the two-dimensional coordinates of the user input to the bounding coordinates of all on-screen controls. If the coordinates of the received user input lie within the bounding coordinates of an on-screen control, then the user input is considered to have been intended for that control. This mechanism is commonly referred to as “point-in rectangle hit testing.”
- Point-in-rectangle hit testing is suitable for use with user input devices that have a high degree of precision, such as a mouse user input device. Point-in-rectangle hit testing can be problematic, however, when utilized with touch screen user input devices and other types of user input devices that do not have the precision of a mouse user input device. Moreover, the high degree of precision typically required by point-in-rectangle hit testing mechanisms may also cause frustration for users that do not possess the fine motor skills necessary to precisely select on-screen user interface controls.
- It is with respect to these considerations and others that the disclosure made herein is presented.
- A multi-factor probabilistic model for evaluating user input is presented herein. Through the utilization of the multi-factor probabilistic model, several of the limitations of point-in-rectangle hit testing can be minimized or eliminated. For instance, using the multi-factor probabilistic model presented herein, user input can be evaluated in a manner that reduces the precision typically required by point-in-rectangle hit testing. This can be beneficial when used in conjunction with touch screen user input devices and other types of user input devices that do not have the precision of a mouse user input device. This may also be beneficial when used by users that do not possess the fine motor skills necessary to precisely select on-screen user interface controls. Moreover, because the multi-factor probabilistic model presented herein takes multiple factors into account, a user may approach interactions with a computer that utilizes the model in a variety of ways. For instance, the user might interact with user interface elements by precise positioning as done previously, by unique gesture, by hand pose or shape, or in another manner.
- According to one aspect presented herein, a multi-factor probabilistic model is utilized to evaluate user input to determine if the user input is intended for an on-screen user interface control. In particular, when user input is received, the probability that the user input was intended for each on-screen user interface control is computed. The user input is then associated with the user interface control that has the highest computed probability. In one implementation, the highest probability must also exceed a threshold probability in order for the user input to be associated with the user interface control. The probability that the user input was intended for each user interface control may also be computed over time.
- According to another aspect, the probability is computed utilizing the multi-factor probabilistic model. The multi-factor probabilistic model computes the probability that user input was intended for each user interface control utilizing a multitude of factors. For instance, the factors utilized by the model might include, but are not limited to, the probability that the user input is near each user interface control, the probability that the motion, or path, of the user input is consistent with motion typically utilized to control the user interface control, the probability that the shape of the user input is consistent with the shape of user input for controlling the user interface control, and that the size of the user input is consistent with the size of user input for controlling the user interface control. Each factor may be assigned a weight.
- According to another aspect, a high probability might be calculated for several user interface controls. In this case, a suitable user interface may be provided through which a user can specify which of the user interface controls that the user input was intended for. For instance, the user interface controls having the highest probabilities might be visually emphasized in order to indicate to a user that an ambiguous user input was received. The user might then more particularly select one of the emphasized user interface controls in order to complete the user input operation.
- The above-described subject matter may also be implemented as a computer-controlled apparatus, a computer process, a computing system, or as an article of manufacture such as a computer-readable medium. These and various other features will be apparent from a reading of the following Detailed Description and a review of the associated drawings.
- This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended that this Summary be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
-
FIG. 1 is a block diagram showing aspects of an illustrative operating environment and aspects of a probabilistic multi-factor model for evaluating user input disclosed herein; -
FIG. 2 is a flow diagram showing aspects of the operation of a probabilistic multi-factor model for evaluating user input in one embodiment presented herein; -
FIGS. 3-5 are screen diagrams illustrating the application of the probabilistic multi-factor model provided herein to several sample user inputs; and -
FIG. 6 is a computer architecture diagram showing an illustrative computer hardware and software architecture for a computing system capable of implementing the embodiments presented herein. - The following detailed description is directed to a multi-factor probabilistic model for evaluating user input. While the subject matter described herein is presented in the general context of program modules that execute in conjunction with the execution of an operating system and application programs on a computer system, those skilled in the art will recognize that other implementations may be performed in combination with other types of program modules. Generally, program modules include routines, programs, components, data structures, and other types of structures that perform particular tasks or implement particular abstract data types. Moreover, those skilled in the art will appreciate that the subject matter described herein may be practiced with other computer system configurations, including hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers, and the like.
- In the following detailed description, references are made to the accompanying drawings that form a part hereof, and which are shown by way of illustration specific embodiments or examples. Referring now to the drawings, in which like numerals represent like elements through the several figures, aspects of a multi-factor probabilistic model for evaluating user input will be described.
- Turning now to
FIG. 1 , details will be provided regarding one embodiment presented herein for evaluating user input using a probabilistic multi-factor model. In particular,FIG. 1 illustrates aspects of anillustrative operating environment 100 along with aspects of a probabilisticmulti-factor model 110 for evaluatinguser input 108. - As shown in
FIG. 1 , theoperating environment 100 for the embodiments presented herein includes anapplication program 102 and anoperating system 104 capable of generating graphical user interface (“UI”) controls 106. As known in the art,UI controls 106 are elements of a graphical user interface that display information to a user and/or that receive input from the user. For instance, theUI controls 106 might include, but are not limited to, scroll bars, check boxes, radio buttons, buttons, selectable icons, menu items, dialog boxes, sliders, list boxes, drop-down lists, toolbars, ribbon controls, combo boxes, tabs, and windows. It should be appreciated that the term UI control as utilized herein encompasses any type of UI element displayed by a program that receivesuser input 108. It should also be appreciated that althoughFIG. 1 illustrates only anoperating system 104 and anapplication program 102 generating theUI controls 106, other types of programs might also generate theUI controls 106. - As also illustrated in
FIG. 1 , auser input device 112 receivesuser input 108 and provides theuser input 108 to a computer system executing theoperating system 104 and theapplication program 102. According to one embodiment presented herein, theuser input device 112 is a touch screen user input device, such as a capacitive touch screen. It should be appreciated, however, that theuser input device 112 may comprise any type of user input device including, but not limited to, a keyboard, mouse, trackball, joystick, light pen, and game controller. Theuser input device 112 might also comprise a free space motion capture system. For instance, some free space motion capture systems utilize an infrared emitter and sensor to detect motion in free space. The detected motion can then be provided to a computer system as theuser input 108. It should be appreciated that, as used herein, the termuser input device 112 refers to any type of device through which a user might provide input to a computer system. - It should also be appreciated that the
user input 108 will vary depending upon the type ofuser input device 112 being utilized. For instance, when theuser input device 112 is a touch screen, theuser input 108 will comprise a user's interaction with the touch screen. Theuser input device 112 can generate data that indicates the two-dimensional coordinates of the user's interaction with the touch screen. The data might vary over time, so thatuser input 108 that has a time component can also be described. For instance, the data might indicate that a gesture was made on the touch screen, such as a swipe across the touch screen with a single or multiple fingers. Other types ofuser input devices 112 will similarly generate appropriate data that describes theuser input 108. - As described briefly above, the
user input device 112 receives theuser input 108. According to one embodiment presented herein, theuser input 108 is then provided to the probabilisticmulti-factor model 110. The probabilisticmulti-factor model 110 is a software component configured to evaluate theuser input 108 to determine which of one or more UI controls 106 that theuser input 108 should be provided to. Although illustrated inFIG. 1 as being separate from theoperating system 104 and theapplication program 102, the probabilisticmulti-factor model 110 might be integrated with theoperating system 104 or theapplication program 102. The probabilisticmulti-factor model 110 might also execute as part of another type of program. - As will be described in greater detail below, when
user input 108 is received, the probabilisticmulti-factor model 110 computes the probability that theuser input 108 was intended for each on-screen UI control 106. The probabilisticmulti-factor model 110 then associates the receiveduser input 108 with theUI control 106 that has the highest computed probability. In one implementation, the highest probability must also exceed a threshold probability in order for theuser input 108 to be associated with aUI control 106. The probability that theuser input 108 was intended for eachUI control 106 might also be computed over time. In this way,user input 108 that varies with time, such as gestures, can be evaluated over the duration of theuser input 108 to determine the intendedUI control 106. As discuss briefly above, because the multi-factorprobabilistic model 110 takes multiple factors into account, as more fully described herein, a user may approach interactions with a computer that utilizes the model in a variety of ways. The user might interact with the user interface controls 106 by precise positioning as done previously, by unique gesture, or by hand pose or shape, for instance. - According to another aspect, the multi-factor
probabilistic model 110 computes the probability that theuser input 108 was intended for eachUI control 106 utilizing a multitude of factors. For instance, themodel 110 might utilize factors including, but not limited to, the probability that theuser input 108 is near eachUI control 106, the probability that the motion, or path, of theuser input 108 is consistent with motion typically utilized to control theUI control 106, the probability that the shape of theuser input 108 is consistent with the shape of user input for controlling theUI control 106, and that the size of theuser input 108 is consistent with the size of user input for controlling theUI control 106. Each factor might also be weighted. Facilities might also be provided for allowing a user to specify or modify the weight assigned to each factor. - According to another aspect, the probabilistic
multi-factor model 110 provides a user interface for allowing a user to disambiguate theuser input 108. For instance, in some scenarios the probabilisticmulti-factor model 110 might compute a high probability for several user interface controls 106. In this case, the probabilisticmulti-factor model 110 might provide a suitable UI through which a user can specify theUI control 106 that theuser input 108 was intended for. For instance, the UI controls 106 having the highest probabilities might be visually emphasized in order to indicate to a user thatambiguous user input 108 was received. The user might then more particularly select one of the emphasized UI controls 106 in order to specify that the selectedUI control 106 is the intended control. Additional details regarding the operation of the probabilisticmulti-factor model 110 in this regard will be provided below with respect toFIGS. 2-5 . - Turning now to
FIG. 2 , additional details will be provided regarding the embodiments presented herein for evaluatinguser input 108 using a probabilisticmulti-factor model 110. In particular,FIG. 2 is a flow diagram that includes a routine 200 showing aspects of an illustrative process performed by the probabilisticmulti-factor model 110 when evaluatinguser input 108. - It should be appreciated that the logical operations described herein with respect to
FIG. 2 are implemented (1) as a sequence of computer implemented acts or program modules running on a computing system and/or (2) as interconnected machine logic circuits or circuit modules within the computing system. The implementation is a matter of choice dependent on the performance and other requirements of the computing system. Accordingly, the logical operations described herein are referred to variously as operations, structural devices, acts, or modules. These operations, structural devices, acts and modules may be implemented in software, in firmware, in special purpose digital logic, and any combination thereof. It should also be appreciated that more or fewer operations may be performed than shown in the figures and described herein. These operations may also be performed in a different order than those described herein. - The routine 200 begins at
operation 202, where the probabilisticmulti-factor model 110 determines whetheruser input 108 has been received. Ifuser input 108 has been received, the routine 200 proceeds tooperation 204, where a variable (referred to herein as the “current UI control variable”) is initialized to identify one of the on-screen UI controls 106. The current UI control variable identifies theUI control 106 for which the probabilisticmulti-factor model 110 is currently computing a probability (the “current UI control”). Once the current UI control variable has been initialized, the routine 200 proceeds fromoperation 204 tooperation 206. - At
operation 206, the probabilisticmulti-factor model 110 computes the probability that theuser input 108 was intended for thecurrent UI control 106. As described briefly above, the multi-factorprobabilistic model 110 computes the probability that theuser input 108 was intended for thecurrent UI control 106 utilizing a multitude of factors. According to one embodiment, the probability that theuser input 108 was intended for manipulating thecurrent UI control 106 is computed as Pinput(C)=(W1*Pinput(near C))*(W2*Pinput(motion consistent with C))* (W3*Pinput(shape/size consistent with C)* . . . , where C represents theUI control 106 being evaluated, Pinput(near C) is the probability that theuser input 108 is near thecurrent UI control 106, Pinput(motion consistent with C) is the probability that the motion, or path, of theuser input 108 is consistent with motion typically utilized to control thecurrent UI control 106, and Pinput(shape/size consistent with C)* represents the probability that the shape and/or size of theuser input 108 is consistent with the shape and/or size of user input for controlling thecurrent UI control 106. W1-W3 represent weights assigned to each of the factors. As discussed briefly above, facilities might also be provided for allowing a user to specify or modify the weight assigned to each factor. - It should be appreciated that the factors identified above are merely illustrative and that the probabilistic
multi-factor model 110 may compute the probability that theuser input 108 was intended for thecurrent UI control 106 using other factors. For instance, the factors may take into consideration the motion, gesture, shape, context, and history of theuser input 108. The probabilisticmulti-factor model 110 might also utilize other factors when determining the probability. - Once the probability that the
user input 108 was intended for thecurrent UI control 106 has been computed atoperation 206, the routine 200 proceeds tooperation 208. Atoperation 208, the probabilisticmulti-factor model 110 determines whether there are other on-screen UI controls 106 for which the probability should be calculated. If so, the routine 200 proceeds fromoperation 208 tooperation 210 where the current UI control variable is incremented to identify thenext UI control 106. The routine 200 then proceeds fromoperation 210 tooperation 206 where the probability that theuser input 108 was intended for manipulating thecurrent UI control 106 is computed in the manner described above. - If, at
operation 208, the probabilisticmulti-factor model 110 determines that there are no additional UI controls 106 for which the probability should be calculated, the routine 200 proceeds tooperation 212. Atoperation 212, the probabilisticmulti-factor model 110 determines whetheradditional user input 108 has been received. For instance,additional user input 108 may be received where theuser input 108 is a gesture that varies over time. In this way, the probabilisticmulti-factor model 110 will continually compute the probability that theuser input 108 was intended for each of the UI controls 106 over the time that theuser input 108 is made. Ifmore user input 108 has been received, the routine 200 returns tooperation 204 described above, where the probability that theuser input 108 was intended for each of the UI controls 106 is again computed. If noadditional user input 108 has been received, the routine 200 proceeds fromoperation 212 tooperation 214. - At
operation 214, the probabilisticmulti-factor model 110 determines whether the probability calculations described above resulted in a high probability that asingle UI control 106 was the intended recipient of theuser input 108. In one implementation, the highest probability must also exceed a threshold probability in order for theuser input 108 to be associated with aUI control 106. If oneUI control 106 had a high probability, the routine 200 proceeds fromoperation 214 tooperation 216. - At
operation 216, the probabilisticmulti-factor model 110 associates theuser input 108 with theUI control 106 having the highest computed probability. Associating theuser input 108 with theUI control 106 might include passing theuser input 108 directly to theUI control 106, notifying theoperating system 104 orapplication 102 thatuser input 108 was received for theUI control 106, or causing theUI control 106 to be notified of the receipt of theuser input 108 in another manner. Fromoperation 216, the routine 200 proceeds tooperation 202, described above. - If,
operation 214, the probabilisticmulti-factor model 110 does not conclude that the probability calculations described above resulted in a high probability for asingle UI control 106, the routine 200 proceeds fromoperation 214 tooperation 218. Atoperation 218, the probabilisticmulti-factor model 110 determines whether the probability calculations described above resulted in a high probability that multiple UI controls 106 were the intended recipient of theuser input 108. If not, the routine 200 proceeds tooperation 202 described above. If so, the routine 200 proceeds tooperation 220, where the probabilisticmulti-factor model 110 causes a user interface to be provided that allows the user to disambiguate theuser input 108. In one embodiment, for example, the probabilisticmulti-factor model 110 provides a suitable UI through which the user can specify theUI control 106 that theuser input 108 was intended for. For instance, the UI controls 106 having the highest probabilities might be visually emphasized in order to indicate to a user thatambiguous user input 108 was received. The user might then more particularly select one of the emphasized UI controls 106 in order to specify that the selectedUI control 106 is the intendedUI control 106. Fromoperation 220, the routine 200 returns tooperation 202, described above. - Referring now to
FIGS. 3-5 , several screen diagrams illustrating the application of the probabilisticmulti-factor model 110 provided herein to several sample user inputs will be described. In particular,FIG. 3 shows ascreen display 300A provided by auser input device 112 equipped with a touch screen. Thescreen display 300A in this example includes asingle UI control 106A, which is a button. In this example,user input 108 has been received on the touch screen in the form of asingle touch 304A. Using previous point-in-rectangle hit testing mechanisms, thesingle touch 304A would not register as a selection of theUI control 106A because thetouch 304A is not within the boundaries of theUI control 106A - Using the probabilistic
multi-factor model 110 provided herein, however, thesingle touch 304A may be registered as a selection of theUI control 106A even though thetouch 304A is not within the boundaries of theUI control 106A. This is because the probabilisticmulti-factor model 110 may take into account the type of user input (e.g. asingle touch 304A), the distance from thetouch 304A to theUI control 106A (e.g. relatively close), the consistency of thetouch 304A with the type of input expected by theUI control 106A (e.g. very consistent), the consistency of the shape and size of thetouch 304A with the shape and size of input expected by theUI control 106A (e.g. very consistent), other on-screen UI controls (e.g. none), and other factors. In light of these factors, the probabilisticmulti-factor model 110 may compute a high probability that thetouch 304A was intended for theUI control 106A and associate thetouch 304A with theUI control 106A. In this way, thetouch 304A might select theUI control 106A even though thetouch 304A is not within the boundaries of the actual control. -
FIG. 4 shows ascreen display 300B provided by auser input device 112 equipped with a touch screen. Thescreen display 300B in this example includes a twoUI controls user input 108 has been received on the touch screen in the form of asingle touch 304B. Using previous point-in-rectangle hit testing mechanisms, thesingle touch 304B would not register as a selection of either of the UI controls 106A and 106B, because thetouch 304A is not within the boundaries of either of these controls. - Using the probabilistic
multi-factor model 110 provided herein, however, thesingle touch 304B may be registered as a selection of theUI control 106B even though thetouch 304B is not within the boundaries of theUI control 106B. This is because the probabilisticmulti-factor model 110 may take into account the type of user input (e.g. asingle touch 304B), the distance from thetouch 304A to the UI controls 106B and 106C (e.g. close to theUI control 106B but further from theUI control 106B), the consistency of thetouch 304A with the type of input expected by the UI controls 106A and 106B (e.g. very consistent), the consistency of the shape and size of thetouch 304B with the shape and size of input expected by the UI controls 106A and 106B (e.g. very consistent), and other factors. In light of these factors, the probabilisticmulti-factor model 110 may compute a high probability that thetouch 304B was intended for theUI control 106B and associate thetouch 304B with theUI control 106B since thetouch 304B is closer to theUI control 106B than theUI control 106C. In this way, thetouch 304B might select theUI control 106B even though thetouch 304B is not within the boundaries of this control. -
FIG. 5 shows ascreen display 300C provided by auser input device 112 equipped with a touch screen. Thescreen display 300C in this example includes a twoUI controls UI control 106D is a vertically oriented scroll bar and theUI control 106E is a horizontally oriented scroll bar. In this example,user input 108 has been received on the touch screen in the form of ahorizontal swiping gesture 304C across the touch screen. Using previous point-in-rectangle hit testing mechanisms, thegesture 304C would not register as a selection of either of the UI controls 106D and 106E, because thegesture 304C is not within the boundaries of either of these controls. - Using the probabilistic
multi-factor model 110 provided herein, however, thegesture 304C may be registered as a selection of theUI control 106E even though thegesture 304C is not within the boundaries of theUI control 106E. This is because the probabilisticmulti-factor model 110 may take into account the type of user input (e.g. a horizontal swipe), the distance from thegesture 304C to the UI controls 106D and 106E (e.g. about the same distance to each), the consistency of thegesture 304C with the type of input expected by the UI controls 106D and 106E (e.g. consistent with theUI control 106E but not consistent with theUI control 106D), the consistency of the shape and size of thegesture 304C with the shape and size of input expected by the UI controls 106D and 106E (e.g. consistent with theUI control 106E but not consistent with theUI control 106D), and other factors. In light of these factors, the probabilisticmulti-factor model 110 may compute a high probability that thegesture 304C was intended for theUI control 106E and associate thegesture 106E with theUI control 106E since thegesture 304C is a horizontal motion similar to what would be expected by the horizontally orientedUI control 106E. In this way, thegesture 304C might control theUI control 106E even though thegesture 304C is not within the boundaries of this control. -
FIG. 6 shows an illustrative computer architecture for acomputer 600 capable of executing the software components described herein for providing a multi-factor probabilistic model for evaluating user input. The computer architecture shown inFIG. 6 illustrates a conventional desktop, laptop computer, or server computer and may be utilized to execute the software components described herein for providing a multi-factor probabilistic model for evaluating user input. - The computer architecture shown in
FIG. 6 includes a central processing unit 602 (“CPU”), asystem memory 608, including a random access memory 614 (“RAM”) and a read-only memory (“ROM”) 616, and asystem bus 604 that couples the memory to theCPU 602. A basic input/output system containing the basic routines that help to transfer information between elements within thecomputer 600, such as during startup, is stored in theROM 616. Thecomputer 600 further includes amass storage device 610 for storing an operating system 618, application programs, and other program modules, which will be described in greater detail below. - The
mass storage device 610 is connected to theCPU 602 through a mass storage controller (not shown) connected to thebus 604. Themass storage device 610 and its associated computer-readable media provide non-volatile storage for thecomputer 600. Although the description of computer-readable media contained herein refers to a mass storage device, such as a hard disk or CD-ROM drive, it should be appreciated by those skilled in the art that computer-readable media can be any available computer storage media that can be accessed by thecomputer 600. - By way of example, and not limitation, computer-readable storage media may include volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. For example, computer-readable media includes, but is not limited to, RAM, ROM, EPROM, EEPROM, flash memory or other solid state memory technology, CD-ROM, digital versatile disks (“DVD”), HD-DVD, BLU-RAY, or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by the
computer 600. - According to various embodiments, the
computer 600 may operate in a networked environment using logical connections to remote computers through a network such as thenetwork 620. Thecomputer 600 may connect to thenetwork 620 through anetwork interface unit 606 connected to thebus 604. It should be appreciated that thenetwork interface unit 606 may also be utilized to connect to other types of networks and remote computer systems. Thecomputer 600 may also include an input/output controller 612 for receiving and processing input from a number of other devices, including a keyboard, mouse, or electronic stylus (not shown inFIG. 6 ). Similarly, an input/output controller may provide output to a display screen, a printer, or other type of output device (also not shown inFIG. 6 ). - As mentioned briefly above, a number of program modules and data files may be stored in the
mass storage device 610 andRAM 614 of thecomputer 600, including an operating system 618 suitable for controlling the operation of a networked desktop, laptop, or server computer. Themass storage device 610 andRAM 614 may also store one or more program modules. In particular, themass storage device 610 and theRAM 614 may store anapplication program 102 oroperating system 104 that provide the functionality described herein for evaluating user input using a multi-factorprobabilistic model 110. - In general, software applications or modules may, when loaded into the
CPU 602 and executed, transform theCPU 602 and theoverall computer 600 from a general-purpose computing system into a special-purpose computing system customized to perform the functionality presented herein. TheCPU 602 may be constructed from any number of transistors or other discrete circuit elements, which may individually or collectively assume any number of states. More specifically, theCPU 602 may operate as one or more finite-state machines, in response to executable instructions contained within the software or modules. These computer-executable instructions may transform theCPU 602 by specifying how theCPU 602 transitions between states, thereby physically transforming the transistors or other discrete hardware elements constituting theCPU 602. - Encoding the software or modules onto a mass storage device may also transform the physical structure of the mass storage device or associated computer readable storage media. The specific transformation of physical structure may depend on various factors, in different implementations of this description. Examples of such factors may include, but are not limited to: the technology used to implement the computer readable storage media, whether the computer readable storage media are characterized as primary or secondary storage, and the like. For example, if the computer readable storage media is implemented as semiconductor-based memory, the software or modules may transform the physical state of the semiconductor memory, when the software is encoded therein. For example, the software may transform the states of transistors, capacitors, or other discrete circuit elements constituting the semiconductor memory.
- As another example, the computer readable storage media may be implemented using magnetic or optical technology. In such implementations, the software or modules may transform the physical state of magnetic or optical media, when the software is encoded therein. These transformations may include altering the magnetic characteristics of particular locations within given magnetic media. These transformations may also include altering the physical features or characteristics of particular locations within given optical media, to change the optical characteristics of those locations. Other transformations of physical media are possible without departing from the scope and spirit of the present description, with the foregoing examples provided only to facilitate this discussion.
- Based on the foregoing, it should be appreciated that a multi-factor probabilistic model for evaluating user input has been presented herein. Although the subject matter presented herein has been described in language specific to computer structural features, methodological acts, and computer readable media, it is to be understood that the invention defined in the appended claims is not necessarily limited to the specific features, acts, or media described herein. Rather, the specific features, acts and mediums are disclosed as example forms of implementing the claims.
- The subject matter described above is provided by way of illustration only and should not be construed as limiting. Various modifications and changes may be made to the subject matter described herein without following the example embodiments and applications illustrated and described, and without departing from the true spirit and scope of the present invention, which is set forth in the following claims.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/732,190 US11429272B2 (en) | 2010-03-26 | 2010-03-26 | Multi-factor probabilistic model for evaluating user input |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/732,190 US11429272B2 (en) | 2010-03-26 | 2010-03-26 | Multi-factor probabilistic model for evaluating user input |
Publications (2)
Publication Number | Publication Date |
---|---|
US20110238612A1 true US20110238612A1 (en) | 2011-09-29 |
US11429272B2 US11429272B2 (en) | 2022-08-30 |
Family
ID=44657500
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/732,190 Active 2035-01-12 US11429272B2 (en) | 2010-03-26 | 2010-03-26 | Multi-factor probabilistic model for evaluating user input |
Country Status (1)
Country | Link |
---|---|
US (1) | US11429272B2 (en) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150019539A1 (en) * | 2013-07-15 | 2015-01-15 | Blackberry Limited | Methods and devices for providing a text prediction |
EP2827257A1 (en) * | 2013-07-15 | 2015-01-21 | BlackBerry Limited | Methods and devices for providing a text prediction |
US20150355772A1 (en) * | 2014-06-04 | 2015-12-10 | International Business Machines Corporation | Touch prediction for visual displays |
WO2015200412A1 (en) * | 2014-06-27 | 2015-12-30 | Microsoft Technology Licensing, Llc | Probabilistic touch sensing |
CN105210023A (en) * | 2013-03-06 | 2015-12-30 | 诺基亚技术有限公司 | Apparatus and associated methods |
CN106066756A (en) * | 2013-11-27 | 2016-11-02 | 青岛海信电器股份有限公司 | The interface creating method of terminal and device |
JPWO2016117500A1 (en) * | 2015-01-19 | 2017-11-24 | 日本電気株式会社 | Authentication apparatus, method, system and program, and server apparatus |
WO2018205745A1 (en) * | 2017-05-12 | 2018-11-15 | 北京点石经纬科技有限公司 | Input interface display system and method for portable device |
USD844018S1 (en) * | 2014-09-09 | 2019-03-26 | Apple Inc. | Display screen or portion thereof with graphical user interface |
USD851121S1 (en) | 2016-10-26 | 2019-06-11 | Apple Inc. | Display screen or portion thereof with graphical user interface |
US20200210518A1 (en) * | 2018-12-26 | 2020-07-02 | Software Ag | Systems and/or methods for dynamic layout design |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4332464A (en) * | 1980-09-22 | 1982-06-01 | Xerox Corporation | Interactive user-machine interface method and apparatus for copier/duplicator |
US5231510A (en) * | 1991-04-22 | 1993-07-27 | Worthington Cristian A | Information retrieval system utilizing facsimile communication and paper forms with preset format |
US6323846B1 (en) * | 1998-01-26 | 2001-11-27 | University Of Delaware | Method and apparatus for integrating manual input |
US20030156145A1 (en) * | 2002-02-08 | 2003-08-21 | Microsoft Corporation | Ink gestures |
US20030193572A1 (en) * | 2002-02-07 | 2003-10-16 | Andrew Wilson | System and process for selecting objects in a ubiquitous computing environment |
US20030193481A1 (en) * | 2002-04-12 | 2003-10-16 | Alexander Sokolsky | Touch-sensitive input overlay for graphical user interface |
US20050146508A1 (en) * | 2004-01-06 | 2005-07-07 | International Business Machines Corporation | System and method for improved user input on personal computing devices |
US6954197B2 (en) * | 2002-11-15 | 2005-10-11 | Smart Technologies Inc. | Size/scale and orientation determination of a pointer in a camera-based touch system |
US20060036944A1 (en) * | 2004-08-10 | 2006-02-16 | Microsoft Corporation | Surface UI for gesture-based interaction |
US7453439B1 (en) * | 2003-01-16 | 2008-11-18 | Forward Input Inc. | System and method for continuous stroke word-based text input |
US20090006958A1 (en) * | 2007-06-29 | 2009-01-01 | Nokia Corporation | Method, Apparatus and Computer Program Product for Providing an Object Selection Mechanism for Display Devices |
US20090100384A1 (en) * | 2007-10-10 | 2009-04-16 | Apple Inc. | Variable device graphical user interface |
US20090135162A1 (en) * | 2005-03-10 | 2009-05-28 | Koninklijke Philips Electronics, N.V. | System and Method For Detecting the Location, Size and Shape of Multiple Objects That Interact With a Touch Screen Display |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
HU0100663D0 (en) | 2001-02-09 | 2001-04-28 | Pali Jenoe | Apparatus for detecting local surface drift |
ATE489672T1 (en) | 2006-09-13 | 2010-12-15 | Koninkl Philips Electronics Nv | DETERMINING THE ALIGNMENT OF AN OBJECT |
US9477342B2 (en) * | 2008-08-26 | 2016-10-25 | Google Technology Holdings LLC | Multi-touch force sensing touch-screen devices and methods |
-
2010
- 2010-03-26 US US12/732,190 patent/US11429272B2/en active Active
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4332464A (en) * | 1980-09-22 | 1982-06-01 | Xerox Corporation | Interactive user-machine interface method and apparatus for copier/duplicator |
US5231510A (en) * | 1991-04-22 | 1993-07-27 | Worthington Cristian A | Information retrieval system utilizing facsimile communication and paper forms with preset format |
US6323846B1 (en) * | 1998-01-26 | 2001-11-27 | University Of Delaware | Method and apparatus for integrating manual input |
US20030193572A1 (en) * | 2002-02-07 | 2003-10-16 | Andrew Wilson | System and process for selecting objects in a ubiquitous computing environment |
US20030156145A1 (en) * | 2002-02-08 | 2003-08-21 | Microsoft Corporation | Ink gestures |
US20030193481A1 (en) * | 2002-04-12 | 2003-10-16 | Alexander Sokolsky | Touch-sensitive input overlay for graphical user interface |
US6954197B2 (en) * | 2002-11-15 | 2005-10-11 | Smart Technologies Inc. | Size/scale and orientation determination of a pointer in a camera-based touch system |
US7453439B1 (en) * | 2003-01-16 | 2008-11-18 | Forward Input Inc. | System and method for continuous stroke word-based text input |
US20050146508A1 (en) * | 2004-01-06 | 2005-07-07 | International Business Machines Corporation | System and method for improved user input on personal computing devices |
US20060036944A1 (en) * | 2004-08-10 | 2006-02-16 | Microsoft Corporation | Surface UI for gesture-based interaction |
US20090135162A1 (en) * | 2005-03-10 | 2009-05-28 | Koninklijke Philips Electronics, N.V. | System and Method For Detecting the Location, Size and Shape of Multiple Objects That Interact With a Touch Screen Display |
US20090006958A1 (en) * | 2007-06-29 | 2009-01-01 | Nokia Corporation | Method, Apparatus and Computer Program Product for Providing an Object Selection Mechanism for Display Devices |
US20090100384A1 (en) * | 2007-10-10 | 2009-04-16 | Apple Inc. | Variable device graphical user interface |
Non-Patent Citations (1)
Title |
---|
Rubine, Dean Harris. "The Automatic Recognition of Gestures" December 1991. [ONLINE] Downloaded 5/1/2018 http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.116.1350&rep=rep1&type=pdf * |
Cited By (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160018913A1 (en) * | 2013-03-06 | 2016-01-21 | Nokia Technologies Oy | Apparatus and associated methods |
US10222881B2 (en) * | 2013-03-06 | 2019-03-05 | Nokia Technologies Oy | Apparatus and associated methods |
CN105210023A (en) * | 2013-03-06 | 2015-12-30 | 诺基亚技术有限公司 | Apparatus and associated methods |
EP2827257A1 (en) * | 2013-07-15 | 2015-01-21 | BlackBerry Limited | Methods and devices for providing a text prediction |
US20150019539A1 (en) * | 2013-07-15 | 2015-01-15 | Blackberry Limited | Methods and devices for providing a text prediction |
CN106066756A (en) * | 2013-11-27 | 2016-11-02 | 青岛海信电器股份有限公司 | The interface creating method of terminal and device |
US9405399B2 (en) | 2014-06-04 | 2016-08-02 | International Business Machines Corporation | Touch prediction for visual displays |
US9406025B2 (en) * | 2014-06-04 | 2016-08-02 | International Business Machines Corporation | Touch prediction for visual displays |
US20160306493A1 (en) * | 2014-06-04 | 2016-10-20 | International Business Machines Corporation | Touch prediction for visual displays |
US10067596B2 (en) | 2014-06-04 | 2018-09-04 | International Business Machines Corporation | Touch prediction for visual displays |
US10203796B2 (en) * | 2014-06-04 | 2019-02-12 | International Business Machines Corporation | Touch prediction for visual displays |
US10162456B2 (en) * | 2014-06-04 | 2018-12-25 | International Business Machines Corporation | Touch prediction for visual displays |
US20150355772A1 (en) * | 2014-06-04 | 2015-12-10 | International Business Machines Corporation | Touch prediction for visual displays |
US10025427B2 (en) | 2014-06-27 | 2018-07-17 | Microsoft Technology Licensing, Llc | Probabilistic touch sensing |
CN106662974A (en) * | 2014-06-27 | 2017-05-10 | 微软技术许可有限责任公司 | Probabilistic touch sensing |
WO2015200412A1 (en) * | 2014-06-27 | 2015-12-30 | Microsoft Technology Licensing, Llc | Probabilistic touch sensing |
RU2683171C2 (en) * | 2014-06-27 | 2019-03-26 | МАЙКРОСОФТ ТЕКНОЛОДЖИ ЛАЙСЕНСИНГ, ЭлЭлСи | Probabilistic touch detection |
USD844018S1 (en) * | 2014-09-09 | 2019-03-26 | Apple Inc. | Display screen or portion thereof with graphical user interface |
US10579781B2 (en) | 2015-01-19 | 2020-03-03 | Nec Corporation | Authentication apparatus, method, system and program, and server apparatus |
JPWO2016117500A1 (en) * | 2015-01-19 | 2017-11-24 | 日本電気株式会社 | Authentication apparatus, method, system and program, and server apparatus |
US11030286B2 (en) | 2015-01-19 | 2021-06-08 | Nec Corporation | Authentication apparatus, method, system and program, and server apparatus |
USD851121S1 (en) | 2016-10-26 | 2019-06-11 | Apple Inc. | Display screen or portion thereof with graphical user interface |
USD888086S1 (en) | 2016-10-26 | 2020-06-23 | Apple Inc. | Display screen or portion thereof with graphical user interface |
USD910690S1 (en) | 2016-10-26 | 2021-02-16 | Apple Inc. | Display screen or portion thereof with graphical user interface |
WO2018205745A1 (en) * | 2017-05-12 | 2018-11-15 | 北京点石经纬科技有限公司 | Input interface display system and method for portable device |
US20200210518A1 (en) * | 2018-12-26 | 2020-07-02 | Software Ag | Systems and/or methods for dynamic layout design |
US10984170B2 (en) * | 2018-12-26 | 2021-04-20 | Software Ag | Systems and/or methods for dynamic layout design |
Also Published As
Publication number | Publication date |
---|---|
US11429272B2 (en) | 2022-08-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11429272B2 (en) | Multi-factor probabilistic model for evaluating user input | |
US9927969B2 (en) | Rendering object icons associated with an object icon | |
US11200072B2 (en) | User interface adaptations based on inferred content occlusion and user intent | |
US10168855B2 (en) | Automatic detection of user preferences for alternate user interface model | |
US20110199386A1 (en) | Overlay feature to provide user assistance in a multi-touch interactive display environment | |
US10108330B2 (en) | Automatic highlighting of formula parameters for limited display devices | |
US20150331590A1 (en) | User interface application launcher and method thereof | |
JP6557213B2 (en) | Page back | |
KR20140042750A (en) | Touch-enabled complex data entry | |
US9898543B2 (en) | Browser interaction for lazy loading operations | |
US20160070467A1 (en) | Electronic device and method for displaying virtual keyboard | |
WO2018098960A1 (en) | Method for operating touchscreen device, and touchscreen device | |
US10366518B2 (en) | Extension of text on a path | |
US9904402B2 (en) | Mobile terminal and method for input control | |
US20160378327A1 (en) | Path gestures | |
US20200026400A1 (en) | Adjusting user interface for touchscreen and mouse/keyboard environments | |
EP3210101B1 (en) | Hit-test to determine enablement of direct manipulations in response to user actions | |
US9635170B2 (en) | Apparatus and method for controlling terminal to expand available display region to a virtual display space | |
US20130155072A1 (en) | Electronic device and method for managing files using the electronic device | |
KR101366170B1 (en) | User Interface for controlling state of menu | |
US20190187868A1 (en) | Operation of a data processing system during graphical user interface transitions | |
US20160224202A1 (en) | System, method and user interface for gesture-based scheduling of computer tasks | |
US9639257B2 (en) | System and method for selecting interface elements within a scrolling frame | |
KR102205235B1 (en) | Control method of favorites mode and device including touch screen performing the same | |
US20140304652A1 (en) | Electronic device and method for unlocking the electronic device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MICROSOFT CORPORATION, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:WILSON, ANDREW DAVID;REEL/FRAME:024141/0626 Effective date: 20100323 |
|
AS | Assignment |
Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034564/0001 Effective date: 20141014 |
|
STCV | Information on status: appeal procedure |
Free format text: NOTICE OF APPEAL FILED |
|
STCV | Information on status: appeal procedure |
Free format text: APPEAL BRIEF (OR SUPPLEMENTAL BRIEF) ENTERED AND FORWARDED TO EXAMINER |
|
STCV | Information on status: appeal procedure |
Free format text: EXAMINER'S ANSWER TO APPEAL BRIEF MAILED |
|
STCV | Information on status: appeal procedure |
Free format text: APPEAL READY FOR REVIEW |
|
STCV | Information on status: appeal procedure |
Free format text: ON APPEAL -- AWAITING DECISION BY THE BOARD OF APPEALS |
|
STCV | Information on status: appeal procedure |
Free format text: BOARD OF APPEALS DECISION RENDERED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: AWAITING TC RESP., ISSUE FEE NOT PAID |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |