WO2007082290A2 - User interface for a touch-screen based computing device and method therefor - Google Patents

User interface for a touch-screen based computing device and method therefor Download PDF

Info

Publication number
WO2007082290A2
WO2007082290A2 PCT/US2007/060435 US2007060435W WO2007082290A2 WO 2007082290 A2 WO2007082290 A2 WO 2007082290A2 US 2007060435 W US2007060435 W US 2007060435W WO 2007082290 A2 WO2007082290 A2 WO 2007082290A2
Authority
WO
WIPO (PCT)
Prior art keywords
screen
icons
impact zone
touch
determining
Prior art date
Application number
PCT/US2007/060435
Other languages
French (fr)
Other versions
WO2007082290A3 (en
Inventor
Yaroslav Novak
Original Assignee
Motorola, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Motorola, Inc. filed Critical Motorola, Inc.
Publication of WO2007082290A2 publication Critical patent/WO2007082290A2/en
Publication of WO2007082290A3 publication Critical patent/WO2007082290A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • G06F9/453Help systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present invention relates to a method for improving the usability of a user interface (UI) of a touch-screen based computing device.
  • UI user interface
  • the invention is applicable to, but not limited to, user interface for mobile computing devices utilising a touchscreen.
  • Touch-screen (sometimes referred to as touch-panel) displays are known for use in electronic devices for various applications. Such displays show an image of a number of character buttons or functional buttons. If a user touches the panel where one of the buttons is shown an internal sensor detects that the user has selected that particular button and indicates the selection to an internal electronic controller that executes a corresponding function.
  • Touch-screen displays have been used mainly in applications such as automatic teller machines, in which the users are the general public who may not be used to computer operations.
  • applications are emerging in which touch panel displays are used to provide operations and functions equivalent to those obtained with a personal computer keyboard or mouse pointer.
  • touch screen displays are known which display icons, option keys or the like that are used, for example, to indicate software programs or other functional applications which may be selected to run on a digital processor of the device.
  • Touch-screen displays are predominantly used in computing devices or microprocessor control devices comprising a user interface. Smaller versions of these computing devices have found particular use in applications where the use of a full- sized keypad or keyboard may be impractical or impossible. Touch-screen based computing devices have also been used to improve the User Interface (UI) of devices in which the keypad is small and/or has a limited functionality. Touch-screen based computing devices have also found great use in applications that require user input from a menu-based format, as the user is able to quickly select menu options from a range of displayed icons.
  • UI User Interface
  • the screens of mobile/hand-portable computing devices are typically limited in size by the nature of the device, e.g. a Personal Digital Assistant (PDA).
  • the screen size may be relatively large, as is found in some laptop personal computer (PC) or PC tablet devices.
  • the touch-screen is used to display a number of icons, icons or text fields that are associated with some device functionality, such as a text editor or a calendar application, for example.
  • the term 'icon' will be used to encompass icons, control elements, text fields, etc. of the image displayed to represent a particular functionality of the touch-screen based device.
  • An icon is thus a "shortcut" to an application or function or further screen displaying further options. For example, a user may activate the icon by touching an area of the screen displaying the icon, which causes the associated application to execute. Icons may also be associated with some functionality of the computing device itself.
  • text fields may also be associated with a specific application, such as a text editor or browser.
  • the text fields may activate a specific functionality of an application that is already executing on the device.
  • the text field could be an item in a drop down menu, for example.
  • the touch screen display may be provided on a handset, e.g. for wireless communications.
  • a stylus or like device is often employed to assist the user to navigate through the program applications or options available, and to enter data and provide control instructions via the touch panel.
  • working with a stylus is not ideal.
  • the stylus has first of all to be retrieved to begin operation using it. It may easily be mislaid or lost. Operation of the stylus on the touch screen of the touch panel display has to be gentle to avoid damage to the touch screen.
  • buttons or icons are selected by a user's finger in at least some applications of the device.
  • displayed items for selection may need to be close together, selection of one item may overlap with selection of another unwanted item. This problem may be noticed particularly if the user is wearing gloves, e.g. because the user is performing a specialist job such as in police work or in fire-fighting. This problem of producing an overlap in the items selected items has been recognised in the prior art.
  • the solution which has been proposed is based on a predictive algorithm.
  • the algorithm detects what selections have already been made by the user and predicts from such selections what is the selection currently likely to be intended by the user. Unfortunately, such an algorithm is of little use if no prior selections have already been made by the user.
  • a touchscreen based computing device as claimed in Claim 10.
  • a method for determining a command provided via activation of an onscreen icon of a touch-screen based computing device comprises detecting activation by a user of the touch-screen, determining a position and/or dimension of an impact zone, and determining whether multiple on-screen icons are associated with the impact zone. If only one on-screen icon is associated with the area, that particular icon is activated. However, if multiple on-screen icons arc associated with the impact zone, the process displays a helper-screen comprising the multiple icons and a step of determining a position and/or dimension of a subsequent impact zone relating to the icons displayed on the helper screen is performed.
  • the process of displaying helper-screens is subsequently repeated until a single desired icon is determined as being activated, or the process cancelled.
  • UI user interface
  • a dimension of each icon displayed in the helper- screen is scaled, to improve the ability of a microprocessor within the computing device of determining a single icon that the user desires to activate.
  • the multiple icons may be scaled to one or more similar dimension(s) of the initial impact zone.
  • the size of the icons being displayed on the helper screen can be scaled, such that the icons are substantially the same size as the tool being used to activate them on the screen, e.g. a finger end or a pen for example.
  • the user may readily select a desired icon from the helper screen, with a reduced risk of selecting multiple icons, as the icon size is similar to the impact caused by the activating mechanism.
  • the scaling of one or more dimension(s) of each of the multiple icons displayed in the helper-screen is performed such that one or more dimension(s) of the multiple icons is/are larger than one or more of the dimension(s) of the impact zone.
  • the multiple icons are larger than the tool being used to activate the screen, such as a finger end or pen.
  • the user has little difficulty selecting the desired icon, even when, say, using a gloved finger.
  • This beneficial feature may also be automatically activated when the impact zone increases above a certain size, implying that a large object is being used to activate applications via the touch-screen.
  • a degree of overlap between an impact zone and areas associated with each of the on-screen icons is determined. Any icon for which the determined degree of overlap is less than a threshold level is ignored.
  • This 'filter' mechanism advantageously reduces the number of icons displayed on the helper screen, by determining and eliminating applications that it identifies were most probably activated in error by the user.
  • a touchscreen based computing device comprising a user interface having a touch-screen input device and a microprocessor, operably coupled to the touch screen input device, arranged to detect an activation of an on-screen icon; and determine an impact zone associated with the activation.
  • the microprocessor determines whether a single on- screen icon or multiple on-screen icons is/are associated with the impact zone; and displays on the touch-screen, in response to determining multiple on-screen icons, a helper-screen comprising multiple icons associated with the multiple on-screen icons of the impact zone.
  • the helper-screen occupies only a fraction of an entire touch-screen area, thus allowing other information relevant to the user to continue to be displayed on the main screen. In this manner, the user is able to select the required application from the helper-screen, whilst simultaneously monitoring the main screen.
  • the computing device advantageously supports at least two modes of operation, a regular mode in which the command determination feature is disabled, and an overlap mode in which the feature is enabled.
  • a regular mode in which the command determination feature is disabled
  • an overlap mode in which the feature is enabled.
  • the method comprises determining those icons that are selected and subsequently displaying these icons to the user in a further helper screen. The user is then able to start the application or command via pressing the relevant icon on the helper-screen.
  • FIG. 1 illustrates a computing device adapted in accordance with one embodiment of the present invention showing a number of icons and impact zones;
  • FIG. 2 illustrates a computing device adapted in accordance with one embodiment of the present invention, showing the helper-screen and four icons corresponding to elements selected by one of the impact zones of FIG. 1;
  • FIG. 3 illustrates a flow chart describing a method in accordance with one embodiment of the present invention.
  • any touch screen device such as a personal digital assistant (PDA), MP-3 player or mobile telephone.
  • PDA personal digital assistant
  • any reference to computing device is meant to encompass any user-interface equipment that is capable of using a touch-screen as a means of user interaction.
  • selection of an 'icon' is meant to encompass any selection where a choice exists on a user interface, irrespective of whether a 'button' or 'menu' presentation mechanism is used, e.g. the selection of an active documents may encompass a choice between two overlapping displayed documents.
  • FIG. 1 shows a computing device 100 with touch screen display 160, and icons 122, 124, 126, etc.
  • the icons are graphical icons representing software applications or functions stored in memory on the computing device 100.
  • the graphical icons may represent software applications or functions accessible by the computing device 100 via a wireless network, for example.
  • a microprocessor 170 on the touch-screen based computing device 100 interprets the inputs entered by the touch screen and determines how to proceed, e.g. activating a function, operation or element or generating and displaying a helper screen.
  • the touch screen and the microprocessor 170 may, for example, be operably coupled to a sensor (not shown), such that the sensor senses a local change or variation in an electrical property caused by touch by a user's finger.
  • the property changed may be electrical capacitance, conductivity or resistivity.
  • the touch screen may for example be a screen which senses a local change or variation in an electrical property, such as conductivity or resistivity, due to a pressure change when a selected area corresponding to a displayed icon is touched by a user, to indicate selection of the corresponding function. Any other known form of touch screen may alternatively be used.
  • the applications or functions are executed by a user physically touching an area of the screen or display occupied by a respective icon.
  • the display hardware is configured to detect any impact (touch), determine a position of the touch (impact zone) on the screen, and make this information available to applications or firmware running on, say, the microprocessor 170 of the computing device 100.
  • the icons themselves are generated, or read out of memory, by the microprocessor 170 of the computing device 100 as required, and displayed on the touch screen 160.
  • the whole screen and its contents define the user interface (UI) of the computing device 100.
  • Touch-sensitive elements and circuitry are known in the art. A skilled artisan appreciates how such elements and circuitry can be applied in implementing the inventive concept herein described and therefore the structure of touch-sensitive elements and circuitry will not be described further here.
  • the circular areas 110, 130, 150 shown in FIG. 1 represent somewhat idealised impact zones on the touch-screen 160 of the computing device 100.
  • Each of these impact zones corresponds to an example of a physical touch on the screen by an object such as a user's finger end. In reality these areas will have a more irregular form and are shown as circular for clarity purposes only.
  • the impact zone represents an attempt by the user of the computing device 100 to execute a command, or start an application, or otherwise access a functionality of the computing device 100 by means of an icon on the screen.
  • the mechanism used to touch the screen may well have been larger than the icon corresponding to the icon of the required application or function.
  • a user's impact zone 150 has wholly overlapped icon 154, substantially overlapped icon 152 and partially overlapped icon 156. In this case, it is possible that an incorrect application may be started unless the required application can be identified.
  • the user's impact zone 110 has substantially overlapped icons 112, 114 and partially overlapped icon 116, 118. Again, in this case, it is possible that an incorrect application may be started unless the required application can be identified.
  • the user's impact zone 130 has wholly overlapped a single icon 132. In this case, it is likely that the correct application corresponding to icon 132 will be started.
  • firmware say in the form of a microprocessor 170 executing on the hardware of the computing device 100, receives data identifying an impact zone (i.e. a user activation area on the touch screen).
  • the indication may be in the form of one or more dimension(s) of the impact zone, for example a list of pixels on the touch screen that have been activated, or length and width data of the impact zone, or a central location of the impact zone, with a radius of the impact area (in a circular form).
  • the microprocessor 170 compares a position of the impact zone, say impact zone 110, with the known position of the icons 112, 114, 116 and 118. For example, in the context of the present invention, a location area of the impact zone may be identified and compared with the known location areas of the icons in the vicinity of the impact zone.
  • the microprocessor 170 then ascertains whether more than one icon has been selected by the impact.
  • buttons may also be displayed, such as soft function keys, allowing the user to cancel a command or access some other function of the computing device 100, according to the applications or functions being supported.
  • FIG. 2 corresponds to the second example of FIG. 1 , whereby four icons 112, 114, 116, 118 corresponding to impact zone 110 were selected.
  • the helper-screen 230 is generated by a microprocessor 170 of the computing device 100 and is displayed on either a subsection of the touch screen 160 or the whole screen.
  • the area of the screen used to display the helper-screen 230 is user-selectable/user-defmable.
  • a user may be able to define the area to be made proportional to the number of overlapping elements.
  • the respective size of the icons displayed on the helper screen may be configured as dependent upon the level of overlap identified when comparing the impact zone with the icon area.
  • some other criterion could be used.
  • the size of the icons to be displayed on the helper screen 230 may be user-selectable/user-definable. In this manner, the user is able to better manage its UI. For example, a user whose hands tremble may decide to utilise a large area or the whole area of the screen 160 to display the helper screen 230, to more readily select the desired icon. In contrast, a user with better hand control may desire to limit the display area of the helper screen 230 so that (s)he can view other functions/applications that are running on other sections of the touch screen 160.
  • helper screen 230 is shown that covers the whole of the screen area of the computing device 100.
  • This helper screen 230 displays four icons, corresponding to the icons 112, 114, 116, 118 that were determined as overlapping with the original impact zone 110.
  • the user can now confirm the selection of the required icon, say icon 114, by touching the helper-screen display in the area of that element.
  • the function associated with that icon 114 is executed or activated.
  • the helper screen 230 is then configured to disappear and the device then returns to its normal operation.
  • the computing device 100 may display information associated with the function activated by the icon, if any.
  • the impact zone overlaps with a number of icons 112, 114, 116 and 118, with each overlapping to varying degrees.
  • a microprocessor 170 within, say, the firmware of the computing device 100 calculates a degree of overlap between the impact zone 1 10 and the individual icons 112, 114, 116 and 118.
  • the microprocessor 170 may do this by interfacing directly with the hardware of the touch screen.
  • the interface is performed via a device driver.
  • the microprocessor 170 reads data from a memory element within the computing device 100, where the data corresponds to the impact zone and/or the icon positions and sizes (e.g. area of screen occupied).
  • the microprocessor 170 determines a degree of overlap of each icon 1 12, 1 14, 1 16 and 118 with the impact zone 110, and then compares each degree of overlap with a threshold level. The microprocessor 170 is then able to determine those icons to display in the helper-screen 230 where the degree of overlap exceeded the threshold level.
  • the threshold level is set such that icon 118 is deemed to have been activated by accident, i.e. only a small fraction of the total area occupied by icon 118 lies within the impact zone 110.
  • the microprocessor 170 determines that icon 118 is not to be displayed on the helper screen 230, thus further simplifying the selection process by allowing more space to display the remaining three icons 112, 114 and 116.
  • the sizes of the icons 112, 114 and 116 to be displayed on the helper screen 230 are also scaled so as to be proportional to the original impact zone 110.
  • the user of the computing device 100 may set the scaling factor.
  • the scaling factor may be generated automatically by the computing device itself, for example according to a pre-defined function or rule.
  • the icons displayed in. the helper screen.230 can thus be made significantly larger than the original icons 112, 114 and 116, thereby allowing a single icon, i.e. desired icon 114, to be more readily selected.
  • the icons 112, 114 and 116 may also be scaled such that they occupy a similar area of screen, as did the initial impact zone 110.
  • the icons 112, 114 and 116 may be scaled, such that they occupy a larger area than did the initial impact zone 110.
  • the icons of the helper screen 230 may be scaled such that they are as large as, or larger than, the initial impact zone 110 made by the gloved finger.
  • the microprocessor 170 retains the size of the icons 112, 114 and 116, from the first activation. However, in order to improve the likelihood of the user activating the desired icon (e.g. icon 114) upon the next activation, the microprocessor 170 enlarges the spacing between the icons. For example, the microprocessor 170 may, in one embodiment, arrange the spacing between the icons such that no two icons arc displayed within an area as small as, or smaller than, the initial impact zone 110. Of course, the number of icons selected for subsequent selection via the helper screen, and the absolute size of the helper screen, limit the size and spacing of the icons.
  • a helper screen 230 i.e. upon a second or further activation
  • the aforementioned method of the invention is carried out again, in order to further reduce the number of icons displayed.
  • a further help-screen is generated, with a further reduction in the number of icons displayed. The user then selects the desired icon 114, from this reduced number of icons.
  • a flow chart 300 describes a method in accordance with one embodiment of the present invention.
  • the method commences when the screen is activated, in step 305.
  • a microprocessor within the computing device determines a position and dimensions of the activated impact zone, as shown in step 310.
  • the microprocessor determines whether one or more on-screen icon(s) is/are located within the impact zone, as in step 315.
  • step 315 the process returns to step 305 to determine whether the screen is activated. If the microprocessor determines that one or more on-screen icon(s) is/are located within the impact zone, in step 315, the microprocessor then determines whether multiple on-screen icon(s) is/are located within the impact zone, as in step 320. Tf multiple on-screen icons are not located within the impact zone, in step 320, the single selected icon is activated, as shown in step 330.
  • the microprocessor generates and displays a number of the multiple on-screen icons on a helper screen, as shown in step 325.
  • the microprocessor uses larger icons.
  • the microprocessor uses a greater spacing between the icons used to represent the multiple icons, as in step 325. The process then returns to step 305, waiting for a further user input on the helper screen to select the desired icon or further narrow down the selection.
  • the present invention is described in terms of a UI for a computing device. However, it will be appreciated by a skilled artisan that the inventive concept herein described may be embodied in any type of UI for a touch-screen device.
  • inventive concept can be applied by a semiconductor manufacturer to any user interface. It is further envisaged that, for example, a semiconductor manufacturer may employ the inventive concept in a design of a stand-alone user interface for a computing device or application-specific integrated circuit (ASIC) and/or any other sub-system element.
  • ASIC application-specific integrated circuit
  • a computing device having a touch screen is configured such that an impact on the touch screen, which overlaps with a number of icons, is detected by a microprocessor 170 in the device.
  • the microprocessor 170 arranges for a number of the multiple icons to be subsequently displayed on a helper screen, the user then being able to select the desired icon, from the number of the multiple icons, by the user touching the helper screen.

Abstract

A method for determining a command given via a touch-screen input device of a computing device comprising detecting (300) an activation of an on-screen icon; and determining (310) an impact zone associated with said activation. The method further comprises the steps of determining (320) whether a single on-screen icon or multiple on-screen icons (112, 114, 116, 118) is/are associated with said impact zone (110); and displaying, in response to determining multiple on-screen icons (112, 114, 116, 118), a helper-screen (230) comprising multiple icons (220) associated with said multiple on-screen icons (112, 114, 116, 118) of said impact zone.

Description

USERINTERFACE FOR A TOUCH-SCREEN BASED COMPUTING DEVICE
AND METHOD THEREFOR
Field of the Invention The present invention relates to a method for improving the usability of a user interface (UI) of a touch-screen based computing device. The invention is applicable to, but not limited to, user interface for mobile computing devices utilising a touchscreen.
Background of the Invention
Touch-screen (sometimes referred to as touch-panel) displays are known for use in electronic devices for various applications. Such displays show an image of a number of character buttons or functional buttons. If a user touches the panel where one of the buttons is shown an internal sensor detects that the user has selected that particular button and indicates the selection to an internal electronic controller that executes a corresponding function.
Touch-screen displays have been used mainly in applications such as automatic teller machines, in which the users are the general public who may not be used to computer operations. However, applications are emerging in which touch panel displays are used to provide operations and functions equivalent to those obtained with a personal computer keyboard or mouse pointer. For example, touch screen displays are known which display icons, option keys or the like that are used, for example, to indicate software programs or other functional applications which may be selected to run on a digital processor of the device.
Touch-screen displays are predominantly used in computing devices or microprocessor control devices comprising a user interface. Smaller versions of these computing devices have found particular use in applications where the use of a full- sized keypad or keyboard may be impractical or impossible. Touch-screen based computing devices have also been used to improve the User Interface (UI) of devices in which the keypad is small and/or has a limited functionality. Touch-screen based computing devices have also found great use in applications that require user input from a menu-based format, as the user is able to quickly select menu options from a range of displayed icons.
The screens of mobile/hand-portable computing devices are typically limited in size by the nature of the device, e.g. a Personal Digital Assistant (PDA). Alternatively, the screen size may be relatively large, as is found in some laptop personal computer (PC) or PC tablet devices. Tn each case, the touch-screen is used to display a number of icons, icons or text fields that are associated with some device functionality, such as a text editor or a calendar application, for example. Hereafter, the term 'icon' will be used to encompass icons, control elements, text fields, etc. of the image displayed to represent a particular functionality of the touch-screen based device.
An icon is thus a "shortcut" to an application or function or further screen displaying further options. For example, a user may activate the icon by touching an area of the screen displaying the icon, which causes the associated application to execute. Icons may also be associated with some functionality of the computing device itself.
It is also known that text fields may also be associated with a specific application, such as a text editor or browser. Alternatively, the text fields may activate a specific functionality of an application that is already executing on the device. In this latter case, the text field could be an item in a drop down menu, for example.
One problem with known touch-screen based computing devices is that the icons or control elements or text fields tend, to be 'small' with respect to the screen size and, thus, difficult to activate by a user using a finger. In this case, it is quite common that an attempt by the user to activate a specific function, or to select a specific menu point, results in the wrong function being activated. The user must then de-activate or disable this function before re-attempting to activate the desired operation. Such a process is always time consuming and irritating to the user, and sometimes extremely so if the function activated by accident happens to be one that deletes a document or closes a running application unexpectedly.
Furthermore, the touch screen display may be provided on a handset, e.g. for wireless communications. This can cause difficulties in selecting buttons or icons on the touch panel display since generally the buttons or icons have to be smaller and closer together. Thus, in such applications, a stylus or like device is often employed to assist the user to navigate through the program applications or options available, and to enter data and provide control instructions via the touch panel. However, working with a stylus is not ideal. The stylus has first of all to be retrieved to begin operation using it. It may easily be mislaid or lost. Operation of the stylus on the touch screen of the touch panel display has to be gentle to avoid damage to the touch screen.
Desirably, use of a stylus is avoided and displayed items for selection on the touch panel display, e.g. buttons or icons, are selected by a user's finger in at least some applications of the device. However, since displayed items for selection may need to be close together, selection of one item may overlap with selection of another unwanted item. This problem may be noticed particularly if the user is wearing gloves, e.g. because the user is performing a specialist job such as in police work or in fire-fighting. This problem of producing an overlap in the items selected items has been recognised in the prior art.
The solution which has been proposed is based on a predictive algorithm. The algorithm detects what selections have already been made by the user and predicts from such selections what is the selection currently likely to be intended by the user. Unfortunately, such an algorithm is of little use if no prior selections have already been made by the user.
Thus, a need exists for a UI for a touch-screen based device that enhances usability of such a device by reducing the risk of false activation of functions. Summary of the Invention
In accordance with a first aspect of the present invention, there is provided a method, as claimed in Claim 1.
In accordance with a first aspect of the present invention, there is provided a touchscreen based computing device, as claimed in Claim 10.
Further aspects and advantageous features of the present invention are as described in the appended Claims.
In summary, a method for determining a command provided via activation of an onscreen icon of a touch-screen based computing device is described. The method comprises detecting activation by a user of the touch-screen, determining a position and/or dimension of an impact zone, and determining whether multiple on-screen icons are associated with the impact zone. If only one on-screen icon is associated with the area, that particular icon is activated. However, if multiple on-screen icons arc associated with the impact zone, the process displays a helper-screen comprising the multiple icons and a step of determining a position and/or dimension of a subsequent impact zone relating to the icons displayed on the helper screen is performed.
In one embodiment, the process of displaying helper-screens is subsequently repeated until a single desired icon is determined as being activated, or the process cancelled.
In this manner, the performance and usability of a user interface (UI) is substantially and advantageously enhanced. In particular, the accuracy with which on-screen icons can be selected, and thus with which applications can be started or commands executed, is greatly improved.
Tn a further advantageous step, a dimension of each icon displayed in the helper- screen is scaled, to improve the ability of a microprocessor within the computing device of determining a single icon that the user desires to activate. For example, the multiple icons may be scaled to one or more similar dimension(s) of the initial impact zone. In this way, the size of the icons being displayed on the helper screen can be scaled, such that the icons are substantially the same size as the tool being used to activate them on the screen, e.g. a finger end or a pen for example. Thus, the user may readily select a desired icon from the helper screen, with a reduced risk of selecting multiple icons, as the icon size is similar to the impact caused by the activating mechanism.
In one embodiment, the scaling of one or more dimension(s) of each of the multiple icons displayed in the helper-screen is performed such that one or more dimension(s) of the multiple icons is/are larger than one or more of the dimension(s) of the impact zone. Thus, in this manner, the multiple icons are larger than the tool being used to activate the screen, such as a finger end or pen. Hence, the user has little difficulty selecting the desired icon, even when, say, using a gloved finger. This beneficial feature may also be automatically activated when the impact zone increases above a certain size, implying that a large object is being used to activate applications via the touch-screen.
In a yet further beneficial step, a degree of overlap between an impact zone and areas associated with each of the on-screen icons is determined. Any icon for which the determined degree of overlap is less than a threshold level is ignored. This 'filter' mechanism advantageously reduces the number of icons displayed on the helper screen, by determining and eliminating applications that it identifies were most probably activated in error by the user.
In accordance with a second aspect of the present invention, there is provided a touchscreen based computing device comprising a user interface having a touch-screen input device and a microprocessor, operably coupled to the touch screen input device, arranged to detect an activation of an on-screen icon; and determine an impact zone associated with the activation. The microprocessor determines whether a single on- screen icon or multiple on-screen icons is/are associated with the impact zone; and displays on the touch-screen, in response to determining multiple on-screen icons, a helper-screen comprising multiple icons associated with the multiple on-screen icons of the impact zone.
In a further advantageous embodiment, the helper-screen occupies only a fraction of an entire touch-screen area, thus allowing other information relevant to the user to continue to be displayed on the main screen. In this manner, the user is able to select the required application from the helper-screen, whilst simultaneously monitoring the main screen.
The computing device advantageously supports at least two modes of operation, a regular mode in which the command determination feature is disabled, and an overlap mode in which the feature is enabled. Thus, the user is able to enable the feature when the screen activation device is known to be large and inaccurate, such as a gloved finger, but can disable the feature when it is not needed, i.e. when the activation mechanism is expected to allow more accurate positioning on the screen.
In this manner, the aforementioned problems associated with the activation of functions via a touch screen interface have been alleviated.
This has been achieved, in one aspect, by identifying whether an impact on a touch screen selects multiple icons. If it is determined that multiple icon are selected, the method comprises determining those icons that are selected and subsequently displaying these icons to the user in a further helper screen. The user is then able to start the application or command via pressing the relevant icon on the helper-screen.
The teachings of the current invention are applicable to any type of computing device incorporating a touch screen. Brief Description of the Drawings
Exemplary embodiments of the present invention will now be described, by way of example only, with reference to the accompanying drawings, in which:
FIG. 1 illustrates a computing device adapted in accordance with one embodiment of the present invention showing a number of icons and impact zones;
FIG. 2 illustrates a computing device adapted in accordance with one embodiment of the present invention, showing the helper-screen and four icons corresponding to elements selected by one of the impact zones of FIG. 1; and
FIG. 3 illustrates a flow chart describing a method in accordance with one embodiment of the present invention.
Description of Embodiments of the Invention
The present invention will be described in terms of a touch screen computer. However, it will be appreciated that the inventive concept may be embodied in any touch screen device such as a personal digital assistant (PDA), MP-3 player or mobile telephone. Thus, hereafter, any reference to computing device is meant to encompass any user-interface equipment that is capable of using a touch-screen as a means of user interaction. Furthermore, within the context of the present invention, the selection of an 'icon' is meant to encompass any selection where a choice exists on a user interface, irrespective of whether a 'button' or 'menu' presentation mechanism is used, e.g. the selection of an active documents may encompass a choice between two overlapping displayed documents.
FIG. 1 shows a computing device 100 with touch screen display 160, and icons 122, 124, 126, etc. The icons are graphical icons representing software applications or functions stored in memory on the computing device 100. Alternatively, it is envisaged that the graphical icons may represent software applications or functions accessible by the computing device 100 via a wireless network, for example. A microprocessor 170 on the touch-screen based computing device 100 interprets the inputs entered by the touch screen and determines how to proceed, e.g. activating a function, operation or element or generating and displaying a helper screen.
In one embodiment, the touch screen and the microprocessor 170 may, for example, be operably coupled to a sensor (not shown), such that the sensor senses a local change or variation in an electrical property caused by touch by a user's finger. For example, the property changed may be electrical capacitance, conductivity or resistivity. Alternatively, or in addition, the touch screen may for example be a screen which senses a local change or variation in an electrical property, such as conductivity or resistivity, due to a pressure change when a selected area corresponding to a displayed icon is touched by a user, to indicate selection of the corresponding function. Any other known form of touch screen may alternatively be used.
Thus, the applications or functions are executed by a user physically touching an area of the screen or display occupied by a respective icon. The display hardware is configured to detect any impact (touch), determine a position of the touch (impact zone) on the screen, and make this information available to applications or firmware running on, say, the microprocessor 170 of the computing device 100. The icons themselves are generated, or read out of memory, by the microprocessor 170 of the computing device 100 as required, and displayed on the touch screen 160. The whole screen and its contents define the user interface (UI) of the computing device 100. Touch-sensitive elements and circuitry are known in the art. A skilled artisan appreciates how such elements and circuitry can be applied in implementing the inventive concept herein described and therefore the structure of touch-sensitive elements and circuitry will not be described further here.
The circular areas 110, 130, 150 shown in FIG. 1 represent somewhat idealised impact zones on the touch-screen 160 of the computing device 100. Each of these impact zones corresponds to an example of a physical touch on the screen by an object such as a user's finger end. In reality these areas will have a more irregular form and are shown as circular for clarity purposes only.
To consider the embodiments of the present invention, let us first consider the effects of different impact zones. The impact zone represents an attempt by the user of the computing device 100 to execute a command, or start an application, or otherwise access a functionality of the computing device 100 by means of an icon on the screen. As shown, the mechanism used to touch the screen may well have been larger than the icon corresponding to the icon of the required application or function.
Thus, in a first example, a user's impact zone 150 has wholly overlapped icon 154, substantially overlapped icon 152 and partially overlapped icon 156. In this case, it is possible that an incorrect application may be started unless the required application can be identified.
In a second example, the user's impact zone 110 has substantially overlapped icons 112, 114 and partially overlapped icon 116, 118. Again, in this case, it is possible that an incorrect application may be started unless the required application can be identified.
In a third example, the user's impact zone 130 has wholly overlapped a single icon 132. In this case, it is likely that the correct application corresponding to icon 132 will be started.
According to the teachings of the current invention, firmware, say in the form of a microprocessor 170 executing on the hardware of the computing device 100, receives data identifying an impact zone (i.e. a user activation area on the touch screen). The indication may be in the form of one or more dimension(s) of the impact zone, for example a list of pixels on the touch screen that have been activated, or length and width data of the impact zone, or a central location of the impact zone, with a radius of the impact area (in a circular form). The microprocessor 170 then compares a position of the impact zone, say impact zone 110, with the known position of the icons 112, 114, 116 and 118. For example, in the context of the present invention, a location area of the impact zone may be identified and compared with the known location areas of the icons in the vicinity of the impact zone. The microprocessor 170 then ascertains whether more than one icon has been selected by the impact.
If only one icon is selected by an impact, e.g. in the third example where the impact zone 130 only overlaps with a single icon 132, then the function or command associated with this icon 132 can be executed. The operation of the computing device 100 then proceeds as normal.
If, however, multiple icons 112, 114, 116, 118 have been selected, i.e. multiple icons overlap with the impact zone, then these icons 112, 114, 116, 118 are identified as potentially desired applications of functions. In response, a helper-screen is generated on screen 160 wherein only these icons 112, 114, 116, 118 are displayed.
It is envisaged that other icons may also be displayed, such as soft function keys, allowing the user to cancel a command or access some other function of the computing device 100, according to the applications or functions being supported.
Referring now to FIG. 2, the computing device 100 is illustrated showing one example of the activation of the helper-screen. Here, FIG. 2 corresponds to the second example of FIG. 1 , whereby four icons 112, 114, 116, 118 corresponding to impact zone 110 were selected. The helper-screen 230 is generated by a microprocessor 170 of the computing device 100 and is displayed on either a subsection of the touch screen 160 or the whole screen.
In one embodiment of the present invention, the area of the screen used to display the helper-screen 230 is user-selectable/user-defmable. For example, a user may be able to define the area to be made proportional to the number of overlapping elements. Tn another embodiment, the respective size of the icons displayed on the helper screen may be configured as dependent upon the level of overlap identified when comparing the impact zone with the icon area. Alternatively, as would be appreciated by a skilled artisan when taught the inventive concept herein described, some other criterion could be used.
In one embodiment, the size of the icons to be displayed on the helper screen 230 may be user-selectable/user-definable. In this manner, the user is able to better manage its UI. For example, a user whose hands tremble may decide to utilise a large area or the whole area of the screen 160 to display the helper screen 230, to more readily select the desired icon. In contrast, a user with better hand control may desire to limit the display area of the helper screen 230 so that (s)he can view other functions/applications that are running on other sections of the touch screen 160.
In FIG. 2 a helper screen 230 is shown that covers the whole of the screen area of the computing device 100. This helper screen 230 displays four icons, corresponding to the icons 112, 114, 116, 118 that were determined as overlapping with the original impact zone 110.
Advantageously, the user can now confirm the selection of the required icon, say icon 114, by touching the helper-screen display in the area of that element. Once the required icon 114 has been selected, the function associated with that icon 114 is executed or activated. In one embodiment, the helper screen 230 is then configured to disappear and the device then returns to its normal operation. For example, depending upon the icon selected, the computing device 100 may display information associated with the function activated by the icon, if any.
As shown in FIG. 1 the impact zone overlaps with a number of icons 112, 114, 116 and 118, with each overlapping to varying degrees. In one embodiment, and before generating a helper-screen, a microprocessor 170 within, say, the firmware of the computing device 100 calculates a degree of overlap between the impact zone 1 10 and the individual icons 112, 114, 116 and 118. The microprocessor 170 may do this by interfacing directly with the hardware of the touch screen. In one embodiment, the interface is performed via a device driver. In another embodiment, the microprocessor 170 reads data from a memory element within the computing device 100, where the data corresponds to the impact zone and/or the icon positions and sizes (e.g. area of screen occupied).
In the example shown in FIG. 1, it is clear that icon 114 has a far greater degree of overlap with the impact zone 110 than do icons 112, 116 and 118. In this case, the microprocessor 170 determines a degree of overlap of each icon 1 12, 1 14, 1 16 and 118 with the impact zone 110, and then compares each degree of overlap with a threshold level. The microprocessor 170 is then able to determine those icons to display in the helper-screen 230 where the degree of overlap exceeded the threshold level.
In one embodiment, the threshold level is set such that icon 118 is deemed to have been activated by accident, i.e. only a small fraction of the total area occupied by icon 118 lies within the impact zone 110.
Thus, in response to the comparison, the microprocessor 170 determines that icon 118 is not to be displayed on the helper screen 230, thus further simplifying the selection process by allowing more space to display the remaining three icons 112, 114 and 116.
In one embodiment, the sizes of the icons 112, 114 and 116 to be displayed on the helper screen 230 are also scaled so as to be proportional to the original impact zone 110. In one embodiment, the user of the computing device 100 may set the scaling factor. In another embodiment, the scaling factor may be generated automatically by the computing device itself, for example according to a pre-defined function or rule. The icons displayed in. the helper screen.230 can thus be made significantly larger than the original icons 112, 114 and 116, thereby allowing a single icon, i.e. desired icon 114, to be more readily selected.
In one embodiment, the icons 112, 114 and 116 may also be scaled such that they occupy a similar area of screen, as did the initial impact zone 110. Alternatively, the icons 112, 114 and 116 may be scaled, such that they occupy a larger area than did the initial impact zone 110. Thus, if a gloved finger is being used to activate the screen, then the icons of the helper screen 230 may be scaled such that they are as large as, or larger than, the initial impact zone 110 made by the gloved finger.
In one embodiment, the microprocessor 170 retains the size of the icons 112, 114 and 116, from the first activation. However, in order to improve the likelihood of the user activating the desired icon (e.g. icon 114) upon the next activation, the microprocessor 170 enlarges the spacing between the icons. For example, the microprocessor 170 may, in one embodiment, arrange the spacing between the icons such that no two icons arc displayed within an area as small as, or smaller than, the initial impact zone 110. Of course, the number of icons selected for subsequent selection via the helper screen, and the absolute size of the helper screen, limit the size and spacing of the icons.
If it is determined that more than one icon 112, 114, 116 or 118 is selected within a helper screen 230, i.e. upon a second or further activation, then the aforementioned method of the invention is carried out again, in order to further reduce the number of icons displayed. In this case, a further help-screen is generated, with a further reduction in the number of icons displayed. The user then selects the desired icon 114, from this reduced number of icons.
Referring now to FIG. 3, a flow chart 300 describes a method in accordance with one embodiment of the present invention. The method commences when the screen is activated, in step 305. A microprocessor within the computing device then determines a position and dimensions of the activated impact zone, as shown in step 310. The microprocessor then determines whether one or more on-screen icon(s) is/are located within the impact zone, as in step 315.
If no on-screen icon is located within the impact zone, in step 315, the process returns to step 305 to determine whether the screen is activated. If the microprocessor determines that one or more on-screen icon(s) is/are located within the impact zone, in step 315, the microprocessor then determines whether multiple on-screen icon(s) is/are located within the impact zone, as in step 320. Tf multiple on-screen icons are not located within the impact zone, in step 320, the single selected icon is activated, as shown in step 330.
However, if multiple on-screen icons are located within the impact zone, in step 320, the microprocessor generates and displays a number of the multiple on-screen icons on a helper screen, as shown in step 325. In one embodiment, the microprocessor uses larger icons. In another embodiment, the microprocessor uses a greater spacing between the icons used to represent the multiple icons, as in step 325. The process then returns to step 305, waiting for a further user input on the helper screen to select the desired icon or further narrow down the selection.
The present invention is described in terms of a UI for a computing device. However, it will be appreciated by a skilled artisan that the inventive concept herein described may be embodied in any type of UI for a touch-screen device.
It will be appreciated that any suitable distribution of functionality between different functional units or signal processing elements such as touch sensitive devices, signal processing units, etc. may be used without detracting from the inventive concept herein described. Hence, references to specific functional devices or elements are only to be seen as references to suitable means for providing the described functionality, rather than indicative of a strict logical or physical structure or organization. Aspects of the invention may be implemented in any suitable form including hardware, software, firmware or any combination of these. The elements and components of an embodiment of the invention may be physically, functionally and logically implemented in any suitable way. Indeed, the functionality may be implemented in a single unit or IC, in a plurality of units or ICs or as part of other functional units.
Tn particular, it is envisaged that the aforementioned inventive concept can be applied by a semiconductor manufacturer to any user interface. It is further envisaged that, for example, a semiconductor manufacturer may employ the inventive concept in a design of a stand-alone user interface for a computing device or application-specific integrated circuit (ASIC) and/or any other sub-system element.
Although the present invention has been described in connection with some embodiments, it is not intended to be limited to the specific form set forth herein. Rather, the scope of the present invention is limited only by the accompanying claims. Additionally, although a feature may appear to be described in connection with particular embodiments, one skilled in the art would recognize that various features of the described embodiments may be combined in accordance with the invention. In the claims, the term 'comprising' does not exclude the presence of other elements or steps.
Furthermore, although individual features may be included in different claims, these may possibly be advantageously combined, and the inclusion in different claims does not imply that a combination of features is not feasible and/or advantageous. Also, the inclusion of a feature in one category of claims does not imply a limitation to this category, but rather indicates that the feature is equally applicable to other claim categories, as appropriate. Furthermore, the order of features in the claims does not imply any specific order in which the features must be performed and in particular the order of individual steps in a method claim does not imply that the steps must be performed in this order. Rather, the steps may be performed in any suitable order. In addition, singular references do not exclude a plurality. Thus, references to "a", "an", "first", "second" etc. do not preclude a plurality.
Thus, in summary, a computing device having a touch screen is configured such that an impact on the touch screen, which overlaps with a number of icons, is detected by a microprocessor 170 in the device. In response to detecting multiple icons, the microprocessor 170 arranges for a number of the multiple icons to be subsequently displayed on a helper screen, the user then being able to select the desired icon, from the number of the multiple icons, by the user touching the helper screen. Hence, an improved user interface and method of operation have been described, where the aforementioned disadvantages with prior art arrangements have been substantially alleviated.

Claims

Claims
1. A method (300) for determining a command given via a touch-screen input device of a computing device comprising the steps: detecting (300) an activation of an on-screen, icon; and determining (310) an impact zone associated with said activation; wherein the method is characterised by the steps: determining (320) whether a single on-screen icon or multiple on-screen icons (1 12, 1 14, 1 16, 1 18) is/are associated with said impact zone (1 10); and displaying, in response to determining multiple on-screen icons (112, 114,
116, 118), a helper-screen (230) comprising multiple icons (220) associated with said multiple on-screen icons (112, 114, 116, 118) of said impact zone.
2. The method (300) according to Claim 1 further characterised by the step of: repeating, in response to determining multiple on-screen icons (112, 114, 116, 118), the steps of determining (310), determining (320) and displaying until a single on-screen icon (122) is activated.
3. The method (300) according to Claim 1 further characterised in that determining (310) an impact zone associated with said activation comprises determining at least one of:
(i) a position of an activated screen area (110, 130);
(ii) at least one dimension of an activated screen area (110, 130).
4. The method (300) according to Claim 1 further characterised by the step of scaling one or more dimension(s) of the multiple icons (220) displayed in the helper-screen (200), wherein the step of scaling comprises: scaling a number of the one or more dimension(s) of the multiple icons to a similar dimension to the determined dimension of the impact zone (110) or scaling a number of the at least one dimension of the multiple icons to a dimension larger than the determined dimension of the impact zone (110).
5. The method (300) according to Claim 1 further characterised by the step of: determining a degree of overlap between the impact zone (110), and a respective area of each of the multiple on-screen icons (112, 114, 116, 118); and displaying the multiple icons representing the multiple icons for which the determined degree of overlap is greater than a threshold level.
6. The method (300) according to Claim 4 further characterised in that a scaling factor applied to the multiple icons displayed on the helper screen is user- selectable
7. The method (300) according to Claim 1 further characterised by the step of: displaying the multiple icons in a helper screen with increased spacing between the icons.
8. A touch-screen based computing device (100, 200) comprising a user interface having a touch-screen input device and a microprocessor (170), operably coupled to the touch screen input device, and arranged to detect an activation of an on-screen icon; and determining an impact zone associated with the activation; wherein the touch-screen based computing device (100, 200) is characterised in that the microprocessor (170) determines whether a single on-screen icon or multiple onscreen icons (112, 114, 116, 118) is/are associated with the impact zone (110); and displays on the touch-screen, in response to determining multiple on-screen icons (1 12, 1 14, 1 16, 1 18), a helper-screen (230) comprising multiple icons (220) associated with the multiple on-screen icons (112, 114, 116, 118) of the impact zone.
9. A touch-screen based computing device (100, 200) according to Claim 8 further characterised in that the microprocessor (170) repeats, in response to determining multiple on-screen icons (112, 114, 116, 118), the process of determining an impact zone associated with the activation, determining whether a single on-screen icon or multiple on-screen icons (112, 114, 116, 118) is/are associated with the impact zone and displaying the multiple icons until a single on-scrccn icon (122) is activated.
10. A touch-screen based computing device (100, 200) according to Claim 9 or Claim 11 further characterised in that the microprocessor (170) determines at least one of:
(i) a position;
(ii) at least one dimension of an impact zone (110, 130).
PCT/US2007/060435 2006-01-12 2007-01-12 User interface for a touch-screen based computing device and method therefor WO2007082290A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB0600548.2 2006-01-12
GB0600548A GB2434286B (en) 2006-01-12 2006-01-12 User interface for a touch-screen based computing device and method therefor

Publications (2)

Publication Number Publication Date
WO2007082290A2 true WO2007082290A2 (en) 2007-07-19
WO2007082290A3 WO2007082290A3 (en) 2008-04-10

Family

ID=35997889

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2007/060435 WO2007082290A2 (en) 2006-01-12 2007-01-12 User interface for a touch-screen based computing device and method therefor

Country Status (2)

Country Link
GB (1) GB2434286B (en)
WO (1) WO2007082290A2 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101472591B1 (en) * 2008-11-14 2014-12-17 삼성전자주식회사 Method for selection of portion of contents magnified with a zoom function, apparatus for serveing the contents, and system for the same
US9141280B2 (en) 2011-11-09 2015-09-22 Blackberry Limited Touch-sensitive display method and apparatus
US9274698B2 (en) 2007-10-26 2016-03-01 Blackberry Limited Electronic device and method of controlling same
US9501168B2 (en) 2011-08-10 2016-11-22 Cypress Semiconductor Corporation Methods and apparatus to detect a presence of a conductive object

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8405627B2 (en) * 2010-12-07 2013-03-26 Sony Mobile Communications Ab Touch input disambiguation
KR20150073354A (en) * 2013-12-23 2015-07-01 삼성전자주식회사 method and apparatus processing an object provided via a display

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6211856B1 (en) * 1998-04-17 2001-04-03 Sung M. Choi Graphical user interface touch screen with an auto zoom feature
US6259436B1 (en) * 1998-12-22 2001-07-10 Ericsson Inc. Apparatus and method for determining selection of touchable items on a computer touchscreen by an imprecise touch
US20020122029A1 (en) * 1999-05-20 2002-09-05 Murphy Stephen C. Computer touch screen adapted to facilitate selection of features at edge of screen
US20050190970A1 (en) * 2004-02-27 2005-09-01 Research In Motion Limited Text input system for a mobile electronic device and methods thereof

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5119079A (en) * 1990-09-17 1992-06-02 Xerox Corporation Touch screen user interface with expanding touch locations for a reprographic machine
US6073036A (en) * 1997-04-28 2000-06-06 Nokia Mobile Phones Limited Mobile station with touch input having automatic symbol magnification function

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6211856B1 (en) * 1998-04-17 2001-04-03 Sung M. Choi Graphical user interface touch screen with an auto zoom feature
US6259436B1 (en) * 1998-12-22 2001-07-10 Ericsson Inc. Apparatus and method for determining selection of touchable items on a computer touchscreen by an imprecise touch
US20020122029A1 (en) * 1999-05-20 2002-09-05 Murphy Stephen C. Computer touch screen adapted to facilitate selection of features at edge of screen
US20050190970A1 (en) * 2004-02-27 2005-09-01 Research In Motion Limited Text input system for a mobile electronic device and methods thereof

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9274698B2 (en) 2007-10-26 2016-03-01 Blackberry Limited Electronic device and method of controlling same
US10423311B2 (en) 2007-10-26 2019-09-24 Blackberry Limited Text selection using a touch sensitive screen of a handheld mobile communication device
US11029827B2 (en) 2007-10-26 2021-06-08 Blackberry Limited Text selection using a touch sensitive screen of a handheld mobile communication device
KR101472591B1 (en) * 2008-11-14 2014-12-17 삼성전자주식회사 Method for selection of portion of contents magnified with a zoom function, apparatus for serveing the contents, and system for the same
US8930848B2 (en) 2008-11-14 2015-01-06 Samsung Electronics Co., Ltd. Method for selecting area of content for enlargement, and apparatus and system for providing content
US9501168B2 (en) 2011-08-10 2016-11-22 Cypress Semiconductor Corporation Methods and apparatus to detect a presence of a conductive object
US10338739B1 (en) 2011-08-10 2019-07-02 Cypress Semiconductor Corporation Methods and apparatus to detect a presence of a conductive object
US9141280B2 (en) 2011-11-09 2015-09-22 Blackberry Limited Touch-sensitive display method and apparatus
US9383921B2 (en) 2011-11-09 2016-07-05 Blackberry Limited Touch-sensitive display method and apparatus
US9588680B2 (en) 2011-11-09 2017-03-07 Blackberry Limited Touch-sensitive display method and apparatus

Also Published As

Publication number Publication date
GB0600548D0 (en) 2006-02-22
WO2007082290A3 (en) 2008-04-10
GB2434286A (en) 2007-07-18
GB2434286B (en) 2008-05-28

Similar Documents

Publication Publication Date Title
US10866724B2 (en) Input and output method in touch screen terminal and apparatus therefor
EP2502136B1 (en) Method and apparatus for replicating physical key function with soft keys in an electronic device
US9740321B2 (en) Method for operating application program and mobile electronic device using the same
US8775966B2 (en) Electronic device and method with dual mode rear TouchPad
KR101012598B1 (en) Method and computer readable medium for generating display on touch screen of computer
TWI428812B (en) Method for controlling application program, electronic device thereof, recording medium thereof, and computer program product using the method
US9875005B2 (en) Method of unlocking electronic device by displaying unlocking objects at randomized/user-defined locations and related computer readable medium thereof
US8456433B2 (en) Signal processing apparatus, signal processing method and selection method of user interface icon for multi-touch panel
TW201329835A (en) Display control device, display control method, and computer program
JP5556398B2 (en) Information processing apparatus, information processing method, and program
US8558806B2 (en) Information processing apparatus, information processing method, and program
GB2516029A (en) Touchscreen keyboard
WO2007082290A2 (en) User interface for a touch-screen based computing device and method therefor
US20110148776A1 (en) Overlay Handling
WO2011152335A1 (en) Electronic device using touch panel input and method for receiving operation thereby
JP6217633B2 (en) Mobile terminal device, control method for mobile terminal device, and program
KR20090056469A (en) Apparatus and method for reacting to touch on a touch screen
KR101678213B1 (en) An apparatus for user interface by detecting increase or decrease of touch area and method thereof
KR102296968B1 (en) Control method of favorites mode and device including touch screen performing the same
KR100859882B1 (en) Method and device for recognizing a dual point user input on a touch based user input device
JP5165624B2 (en) Information input device, object display method, and computer-executable program
US11893229B2 (en) Portable electronic device and one-hand touch operation method thereof
EP2743812B1 (en) Method for selecting a plurality of entries on a user interface
JP7019992B2 (en) Display input device and image forming device equipped with it
USRE46020E1 (en) Method of controlling pointer in mobile terminal having pointing device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07710082

Country of ref document: EP

Kind code of ref document: A2