US20130019192A1 - Pickup hand detection and its application for mobile devices - Google Patents
Pickup hand detection and its application for mobile devices Download PDFInfo
- Publication number
- US20130019192A1 US20130019192A1 US13/182,355 US201113182355A US2013019192A1 US 20130019192 A1 US20130019192 A1 US 20130019192A1 US 201113182355 A US201113182355 A US 201113182355A US 2013019192 A1 US2013019192 A1 US 2013019192A1
- Authority
- US
- United States
- Prior art keywords
- screen display
- screen
- user
- program code
- computer program
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
Definitions
- the present invention relates generally to methods and systems for controlling an interface of a computing system and, more specifically, to methods and systems for controlling a graphical user interface of a handheld electronic device.
- Portable electronic devices include, for example, several types of mobile stations, such as simple cellular telephones, smart phones, wireless personal digital assistants (PDAs), tablets, and laptop computers with wireless or Bluetooth® capabilities.
- mobile stations such as simple cellular telephones, smart phones, wireless personal digital assistants (PDAs), tablets, and laptop computers with wireless or Bluetooth® capabilities.
- Portable electronic devices such as PDAs or smart telephones, are generally intended for handheld use and ease of portability.
- a touch screen display for input and output is particularly useful on such handheld devices, as such handheld devices are small.
- these devices have a limited area for rendering content on the touch screen display and for rendering features or icons, for example, for user interaction. The problem may be further exacerbated when users use the handheld electronic device with one hand while touching the screen with the thumb of the hand since thumbs cannot reach screen items far away on the touch screen.
- an apparatus comprises a switching system that is configured to receive orientation information of the apparatus; determine which hand of a user holds the apparatus; and switch operation of the apparatus between a first screen display and a second screen display of the apparatus depending on which hand of the user holds the apparatus.
- a method comprises detecting an orientation of an apparatus; determining which hand of a user holds the apparatus; and rearranging a user interface in accordance with which hand of the user holds the apparatus.
- a computer readable medium having computer usable program code embodied therewith, the computer program code comprises computer program code configured to switch operation between a first screen display and a second screen display, wherein the first screen display and the second screen display have a plurality of screen items; and computer program code configured to determine which hand of a user holds an apparatus before switching operation from the first screen display to the second screen display.
- FIG. 1A is a plan view of an exemplary embodiment of a device with a touch screen
- FIG. 1B is a schematic view of an exemplary embodiment of a device with an orientation sensor
- FIG. 2A is a screenshot of a first screen display when a user holds the device with a left hand according to an exemplary embodiment
- FIG. 2B is a screenshot of a second screen display when a user holds the device with a right hand according to an exemplary embodiment
- FIG. 3 a flow diagram of an exemplary process for user switching between the first screen display and the second screen display.
- exemplary embodiments provide methods and systems for controlling an interface of a communications device using an orientation sensor that detects which hand of a user is holding the communications device. This allows for different input configurations based upon which hand is holding the device. Exemplary embodiments optimize the user-friendliness of communications devices from a tactile input perspective. Additional input points and options enable complex applications of functions otherwise impractical for handheld devices. In embodiments of the invention, the flexibility of the input approach may be used to support adaptation to user limitation, for instance, for the disabled.
- a memory on the communications device may store one or more user profiles which include input combinations for specific functions for specific users.
- Exemplary embodiments may include an orientation sensor, such as an accelerometer (gravity sensor, or g-sensor), which measures the orientation of the communications device, such as an angle with the gravity vector.
- Exemplary embodiments may further include a switching system having one or more computer hardware and/or software systems which control switching between a first screen display for the left hand of a user and a second screen display for the right hand of the user.
- the orientation information may be used by the switching system to determine an arrangement of icons, menus, buttons, sliding bars on a touch screen, or interface controls, for example. More specifically, when the switching system receives the orientation information of the communications device, such as the angle between the communications device and gravity vector. If the angle is positive, as shown in FIG.
- the switching system may determine it is the user's left hand holding the communications device.
- the switching system may switch to a first screen display where most useful icons, communication bars, dialog buttons may be arranged and facilitated within the reach of the left hand thumb. If the angle is negative, as shown in FIG. 2B , the switching system may determine it is the user's right hand holding the communications device.
- the switching system may switch to a second screen display where most useful icons, communications bars, or dialog buttons may be arranged and facilitated within the reach of the right hand thumb.
- Exemplary embodiments may take the form of an entire hardware embodiment, an entire software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system.”
- exemplary embodiments may take the form of a computer program product embodied in any tangible medium of expression having computer-usable program code embodied in the medium.
- the computer-usable or computer-readable medium may be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium.
- the computer-readable medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a transmission media such as those supporting the Internet or an intranet, or a magnetic storage device.
- RAM random access memory
- ROM read-only memory
- EPROM or Flash memory erasable programmable read-only memory
- CD-ROM compact disc read-only memory
- CD-ROM compact disc read-only memory
- a transmission media such as those supporting the Internet or an intranet, or a magnetic storage device.
- a computer-usable or computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted, or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory.
- a computer-usable or computer-readable medium may be any medium that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction performance system, apparatus, or device.
- the computer-usable medium may include a propagated data signal with the computer-usable program code embodied therewith, either in baseband or as part of a carrier wave.
- the computer usable program code may be transmitted using any appropriate medium, including but not limited to wireless, wired, wire line, optical fiber cable, RF, etc.
- Computer program code for carrying out operations of exemplary embodiments may be written in any combination of one or more programming languages, including an object oriented programming language such as JavaTM SmalltalkTM, C++ or the like, and conventional procedural programming languages, such as the “C” programming language or similar programming languages.
- the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server.
- the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
- LAN local area network
- WAN wide area network
- Internet Service Provider for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
- These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
- These computer program instructions may also be stored in a computer-readable medium that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instruction means which implement the function/act specified in the flowchart and/or block diagram block or blocks.
- the computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
- FIG. 1A is an exemplary embodiment of a device 100 with a speaker 106 and a microphone 108 .
- the device 100 may be, for example, a handheld computer, a server, a personal digital assistant, a cellular telephone, a network appliance, a camera, a smart phone, a network base station, a media player, a navigation device, an e-mail device, a game console, a television receiver (e.g., a satellite or cable television set-top box), a digital-video-recorder (DVR), an automatic teller machine (ATM), a security system (e.g., a door or gate access system), or a combination of any two or more of these data processing devices or other data processing devices.
- DVR digital-video-recorder
- ATM automatic teller machine
- security system e.g., a door or gate access system
- the device 100 may comprise any type of electronic device, general purpose computing device or special purpose computing device that includes a processor, other circuitry, or logic operable to perform the screen switch process described herein to facilitate user's data input by an object, such as a thumb of the user, for example.
- the device 100 may include a display device, such as a touch screen 102 , which may be operable to present a first screen display 202 (shown in FIG. 2A ) and a second screen display 204 (shown in FIG. 2B ).
- the touch screen 102 may be implemented with liquid crystal display (LCD) technology, light emitting polymer display (LPD) technology, or some other display technology.
- LCD liquid crystal display
- LPD light emitting polymer display
- the touch screen 102 may be sensitive to haptic and/or tactile contact by a user.
- the touch screen 102 may comprise a multi-touch-sensitive display.
- a multi-sensitive display may, for example, process multiple simultaneous touch points, including processing data related to the pressure, degree, and/or position of each touch point.
- the device 100 may comprise a battery 114 , a memory 118 , a processing unit 120 (e.g., a central processing unit (CPU)), and an orientation sensor 112 (e.g., an accelerometer (G-sensor), and a rotational sensor (Gyro)).
- the orientation sensor 112 may be connected to the processing unit 120 and may be controlled by one or a combination of a monitoring circuit and operating software. The sensor may detect the orientation of the device 100 or information from which the orientation of the device 100 may be determined, such as acceleration.
- the orientation sensor 112 is a two axis accelerometer.
- an orientation sensor other than an accelerometer may be used, such as a gravity sensor, a gyroscope, a tilt sensor, an electronic compass or other suitable sensors, or combinations thereof.
- the device 100 may comprise two or more sensors, such as an accelerometer and an electronic compass.
- an accelerometer is a sensor which converts acceleration from motion (e.g., movement of the mobile communication device 100 or a portion thereof due to a strike force) and gravity which are detected by a sensing element into an electrical signal (producing a corresponding change in output) and is available in one, two or three axis configurations. Accelerometers may produce digital or analog output signals depending on the type of accelerometer.
- an analog output requiring buffering and analog-to-digital (A/D) conversion
- A/D analog-to-digital
- digital output which is typically available in an industry standard interface, such as an SPI (Serial Peripheral Interface) or I2C (Inter-Integrated Circuit) interface.
- SPI Serial Peripheral Interface
- I2C Inter-Integrated Circuit
- the output of an accelerometer is typically measured in terms of the gravitational acceleration constant at the Earth's surface, denoted g, which is approximately 9.81 m/s 2 (32.2 ft/s 2 ) as the standard average.
- the accelerometer may be various types including, but not limited to, a capacitive, piezoelectric, piezoresistive, or gas-based accelerometer.
- the range of accelerometers vary up to the thousands of g's, however, for portable electronic devices, “low-g” accelerometers may be used. Examples of low-g accelerometers which may be used are micro electro-mechanical systems (MEMS) digital accelerometers from Analog Devices, Inc. (ADI), Freescale Semiconductor, Inc. (Freescale) and STMicroelectronics N.V. of Geneva, Switzerland.
- MEMS micro electro-mechanical systems
- the device 100 may have one or more graphical user interfaces 109 on the touch-sensitive display 102 for providing the user access to various system objects and for conveying information to the user.
- the first screen display 202 may include a plurality of screen icons 201 , e.g., an Evernote® icon 226 , a calendar icon 224 , a photo icon 220 , a camera icon 206 , a settings icon 230 , a news icon 228 , a map icon 222 , a weather icon 208 , a memo icon 232 , a clock icon 234 , a TeamViewer® icon 210 , a phone icon 218 , a mail icon 216 , a note icon 214 , and a media icon 212 .
- the icons may be arranged in a grid pattern comprising a plurality of columns and rows. Rows and/or columns may be straight, curved, or otherwise. In other exemplary embodiments, icons may be arranged in various other patterns and layouts.
- theta may be defined as an angle of the y-axis relative to the gravity vector 260 for a two-axis accelerometer in accordance with one exemplary embodiment of the present invention.
- the measurement axis 280 may be aligned with an axis 270 of the device 100 .
- the x-axis and y-axis are typically aligned with the input plane of the touch screen 102 .
- the z-axis (not shown) is perpendicular to the horizontal plane and detects orientation information when the device 100 is moved vertically.
- the theta may be calculated using equation (1).
- x sensor and y sensor are the measurements from the x axis and y axis of the two-axis accelerometer. It will be appreciated that 0 can also be determined by other means.
- the device orientation may be defined by which one of the top 250 , bottom 256 , left-hand side 252 , right-hand side 254 of the device 100 is directed generally upward.
- the device orientation may be measured by the ⁇ angle, x sensor , or y sensor .
- a switching system 310 shown in FIG. 3
- the switching system 310 may determine it is the user's left hand holding the device 100 .
- the switching system 310 may switch to a first screen display where icons, communication bars, dialog buttons may be arranged and facilitated for the left hand thumb use, as shown in FIG. 2A .
- the switching system 310 may determine it is the user's right hand holding the device 100 , the switching system 310 may switch to a second screen display where icons, communications bars, or dialog buttons may be arranged and facilitated for the right hand thumb use, as shown in FIG. 2B .
- the switching system 310 may determine that a user holding the device is lying down and the top 250 of the device 100 is downward. In addition, if the ⁇ angle or x sensor is positive, the switching system 310 may determine it is the user's left hand holding the device 100 , the switching system may switch to a first screen display where icons, communication bars, dialog buttons may be arranged and facilitated for the left hand thumb use. When the ⁇ angle or x sensor is negative, the switching system 310 may determine it is the user's right hand holding the device 100 , the switching system 310 may switch to a second screen display where icons, communication bars, or dialog buttons may be arranged and facilitated for the right hand thumb use.
- the switching system 310 may determine that the user, who is holding the device close to his/her ear, is lying on his/her side, or the device 100 is placed on a table or chair.
- the switching system 310 may have memory 118 , which stores a screen display at a pre-determined time period, such as a few seconds, for example.
- the switching system 310 may present a previous stored screen display on the touch screen 102 .
- a combination of icons on the first screen display 202 may be the same as and may be shuffled from the icons on the first screen display 202 .
- the icons on the first screen display 202 may be different from the icons in the second screen display 204 in FIG. 2B .
- the icons on the second screen display 204 may include some of the icons found on the first screen display 202 .
- the Evernote® icon 226 in FIG. 2A may be switched and moved to the position where the camera icon 206 used to occupy, for example.
- the phone icon 218 may be shuffled and moved to the area where the media icon 212 used to occupy, for example.
- the switching system 310 may arrange the most useful icons, menus, buttons, sliding bars, and the like within the reach of the thumb of a user's hand that is holding the device 100 .
- the positions of icons may be rearranged in a symmetrical manner with respect to y-axis when switching between the first and second screen displays.
- positions of all icons may be shifted, i.e. when a user holds the device with a left hand, icons may be shifted to the left area of the screen display within the reach of the left thumb, and when a user holds the device with a right hand, icons may be shifted to the right area of the screen display within the reach of the right thumb.
- a screen display has a user interface image which includes button icons for selecting a function
- the positions of the button icons may be shifted according to which hand is holding the device for the ease of each thumb's reach.
- the device 100 may include a switching system 310 , the first screen display 202 shown in FIG. 2A , and the second screen display 204 shown in FIG. 2B .
- the switching system 310 may have one or more computer software and/or hardware systems which control switching between the first screen display 202 and the second screen display 204 .
- the switching system 310 may switch a screen item between the first screen display and the second screen display.
- screen items may include a plurality of icons 201 , for example.
- screen items may further include menus, buttons, sliding bars, interface controls, or a plurality of images, such as composite images.
- the switching system 310 may receive orientation information, such as an angle of an apparatus with gravity vector, from an orientation sensor.
- the switching system 310 may determine which hand of a user holds the apparatus, and rearrange a user interface in accordance with which hand of the user holds the apparatus. If the user uses his/her left hand to hold the apparatus, the switching system 310 may present the first screen display 202 . If the user uses his/her right hand to hold the apparatus, the switching system 310 may present the second screen display 204 .
Abstract
A method and apparatus is provided for switching from a first screen display to a second screen display for a user to input data. An apparatus may comprise a first screen display for facilitating a thumb of a user to input on a touch screen of the apparatus, a second screen display for facilitating the other thumb of the user to input on the touch screen of the apparatus, and a switching system. The switching system may be configured to switch a screen item position between the first screen display and the second screen display. The switching system may receive orientation information of the apparatus from an orientation sensor, determine which hand of the user holds the apparatus, and may switch operation of the apparatus between the first screen display to the second screen display.
Description
- The present invention relates generally to methods and systems for controlling an interface of a computing system and, more specifically, to methods and systems for controlling a graphical user interface of a handheld electronic device.
- Electronic devices, including portable electronic devices, have gained widespread use and may provide a variety of functions including, for example, telephonic, electronic messaging and other personal information manager application functions. Portable electronic devices include, for example, several types of mobile stations, such as simple cellular telephones, smart phones, wireless personal digital assistants (PDAs), tablets, and laptop computers with wireless or Bluetooth® capabilities.
- Portable electronic devices, such as PDAs or smart telephones, are generally intended for handheld use and ease of portability. A touch screen display for input and output is particularly useful on such handheld devices, as such handheld devices are small. However, these devices have a limited area for rendering content on the touch screen display and for rendering features or icons, for example, for user interaction. The problem may be further exacerbated when users use the handheld electronic device with one hand while touching the screen with the thumb of the hand since thumbs cannot reach screen items far away on the touch screen.
- Therefore, it can be seen that there is a need for a method and system of controlling a graphic interface of a hand held device.
- In one aspect, an apparatus comprises a switching system that is configured to receive orientation information of the apparatus; determine which hand of a user holds the apparatus; and switch operation of the apparatus between a first screen display and a second screen display of the apparatus depending on which hand of the user holds the apparatus.
- In another aspect, a method comprises detecting an orientation of an apparatus; determining which hand of a user holds the apparatus; and rearranging a user interface in accordance with which hand of the user holds the apparatus.
- In a further aspect, a computer readable medium having computer usable program code embodied therewith, the computer program code comprises computer program code configured to switch operation between a first screen display and a second screen display, wherein the first screen display and the second screen display have a plurality of screen items; and computer program code configured to determine which hand of a user holds an apparatus before switching operation from the first screen display to the second screen display.
- These and other features, aspects and advantages of the present invention will become better understood with reference to the following drawings, description and claims.
-
FIG. 1A is a plan view of an exemplary embodiment of a device with a touch screen; -
FIG. 1B is a schematic view of an exemplary embodiment of a device with an orientation sensor; -
FIG. 2A is a screenshot of a first screen display when a user holds the device with a left hand according to an exemplary embodiment; -
FIG. 2B is a screenshot of a second screen display when a user holds the device with a right hand according to an exemplary embodiment; and -
FIG. 3 a flow diagram of an exemplary process for user switching between the first screen display and the second screen display. - The following detailed description is of the best currently contemplated modes of carrying out exemplary embodiments. The description is not to be taken in a limiting sense, but is made merely for the purpose of illustrating the general principles, since the scope of the embodiments is best defined by the appended claims.
- Various inventive features are described below that can each be used independently of one another or in combination with other features.
- Broadly, exemplary embodiments provide methods and systems for controlling an interface of a communications device using an orientation sensor that detects which hand of a user is holding the communications device. This allows for different input configurations based upon which hand is holding the device. Exemplary embodiments optimize the user-friendliness of communications devices from a tactile input perspective. Additional input points and options enable complex applications of functions otherwise impractical for handheld devices. In embodiments of the invention, the flexibility of the input approach may be used to support adaptation to user limitation, for instance, for the disabled. A memory on the communications device may store one or more user profiles which include input combinations for specific functions for specific users.
- Exemplary embodiments may include an orientation sensor, such as an accelerometer (gravity sensor, or g-sensor), which measures the orientation of the communications device, such as an angle with the gravity vector. Exemplary embodiments may further include a switching system having one or more computer hardware and/or software systems which control switching between a first screen display for the left hand of a user and a second screen display for the right hand of the user. The orientation information may be used by the switching system to determine an arrangement of icons, menus, buttons, sliding bars on a touch screen, or interface controls, for example. More specifically, when the switching system receives the orientation information of the communications device, such as the angle between the communications device and gravity vector. If the angle is positive, as shown in
FIG. 2A , the switching system may determine it is the user's left hand holding the communications device. The switching system may switch to a first screen display where most useful icons, communication bars, dialog buttons may be arranged and facilitated within the reach of the left hand thumb. If the angle is negative, as shown inFIG. 2B , the switching system may determine it is the user's right hand holding the communications device. The switching system may switch to a second screen display where most useful icons, communications bars, or dialog buttons may be arranged and facilitated within the reach of the right hand thumb. - Exemplary embodiments may take the form of an entire hardware embodiment, an entire software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system.” Furthermore, exemplary embodiments may take the form of a computer program product embodied in any tangible medium of expression having computer-usable program code embodied in the medium.
- Any combination of one or more computer usable or computer readable medium(s) may be utilized. The computer-usable or computer-readable medium may be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a transmission media such as those supporting the Internet or an intranet, or a magnetic storage device. Note that the computer-usable or computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted, or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory. In the context of this document, a computer-usable or computer-readable medium may be any medium that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction performance system, apparatus, or device. The computer-usable medium may include a propagated data signal with the computer-usable program code embodied therewith, either in baseband or as part of a carrier wave. The computer usable program code may be transmitted using any appropriate medium, including but not limited to wireless, wired, wire line, optical fiber cable, RF, etc.
- Computer program code for carrying out operations of exemplary embodiments may be written in any combination of one or more programming languages, including an object oriented programming language such as Java™ Smalltalk™, C++ or the like, and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
- Exemplary embodiments are described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions.
- These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
- These computer program instructions may also be stored in a computer-readable medium that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instruction means which implement the function/act specified in the flowchart and/or block diagram block or blocks. The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
-
FIG. 1A is an exemplary embodiment of adevice 100 with aspeaker 106 and amicrophone 108. Thedevice 100 may be, for example, a handheld computer, a server, a personal digital assistant, a cellular telephone, a network appliance, a camera, a smart phone, a network base station, a media player, a navigation device, an e-mail device, a game console, a television receiver (e.g., a satellite or cable television set-top box), a digital-video-recorder (DVR), an automatic teller machine (ATM), a security system (e.g., a door or gate access system), or a combination of any two or more of these data processing devices or other data processing devices. In other words, thedevice 100 may comprise any type of electronic device, general purpose computing device or special purpose computing device that includes a processor, other circuitry, or logic operable to perform the screen switch process described herein to facilitate user's data input by an object, such as a thumb of the user, for example. - In some embodiments, the
device 100 may include a display device, such as atouch screen 102, which may be operable to present a first screen display 202 (shown inFIG. 2A ) and a second screen display 204 (shown inFIG. 2B ). Thetouch screen 102 may be implemented with liquid crystal display (LCD) technology, light emitting polymer display (LPD) technology, or some other display technology. Thetouch screen 102 may be sensitive to haptic and/or tactile contact by a user. - In some implementations, the
touch screen 102 may comprise a multi-touch-sensitive display. A multi-sensitive display may, for example, process multiple simultaneous touch points, including processing data related to the pressure, degree, and/or position of each touch point. - Referring to
FIG. 1B , thedevice 100 may comprise abattery 114, amemory 118, a processing unit 120 (e.g., a central processing unit (CPU)), and an orientation sensor 112 (e.g., an accelerometer (G-sensor), and a rotational sensor (Gyro)). Theorientation sensor 112 may be connected to theprocessing unit 120 and may be controlled by one or a combination of a monitoring circuit and operating software. The sensor may detect the orientation of thedevice 100 or information from which the orientation of thedevice 100 may be determined, such as acceleration. - In some embodiments, the
orientation sensor 112 is a two axis accelerometer. In other embodiments, an orientation sensor other than an accelerometer may be used, such as a gravity sensor, a gyroscope, a tilt sensor, an electronic compass or other suitable sensors, or combinations thereof. In some embodiments, thedevice 100 may comprise two or more sensors, such as an accelerometer and an electronic compass. - As will be appreciated by persons skilled in the art, an accelerometer is a sensor which converts acceleration from motion (e.g., movement of the
mobile communication device 100 or a portion thereof due to a strike force) and gravity which are detected by a sensing element into an electrical signal (producing a corresponding change in output) and is available in one, two or three axis configurations. Accelerometers may produce digital or analog output signals depending on the type of accelerometer. Generally, two types of outputs are available depending on whether an analog or digital accelerometer is used: (1) an analog output requiring buffering and analog-to-digital (A/D) conversion; and (2) a digital output which is typically available in an industry standard interface, such as an SPI (Serial Peripheral Interface) or I2C (Inter-Integrated Circuit) interface. - The output of an accelerometer is typically measured in terms of the gravitational acceleration constant at the Earth's surface, denoted g, which is approximately 9.81 m/s2 (32.2 ft/s2) as the standard average. The accelerometer may be various types including, but not limited to, a capacitive, piezoelectric, piezoresistive, or gas-based accelerometer. The range of accelerometers vary up to the thousands of g's, however, for portable electronic devices, “low-g” accelerometers may be used. Examples of low-g accelerometers which may be used are micro electro-mechanical systems (MEMS) digital accelerometers from Analog Devices, Inc. (ADI), Freescale Semiconductor, Inc. (Freescale) and STMicroelectronics N.V. of Geneva, Switzerland.
- Referring to 2A, in some implementations, the
device 100 may have one or moregraphical user interfaces 109 on the touch-sensitive display 102 for providing the user access to various system objects and for conveying information to the user. - In some implementations, the
first screen display 202 may include a plurality ofscreen icons 201, e.g., anEvernote® icon 226, acalendar icon 224, aphoto icon 220, acamera icon 206, asettings icon 230, anews icon 228, amap icon 222, aweather icon 208, amemo icon 232, aclock icon 234, aTeamViewer® icon 210, aphone icon 218, amail icon 216, anote icon 214, and amedia icon 212. The icons may be arranged in a grid pattern comprising a plurality of columns and rows. Rows and/or columns may be straight, curved, or otherwise. In other exemplary embodiments, icons may be arranged in various other patterns and layouts. - When the
device 100 is held by a left hand, thedevice 100 may be inclined toward the left slightly, since the weight of thedevice 100 can be supported by the base of the thumb. As shown inFIG. 2A , theta (θ) may be defined as an angle of the y-axis relative to thegravity vector 260 for a two-axis accelerometer in accordance with one exemplary embodiment of the present invention. Themeasurement axis 280 may be aligned with anaxis 270 of thedevice 100. The x-axis and y-axis are typically aligned with the input plane of thetouch screen 102. The z-axis (not shown) is perpendicular to the horizontal plane and detects orientation information when thedevice 100 is moved vertically. The theta may be calculated using equation (1). -
θ=arctan(xsensor/ysensor) (1) - where xsensor and ysensor are the measurements from the x axis and y axis of the two-axis accelerometer. It will be appreciated that 0 can also be determined by other means.
- It will be appreciated that the device orientation may be defined by which one of the top 250, bottom 256, left-
hand side 252, right-hand side 254 of thedevice 100 is directed generally upward. The device orientation may be measured by the θ angle, xsensor, or ysensor. When ysensor is positive, a switching system 310 (shown inFIG. 3 ) may determine that a user holding the device is not lying down and the top 250 of thedevice 100 is upward. In addition, if the θ angle or xsensor (which is also called a slope)) is positive, theswitching system 310 may determine it is the user's left hand holding thedevice 100. Theswitching system 310 may switch to a first screen display where icons, communication bars, dialog buttons may be arranged and facilitated for the left hand thumb use, as shown inFIG. 2A . When the θ angle or xsensor (which is also called a slope) is negative, theswitching system 310 may determine it is the user's right hand holding thedevice 100, theswitching system 310 may switch to a second screen display where icons, communications bars, or dialog buttons may be arranged and facilitated for the right hand thumb use, as shown inFIG. 2B . - When ysensor is negative, the
switching system 310 may determine that a user holding the device is lying down and the top 250 of thedevice 100 is downward. In addition, if the θ angle or xsensor is positive, theswitching system 310 may determine it is the user's left hand holding thedevice 100, the switching system may switch to a first screen display where icons, communication bars, dialog buttons may be arranged and facilitated for the left hand thumb use. When the θ angle or xsensor is negative, theswitching system 310 may determine it is the user's right hand holding thedevice 100, theswitching system 310 may switch to a second screen display where icons, communication bars, or dialog buttons may be arranged and facilitated for the right hand thumb use. - If xsensor and ysensor are zero (the
device 100 is parallel to the ground), theswitching system 310 may determine that the user, who is holding the device close to his/her ear, is lying on his/her side, or thedevice 100 is placed on a table or chair. Theswitching system 310 may havememory 118, which stores a screen display at a pre-determined time period, such as a few seconds, for example. When theswitching system 310 receives the orientation information in which xsensor and ysensor are zero from theorientation sensor 112, theswitching system 310 may present a previous stored screen display on thetouch screen 102. - Referring to
FIG. 2B , in one exemplary embodiment, a combination of icons on thefirst screen display 202 may be the same as and may be shuffled from the icons on thefirst screen display 202. In another exemplary embodiment, the icons on thefirst screen display 202 may be different from the icons in thesecond screen display 204 inFIG. 2B . In still other exemplary embodiments, the icons on thesecond screen display 204 may include some of the icons found on thefirst screen display 202. For example, when thefirst screen display 202 changes to thesecond screen display 204, theEvernote® icon 226 inFIG. 2A may be switched and moved to the position where thecamera icon 206 used to occupy, for example. Similarly, thephone icon 218 may be shuffled and moved to the area where themedia icon 212 used to occupy, for example. In this way, theswitching system 310 may arrange the most useful icons, menus, buttons, sliding bars, and the like within the reach of the thumb of a user's hand that is holding thedevice 100. - In an exemplary embodiment, the positions of icons may be rearranged in a symmetrical manner with respect to y-axis when switching between the first and second screen displays. In another exemplary embodiment, when a screen display has relatively few numbers of icons with enough blank space on the display screen, positions of all icons may be shifted, i.e. when a user holds the device with a left hand, icons may be shifted to the left area of the screen display within the reach of the left thumb, and when a user holds the device with a right hand, icons may be shifted to the right area of the screen display within the reach of the right thumb. In further another exemplary embodiment, when a screen display has a user interface image which includes button icons for selecting a function, the positions of the button icons may be shifted according to which hand is holding the device for the ease of each thumb's reach.
- Referring to
FIG. 3 , thedevice 100 may include aswitching system 310, thefirst screen display 202 shown inFIG. 2A , and thesecond screen display 204 shown inFIG. 2B . Theswitching system 310 may have one or more computer software and/or hardware systems which control switching between thefirst screen display 202 and thesecond screen display 204. Theswitching system 310 may switch a screen item between the first screen display and the second screen display. In one embodiment, screen items may include a plurality oficons 201, for example. In another exemplary embodiment, screen items may further include menus, buttons, sliding bars, interface controls, or a plurality of images, such as composite images. - The
switching system 310 may receive orientation information, such as an angle of an apparatus with gravity vector, from an orientation sensor. Theswitching system 310 may determine which hand of a user holds the apparatus, and rearrange a user interface in accordance with which hand of the user holds the apparatus. If the user uses his/her left hand to hold the apparatus, theswitching system 310 may present thefirst screen display 202. If the user uses his/her right hand to hold the apparatus, theswitching system 310 may present thesecond screen display 204. - It should be understood, of course, that the foregoing relate to exemplary embodiments of the invention and that modifications may be made without departing from the spirit and scope of the invention as set forth in the following claims.
Claims (20)
1. An apparatus, comprising:
a switching system that is configured to:
receive orientation information of the apparatus;
determine which hand of a user holds the apparatus; and
switch operation of the apparatus between a first screen display and a second screen display of the apparatus depending on which hand of the user holds the apparatus.
2. The apparatus of claim 1 further comprising an orientation sensor, wherein the orientation sensor sends out the orientation information of the apparatus.
3. The apparatus of claim 1 , wherein the first screen display and the second screen display comprise a plurality of screen items.
4. The apparatus of claim 3 , wherein the switching system switches a position of screen items when switching from the first screen display to the second screen display.
5. The apparatus of claim 3 , wherein the screen items on the first screen display are displayed in accordance with a contact position of a thumb of the user on a touch screen of the apparatus.
6. The apparatus of claim 3 , wherein the screen items in the second screen display are displayed in accordance with a contact position of a thumb of the user.
7. The apparatus of claim 3 , wherein screen items are one of an icon and an image.
8. The apparatus of claim 1 , wherein the switching system has a memory which stores a screen display.
9. The apparatus of claim 1 , wherein the switching system displays the stored screen display.
10. A method, comprising:
detecting an orientation of an apparatus;
determining which hand of a user holds the apparatus; and
rearranging a user interface in accordance with which hand of the user holds the apparatus.
11. The method of claim 10 further comprising presenting a first screen display when the user holds the apparatus with the left hand.
12. The method of claim 10 further comprising presenting a second screen display when the user holds the apparatus with the right hand.
13. The method of claim 12 further comprising shuffling a combination of screen items when switching between the first screen display and the second screen display.
14. The method of claim 10 further comprising arranging screen items according with a contact position of a thumb of the user on a touch screen of the apparatus.
15. A computer readable medium having computer usable program code embodied therewith, the computer program code comprising:
computer program code configured to switch operation between a first screen display and a second screen display, wherein the first screen display and the second screen display have a plurality of screen items; and
computer program code configured to determine which hand of a user holds an apparatus before switching operation between the first screen display and the second screen display.
16. The computer program code of claim 15 further comprising computer program code configured to receive orientation information from an orientation sensor.
17. The computer program code of claim 15 further comprising computer program code configured to display the first screen display in accordance with a contact position of a thumb of the user on a touch screen of the apparatus.
18. The computer program code of claim 15 further comprising computer program code configured to display the second screen display in accordance with a contact position of a thumb of the user on a touch screen of the apparatus.
19. The computer program code of claim 15 further comprising computer program code configured to store a screen display in a memory.
20. The computer program code of claim 15 further comprising computer program code configured to display the stored screen display
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/182,355 US20130019192A1 (en) | 2011-07-13 | 2011-07-13 | Pickup hand detection and its application for mobile devices |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/182,355 US20130019192A1 (en) | 2011-07-13 | 2011-07-13 | Pickup hand detection and its application for mobile devices |
Publications (1)
Publication Number | Publication Date |
---|---|
US20130019192A1 true US20130019192A1 (en) | 2013-01-17 |
Family
ID=47519686
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/182,355 Abandoned US20130019192A1 (en) | 2011-07-13 | 2011-07-13 | Pickup hand detection and its application for mobile devices |
Country Status (1)
Country | Link |
---|---|
US (1) | US20130019192A1 (en) |
Cited By (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120192113A1 (en) * | 2011-01-24 | 2012-07-26 | Kyocera Corporation | Portable electronic device |
US20130052954A1 (en) * | 2011-08-23 | 2013-02-28 | Qualcomm Innovation Center, Inc. | Data transfer between mobile computing devices |
US20140013844A1 (en) * | 2012-07-16 | 2014-01-16 | Lenovo (Beijing) Co., Ltd. | Terminal Device |
CN103744586A (en) * | 2014-01-07 | 2014-04-23 | 惠州Tcl移动通信有限公司 | Mobile terminal and mobile terminal menu item setting method and device |
US20140146007A1 (en) * | 2012-11-26 | 2014-05-29 | Samsung Electronics Co., Ltd. | Touch-sensing display device and driving method thereof |
US20140292818A1 (en) * | 2013-03-26 | 2014-10-02 | Samsung Electronics Co. Ltd. | Display apparatus and control method thereof |
KR20140122076A (en) * | 2013-04-09 | 2014-10-17 | 삼성전자주식회사 | Method and apparatus for displaying an object of portable electronic device |
US20150089359A1 (en) * | 2013-09-25 | 2015-03-26 | At&T Mobility Ii Llc | Intelligent Adaptation of Home Screens |
US20150089360A1 (en) * | 2013-09-25 | 2015-03-26 | At&T Mobility Ii Llc | Intelligent Adaptation of User Interfaces |
US20150089386A1 (en) * | 2013-09-25 | 2015-03-26 | At&T Mobility Ii Llc | Intelligent Adaptation of Home Screens According to Handedness |
US20150149941A1 (en) * | 2013-11-22 | 2015-05-28 | Fujitsu Limited | Mobile terminal and display control method |
EP2846238A4 (en) * | 2013-05-29 | 2015-06-17 | Huawei Tech Co Ltd | Method for switching and presentation of operation mode of terminal, and terminal |
US20150212699A1 (en) * | 2014-01-27 | 2015-07-30 | Lenovo (Singapore) Pte. Ltd. | Handedness for hand-held devices |
US9204288B2 (en) | 2013-09-25 | 2015-12-01 | At&T Mobility Ii Llc | Intelligent adaptation of address books |
US20160345130A1 (en) * | 2013-08-28 | 2016-11-24 | At&T Mobility Ii Llc | Autonomous pull and display of location based service applications by a mobile device based on context of the mobile device |
US9588643B2 (en) | 2014-12-18 | 2017-03-07 | Apple Inc. | Electronic devices with hand detection circuitry |
WO2017078314A1 (en) * | 2015-11-06 | 2017-05-11 | Samsung Electronics Co., Ltd. | Electronic device for displaying multiple screens and control method therefor |
CN107357487A (en) * | 2017-07-26 | 2017-11-17 | 掌阅科技股份有限公司 | Application control method, electronic equipment and computer-readable storage medium |
CN109814824A (en) * | 2018-12-29 | 2019-05-28 | 努比亚技术有限公司 | Luminance regulating method, terminal and the storage medium of two-sided screen |
US20190272093A1 (en) * | 2018-03-05 | 2019-09-05 | Omron Corporation | Character input device, character input method, and character input program |
US10627948B2 (en) | 2016-05-25 | 2020-04-21 | Microsoft Technology Licensing, Llc | Sequential two-handed touch typing on a mobile device |
US10949077B2 (en) * | 2015-06-19 | 2021-03-16 | Beijing Zhigu Rui Tuo Tech Co., Ltd | Information processing method and device |
US10963147B2 (en) | 2012-06-01 | 2021-03-30 | Microsoft Technology Licensing, Llc | Media-aware interface |
US11294561B2 (en) * | 2013-11-29 | 2022-04-05 | Semiconductor Energy Laboratory Co., Ltd. | Data processing device having flexible position input portion and driving method thereof |
DE102015120864B4 (en) | 2014-12-05 | 2022-12-29 | Htc Corporation | Mobile electronic device, user interface display method and recording medium therefor |
US11847293B2 (en) * | 2021-08-05 | 2023-12-19 | Rolland & Hamann Innovations, LLC | Selectable input alterations |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080119237A1 (en) * | 2006-11-16 | 2008-05-22 | Lg Electronics Inc. | Mobile terminal and screen display method thereof |
US20090109187A1 (en) * | 2007-10-30 | 2009-04-30 | Kabushiki Kaisha Toshiba | Information processing apparatus, launcher, activation control method and computer program product |
US20100085317A1 (en) * | 2008-10-06 | 2010-04-08 | Samsung Electronics Co., Ltd. | Method and apparatus for displaying graphical user interface depending on a user's contact pattern |
US20100222046A1 (en) * | 2009-02-27 | 2010-09-02 | Research In Motion Limited | Method and handheld electronic device for triggering advertising on a display screen |
US20110234487A1 (en) * | 2008-12-16 | 2011-09-29 | Tomohiro Hiramoto | Portable terminal device and key arrangement control method |
US20120084692A1 (en) * | 2010-09-30 | 2012-04-05 | Lg Electronics Inc. | Mobile terminal and control method of the mobile terminal |
US20120324381A1 (en) * | 2011-06-17 | 2012-12-20 | Google Inc. | Graphical icon presentation |
-
2011
- 2011-07-13 US US13/182,355 patent/US20130019192A1/en not_active Abandoned
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080119237A1 (en) * | 2006-11-16 | 2008-05-22 | Lg Electronics Inc. | Mobile terminal and screen display method thereof |
US20090109187A1 (en) * | 2007-10-30 | 2009-04-30 | Kabushiki Kaisha Toshiba | Information processing apparatus, launcher, activation control method and computer program product |
US20100085317A1 (en) * | 2008-10-06 | 2010-04-08 | Samsung Electronics Co., Ltd. | Method and apparatus for displaying graphical user interface depending on a user's contact pattern |
US20110234487A1 (en) * | 2008-12-16 | 2011-09-29 | Tomohiro Hiramoto | Portable terminal device and key arrangement control method |
US20100222046A1 (en) * | 2009-02-27 | 2010-09-02 | Research In Motion Limited | Method and handheld electronic device for triggering advertising on a display screen |
US20120084692A1 (en) * | 2010-09-30 | 2012-04-05 | Lg Electronics Inc. | Mobile terminal and control method of the mobile terminal |
US20120324381A1 (en) * | 2011-06-17 | 2012-12-20 | Google Inc. | Graphical icon presentation |
Cited By (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120192113A1 (en) * | 2011-01-24 | 2012-07-26 | Kyocera Corporation | Portable electronic device |
US20130052954A1 (en) * | 2011-08-23 | 2013-02-28 | Qualcomm Innovation Center, Inc. | Data transfer between mobile computing devices |
US10963147B2 (en) | 2012-06-01 | 2021-03-30 | Microsoft Technology Licensing, Llc | Media-aware interface |
US11875027B2 (en) * | 2012-06-01 | 2024-01-16 | Microsoft Technology Licensing, Llc | Contextual user interface |
US20140013844A1 (en) * | 2012-07-16 | 2014-01-16 | Lenovo (Beijing) Co., Ltd. | Terminal Device |
US9574878B2 (en) * | 2012-07-16 | 2017-02-21 | Lenovo (Beijing) Co., Ltd. | Terminal device having hand shaking sensing units to determine the manner that a user holds the terminal device |
US20140146007A1 (en) * | 2012-11-26 | 2014-05-29 | Samsung Electronics Co., Ltd. | Touch-sensing display device and driving method thereof |
US20140292818A1 (en) * | 2013-03-26 | 2014-10-02 | Samsung Electronics Co. Ltd. | Display apparatus and control method thereof |
US9886167B2 (en) * | 2013-03-26 | 2018-02-06 | Samsung Electronics Co., Ltd. | Display apparatus and control method thereof |
KR20140122076A (en) * | 2013-04-09 | 2014-10-17 | 삼성전자주식회사 | Method and apparatus for displaying an object of portable electronic device |
KR102161450B1 (en) | 2013-04-09 | 2020-10-05 | 삼성전자 주식회사 | Method and apparatus for displaying an object of portable electronic device |
EP2846238A4 (en) * | 2013-05-29 | 2015-06-17 | Huawei Tech Co Ltd | Method for switching and presentation of operation mode of terminal, and terminal |
US20160345130A1 (en) * | 2013-08-28 | 2016-11-24 | At&T Mobility Ii Llc | Autonomous pull and display of location based service applications by a mobile device based on context of the mobile device |
US11350240B2 (en) | 2013-08-28 | 2022-05-31 | At&T Mobility Ii Llc | Autonomous pull and display of location based service applications by a mobile device based on context of the mobile device |
US10764714B2 (en) * | 2013-08-28 | 2020-09-01 | At&T Mobility Ii Llc | Autonomous pull and display of location based service applications by a mobile device based on context of the mobile device |
US20150089360A1 (en) * | 2013-09-25 | 2015-03-26 | At&T Mobility Ii Llc | Intelligent Adaptation of User Interfaces |
US20150089386A1 (en) * | 2013-09-25 | 2015-03-26 | At&T Mobility Ii Llc | Intelligent Adaptation of Home Screens According to Handedness |
US20150089359A1 (en) * | 2013-09-25 | 2015-03-26 | At&T Mobility Ii Llc | Intelligent Adaptation of Home Screens |
US9204288B2 (en) | 2013-09-25 | 2015-12-01 | At&T Mobility Ii Llc | Intelligent adaptation of address books |
US20150149941A1 (en) * | 2013-11-22 | 2015-05-28 | Fujitsu Limited | Mobile terminal and display control method |
US11714542B2 (en) | 2013-11-29 | 2023-08-01 | Semiconductor Energy Laboratory Co., Ltd. | Data processing device and driving method thereof for a flexible touchscreen device accepting input on the front, rear and sides |
US11294561B2 (en) * | 2013-11-29 | 2022-04-05 | Semiconductor Energy Laboratory Co., Ltd. | Data processing device having flexible position input portion and driving method thereof |
CN103744586A (en) * | 2014-01-07 | 2014-04-23 | 惠州Tcl移动通信有限公司 | Mobile terminal and mobile terminal menu item setting method and device |
US10416856B2 (en) * | 2014-01-27 | 2019-09-17 | Lenovo (Singapore) Pte. Ltd. | Handedness for hand-held devices |
US20150212699A1 (en) * | 2014-01-27 | 2015-07-30 | Lenovo (Singapore) Pte. Ltd. | Handedness for hand-held devices |
DE102015120864B4 (en) | 2014-12-05 | 2022-12-29 | Htc Corporation | Mobile electronic device, user interface display method and recording medium therefor |
US9588643B2 (en) | 2014-12-18 | 2017-03-07 | Apple Inc. | Electronic devices with hand detection circuitry |
US10949077B2 (en) * | 2015-06-19 | 2021-03-16 | Beijing Zhigu Rui Tuo Tech Co., Ltd | Information processing method and device |
US10387017B2 (en) | 2015-11-06 | 2019-08-20 | Samsung Electronics Co., Ltd | Electronic device for displaying multiple screens and control method therefor |
WO2017078314A1 (en) * | 2015-11-06 | 2017-05-11 | Samsung Electronics Co., Ltd. | Electronic device for displaying multiple screens and control method therefor |
US10627948B2 (en) | 2016-05-25 | 2020-04-21 | Microsoft Technology Licensing, Llc | Sequential two-handed touch typing on a mobile device |
CN107357487A (en) * | 2017-07-26 | 2017-11-17 | 掌阅科技股份有限公司 | Application control method, electronic equipment and computer-readable storage medium |
US20190272093A1 (en) * | 2018-03-05 | 2019-09-05 | Omron Corporation | Character input device, character input method, and character input program |
CN109814824A (en) * | 2018-12-29 | 2019-05-28 | 努比亚技术有限公司 | Luminance regulating method, terminal and the storage medium of two-sided screen |
US11847293B2 (en) * | 2021-08-05 | 2023-12-19 | Rolland & Hamann Innovations, LLC | Selectable input alterations |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20130019192A1 (en) | Pickup hand detection and its application for mobile devices | |
US8963875B2 (en) | Touch screen device with wet detection and control method thereof | |
US8352639B2 (en) | Method of device selection using sensory input and portable electronic device configured for same | |
US8928593B2 (en) | Selecting and updating location of virtual keyboard in a GUI layout in response to orientation change of a portable device | |
US20160291864A1 (en) | Method of interacting with a portable electronic device | |
US8633901B2 (en) | Handheld electronic device having a touchscreen and a method of using a touchscreen of a handheld electronic device | |
CA2781636C (en) | Method for tap detection and for interacting with a handheld electronic device, and a handheld electronic device configured therefor | |
US9465454B2 (en) | Mobile terminal device, storage medium, and method for display control of mobile terminal device | |
US8482520B2 (en) | Method for tap detection and for interacting with and a handheld electronic device, and a handheld electronic device configured therefor | |
US9280261B2 (en) | Method and handheld electronic device having a graphical user interface which arranges icons dynamically | |
US20100188371A1 (en) | Handheld electronic device having a touchscreen and a method of using a touchscreen of a handheld electronic device | |
CA2691289C (en) | A handheld electronic device having a touchscreen and a method of using a touchscreen of a handheld electronic device | |
US20130169545A1 (en) | Cooperative displays | |
US20120092280A1 (en) | Electronic device, screen control method, and storage medium storing screen control program | |
KR20180083310A (en) | Use of accelerometer inputs to change the operating state of convertible computing devices | |
JP2012058847A (en) | Mobile terminal device and program | |
EP2611117B1 (en) | Cooperative displays | |
WO2012061917A1 (en) | Motion gestures interface for portable electronic device | |
JP2012058332A (en) | Mobile terminal device and program | |
WO2014097653A1 (en) | Electronic apparatus, control method, and program | |
CA2775662C (en) | Method of device selection using sensory input and portable electronic device configured for same | |
TW201349088A (en) | Displaying method for software keyboard and electronic device thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: LENOVO (SINGAPORE) PTE. LTD., SINGAPORE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ITOH, HIROSHI;SHIMOTONO, SUSUMU;REEL/FRAME:026587/0065 Effective date: 20110712 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |