US20150185989A1 - Interactive user interface - Google Patents
Interactive user interface Download PDFInfo
- Publication number
- US20150185989A1 US20150185989A1 US14/645,686 US201514645686A US2015185989A1 US 20150185989 A1 US20150185989 A1 US 20150185989A1 US 201514645686 A US201514645686 A US 201514645686A US 2015185989 A1 US2015185989 A1 US 2015185989A1
- Authority
- US
- United States
- Prior art keywords
- touch screen
- displaying
- electronic document
- touch
- computer
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0483—Interaction with page-structured environments, e.g. book metaphor
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B21/00—Exercising apparatus for developing or strengthening the muscles or joints of the body by working against a counterforce, with or without measuring devices
-
- G06F17/212—
-
- G06F17/2235—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/10—Text processing
- G06F40/103—Formatting, i.e. changing of presentation of documents
- G06F40/106—Display of layout of documents; Previewing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/10—Text processing
- G06F40/12—Use of codes for handling textual entities
- G06F40/134—Hyperlinking
Definitions
- handheld devices such as cellular phones, smart phones, personal digital assistants (PDAs), and portable media players, among others, have proliferated rapidly throughout the general public and the business community.
- the computing power of these devices has increased and the networks on which handheld devices operate have seen dramatic growth.
- wireless networks now provide handheld device users with access to email, media, and documents, such as web pages, articles, blogs, and electronic books.
- Selecting icons or entering text while relatively easy to accomplish through a traditional input device (e.g., a keyboard or mouse), presents unique challenges for handheld device users.
- handheld devices with touch screens typically require users to touch or tap an icon or graphical button to select a command or enter text. Users often need to tap these small icons or graphical buttons in a correct sequence to perform a command. The more taps that the user must undertake, the more likely an error will occur, often to the frustration of the user.
- FIG. 1 is a diagram of an example of a system for providing content
- FIG. 3 is a diagram of examples of components of a terminal
- FIG. 4 is a diagram of an example of a software architecture providing functionality for a terminal
- FIGS. 5A , 5 B, and 5 C are diagrams of examples of touch screen gestures
- FIG. 6 is an example of a screen in a portrait orientation including regions
- FIG. 7 is an example of a screen in a landscape orientation including regions
- FIG. 8A is an example of a screen displaying an electronic document
- FIG. 8B is an example of a screen displaying an electronic document and control objects
- FIG. 9 is a flow diagram of an example of routine for executing a command based on a selected region
- FIG. 10A is an example of a screen displaying an electronic document
- FIG. 10B is an example of a screen displaying an electronic document and a highlighted element
- FIG. 11 is an example of a simple view of an element
- FIG. 12A is an example of a screen including a simple view of an element displayed in a small size
- FIG. 12B is an example of a screen including a simple view of an element displayed in a large size
- FIG. 13 is a flow diagram of an example of a routine for activating an element
- FIG. 14 is an example of a slide bar
- FIG. 15 is an example of a screen including a slide bar and a slider
- FIG. 16 is an example of a screen including a slide bar and a slider
- FIG. 17 is an example of a screen including a slide bar and a slider
- FIG. 18 is a flow diagram of an example of a routine for navigating an electronic document using a slide bar and a slider;
- FIG. 19A is an example of a screen including an electronic document
- FIG. 19B is an example of a screen including an electronic document and a graphical object in front of the electronic document.
- FIG. 20 is a flow diagram of an example of a routine for controlling the perceived brightness of an electronic document.
- Disclosed embodiments provide apparatuses and computer-implemented methods for providing interactive user interfaces.
- the apparatuses and methods may provide a touch screen capable of detecting touch gestures executed by a user while viewing an electronic document.
- a computer-implemented method for navigating an electronic document on a portable device.
- the method may include displaying the electronic document on a touch screen of the portable device.
- the method may further include defining a plurality of regions on the touch screen.
- the method may further include detecting a tap gesture on the touch screen.
- the method may further include determining a selection of one of the regions based on a location of the tap gesture on the touch screen.
- the method may further include executing, by a processor of the portable device, a command associated with the selected region.
- a computer implemented method for navigating an electronic document on a terminal.
- the method may include defining a plurality of regions on a touch screen of the terminal.
- the method may further include detecting a tap gesture on the touch screen.
- the method may further include executing, by a processor of the terminal, in response to the detected tap gesture, a command associated with the selected region.
- a computer-implemented method for navigating an electronic document on a terminal.
- the method may include defining a plurality of regions of a touch screen of the terminal. At least one of the regions may be associated with a command that displays another page of the electronic document on the touch screen.
- the method may further include detecting, by a processor of the terminal, a touch selection of the at least one of the regions.
- the method may further include executing, in response to the touch selection, the command associated with the selected at least one of the regions.
- an apparatus for navigating an electronic document.
- the apparatus may include a touch screen including a plurality of regions.
- the apparatus may further include a storage comprising a detection module for detecting a tap gesture on the touch screen and for determining a selection of one of the regions based on a location of the tap gesture on the touch screen.
- the apparatus may further include a processor for executing, in response to the detected tap gesture, a command associated with the selected one of the regions.
- a computer-implemented method for activating an element of an electronic document.
- the method may include displaying the electronic document on a touch screen of a terminal.
- the electronic document may include the element.
- the method may further include detecting a touch-and-hold gesture on a location of the touch screen.
- the method may further include highlighting the element on the touch screen.
- the element may be displayed at the location of the detected touch-and-hold gesture.
- the method may further include detecting, subsequent to the highlighting, a release of the touch-and-hold gesture from the element.
- the method may further include activating, by a processor of the terminal, the element based on the release of the touch-and-hold gesture.
- an apparatus for activating an element.
- the apparatus may include a touch screen for displaying an electronic document including the element.
- the apparatus may further include a storage.
- the storage may comprise a detection module for detecting a touch-and-hold gesture on a location of the touch screen.
- the element may be displayed at the location of the detected touch-and-hold gesture.
- the detection module may be for detecting a release of the touch-and-hold gesture from the element.
- the storage may further comprise a highlighting module for highlighting the element on the touch screen based on the detected tough-and-hold gesture.
- the apparatus may further include a processor for activating, based on the release of the touch-and-hold gesture, the element.
- a computer-implemented method for navigating an electronic document on a terminal.
- the method may include displaying, by a processor of the terminal, a page of a section of the electronic document on a touch screen of the terminal.
- the electronic document may include one or more sections. Each section may include one or more pages.
- the method may further include displaying a slider on a slide bar on the touch screen.
- the slide bar may include an expanded part corresponding to the displayed section of the electronic document. The expanded part may be disproportionately larger than other parts of the slide bar corresponding to other sections of the electronic document.
- the method may further include detecting a drag gesture on the touch screen.
- the drag gesture may include a movement of the slider on the slide bar.
- the method may further include detecting a release gesture of the slider on the touch screen.
- the method may further include displaying, on the touch screen, a target page of the electronic document based on the release gesture.
- a computer-implemented method for navigating an electronic document on a terminal.
- the method may include displaying a page of a section of the electronic document on a touch screen of the terminal.
- the method may further include displaying a slider on a slide bar on the touch screen.
- the slide bar may include an expanded part corresponding to the displayed section of the electronic document.
- the expanded part may be disproportionately larger than other parts of the slide bar corresponding to other sections of the electronic document.
- the method may further include detecting a drag gesture to move the slider on the slide bar.
- the method may further include displaying, by a processor of the terminal, another page of the electronic document on the touch screen based on the drag gesture.
- an apparatus for navigating an electronic document.
- the apparatus may include a touch screen for displaying a page of a section of the electronic document on a touch screen of the terminal and for displaying a slider on a slide bar on the touch screen.
- the slide bar may include an expanded part corresponding to the displayed section of the electronic document. The expanded part may be disproportionately larger than other parts of the slide bar corresponding to other sections of the electronic document.
- the apparatus may further include a storage comprising a detection module for detecting a drag gesture to move the slider on the slide bar.
- the apparatus may further include a processor for displaying another page of the electronic document on the touch screen based on the drag gesture.
- a computer-implemented method for controlling a perceived brightness of an electronic document.
- the method may include displaying the electronic document on a touch screen of a terminal.
- the method may further include displaying a partially opaque graphical object in front of the electronic document.
- the perceived brightness of the electronic document through the partially opaque graphical object may be based on an opacity of the partially opaque graphical object.
- the opacity of the partially opaque graphical object may be adjustable from completely transparent to completely opaque.
- the method may further include detecting a swipe gesture on the touch screen.
- the method may further include controlling, by a processor of the terminal, the perceived brightness of the electronic document by adjusting the opacity of the partially opaque graphical object based on the swipe gesture.
- a computer-implemented method for controlling a perceived brightness of an electronic document.
- the method may include displaying the electronic document on a touch screen of a terminal.
- the method may further include displaying a graphical object in front of the electronic document.
- the method may further include detecting a gesture on the touch screen.
- the method may further include controlling, by a processor of the terminal, the perceived brightness of the electronic document by adjusting an opacity of the graphical object based on the gesture.
- an apparatus for controlling a perceived brightness of an electronic document.
- the apparatus may include a touch screen for displaying the electronic document and for displaying a graphical object in front of the electronic document.
- the apparatus may further include a storage comprising a detection module for detecting a gesture on the touch screen.
- the apparatus may further include a processor for controlling the perceived brightness of the electronic document by adjusting an opacity of the graphical object based on the gesture.
- computer-readable media that store program instructions executable by a processor for performing any of the above-described methods.
- a computer-implemented method for controlling a brightness of an electronic document.
- the method may include displaying the electronic document on a touch screen of a terminal.
- the method may further include detecting a vertical swipe gesture on the touch screen.
- the method may further include controlling, by a processor of the terminal, the brightness of the electronic document based on the vertical swipe gesture.
- FIG. 1 is a diagram of an example of a system 100 for providing content, consistent with a disclosed embodiment.
- System 100 may provide functionality for providing an online catalog.
- the online catalog may provide access to catalogs that include electronic documents.
- a host server 110 catalog servers 120 - 140 , and terminals 150 - 170 are connected to a network 180 .
- a network 180 One of ordinary skill in the art will appreciate that although one host server, three catalog servers, three terminals, and one network are depicted in FIG. 1 , any number of these components may be provided.
- one or more components of system 100 may be combined and/or divided into subcomponents. For example, functionality provided by host server 110 and catalog server 120 may be combined or functionality provided by host server 110 may be subdivided across multiple components.
- Network 180 may enable communications between the various components in system 100 , such as host server 110 , catalog servers 120 - 140 , and terminals 150 - 170 .
- terminals 150 - 170 may access legacy systems (not shown) via network 180 , or may directly access legacy systems, data stores, or other network applications.
- Network 180 may include a shared, public, or private network; may encompass a wide area network or a local area network; and may be implemented through any suitable combination of wired and wireless networks.
- Network 180 may further comprise an intranet or the Internet.
- catalog server 120 may transmit catalogs and catalog information to host server 110 via network 180 .
- Host server 110 may transmit catalog information to one or more of terminals 150 - 170 via network 180 .
- catalog server 120 may transmit electronic catalogs and electronic documents to one or more of terminals 150 - 170 via network 180 .
- host server 110 may comprise a general purpose computer (e.g., a personal computer, network computer, server, or mainframe computer) having one or more processors that may be selectively activated or reconfigured by a computer program.
- Host server 110 may also be implemented in a distributed network.
- host server 110 may communicate via network 180 with one or more additional host servers (not shown), which may enable host server 110 to distribute processes for parallel execution by a plurality of host servers.
- host server 110 may be specially constructed for carrying-out methods consistent with disclosed embodiments.
- host server 110 may constitute, for example, a web server accessible via the Internet.
- host server 110 may include a wireless application protocol (WAP) server accessible by wireless terminals.
- WAP wireless application protocol
- host server 110 may provide access to a web site and provide access to content.
- Host server 110 may include a processor 112 , such as a central processing unit (CPU). Host server 110 may include a memory 114 . Memory 114 may be, for example, a random access memory (RAM). Processor 112 may load active data and program modules into memory 114 for performing one or more processes for providing content.
- processor 112 such as a central processing unit (CPU).
- CPU central processing unit
- Host server 110 may include a memory 114 .
- Memory 114 may be, for example, a random access memory (RAM).
- Processor 112 may load active data and program modules into memory 114 for performing one or more processes for providing content.
- Host server 110 may include a data store 116 .
- Data store 116 may be a persistent storage including, for example, hard drives, flash drives, etc.
- Data store 116 may include one or more databases.
- data store 116 may constitute an online catalog data store and may store, for example, data regarding electronic catalogs and data regarding catalog servers 120 - 140 , which may host the electronic catalogs.
- Data store 116 may store unique identifiers (IDs) associated with the electronic catalogs.
- IDs unique identifiers
- data store 116 may store catalog IDs associated with catalogs and catalog server IDs associated with catalog servers 120 - 140 .
- Data store 116 may continually receive and store new data regarding additional catalogs such that data store 116 is continually expanding and/or shrinking in scope and breadth.
- host server 110 may receive new catalog information from one or more of catalog servers 120 - 140 or from other catalog servers (not shown) over network 180 .
- information may be removed from data store 116 , such as when a catalog is no longer available or when one or more of catalog servers 120 - 140 become unavailable.
- catalog servers 120 - 140 may be any type of a general purpose computer (e.g., a personal computer, network computer, server, or mainframe computer) having one or more processors that may be selectively activated or reconfigured by a computer program.
- Catalog servers 120 - 140 may also be implemented in a distributed network.
- catalog server 120 may communicate via network 180 with catalog servers 130 - 140 , which may enable catalog server 120 to distribute processes for parallel execution by a plurality of catalog servers 130 - 140 .
- catalog server 120 may be specially constructed for carrying-out methods consistent with disclosed embodiments.
- catalog server 120 may constitute, for example, a web server accessible via the Internet.
- catalog server 120 may include a wireless application protocol (WAP) server accessible by wireless terminals.
- WAP wireless application protocol
- catalog server 120 may provide access to a web site and provide access to content.
- Catalog server 120 may include a processor 122 , such as a central processing unit (CPU).
- Catalog server 120 may include a memory 124 .
- Memory 124 may be, for example, a random access memory (RAM).
- Processor 122 may load active data and program modules into memory 124 for performing one or more processes for providing content.
- Catalog server 120 may include a data store 126 .
- Data store 126 may be a persistent storage including, for example, hard drives, flash drives, etc.
- Data store 126 may include one or more databases.
- data store 126 may constitute a catalog data store and may store an electronic catalog and various electronic media available through the electronic catalog.
- the electronic media may include electronic documents, such as books for sale, rent, download, or viewing from an Internet site provided by catalog server 120 .
- data store 126 may store titles, authors, images of covers, images of the pages of the books, descriptions, prices, and genres, among others, associated with the electronic books.
- the electronic media may further include music and videos.
- Electronic documents stored in data store 126 may be associated, for example, with unique identifiers (IDs), such as a product ID or ISBN (International Standard Book Number), or any other identifier.
- IDs unique identifiers
- data store 126 may store metadata in association with product IDs.
- the metadata may provide information describing and/or classifying an electronic document (e.g., classifying a novel by genre by associating metadata of “horror” with a product ID for a horror novel).
- catalog server 120 may include a web server for providing access to data store 126 .
- catalog server 120 may provide access to content (e.g., articles, media, etc.) stored in data store 126 that are accessible for free or that is accessible to subscribers or other authorized users.
- content e.g., articles, media, etc.
- Data store 126 may continually expand and/or shrink in scope and breadth. For example, information regarding a new electronic document (e.g., a book that was recently published and became available) may be added to data store 126 . Likewise, information regarding an existing electronic document may be removed from data store 126 , for example, if the electronic document is no longer available. In addition, the electronic catalog may be updated to reflect newly available electronic documents as well as unavailable electronic documents.
- a new electronic document e.g., a book that was recently published and became available
- information regarding an existing electronic document may be removed from data store 126 , for example, if the electronic document is no longer available.
- the electronic catalog may be updated to reflect newly available electronic documents as well as unavailable electronic documents.
- Terminals 150 - 170 may be any type of device for communicating with host server 110 and catalog servers 120 - 140 over network 180 .
- terminals 150 - 170 may comprise a server, a personal computer, a laptop, a notebook, a portable device (e.g., a handheld device, a personal digital assistant, a phone, an electronic book reader, etc.), a set top box, or any other appropriate device capable of exchanging data with network 180 .
- FIG. 2 is a view of an example of terminal 150 .
- terminal 150 may be a computer including, for example, a server, a desktop, a laptop, a notebook, an electronic book reader, a phone, etc.
- Terminal 150 may comprise a body 202 and a display screen 204 .
- Display screen 204 may be any screen for displaying text and/or graphics including, for example, a liquid crystal display (LCD).
- Display screen 204 may include a touch screen such that terminal 150 is able to detect touches on the touch screen as user inputs.
- Terminal 150 may comprise one or more buttons 206 A and 206 B. Buttons 206 A and 206 B may have fixed commands. Alternatively, the commands associated with buttons 206 A and 206 B may be dynamically changed by terminal 150 . Furthermore, buttons 206 A and 206 B may be configurable by a user to perform specific functions.
- Terminal 150 may further comprise a port 208 for receiving electrical power including, for example, an alternating current (AC) port, a direct current (DC) port, and a universal serial bus (USB) port.
- port 208 may enable data communication.
- port 208 may constitute a USB port, IEEE 1394 port, etc.
- Port 208 may serve a dual purpose and be an electrical power port and a data communication port.
- Terminal 150 may comprise an audio port 210 for connecting, for example, a headphone and/or a microphone.
- Terminal 150 may comprise volume control buttons 212 A and 212 B.
- Terminal 150 may comprise a scroll button 214 for providing user input.
- Terminal 150 may further comprises a battery (not shown), a transmitter (not shown) and a receiver (not shown), such as an antenna, a speaker (not shown), and a removal storage slot (not shown).
- FIG. 3 is a diagram of examples of components of terminal 150 .
- Terminal may include a processor 302 , such as a CPU.
- Terminal 150 may include a memory 304 .
- Memory 304 may be, for example, a RAM.
- Terminal 150 may comprise a storage 306 .
- Storage 306 may be a persistent storage including, for example, a hard drive, a flash drive, etc.
- Storage 306 may store, for example, an electronic document received from catalog server 120 .
- Terminal 150 may comprise a removable storage 308 including, for example, a compact disk read only memory (CD-ROM), a flash memory card, etc.
- CD-ROM compact disk read only memory
- Terminal 150 may comprise a communication controller 310 that operates the transmitter and receiver to enable terminal 150 to communicate with, for example, host server 110 and catalog servers 120 - 140 via network 180 .
- Terminal 150 may comprises an input/output (I/O) controller 312 that detects users input from, for example, buttons 203 A and 203 B, volume buttons 206 A and 206 B, and scroll button 207 , and controls output to display screen 204 and the speaker of terminal 150 .
- I/O controller 312 may detect touch screen gestures on display screen 204 in disclosed embodiments where display screen 204 is a touch screen.
- Terminal 150 may further comprises a video controller 314 for outputting graphics on display screen 204 to the user.
- Terminal 150 may comprise an audio controller 316 that operates audio port 210 and the speaker to input and output audio.
- These components processor 302 , memory 304 , storage 306 , removable storage 308 , communication controller 310 , I/O controller 312 , video controller 314 , and audio controller 316 ), and other components (not shown) of terminal 103 may be connected via a bus 308 .
- FIG. 4 is diagram of an example of a software architecture providing functionality for terminal 150 , consistent with a disclosed embodiment.
- the software architecture may be stored in, for example, storage 306 or removable storage 308 .
- storage 306 may store instructions of program 402 , which when executed by processor 302 , perform one or more data processes.
- Program 402 may include a rendering module 402 A, a detection module 402 B, and a communication module 402 C.
- program 402 is shown as being stored in storage 306 , one of ordinary skill in the art will recognize that program 402 may be distributed across multiple components including removable storage 308 and components available over network 180 .
- program 402 may be loaded into memory 304 for execution by processor 302 .
- Rendering module 402 A may include instructions for rendering graphics to be displayed on display screen 204 .
- video controller 314 may execute the instructions of rendering module 402 A to control the graphics displayed on display screen 204 .
- Detection module 402 B may include instructions for receiving touch gestures on the touch screen of the display screen 204 .
- Various touch screen gestures are described in detail below in connection with FIG. 5 .
- Communication module 402 C may include instructions for enabling communications on terminal 150 .
- communication module 402 C may include instructions for operating the transmitter, the receiver, and port 208 (when used as a data port).
- communication module 402 C may comprise a receiver module and a transmitter module.
- FIGS. 5A , 5 B, and 5 C are diagrams of examples of touch screen gestures.
- the touch screen of terminal 150 may be capable of detecting touches on display screen 204 . Touches on display screen 204 may be performed by, for example, a human finger or a physical object such as a stylus.
- Terminal 150 through display screen 204 , I/O controller 312 , and detection module 402 B, may be capable of determining, for example, the location of a touch, a release of the touch, the duration of the touch, the strength of the touch, a movement of the touch (e.g., a drag), etc.
- the touch screen may be capable of being depressed, in which terminal 150 may be capable of detecting a press gesture on the display screen 204 .
- FIG. 5A shows an example of a tap gesture.
- a tap gesture includes a touch and a release, or alternatively, a press and a release.
- a user may perform a tap gesture by tapping on display screen 204 of terminal 150 .
- the user may touch the tip of a finger on location A on display screen 204 .
- a tap gesture typically involves a finger touching display screen 204 for a short duration.
- FIG. 5A also shows an example of a long tap gesture.
- a long tap gesture includes a touch, and then a hold, followed by a release.
- a long tap gesture is similar to a tap gesture except that a finger touches display screen 204 for a longer duration, i.e., the duration of the hold gesture.
- FIG. 5B shows an example of an upward swipe gesture.
- a swipe gesture involves a touch gesture, a drag gesture, and then a release gesture.
- the user may touch the tip of a finger at location B on display screen 204 , then drag the finger upward towards location C while touching display screen 204 , and then releasing at location C by lifting the finger off display screen 204 .
- Terminal 150 may detect the touch gesture at location B, the drag gesture upward towards location C, and the release gesture at location C, and thus register an upward swipe gesture.
- the upward swipe gesture need not be perfectly upright or even perfectly straight. Some variance may be allowed for such a gesture.
- FIG. 5C shows an example of a downward swipe gesture.
- the user may swipe down from location D to location E on display screen 204 , such that terminal 150 will register a downward swipe gesture.
- swipe gestures in various directions including horizontal (rightward and leftward) gestures and diagonal gestures.
- a swipe gesture may also be called a drag gesture.
- a drag gesture may be executed slower than a swipe gesture.
- a pinch gesture or a spread gesture may be executed by the user using two fingers (e.g., a thumb and an index finger). The user may execute a pinch gesture by executing two swipe gestures concurrently such that the directions of the two swipe gestures are directed towards each other. The user may execute a spread gesture by executing two swipe gestures concurrently such that the directions of the two swipe gestures are directed away from each other.
- FIG. 6 is an example of a screen 600 in portrait orientation including regions 602 A- 602 C.
- Screen 600 may be displayed on display screen 204 of terminal 150 in portrait orientation.
- a left region 602 A, a center region 602 B, and a right region 602 C may be defined within screen 600 .
- three regions are shown in screen 600 , any number of regions may be defined and each region may be any shape or size.
- Screen 600 may display an electronic document, which has been overlaid by regions 602 A- 602 C.
- Screen 600 may also include boundaries of regions 602 A- 602 C as well as commands assigned to regions 602 A- 602 C.
- left region 602 A may be assigned a command to display the previous page of the electronic document
- center region 602 B may be assigned a command to display or hide control objects
- right region 602 C may be assigned a command to display the next page of the electronic document.
- Any command that may be executed on terminal 150 may be assigned to any of regions 602 A- 602 C.
- One or more of regions 602 A- 602 C may be assigned no command.
- terminal 150 may provide default command assignments for regions 602 A- 602 C and/or the command assignments may be configurable by a user.
- FIG. 7 is an example of a screen 700 in landscape orientation including regions 702 A- 702 C.
- Screen 700 may be displayed on display screen 204 of terminal 150 in landscape orientation. Similar to screen 600 , screen 700 includes a left region 702 A, a center region 702 B, and a right region 702 C. As illustrated, the size and shape of regions 702 A- 702 C are different from regions 602 A- 602 C. The boundaries of regions 602 A- 602 C and regions 702 A- 702 C may be defined based on the size and shape of screen 600 and screen 700 , respectively.
- FIG. 8A is an example of a screen 810 displaying an electronic document.
- Screen 810 may be displayed on display screen 204 of terminal 150 .
- screen 810 includes the electronic document, but does not include any control objects, such as menus, buttons, scroll bars, etc.
- FIG. 8B is an example of a screen 820 displaying an electronic document and control objects.
- Screen 820 may be displayed on display screen 204 of terminal 150 .
- control objects may be overlaid in front of the displayed electronic document such that the electronic document is partially visible through transparent portions of the control objects.
- Control objects may include, for example, menus, buttons, scroll bars, informational displays, among others.
- FIG. 9 is a flow diagram of an example of routine 900 for executing a command based on a selected region, consistent with a disclosed embodiment.
- Routine 900 may implement processes according to one or more of program modules 402 A- 402 C.
- rendering module 402 A may display an electronic document on display screen 204 .
- the electronic document may be stored in, for example, storage 306 or removable storage 308 .
- Terminal 150 may have received the electronic document from catalog server 120 via network 180 .
- display screen 204 may display the electronic document with no control objects, as illustrated in FIG. 8A , to fully utilize the limited screen real estate for electronic document viewing.
- one or more of regions 602 A- 602 C may be defined within display screen 204 by processor 302 .
- routine 900 is described in connection with regions 602 A- 602 C of FIG. 6
- routine 900 may be implemented in a similar manner with respect to regions 702 A- 702 C of FIG. 7 .
- Regions 602 A- 602 C may be defined based on the size and size of display screen 204 .
- regions 602 A- 602 C may be defined based on the size and shape of the window.
- the regions defined in screen 600 and the regions defined in screen 700 may vary in size and shape because the size and shape of screen 600 and screen 700 may vary.
- display screen 204 is divided into three horizontal regions 602 A- 602 C, as illustrated in FIGS. 6 and 7 .
- Horizontal regions 602 A- 602 C may be positioned side-by-side, whereas vertical regions may be positioned top-to-bottom.
- Defined regions 602 A- 602 C may be of same or similar shape and size, or they may be of different shapes and sizes. Other numbers of regions and other sizes and shapes of regions are possible (e.g., five vertical regions).
- terminal 150 may assign commands to each of regions 602 A- 602 C. Any command that may be executed by terminal 150 may be assigned to defined regions 602 A- 602 C. Terminal 150 may include default assignments of commands to regions 602 A- 602 C. In addition, the assignments of commands to regions 602 A- 602 C may be configurable by a user. The assignments of commands to the regions (including current assignments, default assignment, and user-configured assignments) may be stored in storage 306 .
- rendering module 402 A may display the boundaries of regions 602 A- 602 C on display screen 204 , as illustrated in FIG. 6 and similarly illustrated in FIG. 7 with respect to regions 702 A- 702 C.
- the boundaries of regions 602 A- 602 C may be displayed when the electronic document is first displayed on display screen 204 .
- the boundaries of regions 602 A- 602 C may be automatically hidden after a predefined delay or after a user taps on display screen 204 .
- the commands assigned to regions 602 A- 602 C may be displayed on display screen 204 along with the boundaries, as illustrated in FIG. 6 and similarly illustrated in FIG. 7 with respect to regions 702 A- 702 C.
- instructions may be displayed on display screen 204 that instruct the user how to input commands and control terminal 150 using regions 602 A- 602 C.
- the instructions may state, “Tap on a region to execute a command.”
- the instructions may not need to be displayed because the displaying of the regions and their assigned commands is self-explanatory.
- a user who wishes to execute one of the commands assigned to regions 602 A- 602 C may, for example, tap a finger or a stylus on display screen 204 within one of regions 602 A- 602 C associated with the desired command.
- the user may tap on the desired one of regions 602 A- 602 C even when the boundaries of regions 602 A- 602 C are not displayed. That is, even when only the electronic document is displayed on display screen 204 without any visible control objects and without any visible boundaries of regions 602 A- 602 C, as illustrated in FIG. 8A , the user may still tap on a portion of the displayed electronic document corresponding to a location within defined regions 602 A- 602 C, and terminal 150 will execute the command associated with the selected one of regions 602 A- 602 C.
- detection module 402 B may detect a tap gesture on display screen 204 .
- a user may tap on display screen 204 , which may be a touch screen, using, for example, a finger or a stylus.
- routine 600 is described with respect to a tap gesture, detection module 402 B may detect any other types of touch screen gestures.
- routine 600 is described with respect display screen 204 being a touch screen, disclosed embodiments may be applicable to a conventional user input device (such as mouse) and conventional display devices including non-touch screen type displays.
- Detection module 402 B may further determine the location of the tap gesture on display screen 204 in relation to defined regions 602 A- 602 C.
- processor 302 may determine which of regions 602 A- 602 C was selected by the user based on the location of the detected tap gesture determined by detection module 402 B in relation to the boundaries of regions 602 A- 602 C. For example, if the location of the tap gesture falls within right region 602 C, then terminal 150 would determine that right region 602 C was selected by the user.
- processor 302 may execute the command assigned to the selected region. For example, if the user selected right region 602 C and a command to display the next page of the electronic document is assigned to right region 602 C, then processor 302 may execute that command to cause rendering module 402 A to display the next page of the electronic document on display screen 204 . Therefore, in one embodiment, a user may view the electronic document by utilizing the entire display screen 204 without any visible control objects and still navigate forward through the electronic document page by page simply by tapping within right region 602 C on display screen 204 .
- processor 302 may execute that command to cause rendering module 402 A to display control objects on display screen 204 , as illustrated in FIG. 8B .
- the displayed control objects may provide the user access to more commands than the set of commands assigned to defined regions 602 A- 602 C.
- the user may tap on center region 602 B while control objects are displayed, which will cause terminal 150 to hide the displayed control objects, returning display screen 204 to FIG. 8A .
- blocks 902 - 910 may be optional and may be omitted from implementations in certain embodiments. Furthermore, functionality provided by one or more of blocks 902 - 910 may be subdivided into multiple blocks.
- FIG. 10A is an example of a screen 1010 displaying an electronic document including a plurality of elements.
- Screen 1010 may be displayed on display screen 204 of terminal 150 .
- Elements may include any graphical object that may be displayed on display screen 204 .
- elements may include a word or a group of words (such as sentences, paragraphs, etc.), an image, a video, a hyperlink, an icon, etc.
- an element may include one or more subelements.
- a paragraph constituting an element include multiple words, each word constituting a subelement.
- FIG. 10B is an example of a screen 1020 displaying an electronic document and a highlighted element.
- Screen 1020 may be displayed on display screen 204 of terminal 150 .
- Screen 1020 displays the same elements (i.e., same paragraphs) as screen 1010 , but in screen 1020 one of the displayed elements is highlighted (i.e., element 1022 is highlighted).
- FIG. 11 is an example of a screen 1100 including a simple view of element 1022 .
- a simple view may include one activated element.
- the simple view illustrated in screen 1100 includes element 1022 that is highlighted in screen 1020 , but the simple view does not include other paragraphs that are not highlighted in screen 1020 .
- Element 1022 may be displayed larger in the simple view compared to screen 1120 .
- the simple view may enable the user to select a subelement within element 1022 .
- subelement 1102 i.e., the word “towards”
- the simple view may include one or more action buttons related to element 1022 and subelement 1102 .
- the simple view may include a definition button 1104 for providing a definition of the selected subelement 1102 .
- the simple view may further include a close button 1108 , the selection of which will exit the simple view and return to display screen 204 back to screen 1010 for viewing the electronic document.
- the simple view may include instructions 1106 informing the user how to select a subelement of element 1022 and how to pan and zoom element 1022 .
- the simple view may enable the user to pan and zoom element 1022 .
- the user may pan element 1022 by a drag gesture.
- the user may zoom element 1022 by using a pinch gesture and a spread gesture.
- a pinch gesture may cause terminal 150 to shrink the size of element 1022 displayed in the simple view, as illustrated in FIG. 12A .
- FIG. 12A is an example of a screen 1210 including a simple view of element 1022 displayed in a small size. Furthermore, a spread gesture may cause terminal 150 to enlarge the size of element 1022 displayed in the simple view, as illustrated in FIG. 12B .
- FIG. 12B is an example of a screen 1220 including a simple view of element 1022 displayed in a large size.
- FIG. 13 is a flow diagram of an example of a routine 1300 for activating an element, consistent with a disclosed embodiment. Routine 1300 may implement processes according to one or more of program modules 402 A- 402 C.
- rendering module 402 A may display an electronic document on display screen 204 , as illustrated in FIG. 10A .
- the displayed electronic document may include one or more elements—in this example, several paragraphs.
- detection module 402 B may detect a touch-and-hold gesture on display screen 204 . Detection module 402 B may also determine the location of the touch-and-hold gesture. Furthermore, processor 302 may determine an element that corresponds to the location of the touch-and-hold gesture based on the location of the element on display screen 204 , the corresponding element being a target element. For example, if the user touches a finger on the third paragraph displayed on display screen 204 and holds the finger on display screen 204 , then terminal 150 may determine that the target element is element 1022 , illustrated in FIG. 10B .
- rendering module 402 A may highlight the target element by, for example, displaying a box or other shape around the target element.
- the box may be filled with a color.
- target element 1022 i.e., the third paragraph
- the highlighting may inform the user which of the displayed elements terminal 150 determined as being selected by the touch-and-hold gesture, and will subsequently be activated upon release of the touch-and-hold gesture.
- the user may cancel the selection of the target element. For example, if the highlighted element is not the element that the user intended to select and activate, the user may drag his finger away from the highlighted element. If such gesture is executed by the user, then detection module 402 B may detect the drag gesture and rendering module 402 A may remove the highlighting to indicate cancellation of the element selection. The user may then release his finger off display screen 204 .
- detection module 402 B may detect the release gesture. Detection module 402 B may further determine the location of the release gesture, and processor 302 may confirm that the location corresponds to the target element.
- processor 302 may activate the target element.
- the activation may entail various actions. For example, if the target element is an image, then rendering module 402 A may display an enlarged view of the image. If the target element is a video, then rendering module 402 A may play the video. If the target element is a word, then rendering module 402 A may display a definition of the word. If the target element is a link to a section of the electronic document, then rendering module 402 A may displayed the linked section of the electronic document. If the target element is a hyperlink to a web site, then rendering module 150 may display the web site associated with the hyperlink. If the target element is a paragraph, then rendering module 402 A may display a simple view of the paragraph. Various other forms of activation are possible. Also, terminal 150 may enable the user to configure how various types of target elements may be activated.
- terminal 150 may display a simple view of target element 1022 , as illustrated in FIG. 11 .
- blocks 1302 - 1310 may be optional and may be omitted from implementations in certain embodiments. Furthermore, functionality provided by one or more of blocks 1302 - 1310 may be subdivided into multiple blocks.
- FIG. 14 is an example of a slide bar 1402 , consistent with a disclosed embodiment.
- Slide bar 1402 may include a slider 1404 , which may be moved from one end of slide bar 1402 to the other end of slide bar 1402 .
- Slide bar 1402 and slider 1404 may be displayed on display screen 204 of terminal 150 .
- Terminal 150 may enable a user to slide slider 1404 along slide bar 1402 using drag gestures.
- Slide bar 1402 may be used to enable a user to navigate through an electronic document.
- slide bar 1402 may be used to navigate through an electronic book comprising 11 chapters, wherein chapter 7 comprises 13 pages and other chapters 1-6 and 8-11 comprise varying number of pages.
- slide bar 1402 may be disproportionately divided such that slide bar 1402 is not evenly divided in proportion to the division of the electronic document. For example, if the electronic book comprises 100 pages, slide bar 1402 would not be divided into 100 equal parts, each part corresponding to one page.
- slide bar 1402 may include an expanded part 1406 .
- Expanded part may constitute about a half of slide bar 1402 .
- Expanded part 1406 may correspond to the currently-displayed chapter of the electronic book. In this example, if chapter 7 of the electronic document is being displayed on display screen 204 , then expanded part 1406 may correspond to chapter 7. As depicted in FIG. 14 , expanded part 1406 , which corresponds to chapter 7, occupies a disproportionately larger part of slide bar 1402 , while other chapters (i.e., chapters 1-6 and 8-11) occupy a disproportionately smaller part of slide bar 1402 . This may be so even if chapter 7 is not necessarily the longest chapter of the electronic document.
- expanded part 1406 is further divided into multiple parts, each part corresponding to a page in chapter 7. Expanded part 1406 may be evenly divided in proportion to the number of pages in chapter 7.
- the divisions of slide bar 1402 may be visible when slide bar 1402 is displayed on display screen 204 .
- these divisions may not be visible when slide bar 1402 is displayed on display screen 204 .
- only the boundaries of expanded part 1406 may be displayed, while chapter divisions are not displayed.
- FIG. 15 is an example of a screen 1500 including slide bar 1402 and slider 1404 .
- Screen 1500 may be displayed on display screen 204 of terminal 150 .
- slider 1404 is located within expanded part 1406 of slide bar 1402 at a position corresponding to page 5 of chapter 7, which is being displayed on display screen 204 .
- Screen 1500 may further include a display of the chapter number and the page number corresponding to the displayed chapter and page.
- FIG. 16 is an example of a screen 1600 including slide bar 1402 and slider 1404 .
- Screen 1600 may be displayed display screen 204 of terminal 150 .
- slider 1404 has be moved slightly to the right compared to FIG. 15 , such that slider 1402 is located within expanded part 1406 of slide bar 1402 at a position corresponding to page 11 of chapter 7.
- Slide 1404 is still within expanded part 1406 of slide bar 1402 .
- Screen 1600 may further include a display of a target section (i.e., page 11 of chapter 7) corresponding to the location of slider 1404 .
- FIG. 17 is an example of a screen 1700 including slide bar 1402 and slider 1404 .
- Screen 1700 may be displayed display screen 204 of terminal 150 .
- slider 1404 has be moved even more to the right compared to FIGS. 15 and 16 , such that slider 1404 is located outside expanded part 1406 on slide bar 1402 at a position corresponding to chapter 9.
- Screen 1700 may further include a display of a target section (i.e., chapter 9) corresponding to the location of slider 1404 .
- FIG. 18 is a flow diagram of a routine 1800 for navigating an electronic document using slide bar 1402 and slider 1404 , consistent with a disclosed embodiment.
- Routine 1800 may implement processes according to one or more of program modules 402 A- 402 C.
- rendering module 402 A may display an electronic document on display screen 204 .
- page 5 of chapter 7 of the electronic document is displayed, and chapter 7 includes 13 pages.
- rendering module 402 A may display slide bar 1402 and slider 1404 on display screen 204 .
- the user may have executed a tap gesture on center region 602 B, which was detected by detection module 402 B and caused rendering module 402 A to display control objects including slide bar 1402 and slider 1404 overlaid in front of the electronic document.
- slider 1404 may be located on slide bar 1404 at a position corresponding to the currently displayed page and chapter of the electronic document, which is page 5 of chapter 7 in this example.
- rendering module 402 A may also display the page number and the chapter number of the currently displayed section of the electronic document. In this example, “Chapter 7, Page 5/13” is displayed on display screen 204 .
- the user may move slider 1404 displayed on display screen 204 using a drag gesture to navigate to a different section of the electronic document. For example, the user may drag slider 1404 to the right within expanded part 1406 to navigate to a different page of chapter 7.
- detection module 402 B may detect the drag gesture corresponding to a movement of slider 1404 along slide bar 1402 .
- rendering module 402 A may move the displayed slider 1404 on display screen 204 along with the detected drag gesture.
- rendering module 402 A may display a target page number based on the location of slider 1404 as slider 1404 is being dragged along slide bar 1402 within expanded part 1406 .
- FIG. 16 slider 1404 has been moved slightly to the right compared to FIG. 15 , and display screen 204 displays the target page number “Page 11/13.”
- similar steps may be performed when slider 1404 is moved in leftward direction.
- the user may drag slider 1404 even further outside of expanded part 1406 to navigate to a different chapter. If the user drags slider 1404 even further along slide bar 1402 such that slider 1404 is located outside expanded part 1406 corresponding to chapter 7, detection module 402 B may detect the drag gesture, and rendering module 402 A may update the target chapter number displayed on display screen 204 as slider 1404 is being dragged along slide bar 1402 outside expanded part 1406 and hide the target page number from display screen 204 . For example, in FIG. 17 , slider 1404 has been moved even further to the right compared to FIGS. 15 and 16 , outside of the boundaries of expanded part 1406 .
- rendering module 402 B stopped displaying the target page number and updated the target chapter number from “Chapter 7” to “Chapter 9.”
- similar steps may be performed when slider 1404 is moved leftward, outside of the left boundary of expanded part 1406 .
- rendering module 402 A may display the target page of the electronic document on display screen 204 based on the position of slider 1404 as the user is dragging slider 1404 along slide bar 1402 , even before slider 1404 is released by the user.
- the displayed target page corresponding to the current position of slider 1404 may be partially visible through the displayed control objects, as illustrated FIGS. 16 and 17 .
- detection module 402 B may detect a release gesture over slider 1404 . For example, when the user has moved slider 1404 to a desired target section of the electronic document, the user may release slider 1404 .
- rendering module 402 A may display the target section of the electronic document in response to the release gesture based on the location of slider 1404 when it was released.
- rendering module 402 A may display page 11 of chapter 7 of the electronic document on display screen 204 .
- rendering module 402 A may display the first page of chapter 9 of the electronic document on display screen 204 .
- terminal 150 may associate expanded part 1406 with chapter 9 and calculate the proper divisions of slide bar 1402 such that other chapters 1-8, 10, and 11 constitute the parts of slide bar 1402 not taken up by expanded part 1406 .
- blocks 1802 - 1812 may be optional and may be omitted from implementations in certain embodiments. Furthermore, functionality provided by one or more of blocks 1802 - 1812 may be subdivided into multiple blocks.
- FIG. 19A is an example of a screen 1910 including an electronic document.
- Screen 1910 may be displayed on display screen 204 of terminal 150 .
- the text of the electronic document is displayed in black color on a white background.
- FIG. 19B is an example of a screen 1920 including an electronic document and a partially opaque graphical object in front of the electronic document.
- the text of the electronic document is displayed in black color against a white background.
- a graphical object whose size is equal to the size of screen 1920 is overlaid in front of the displayed electronic document.
- the perceived brightness of the electronic document visible through the partially opaque graphical object in front of the electronic document may vary depending on the opacity of the graphical object.
- the opacity of the graphical object may be adjusted by a user.
- the opacity of the graphical object may be adjusted from completely transparent to completely opaque.
- the black text of the electronic document cannot appear any darker or dimmer because black is the darkest color.
- the white background of the electronic document may appear dimmer when viewed through the partially opaque graphical object. For example, as illustrated in FIG. 19B , the white background may appear gray in color.
- the text of the electronic document may be displayed in a white color on a black background, which is desirable to some users when reading at night.
- the black background may not appear any dimmer through the partially opaque graphical object, but the white text may appear dimmer through the partially opaque graphical object depending on the opacity of the graphical object.
- the white text may appear gray in color.
- FIG. 20 is a flow diagram of an example of a routine 2000 for controlling the perceived brightness of an electronic document, consistent with a disclosed embodiment.
- Routine 2000 may implement processes according to one or more of program modules 402 A- 402 C.
- rendering module 402 A may display an electronic document on display screen 204 .
- rendering module 402 A may display the text of the electronic document in black color against a white background, as illustrated in FIG. 19A .
- rendering module 402 A may also display a graphical object in front of the displayed electronic document on display screen 204 .
- the graphical object may be partially opaque such that the electronic document behind the graphical object is partially visible to a user through the graphical object.
- the opacity of the graphical object may initially be set to the lowest level (i.e., completely transparent) such that the perceived brightness of the displayed electronic document is the highest brightness level.
- storage 306 may store the last opacity level of the graphical object and rendering module 402 A may display the graphical object using the last opacity level stored in storage 306 .
- the user may control the opacity of the displayed graphical object, and thereby control the perceived brightness of the electronic document, through touch gestures.
- a vertical upward swipe gesture may be configured to increase the perceived brightness of the electronic document and, conversely, a vertical downward swipe gesture may be configured to decrease the perceived brightness of the electronic document.
- different touch gestures, buttons on display screen 204 , buttons 206 A and 206 B, scroll button 214 , or the like, may be used to control the perceived brightness of the electronic document.
- detection module 402 B may detect a swipe gesture on display screen 204 . Detection module 402 B may further detect, for example, the direction of the swipe gesture, the length of the swipe gesture, the location of the swipe gesture, and the speed of the swipe gesture.
- rendering module 402 A may adjust the opacity of the graphical object in response to the detected swipe gesture.
- the opacity of the graphical object may be adjusted based on one or more of the direction of the swipe gesture, the length of the swipe gesture, the location of the swipe gesture, the speed of the swipe gesture, the current opacity level of the graphical object, and the maximum and minimum opacity levels of the graphical object.
- rendering module 402 A may increase the opacity of the graphical object, thereby decreasing the perceived brightness of the electronic document. Conversely, rendering module 402 A may decrease the opacity of the graphical object, thereby increasing the perceived brightness of the electronic document, in response to an upward swipe gesture on display screen 204 detected by detection module 402 B.
- the opacity of the graphical object may be adjusted based on the length of the swipe gesture; that is, a long swipe may cause a large adjustment in opacity level, while a short swipe may cause a small adjustment in opacity level.
- the opacity of the graphical object may be adjusted based on the speed of the swipe gesture; that is, a fast swipe may cause a large adjustment in opacity level, while a slow swipe may cause a small adjustment in opacity level.
- the amount of adjustment of the opacity of the graphical object may be based on other factors or a combination of many factors.
- terminal 150 may adjust the opacity of the graphical object after the swipe gesture has been fully executed and completed by the user.
- terminal 150 may adjust the opacity of the graphical object while the swipe gesture is being executed such that the user can stop the swipe gesture when the desired perceived brightness of the electronic document is achieved.
- blocks 2002 - 2008 may be optional and may be omitted from implementations in certain embodiments. Furthermore, functionality provided by one or more of blocks 2002 - 2008 may be subdivided into multiple blocks.
- aspects of the disclosed embodiments are described as being stored in memory, one skilled in the art will appreciate that these aspects can also be stored on other types of computer-readable media, such as secondary storage devices, for example, hard disks, floppy disks, or CD-ROM, or other forms of RAM or ROM, USB media, DVD, or other optical drive media.
- secondary storage devices for example, hard disks, floppy disks, or CD-ROM, or other forms of RAM or ROM, USB media, DVD, or other optical drive media.
- Programs based on the written description and disclosed methods are within the skill of an experienced developer.
- the various programs or program modules can be created using any of the techniques known to one skilled in the art or can be designed in connection with existing software.
- program sections or program modules can be designed in or by means of .Net Framework, .Net Compact Framework (and related languages, such as Visual Basic, C, etc.), Java, C++, Objective-C, HTML, HTML/AJAX combinations, XML, or HTML with included Java applets.
- One or more of such software sections or modules can be integrated into a computer system or existing e-mail or browser software.
Abstract
Description
- In today's information-centric world, handheld devices, such as cellular phones, smart phones, personal digital assistants (PDAs), and portable media players, among others, have proliferated rapidly throughout the general public and the business community. The computing power of these devices has increased and the networks on which handheld devices operate have seen dramatic growth. As a result of these advancements, these devices now provide access to a wide range of applications that were traditionally available via desktop computers and laptop computers. For example, wireless networks now provide handheld device users with access to email, media, and documents, such as web pages, articles, blogs, and electronic books.
- Selecting icons or entering text, while relatively easy to accomplish through a traditional input device (e.g., a keyboard or mouse), presents unique challenges for handheld device users. For example, handheld devices with touch screens typically require users to touch or tap an icon or graphical button to select a command or enter text. Users often need to tap these small icons or graphical buttons in a correct sequence to perform a command. The more taps that the user must undertake, the more likely an error will occur, often to the frustration of the user.
- Gestures often substitute for hardware controls for handheld devices having touch screens. The gestures should seem natural so they are easy to remember without physical labels to remind users of what to do. The gestures must also not require too much dexterity, as fingertips are very large and imprecise compared to a traditional mouse pointer. Furthermore, due to its limited screen size, any graphic controls that are visible on a touch screen, particularly when reading a document, unnecessarily consume screen real estate. The small screen also constrains the size of images and formatted text that cannot be reflowed to fit the screen.
- In addition, when a user wishes to navigate within a document, such as an electronic book, the user may wish to jump to an arbitrary page or the beginning of a chapter. Both navigational methods suffer when confined to traditional controls on a small screen. For example, manually entering a page number to jump to is cumbersome and a continuous control, such as a slider, must be divided into many increments. Hitting a precise page of a large book comprising many pages is virtually impossible when operating a slider control on a small screen. Still further, backlighting can appear glaring in the dark, ruin night vision, and cause eye strain. Handheld devices typically allow a user to control the brightness of a backlight, but only through device settings. To access these device settings, users must exit applications, which is inconvenient. Therefore, there is a need for improved systems and methods that overcome the above problems.
- The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate various disclosed embodiments. In the drawings:
-
FIG. 1 is a diagram of an example of a system for providing content; -
FIG. 2 is a view of an example of a terminal; -
FIG. 3 is a diagram of examples of components of a terminal; -
FIG. 4 is a diagram of an example of a software architecture providing functionality for a terminal; -
FIGS. 5A , 5B, and 5C are diagrams of examples of touch screen gestures; -
FIG. 6 is an example of a screen in a portrait orientation including regions; -
FIG. 7 is an example of a screen in a landscape orientation including regions; -
FIG. 8A is an example of a screen displaying an electronic document; -
FIG. 8B is an example of a screen displaying an electronic document and control objects; -
FIG. 9 is a flow diagram of an example of routine for executing a command based on a selected region; -
FIG. 10A is an example of a screen displaying an electronic document; -
FIG. 10B is an example of a screen displaying an electronic document and a highlighted element; -
FIG. 11 is an example of a simple view of an element; -
FIG. 12A is an example of a screen including a simple view of an element displayed in a small size; -
FIG. 12B is an example of a screen including a simple view of an element displayed in a large size; -
FIG. 13 is a flow diagram of an example of a routine for activating an element; -
FIG. 14 is an example of a slide bar; -
FIG. 15 is an example of a screen including a slide bar and a slider; -
FIG. 16 is an example of a screen including a slide bar and a slider; -
FIG. 17 is an example of a screen including a slide bar and a slider; -
FIG. 18 is a flow diagram of an example of a routine for navigating an electronic document using a slide bar and a slider; -
FIG. 19A is an example of a screen including an electronic document; -
FIG. 19B is an example of a screen including an electronic document and a graphical object in front of the electronic document; and -
FIG. 20 is a flow diagram of an example of a routine for controlling the perceived brightness of an electronic document. - The following detailed description refers to the accompanying drawings. Wherever possible, the same reference numbers are used in the drawings and the following description to refer to the same or similar parts. While several exemplary embodiments are described herein, modifications, adaptations and other implementations are possible. For example, substitutions, additions or modifications may be made to the components illustrated in the drawings, and the exemplary methods described herein may be modified by substituting, reordering, removing, or adding steps to the disclosed methods. Accordingly, the following detailed description is not limiting of the disclosed embodiments. Instead, the proper scope is defined by the appended claims.
- Disclosed embodiments provide apparatuses and computer-implemented methods for providing interactive user interfaces. The apparatuses and methods may provide a touch screen capable of detecting touch gestures executed by a user while viewing an electronic document.
- Consistent with a disclosed embodiment, a computer-implemented method is provided for navigating an electronic document on a portable device. The method may include displaying the electronic document on a touch screen of the portable device. The method may further include defining a plurality of regions on the touch screen. The method may further include detecting a tap gesture on the touch screen. The method may further include determining a selection of one of the regions based on a location of the tap gesture on the touch screen. The method may further include executing, by a processor of the portable device, a command associated with the selected region.
- Consistent with another disclosed embodiment, a computer implemented method is provided for navigating an electronic document on a terminal. The method may include defining a plurality of regions on a touch screen of the terminal. The method may further include detecting a tap gesture on the touch screen. The method may further include executing, by a processor of the terminal, in response to the detected tap gesture, a command associated with the selected region.
- Consistent with yet another disclosed embodiment, a computer-implemented method is provided for navigating an electronic document on a terminal. The method may include defining a plurality of regions of a touch screen of the terminal. At least one of the regions may be associated with a command that displays another page of the electronic document on the touch screen. The method may further include detecting, by a processor of the terminal, a touch selection of the at least one of the regions. The method may further include executing, in response to the touch selection, the command associated with the selected at least one of the regions.
- Consistent with yet another disclosed embodiment, an apparatus is provided for navigating an electronic document. The apparatus may include a touch screen including a plurality of regions. The apparatus may further include a storage comprising a detection module for detecting a tap gesture on the touch screen and for determining a selection of one of the regions based on a location of the tap gesture on the touch screen. The apparatus may further include a processor for executing, in response to the detected tap gesture, a command associated with the selected one of the regions.
- Consistent with yet another disclosed embodiment, a computer-implemented method is provided for activating an element of an electronic document. The method may include displaying the electronic document on a touch screen of a terminal. The electronic document may include the element. The method may further include detecting a touch-and-hold gesture on a location of the touch screen. The method may further include highlighting the element on the touch screen. The element may be displayed at the location of the detected touch-and-hold gesture. The method may further include detecting, subsequent to the highlighting, a release of the touch-and-hold gesture from the element. The method may further include activating, by a processor of the terminal, the element based on the release of the touch-and-hold gesture.
- Consistent with yet another disclosed embodiment, an apparatus is provided for activating an element. The apparatus may include a touch screen for displaying an electronic document including the element. The apparatus may further include a storage. The storage may comprise a detection module for detecting a touch-and-hold gesture on a location of the touch screen. The element may be displayed at the location of the detected touch-and-hold gesture. The detection module may be for detecting a release of the touch-and-hold gesture from the element. The storage may further comprise a highlighting module for highlighting the element on the touch screen based on the detected tough-and-hold gesture. The apparatus may further include a processor for activating, based on the release of the touch-and-hold gesture, the element.
- Consistent with yet another disclosed embodiment, a computer-implemented method is provided for navigating an electronic document on a terminal. The method may include displaying, by a processor of the terminal, a page of a section of the electronic document on a touch screen of the terminal. The electronic document may include one or more sections. Each section may include one or more pages. The method may further include displaying a slider on a slide bar on the touch screen. The slide bar may include an expanded part corresponding to the displayed section of the electronic document. The expanded part may be disproportionately larger than other parts of the slide bar corresponding to other sections of the electronic document. The method may further include detecting a drag gesture on the touch screen. The drag gesture may include a movement of the slider on the slide bar. The method may further include detecting a release gesture of the slider on the touch screen. The method may further include displaying, on the touch screen, a target page of the electronic document based on the release gesture.
- Consistent with yet another disclosed embodiment, a computer-implemented method is provided for navigating an electronic document on a terminal. The method may include displaying a page of a section of the electronic document on a touch screen of the terminal. The method may further include displaying a slider on a slide bar on the touch screen. The slide bar may include an expanded part corresponding to the displayed section of the electronic document. The expanded part may be disproportionately larger than other parts of the slide bar corresponding to other sections of the electronic document. The method may further include detecting a drag gesture to move the slider on the slide bar. The method may further include displaying, by a processor of the terminal, another page of the electronic document on the touch screen based on the drag gesture.
- Consistent with yet another disclosed embodiment, an apparatus is provided for navigating an electronic document. The apparatus may include a touch screen for displaying a page of a section of the electronic document on a touch screen of the terminal and for displaying a slider on a slide bar on the touch screen. The slide bar may include an expanded part corresponding to the displayed section of the electronic document. The expanded part may be disproportionately larger than other parts of the slide bar corresponding to other sections of the electronic document. The apparatus may further include a storage comprising a detection module for detecting a drag gesture to move the slider on the slide bar. The apparatus may further include a processor for displaying another page of the electronic document on the touch screen based on the drag gesture.
- Consistent with yet another disclosed embodiment, a computer-implemented method is provided for controlling a perceived brightness of an electronic document. The method may include displaying the electronic document on a touch screen of a terminal. The method may further include displaying a partially opaque graphical object in front of the electronic document. The perceived brightness of the electronic document through the partially opaque graphical object may be based on an opacity of the partially opaque graphical object. The opacity of the partially opaque graphical object may be adjustable from completely transparent to completely opaque. The method may further include detecting a swipe gesture on the touch screen. The method may further include controlling, by a processor of the terminal, the perceived brightness of the electronic document by adjusting the opacity of the partially opaque graphical object based on the swipe gesture.
- Consistent with yet another disclosed embodiment, a computer-implemented method is provided for controlling a perceived brightness of an electronic document. The method may include displaying the electronic document on a touch screen of a terminal. The method may further include displaying a graphical object in front of the electronic document. The method may further include detecting a gesture on the touch screen. The method may further include controlling, by a processor of the terminal, the perceived brightness of the electronic document by adjusting an opacity of the graphical object based on the gesture.
- Consistent with yet another disclosed embodiment, an apparatus is provided for controlling a perceived brightness of an electronic document. The apparatus may include a touch screen for displaying the electronic document and for displaying a graphical object in front of the electronic document. The apparatus may further include a storage comprising a detection module for detecting a gesture on the touch screen. The apparatus may further include a processor for controlling the perceived brightness of the electronic document by adjusting an opacity of the graphical object based on the gesture.
- Consistent with other disclosed embodiments, computer-readable media are provided that store program instructions executable by a processor for performing any of the above-described methods.
- Consistent with yet another disclosed embodiment, a computer-implemented method is provided for controlling a brightness of an electronic document. The method may include displaying the electronic document on a touch screen of a terminal. The method may further include detecting a vertical swipe gesture on the touch screen. The method may further include controlling, by a processor of the terminal, the brightness of the electronic document based on the vertical swipe gesture.
-
FIG. 1 is a diagram of an example of asystem 100 for providing content, consistent with a disclosed embodiment.System 100 may provide functionality for providing an online catalog. The online catalog may provide access to catalogs that include electronic documents. As shown insystem 100, ahost server 110, catalog servers 120-140, and terminals 150-170 are connected to anetwork 180. One of ordinary skill in the art will appreciate that although one host server, three catalog servers, three terminals, and one network are depicted inFIG. 1 , any number of these components may be provided. Furthermore, one of ordinary skill in the art will recognize that one or more components ofsystem 100 may be combined and/or divided into subcomponents. For example, functionality provided byhost server 110 andcatalog server 120 may be combined or functionality provided byhost server 110 may be subdivided across multiple components. -
Network 180 may enable communications between the various components insystem 100, such ashost server 110, catalog servers 120-140, and terminals 150-170. In addition, terminals 150-170 may access legacy systems (not shown) vianetwork 180, or may directly access legacy systems, data stores, or other network applications.Network 180 may include a shared, public, or private network; may encompass a wide area network or a local area network; and may be implemented through any suitable combination of wired and wireless networks.Network 180 may further comprise an intranet or the Internet. For example,catalog server 120 may transmit catalogs and catalog information tohost server 110 vianetwork 180.Host server 110 may transmit catalog information to one or more of terminals 150-170 vianetwork 180. Furthermore,catalog server 120 may transmit electronic catalogs and electronic documents to one or more of terminals 150-170 vianetwork 180. - In one embodiment,
host server 110 may comprise a general purpose computer (e.g., a personal computer, network computer, server, or mainframe computer) having one or more processors that may be selectively activated or reconfigured by a computer program.Host server 110 may also be implemented in a distributed network. For example,host server 110 may communicate vianetwork 180 with one or more additional host servers (not shown), which may enablehost server 110 to distribute processes for parallel execution by a plurality of host servers. Alternatively,host server 110 may be specially constructed for carrying-out methods consistent with disclosed embodiments. - In one embodiment,
host server 110 may constitute, for example, a web server accessible via the Internet. In another embodiment,host server 110 may include a wireless application protocol (WAP) server accessible by wireless terminals. For example,host server 110 may provide access to a web site and provide access to content. -
Host server 110 may include aprocessor 112, such as a central processing unit (CPU).Host server 110 may include amemory 114.Memory 114 may be, for example, a random access memory (RAM).Processor 112 may load active data and program modules intomemory 114 for performing one or more processes for providing content. -
Host server 110 may include adata store 116.Data store 116 may be a persistent storage including, for example, hard drives, flash drives, etc.Data store 116 may include one or more databases. In one embodiment,data store 116 may constitute an online catalog data store and may store, for example, data regarding electronic catalogs and data regarding catalog servers 120-140, which may host the electronic catalogs.Data store 116 may store unique identifiers (IDs) associated with the electronic catalogs. For example,data store 116 may store catalog IDs associated with catalogs and catalog server IDs associated with catalog servers 120-140. -
Data store 116 may continually receive and store new data regarding additional catalogs such thatdata store 116 is continually expanding and/or shrinking in scope and breadth. For example,host server 110 may receive new catalog information from one or more of catalog servers 120-140 or from other catalog servers (not shown) overnetwork 180. Likewise, information may be removed fromdata store 116, such as when a catalog is no longer available or when one or more of catalog servers 120-140 become unavailable. - In one embodiment, catalog servers 120-140 may be any type of a general purpose computer (e.g., a personal computer, network computer, server, or mainframe computer) having one or more processors that may be selectively activated or reconfigured by a computer program. Catalog servers 120-140 may also be implemented in a distributed network. For example,
catalog server 120 may communicate vianetwork 180 with catalog servers 130-140, which may enablecatalog server 120 to distribute processes for parallel execution by a plurality of catalog servers 130-140. Alternatively,catalog server 120 may be specially constructed for carrying-out methods consistent with disclosed embodiments. - The following discussion with respect to
catalog server 120 may apply to catalog servers 130-140. In one embodiment,catalog server 120 may constitute, for example, a web server accessible via the Internet. In another embodiment,catalog server 120 may include a wireless application protocol (WAP) server accessible by wireless terminals. For example,catalog server 120 may provide access to a web site and provide access to content. -
Catalog server 120 may include aprocessor 122, such as a central processing unit (CPU).Catalog server 120 may include amemory 124.Memory 124 may be, for example, a random access memory (RAM).Processor 122 may load active data and program modules intomemory 124 for performing one or more processes for providing content. -
Catalog server 120 may include adata store 126.Data store 126 may be a persistent storage including, for example, hard drives, flash drives, etc.Data store 126 may include one or more databases. In one embodiment,data store 126 may constitute a catalog data store and may store an electronic catalog and various electronic media available through the electronic catalog. For example, the electronic media may include electronic documents, such as books for sale, rent, download, or viewing from an Internet site provided bycatalog server 120. For example,data store 126 may store titles, authors, images of covers, images of the pages of the books, descriptions, prices, and genres, among others, associated with the electronic books. The electronic media may further include music and videos. - Electronic documents stored in
data store 126 may be associated, for example, with unique identifiers (IDs), such as a product ID or ISBN (International Standard Book Number), or any other identifier. Additionally,data store 126 may store metadata in association with product IDs. For example, the metadata may provide information describing and/or classifying an electronic document (e.g., classifying a novel by genre by associating metadata of “horror” with a product ID for a horror novel). - Consistent with other embodiments,
catalog server 120 may include a web server for providing access todata store 126. For example, in one embodiment,catalog server 120 may provide access to content (e.g., articles, media, etc.) stored indata store 126 that are accessible for free or that is accessible to subscribers or other authorized users. -
Data store 126 may continually expand and/or shrink in scope and breadth. For example, information regarding a new electronic document (e.g., a book that was recently published and became available) may be added todata store 126. Likewise, information regarding an existing electronic document may be removed fromdata store 126, for example, if the electronic document is no longer available. In addition, the electronic catalog may be updated to reflect newly available electronic documents as well as unavailable electronic documents. - Terminals 150-170 may be any type of device for communicating with
host server 110 and catalog servers 120-140 overnetwork 180. For example, terminals 150-170 may comprise a server, a personal computer, a laptop, a notebook, a portable device (e.g., a handheld device, a personal digital assistant, a phone, an electronic book reader, etc.), a set top box, or any other appropriate device capable of exchanging data withnetwork 180. -
FIG. 2 is a view of an example ofterminal 150. The following description ofterminal 150 may apply to terminals 160-170. As mentioned above, althoughterminal 150 is depicted as a small and portable device, terminal 150 may be a computer including, for example, a server, a desktop, a laptop, a notebook, an electronic book reader, a phone, etc.Terminal 150 may comprise abody 202 and adisplay screen 204.Display screen 204 may be any screen for displaying text and/or graphics including, for example, a liquid crystal display (LCD).Display screen 204 may include a touch screen such thatterminal 150 is able to detect touches on the touch screen as user inputs.Terminal 150 may comprise one ormore buttons Buttons buttons terminal 150. Furthermore,buttons -
Terminal 150 may further comprise aport 208 for receiving electrical power including, for example, an alternating current (AC) port, a direct current (DC) port, and a universal serial bus (USB) port. Alternatively,port 208 may enable data communication. For example,port 208 may constitute a USB port, IEEE 1394 port, etc.Port 208 may serve a dual purpose and be an electrical power port and a data communication port.Terminal 150 may comprise anaudio port 210 for connecting, for example, a headphone and/or a microphone.Terminal 150 may comprisevolume control buttons Terminal 150 may comprise ascroll button 214 for providing user input.Terminal 150 may further comprises a battery (not shown), a transmitter (not shown) and a receiver (not shown), such as an antenna, a speaker (not shown), and a removal storage slot (not shown). -
FIG. 3 is a diagram of examples of components ofterminal 150. Terminal may include aprocessor 302, such as a CPU.Terminal 150 may include amemory 304.Memory 304 may be, for example, a RAM.Terminal 150 may comprise astorage 306.Storage 306 may be a persistent storage including, for example, a hard drive, a flash drive, etc.Storage 306 may store, for example, an electronic document received fromcatalog server 120.Terminal 150 may comprise aremovable storage 308 including, for example, a compact disk read only memory (CD-ROM), a flash memory card, etc. -
Terminal 150 may comprise acommunication controller 310 that operates the transmitter and receiver to enable terminal 150 to communicate with, for example,host server 110 and catalog servers 120-140 vianetwork 180.Terminal 150 may comprises an input/output (I/O)controller 312 that detects users input from, for example, buttons 203A and 203B,volume buttons screen 204 and the speaker ofterminal 150. I/O controller 312 may detect touch screen gestures ondisplay screen 204 in disclosed embodiments wheredisplay screen 204 is a touch screen. -
Terminal 150 may further comprises avideo controller 314 for outputting graphics ondisplay screen 204 to the user.Terminal 150 may comprise anaudio controller 316 that operatesaudio port 210 and the speaker to input and output audio. These components (processor 302,memory 304,storage 306,removable storage 308,communication controller 310, I/O controller 312,video controller 314, and audio controller 316), and other components (not shown) of terminal 103 may be connected via abus 308. -
FIG. 4 is diagram of an example of a software architecture providing functionality forterminal 150, consistent with a disclosed embodiment. The software architecture may be stored in, for example,storage 306 orremovable storage 308. - In one embodiment,
storage 306 may store instructions ofprogram 402, which when executed byprocessor 302, perform one or more data processes.Program 402 may include arendering module 402A, adetection module 402B, and acommunication module 402C. Althoughprogram 402 is shown as being stored instorage 306, one of ordinary skill in the art will recognize thatprogram 402 may be distributed across multiple components includingremovable storage 308 and components available overnetwork 180. In one embodiment,program 402 may be loaded intomemory 304 for execution byprocessor 302. -
Rendering module 402A may include instructions for rendering graphics to be displayed ondisplay screen 204. For example,video controller 314 may execute the instructions ofrendering module 402A to control the graphics displayed ondisplay screen 204.Detection module 402B may include instructions for receiving touch gestures on the touch screen of thedisplay screen 204. Various touch screen gestures are described in detail below in connection withFIG. 5 .Communication module 402C may include instructions for enabling communications onterminal 150. For example,communication module 402C may include instructions for operating the transmitter, the receiver, and port 208 (when used as a data port). - Although
program modules 402A-402C have been described above as being separate modules, functionalities provided by one or more modules may be combined in keeping with the spirit and scope of the disclosed embodiments. For example,communication module 402C may comprise a receiver module and a transmitter module. -
FIGS. 5A , 5B, and 5C are diagrams of examples of touch screen gestures. The touch screen ofterminal 150 may be capable of detecting touches ondisplay screen 204. Touches ondisplay screen 204 may be performed by, for example, a human finger or a physical object such as a stylus.Terminal 150, throughdisplay screen 204, I/O controller 312, anddetection module 402B, may be capable of determining, for example, the location of a touch, a release of the touch, the duration of the touch, the strength of the touch, a movement of the touch (e.g., a drag), etc. Furthermore, the touch screen may be capable of being depressed, in whichterminal 150 may be capable of detecting a press gesture on thedisplay screen 204. -
FIG. 5A shows an example of a tap gesture. A tap gesture includes a touch and a release, or alternatively, a press and a release. A user may perform a tap gesture by tapping ondisplay screen 204 ofterminal 150. For example, as illustrated inFIG. 5A , the user may touch the tip of a finger on location A ondisplay screen 204. A tap gesture typically involves a fingertouching display screen 204 for a short duration.FIG. 5A also shows an example of a long tap gesture. A long tap gesture includes a touch, and then a hold, followed by a release. A long tap gesture is similar to a tap gesture except that a finger touchesdisplay screen 204 for a longer duration, i.e., the duration of the hold gesture. -
FIG. 5B shows an example of an upward swipe gesture. A swipe gesture involves a touch gesture, a drag gesture, and then a release gesture. For example, as illustrated inFIG. 5B , the user may touch the tip of a finger at location B ondisplay screen 204, then drag the finger upward towards location C while touchingdisplay screen 204, and then releasing at location C by lifting the finger offdisplay screen 204.Terminal 150 may detect the touch gesture at location B, the drag gesture upward towards location C, and the release gesture at location C, and thus register an upward swipe gesture. The upward swipe gesture need not be perfectly upright or even perfectly straight. Some variance may be allowed for such a gesture. -
FIG. 5C shows an example of a downward swipe gesture. For example, the user may swipe down from location D to location E ondisplay screen 204, such thatterminal 150 will register a downward swipe gesture. - In addition to the above-described upward and downward swipe gestures, the user may execute other swipe gestures in various directions including horizontal (rightward and leftward) gestures and diagonal gestures. Furthermore, a swipe gesture may also be called a drag gesture. Typically, a drag gesture may be executed slower than a swipe gesture. Furthermore, a pinch gesture or a spread gesture may be executed by the user using two fingers (e.g., a thumb and an index finger). The user may execute a pinch gesture by executing two swipe gestures concurrently such that the directions of the two swipe gestures are directed towards each other. The user may execute a spread gesture by executing two swipe gestures concurrently such that the directions of the two swipe gestures are directed away from each other.
-
FIG. 6 is an example of ascreen 600 in portraitorientation including regions 602A-602C.Screen 600 may be displayed ondisplay screen 204 ofterminal 150 in portrait orientation. In one embodiment, aleft region 602A, acenter region 602B, and aright region 602C may be defined withinscreen 600. Although three regions are shown inscreen 600, any number of regions may be defined and each region may be any shape or size. -
Screen 600 may display an electronic document, which has been overlaid byregions 602A-602C.Screen 600 may also include boundaries ofregions 602A-602C as well as commands assigned toregions 602A-602C. For example, leftregion 602A may be assigned a command to display the previous page of the electronic document,center region 602B may be assigned a command to display or hide control objects, andright region 602C may be assigned a command to display the next page of the electronic document. Any command that may be executed onterminal 150 may be assigned to any ofregions 602A-602C. One or more ofregions 602A-602C may be assigned no command. Furthermore, terminal 150 may provide default command assignments forregions 602A-602C and/or the command assignments may be configurable by a user. -
FIG. 7 is an example of ascreen 700 in landscapeorientation including regions 702A-702C.Screen 700 may be displayed ondisplay screen 204 ofterminal 150 in landscape orientation. Similar to screen 600,screen 700 includes aleft region 702A, acenter region 702B, and aright region 702C. As illustrated, the size and shape ofregions 702A-702C are different fromregions 602A-602C. The boundaries ofregions 602A-602C andregions 702A-702C may be defined based on the size and shape ofscreen 600 andscreen 700, respectively. -
FIG. 8A is an example of ascreen 810 displaying an electronic document.Screen 810 may be displayed ondisplay screen 204 ofterminal 150. In one embodiment,screen 810 includes the electronic document, but does not include any control objects, such as menus, buttons, scroll bars, etc. -
FIG. 8B is an example of ascreen 820 displaying an electronic document and control objects.Screen 820 may be displayed ondisplay screen 204 ofterminal 150. In one embodiment, control objects may be overlaid in front of the displayed electronic document such that the electronic document is partially visible through transparent portions of the control objects. Control objects may include, for example, menus, buttons, scroll bars, informational displays, among others. -
FIG. 9 is a flow diagram of an example of routine 900 for executing a command based on a selected region, consistent with a disclosed embodiment.Routine 900 may implement processes according to one or more ofprogram modules 402A-402C. - At the start of routine 900, in
block 902,rendering module 402A may display an electronic document ondisplay screen 204. The electronic document may be stored in, for example,storage 306 orremovable storage 308.Terminal 150 may have received the electronic document fromcatalog server 120 vianetwork 180. In one embodiment,display screen 204 may display the electronic document with no control objects, as illustrated inFIG. 8A , to fully utilize the limited screen real estate for electronic document viewing. - In
block 904, one or more ofregions 602A-602C may be defined withindisplay screen 204 byprocessor 302. Although routine 900 is described in connection withregions 602A-602C ofFIG. 6 , routine 900 may be implemented in a similar manner with respect toregions 702A-702C ofFIG. 7 .Regions 602A-602C may be defined based on the size and size ofdisplay screen 204. In an alternative embodiment in which not all of the available space ondisplay screen 204 is used for displaying the electronic document, but rather only a portion ofdisplay screen 204 is dedicated to a window for displaying the electronic document,regions 602A-602C may be defined based on the size and shape of the window. For examples, the regions defined inscreen 600 and the regions defined inscreen 700 may vary in size and shape because the size and shape ofscreen 600 andscreen 700 may vary. - In one embodiment,
display screen 204 is divided into threehorizontal regions 602A-602C, as illustrated inFIGS. 6 and 7 .Horizontal regions 602A-602C may be positioned side-by-side, whereas vertical regions may be positioned top-to-bottom.Defined regions 602A-602C may be of same or similar shape and size, or they may be of different shapes and sizes. Other numbers of regions and other sizes and shapes of regions are possible (e.g., five vertical regions). - Furthermore, terminal 150 may assign commands to each of
regions 602A-602C. Any command that may be executed byterminal 150 may be assigned to definedregions 602A-602C.Terminal 150 may include default assignments of commands toregions 602A-602C. In addition, the assignments of commands toregions 602A-602C may be configurable by a user. The assignments of commands to the regions (including current assignments, default assignment, and user-configured assignments) may be stored instorage 306. - In one embodiment,
rendering module 402A may display the boundaries ofregions 602A-602C ondisplay screen 204, as illustrated inFIG. 6 and similarly illustrated inFIG. 7 with respect toregions 702A-702C. For example, the boundaries ofregions 602A-602C may be displayed when the electronic document is first displayed ondisplay screen 204. The boundaries ofregions 602A-602C may be automatically hidden after a predefined delay or after a user taps ondisplay screen 204. In addition, the commands assigned toregions 602A-602C may be displayed ondisplay screen 204 along with the boundaries, as illustrated inFIG. 6 and similarly illustrated inFIG. 7 with respect toregions 702A-702C. Furthermore, instructions may be displayed ondisplay screen 204 that instruct the user how to input commands and control terminal 150 usingregions 602A-602C. For example, the instructions may state, “Tap on a region to execute a command.” Alternatively, the instructions may not need to be displayed because the displaying of the regions and their assigned commands is self-explanatory. - A user who wishes to execute one of the commands assigned to
regions 602A-602C may, for example, tap a finger or a stylus ondisplay screen 204 within one ofregions 602A-602C associated with the desired command. In one embodiment, the user may tap on the desired one ofregions 602A-602C even when the boundaries ofregions 602A-602C are not displayed. That is, even when only the electronic document is displayed ondisplay screen 204 without any visible control objects and without any visible boundaries ofregions 602A-602C, as illustrated inFIG. 8A , the user may still tap on a portion of the displayed electronic document corresponding to a location within definedregions 602A-602C, and terminal 150 will execute the command associated with the selected one ofregions 602A-602C. - In
block 906,detection module 402B may detect a tap gesture ondisplay screen 204. For example, a user may tap ondisplay screen 204, which may be a touch screen, using, for example, a finger or a stylus. Although routine 600 is described with respect to a tap gesture,detection module 402B may detect any other types of touch screen gestures. Furthermore, althoughroutine 600 is described withrespect display screen 204 being a touch screen, disclosed embodiments may be applicable to a conventional user input device (such as mouse) and conventional display devices including non-touch screen type displays.Detection module 402B may further determine the location of the tap gesture ondisplay screen 204 in relation to definedregions 602A-602C. - In
block 908,processor 302 may determine which ofregions 602A-602C was selected by the user based on the location of the detected tap gesture determined bydetection module 402B in relation to the boundaries ofregions 602A-602C. For example, if the location of the tap gesture falls withinright region 602C, then terminal 150 would determine thatright region 602C was selected by the user. - In
block 910,processor 302 may execute the command assigned to the selected region. For example, if the user selectedright region 602C and a command to display the next page of the electronic document is assigned toright region 602C, thenprocessor 302 may execute that command to causerendering module 402A to display the next page of the electronic document ondisplay screen 204. Therefore, in one embodiment, a user may view the electronic document by utilizing theentire display screen 204 without any visible control objects and still navigate forward through the electronic document page by page simply by tapping withinright region 602C ondisplay screen 204. - Alternatively, if the user selected
center region 602B and a command to display control objects is assigned to centerregion 602B, thenprocessor 302 may execute that command to causerendering module 402A to display control objects ondisplay screen 204, as illustrated inFIG. 8B . The displayed control objects may provide the user access to more commands than the set of commands assigned to definedregions 602A-602C. - Furthermore, the user may tap on
center region 602B while control objects are displayed, which will cause terminal 150 to hide the displayed control objects, returningdisplay screen 204 toFIG. 8A . - As one of ordinary skill in the art will appreciate, one or more of blocks 902-910 may be optional and may be omitted from implementations in certain embodiments. Furthermore, functionality provided by one or more of blocks 902-910 may be subdivided into multiple blocks.
-
FIG. 10A is an example of ascreen 1010 displaying an electronic document including a plurality of elements.Screen 1010 may be displayed ondisplay screen 204 ofterminal 150. Elements may include any graphical object that may be displayed ondisplay screen 204. For example, elements may include a word or a group of words (such as sentences, paragraphs, etc.), an image, a video, a hyperlink, an icon, etc. Furthermore, an element may include one or more subelements. For example, a paragraph constituting an element include multiple words, each word constituting a subelement. -
FIG. 10B is an example of ascreen 1020 displaying an electronic document and a highlighted element.Screen 1020 may be displayed ondisplay screen 204 ofterminal 150.Screen 1020 displays the same elements (i.e., same paragraphs) asscreen 1010, but inscreen 1020 one of the displayed elements is highlighted (i.e.,element 1022 is highlighted). -
FIG. 11 is an example of ascreen 1100 including a simple view ofelement 1022. In one embodiment, a simple view may include one activated element. For example, the simple view illustrated inscreen 1100 includeselement 1022 that is highlighted inscreen 1020, but the simple view does not include other paragraphs that are not highlighted inscreen 1020.Element 1022 may be displayed larger in the simple view compared to screen 1120. - The simple view may enable the user to select a subelement within
element 1022. For example, as illustrated inFIG. 11 , subelement 1102 (i.e., the word “towards”) is highlighted. Furthermore, the simple view may include one or more action buttons related toelement 1022 andsubelement 1102. In one embodiment, the simple view may include adefinition button 1104 for providing a definition of the selectedsubelement 1102. The simple view may further include aclose button 1108, the selection of which will exit the simple view and return todisplay screen 204 back toscreen 1010 for viewing the electronic document. The simple view may includeinstructions 1106 informing the user how to select a subelement ofelement 1022 and how to pan andzoom element 1022. - In addition, the simple view may enable the user to pan and
zoom element 1022. For example, the user may panelement 1022 by a drag gesture. Also, the user may zoomelement 1022 by using a pinch gesture and a spread gesture. A pinch gesture may cause terminal 150 to shrink the size ofelement 1022 displayed in the simple view, as illustrated inFIG. 12A . -
FIG. 12A is an example of ascreen 1210 including a simple view ofelement 1022 displayed in a small size. Furthermore, a spread gesture may cause terminal 150 to enlarge the size ofelement 1022 displayed in the simple view, as illustrated inFIG. 12B .FIG. 12B is an example of ascreen 1220 including a simple view ofelement 1022 displayed in a large size. -
FIG. 13 is a flow diagram of an example of a routine 1300 for activating an element, consistent with a disclosed embodiment.Routine 1300 may implement processes according to one or more ofprogram modules 402A-402C. - At the start of routine 1300, in
block 1302,rendering module 402A may display an electronic document ondisplay screen 204, as illustrated inFIG. 10A . The displayed electronic document may include one or more elements—in this example, several paragraphs. - In
block 1304,detection module 402B may detect a touch-and-hold gesture ondisplay screen 204.Detection module 402B may also determine the location of the touch-and-hold gesture. Furthermore,processor 302 may determine an element that corresponds to the location of the touch-and-hold gesture based on the location of the element ondisplay screen 204, the corresponding element being a target element. For example, if the user touches a finger on the third paragraph displayed ondisplay screen 204 and holds the finger ondisplay screen 204, then terminal 150 may determine that the target element iselement 1022, illustrated inFIG. 10B . - In
block 1306,rendering module 402A may highlight the target element by, for example, displaying a box or other shape around the target element. The box may be filled with a color. In the example illustrated inFIG. 10B , target element 1022 (i.e., the third paragraph) is highlighted. The highlighting may inform the user which of the displayed elements terminal 150 determined as being selected by the touch-and-hold gesture, and will subsequently be activated upon release of the touch-and-hold gesture. - In one embodiment, the user may cancel the selection of the target element. For example, if the highlighted element is not the element that the user intended to select and activate, the user may drag his finger away from the highlighted element. If such gesture is executed by the user, then
detection module 402B may detect the drag gesture andrendering module 402A may remove the highlighting to indicate cancellation of the element selection. The user may then release his finger offdisplay screen 204. - If the highlighted element is the element that the user intended to select, then the user may release his finger over the highlighted element off
display screen 204. Inblock 1308,detection module 402B may detect the release gesture.Detection module 402B may further determine the location of the release gesture, andprocessor 302 may confirm that the location corresponds to the target element. - In
bock 1310,processor 302 may activate the target element. Depending on the type of the target element, the activation may entail various actions. For example, if the target element is an image, then renderingmodule 402A may display an enlarged view of the image. If the target element is a video, then renderingmodule 402A may play the video. If the target element is a word, then renderingmodule 402A may display a definition of the word. If the target element is a link to a section of the electronic document, then renderingmodule 402A may displayed the linked section of the electronic document. If the target element is a hyperlink to a web site, then renderingmodule 150 may display the web site associated with the hyperlink. If the target element is a paragraph, then renderingmodule 402A may display a simple view of the paragraph. Various other forms of activation are possible. Also, terminal 150 may enable the user to configure how various types of target elements may be activated. - As an example, if the user selects
target element 1022 inFIG. 10B , then terminal 150 may display a simple view oftarget element 1022, as illustrated inFIG. 11 . - As one of ordinary skill in the art will appreciate, one or more of blocks 1302-1310 may be optional and may be omitted from implementations in certain embodiments. Furthermore, functionality provided by one or more of blocks 1302-1310 may be subdivided into multiple blocks.
-
FIG. 14 is an example of aslide bar 1402, consistent with a disclosed embodiment.Slide bar 1402 may include aslider 1404, which may be moved from one end ofslide bar 1402 to the other end ofslide bar 1402.Slide bar 1402 andslider 1404 may be displayed ondisplay screen 204 ofterminal 150.Terminal 150 may enable a user to slideslider 1404 alongslide bar 1402 using drag gestures.Slide bar 1402 may be used to enable a user to navigate through an electronic document. - For example,
slide bar 1402 may be used to navigate through an electronic book comprising 11 chapters, whereinchapter 7 comprises 13 pages and other chapters 1-6 and 8-11 comprise varying number of pages. - In one embodiment,
slide bar 1402 may be disproportionately divided such thatslide bar 1402 is not evenly divided in proportion to the division of the electronic document. For example, if the electronic book comprises 100 pages,slide bar 1402 would not be divided into 100 equal parts, each part corresponding to one page. - In one embodiment,
slide bar 1402 may include an expandedpart 1406. Expanded part may constitute about a half ofslide bar 1402.Expanded part 1406 may correspond to the currently-displayed chapter of the electronic book. In this example, ifchapter 7 of the electronic document is being displayed ondisplay screen 204, then expandedpart 1406 may correspond tochapter 7. As depicted inFIG. 14 , expandedpart 1406, which corresponds tochapter 7, occupies a disproportionately larger part ofslide bar 1402, while other chapters (i.e., chapters 1-6 and 8-11) occupy a disproportionately smaller part ofslide bar 1402. This may be so even ifchapter 7 is not necessarily the longest chapter of the electronic document. - In one embodiment, expanded
part 1406 is further divided into multiple parts, each part corresponding to a page inchapter 7.Expanded part 1406 may be evenly divided in proportion to the number of pages inchapter 7. - In one embodiment, the divisions of slide bar 1402 (both chapter divisions and page divisions within expanded part 1406) may be visible when
slide bar 1402 is displayed ondisplay screen 204. Alternatively, these divisions may not be visible whenslide bar 1402 is displayed ondisplay screen 204. As yet another alternative, only the boundaries of expandedpart 1406 may be displayed, while chapter divisions are not displayed. -
FIG. 15 is an example of ascreen 1500 includingslide bar 1402 andslider 1404.Screen 1500 may be displayed ondisplay screen 204 ofterminal 150. InFIG. 15 ,slider 1404 is located within expandedpart 1406 ofslide bar 1402 at a position corresponding topage 5 ofchapter 7, which is being displayed ondisplay screen 204.Screen 1500 may further include a display of the chapter number and the page number corresponding to the displayed chapter and page. -
FIG. 16 is an example of ascreen 1600 includingslide bar 1402 andslider 1404.Screen 1600 may be displayeddisplay screen 204 ofterminal 150. InFIG. 16 ,slider 1404 has be moved slightly to the right compared toFIG. 15 , such thatslider 1402 is located within expandedpart 1406 ofslide bar 1402 at a position corresponding topage 11 ofchapter 7. InFIG. 16 ,slider 1404 is still within expandedpart 1406 ofslide bar 1402.Screen 1600 may further include a display of a target section (i.e.,page 11 of chapter 7) corresponding to the location ofslider 1404. -
FIG. 17 is an example of ascreen 1700 includingslide bar 1402 andslider 1404.Screen 1700 may be displayeddisplay screen 204 ofterminal 150. InFIG. 17 ,slider 1404 has be moved even more to the right compared toFIGS. 15 and 16 , such thatslider 1404 is located outside expandedpart 1406 onslide bar 1402 at a position corresponding tochapter 9.Screen 1700 may further include a display of a target section (i.e., chapter 9) corresponding to the location ofslider 1404. -
FIG. 18 is a flow diagram of a routine 1800 for navigating an electronic document usingslide bar 1402 andslider 1404, consistent with a disclosed embodiment.Routine 1800 may implement processes according to one or more ofprogram modules 402A-402C. - At the start of routine 1800, in
block 1802,rendering module 402A may display an electronic document ondisplay screen 204. In this example, as illustrated inFIG. 15 ,page 5 ofchapter 7 of the electronic document is displayed, andchapter 7 includes 13 pages. - In
block 1804,rendering module 402A may displayslide bar 1402 andslider 1404 ondisplay screen 204. For example, the user may have executed a tap gesture oncenter region 602B, which was detected bydetection module 402B and causedrendering module 402A to display control objects includingslide bar 1402 andslider 1404 overlaid in front of the electronic document. As illustrated inFIG. 15 ,slider 1404 may be located onslide bar 1404 at a position corresponding to the currently displayed page and chapter of the electronic document, which ispage 5 ofchapter 7 in this example. Furthermore,rendering module 402A may also display the page number and the chapter number of the currently displayed section of the electronic document. In this example, “Chapter 7,Page 5/13” is displayed ondisplay screen 204. - The user may move
slider 1404 displayed ondisplay screen 204 using a drag gesture to navigate to a different section of the electronic document. For example, the user may dragslider 1404 to the right within expandedpart 1406 to navigate to a different page ofchapter 7. - In
block 1806,detection module 402B may detect the drag gesture corresponding to a movement ofslider 1404 alongslide bar 1402. In one embodiment,rendering module 402A may move the displayedslider 1404 ondisplay screen 204 along with the detected drag gesture. - In
block 1808,rendering module 402A may display a target page number based on the location ofslider 1404 asslider 1404 is being dragged alongslide bar 1402 within expandedpart 1406. For example, inFIG. 16 ,slider 1404 has been moved slightly to the right compared toFIG. 15 , anddisplay screen 204 displays the target page number “Page 11/13.” Although not illustrated, similar steps may be performed whenslider 1404 is moved in leftward direction. - Moreover, the user may drag
slider 1404 even further outside of expandedpart 1406 to navigate to a different chapter. If the user dragsslider 1404 even further alongslide bar 1402 such thatslider 1404 is located outside expandedpart 1406 corresponding tochapter 7,detection module 402B may detect the drag gesture, andrendering module 402A may update the target chapter number displayed ondisplay screen 204 asslider 1404 is being dragged alongslide bar 1402 outside expandedpart 1406 and hide the target page number fromdisplay screen 204. For example, inFIG. 17 ,slider 1404 has been moved even further to the right compared toFIGS. 15 and 16 , outside of the boundaries of expandedpart 1406. In response to the movement ofslider 1404,rendering module 402B stopped displaying the target page number and updated the target chapter number from “Chapter 7” to “Chapter 9.” Although not illustrated, similar steps may be performed whenslider 1404 is moved leftward, outside of the left boundary of expandedpart 1406. - In one embodiment,
rendering module 402A may display the target page of the electronic document ondisplay screen 204 based on the position ofslider 1404 as the user is draggingslider 1404 alongslide bar 1402, even beforeslider 1404 is released by the user. The displayed target page corresponding to the current position ofslider 1404 may be partially visible through the displayed control objects, as illustratedFIGS. 16 and 17 . - In
block 1810,detection module 402B may detect a release gesture overslider 1404. For example, when the user has movedslider 1404 to a desired target section of the electronic document, the user may releaseslider 1404. - In
block 1812,rendering module 402A may display the target section of the electronic document in response to the release gesture based on the location ofslider 1404 when it was released. - As an example, if
slider 1404 is released at a position corresponding topage 11 ofchapter 7, as illustrated inFIG. 16 , then renderingmodule 402A may displaypage 11 ofchapter 7 of the electronic document ondisplay screen 204. As another example, ifslider 1404 is released at a position corresponding tochapter 9, as illustrated inFIG. 17 , then renderingmodule 402A may display the first page ofchapter 9 of the electronic document ondisplay screen 204. - Furthermore, if the displayed chapter is changed (e.g., from
chapter 7 to chapter 9), then terminal 150 may associate expandedpart 1406 withchapter 9 and calculate the proper divisions ofslide bar 1402 such that other chapters 1-8, 10, and 11 constitute the parts ofslide bar 1402 not taken up by expandedpart 1406. - As one of ordinary skill in the art will appreciate, one or more of blocks 1802-1812 may be optional and may be omitted from implementations in certain embodiments. Furthermore, functionality provided by one or more of blocks 1802-1812 may be subdivided into multiple blocks.
-
FIG. 19A is an example of ascreen 1910 including an electronic document.Screen 1910 may be displayed ondisplay screen 204 ofterminal 150. In one embodiment, as illustrated inFIG. 19A , the text of the electronic document is displayed in black color on a white background.FIG. 19B is an example of ascreen 1920 including an electronic document and a partially opaque graphical object in front of the electronic document. In one embodiment, as illustrated inFIG. 19B , the text of the electronic document is displayed in black color against a white background. In addition, a graphical object whose size is equal to the size ofscreen 1920 is overlaid in front of the displayed electronic document. - Even if the actual brightness of
screen 1920 remains the same, the perceived brightness of the electronic document visible through the partially opaque graphical object in front of the electronic document may vary depending on the opacity of the graphical object. In one embodiment, the opacity of the graphical object may be adjusted by a user. For example, the opacity of the graphical object may be adjusted from completely transparent to completely opaque. - The black text of the electronic document cannot appear any darker or dimmer because black is the darkest color. However, the white background of the electronic document may appear dimmer when viewed through the partially opaque graphical object. For example, as illustrated in
FIG. 19B , the white background may appear gray in color. - Furthermore, in an alternative embodiment, the text of the electronic document may be displayed in a white color on a black background, which is desirable to some users when reading at night. In this scenario, the black background may not appear any dimmer through the partially opaque graphical object, but the white text may appear dimmer through the partially opaque graphical object depending on the opacity of the graphical object. For example, the white text may appear gray in color.
- Although disclose embodiments are described with respect to colors black and white as well as perceived gray color, any other color may be used.
-
FIG. 20 is a flow diagram of an example of a routine 2000 for controlling the perceived brightness of an electronic document, consistent with a disclosed embodiment.Routine 2000 may implement processes according to one or more ofprogram modules 402A-402C. - At the start of routine 2000, in
block 2002,rendering module 402A may display an electronic document ondisplay screen 204. For example,rendering module 402A may display the text of the electronic document in black color against a white background, as illustrated inFIG. 19A . - In
block 2004,rendering module 402A may also display a graphical object in front of the displayed electronic document ondisplay screen 204. The graphical object may be partially opaque such that the electronic document behind the graphical object is partially visible to a user through the graphical object. In one embodiment, the opacity of the graphical object may initially be set to the lowest level (i.e., completely transparent) such that the perceived brightness of the displayed electronic document is the highest brightness level. Alternatively,storage 306 may store the last opacity level of the graphical object andrendering module 402A may display the graphical object using the last opacity level stored instorage 306. - In one embodiment, the user may control the opacity of the displayed graphical object, and thereby control the perceived brightness of the electronic document, through touch gestures. For example, a vertical upward swipe gesture may be configured to increase the perceived brightness of the electronic document and, conversely, a vertical downward swipe gesture may be configured to decrease the perceived brightness of the electronic document. In other embodiments, different touch gestures, buttons on
display screen 204,buttons scroll button 214, or the like, may be used to control the perceived brightness of the electronic document. - In
block 2006,detection module 402B may detect a swipe gesture ondisplay screen 204.Detection module 402B may further detect, for example, the direction of the swipe gesture, the length of the swipe gesture, the location of the swipe gesture, and the speed of the swipe gesture. - In
block 2008,rendering module 402A may adjust the opacity of the graphical object in response to the detected swipe gesture. The opacity of the graphical object may be adjusted based on one or more of the direction of the swipe gesture, the length of the swipe gesture, the location of the swipe gesture, the speed of the swipe gesture, the current opacity level of the graphical object, and the maximum and minimum opacity levels of the graphical object. - For example, if the user executed a downward swipe gesture on
display screen 204, then renderingmodule 402A may increase the opacity of the graphical object, thereby decreasing the perceived brightness of the electronic document. Conversely,rendering module 402A may decrease the opacity of the graphical object, thereby increasing the perceived brightness of the electronic document, in response to an upward swipe gesture ondisplay screen 204 detected bydetection module 402B. - Furthermore, the opacity of the graphical object may be adjusted based on the length of the swipe gesture; that is, a long swipe may cause a large adjustment in opacity level, while a short swipe may cause a small adjustment in opacity level. As another example, the opacity of the graphical object may be adjusted based on the speed of the swipe gesture; that is, a fast swipe may cause a large adjustment in opacity level, while a slow swipe may cause a small adjustment in opacity level. The amount of adjustment of the opacity of the graphical object may be based on other factors or a combination of many factors.
- In one embodiment, terminal 150 may adjust the opacity of the graphical object after the swipe gesture has been fully executed and completed by the user. Alternatively, terminal 150 may adjust the opacity of the graphical object while the swipe gesture is being executed such that the user can stop the swipe gesture when the desired perceived brightness of the electronic document is achieved.
- As one of ordinary skill in the art will appreciate, one or more of blocks 2002-2008 may be optional and may be omitted from implementations in certain embodiments. Furthermore, functionality provided by one or more of blocks 2002-2008 may be subdivided into multiple blocks.
- The foregoing description has been presented for purposes of illustration. It is not exhaustive and is not limiting to the precise forms or embodiments disclosed. Modifications and adaptations will be apparent to those skilled in the art from consideration of the specification and practice of the disclosed embodiments. For example, the described implementations include software, but systems and methods consistent with the disclosed embodiments be implemented as a combination of hardware and software or in hardware alone. Examples of hardware include computing or processing systems, including personal computers, servers, laptops, mainframes, micro-processors and the like. Additionally, although aspects of the disclosed embodiments are described as being stored in memory, one skilled in the art will appreciate that these aspects can also be stored on other types of computer-readable media, such as secondary storage devices, for example, hard disks, floppy disks, or CD-ROM, or other forms of RAM or ROM, USB media, DVD, or other optical drive media.
- Computer programs based on the written description and disclosed methods are within the skill of an experienced developer. The various programs or program modules can be created using any of the techniques known to one skilled in the art or can be designed in connection with existing software. For example, program sections or program modules can be designed in or by means of .Net Framework, .Net Compact Framework (and related languages, such as Visual Basic, C, etc.), Java, C++, Objective-C, HTML, HTML/AJAX combinations, XML, or HTML with included Java applets. One or more of such software sections or modules can be integrated into a computer system or existing e-mail or browser software.
- Moreover, while illustrative embodiments have been described herein, the scope of any and all embodiments having equivalent elements, modifications, omissions, combinations (e.g., of aspects across various embodiments), adaptations and/or alterations as would be appreciated by those in the art based on the present disclosure. The limitations in the claims are to be interpreted broadly based on the language employed in the claims and not limited to examples described in the present specification or during the prosecution of the application, which examples are to be construed as non-exclusive. Further, the blocks of the disclosed routines may be modified in any manner, including by reordering blocks and/or inserting or deleting blocks. It is intended, therefore, that the specification and examples be considered as exemplary only, with a true scope and spirit being indicated by the following claims and their full scope of equivalents.
Claims (21)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/645,686 US20150185989A1 (en) | 2009-07-10 | 2015-03-12 | Interactive user interface |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/500,827 US9035887B1 (en) | 2009-07-10 | 2009-07-10 | Interactive user interface |
US14/645,686 US20150185989A1 (en) | 2009-07-10 | 2015-03-12 | Interactive user interface |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/500,827 Continuation US9035887B1 (en) | 2009-07-10 | 2009-07-10 | Interactive user interface |
Publications (1)
Publication Number | Publication Date |
---|---|
US20150185989A1 true US20150185989A1 (en) | 2015-07-02 |
Family
ID=53054643
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/500,827 Active 2032-06-03 US9035887B1 (en) | 2009-07-10 | 2009-07-10 | Interactive user interface |
US14/645,686 Abandoned US20150185989A1 (en) | 2009-07-10 | 2015-03-12 | Interactive user interface |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/500,827 Active 2032-06-03 US9035887B1 (en) | 2009-07-10 | 2009-07-10 | Interactive user interface |
Country Status (1)
Country | Link |
---|---|
US (2) | US9035887B1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9785327B1 (en) | 2009-07-10 | 2017-10-10 | Lexcycle, Inc. | Interactive user interface |
US10347193B2 (en) | 2017-06-23 | 2019-07-09 | Blackberry Limited | Electronic device including display and method of applying privacy filter |
Families Citing this family (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8487759B2 (en) | 2009-09-30 | 2013-07-16 | Apple Inc. | Self adapting haptic device |
US10013058B2 (en) | 2010-09-21 | 2018-07-03 | Apple Inc. | Touch-based user interface with haptic feedback |
US10120446B2 (en) | 2010-11-19 | 2018-11-06 | Apple Inc. | Haptic input device |
US9754558B2 (en) * | 2012-06-18 | 2017-09-05 | Apple Inc. | Heads-up scrolling |
US20140195961A1 (en) * | 2013-01-07 | 2014-07-10 | Apple Inc. | Dynamic Index |
WO2015047364A1 (en) | 2013-09-29 | 2015-04-02 | Pearl Capital Developments Llc | Devices and methods for creating haptic effects |
US10236760B2 (en) | 2013-09-30 | 2019-03-19 | Apple Inc. | Magnetic actuators for haptic response |
US9317118B2 (en) | 2013-10-22 | 2016-04-19 | Apple Inc. | Touch surface for simulating materials |
JP2015087911A (en) * | 2013-10-30 | 2015-05-07 | 株式会社東芝 | Electronic device, method, and program |
CN105814510B (en) | 2013-12-10 | 2019-06-07 | 苹果公司 | Band body attachment mechanism with haptic response |
WO2015163842A1 (en) | 2014-04-21 | 2015-10-29 | Yknots Industries Llc | Apportionment of forces for multi-touch input devices of electronic devices |
DE102015209639A1 (en) | 2014-06-03 | 2015-12-03 | Apple Inc. | Linear actuator |
KR102019505B1 (en) | 2014-09-02 | 2019-09-06 | 애플 인크. | Haptic notifications |
US10353467B2 (en) | 2015-03-06 | 2019-07-16 | Apple Inc. | Calibration of haptic devices |
AU2016100399B4 (en) | 2015-04-17 | 2017-02-02 | Apple Inc. | Contracting and elongating materials for providing input and output for an electronic device |
USD768722S1 (en) * | 2015-05-01 | 2016-10-11 | Microsoft Corporation | Display screen with animated graphical user interface |
CN107925333B (en) | 2015-09-08 | 2020-10-23 | 苹果公司 | Linear actuator for use in an electronic device |
US10039080B2 (en) | 2016-03-04 | 2018-07-31 | Apple Inc. | Situationally-aware alerts |
US10268272B2 (en) | 2016-03-31 | 2019-04-23 | Apple Inc. | Dampening mechanical modes of a haptic actuator using a delay |
US10622538B2 (en) | 2017-07-18 | 2020-04-14 | Apple Inc. | Techniques for providing a haptic output and sensing a haptic input using a piezoelectric body |
US10691211B2 (en) | 2018-09-28 | 2020-06-23 | Apple Inc. | Button providing force sensing and/or haptic output |
US10599223B1 (en) | 2018-09-28 | 2020-03-24 | Apple Inc. | Button providing force sensing and/or haptic output |
US11380470B2 (en) | 2019-09-24 | 2022-07-05 | Apple Inc. | Methods to control force in reluctance actuators based on flux related parameters |
US11809631B2 (en) | 2021-09-21 | 2023-11-07 | Apple Inc. | Reluctance haptic engine for an electronic device |
Citations (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040119744A1 (en) * | 2001-12-19 | 2004-06-24 | Sammy Chan | Selecting moving objects on a system |
US20060150087A1 (en) * | 2006-01-20 | 2006-07-06 | Daniel Cronenberger | Ultralink text analysis tool |
US20070024595A1 (en) * | 2005-07-29 | 2007-02-01 | Interlink Electronics, Inc. | System and method for implementing a control function via a sensor having a touch sensitive control input surface |
US20070101292A1 (en) * | 2003-07-28 | 2007-05-03 | Kupka Sig G | Manipulating an On-Screen Object Using Zones Surrounding the Object |
US20070118803A1 (en) * | 2000-05-08 | 2007-05-24 | Walker Jay S | Products and processes for providing one or more links in an electronic file that is presented to a user |
US20080165141A1 (en) * | 2007-01-05 | 2008-07-10 | Apple Inc. | Gestures for controlling, manipulating, and editing of media files using touch sensitive devices |
US20080168404A1 (en) * | 2007-01-07 | 2008-07-10 | Apple Inc. | List Scrolling and Document Translation, Scaling, and Rotation on a Touch-Screen Display |
US20080231610A1 (en) * | 2004-07-30 | 2008-09-25 | Apple Inc. | Gestures for touch sensitive input devices |
US20080259041A1 (en) * | 2007-01-05 | 2008-10-23 | Chris Blumenberg | Method, system, and graphical user interface for activating hyperlinks |
US20090174680A1 (en) * | 2008-01-06 | 2009-07-09 | Freddy Allen Anzures | Portable Multifunction Device, Method, and Graphical User Interface for Viewing and Managing Electronic Calendars |
US20090278806A1 (en) * | 2008-05-06 | 2009-11-12 | Matias Gonzalo Duarte | Extended touch-sensitive control area for electronic device |
US20100085303A1 (en) * | 2008-10-07 | 2010-04-08 | Research In Motion Limited | Portable electronic device and method of controlling same |
US20100085314A1 (en) * | 2008-10-08 | 2010-04-08 | Research In Motion Limited | Portable electronic device and method of controlling same |
US20100097322A1 (en) * | 2008-10-16 | 2010-04-22 | Motorola, Inc. | Apparatus and method for switching touch screen operation |
US20100128994A1 (en) * | 2008-11-24 | 2010-05-27 | Jan Scott Zwolinski | Personal dictionary and translator device |
US20100251112A1 (en) * | 2009-03-24 | 2010-09-30 | Microsoft Corporation | Bimodal touch sensitive digital notebook |
US8155950B1 (en) * | 2008-11-20 | 2012-04-10 | Intuit Inc. | Method and system for providing a personalized electronic dictionary and vocabulary builder |
US20120233545A1 (en) * | 2011-03-11 | 2012-09-13 | Akihiko Ikeda | Detection of a held touch on a touch-sensitive display |
Family Cites Families (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3618032A (en) | 1968-12-09 | 1971-11-02 | Ibm | Automatic data composing, editing and formatting system |
US4574363A (en) | 1982-07-13 | 1986-03-04 | International Business Machines Corporation | Mixed mode enhanced resolution hyphenation function for a text processing system |
US5295069A (en) | 1991-06-05 | 1994-03-15 | International Business Machines Corporation | Computer method for ranked hyphenation of multilingual text |
US6243071B1 (en) | 1993-11-03 | 2001-06-05 | Apple Computer, Inc. | Tool set for navigating through an electronic book |
US6747680B1 (en) | 1999-12-13 | 2004-06-08 | Microsoft Corporation | Speed-dependent automatic zooming interface |
US6157381A (en) | 1997-11-18 | 2000-12-05 | International Business Machines Corporation | Computer system, user interface component and method utilizing non-linear scroll bar |
US7082407B1 (en) | 1999-04-09 | 2006-07-25 | Amazon.Com, Inc. | Purchase notification service for assisting users in selecting items from an electronic catalog |
US20040122731A1 (en) | 1999-09-23 | 2004-06-24 | Mannik Peeter Todd | System and method for using interactive electronic representations of objects |
US6992687B1 (en) | 1999-12-07 | 2006-01-31 | Microsoft Corporation | Bookmarking and placemarking a displayed document in a computer system |
US7437683B1 (en) | 1999-12-16 | 2008-10-14 | Microsoft Corporation | Method and apparatus for fostering immersive reading of electronic documents |
WO2001071630A2 (en) | 2000-03-22 | 2001-09-27 | America To Go Llc | Methods and apparatus for on-line ordering |
US7450114B2 (en) | 2000-04-14 | 2008-11-11 | Picsel (Research) Limited | User interface systems and methods for manipulating and viewing digital documents |
US7305696B2 (en) | 2000-04-17 | 2007-12-04 | Triveni Digital, Inc. | Three part architecture for digital television data broadcasting |
US6816174B2 (en) | 2000-12-18 | 2004-11-09 | International Business Machines Corporation | Method and apparatus for variable density scroll area |
US7139982B2 (en) | 2000-12-21 | 2006-11-21 | Xerox Corporation | Navigation methods, systems, and computer program products for virtual three-dimensional books |
US20020126154A1 (en) | 2001-03-12 | 2002-09-12 | Watson Edwin George | Method for displaying and scrolling data including display status feedback |
US20030018543A1 (en) | 2001-06-25 | 2003-01-23 | Alger Jeffrey H. | Client portal |
US20040205568A1 (en) | 2002-03-01 | 2004-10-14 | Breuel Thomas M. | Method and system for document image layout deconstruction and redisplay system |
US7472082B2 (en) | 2002-09-25 | 2008-12-30 | Wirth Jr John | Method and system for browsing a custom catalog via the internet |
US7100119B2 (en) | 2002-11-01 | 2006-08-29 | Microsoft Corporation | Page bar control |
US7689927B2 (en) | 2002-11-15 | 2010-03-30 | Microsoft Corporation | Viewable document section |
US7370007B2 (en) | 2002-11-18 | 2008-05-06 | Sap Aktiengesellschaft | Catalog search agent |
JP2006209258A (en) * | 2005-01-25 | 2006-08-10 | Kenwood Corp | Av processing apparatus, audio video processing method, and program |
US7516402B2 (en) | 2005-02-28 | 2009-04-07 | Nokia Corporation | Presentation of large objects on small displays |
US20060227153A1 (en) | 2005-04-08 | 2006-10-12 | Picsel Research Limited | System and method for dynamically zooming and rearranging display items |
US20070238077A1 (en) | 2006-04-11 | 2007-10-11 | Edward Strachar | Interactive Reading Teaching Tool System |
US8736557B2 (en) | 2006-09-11 | 2014-05-27 | Apple Inc. | Electronic device with image based browsers |
US20080208715A1 (en) | 2007-02-28 | 2008-08-28 | Ronen Hod | Method, system and apparatus for providing a personalized electronic shopping center |
US20090015568A1 (en) | 2007-07-12 | 2009-01-15 | Koski David A | Method and Apparatus for Implementing Slider Detents |
US8135574B2 (en) * | 2007-11-15 | 2012-03-13 | Weikel Bryan T | Creating and displaying bodies of parallel segmented text |
US20090235201A1 (en) * | 2008-03-11 | 2009-09-17 | Aaron Baalbergen | Methods for controlling display of on-screen menus |
US20100088632A1 (en) | 2008-10-08 | 2010-04-08 | Research In Motion Limited | Method and handheld electronic device having dual mode touchscreen-based navigation |
US20100146435A1 (en) | 2008-12-04 | 2010-06-10 | Jean-Christophe Cros | Method and system for navigating through parts of a document |
US8689128B2 (en) | 2009-03-16 | 2014-04-01 | Apple Inc. | Device, method, and graphical user interface for moving a current position in content at a variable scrubbing rate |
-
2009
- 2009-07-10 US US12/500,827 patent/US9035887B1/en active Active
-
2015
- 2015-03-12 US US14/645,686 patent/US20150185989A1/en not_active Abandoned
Patent Citations (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070118803A1 (en) * | 2000-05-08 | 2007-05-24 | Walker Jay S | Products and processes for providing one or more links in an electronic file that is presented to a user |
US20040119744A1 (en) * | 2001-12-19 | 2004-06-24 | Sammy Chan | Selecting moving objects on a system |
US20070101292A1 (en) * | 2003-07-28 | 2007-05-03 | Kupka Sig G | Manipulating an On-Screen Object Using Zones Surrounding the Object |
US20080231610A1 (en) * | 2004-07-30 | 2008-09-25 | Apple Inc. | Gestures for touch sensitive input devices |
US20070024595A1 (en) * | 2005-07-29 | 2007-02-01 | Interlink Electronics, Inc. | System and method for implementing a control function via a sensor having a touch sensitive control input surface |
US20060150087A1 (en) * | 2006-01-20 | 2006-07-06 | Daniel Cronenberger | Ultralink text analysis tool |
US20080259041A1 (en) * | 2007-01-05 | 2008-10-23 | Chris Blumenberg | Method, system, and graphical user interface for activating hyperlinks |
US20080165141A1 (en) * | 2007-01-05 | 2008-07-10 | Apple Inc. | Gestures for controlling, manipulating, and editing of media files using touch sensitive devices |
US20080168404A1 (en) * | 2007-01-07 | 2008-07-10 | Apple Inc. | List Scrolling and Document Translation, Scaling, and Rotation on a Touch-Screen Display |
US20090174680A1 (en) * | 2008-01-06 | 2009-07-09 | Freddy Allen Anzures | Portable Multifunction Device, Method, and Graphical User Interface for Viewing and Managing Electronic Calendars |
US20090278806A1 (en) * | 2008-05-06 | 2009-11-12 | Matias Gonzalo Duarte | Extended touch-sensitive control area for electronic device |
US20100085303A1 (en) * | 2008-10-07 | 2010-04-08 | Research In Motion Limited | Portable electronic device and method of controlling same |
US20100085314A1 (en) * | 2008-10-08 | 2010-04-08 | Research In Motion Limited | Portable electronic device and method of controlling same |
US20100097322A1 (en) * | 2008-10-16 | 2010-04-22 | Motorola, Inc. | Apparatus and method for switching touch screen operation |
US8155950B1 (en) * | 2008-11-20 | 2012-04-10 | Intuit Inc. | Method and system for providing a personalized electronic dictionary and vocabulary builder |
US20100128994A1 (en) * | 2008-11-24 | 2010-05-27 | Jan Scott Zwolinski | Personal dictionary and translator device |
US20100251112A1 (en) * | 2009-03-24 | 2010-09-30 | Microsoft Corporation | Bimodal touch sensitive digital notebook |
US20120233545A1 (en) * | 2011-03-11 | 2012-09-13 | Akihiko Ikeda | Detection of a held touch on a touch-sensitive display |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9785327B1 (en) | 2009-07-10 | 2017-10-10 | Lexcycle, Inc. | Interactive user interface |
US10347193B2 (en) | 2017-06-23 | 2019-07-09 | Blackberry Limited | Electronic device including display and method of applying privacy filter |
Also Published As
Publication number | Publication date |
---|---|
US9035887B1 (en) | 2015-05-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8347232B1 (en) | Interactive user interface | |
US9035887B1 (en) | Interactive user interface | |
US11893233B2 (en) | Device, method, and graphical user interface for moving user interface objects | |
JP7397881B2 (en) | Systems, methods, and user interfaces for interacting with multiple application windows | |
US11592923B2 (en) | Systems and methods for resizing applications in a multitasking view on an electronic device with a touch-sensitive display | |
DK180837B1 (en) | Devices, methods, and graphical user interfaces for interacting with user interface objects corresponding to applications | |
AU2017208225B2 (en) | Device, method, and graphical user interface for adjusting the appearance of a control | |
US20190018562A1 (en) | Device, Method, and Graphical User Interface for Scrolling Nested Regions | |
JP6138866B2 (en) | Device, method and graphical user interface for document manipulation | |
KR101720849B1 (en) | Touch screen hover input handling | |
US9841887B2 (en) | Anchoring viewport | |
US20150113428A1 (en) | Widget Setting Method and Terminal Device | |
EP2795608A1 (en) | Method, apparatus and computer program product for providing multiple levels of interaction with a program | |
US20220326816A1 (en) | Systems, Methods, and User Interfaces for Interacting with Multiple Application Views | |
CN114766015A (en) | Device, method and graphical user interface for interacting with user interface objects corresponding to an application | |
EP3278203A1 (en) | Enhancement to text selection controls | |
EP2795609A1 (en) | Method, apparatus and computer program product for graphically transitioning between multiple program interface levels of a program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: LEXCYCLE, INC., OREGON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PRUD'HOMMEAUX, MARC;WHITE, AUGUSTUS A.;SIGNING DATES FROM 20090918 TO 20091002;REEL/FRAME:038742/0683 |
|
AS | Assignment |
Owner name: AMAZON TECHNOLOGIES, INC., WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LEXCYCLE, INC.;REEL/FRAME:043900/0602 Effective date: 20171017 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |