US8542205B1 - Refining search results based on touch gestures - Google Patents

Refining search results based on touch gestures Download PDF

Info

Publication number
US8542205B1
US8542205B1 US12/823,085 US82308510A US8542205B1 US 8542205 B1 US8542205 B1 US 8542205B1 US 82308510 A US82308510 A US 82308510A US 8542205 B1 US8542205 B1 US 8542205B1
Authority
US
United States
Prior art keywords
touch
search results
sensitive display
force
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US12/823,085
Inventor
Kevin E. Keller
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Amazon Technologies Inc
Original Assignee
Amazon Technologies Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Amazon Technologies Inc filed Critical Amazon Technologies Inc
Priority to US12/823,085 priority Critical patent/US8542205B1/en
Assigned to AMAZON TECHNOLOGIES, INC. reassignment AMAZON TECHNOLOGIES, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KELLER, KEVIN E.
Application granted granted Critical
Publication of US8542205B1 publication Critical patent/US8542205B1/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/332Query formulation

Definitions

  • a large and growing population of users is enjoying entertainment through the consumption of digital content items (or simply “content items”), such as music, movies, images, electronic books, and so on.
  • digital content items such as music, movies, images, electronic books, and so on.
  • the users employ various electronic devices to consume such content items.
  • electronic devices include electronic book (eBook) reader devices, cellular telephones, personal digital assistants (PDAs), portable media players, tablet computers, netbooks, and the like.
  • FIG. 1 illustrates an architecture in which a community of users operates respective client devices to consume content items, such electronic books (eBooks), songs, videos and the like.
  • the client devices and/or a content item service implement techniques to provide context-sensitive reference works (e.g., dictionaries, thesauruses, atlases, etc.) that provide requested information to the users based on a genre of the content item associated with the request, a characteristic of the user, or the like.
  • context-sensitive reference works e.g., dictionaries, thesauruses, atlases, etc.
  • FIG. 2 is a block diagram of selected modules of an eBook reader device capable of receiving a request for information from a user experiencing a content item, determining a type of reference work entry appropriate for the content item or the user, and providing the information to the user from the determined type of reference work entry.
  • FIG. 3 illustrates an example user interface rendered by the devices of FIGS. 1 and 2 .
  • the device or the content item service has determined that the eBook currently being read by the user is associated with a “medical” genre. As such, when the user requests a definition for a word within the eBook, the device displays a “medical” definition of the word rather than a standard or other type of definition.
  • FIG. 4 illustrates an example user interface rendered by the device of FIG. 3 after the user has selected to view “more” definitions of the illustrated word “prognosis.” As shown, in response the device displays definitions of “prognosis” in the medical sense, the standard sense, and the legal sense.
  • FIG. 5 illustrates another example user interface rendered by the devices of FIGS. 1 and 2 .
  • the device is displaying an article from a periodical relating to “business.”
  • the device displays synonyms and antonyms from a business-related thesaurus entry.
  • FIG. 6 illustrates another example user interface rendered by the devices of FIGS. 1 and 2 .
  • the device is displaying a sports-related article.
  • the device displays information about that topic from a sports-related encyclopedia entry.
  • FIG. 7 is a flow diagram showing a process of classifying a content item according to, for example, a genre and determining, based on the classification, a type of reference work entry to use for the content item when a user requests information associated with a word, phrase, or topic found within the content item.
  • FIG. 8 is a block diagram of selected modules of an example eBook reader device that may implement a touch-sensitive display and that is capable of outputting different reference work entries based on an amount of force applied to the touch-sensitive display.
  • FIG. 9 illustrates an example user interface rendered by the device of FIG. 8 after a user has made a selection of a word on the touch-sensitive display of the device.
  • the device measures an amount of force of the selection and outputs a particular type of reference work entry based on this measured amount of force.
  • FIG. 10 illustrates an example user interface rendered by the device of FIG. 8 after the user has increased the amount of force on the selected word.
  • the device outputs a second, different type of reference work entry based on this greater amount of force.
  • FIG. 11 illustrates an example user interface rendered by the device of FIG. 8 after the user has yet again increased the amount of force on the selected word.
  • the device outputs a third, different type of reference work entry based on this even greater amount of force.
  • FIG. 12 illustrates another example user interface rendered by the device of FIG. 8 after a user has made a selection of a word on the touch-sensitive display of the device.
  • the device measures an amount of force of the selection and outputs a particular context-sensitive reference work entry based on this measured amount of force.
  • FIG. 13 illustrates another example user interface rendered by the device of FIG. 8 after the user has increased the amount of force on the touch-sensitive display.
  • the device outputs a second, different context-sensitive reference work entry based on this greater amount of force.
  • FIG. 14 is a flow diagram showing a process of selecting which of multiple reference work entries to output based on an amount of measured force associated with a selection.
  • FIG. 15 illustrates an example user interface rendered by the device of FIG. 8 after a user has made a selection of a word on the touch-sensitive display of the device.
  • the device outputs a reference work entry based on this selection.
  • FIG. 16 illustrates an example user interface rendered by the device after the user has provided an additional input.
  • the device outputs a second, different reference work entry.
  • This input may include more force on the display, an additional point of contact on the display, activation of a key on a keypad, an oral command spoken by the user or any other type of input.
  • FIG. 17 is a flow diagram showing a process of outputting a first reference work entry in response to receiving a touch selection on a display and, thereafter, outputting a second, different reference work entry after receiving an additional input.
  • FIG. 18 illustrates an example user interface rendered by the device of FIG. 8 after a user has made a selection of a word on the touch-sensitive display of the device.
  • the device outputs a reference work entry after determining that an amount of force of this selection is less than a threshold force.
  • FIG. 19 illustrates an example user interface rendered by the device of FIG. 8 after a user has made a selection of a word on the touch-sensitive display of the device.
  • the device enables the user to select a reference work entry to output after the device determines that an amount of force of this selection is greater than a threshold force.
  • FIG. 20 is a flow diagram showing a process of determining whether to output a reference work entry or whether to allow a user to select which reference work entry to output based on an amount of force of a selection.
  • FIG. 21 is a block diagram of selected modules of an example eBook reader device that may implement a touch-sensitive display and that is capable of outputting different content based on an amount of force applied to the touch-sensitive display.
  • FIG. 22 illustrates an example user interface rendered by the device of FIG. 21 after a user has made a selection of a word within a content item being rendered on the touch-sensitive display of the device.
  • the device may measures an amount of force of the selection and, in some instance, may output a particular type of reference work entry based on this measured amount of force.
  • FIG. 23 illustrates an example user interface rendered by the device of FIG. 21 after the user has increased the amount of force on the selected word.
  • the device outputs other instances of this word within the illustrated content item based on this greater amount of force.
  • FIG. 24 illustrates an example user interface rendered by the device of FIG. 21 after the user has yet again increased the amount of force on the selected word.
  • the device outputs other instances of this word within other content items based on this even greater amount of force.
  • FIG. 25 illustrates an example user interface rendered by the device of FIG. 21 after the user has yet again increased the amount of force on the selected word.
  • the device outputs search results associated with a query comprising the selected word based on this even greater amount of force.
  • FIG. 26 is a flow diagram showing a process of causing display of other instances of a selected term based on a force associated with a selection.
  • FIG. 27 illustrates an example user interface rendered by the device of FIG. 21 after a user has made a selection of a word within a content item being rendered on the touch-sensitive display of the device.
  • the device requests that the user select whether to view previous or subsequent instances of the selected word within the illustrated content item.
  • FIG. 28 illustrates an example user interface rendered by the device of FIG. 21 while a user makes a gesture that both selects a word within a content item and requests to view subsequent instances of the selected word within the illustrated content item.
  • FIG. 29 illustrates an example user interface rendered by the device of FIG. 21 after the gesture made by the user in FIG. 28 .
  • FIG. 30 is a flow diagram showing a process of causing display of information associated with a portion of a content item in response to receiving a touch selection of the portion on a display and, thereafter, outputting other instances of the portion after receiving an additional input.
  • FIG. 31 illustrates an example user interface rendered by the device of FIG. 21 after the user selects a word within the illustrated content item.
  • the device outputs search results associated with a query comprising the selected word based on the selection by the user.
  • FIG. 32 illustrates an example user interface rendered by the device of FIG. 21 after the user provides a greater amount of force to the selected word.
  • the device refines (e.g., narrows) the search results of FIG. 31 in response to detecting the greater amount of force.
  • FIG. 33 is a flow diagram showing a process of refining search results on a touch-sensitive display based on the device detecting a greater or lesser amount of force on a selected term.
  • FIG. 34 illustrates an example user interface rendered by the device of FIG. 21 while a user makes a gesture that both selects a word within a content item and requests to narrow illustrated search results associated with the selected word.
  • FIG. 35 illustrates an example user interface rendered by the device of FIG. 21 after the user performs the gesture of FIG. 34 . As illustrated, the device has narrowed search results in response to detecting the gesture.
  • FIG. 36 illustrates an example user interface rendered by the device of FIG. 21 while a user makes a gesture that both selects a word within a content item and requests to expand illustrated search results associated with the selected word.
  • FIG. 37 illustrates an example user interface rendered by the device of FIG. 21 after the user performs the gesture of FIG. 36 . As illustrated, the device has expanded or broadened search results in response to detecting the gesture.
  • FIG. 38 is a flow diagram showing a process for refining search results based at least in part on detecting a predefined gesture on a touch-sensitive display.
  • This disclosure describes techniques for outputting different content on a touch-sensitive display of a device based at least in part on an amount of force applied to the touch-sensitive display. For instance, when a user reads an electronic book (eBook) on a device having a touch-sensitive display, the user may make a selection of a word or phrase within the eBook by touching the display at a location of the word or phrase.
  • the techniques may output information associated with the selected word. For instance, the device may output, in response, a dictionary definition of the selected word, a picture associated with the selected word, synonyms of the selected, or the like. Thereafter, the user may apply a greater or lesser amount of force to the selected word and, in response, the device may output other instances or uses of the selected word. For instance, the device may output other instances of the selected word within the illustrated eBook, other eBooks or content items, or the like.
  • the techniques described herein may refine search results associated with a particular word or phrase based at least in part on a measured amount of force associated with a selection. For instance, a user may request, via a touch on the touch-sensitive display, that the device perform a search based on a query associated with a selected word or phrase. In response, the device may output search results associated with the search (from a search engine or otherwise). Thereafter, the user may provide an amount of force on the touch-sensitive display that is more or less than the original selection. In response, the device may refine (e.g., expand or narrow) the illustrated search results. In one of many examples, the device may identify a term associated with the rendered eBook and may perform a search based on a query that includes this term in addition to the term selected by the user.
  • This disclosure also describes techniques for outputting different reference work entries based on an amount of force applied to a touch-sensitive display of a device. For instance, when a user reads an electronic book (eBook) on a device having a touch-sensitive display, the user may make a selection of a word or phrase within the eBook by touching the display at a location of the word or phrase. The techniques may then determine which of multiple different reference work entries to output based on a measured amount of force of the selection. For instance, the device may output a dictionary definition of the selected word in response to measuring a first amount of force. Additionally or alternatively, the device may output a thesaurus entry for the word in response to measuring a second, greater amount of force.
  • eBook electronic book
  • the techniques may then determine which of multiple different reference work entries to output based on a measured amount of force of the selection. For instance, the device may output a dictionary definition of the selected word in response to measuring a first amount of force. Additionally or alternatively, the device may output a thesaurus entry for the word
  • This disclosure also describes an architecture and techniques for outputting requested information from reference works (e.g., dictionaries, thesauruses, almanacs, atlases, encyclopedias, gazetteers) in a context-sensitive manner. For instance, when a user reads an electronic book (eBook) and requests a definition for a word found within the eBook, the techniques may display a definition for the word that has been selected based on the context of the request. In one example, the techniques may display a definition that corresponds to one or more identified genres of the eBook in which the word appears. In another example, the techniques may display a definition that corresponds to known information about the user, such as a preference of the user or the like.
  • reference works e.g., dictionaries, thesauruses, almanacs, atlases, encyclopedias, gazetteers
  • the techniques will display a medical-related definition of the word. If the user reads a science-fiction (sci-fi) eBook, meanwhile, the techniques may display a sci-fi or science-related definition of the word. In each of these instances, the techniques may display more than one definition, with the order of the displayed definitions being based on the classification of the eBook. For instance, the medical definition may be displayed first in instances where the eBook is determined be medical-related. As such, the techniques display information from a reference work, such as the dictionary, in a manner that is more likely to be relevant and of interest to the user.
  • a reference work such as the dictionary
  • the techniques may classify a particular content item as being associated with one or more particular genres (e.g., science, science fiction, medicine, business, law, fiction, a particular foreign language, etc.).
  • a user experiencing the content item may request some information regarding the content item that may be found within a reference work. For instance, the user may request a definition of a word, synonyms or antonyms for a word, information from an encyclopedia regarding an identified word, phrase, or topic, a map for or directions to an identified location, or the like.
  • the techniques select an entry from the appropriate type of reference work and then output (e.g., visually, audibly, etc.) the reference work entry.
  • the techniques may select the corresponding encyclopedia entry based on the genre of the content item. For instance, if the user currently experiences a sports-themed content item and the user requests information regarding the topic “bat,” the techniques may output information regarding “bats” from a sports-themed encyclopedia. This information will likely discuss a round, elongated object for hitting a ball. If, however, the user currently experiences an animal-related content item and the user makes the same request, the techniques may output an encyclopedia entry from an animal-related encyclopedia. This information will likely discuss the nocturnal mammal.
  • the techniques may instead reference a single reference work that includes multiple different types of entries (e.g., sports-related, animal-related, medical, etc.).
  • a single encyclopedia may include an entry for “bat” in the sports sense and an entry for “bat” in the animal sense.
  • the techniques may display one or both of the definitions in a manner based on the identified genre of the content item.
  • Example Architecture describes one example architecture and several example components that implement the techniques introduced above.
  • Example eBook Reader Device describes example components of one type of device that may implement context-sensitive reference works.
  • Example User Interfaces follows, describing examples of user interfaces (UIs) that may be served to and rendered at the client devices of FIG. 1 .
  • the discussion then moves on to illustrate and describe an “Example Process” for implementing the described techniques.
  • the discussion also includes a second section, entitled “Surfacing Reference Work Entries on Touch-Sensitive Displays,” that also includes numerous sub-sections.
  • a first sub-section is entitled “Example eBook Reader Device” and describes example components of one type of device that may implement the techniques in this section.
  • a sub-section entitled “Example User Interfaces and Processes” follows.
  • the discussion also includes a third section, entitled “Surfacing Content Based on Touch Gestures” that, like the preceding sections, includes numerous sub-sections.
  • a first sub-section is entitled “Example eBook Reader Device” and describes example components of one type of device that may implement the techniques in this section.
  • a sub-section entitled “Example User Interfaces and Processes for Surfacing Other Instances of a Selected Term” follows. Thereafter, the discussion includes a sub-section entitled “Example User Interfaces and Processes for Refining Search Results,” before the discussion ends with a brief conclusion.
  • FIG. 1 illustrates an example architecture 100 in which a community of users 102 operates respective client devices 104 ( 1 ), 104 ( 2 ), 104 ( 3 ), . . . , 104 (M) to consume content items, such electronic books (eBooks), songs, videos, still images and the like.
  • client devices 104 and/or a content item service 106 implement techniques to provide context-sensitive reference works (e.g., dictionaries, thesauruses, atlases, etc.) that provide requested information to the users based on a genre of the content item associated with the request, a characteristic of the requesting user, or the like.
  • context-sensitive reference works e.g., dictionaries, thesauruses, atlases, etc.
  • the client devices 104 are variously configured with different functionality to enable consumption of one or more types of contents items of any type or format including, for example, electronic texts (e.g., documents of any format, electronic periodicals, such as digital magazines and newspapers, etc.), digital audio (e.g., music, audible books, etc.), digital video (e.g., movies, television, short clips, etc.), images (e.g., art, photographs, etc.), and multi-media content.
  • electronic book” and/or “eBook”, as used herein, include electronic or digital representations of printed works, as well as digital content that may include text, multimedia, hypertext, and/or hypermedia. Examples of printed and/or digital works include, but are not limited to, books, magazines, newspapers, periodicals, journals, reference materials, telephone books, textbooks, anthologies, instruction manuals, proceedings of meetings, forms, directories, maps, web pages, etc.
  • FIG. 1 illustrates that the client devices 104 operated by users of the user community 102 may comprises eBook reader devices (e.g., devices 104 ( 1 ) and 104 ( 2 )), laptop computers (e.g., device 104 ( 3 )), multifunction communication devices (e.g., device 104 (M)), portable digital assistants (PDAs), wireless headsets, entertainment systems, portable media players, tablet computers, cameras, video cameras, netbooks, notebooks, desktop computers, gaming consoles, DVD players, media centers, or any other type of device.
  • eBook reader devices e.g., devices 104 ( 1 ) and 104 ( 2 )
  • laptop computers e.g., device 104 ( 3 )
  • multifunction communication devices e.g., device 104 (M)
  • PDAs portable digital assistants
  • wireless headsets entertainment systems, portable media players, tablet computers, cameras, video cameras, netbooks, notebooks, desktop computers, gaming consoles, DVD players, media centers, or any other type of device.
  • the client devices may receive, over a network 108 , one or more content items for presentation on the devices from the content item service 106 .
  • the network 108 is representative of any one or combination of multiple different types of networks, such as the Internet, cable networks, cellular networks, wireless networks, and wired networks.
  • a wireless technology and associated protocols is the Wireless Fidelity (WiFi) wireless networking technologies defined according to IEEE 802.11 standards, while another example is a cellular network.
  • the content item service 106 is embodied as one or more servers that collectively have processing and storage capabilities to receive requests for content items from the devices, such as the eBook reader device 104 ( 1 ).
  • the servers of the content item service 106 may be embodied in any number of ways, including as a single server, a cluster of servers, a server farm or data center, and so forth, although other server architectures (e.g., mainframe) may also be used.
  • the content item service 106 may be embodied as a client device, such as desktop computer, a laptop computer, an eBook reader device and so forth. In some implementations, for instance, some or all of the elements of content item service 106 illustrated in FIG. 1 may reside on the client devices 104 .
  • the content item service 106 includes a content item distribution system 110 , a content item database 112 , and a content item classifier 114 .
  • the content item distribution system 110 may support distribution of content items (e.g., online retailing via a website) to the client devices 104 .
  • the servers store the content items in the content item database 112 , although in other implementations, the servers merely facilitate purchase and delivery of content items stored in other locations.
  • the content item classifier 114 serves to classify content items by, for example, genre.
  • the classifier 114 may classify content items as relating to fiction, non-fiction, historical, science, science fiction, medicine, business, law, sports, animals, geography, computer science, engineering, chemistry, mathematics, a particular language or any other type of genre, category, or classification.
  • the classifier 114 may reference a prior categorization of the content items within, for example, the content item database 112 .
  • the classifier may classify these content items in other ways, as discussed in detail below.
  • the content item classifier 114 may classify content items as relating to multiple different genres. For instance, an eBook that includes multiple sections may be associated with different genres corresponding to the different sections of the book.
  • a textbook for instance, may include a section classified as relating to mathematics, a section classified as relating to science, and a section classified as relating to medicine.
  • a single section or an entire eBook may also be classified as relating to multiple genres.
  • these genre classifications may be used to determine which category of reference work entry to use when receiving a request for information from a user. For instance, if a user reading the afore-mentioned textbook requests a definition for a word found within the science section of the book, the device of the user may display a science-related dictionary entry (alone or more prominently than other definitions).
  • the content item classifier 114 may classify content items in a multitude of ways. As illustrated, the content item classifier 114 includes a contents analysis module 116 , a feedback module 118 , and a user analysis module 120 .
  • the contents analysis module 116 may classify content items with reference to the actual contents of the item. Using an eBook as an example, this module 116 may scan the text of the eBook to identify key words and may then compare these identified key words to known, pre-compiled sets of key words associated with different genres.
  • the module 116 may scan contents of an eBook and may identify that the most-frequently used words of the eBook include “medicine,” “doctor,” “Dr.,” “disease,” and “hospital.” As such, the module 116 may compare these key words to sets of key words associated with different genres before concluding that this book should be classified as being within the medical genre. Similarly, the module 116 may analyze images or sounds within a content item and may compare these images or sounds to known sets of images or sounds associated with identified genres.
  • this module 116 may weight certain words more heavily than others. For instance, the module 116 may weight the words of the title more heavily than the words within the chapters of the book. Similarly, the module 116 may assign a larger weight to the name of the author, the identity of the publisher, and the like.
  • the feedback module 118 serves to classify content items in whole or in part based upon received feedback. For instance, these techniques may include querying human users as to the genre of a content item and using responses from the users as input in determining the genre(s) of the respective content item. Furthermore, this module 118 may also track user actions in order to receive this feedback. For instance, envision that a user requests a definition found within a particular eBook for the term “boil.” Envision also that the classifier 114 has classified this eBook as relating to science and, as such, the user's device displays a science definition of the term boil, explaining that “boiling” refers to when a liquid changes state to a gas.
  • feedback module 118 may determine (e.g., via an indication received over the network 108 ) that the user requested to see a different definition of the term “boil” (e.g., a medical definition). In this instance, the feedback module 118 may deduce that the eBook should have been classified as being of the “medical” genre rather than the “science” genre.
  • the classifier 114 may assign a confidence level to a particular genre associated with a content item and may alter this genre based on feedback received at the feedback module 118 . For instance, the classifier may determine that the eBook from this example above is 53% likely to be primarily of a “science” genre and 47% likely to be primarily of a “medical” genre. After receiving feedback similar to the feedback from the user discussed immediately above, these percentages may change such that the classifier 114 now judges that the eBook is more likely to relate to medicine than pure science. As such, the classifier 114 may change the assigned genre to “genre” (or may change the primary genre to “medical” while marking “science” as a secondary genre).
  • the user analysis module 120 may function to classify content items in whole or in part based upon the identity of the user experiencing the media item. For instance, when the content item distribution system 110 downloads an eBook to the eBook reader device 104 ( 1 ), the module 120 may analyze known information about the user associated with the device by, for instance, referencing a user profile stored in an account of the user at the content item service 106 . The module 120 may then use this known information about the user to help deduce the genre of the eBook.
  • the module 120 may more heavily weight the chances of this eBook being related to medicine.
  • the user analysis module 120 may similarly use any other known information about the user to help classify content items, including a location of the user, demographic information of the user, an address of the user, and the like.
  • the content item classifier 114 may classify content items as belonging to one or more genres. For instance, individual sections of content items (e.g., chapters, individual songs or tracks, etc.) may each be associated with one or more genres, or an entire content item may be associated with a single or multiple genres. In each instance, the determined genre(s) is helpful to determine the appropriate type or category of reference work entry to use when a user requests information regarding a word, phrase, or topic within the corresponding content item.
  • individual sections of content items e.g., chapters, individual songs or tracks, etc.
  • an entire content item may be associated with a single or multiple genres.
  • the determined genre(s) is helpful to determine the appropriate type or category of reference work entry to use when a user requests information regarding a word, phrase, or topic within the corresponding content item.
  • FIG. 1 illustrates that the eBook reader device 104 ( 1 ) currently displays a fictitious eBook 122 entitled “Secrets to Internal Medicine” by a fictitious author “Dr. Grace Bradley,” which the device 104 ( 1 ) may have downloaded from the content item service 106 .
  • FIG. 1 also illustrates that the content item database 112 stores the same eBook 122 .
  • FIG. 1 illustrates that the content item classifier 114 has classified this eBook 122 as relating a particular genre 124 .
  • the classifier 114 has determined that the eBook relates to medicine and has classified this book accordingly.
  • the content item database 112 may similarly store multiple other content items along with a notation of the genre(s) of each of these other items.
  • the user of the eBook reader device 104 ( 1 ) has selected (via a highlight 126 ) a particular word (“prognosis”) from the eBook 122 .
  • the eBook reader device 104 ( 1 ) displays a definition 128 of the selected word.
  • the definition 128 of the word comes from a medical dictionary entry, which corresponds to the classification of the eBook 16 as being related to the “medical” genre.
  • this definition 128 states that a “prognosis” is “a forecast of the probable course and/or outcome of a disease.” While this example describes a dictionary, other implementations may employ other types or categories of reference works, a few examples of which are discussed below.
  • FIG. 2 illustrates example components that might be implemented in the eBook reader device 104 ( 1 ) of FIG. 1 that displays information provided by context-sensitive reference works, such as dictionaries or the like.
  • the eBook reader device 104 ( 1 ) is a dedicated, handheld eBook reader device, although other electronic devices may implement these techniques and, hence, may include some of the functionality described herein.
  • the eBook reader device 104 ( 1 ) includes one or more processing units 202 and memory 204 .
  • the memory 204 (and other memories described throughout this document) is an example of computer storage media and may include volatile and nonvolatile memory.
  • the memory 204 may include, but is not limited to, RAM, ROM, EEPROM, flash memory, or other memory technology, or any other medium which can be used to store media items or applications and data which can be accessed by the eBook reader device 104 ( 1 ).
  • the memory 204 may be used to store any number of functional components that are executable on the processing unit(s) 202 , as well as data and content items that are rendered by the eBook reader device 104 ( 1 ).
  • the memory 204 may store an operating system and an eBook storage database to store one or more content items 206 , such as eBooks, audio books, songs, videos, still images, and the like.
  • the memory 204 may further include a memory portion designated as an immediate page memory to temporarily store one or more pages of an electronic book. The pages held by the immediate page memory are placed therein a short period before a next page request is expected.
  • page refers to a collection of content that is presented at one time in a display of the eBook reader device 104 ( 1 ).
  • a “page” may be understood as a virtual frame of the content, or a visual display window presenting the content to the user.
  • pages as described herein are not fixed permanently, in contrast to the pages of published “hard” books. Instead, pages described herein may be redefined or repaginated when, for example, the user chooses a different font for displaying the content in the first display.
  • the terms “page views”, “screen views”, and the like are also intended to mean a virtual frame of content.
  • An interface module 208 may also be provided in memory 204 and may be executed on the processing unit(s) 202 to provide for user operation of the device 104 ( 1 ).
  • One feature of the interface module 208 allows a user to request to receive information from a reference work regarding a word, phrase, or topic found within one of the content items 206 .
  • the interface module 208 may allow the user to request a definition of a word from a dictionary, synonyms from a thesaurus, a map from an atlas, and the like.
  • the interface module 208 may facilitate textual entry of request (e.g., via a cursor, controller, keyboard, etc.), audible entry of the request (e.g., via a microphone), or entry of the request in any other manner.
  • the interface module 208 may provide menus and other navigational tools to facilitate selection and rendering of the content items 206 .
  • the interface module 208 may further include a browser or other application that facilitates access to sites over a network, such as websites or online merchants.
  • a content presentation application 210 renders the content items 206 .
  • the content presentation application 210 may be implemented as various applications depending upon the content items.
  • the application 210 may be an electronic book reader application for rending electronic books, or an audio player for playing audio books or songs, or a video player for playing video, and so forth.
  • the memory 204 may also store user credentials 212 .
  • the credentials 212 may be device specific (set during manufacturing) or provided as part of a registration process for a service. The credentials may be used to ensure compliance with DRM aspects of rendering the content items 206 .
  • the memory 204 also stores one or more reference works 214 , such as one or more dictionaries, thesauruses, encyclopedias, atlases, gazetteers, and the like.
  • the memory 204 stores multiple categories of a particular kind of reference work.
  • the memory 204 may store a standard dictionary (e.g., Merriam-Webster® English Dictionary), a medical dictionary, a legal dictionary, a science dictionary, a science-fiction dictionary, an engineering dictionary, a foreign language dictionary, a business dictionary, a chemistry dictionary, a mathematics dictionary, and the like.
  • a single kind of reference work may contain multiple reference work entry types.
  • a single dictionary may store, for one or more of the words therein, a standard dictionary entry, a medical dictionary entry, a legal dictionary entry, a science dictionary entry, and the like.
  • FIG. 2 further illustrates that the memory 204 stores a feedback module 216 that is executable on the processing unit(s) to receive user feedback regarding an outputted reference work entry or a classified genre of a content item. As discussed above, this feedback may be used to help re-classify the genre associated with the content item.
  • a feedback module 216 that is executable on the processing unit(s) to receive user feedback regarding an outputted reference work entry or a classified genre of a content item. As discussed above, this feedback may be used to help re-classify the genre associated with the content item.
  • the eBook reader device 104 ( 1 ) also stores a reference entry selection module 218 that is executable on the processing unit(s) to select a particular type of reference work entry based on a genre of a content item, a characteristic of a user, or the like. For instance, this module 218 may store or reference a table that maps “content item genres” to “reference work entry types.” Therefore, when the content presentation application 210 outputs a content item of a particular genre and the user requests some reference work information associated with a word, phrase, or topic therein, the module 218 may reference this table to determine the type of entry to output. In some instances, the reference entry selection module 218 may reside on the content item service 106 or in another location, in which case the eBook reader device 104 ( 1 ) may access the module 218 over the network 108 .
  • the module 218 may determine that the application 210 should display a medical definition when receiving a request for a word within an eBook that has been categorized as “medical” in nature.
  • This table may similarly map a “legal” genre to a “legal” reference work entry type, a “sci-fi” genre to a “science” reference work entry type, a “historical fiction,” “British lit” and the like to a “standard” reference work entry type, and so on. In some instances, this table may map combinations of genres to reference work entry types.
  • the table may map an eBook that is associated with both a “medical” genre and a “mystery” genre to a “standard” reference work entry type rather than a “medical” reference work entry type. It is to be appreciated, however, that FIG. 2 simply illustrates several example mappings, and that any type of content item genre may map to any type of reference work entry type in certain implementations.
  • FIG. 2 further illustrates that the eBook reader device 104 ( 1 ) may include a display 220 , which may be passive, emissive or any other form of display.
  • the display uses electronic paper (ePaper) display technology, which is bi-stable, meaning that it is capable of holding text or other rendered images even when very little or no power is supplied to the display.
  • ePaper electronic paper
  • Some example ePaper-like displays that may be used with the implementations described herein include bi-stable LCDs, MEMS, cholesteric, pigmented electrophoretic, and others.
  • the display may be embodied using other technologies, such as LCDs and OLEDs, and may further include a touch screen interface.
  • a touch sensitive mechanism may be included with the display to form a touch-screen display.
  • the eBook reader device 104 ( 1 ) may further be equipped with various input/output (I/O) components 222 .
  • I/O components may include various user interface controls (e.g., buttons, a joystick, a keyboard, etc.), audio speakers, connection ports, and so forth.
  • a network interface 224 supports both wired and wireless connection to various networks, such as cellular networks, radio, WiFi networks, short range networks (e.g., Bluetooth), IR, and so forth.
  • the network interface 224 may allow a user of the device 104 ( 1 ) to download content items from the content item service 106 , may allow the feedback module 216 to provide received feedback to the service 106 , and the like.
  • the eBook reader device 104 ( 1 ) also includes a battery and power control unit 226 .
  • the battery and power control unit operatively controls an amount of power, or electrical energy, consumed by the eBook reader device. Actively controlling the amount of power consumed by the reader device may achieve more efficient use of electrical energy stored by the battery.
  • the eBook reader device 104 ( 1 ) may have additional features or functionality.
  • the eBook reader device 104 ( 1 ) may also include additional data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape.
  • the additional data storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data.
  • program modules include routines, programs, objects, components, data structures, etc. for performing particular tasks or implement particular abstract data types.
  • program modules and the like may be executed as native code or may be downloaded and executed, such as in a virtual machine or other just-in-time compilation execution environment.
  • functionality of the program modules may be combined or distributed as desired in various embodiments.
  • An implementation of these modules and techniques may be stored on or transmitted across some form of computer readable media.
  • FIGS. 3-6 illustrate example user interfaces that the eBook reader device 104 ( 1 ) (and the other client devices of the architecture 100 ) may render in accordance with the techniques described above. While these figures illustrates a few example interfaces it is to be appreciated that numerous other types of interfaces displaying information from numerous other types of reference works may be implemented using the described techniques.
  • FIG. 3 illustrates the example user interface described above with reference to FIG. 1 .
  • the eBook reader device 104 ( 1 ) or the content item service 106 has determined that the eBook 122 currently being read by the user is associated with a “medical” genre.
  • the device displays a “medical” definition of the word rather than a standard or other type of definition.
  • FIG. 3 illustrates that the user has selected (e.g., via a keyboard, cursor, touch screen, etc.) the word “prognosis,” as illustrated by the highlight 126 . While the user selects a word in this example, the user may select a phrase in other embodiments.
  • the device 104 ( 1 ) displays the medical definition 128 of this word. As illustrated, this definition 128 includes an indication 302 that this definition is in fact the medical definition, rather than another type of definition (e.g., a standard definition, a science definition, etc.).
  • the eBook reader device 104 ( 1 ) may display a definition from a dictionary when the user selects a word, although in other implementations the device may display synonyms from a thesaurus, information from an encyclopedia, or information any other reference work type. In still other implementations, the device 104 ( 1 ) may prompt the user to select the type of the reference work from which the device should display information.
  • FIG. 3 also illustrates that the definition 128 includes an icon 304 (“More”) that, when selected, allows the user to view additional definitions of the word “prognosis.”
  • icon 304 (“More”) that, when selected, allows the user to view additional definitions of the word “prognosis.”
  • FIG. 4 illustrates an example user interface rendered by the eBook reader device 104 ( 1 ) after the user has selected to view “more” definitions of the word “prognosis.”
  • the device displays the definition 128 of this word in the medical sense first, followed by a definition 402 of the word in a standard sense, and a definition 404 of the word in a legal sense.
  • the order of the list is also based on the genre of the eBook, with the medical definition appearing first.
  • one or both of the feedback modules 118 and 216 may use the user's selection of the icon 304 as an indication that the eBook or the currently displayed portion of the eBook may need to be re-classified. For instance and as discussed above, this selection may alter the confidence level associated with the currently associated genre.
  • FIG. 5 illustrates another example user interface rendered by the eBook reader device 104 ( 1 ).
  • the device currently displays an eBook 502 comprising a periodical article that has been determined to relate the genre “business.”
  • the device may display a reference work entry associated with the genre “business.”
  • the user requests (either explicitly or via default settings) to look up the word “bear” in a thesaurus, as indicated by a highlight 504 .
  • the eBook reader device 104 displays an entry 506 from a thesaurus, the entry comprising synonyms and antonyms.
  • an indication 508 indicates that this entry corresponds to a “business” use of the term “bear,” as the synonyms include “pessimist, cynic, defeatist, misanthrope,” while the antonyms include “bull, optimist.” This is contrasted with the standard use of the term “bear” in the English language, having synonyms of bear “stand, stomach, tolerate, abide” and the like.
  • the device 104 ( 1 ) is more likely to provide the user with the information that she seeks. Furthermore, the device 104 ( 1 ) also displays the “more” icon 304 to allow the user to view other thesaurus entry types associated with the word “bear” (e.g., the standard use entry, an entry related to animals, etc.).
  • the word “bear” e.g., the standard use entry, an entry related to animals, etc.
  • FIG. 6 illustrates another example user interface rendered by the eBook device 104 ( 1 ).
  • the device 104 ( 1 ) is displaying an eBook 602 in the form of an article that has been determined be associated with a “sports” genre.
  • the user requests to look up the topic “bat” in an encyclopedia, as indicated by a highlight 604 .
  • the device 104 ( 1 ) displays an entry 606 from a sports-related encyclopedia that explains the history and importance of a “baseball bat.”
  • the eBook device 104 ( 1 ) also displays an indication 608 that the entry 606 resides in a sports-related encyclopedia, or that the entry is sports-related entry in a general encyclopedia.
  • the device also displays the “more” icon that, when selected, causes the device to display other articles associated with the term “bat,” such as an article about the nocturnal mammal.
  • the device 104 may instead display the animal-related encyclopedia entry first, rather than the illustrated sports-related entry 606 .
  • FIG. 7 illustrates an example process 700 for implementing the techniques described above of providing context-sensitive reference work entries.
  • This process is illustrated as a logical flow graph, each operation of which represents a sequence of operations that can be implemented in hardware, software, or a combination thereof.
  • the operations represent computer-executable instructions that, when executed by one or more processors, perform the recited operations.
  • computer-executable instructions include routines, programs, objects, components, data structures, and the like that perform particular functions or implement particular abstract data types.
  • the order in which the operations are described is not intended to be construed as a limitation, and any number of the described operations can be combined in any order and/or in parallel to implement the process.
  • process 700 is described with reference to the architecture 100 of FIG. 1 , although other architectures may implement this process.
  • Process 700 includes an operation 702 , which represents classifying a content item as belonging to one or more genres, such as one or more of the genres discussed above.
  • This analyzing may include one or a combination of sub-operations 702 ( 1 ), 702 ( 2 ), and 702 ( 3 ).
  • Classifying a content item may include, for instance, analyzing contents of the content item at sub-operation 702 ( 1 ). This may include analyzing a content item for key words and comparing these key words to sets of key words associated with different respective genres.
  • Sub-operation 702 ( 2 ) may include referencing a prior categorization of the content item, such as from an electronic catalog of content items.
  • sub-operation 702 ( 3 ) may include referencing feedback regarding the content item itself, as discussed above.
  • an operation 704 represents determining a reference work entry to use for the content item based at least in part on the classified genre of the item. For instance, if the item has been classified as “legal,” operation 704 may determine that a “legal” reference work entry should be used. Conversely, if the reference work is classified as “thriller,” then operation 704 may determine that a “standard” reference work entry should be used.
  • an operation 706 represents receiving a request for information found within a reference work regarding a word, phrase, or topic found within the content item. This may include, for example, receiving a request for a definition of a word from a dictionary, synonyms or antonyms for the word from a thesaurus, information regarding a topic from an encyclopedia, a map from an atlas, or the like.
  • Operation 708 represents selecting a reference work entry from the determined type of reference work entry type. For instance, after the user requests to receive a definition of the word “prognosis” found within a medical-related book, operation 708 may select the medical definition of “prognosis.”
  • an operation 710 represents outputting (visually, audibly, etc.) the selected reference work entry, such as the medical definition of the term “prognosis.” Again, this outputting may comprise outputting multiple definitions of the word in an order based at least in part on the classified genre(s) of the content item. For instance, operation 710 may output multiple definitions of the word “prognosis,” with the medical definition being displayed first or more prominently in the list relative to the other definitions.
  • Operation 712 represents querying whether feedback (e.g., user feedback) has been received in response to the output of the reference work entry. For instance, operation 712 may query whether the user decided to view additional definitions of the word “prognosis.” If so, then this feedback is fed back to the classification block to potentially alter the classification of the content item. If no feedback is received, then the process 700 ends at operation 714 .
  • feedback e.g., user feedback
  • FIG. 8 is a block diagram of selected modules of an eBook reader device 800 that may implement a touch-sensitive display and that is capable of outputting different reference work entries based on an amount of force applied to the touch-sensitive display.
  • the illustrated eBook reader device 800 may include several similar or identical components as the eBook reader device 104 ( 1 ) described above.
  • the eBook reader device 800 is a dedicated, handheld eBook reader device, although other electronic devices may implement these techniques and, hence, may include some of the functionality described herein.
  • mobile telephones, tablet computers, laptop computers, desktop computers, personal media players, portable digital assistants (PDAs), or any other type of electronic device may implement the components and techniques described below.
  • PDAs portable digital assistants
  • the example eBook reader device 800 includes one or more processing units 802 and memory 804 .
  • the eBook reader device 800 may include a touch sensor 806 that enables a user of the device to operate the device via touch inputs.
  • the touch sensor 806 and the display 220 are integral to provide a touch-sensitive display that displays content items (e.g., eBooks) and allows users to navigate the content items via touch inputs on the display.
  • the memory 804 may be used to store any number of functional components that are executable on the processing unit(s) 802 , as well as data and content items that are rendered by the eBook reader device 800 .
  • the memory 804 may store an operating system and an eBook storage database to store the one or more content items 206 described above, such as eBooks, audio books, songs, videos, still images, and the like.
  • a content presentation application 210 renders the content items 206 .
  • the content presentation application 210 may be implemented as various applications depending upon the content items.
  • the application 210 may be an electronic book reader application for rending electronic books, an audio player for playing audio books or songs, a video player for playing video, and so forth.
  • the memory 804 may also store user credentials 212 .
  • the credentials 212 may be device specific (set during manufacturing) or provided as part of a registration process for a service. The credentials may be used to ensure compliance with DRM aspects of rendering the content items 206 .
  • the memory 804 also stores (persistently or temporarily) one or more reference works 214 , such as one or more dictionaries, thesauruses, encyclopedias, atlases, gazetteers, and the like.
  • the memory 204 stores multiple categories of a particular kind of reference work.
  • the memory 204 may store a standard dictionary (e.g., Merriam-Webster®English Dictionary), a medical dictionary, a legal dictionary, a science dictionary, a science-fiction dictionary, an engineering dictionary, a foreign language dictionary, a business dictionary, a chemistry dictionary, a mathematics dictionary, and the like.
  • a single kind of reference work may contain multiple reference work entry types.
  • a single dictionary may store, for one or more of the words therein, a standard dictionary entry, a medical dictionary entry, a legal dictionary entry, a science dictionary entry, and the like.
  • the device may store a dictionary that accompanies a particular eBook.
  • the device may store a dictionary that a publisher of a particular eBook creates for that particular eBook or for a particular series of eBooks.
  • the memory 804 may also include the interface module 208 that, as described above, provides for user operation of the device 104 ( 1 ).
  • One feature of the interface module 208 allows a user to request to receive information from a reference work regarding a word, phrase, or topic found within one of the content items 206 .
  • the interface module 208 may allow the user to request a definition of a word from a dictionary, synonyms from a thesaurus, a map from an atlas, and the like.
  • the interface module 208 may facilitate textual entry of request (e.g., via a cursor, controller, keyboard, etc.), audible entry of the request (e.g., via a microphone), or entry of the request in any other manner.
  • the memory 804 also stores a touch-input controller 808 to detect touch inputs received via the touch sensor 806 and, in some instances to measure of a force of the touches.
  • the touch-input controller 808 is configured to detect multiple touches on the touch sensor 806 as well as to measure an amount of force of each of the touches.
  • the eBook reader device 800 also stores a reference entry selection module 810 that is executable on the processing unit(s) to select a particular type of reference work entry in response to receiving an indication of a touch input. For instance, in response to the user selecting a particular portion of a rendered content item via a touch input, the reference entry selection module 810 may select a particular type of reference work entry to output based on a measured force of the touch input. For example, if a user selects a particular word on the touch-sensitive display, the module 810 may map the amount of force of the touch to one of multiple different reference work entries.
  • the module 810 outputs a dictionary definition of the word in response to the user providing a first amount of force, a thesaurus entry for the word in response to the user providing a greater amount of force, and an encyclopedia entry for the word in response to the user providing an even greater amount of force.
  • the module 810 may select and output multiple different reference work entries within a same type of reference work. For instance, the module 810 may output a medical definition of a word in response to a touch input having a first force, a standard definition of the word in response to a touch input having a greater force, and a legal definition of the work in response to a touch input having an even greater force.
  • the reference entry selection module 810 allows a user to toggle through multiple reference work entries for a particular word, phrase, or topic by providing additional force to the touch-sensitive display.
  • the order and substance of the outputted reference work entries may be configurable by a user of the device, may be set by a publisher or distributor of a corresponding content item, or the like.
  • multiple other configurations may also be implemented.
  • the reference entry selection module 810 selects a reference work entry to output based on factors in addition to a first touch input. For instance, a user may select a word or other portion of a content item on touch-sensitive display and, in response, the module 810 may select and output a first reference work entry. Thereafter, the user may provide an additional input and, in response, the module 810 may select and output a different reference work entry.
  • the additional input may comprise an additional touch, the user activating a button on the keypad, the user orally stating a command, or any other type of user input.
  • the user is able to toggle through multiple reference work entries by providing an input to the touch sensor 806 and thereafter providing additional inputs to the touch sensor 806 and/or the interface module 208 .
  • the touch sensor 806 is capable of detecting and interpreting multiple coincident touches
  • the user may place a first finger on the touch sensor 806 to cause display of a first reference work entry and then may toggle through other reference work entries by tapping another finger on the touch sensor 806 .
  • the reference entry selection module 810 may select a reference entry to output based on other characteristics. For instance, the module 810 may select a particular entry to output based on a current location of the device, as determined by a GPS system resident on the device, signal triangulation, or any other location-sensing method.
  • FIG. 2 further illustrates that the eBook reader device 800 may include the display 220 , which may be passive, emissive or any other form of display as discussed above. Also as discussed above, the display 220 and the touch sensor 806 may couple to form a touch-sensitive display. That is, the touch sensor 806 may reside underneath or above the display 220 in some instances, or may reside adjacent to the display in other instances.
  • the display 220 may be passive, emissive or any other form of display as discussed above.
  • the display 220 and the touch sensor 806 may couple to form a touch-sensitive display. That is, the touch sensor 806 may reside underneath or above the display 220 in some instances, or may reside adjacent to the display in other instances.
  • the eBook reader device 800 may further be equipped with various input/output (I/O) components 222 .
  • I/O components may include various user interface controls (e.g., buttons, a joystick, a keyboard, etc.), audio speakers, connection ports, and so forth.
  • a network interface 224 supports both wired and wireless connection to various networks, such as cellular networks, radio, WiFi networks, short range networks (e.g., Bluetooth), IR, and so forth.
  • the network interface 224 may allow a user of the device 800 to download content items from the content item service 106 .
  • the eBook reader device 800 also includes a battery and power control unit 226 .
  • the battery and power control unit operatively controls an amount of power, or electrical energy, consumed by the eBook reader device. Actively controlling the amount of power consumed by the reader device may achieve more efficient use of electrical energy stored by the battery.
  • FIG. 9 illustrates an example user interface rendered by the device 800 of FIG. 8 .
  • the device includes a touch-sensitive display 902 that renders an eBook 904 .
  • a user makes a selection of a word on the touch-sensitive display 902 , as illustrated by the highlight 126 .
  • the touch-input controller 808 of the device 800 measures an amount of force 906 of the selection and provides this measured amount to the reference entry selection module 810 .
  • This module 810 may then map this particular amount of force to a particular type of reference work entry to output.
  • the module 810 may output a first type of reference work entry for a touch-input having a measured force within a first range, a second type of reference work entry for a touch-input having a measured force within a second range, and so forth.
  • the controller 808 may measure the amount of force 906 of the initial touch and may use this amount as a baseline for future touches. For instance, the device 800 may select and output a first type of reference work entry in response to detecting a first touch, regardless of the amount of force of the touch. Thereafter, the device 800 may output other reference work entries in response to detecting touches having greater or lesser forces than the initial “baseline” touch.
  • the module 810 may output the entry to the user.
  • the module 810 has overlaid the entry onto the touch-sensitive display 902 , although other implementations may output the entry in other ways.
  • the reference work selection module 810 has selected, based on the amount of force 906 of the touch input, a dictionary entry 908 to output on the display.
  • FIG. 9 illustrates that the device 800 outputs the dictionary entry 908 for the selected word “Mammal.”
  • FIG. 9 also illustrates that the entry 908 provides an indication 910 of the source of the reference work entry.
  • FIG. 10 illustrates an example user interface rendered by the device 800 after the user has increased the amount of force on the selected word.
  • the user now selects the word “mammal” with an amount of force 1002 that is greater than the amount of force 906 .
  • the user may or may not have maintained contact with the touch-sensitive display 902 between the applications of these two forces.
  • the touch-input controller 808 has detected the greater amount of force 1002 and provided this information to the reference entry selection module 810 .
  • the module 810 has mapped this amount of force to a particular reference work entry.
  • the module 810 has selected a thesaurus entry 1004 for output on the touch-sensitive display 902 .
  • the thesaurus entry includes an indication 1006 of the source of entry, along with both synonyms and antonyms of the word “mammal.” While FIG. 10 illustrates that the device 800 outputs the thesaurus entry 1004 in response to detecting the amount of force 1002 on the location of the display 902 associated with the word “mammal,” other implementations may output any other type of reference work entry. In addition, other implementations may output the thesaurus entry in response to detecting a lesser amount of force.
  • FIG. 11 illustrates yet another example user interface rendered by the device 800 after the user has yet again provided an input having an increased amount of force 1102 on the selected word “mammal.”
  • the touch-input controller 808 measures the increased force 1102 and provides this information to the reference entry selection module 810 , the module 810 selects a third, different type of reference work entry to output on the device.
  • the device 800 outputs an encyclopedia entry 1104 for the word “Mammal,” which again includes an indication 1106 of the source of the entry.
  • the user of the device 800 is able to toggle through multiple different reference work entries associated with a particular word, phrase, image, or other portion of a content item. For instance, the user may toggle through a dictionary entry, a thesaurus entry, an encyclopedia entry, and/or any other type of reference work entry by altering the amount of force applied to the touch-sensitive display 902 .
  • the device 800 may store locally some or all of the underlying reference works, or the device 800 may request and receive the surfaced entries over a network on an on-demand basis.
  • FIG. 12 illustrates another example user interface rendered by the device 800 .
  • the touch-sensitive display 902 of the device 800 renders the eBook 504 discussed with reference to FIG. 5 above.
  • the user selects (via a touch input) the word “bear” on the display 902 , as indicated by the highlight 126 .
  • the touch-input controller 808 measures an amount of force 1202 of the selection.
  • the reference entry selection module 810 outputs a thesaurus entry 1204 for the word “bear,” which includes an indication 1206 of the type of reference work being utilized.
  • the module 810 has additionally determined that the eBook 502 is associated with the genre of “business.”
  • the module 810 may make this determination using any of the techniques described above, such as a by referencing a prior categorization of the eBook 502 , by analyzing key words of the eBook 502 , or the like.
  • the reference entry selection module 810 outputs a business-related thesaurus entry 1204 , rather than a standard thesaurus entry.
  • the entry 1204 provides business-based synonyms for the term “bear,” such as pessimist, cynic, and defeatist.
  • the entry 1204 includes an indication 1208 that this entry 1204 corresponds to a “business” use of the term.
  • FIG. 13 illustrates another example user interface rendered by the device 800 after the user has provided additional force on the touch-sensitive display 902 .
  • the touch-input controller 808 measures an amount of force 1302 provided by the additional input and provides this information to the reference entry selection module 810 .
  • the module 810 selects a reference work entry to output on the device 800 .
  • the module 810 again outputs a thesaurus entry 1304 (as shown by indication 1306 ).
  • This thesaurus entry is associated with a standard use of the term “bear” (as shown by indication 1308 ) and, as such, this entry includes synonyms such as carry, convey, and deliver.
  • FIGS. 12-13 illustrate that the user is able to toggle through multiple different context-sensitive reference work entries by providing varying amounts of force on the touch-sensitive display.
  • a user is able to view an initial reference work entry that has been determined to most likely relate to the illustrated eBook, while providing the user the ability to toggle through multiple other entries.
  • the techniques may surface a business-related thesaurus entry for a business-related book, while allowing the user to toggle through other thesaurus entries for a same word by applying an increased or decreased force to the display 902 .
  • FIG. 14 is a flow diagram showing a process 1400 of selecting which of multiple reference work entries to output based on an amount of measured force associated with a selection.
  • This process (as well as the processes described below) is illustrated as a logical flow graph, each operation of which represents a sequence of operations that can be implemented in hardware, software, or a combination thereof.
  • the operations represent computer-executable instructions stored on one or more computer-readable storage media that, when executed by one or more processors, perform the recited operations.
  • computer-executable instructions include routines, programs, objects, components, data structures, and the like that perform particular functions or implement particular abstract data types. While this process illustrates one example order in which the operations of the process may occur, the example order is not intended to be construed as a limitation, and any number of the described operations can be combined in any order and/or in parallel to implement the process.
  • process 1400 is described with reference to the device 800 of FIG. 8 , although other devices and architectures may implement this process.
  • the device 800 detects an initial selection on a touch sensor and measures a force of the selection. For instance, a user of the device 800 may provide a touch input onto the touch-sensitive display of the device via a finger, stylus, or the like. This initial selection may select a particular portion of a content item, such as a word or phrase, a location on a map, a portion of an image, or any other portion of any type of content item.
  • the device 800 causes display of a first reference work entry at least partly in response to detecting the initial selection. For instance, the device 800 may display the first reference work entry on the touch-sensitive display of the device, on another display of the device, or on another display of a different device. Further, in some instances, the device 800 selects the first reference work entry based at least in part on the measured force of the initial selection. In other instances, meanwhile, the device 800 selects the first reference work entry regardless of an amount of force. In these latter instances, the measured force of the initial selection may act as a baseline for future touch inputs.
  • the displayed first reference work entry may also be based on a location of the initial selection on the touch sensor. For instance, if a user provides the touch input on a portion of a touch-sensitive display displaying the word “mammal,” the device may output a reference work entry (e.g., a dictionary entry, an encyclopedia entry, etc.) associated with the word “mammal.”
  • a reference work entry e.g., a dictionary entry, an encyclopedia entry, etc.
  • the touch sensor of the device detects a force that is greater or lesser than the measured force of the initial selection. For instance, the user may have applied more force via her finger or a stylus, with or without removing her finger or stylus from touch sensor after making the initial selection.
  • the device 800 causes display of a second, different reference work entry.
  • the device may display this entry on any display, such as on the touch-sensitive display of the device 800 . Further, this entry may be selected based on an amount of force of the selection and/or based on a difference in force between the initial selection and the greater/lesser force. This second, displayed entry may also be based on the portion of the content item that the user selects.
  • the device 800 may output a thesaurus entry for the word “mammal” in response to the user providing a greater amount of force at the location of the display displaying this word.
  • FIG. 15 illustrates yet another example user interface rendered by the device 800 of FIG. 8 after a user has made a selection of a word on the touch-sensitive display 902 of the device 800 .
  • the device 800 currently outputs the eBook 904 and the user provides a first input 1502 in the form of a touch on the touch-sensitive display 902 .
  • the device 800 outputs a reference work entry associated with the selected word.
  • the device 800 again displays a dictionary entry 908 for the selected word “mammal.”
  • FIG. 16 illustrates an example user interface rendered by the device 800 after the user provides an additional input 1602 .
  • This additional input may comprise additional force on the touch-sensitive display, a touch on the display that is coincident with the initial touch input, activation of a key on a keypad of the device 800 , an oral command spoken by the user and captured by a microphone of the device 800 , or any other type of input.
  • the device 800 In response to the additional input 1602 , the device 800 outputs a second, different reference work entry 1004 .
  • the device 800 may select this entry 1004 based on actual contents of the additional input 1602 , or the device 800 may select this entry 1004 based on the entry 1004 being “next in line” for display.
  • FIGS. 15-16 illustrate that a user of the device 800 is able to toggle through multiple reference work entries associated with a portion of a content item by providing a touch input and, thereafter, providing other preconfigured inputs.
  • a user of the device may view a dictionary entry of a word by providing an initial touch input on the touch-sensitive display of the device. Thereafter, the user may toggle through multiple other reference work entries associated with the word by providing a second, coincident touch on the display.
  • the user could view a thesaurus entry by maintaining contact with her finger on the display and providing a first tap, an encyclopedia entry by providing a second tap, search engine results by providing a third tap, and so forth.
  • the user could toggle through multiple different reference work entries by providing a touch input and then selecting a particular button on a keypad of the device. For instance, after providing a touch input, the user could select the “T” on the keypad to view the thesaurus entry for the word, the “E” on the keypad to view the encyclopedia entry of the word, and so forth. Conversely, the user may select a single button that toggles through these reference work entries in a preconfigured order.
  • the user could toggle through multiple different reference work entries by providing a touch input and then providing an oral command. For instance, the user could say aloud “thesaurus,” “encyclopedia,” “map,” and so forth.
  • the device 800 may process this request and output the requested reference work entry.
  • the user may provide a single command (e.g., “next”) that, when received, causes the device to toggle through the reference work entries in a preconfigured order.
  • FIG. 17 is a flow diagram showing a process 1700 of outputting a first reference work entry in response to receiving a touch selection on a display and, thereafter, outputting a second, different reference work entry after receiving an additional input.
  • the device 800 receives a first user input on a touch-sensitive display 902 of the device 800 .
  • the user may select a particular portion of a content item that the device outputs, such as a particular word or phrase of an eBook.
  • the device causes display of a first reference work entry associated with the selected portion of the content item. For instance, if the user selects the word “mammal,” the device 800 may display a dictionary entry or other reference work entry for this word. Conversely, if the user selects the word “Los Angeles,” the device 800 may display the location of this city on a map.
  • user provides and the device receives a second user input, which may or may not be coincident with the first input provided on the touch-sensitive display.
  • the user may provide additional force on the display, may provide an additional point of contact or touch on the display, may activate a button on a keypad of the device, may orally speak a command, and so forth.
  • the device 800 in response to receiving the second input the device 800 causes display of a second, different reference work entry.
  • this reference work entry is also associated with the initially selected portion of the content item.
  • the device 800 may display a thesaurus entry for a selected word after displaying a dictionary entry for the word at the operation 1704 .
  • FIG. 18 illustrates an example user interface rendered by the device 800 of FIG. 8 after a user has made a selection of a word on the touch-sensitive display 902 of the device 800 .
  • the device 800 measures a force of the selection and determines whether to output a particular reference work entry in response to the selection, or whether to enable the user of the device 800 to select a reference work entry for output. For instance, the device 800 may determine whether a provided force is greater or less than a threshold force and, in response to this determination, may output the particular reference work entry or may enable the user to make the selection.
  • FIG. 18 illustrates that the device 800 outputs a reference work entry (e.g., the dictionary entry 908 ) after determining that an amount of force 1802 of this selection is less than a threshold force.
  • This threshold force may be configurable by a user of the device 800 in some instances.
  • FIG. 19 illustrates an example user interface rendered by the device 800 after the device determines that an amount of force of this selection is greater than the threshold force.
  • the device 800 enables the user to select a reference work entry to output, each of which may be associated with the word “mammal.”
  • the device 800 displays a menu 1902 with selectable icons, such as “dictionary entry,” “thesaurus entry,” and the like.
  • the user may select one of the icons to cause the device 800 to display the corresponding reference work entry. For instance, the user may select “encyclopedia entry” to view an encyclopedia entry for the word “mammal.”
  • FIGS. 18-19 thus illustrate that the device 800 may determine whether to output an entry or whether to enable a user to select an entry based on an amount of force of a user's selection. While these figures illustrate that the device enables user selection in response to detecting a force that is greater than a threshold force, in other instances the device may operate in an opposite manner. That is, the device 800 may enable the user to select a reference work entry in response to detecting a force that is less than a threshold force. Similarly, in these instances, the device 800 may output a particular reference work entry in response to detecting a force that is greater than the threshold force.
  • FIG. 19 illustrates one example manner that the user may select a reference work entry for output
  • the device 800 may enable this selection in a number of other ways.
  • the device 800 may allow the user to orally state which reference work entry to output (e.g., “thesaurus”) or may allow the user to make this selection in any other manner.
  • the device may output a first entry in response to a first amount of force, a second entry in response to a second, greater amount of a force, and the selectable menu 1902 in response to a third, even greater amount of force.
  • FIG. 20 is a flow diagram showing a process 2000 of determining whether to output a reference work entry or whether to allow a user to select which reference work entry to output based on an amount of force of a selection.
  • the device 800 and/or a component in communication with the device 800 may perform this process in some instances.
  • the device detects a selection on a touch-sensitive display of the device. For instance, the device may detect the user providing a touch input via her finger, a stylus, or the like. This selection may be associated with a particular portion of a content item being rendered by the device, such as a particular word or phrase of an eBook.
  • the device measures an amount of a force of the selection.
  • the device determines whether the measured amount of force is greater than or less than a preconfigured threshold value.
  • the device causes display of a reference work entry on the display at operation 2008 .
  • the device 800 enables the user select an entry at operation 2010 .
  • the device 800 may output the selectable menu 1902 or may otherwise allow the user to select which reference work entry to display on the device.
  • FIG. 21 is a block diagram of selected modules of an eBook reader device 2100 that may implement a touch-sensitive display and that is capable of outputting different content based on an amount of force applied to the touch-sensitive display.
  • the illustrated eBook reader device 2100 may include several similar or identical components as the eBook reader devices 104 ( 1 ) and 800 described above.
  • the eBook reader device 2100 is a dedicated, handheld eBook reader device, although other electronic devices may implement these techniques and, hence, may include some of the functionality described herein.
  • mobile telephones, tablet computers, laptop computers, desktop computers, personal media players, portable digital assistants (PDAs), kiosks, or any other type of electronic device may implement the components and techniques described below.
  • PDAs portable digital assistants
  • the example eBook reader device 2100 includes one or more processing units 2102 and memory 2104 .
  • the eBook reader device 2100 may include the touch sensor 806 that enables a user of the device to operate the device via touch inputs.
  • the touch sensor 806 and the display 220 are integral to provide a touch-sensitive display that displays content items (e.g., eBooks) and allows users to navigate the content items via touch inputs on the display.
  • the memory 2104 may be used to store any number of functional components that are executable on the processing unit(s) 2102 , as well as data and content items that are rendered by the eBook reader device 2100 .
  • the memory 2104 may store an operating system and an eBook storage database to store the one or more content items 206 described above, such as eBooks, audio books, songs, videos, still images, and the like.
  • a content presentation application 210 renders the content items 206 .
  • the content presentation application 210 may be implemented as various applications depending upon the content items.
  • the application 210 may be an electronic book reader application for rending electronic books, an audio player for playing audio books or songs, a video player for playing video, and so forth.
  • the memory 2104 may also store user credentials 212 .
  • the credentials 212 may be device specific (set during manufacturing) or provided as part of a registration process for a service. The credentials may be used to ensure compliance with DRM aspects of rendering the content items 206 .
  • the memory 2104 also stores (persistently or temporarily) one or more reference works 214 , such as one or more dictionaries, thesauruses, encyclopedias, atlases, gazetteers, and the like. In some instances, the memory 204 stores multiple categories of a particular kind of reference work as described above.
  • the memory 2104 may also include the interface module 208 that, as described above, provides for user operation of the device 2100 .
  • One feature of the interface module 208 allows a user to request to receive information (e.g., from a reference work, from the Web, etc.) regarding a word, phrase, or topic found within one of the content items 206 .
  • the interface module 208 may allow the user to request a definition of a word from a dictionary, synonyms from a thesaurus, a map from an atlas, search results from the Web, and the like.
  • the interface module 208 may facilitate textual entry of request (e.g., via a cursor, controller, keyboard, etc.), audible entry of the request (e.g., via a microphone), or entry of the request in any other manner.
  • the memory 2104 also stores the touch-input controller 808 to detect touch inputs received via the touch sensor 806 and, in some instances, to measure of a force of the touches.
  • the touch-input controller 808 is configured to detect multiple touches on the touch sensor 806 as well as to measure an amount of force of each of the touches.
  • the eBook reader device 2100 also stores a term aggregation module 2106 and a search refinement module 2108 .
  • the term aggregation module 2106 is executable on the processing unit(s) to aggregate instances of a selected term and output one or more of these instances, such as on the display 220 . For instance, in response to a predefined user gesture, the module 2106 may aggregate or otherwise determine each instance of a selected term within an eBook that the device 2100 currently renders, within other eBooks stored on the device 2100 , and/or within available eBooks generally. The module 106 may then output a visual menu that indicates these other instances of the selected term.
  • the user of the device 2100 may select a particular word in an illustrated eBook with a particular amount of force.
  • the device 2100 may output information about the selected term. For instance, the device 2100 may illustrate a pop-up menu showing an entry associated with the selected word from one of the reference works 214 , such as a dictionary entry. Thereafter, the user may provide a greater or lesser amount of force to the selected word on the display 220 and, in response, the touch-input controller 808 may provide a notification of this greater or lesser force to the term aggregation module 2106 . In response to this notification, the term aggregation module 2106 may output other instances of the selected word within the illustrated eBook, as described in detail below.
  • the term aggregation module 2106 may, in response to the greater or lesser force, output other instances of the selected word within other eBooks stored on the device 2100 and/or other eBooks that are not stored on the device 2100 . While a few example configurations have been described, multiple other configurations may also be implemented.
  • FIG. 21 further illustrates that the memory 2104 may store a search refinement module 2108 .
  • the search refinement module 2108 is executable on the processing unit(s) 2102 to refine a user's search based on varying magnitudes of force associated with a selection of the user. For instance, a user may request that the device 2100 perform a local or remote search by selecting a particular word of an illustrated eBook to use as a query for the search. This search may comprise a search of eBooks or other content items stored on the device, a search of results available on the Web, or the like.
  • the search refinement module 2108 may refine the search by, for instance, expanding or narrowing the search. For instance, the search refinement module 2108 may expand a search in response to the user providing a lesser amount of force on the touch-sensitive display, and/or may narrow a search in response to the user providing a greater amount of force, each of which are described in more detail below.
  • the eBook reader device 2100 may further be equipped with various input/output (I/O) components 222 , similar to the devices 104 ( 1 ) and 800 described above. Such components may include various user interface controls (e.g., buttons, a joystick, a keyboard, etc.), audio speakers, connection ports, and so forth.
  • a network interface 224 supports both wired and wireless connection to various networks, such as cellular networks, radio, WiFi networks, short range networks (e.g., Bluetooth), IR, and so forth.
  • the network interface 224 may allow a user of the device 800 to download content items from the content item service 106 .
  • the eBook reader device 2100 also includes a battery and power control unit 226 .
  • the battery and power control unit operatively controls an amount of power, or electrical energy, consumed by the eBook reader device. Actively controlling the amount of power consumed by the reader device may achieve more efficient use of electrical energy stored by the battery.
  • FIG. 22 illustrates an example user interface rendered by the device 2100 of FIG. 21 .
  • the device includes the touch-sensitive display 902 that renders an eBook 2202 , which in this example comprises the play “Hamlet” by William Shakespeare.
  • an eBook 2202 which in this example comprises the play “Hamlet” by William Shakespeare.
  • a user makes a selection of a word (“Ophelia”) on the touch-sensitive display 902 , as illustrated by the highlight 126 .
  • the touch-input controller 808 of the device 2100 measures an amount of force 2204 of the selection.
  • the device 2100 may output information regarding the selected word.
  • the device 2100 may output a reference work entry associated with the selected word.
  • the device 2100 outputs an entry 2206 describing the role of the selected character, Ophelia, within the illustrated eBook 2202 .
  • the entry 2206 also includes an icon 2208 (entitled “More”) that, when selected, expands or otherwise provides additional information from the entry 2206 .
  • the touch-input controller 808 may provide an indication of the measured amount of force to the reference work selection entry shown in FIG. 8 , which may map the detected amount of force to a particular type of reference work entry to output.
  • the module 810 may output a first type of reference work entry for a touch-input having a measured force within a first range, a second type of reference work entry for a touch-input having a measured force within a second range, and so forth.
  • the controller 808 may measure the amount of force 2204 of the initial touch and may use this amount as a baseline for future touches. For instance, the device 2100 may select and output a first type of information (e.g., entry 2206 , Web search results for “Ophelia,” etc.) in response to detecting a first touch, regardless of the amount of force of the touch. Thereafter, the device 2100 may provide different outputs in response to detecting touches having greater or lesser forces than the initial “baseline” touch, as described immediately below.
  • a first type of information e.g., entry 2206 , Web search results for “Ophelia,” etc.
  • FIG. 23 illustrates an example user interface rendered by the device 2100 after the user has increased the amount of force on the selected word.
  • the user now selects the word “Ophelia” with an amount of force 2302 that is greater than the amount of force 2204 .
  • the user may or may not have maintained contact with the touch-sensitive display 902 between the applications of these two forces.
  • the touch-input controller 808 has detected the greater amount of force 2302 and has provided this information to the term aggregation module 2106 .
  • the module 2106 may determine whether the increased force is greater than a threshold and, if so, may output information that is different than the entry 2206 of FIG. 22 .
  • the module 2106 may output other instances of the selected term “Ophelia,” either within the eBook 2202 , within other eBooks stored on the device 2100 , or within other content items.
  • the module 2106 has output a menu 2304 of other instances of the term “Ophelia” within the illustrated eBook 2202 , Hamlet.
  • each listing of the instance of the selected term may be selectable by the user, such that a selection of that instance may cause the device 2100 to navigate the user to that location in the eBook 2202 or may cause an overlay of that location in the eBook 2202 .
  • the menu 2304 displays other mentions or instances of the term “Ophelia” starting with the beginning of the play. Conversely, the menu 2304 could depict instances of the selected term that are nearest the instance of the selected term or may surface these other instances in any other matter. In either instance, the menu 2304 may include the icon 2208 that, when selected, causes the device 2100 to illustrate more instances of “Ophelia” and their corresponding locations.
  • FIG. 23 illustrates that the device 2100 outputs the menu 2304 in response to detecting the amount of force 2302 on the location of the display 902 associated with the word “Ophelia,” other implementations may output this menu 2304 based detecting a force at another location the display 902 .
  • FIG. 24 illustrates yet another example user interface rendered by the device 2100 after the user has yet again provided an input having an increased amount of force 2402 on the selected word “Ophelia.”
  • the touch-input controller 808 measures the increased force 2402 and provides this information to the term aggregation module 2106 .
  • the module 2106 outputs still different information regarding the selected term.
  • the module 2106 outputs a menu 2404 of other mentions of the term “Ophelia” within other eBooks or content items stored on or accessible by the device 2100 of the user. That is, the menu 2404 may include indications of other works that the user of the device 2100 has previously purchased and/or obtained (e.g., downloaded).
  • the menu 204 indicates that the term Ophelia shows up in three eBooks to which the device has access to.
  • the user may select different ones of the listings, which in turn may cause the device to display the selected eBook, potentially at the first use of the term “Ophelia” within that book.
  • FIG. 24 illustrates other mentions or instances of the selected term within items to which the device user has access to
  • the menu 2404 may include other items that the user currently does not have access to.
  • the menu could provide icons that, when selected, initiate a request for a sample of an item or a request to obtain (e.g., purchase) the item. As such, the user may be able to obtain other items that are associated with or otherwise reference the term “Ophelia,” if the user so desires.
  • FIG. 25 illustrates an example user interface rendered by the device 2100 after the user has yet again increased an amount of force 2502 on the selected word.
  • the device 2100 outputs search results 2504 associated with a query comprising the selected word (“Ophelia”) based on this even greater amount of force 2502 .
  • the search results may comprise Web results provided by a search engine.
  • the user of the device 2100 is able to toggle through different information associated with a particular term (e.g., word or phrase) by altering the amount of force applied to the touch-sensitive display 902 .
  • the user may request to view other instances of a selected term by providing a greater or lesser amount of force to the display 902 than an initial selection.
  • FIG. 22-25 illustrate an example navigation path associated with four different levels of force, other implementations may employ any other combination of information associated with the selected term.
  • both the navigation path i.e., the surfaced information associated with the selected term
  • the force thresholds for navigating this path may be configured by the user in some implementations.
  • FIG. 26 is a flow diagram showing a process 2600 of causing display of other instances of a selected term based on a force associated with a selection.
  • This process (as well as the processes described below) is illustrated as a logical flow graph, each operation of which represents a sequence of operations that can be implemented in hardware, software, or a combination thereof.
  • the operations represent computer-executable instructions stored on one or more computer-readable storage media that, when executed by one or more processors, perform the recited operations.
  • computer-executable instructions include routines, programs, objects, components, data structures, and the like that perform particular functions or implement particular abstract data types. While this process illustrates one example order in which the operations of the process may occur, the example order is not intended to be construed as a limitation, and any number of the described operations can be combined in any order and/or in parallel to implement the process.
  • process 2600 is described with reference to the device 2100 of FIG. 21 , although other devices and architectures may implement this process.
  • the device 2100 detects an initial selection on a touch sensor and measures a force of the selection. For instance, a user of the device 2100 may provide a touch input onto the touch-sensitive display of the device via a finger, stylus, or the like. This initial selection may select a particular portion of a content item, such as a term (e.g., word or phrase), a location on a map, a portion of an image, or any other portion of any type of content item. In this example, the user selects a particular term.
  • a term e.g., word or phrase
  • the device 2100 causes display of information associated with the selected term at least partly in response to detecting the initial selection. For instance, the device 2100 may display a reference work entry or other information associated with the term on the touch-sensitive display of the device, on another display of the device, or on another display of a different device. Further, in some instances, the device 2100 selects the information (e.g., the reference work entry) based at least in part on the measured force of the initial selection. In other instances, meanwhile, the device 2100 selects the information to surface regardless of an amount of force. In these latter instances, the measured force of the initial selection may act as a baseline for future touch inputs.
  • the information e.g., the reference work entry
  • the device 2100 selects the information to surface regardless of an amount of force. In these latter instances, the measured force of the initial selection may act as a baseline for future touch inputs.
  • the touch sensor of the device detects a force that is greater or lesser than the measured force of the initial selection. For instance, the user may have applied more force via her finger or a stylus, with or without removing her finger or stylus from touch sensor after making the initial selection.
  • the device 2100 causes display of one or more other instances of the selected term. Again, the device may display this entry on any display, such as on the touch-sensitive display of the device 2100 . Further, the other instances of the selected term may comprise other instance of the term within the illustrated content item, within other content items associated with the device 2100 , or within other content items generally. In some implementations, the device 2100 determines which of these “other instances” to surface based on an amount of force of the selection and/or based on a difference in force between the initial selection and the greater/lesser force.
  • a first greater amount of force may result in the device 2100 surfacing other instances of the selected term within the currently illustrated eBook, while a second, greater amount of force may result in the device 2100 surfacing other instances of the selected term within the other eBooks stored on or accessible by the device.
  • Other examples are also possible.
  • FIG. 27 illustrates an example user interface rendered by the device 2100 after a user has made a selection of a word within the eBook 2200 being rendered on the touch-sensitive display 902 of the device 2100 .
  • the user provides a touch of a certain threshold amount of force 2702 on the display, which the touch-input controller 808 detects.
  • the device 2100 outputs a menu 2704 requesting that the user select whether to view previous or subsequent instances of the selected word within the illustrated content item.
  • the menu comprises a left-facing arrow 2706 that, when selected, causes the device to render instances of “Ophelia” within Hamlet that occur prior to a current reading location of the user.
  • the menu 2704 also includes a right-facing arrow 2708 that, when selected, causes the device to render instances of “Ophelia” within Hamlet that occur subsequent to a current reading location of the user. While FIG. 27 illustrates arrows, the menu 2704 may include any other icons to allow the user to select whether to surface previous instances of a selected term, subsequent instances, or both.
  • FIG. 28 illustrates an example user interface rendered by the device 2100 while a user makes a gesture 2802 that both selects a word within a content item and requests to view subsequent instances of the selected word within the illustrated content item.
  • the gesture 2802 comprises the user pressing on a particular term of the eBook 2202 (“Ophelia”) with a certain amount of force 2702 , while also swiping her finger to the right.
  • the device 2100 interprets this gesture to mean that the user wishes to view instances of “Ophelia” within Hamlet that occur subsequent to a current reading location of the user within the eBook.
  • FIG. 29 illustrates an example user interface rendered by the device 2100 after the user makes the gesture 2802 in FIG. 28 .
  • the device 2100 surfaces a menu 2902 that indicates that the illustrated eBook 2202 does not contain any subsequent instances of the selected term.
  • the menu 2902 includes a link 2904 that, when selected, allows the user to view previous instances of the term “Ophelia” within the illustrated eBook 2202 .
  • FIG. 30 is a flow diagram showing a process 300 of causing display of information associated with a portion of a content item in response to receiving a touch selection of the portion on a display and, thereafter, outputting other instances of the portion after receiving an additional input.
  • the process 3000 includes, at 3002 , receiving a first user input selecting a portion of a content item being output on a touch-sensitive display. For instance, the user may select a particular term of an eBook, a geographical location on a map, a character in an illustration, or any other defined portion of a content item.
  • the device causes display of information associated with the selected portion. For instance, the device may cause display of a reference work entry associated with a selected word or phrase, a biography of a selected character or person, a location on a map of a selected geographical location, or any information associated with the selected portion.
  • the device receives a second user input requesting to view other instances of the selection portion of the content item within the content item. For instance, the user may provide a greater or lesser amount of force on the selected portion, as discussed. Additionally or alternatively, the user may select a particular button on the keypad of the device, may state an aural command, may provide a second and coincident touch on the display, or the like.
  • the device may cause display of other instances of the selected portion within the content item. For instance, the device may indicate other textual or graphical locations where the selected portion occurs within the content item. This may include other textual mentions of the portion, other instances of the selected portion within an image, photograph, or map, or any other type of instances of the selected portion of the item.
  • FIG. 31 illustrates an example user interface rendered by the device 2100 after the user selects a word, via a touch input 3102 , within the illustrated eBook 2202 .
  • the device 2100 may form a query comprising the select word and may perform a local and/or remote search based on the query. For instance, the device 2100 may pass the query to a search engine to run a Web search on the selected term “Ophelia” and may thereafter output search results 3104 . While the device may conduct a Web search with use of the selected term, the device 2100 may alternatively conduct a local search of the device 2100 and/or a search of one or more other defined devices or domains.
  • FIG. 32 illustrates an example user interface rendered by the device 2100 after the user provides a greater amount of force 3202 to the selected word.
  • the touch-input controller 808 detects this greater amount of force 3202 and passes an indication of the increased force to the search refinement module 2108 .
  • the search refinement module 2108 may refine (e.g., narrow) the search results 3104 .
  • the search refinement module 2108 may form another, more narrow query in response to receiving this indication and may thereafter display search results 3204 associated with this narrower query.
  • the narrowed query comprises the selected term and at least one other term associated with the content item being output by the touch-sensitive display.
  • the additional term may comprise at least a portion of a title of the content item, an author of the content item, a topic of the content item, a categorization of the content item, or any other information associated with the content item, the exact selection of which may or may not be configurable by the user.
  • the module 2108 conducts a search for a query comprising the selected term (“Ophelia”) and a title of the illustrated eBook 2202 from which the term was selected (“Hamlet”). As such, the device 2100 outputs search results 3204 for the query “Ophelia Hamlet.”
  • the search refinement module 2108 may broaden or narrow displayed search results based on crowd-sourcing—that is, with reference to the navigation of users having previously navigated search results associated with the illustrated search.
  • the module 2108 may reference previous navigation of users that have conducted a search from within a particular content item, such as the illustrated play “Hamlet.” For instance, if a large number of users selected a certain set of items in search results when conducting the search “Ophelia” within Hamlet (or within related works), then the module 2108 may highlight these popular items in response to receiving a user request to narrow a search for “Ophelia” within the eBook 2202 (Hamlet).
  • the user may broaden a search by lessening the amount of force provided on the touch-sensitive display 902 . For instance, if the user were to lessen the amount of force from the amount 3202 to the amount 3102 , then the search refinement module 2108 may broaden the search by forming a query “Ophelia” rather than “Ophelia Hamlet.” In response, the device 2100 may display the search results 3104 .
  • FIG. 33 is a flow diagram showing a process 3300 of refining search results on a touch-sensitive display based on the device detecting a greater or lesser amount of force on a selected term.
  • the process 3300 is described with reference to the device 2100 of FIG. 21 , although other devices and architectures may implement this process.
  • the device 2100 detects an initial selection on a touch sensor and measures a force of the selection. For instance, a user of the device 2100 may provide a touch input onto the touch-sensitive display of the device via a finger, stylus, or the like. This initial selection may select a particular portion of a content item, such as a term (e.g., word or phrase), a location on a map, a portion of an image, or any other portion of any type of content item. In this example, the user selects a particular term.
  • a term e.g., word or phrase
  • the device 2100 forms a query comprising (e.g., consisting of) the selected term and thereafter causes display of search results associated with this term.
  • the search results are local, remote, web-based, or any combination thereof.
  • the device may submit the formed query to a remote search engine in some examples.
  • the touch sensor of the device detects a force that is greater or lesser than the measured force of the initial selection. For instance, the user may have applied more force via her finger or a stylus, with or without removing her finger or stylus from touch sensor after making the initial selection.
  • the device 2100 refines the displayed search results.
  • the search refinement module 2108 may form another query and may surface search results associated with this broader or narrower query.
  • FIG. 34 illustrates an example user interface rendered by the device 2100 while a user makes a gesture 3402 that both selects a word within the illustrated eBook 2202 and requests to narrow illustrated search results associated with the selected word.
  • the user may have previously requested to view search results associated with the word “Ophelia” and, in response, the device 2100 illustrates corresponding search results 3104 .
  • the user performs the gesture 3402 , which in this example comprises the user swiping downwards with reference to controls of the device 2100 or the orientation of the eBook 2202 .
  • FIG. 35 illustrates an example user interface rendered by the device 2100 after the user performs the gesture 3402 .
  • the device has narrowed the search results in response to detecting the gesture.
  • the search refinement module 2108 has formed a query (“Ophelia Hamlet”) that is narrower than the previous query (“Hamlet”).
  • the device 2100 now displays search results 3204 associated with the narrower query.
  • FIG. 36 illustrates an example user interface rendered by the device 2100 while a user makes a gesture 3602 that both selects a word within the illustrated eBook 2202 and requests to expand illustrated search results associated with the selected word.
  • the device 2100 currently displays the search results 3204 discussed immediately above with reference to FIG. 35 .
  • the gesture 3602 comprises the user performing an upwards swipe.
  • FIG. 37 illustrates an example user interface rendered by the device 2100 after the user performs the upward swipe gesture of FIG. 36 .
  • the device has expanded search results in response to detecting the gesture.
  • the search refinement module 2108 has surfaced search results 3104 associated with the query “Ophelia,” rather than the previously surfaced results associated with the narrower query “Ophelia Hamlet.”
  • the user has been able to quickly and efficiently narrow and/or broaden search results with use of predefined and intuitive gestures. While the foregoing figures have provided a few predefined gestures and corresponding directionality that may be suitable for refining search results, multiple other gestures and/or directions may be used in other implementations.
  • FIG. 38 is a flow diagram showing a process 3800 for refining search results based at least in part on detecting a predefined gesture on a touch-sensitive display.
  • the device 2100 detects an initial selection of a term in a content item, such as the eBook 2202 .
  • the device 2100 may form a query and may display search results associated with the query at 3804 .
  • the device 2100 may detect a predefined gesture on the touch-sensitive display 902 of the device 902 during the display of the search results.
  • This predefined gesture may comprise a user swipe on the touch-sensitive display in a predefined direction, such as upwards to broaden the search or downwards to narrow the search. Additionally or alternatively, the predefined user gesture may comprise additional or less force on the touch-sensitive display, or any other predefined user gesture.
  • the search refinement module 2108 of the device 2100 may refine the search results on the display based at least in part on detecting the predefined gesture. For instance, the module 2108 may expand the search results or narrow the search results, depending upon the detected gesture made by the user.

Abstract

Techniques for outputting different content on a touch-sensitive display of a device based at least in part on an amount of force applied to the touch-sensitive display. For instance, when a user reads an electronic book (eBook) on a device having a touch-sensitive display, the user may select a word and the device may accordingly display information associated with the selected word. Thereafter, the user may apply a greater or lesser amount of force to the selected word and, in response, the device may output other instances or uses of the selected word. This document also describes techniques for refining search results associated with a particular word or phrase based at least in part on a measured amount of force associated with a selection. For instance, the device may refine (e.g., expand or narrow) illustrated search results based on the user providing greater or lesser force on the touch-sensitive display.

Description

BACKGROUND
A large and growing population of users is enjoying entertainment through the consumption of digital content items (or simply “content items”), such as music, movies, images, electronic books, and so on. The users employ various electronic devices to consume such content items. Among these electronic devices are electronic book (eBook) reader devices, cellular telephones, personal digital assistants (PDAs), portable media players, tablet computers, netbooks, and the like. As the quantity of available electronic media content continues to grow, along with increasing proliferation of devices to consume that media content, finding ways to enhance user experience continues to be a priority.
BRIEF DESCRIPTION OF THE DRAWINGS
The detailed description is set forth with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different figures indicates similar or identical items or features.
FIG. 1 illustrates an architecture in which a community of users operates respective client devices to consume content items, such electronic books (eBooks), songs, videos and the like. In this architecture, the client devices and/or a content item service implement techniques to provide context-sensitive reference works (e.g., dictionaries, thesauruses, atlases, etc.) that provide requested information to the users based on a genre of the content item associated with the request, a characteristic of the user, or the like.
FIG. 2 is a block diagram of selected modules of an eBook reader device capable of receiving a request for information from a user experiencing a content item, determining a type of reference work entry appropriate for the content item or the user, and providing the information to the user from the determined type of reference work entry.
FIG. 3 illustrates an example user interface rendered by the devices of FIGS. 1 and 2. Here, the device or the content item service has determined that the eBook currently being read by the user is associated with a “medical” genre. As such, when the user requests a definition for a word within the eBook, the device displays a “medical” definition of the word rather than a standard or other type of definition.
FIG. 4 illustrates an example user interface rendered by the device of FIG. 3 after the user has selected to view “more” definitions of the illustrated word “prognosis.” As shown, in response the device displays definitions of “prognosis” in the medical sense, the standard sense, and the legal sense.
FIG. 5 illustrates another example user interface rendered by the devices of FIGS. 1 and 2. Here, the device is displaying an article from a periodical relating to “business.” As such, when the user requests to look up in a thesaurus a word within the article, the device displays synonyms and antonyms from a business-related thesaurus entry.
FIG. 6 illustrates another example user interface rendered by the devices of FIGS. 1 and 2. Here, the device is displaying a sports-related article. As such, when the user requests to look up in an encyclopedia a topic within the article, the device displays information about that topic from a sports-related encyclopedia entry.
FIG. 7 is a flow diagram showing a process of classifying a content item according to, for example, a genre and determining, based on the classification, a type of reference work entry to use for the content item when a user requests information associated with a word, phrase, or topic found within the content item.
FIG. 8 is a block diagram of selected modules of an example eBook reader device that may implement a touch-sensitive display and that is capable of outputting different reference work entries based on an amount of force applied to the touch-sensitive display.
FIG. 9 illustrates an example user interface rendered by the device of FIG. 8 after a user has made a selection of a word on the touch-sensitive display of the device. Here, the device measures an amount of force of the selection and outputs a particular type of reference work entry based on this measured amount of force.
FIG. 10 illustrates an example user interface rendered by the device of FIG. 8 after the user has increased the amount of force on the selected word. Here, the device outputs a second, different type of reference work entry based on this greater amount of force.
FIG. 11 illustrates an example user interface rendered by the device of FIG. 8 after the user has yet again increased the amount of force on the selected word. Here, the device outputs a third, different type of reference work entry based on this even greater amount of force.
FIG. 12 illustrates another example user interface rendered by the device of FIG. 8 after a user has made a selection of a word on the touch-sensitive display of the device. Here, the device measures an amount of force of the selection and outputs a particular context-sensitive reference work entry based on this measured amount of force.
FIG. 13 illustrates another example user interface rendered by the device of FIG. 8 after the user has increased the amount of force on the touch-sensitive display. Here, the device outputs a second, different context-sensitive reference work entry based on this greater amount of force.
FIG. 14 is a flow diagram showing a process of selecting which of multiple reference work entries to output based on an amount of measured force associated with a selection.
FIG. 15 illustrates an example user interface rendered by the device of FIG. 8 after a user has made a selection of a word on the touch-sensitive display of the device. Here, the device outputs a reference work entry based on this selection.
FIG. 16 illustrates an example user interface rendered by the device after the user has provided an additional input. In response, the device outputs a second, different reference work entry. This input may include more force on the display, an additional point of contact on the display, activation of a key on a keypad, an oral command spoken by the user or any other type of input.
FIG. 17 is a flow diagram showing a process of outputting a first reference work entry in response to receiving a touch selection on a display and, thereafter, outputting a second, different reference work entry after receiving an additional input.
FIG. 18 illustrates an example user interface rendered by the device of FIG. 8 after a user has made a selection of a word on the touch-sensitive display of the device. Here, the device outputs a reference work entry after determining that an amount of force of this selection is less than a threshold force.
FIG. 19 illustrates an example user interface rendered by the device of FIG. 8 after a user has made a selection of a word on the touch-sensitive display of the device. Here, the device enables the user to select a reference work entry to output after the device determines that an amount of force of this selection is greater than a threshold force.
FIG. 20 is a flow diagram showing a process of determining whether to output a reference work entry or whether to allow a user to select which reference work entry to output based on an amount of force of a selection.
FIG. 21 is a block diagram of selected modules of an example eBook reader device that may implement a touch-sensitive display and that is capable of outputting different content based on an amount of force applied to the touch-sensitive display.
FIG. 22 illustrates an example user interface rendered by the device of FIG. 21 after a user has made a selection of a word within a content item being rendered on the touch-sensitive display of the device. Here, the device may measures an amount of force of the selection and, in some instance, may output a particular type of reference work entry based on this measured amount of force.
FIG. 23 illustrates an example user interface rendered by the device of FIG. 21 after the user has increased the amount of force on the selected word. Here, the device outputs other instances of this word within the illustrated content item based on this greater amount of force.
FIG. 24 illustrates an example user interface rendered by the device of FIG. 21 after the user has yet again increased the amount of force on the selected word. Here, the device outputs other instances of this word within other content items based on this even greater amount of force.
FIG. 25 illustrates an example user interface rendered by the device of FIG. 21 after the user has yet again increased the amount of force on the selected word. Here, the device outputs search results associated with a query comprising the selected word based on this even greater amount of force.
FIG. 26 is a flow diagram showing a process of causing display of other instances of a selected term based on a force associated with a selection.
FIG. 27 illustrates an example user interface rendered by the device of FIG. 21 after a user has made a selection of a word within a content item being rendered on the touch-sensitive display of the device. Here, the device requests that the user select whether to view previous or subsequent instances of the selected word within the illustrated content item.
FIG. 28 illustrates an example user interface rendered by the device of FIG. 21 while a user makes a gesture that both selects a word within a content item and requests to view subsequent instances of the selected word within the illustrated content item.
FIG. 29 illustrates an example user interface rendered by the device of FIG. 21 after the gesture made by the user in FIG. 28.
FIG. 30 is a flow diagram showing a process of causing display of information associated with a portion of a content item in response to receiving a touch selection of the portion on a display and, thereafter, outputting other instances of the portion after receiving an additional input.
FIG. 31 illustrates an example user interface rendered by the device of FIG. 21 after the user selects a word within the illustrated content item. Here, the device outputs search results associated with a query comprising the selected word based on the selection by the user.
FIG. 32 illustrates an example user interface rendered by the device of FIG. 21 after the user provides a greater amount of force to the selected word. Here, the device refines (e.g., narrows) the search results of FIG. 31 in response to detecting the greater amount of force.
FIG. 33 is a flow diagram showing a process of refining search results on a touch-sensitive display based on the device detecting a greater or lesser amount of force on a selected term.
FIG. 34 illustrates an example user interface rendered by the device of FIG. 21 while a user makes a gesture that both selects a word within a content item and requests to narrow illustrated search results associated with the selected word.
FIG. 35 illustrates an example user interface rendered by the device of FIG. 21 after the user performs the gesture of FIG. 34. As illustrated, the device has narrowed search results in response to detecting the gesture.
FIG. 36 illustrates an example user interface rendered by the device of FIG. 21 while a user makes a gesture that both selects a word within a content item and requests to expand illustrated search results associated with the selected word.
FIG. 37 illustrates an example user interface rendered by the device of FIG. 21 after the user performs the gesture of FIG. 36. As illustrated, the device has expanded or broadened search results in response to detecting the gesture.
FIG. 38 is a flow diagram showing a process for refining search results based at least in part on detecting a predefined gesture on a touch-sensitive display.
DETAILED DESCRIPTION
This disclosure describes techniques for outputting different content on a touch-sensitive display of a device based at least in part on an amount of force applied to the touch-sensitive display. For instance, when a user reads an electronic book (eBook) on a device having a touch-sensitive display, the user may make a selection of a word or phrase within the eBook by touching the display at a location of the word or phrase. In response, the techniques may output information associated with the selected word. For instance, the device may output, in response, a dictionary definition of the selected word, a picture associated with the selected word, synonyms of the selected, or the like. Thereafter, the user may apply a greater or lesser amount of force to the selected word and, in response, the device may output other instances or uses of the selected word. For instance, the device may output other instances of the selected word within the illustrated eBook, other eBooks or content items, or the like.
In another example, the techniques described herein may refine search results associated with a particular word or phrase based at least in part on a measured amount of force associated with a selection. For instance, a user may request, via a touch on the touch-sensitive display, that the device perform a search based on a query associated with a selected word or phrase. In response, the device may output search results associated with the search (from a search engine or otherwise). Thereafter, the user may provide an amount of force on the touch-sensitive display that is more or less than the original selection. In response, the device may refine (e.g., expand or narrow) the illustrated search results. In one of many examples, the device may identify a term associated with the rendered eBook and may perform a search based on a query that includes this term in addition to the term selected by the user.
This disclosure also describes techniques for outputting different reference work entries based on an amount of force applied to a touch-sensitive display of a device. For instance, when a user reads an electronic book (eBook) on a device having a touch-sensitive display, the user may make a selection of a word or phrase within the eBook by touching the display at a location of the word or phrase. The techniques may then determine which of multiple different reference work entries to output based on a measured amount of force of the selection. For instance, the device may output a dictionary definition of the selected word in response to measuring a first amount of force. Additionally or alternatively, the device may output a thesaurus entry for the word in response to measuring a second, greater amount of force.
This disclosure also describes an architecture and techniques for outputting requested information from reference works (e.g., dictionaries, thesauruses, almanacs, atlases, encyclopedias, gazetteers) in a context-sensitive manner. For instance, when a user reads an electronic book (eBook) and requests a definition for a word found within the eBook, the techniques may display a definition for the word that has been selected based on the context of the request. In one example, the techniques may display a definition that corresponds to one or more identified genres of the eBook in which the word appears. In another example, the techniques may display a definition that corresponds to known information about the user, such as a preference of the user or the like.
For instance, if a user currently reading a medical-related eBook requests to receive a definition of a word from the eBook, the techniques will display a medical-related definition of the word. If the user reads a science-fiction (sci-fi) eBook, meanwhile, the techniques may display a sci-fi or science-related definition of the word. In each of these instances, the techniques may display more than one definition, with the order of the displayed definitions being based on the classification of the eBook. For instance, the medical definition may be displayed first in instances where the eBook is determined be medical-related. As such, the techniques display information from a reference work, such as the dictionary, in a manner that is more likely to be relevant and of interest to the user.
While the discussion below describes these techniques in the context of eBooks rendered on eBook reader devices, these techniques may apply to a variety of different types of content items, such as songs, videos, still images, and so on. Furthermore, the techniques may apply to a variety of different electronic devices, such as personal computers, cellular telephones, personal digital assistants (PDAs), portable media players, tablet computers, netbooks, and the like.
In each instance, the techniques may classify a particular content item as being associated with one or more particular genres (e.g., science, science fiction, medicine, business, law, fiction, a particular foreign language, etc.). Before or after the classification, a user experiencing the content item may request some information regarding the content item that may be found within a reference work. For instance, the user may request a definition of a word, synonyms or antonyms for a word, information from an encyclopedia regarding an identified word, phrase, or topic, a map for or directions to an identified location, or the like. In response to receiving this request, the techniques select an entry from the appropriate type of reference work and then output (e.g., visually, audibly, etc.) the reference work entry.
For instance, if a user requests information about a particular topic from within the content item, the techniques may select the corresponding encyclopedia entry based on the genre of the content item. For instance, if the user currently experiences a sports-themed content item and the user requests information regarding the topic “bat,” the techniques may output information regarding “bats” from a sports-themed encyclopedia. This information will likely discuss a round, elongated object for hitting a ball. If, however, the user currently experiences an animal-related content item and the user makes the same request, the techniques may output an encyclopedia entry from an animal-related encyclopedia. This information will likely discuss the nocturnal mammal.
While the above example describes referencing discrete reference works (here, encyclopedias), the techniques may instead reference a single reference work that includes multiple different types of entries (e.g., sports-related, animal-related, medical, etc.). For instance, a single encyclopedia may include an entry for “bat” in the sports sense and an entry for “bat” in the animal sense. Here, the techniques may display one or both of the definitions in a manner based on the identified genre of the content item.
The discussion begins with a section, entitled “Context-Sensitive Reference Works,” that includes numerous sub-sections. A first sub-section is entitled “Example Architecture” and describes one example architecture and several example components that implement the techniques introduced above. Next, a sub-section entitled “Example eBook Reader Device” follows, and describes example components of one type of device that may implement context-sensitive reference works. A sub-section entitled “Example User Interfaces” follows, describing examples of user interfaces (UIs) that may be served to and rendered at the client devices of FIG. 1. The discussion then moves on to illustrate and describe an “Example Process” for implementing the described techniques.
The discussion also includes a second section, entitled “Surfacing Reference Work Entries on Touch-Sensitive Displays,” that also includes numerous sub-sections. A first sub-section is entitled “Example eBook Reader Device” and describes example components of one type of device that may implement the techniques in this section. A sub-section entitled “Example User Interfaces and Processes” follows.
The discussion also includes a third section, entitled “Surfacing Content Based on Touch Gestures” that, like the preceding sections, includes numerous sub-sections. A first sub-section is entitled “Example eBook Reader Device” and describes example components of one type of device that may implement the techniques in this section. A sub-section entitled “Example User Interfaces and Processes for Surfacing Other Instances of a Selected Term” follows. Thereafter, the discussion includes a sub-section entitled “Example User Interfaces and Processes for Refining Search Results,” before the discussion ends with a brief conclusion.
This brief introduction, including section titles and corresponding summaries, is provided for the reader's convenience and is not intended to limit the scope of the claims, nor the proceeding sections. Furthermore, the techniques described above and below may be implemented in a number of ways and in a number of contexts. Several example implementations and contexts are provided with reference to the following figures, as described below in more detail. However, the following implementations and contexts are but a few of many.
Context-Sensitive Reference Works
Architectural Environment
FIG. 1 illustrates an example architecture 100 in which a community of users 102 operates respective client devices 104(1), 104(2), 104(3), . . . , 104(M) to consume content items, such electronic books (eBooks), songs, videos, still images and the like. In this architecture, the client devices 104 and/or a content item service 106 implement techniques to provide context-sensitive reference works (e.g., dictionaries, thesauruses, atlases, etc.) that provide requested information to the users based on a genre of the content item associated with the request, a characteristic of the requesting user, or the like.
The client devices 104 are variously configured with different functionality to enable consumption of one or more types of contents items of any type or format including, for example, electronic texts (e.g., documents of any format, electronic periodicals, such as digital magazines and newspapers, etc.), digital audio (e.g., music, audible books, etc.), digital video (e.g., movies, television, short clips, etc.), images (e.g., art, photographs, etc.), and multi-media content. The terms “electronic book” and/or “eBook”, as used herein, include electronic or digital representations of printed works, as well as digital content that may include text, multimedia, hypertext, and/or hypermedia. Examples of printed and/or digital works include, but are not limited to, books, magazines, newspapers, periodicals, journals, reference materials, telephone books, textbooks, anthologies, instruction manuals, proceedings of meetings, forms, directories, maps, web pages, etc.
FIG. 1 illustrates that the client devices 104 operated by users of the user community 102 may comprises eBook reader devices (e.g., devices 104(1) and 104(2)), laptop computers (e.g., device 104(3)), multifunction communication devices (e.g., device 104(M)), portable digital assistants (PDAs), wireless headsets, entertainment systems, portable media players, tablet computers, cameras, video cameras, netbooks, notebooks, desktop computers, gaming consoles, DVD players, media centers, or any other type of device.
In the architecture 100, the client devices may receive, over a network 108, one or more content items for presentation on the devices from the content item service 106. The network 108 is representative of any one or combination of multiple different types of networks, such as the Internet, cable networks, cellular networks, wireless networks, and wired networks. One example of a wireless technology and associated protocols is the Wireless Fidelity (WiFi) wireless networking technologies defined according to IEEE 802.11 standards, while another example is a cellular network.
As illustrated, the content item service 106 is embodied as one or more servers that collectively have processing and storage capabilities to receive requests for content items from the devices, such as the eBook reader device 104(1). The servers of the content item service 106 may be embodied in any number of ways, including as a single server, a cluster of servers, a server farm or data center, and so forth, although other server architectures (e.g., mainframe) may also be used. Alternatively, the content item service 106 may be embodied as a client device, such as desktop computer, a laptop computer, an eBook reader device and so forth. In some implementations, for instance, some or all of the elements of content item service 106 illustrated in FIG. 1 may reside on the client devices 104.
In the illustrated example, the content item service 106 includes a content item distribution system 110, a content item database 112, and a content item classifier 114. The content item distribution system 110 may support distribution of content items (e.g., online retailing via a website) to the client devices 104. In some implementations, the servers store the content items in the content item database 112, although in other implementations, the servers merely facilitate purchase and delivery of content items stored in other locations.
The content item classifier 114, meanwhile, serves to classify content items by, for example, genre. For instance, the classifier 114 may classify content items as relating to fiction, non-fiction, historical, science, science fiction, medicine, business, law, sports, animals, geography, computer science, engineering, chemistry, mathematics, a particular language or any other type of genre, category, or classification. To classify these content items, the classifier 114 may reference a prior categorization of the content items within, for example, the content item database 112. Or, the classifier may classify these content items in other ways, as discussed in detail below.
Furthermore, the content item classifier 114 may classify content items as relating to multiple different genres. For instance, an eBook that includes multiple sections may be associated with different genres corresponding to the different sections of the book. A textbook, for instance, may include a section classified as relating to mathematics, a section classified as relating to science, and a section classified as relating to medicine. A single section or an entire eBook may also be classified as relating to multiple genres.
As discussed in detail below, these genre classifications may be used to determine which category of reference work entry to use when receiving a request for information from a user. For instance, if a user reading the afore-mentioned textbook requests a definition for a word found within the science section of the book, the device of the user may display a science-related dictionary entry (alone or more prominently than other definitions).
The content item classifier 114 may classify content items in a multitude of ways. As illustrated, the content item classifier 114 includes a contents analysis module 116, a feedback module 118, and a user analysis module 120. The contents analysis module 116 may classify content items with reference to the actual contents of the item. Using an eBook as an example, this module 116 may scan the text of the eBook to identify key words and may then compare these identified key words to known, pre-compiled sets of key words associated with different genres. For example, the module 116 may scan contents of an eBook and may identify that the most-frequently used words of the eBook include “medicine,” “doctor,” “Dr.,” “disease,” and “hospital.” As such, the module 116 may compare these key words to sets of key words associated with different genres before concluding that this book should be classified as being within the medical genre. Similarly, the module 116 may analyze images or sounds within a content item and may compare these images or sounds to known sets of images or sounds associated with identified genres.
In some instances, this module 116 may weight certain words more heavily than others. For instance, the module 116 may weight the words of the title more heavily than the words within the chapters of the book. Similarly, the module 116 may assign a larger weight to the name of the author, the identity of the publisher, and the like.
The feedback module 118, meanwhile, serves to classify content items in whole or in part based upon received feedback. For instance, these techniques may include querying human users as to the genre of a content item and using responses from the users as input in determining the genre(s) of the respective content item. Furthermore, this module 118 may also track user actions in order to receive this feedback. For instance, envision that a user requests a definition found within a particular eBook for the term “boil.” Envision also that the classifier 114 has classified this eBook as relating to science and, as such, the user's device displays a science definition of the term boil, explaining that “boiling” refers to when a liquid changes state to a gas. However, feedback module 118 may determine (e.g., via an indication received over the network 108) that the user requested to see a different definition of the term “boil” (e.g., a medical definition). In this instance, the feedback module 118 may deduce that the eBook should have been classified as being of the “medical” genre rather than the “science” genre.
In some instances, the classifier 114 may assign a confidence level to a particular genre associated with a content item and may alter this genre based on feedback received at the feedback module 118. For instance, the classifier may determine that the eBook from this example above is 53% likely to be primarily of a “science” genre and 47% likely to be primarily of a “medical” genre. After receiving feedback similar to the feedback from the user discussed immediately above, these percentages may change such that the classifier 114 now judges that the eBook is more likely to relate to medicine than pure science. As such, the classifier 114 may change the assigned genre to “genre” (or may change the primary genre to “medical” while marking “science” as a secondary genre).
The user analysis module 120, meanwhile, may function to classify content items in whole or in part based upon the identity of the user experiencing the media item. For instance, when the content item distribution system 110 downloads an eBook to the eBook reader device 104(1), the module 120 may analyze known information about the user associated with the device by, for instance, referencing a user profile stored in an account of the user at the content item service 106. The module 120 may then use this known information about the user to help deduce the genre of the eBook.
For instance, envision that the user associated with the eBook reader device 104(1) often purchases eBooks, audio items, and the like that are classified as being of the “medical” genre. Therefore, when attempting to determine the genre of a new eBook purchased at the device 104(1), the module 120 may more heavily weight the chances of this eBook being related to medicine. The user analysis module 120 may similarly use any other known information about the user to help classify content items, including a location of the user, demographic information of the user, an address of the user, and the like.
With information from some or all of the modules 116, 118, and 120, the content item classifier 114 may classify content items as belonging to one or more genres. For instance, individual sections of content items (e.g., chapters, individual songs or tracks, etc.) may each be associated with one or more genres, or an entire content item may be associated with a single or multiple genres. In each instance, the determined genre(s) is helpful to determine the appropriate type or category of reference work entry to use when a user requests information regarding a word, phrase, or topic within the corresponding content item.
FIG. 1, for instance, illustrates that the eBook reader device 104(1) currently displays a fictitious eBook 122 entitled “Secrets to Internal Medicine” by a fictitious author “Dr. Grace Bradley,” which the device 104(1) may have downloaded from the content item service 106. FIG. 1 also illustrates that the content item database 112 stores the same eBook 122. In addition, FIG. 1 illustrates that the content item classifier 114 has classified this eBook 122 as relating a particular genre 124. Here, the classifier 114 has determined that the eBook relates to medicine and has classified this book accordingly. The content item database 112 may similarly store multiple other content items along with a notation of the genre(s) of each of these other items.
In this example, the user of the eBook reader device 104(1) has selected (via a highlight 126) a particular word (“prognosis”) from the eBook 122. In response, the eBook reader device 104(1) displays a definition 128 of the selected word. Here, the definition 128 of the word comes from a medical dictionary entry, which corresponds to the classification of the eBook 16 as being related to the “medical” genre. As such, this definition 128 states that a “prognosis” is “a forecast of the probable course and/or outcome of a disease.” While this example describes a dictionary, other implementations may employ other types or categories of reference works, a few examples of which are discussed below.
Example eBook Reader Device
FIG. 2 illustrates example components that might be implemented in the eBook reader device 104(1) of FIG. 1 that displays information provided by context-sensitive reference works, such as dictionaries or the like. In this example, the eBook reader device 104(1) is a dedicated, handheld eBook reader device, although other electronic devices may implement these techniques and, hence, may include some of the functionality described herein.
In a very basic configuration, the eBook reader device 104(1) includes one or more processing units 202 and memory 204. Depending on the configuration of a dedicated eBook reader device 104(1), the memory 204 (and other memories described throughout this document) is an example of computer storage media and may include volatile and nonvolatile memory. Thus, the memory 204 may include, but is not limited to, RAM, ROM, EEPROM, flash memory, or other memory technology, or any other medium which can be used to store media items or applications and data which can be accessed by the eBook reader device 104(1).
The memory 204 may be used to store any number of functional components that are executable on the processing unit(s) 202, as well as data and content items that are rendered by the eBook reader device 104(1). Thus, the memory 204 may store an operating system and an eBook storage database to store one or more content items 206, such as eBooks, audio books, songs, videos, still images, and the like. The memory 204 may further include a memory portion designated as an immediate page memory to temporarily store one or more pages of an electronic book. The pages held by the immediate page memory are placed therein a short period before a next page request is expected.
The term “page,” as used herein, refers to a collection of content that is presented at one time in a display of the eBook reader device 104(1). Thus, a “page” may be understood as a virtual frame of the content, or a visual display window presenting the content to the user. Thus, “pages” as described herein are not fixed permanently, in contrast to the pages of published “hard” books. Instead, pages described herein may be redefined or repaginated when, for example, the user chooses a different font for displaying the content in the first display. In addition to pages, the terms “page views”, “screen views”, and the like are also intended to mean a virtual frame of content.
An interface module 208 may also be provided in memory 204 and may be executed on the processing unit(s) 202 to provide for user operation of the device 104(1). One feature of the interface module 208 allows a user to request to receive information from a reference work regarding a word, phrase, or topic found within one of the content items 206. For instance, the interface module 208 may allow the user to request a definition of a word from a dictionary, synonyms from a thesaurus, a map from an atlas, and the like.
The interface module 208 may facilitate textual entry of request (e.g., via a cursor, controller, keyboard, etc.), audible entry of the request (e.g., via a microphone), or entry of the request in any other manner. The interface module 208 may provide menus and other navigational tools to facilitate selection and rendering of the content items 206. The interface module 208 may further include a browser or other application that facilitates access to sites over a network, such as websites or online merchants.
A content presentation application 210 renders the content items 206. The content presentation application 210 may be implemented as various applications depending upon the content items. For instance, the application 210 may be an electronic book reader application for rending electronic books, or an audio player for playing audio books or songs, or a video player for playing video, and so forth.
The memory 204 may also store user credentials 212. The credentials 212 may be device specific (set during manufacturing) or provided as part of a registration process for a service. The credentials may be used to ensure compliance with DRM aspects of rendering the content items 206.
The memory 204 also stores one or more reference works 214, such as one or more dictionaries, thesauruses, encyclopedias, atlases, gazetteers, and the like. In some instances, the memory 204 stores multiple categories of a particular kind of reference work. For instance, the memory 204 may store a standard dictionary (e.g., Merriam-Webster® English Dictionary), a medical dictionary, a legal dictionary, a science dictionary, a science-fiction dictionary, an engineering dictionary, a foreign language dictionary, a business dictionary, a chemistry dictionary, a mathematics dictionary, and the like. In other instances, a single kind of reference work may contain multiple reference work entry types. For instance, a single dictionary may store, for one or more of the words therein, a standard dictionary entry, a medical dictionary entry, a legal dictionary entry, a science dictionary entry, and the like.
FIG. 2 further illustrates that the memory 204 stores a feedback module 216 that is executable on the processing unit(s) to receive user feedback regarding an outputted reference work entry or a classified genre of a content item. As discussed above, this feedback may be used to help re-classify the genre associated with the content item.
The eBook reader device 104(1) also stores a reference entry selection module 218 that is executable on the processing unit(s) to select a particular type of reference work entry based on a genre of a content item, a characteristic of a user, or the like. For instance, this module 218 may store or reference a table that maps “content item genres” to “reference work entry types.” Therefore, when the content presentation application 210 outputs a content item of a particular genre and the user requests some reference work information associated with a word, phrase, or topic therein, the module 218 may reference this table to determine the type of entry to output. In some instances, the reference entry selection module 218 may reside on the content item service 106 or in another location, in which case the eBook reader device 104(1) may access the module 218 over the network 108.
In the example of FIG. 1, the module 218 may determine that the application 210 should display a medical definition when receiving a request for a word within an eBook that has been categorized as “medical” in nature. This table may similarly map a “legal” genre to a “legal” reference work entry type, a “sci-fi” genre to a “science” reference work entry type, a “historical fiction,” “British lit” and the like to a “standard” reference work entry type, and so on. In some instances, this table may map combinations of genres to reference work entry types. For instance, the table may map an eBook that is associated with both a “medical” genre and a “mystery” genre to a “standard” reference work entry type rather than a “medical” reference work entry type. It is to be appreciated, however, that FIG. 2 simply illustrates several example mappings, and that any type of content item genre may map to any type of reference work entry type in certain implementations.
FIG. 2 further illustrates that the eBook reader device 104(1) may include a display 220, which may be passive, emissive or any other form of display. In one implementation, the display uses electronic paper (ePaper) display technology, which is bi-stable, meaning that it is capable of holding text or other rendered images even when very little or no power is supplied to the display. Some example ePaper-like displays that may be used with the implementations described herein include bi-stable LCDs, MEMS, cholesteric, pigmented electrophoretic, and others. In other implementations, or for other types of devices, the display may be embodied using other technologies, such as LCDs and OLEDs, and may further include a touch screen interface. In some implementations, a touch sensitive mechanism may be included with the display to form a touch-screen display.
The eBook reader device 104(1) may further be equipped with various input/output (I/O) components 222. Such components may include various user interface controls (e.g., buttons, a joystick, a keyboard, etc.), audio speakers, connection ports, and so forth.
A network interface 224 supports both wired and wireless connection to various networks, such as cellular networks, radio, WiFi networks, short range networks (e.g., Bluetooth), IR, and so forth. The network interface 224 may allow a user of the device 104(1) to download content items from the content item service 106, may allow the feedback module 216 to provide received feedback to the service 106, and the like.
The eBook reader device 104(1) also includes a battery and power control unit 226. The battery and power control unit operatively controls an amount of power, or electrical energy, consumed by the eBook reader device. Actively controlling the amount of power consumed by the reader device may achieve more efficient use of electrical energy stored by the battery.
The eBook reader device 104(1) may have additional features or functionality. For example, the eBook reader device 104(1) may also include additional data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape. The additional data storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data.
Various instructions, methods and techniques described herein may be considered in the general context of computer-executable instructions, such as program modules, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, etc. for performing particular tasks or implement particular abstract data types. These program modules and the like may be executed as native code or may be downloaded and executed, such as in a virtual machine or other just-in-time compilation execution environment. Typically, the functionality of the program modules may be combined or distributed as desired in various embodiments. An implementation of these modules and techniques may be stored on or transmitted across some form of computer readable media.
Example User Interfaces
FIGS. 3-6 illustrate example user interfaces that the eBook reader device 104(1) (and the other client devices of the architecture 100) may render in accordance with the techniques described above. While these figures illustrates a few example interfaces it is to be appreciated that numerous other types of interfaces displaying information from numerous other types of reference works may be implemented using the described techniques.
FIG. 3 illustrates the example user interface described above with reference to FIG. 1. Here, the eBook reader device 104(1) or the content item service 106 has determined that the eBook 122 currently being read by the user is associated with a “medical” genre. As such, when the user requests a definition for a word within the eBook, the device displays a “medical” definition of the word rather than a standard or other type of definition.
Specifically, FIG. 3 illustrates that the user has selected (e.g., via a keyboard, cursor, touch screen, etc.) the word “prognosis,” as illustrated by the highlight 126. While the user selects a word in this example, the user may select a phrase in other embodiments. In response to the selection, the device 104(1) displays the medical definition 128 of this word. As illustrated, this definition 128 includes an indication 302 that this definition is in fact the medical definition, rather than another type of definition (e.g., a standard definition, a science definition, etc.).
In this example, the eBook reader device 104(1) may display a definition from a dictionary when the user selects a word, although in other implementations the device may display synonyms from a thesaurus, information from an encyclopedia, or information any other reference work type. In still other implementations, the device 104(1) may prompt the user to select the type of the reference work from which the device should display information.
FIG. 3 also illustrates that the definition 128 includes an icon 304 (“More”) that, when selected, allows the user to view additional definitions of the word “prognosis.”
FIG. 4 illustrates an example user interface rendered by the eBook reader device 104(1) after the user has selected to view “more” definitions of the word “prognosis.” As shown, in response the device displays the definition 128 of this word in the medical sense first, followed by a definition 402 of the word in a standard sense, and a definition 404 of the word in a legal sense. Here, the order of the list is also based on the genre of the eBook, with the medical definition appearing first. In some instances, one or both of the feedback modules 118 and 216 may use the user's selection of the icon 304 as an indication that the eBook or the currently displayed portion of the eBook may need to be re-classified. For instance and as discussed above, this selection may alter the confidence level associated with the currently associated genre.
FIG. 5 illustrates another example user interface rendered by the eBook reader device 104(1). Here, the device currently displays an eBook 502 comprising a periodical article that has been determined to relate the genre “business.” As such, when the user requests information from a reference work regarding a word, phrase, or topic from the eBook 502, the device may display a reference work entry associated with the genre “business.”
Here, for instance, the user requests (either explicitly or via default settings) to look up the word “bear” in a thesaurus, as indicated by a highlight 504. In response, the eBook reader device 104(1) displays an entry 506 from a thesaurus, the entry comprising synonyms and antonyms. As illustrated, an indication 508 indicates that this entry corresponds to a “business” use of the term “bear,” as the synonyms include “pessimist, cynic, defeatist, misanthrope,” while the antonyms include “bull, optimist.” This is contrasted with the standard use of the term “bear” in the English language, having synonyms of bear “stand, stomach, tolerate, abide” and the like. However, by displaying a business-related thesaurus entry when the user reads a business-related eBook, the device 104(1) is more likely to provide the user with the information that she seeks. Furthermore, the device 104(1) also displays the “more” icon 304 to allow the user to view other thesaurus entry types associated with the word “bear” (e.g., the standard use entry, an entry related to animals, etc.).
FIG. 6 illustrates another example user interface rendered by the eBook device 104(1). Here, the device 104(1) is displaying an eBook 602 in the form of an article that has been determined be associated with a “sports” genre. Here, the user requests to look up the topic “bat” in an encyclopedia, as indicated by a highlight 604. In response, the device 104(1) displays an entry 606 from a sports-related encyclopedia that explains the history and importance of a “baseball bat.” The eBook device 104(1) also displays an indication 608 that the entry 606 resides in a sports-related encyclopedia, or that the entry is sports-related entry in a general encyclopedia.
Again, the device also displays the “more” icon that, when selected, causes the device to display other articles associated with the term “bat,” such as an article about the nocturnal mammal. In instances where the currently displayed eBook 602 has user been classified as related to an “animal” genre, the device 104(1) may instead display the animal-related encyclopedia entry first, rather than the illustrated sports-related entry 606.
Example Process
FIG. 7 illustrates an example process 700 for implementing the techniques described above of providing context-sensitive reference work entries. This process is illustrated as a logical flow graph, each operation of which represents a sequence of operations that can be implemented in hardware, software, or a combination thereof. In the context of software, the operations represent computer-executable instructions that, when executed by one or more processors, perform the recited operations. Generally, computer-executable instructions include routines, programs, objects, components, data structures, and the like that perform particular functions or implement particular abstract data types. The order in which the operations are described is not intended to be construed as a limitation, and any number of the described operations can be combined in any order and/or in parallel to implement the process.
For discussion purposes, the process 700 is described with reference to the architecture 100 of FIG. 1, although other architectures may implement this process.
Process 700 includes an operation 702, which represents classifying a content item as belonging to one or more genres, such as one or more of the genres discussed above. This analyzing may include one or a combination of sub-operations 702(1), 702(2), and 702(3). Classifying a content item may include, for instance, analyzing contents of the content item at sub-operation 702(1). This may include analyzing a content item for key words and comparing these key words to sets of key words associated with different respective genres. Sub-operation 702(2), meanwhile, may include referencing a prior categorization of the content item, such as from an electronic catalog of content items. Finally, sub-operation 702(3) may include referencing feedback regarding the content item itself, as discussed above.
After classifying the item, an operation 704 represents determining a reference work entry to use for the content item based at least in part on the classified genre of the item. For instance, if the item has been classified as “legal,” operation 704 may determine that a “legal” reference work entry should be used. Conversely, if the reference work is classified as “thriller,” then operation 704 may determine that a “standard” reference work entry should be used.
Next, an operation 706 represents receiving a request for information found within a reference work regarding a word, phrase, or topic found within the content item. This may include, for example, receiving a request for a definition of a word from a dictionary, synonyms or antonyms for the word from a thesaurus, information regarding a topic from an encyclopedia, a map from an atlas, or the like.
Operation 708, meanwhile, represents selecting a reference work entry from the determined type of reference work entry type. For instance, after the user requests to receive a definition of the word “prognosis” found within a medical-related book, operation 708 may select the medical definition of “prognosis.” Next, an operation 710 represents outputting (visually, audibly, etc.) the selected reference work entry, such as the medical definition of the term “prognosis.” Again, this outputting may comprise outputting multiple definitions of the word in an order based at least in part on the classified genre(s) of the content item. For instance, operation 710 may output multiple definitions of the word “prognosis,” with the medical definition being displayed first or more prominently in the list relative to the other definitions.
Operation 712 represents querying whether feedback (e.g., user feedback) has been received in response to the output of the reference work entry. For instance, operation 712 may query whether the user decided to view additional definitions of the word “prognosis.” If so, then this feedback is fed back to the classification block to potentially alter the classification of the content item. If no feedback is received, then the process 700 ends at operation 714.
Surfacing Reference Work Entries on Touch-Sensitive Displays
Example eBook Reader Device
FIG. 8 is a block diagram of selected modules of an eBook reader device 800 that may implement a touch-sensitive display and that is capable of outputting different reference work entries based on an amount of force applied to the touch-sensitive display. The illustrated eBook reader device 800 may include several similar or identical components as the eBook reader device 104(1) described above.
In this example, the eBook reader device 800 is a dedicated, handheld eBook reader device, although other electronic devices may implement these techniques and, hence, may include some of the functionality described herein. For instance, mobile telephones, tablet computers, laptop computers, desktop computers, personal media players, portable digital assistants (PDAs), or any other type of electronic device may implement the components and techniques described below.
In a basic configuration, the example eBook reader device 800 includes one or more processing units 802 and memory 804. In addition, the eBook reader device 800 may include a touch sensor 806 that enables a user of the device to operate the device via touch inputs. In some instances, the touch sensor 806 and the display 220 are integral to provide a touch-sensitive display that displays content items (e.g., eBooks) and allows users to navigate the content items via touch inputs on the display.
The memory 804 may be used to store any number of functional components that are executable on the processing unit(s) 802, as well as data and content items that are rendered by the eBook reader device 800. Thus, the memory 804 may store an operating system and an eBook storage database to store the one or more content items 206 described above, such as eBooks, audio books, songs, videos, still images, and the like.
A content presentation application 210 renders the content items 206. The content presentation application 210 may be implemented as various applications depending upon the content items. For instance, the application 210 may be an electronic book reader application for rending electronic books, an audio player for playing audio books or songs, a video player for playing video, and so forth.
The memory 804 may also store user credentials 212. The credentials 212 may be device specific (set during manufacturing) or provided as part of a registration process for a service. The credentials may be used to ensure compliance with DRM aspects of rendering the content items 206.
The memory 804 also stores (persistently or temporarily) one or more reference works 214, such as one or more dictionaries, thesauruses, encyclopedias, atlases, gazetteers, and the like. In some instances, the memory 204 stores multiple categories of a particular kind of reference work. For instance, the memory 204 may store a standard dictionary (e.g., Merriam-Webster®English Dictionary), a medical dictionary, a legal dictionary, a science dictionary, a science-fiction dictionary, an engineering dictionary, a foreign language dictionary, a business dictionary, a chemistry dictionary, a mathematics dictionary, and the like. In other instances, a single kind of reference work may contain multiple reference work entry types. For instance, a single dictionary may store, for one or more of the words therein, a standard dictionary entry, a medical dictionary entry, a legal dictionary entry, a science dictionary entry, and the like. In still other instances, the device may store a dictionary that accompanies a particular eBook. For instance, the device may store a dictionary that a publisher of a particular eBook creates for that particular eBook or for a particular series of eBooks.
The memory 804 may also include the interface module 208 that, as described above, provides for user operation of the device 104(1). One feature of the interface module 208 allows a user to request to receive information from a reference work regarding a word, phrase, or topic found within one of the content items 206. For instance, the interface module 208 may allow the user to request a definition of a word from a dictionary, synonyms from a thesaurus, a map from an atlas, and the like.
The interface module 208 may facilitate textual entry of request (e.g., via a cursor, controller, keyboard, etc.), audible entry of the request (e.g., via a microphone), or entry of the request in any other manner.
The memory 804 also stores a touch-input controller 808 to detect touch inputs received via the touch sensor 806 and, in some instances to measure of a force of the touches. In some instances, the touch-input controller 808 is configured to detect multiple touches on the touch sensor 806 as well as to measure an amount of force of each of the touches.
The eBook reader device 800 also stores a reference entry selection module 810 that is executable on the processing unit(s) to select a particular type of reference work entry in response to receiving an indication of a touch input. For instance, in response to the user selecting a particular portion of a rendered content item via a touch input, the reference entry selection module 810 may select a particular type of reference work entry to output based on a measured force of the touch input. For example, if a user selects a particular word on the touch-sensitive display, the module 810 may map the amount of force of the touch to one of multiple different reference work entries.
In one specific example, the module 810 outputs a dictionary definition of the word in response to the user providing a first amount of force, a thesaurus entry for the word in response to the user providing a greater amount of force, and an encyclopedia entry for the word in response to the user providing an even greater amount of force. Or, the module 810 may select and output multiple different reference work entries within a same type of reference work. For instance, the module 810 may output a medical definition of a word in response to a touch input having a first force, a standard definition of the word in response to a touch input having a greater force, and a legal definition of the work in response to a touch input having an even greater force.
As such, the reference entry selection module 810 allows a user to toggle through multiple reference work entries for a particular word, phrase, or topic by providing additional force to the touch-sensitive display. The order and substance of the outputted reference work entries may be configurable by a user of the device, may be set by a publisher or distributor of a corresponding content item, or the like. Furthermore, while a few example configurations have been described, multiple other configurations may also be implemented.
In some instances, the reference entry selection module 810 selects a reference work entry to output based on factors in addition to a first touch input. For instance, a user may select a word or other portion of a content item on touch-sensitive display and, in response, the module 810 may select and output a first reference work entry. Thereafter, the user may provide an additional input and, in response, the module 810 may select and output a different reference work entry.
The additional input may comprise an additional touch, the user activating a button on the keypad, the user orally stating a command, or any other type of user input. As such, the user is able to toggle through multiple reference work entries by providing an input to the touch sensor 806 and thereafter providing additional inputs to the touch sensor 806 and/or the interface module 208. For instance, where the touch sensor 806 is capable of detecting and interpreting multiple coincident touches, the user may place a first finger on the touch sensor 806 to cause display of a first reference work entry and then may toggle through other reference work entries by tapping another finger on the touch sensor 806.
In addition, the reference entry selection module 810 may select a reference entry to output based on other characteristics. For instance, the module 810 may select a particular entry to output based on a current location of the device, as determined by a GPS system resident on the device, signal triangulation, or any other location-sensing method.
FIG. 2 further illustrates that the eBook reader device 800 may include the display 220, which may be passive, emissive or any other form of display as discussed above. Also as discussed above, the display 220 and the touch sensor 806 may couple to form a touch-sensitive display. That is, the touch sensor 806 may reside underneath or above the display 220 in some instances, or may reside adjacent to the display in other instances.
The eBook reader device 800 may further be equipped with various input/output (I/O) components 222. Such components may include various user interface controls (e.g., buttons, a joystick, a keyboard, etc.), audio speakers, connection ports, and so forth. In addition, a network interface 224 supports both wired and wireless connection to various networks, such as cellular networks, radio, WiFi networks, short range networks (e.g., Bluetooth), IR, and so forth. The network interface 224 may allow a user of the device 800 to download content items from the content item service 106.
The eBook reader device 800 also includes a battery and power control unit 226. The battery and power control unit operatively controls an amount of power, or electrical energy, consumed by the eBook reader device. Actively controlling the amount of power consumed by the reader device may achieve more efficient use of electrical energy stored by the battery.
Example User Interfaces and Processes
FIG. 9 illustrates an example user interface rendered by the device 800 of FIG. 8. As illustrated, the device includes a touch-sensitive display 902 that renders an eBook 904. During rendering of the eBook 904, a user makes a selection of a word on the touch-sensitive display 902, as illustrated by the highlight 126. In response, the touch-input controller 808 of the device 800 measures an amount of force 906 of the selection and provides this measured amount to the reference entry selection module 810. This module 810 may then map this particular amount of force to a particular type of reference work entry to output. For instance, the module 810 may output a first type of reference work entry for a touch-input having a measured force within a first range, a second type of reference work entry for a touch-input having a measured force within a second range, and so forth.
In other instances, meanwhile, the controller 808 may measure the amount of force 906 of the initial touch and may use this amount as a baseline for future touches. For instance, the device 800 may select and output a first type of reference work entry in response to detecting a first touch, regardless of the amount of force of the touch. Thereafter, the device 800 may output other reference work entries in response to detecting touches having greater or lesser forces than the initial “baseline” touch.
In either instance, once the module 810 selects a reference work entry type, the module 810 may output the entry to the user. In the illustrated example, the module 810 has overlaid the entry onto the touch-sensitive display 902, although other implementations may output the entry in other ways. In this example, the reference work selection module 810 has selected, based on the amount of force 906 of the touch input, a dictionary entry 908 to output on the display. As such, FIG. 9 illustrates that the device 800 outputs the dictionary entry 908 for the selected word “Mammal.” FIG. 9 also illustrates that the entry 908 provides an indication 910 of the source of the reference work entry.
FIG. 10 illustrates an example user interface rendered by the device 800 after the user has increased the amount of force on the selected word. As illustrated, the user now selects the word “mammal” with an amount of force 1002 that is greater than the amount of force 906. The user may or may not have maintained contact with the touch-sensitive display 902 between the applications of these two forces. In response to the second input of the user, the touch-input controller 808 has detected the greater amount of force 1002 and provided this information to the reference entry selection module 810. In response, the module 810 has mapped this amount of force to a particular reference work entry.
Here, the module 810 has selected a thesaurus entry 1004 for output on the touch-sensitive display 902. As illustrated, the thesaurus entry includes an indication 1006 of the source of entry, along with both synonyms and antonyms of the word “mammal.” While FIG. 10 illustrates that the device 800 outputs the thesaurus entry 1004 in response to detecting the amount of force 1002 on the location of the display 902 associated with the word “mammal,” other implementations may output any other type of reference work entry. In addition, other implementations may output the thesaurus entry in response to detecting a lesser amount of force.
FIG. 11 illustrates yet another example user interface rendered by the device 800 after the user has yet again provided an input having an increased amount of force 1102 on the selected word “mammal.” After the touch-input controller 808 measures the increased force 1102 and provides this information to the reference entry selection module 810, the module 810 selects a third, different type of reference work entry to output on the device. In the illustrated example, the device 800 outputs an encyclopedia entry 1104 for the word “Mammal,” which again includes an indication 1106 of the source of the entry.
With use of the described techniques, the user of the device 800 is able to toggle through multiple different reference work entries associated with a particular word, phrase, image, or other portion of a content item. For instance, the user may toggle through a dictionary entry, a thesaurus entry, an encyclopedia entry, and/or any other type of reference work entry by altering the amount of force applied to the touch-sensitive display 902. In addition, the device 800 may store locally some or all of the underlying reference works, or the device 800 may request and receive the surfaced entries over a network on an on-demand basis.
FIG. 12 illustrates another example user interface rendered by the device 800. Here, the touch-sensitive display 902 of the device 800 renders the eBook 504 discussed with reference to FIG. 5 above. Here, the user selects (via a touch input) the word “bear” on the display 902, as indicated by the highlight 126. In response, the touch-input controller 808 measures an amount of force 1202 of the selection. Based at least partly on the amount of force 1202 of the selection, the reference entry selection module 810 outputs a thesaurus entry 1204 for the word “bear,” which includes an indication 1206 of the type of reference work being utilized.
In this instance, the module 810 has additionally determined that the eBook 502 is associated with the genre of “business.” The module 810 may make this determination using any of the techniques described above, such as a by referencing a prior categorization of the eBook 502, by analyzing key words of the eBook 502, or the like. After making this determination, the reference entry selection module 810 outputs a business-related thesaurus entry 1204, rather than a standard thesaurus entry. As such, the entry 1204 provides business-based synonyms for the term “bear,” such as pessimist, cynic, and defeatist. In addition, the entry 1204 includes an indication 1208 that this entry 1204 corresponds to a “business” use of the term.
FIG. 13 illustrates another example user interface rendered by the device 800 after the user has provided additional force on the touch-sensitive display 902. Here, the touch-input controller 808 measures an amount of force 1302 provided by the additional input and provides this information to the reference entry selection module 810. In response, the module 810 selects a reference work entry to output on the device 800. In this instance, the module 810 again outputs a thesaurus entry 1304 (as shown by indication 1306). This thesaurus entry, however, is associated with a standard use of the term “bear” (as shown by indication 1308) and, as such, this entry includes synonyms such as carry, convey, and deliver.
As such, FIGS. 12-13 illustrate that the user is able to toggle through multiple different context-sensitive reference work entries by providing varying amounts of force on the touch-sensitive display. In combination with the techniques described above with reference to context-sensitive reference works, a user is able to view an initial reference work entry that has been determined to most likely relate to the illustrated eBook, while providing the user the ability to toggle through multiple other entries. For instance, the techniques may surface a business-related thesaurus entry for a business-related book, while allowing the user to toggle through other thesaurus entries for a same word by applying an increased or decreased force to the display 902.
FIG. 14 is a flow diagram showing a process 1400 of selecting which of multiple reference work entries to output based on an amount of measured force associated with a selection. This process (as well as the processes described below) is illustrated as a logical flow graph, each operation of which represents a sequence of operations that can be implemented in hardware, software, or a combination thereof. In the context of software, the operations represent computer-executable instructions stored on one or more computer-readable storage media that, when executed by one or more processors, perform the recited operations. Generally, computer-executable instructions include routines, programs, objects, components, data structures, and the like that perform particular functions or implement particular abstract data types. While this process illustrates one example order in which the operations of the process may occur, the example order is not intended to be construed as a limitation, and any number of the described operations can be combined in any order and/or in parallel to implement the process.
For discussion purposes, the process 1400 is described with reference to the device 800 of FIG. 8, although other devices and architectures may implement this process.
At operation 1402, the device 800 detects an initial selection on a touch sensor and measures a force of the selection. For instance, a user of the device 800 may provide a touch input onto the touch-sensitive display of the device via a finger, stylus, or the like. This initial selection may select a particular portion of a content item, such as a word or phrase, a location on a map, a portion of an image, or any other portion of any type of content item.
At operation 1404, the device 800 causes display of a first reference work entry at least partly in response to detecting the initial selection. For instance, the device 800 may display the first reference work entry on the touch-sensitive display of the device, on another display of the device, or on another display of a different device. Further, in some instances, the device 800 selects the first reference work entry based at least in part on the measured force of the initial selection. In other instances, meanwhile, the device 800 selects the first reference work entry regardless of an amount of force. In these latter instances, the measured force of the initial selection may act as a baseline for future touch inputs.
In each of these instances, the displayed first reference work entry may also be based on a location of the initial selection on the touch sensor. For instance, if a user provides the touch input on a portion of a touch-sensitive display displaying the word “mammal,” the device may output a reference work entry (e.g., a dictionary entry, an encyclopedia entry, etc.) associated with the word “mammal.”
At operation 1406, the touch sensor of the device detects a force that is greater or lesser than the measured force of the initial selection. For instance, the user may have applied more force via her finger or a stylus, with or without removing her finger or stylus from touch sensor after making the initial selection.
At operation 1408 and in response to detecting the greater or lesser force, the device 800 causes display of a second, different reference work entry. Again, the device may display this entry on any display, such as on the touch-sensitive display of the device 800. Further, this entry may be selected based on an amount of force of the selection and/or based on a difference in force between the initial selection and the greater/lesser force. This second, displayed entry may also be based on the portion of the content item that the user selects. In the example above, for instance, the device 800 may output a thesaurus entry for the word “mammal” in response to the user providing a greater amount of force at the location of the display displaying this word.
FIG. 15 illustrates yet another example user interface rendered by the device 800 of FIG. 8 after a user has made a selection of a word on the touch-sensitive display 902 of the device 800. As illustrated, the device 800 currently outputs the eBook 904 and the user provides a first input 1502 in the form of a touch on the touch-sensitive display 902. In response, the device 800 outputs a reference work entry associated with the selected word. In the illustrated example, the device 800 again displays a dictionary entry 908 for the selected word “mammal.”
FIG. 16 illustrates an example user interface rendered by the device 800 after the user provides an additional input 1602. This additional input may comprise additional force on the touch-sensitive display, a touch on the display that is coincident with the initial touch input, activation of a key on a keypad of the device 800, an oral command spoken by the user and captured by a microphone of the device 800, or any other type of input.
In response to the additional input 1602, the device 800 outputs a second, different reference work entry 1004. The device 800 may select this entry 1004 based on actual contents of the additional input 1602, or the device 800 may select this entry 1004 based on the entry 1004 being “next in line” for display.
In either instance, FIGS. 15-16 illustrate that a user of the device 800 is able to toggle through multiple reference work entries associated with a portion of a content item by providing a touch input and, thereafter, providing other preconfigured inputs. For instance, a user of the device may view a dictionary entry of a word by providing an initial touch input on the touch-sensitive display of the device. Thereafter, the user may toggle through multiple other reference work entries associated with the word by providing a second, coincident touch on the display. As such, the user could view a thesaurus entry by maintaining contact with her finger on the display and providing a first tap, an encyclopedia entry by providing a second tap, search engine results by providing a third tap, and so forth.
Conversely, the user could toggle through multiple different reference work entries by providing a touch input and then selecting a particular button on a keypad of the device. For instance, after providing a touch input, the user could select the “T” on the keypad to view the thesaurus entry for the word, the “E” on the keypad to view the encyclopedia entry of the word, and so forth. Conversely, the user may select a single button that toggles through these reference work entries in a preconfigured order.
In yet another example, the user could toggle through multiple different reference work entries by providing a touch input and then providing an oral command. For instance, the user could say aloud “thesaurus,” “encyclopedia,” “map,” and so forth. In response to the microphone of the device capturing the command, the device 800 may process this request and output the requested reference work entry. Conversely, the user may provide a single command (e.g., “next”) that, when received, causes the device to toggle through the reference work entries in a preconfigured order.
FIG. 17 is a flow diagram showing a process 1700 of outputting a first reference work entry in response to receiving a touch selection on a display and, thereafter, outputting a second, different reference work entry after receiving an additional input.
At operation 1702, the device 800 receives a first user input on a touch-sensitive display 902 of the device 800. For instance, the user may select a particular portion of a content item that the device outputs, such as a particular word or phrase of an eBook.
At operation 1704, the device causes display of a first reference work entry associated with the selected portion of the content item. For instance, if the user selects the word “mammal,” the device 800 may display a dictionary entry or other reference work entry for this word. Conversely, if the user selects the word “Los Angeles,” the device 800 may display the location of this city on a map.
At operation 1706, user provides and the device receives a second user input, which may or may not be coincident with the first input provided on the touch-sensitive display. For instance, the user may provide additional force on the display, may provide an additional point of contact or touch on the display, may activate a button on a keypad of the device, may orally speak a command, and so forth.
At operation 1708, in response to receiving the second input the device 800 causes display of a second, different reference work entry. In some instances, this reference work entry is also associated with the initially selected portion of the content item. For instance, the device 800 may display a thesaurus entry for a selected word after displaying a dictionary entry for the word at the operation 1704.
FIG. 18 illustrates an example user interface rendered by the device 800 of FIG. 8 after a user has made a selection of a word on the touch-sensitive display 902 of the device 800. In this example, the device 800 measures a force of the selection and determines whether to output a particular reference work entry in response to the selection, or whether to enable the user of the device 800 to select a reference work entry for output. For instance, the device 800 may determine whether a provided force is greater or less than a threshold force and, in response to this determination, may output the particular reference work entry or may enable the user to make the selection.
FIG. 18, for instance, illustrates that the device 800 outputs a reference work entry (e.g., the dictionary entry 908) after determining that an amount of force 1802 of this selection is less than a threshold force. This threshold force may be configurable by a user of the device 800 in some instances.
FIG. 19, meanwhile, illustrates an example user interface rendered by the device 800 after the device determines that an amount of force of this selection is greater than the threshold force. Here, the device 800 enables the user to select a reference work entry to output, each of which may be associated with the word “mammal.” In this example, the device 800 displays a menu 1902 with selectable icons, such as “dictionary entry,” “thesaurus entry,” and the like. With the menu 1902, the user may select one of the icons to cause the device 800 to display the corresponding reference work entry. For instance, the user may select “encyclopedia entry” to view an encyclopedia entry for the word “mammal.”
FIGS. 18-19 thus illustrate that the device 800 may determine whether to output an entry or whether to enable a user to select an entry based on an amount of force of a user's selection. While these figures illustrate that the device enables user selection in response to detecting a force that is greater than a threshold force, in other instances the device may operate in an opposite manner. That is, the device 800 may enable the user to select a reference work entry in response to detecting a force that is less than a threshold force. Similarly, in these instances, the device 800 may output a particular reference work entry in response to detecting a force that is greater than the threshold force.
Furthermore, while FIG. 19 illustrates one example manner that the user may select a reference work entry for output, the device 800 may enable this selection in a number of other ways. For instance, the device 800 may allow the user to orally state which reference work entry to output (e.g., “thesaurus”) or may allow the user to make this selection in any other manner.
These techniques may also operate in conjunction with the techniques for toggling through the multiple reference work entries, described above. For instance, the device may output a first entry in response to a first amount of force, a second entry in response to a second, greater amount of a force, and the selectable menu 1902 in response to a third, even greater amount of force.
FIG. 20 is a flow diagram showing a process 2000 of determining whether to output a reference work entry or whether to allow a user to select which reference work entry to output based on an amount of force of a selection. The device 800 and/or a component in communication with the device 800 may perform this process in some instances.
At operation 2002, the device detects a selection on a touch-sensitive display of the device. For instance, the device may detect the user providing a touch input via her finger, a stylus, or the like. This selection may be associated with a particular portion of a content item being rendered by the device, such as a particular word or phrase of an eBook.
At operation 2004, the device measures an amount of a force of the selection. At operation 2006, the device then determines whether the measured amount of force is greater than or less than a preconfigured threshold value. In response to determining that the measured force is less than the threshold value, the device causes display of a reference work entry on the display at operation 2008. Conversely, in response to determining that the measured amount of force is greater than the threshold amount of force, the device 800 enables the user select an entry at operation 2010. For instance, the device 800 may output the selectable menu 1902 or may otherwise allow the user to select which reference work entry to display on the device.
Surfacing Content Based on Touch Gestures
Example eBook Reader Device
FIG. 21 is a block diagram of selected modules of an eBook reader device 2100 that may implement a touch-sensitive display and that is capable of outputting different content based on an amount of force applied to the touch-sensitive display. The illustrated eBook reader device 2100 may include several similar or identical components as the eBook reader devices 104(1) and 800 described above.
In this example, the eBook reader device 2100 is a dedicated, handheld eBook reader device, although other electronic devices may implement these techniques and, hence, may include some of the functionality described herein. For instance, mobile telephones, tablet computers, laptop computers, desktop computers, personal media players, portable digital assistants (PDAs), kiosks, or any other type of electronic device may implement the components and techniques described below.
In a basic configuration, the example eBook reader device 2100 includes one or more processing units 2102 and memory 2104. In addition, the eBook reader device 2100 may include the touch sensor 806 that enables a user of the device to operate the device via touch inputs. In some instances, the touch sensor 806 and the display 220 are integral to provide a touch-sensitive display that displays content items (e.g., eBooks) and allows users to navigate the content items via touch inputs on the display.
The memory 2104 may be used to store any number of functional components that are executable on the processing unit(s) 2102, as well as data and content items that are rendered by the eBook reader device 2100. Thus, the memory 2104 may store an operating system and an eBook storage database to store the one or more content items 206 described above, such as eBooks, audio books, songs, videos, still images, and the like.
A content presentation application 210 renders the content items 206. The content presentation application 210 may be implemented as various applications depending upon the content items. For instance, the application 210 may be an electronic book reader application for rending electronic books, an audio player for playing audio books or songs, a video player for playing video, and so forth.
The memory 2104 may also store user credentials 212. The credentials 212 may be device specific (set during manufacturing) or provided as part of a registration process for a service. The credentials may be used to ensure compliance with DRM aspects of rendering the content items 206.
The memory 2104 also stores (persistently or temporarily) one or more reference works 214, such as one or more dictionaries, thesauruses, encyclopedias, atlases, gazetteers, and the like. In some instances, the memory 204 stores multiple categories of a particular kind of reference work as described above. The memory 2104 may also include the interface module 208 that, as described above, provides for user operation of the device 2100. One feature of the interface module 208 allows a user to request to receive information (e.g., from a reference work, from the Web, etc.) regarding a word, phrase, or topic found within one of the content items 206. For instance, the interface module 208 may allow the user to request a definition of a word from a dictionary, synonyms from a thesaurus, a map from an atlas, search results from the Web, and the like.
The interface module 208 may facilitate textual entry of request (e.g., via a cursor, controller, keyboard, etc.), audible entry of the request (e.g., via a microphone), or entry of the request in any other manner.
The memory 2104 also stores the touch-input controller 808 to detect touch inputs received via the touch sensor 806 and, in some instances, to measure of a force of the touches. In some instances, the touch-input controller 808 is configured to detect multiple touches on the touch sensor 806 as well as to measure an amount of force of each of the touches.
The eBook reader device 2100 also stores a term aggregation module 2106 and a search refinement module 2108. The term aggregation module 2106 is executable on the processing unit(s) to aggregate instances of a selected term and output one or more of these instances, such as on the display 220. For instance, in response to a predefined user gesture, the module 2106 may aggregate or otherwise determine each instance of a selected term within an eBook that the device 2100 currently renders, within other eBooks stored on the device 2100, and/or within available eBooks generally. The module 106 may then output a visual menu that indicates these other instances of the selected term.
In one specific example, the user of the device 2100 may select a particular word in an illustrated eBook with a particular amount of force. In response to this selection, the device 2100 may output information about the selected term. For instance, the device 2100 may illustrate a pop-up menu showing an entry associated with the selected word from one of the reference works 214, such as a dictionary entry. Thereafter, the user may provide a greater or lesser amount of force to the selected word on the display 220 and, in response, the touch-input controller 808 may provide a notification of this greater or lesser force to the term aggregation module 2106. In response to this notification, the term aggregation module 2106 may output other instances of the selected word within the illustrated eBook, as described in detail below. Additionally or alternatively, the term aggregation module 2106 may, in response to the greater or lesser force, output other instances of the selected word within other eBooks stored on the device 2100 and/or other eBooks that are not stored on the device 2100. While a few example configurations have been described, multiple other configurations may also be implemented.
FIG. 21 further illustrates that the memory 2104 may store a search refinement module 2108. The search refinement module 2108 is executable on the processing unit(s) 2102 to refine a user's search based on varying magnitudes of force associated with a selection of the user. For instance, a user may request that the device 2100 perform a local or remote search by selecting a particular word of an illustrated eBook to use as a query for the search. This search may comprise a search of eBooks or other content items stored on the device, a search of results available on the Web, or the like.
After the device 2100 surfaces the results, the user may thereafter provide a selection having a greater or lesser amount of force than the initial selection. In response, the search refinement module 2108 may refine the search by, for instance, expanding or narrowing the search. For instance, the search refinement module 2108 may expand a search in response to the user providing a lesser amount of force on the touch-sensitive display, and/or may narrow a search in response to the user providing a greater amount of force, each of which are described in more detail below.
The eBook reader device 2100 may further be equipped with various input/output (I/O) components 222, similar to the devices 104(1) and 800 described above. Such components may include various user interface controls (e.g., buttons, a joystick, a keyboard, etc.), audio speakers, connection ports, and so forth. In addition, a network interface 224 supports both wired and wireless connection to various networks, such as cellular networks, radio, WiFi networks, short range networks (e.g., Bluetooth), IR, and so forth. The network interface 224 may allow a user of the device 800 to download content items from the content item service 106.
The eBook reader device 2100 also includes a battery and power control unit 226. The battery and power control unit operatively controls an amount of power, or electrical energy, consumed by the eBook reader device. Actively controlling the amount of power consumed by the reader device may achieve more efficient use of electrical energy stored by the battery.
Example User Interfaces and Processes for Surfacing Other Instances of a Selected Term
FIG. 22 illustrates an example user interface rendered by the device 2100 of FIG. 21. As illustrated, the device includes the touch-sensitive display 902 that renders an eBook 2202, which in this example comprises the play “Hamlet” by William Shakespeare. During rendering of the eBook 904, a user makes a selection of a word (“Ophelia”) on the touch-sensitive display 902, as illustrated by the highlight 126. In response, the touch-input controller 808 of the device 2100 measures an amount of force 2204 of the selection. In addition, the device 2100 may output information regarding the selected word. In one example, the device 2100 may output a reference work entry associated with the selected word. Here, the device 2100 outputs an entry 2206 describing the role of the selected character, Ophelia, within the illustrated eBook 2202. The entry 2206 also includes an icon 2208 (entitled “More”) that, when selected, expands or otherwise provides additional information from the entry 2206.
In some instances, the touch-input controller 808 may provide an indication of the measured amount of force to the reference work selection entry shown in FIG. 8, which may map the detected amount of force to a particular type of reference work entry to output. For instance, the module 810 may output a first type of reference work entry for a touch-input having a measured force within a first range, a second type of reference work entry for a touch-input having a measured force within a second range, and so forth.
In other instances, meanwhile, the controller 808 may measure the amount of force 2204 of the initial touch and may use this amount as a baseline for future touches. For instance, the device 2100 may select and output a first type of information (e.g., entry 2206, Web search results for “Ophelia,” etc.) in response to detecting a first touch, regardless of the amount of force of the touch. Thereafter, the device 2100 may provide different outputs in response to detecting touches having greater or lesser forces than the initial “baseline” touch, as described immediately below.
FIG. 23 illustrates an example user interface rendered by the device 2100 after the user has increased the amount of force on the selected word. As illustrated, the user now selects the word “Ophelia” with an amount of force 2302 that is greater than the amount of force 2204. The user may or may not have maintained contact with the touch-sensitive display 902 between the applications of these two forces. In response to the second input of the user, the touch-input controller 808 has detected the greater amount of force 2302 and has provided this information to the term aggregation module 2106. In response, the module 2106 may determine whether the increased force is greater than a threshold and, if so, may output information that is different than the entry 2206 of FIG. 22.
For instance, the module 2106 may output other instances of the selected term “Ophelia,” either within the eBook 2202, within other eBooks stored on the device 2100, or within other content items. Here, the module 2106 has output a menu 2304 of other instances of the term “Ophelia” within the illustrated eBook 2202, Hamlet. In some instances, each listing of the instance of the selected term may be selectable by the user, such that a selection of that instance may cause the device 2100 to navigate the user to that location in the eBook 2202 or may cause an overlay of that location in the eBook 2202.
In the illustrated example, the menu 2304 displays other mentions or instances of the term “Ophelia” starting with the beginning of the play. Conversely, the menu 2304 could depict instances of the selected term that are nearest the instance of the selected term or may surface these other instances in any other matter. In either instance, the menu 2304 may include the icon 2208 that, when selected, causes the device 2100 to illustrate more instances of “Ophelia” and their corresponding locations.
While FIG. 23 illustrates that the device 2100 outputs the menu 2304 in response to detecting the amount of force 2302 on the location of the display 902 associated with the word “Ophelia,” other implementations may output this menu 2304 based detecting a force at another location the display 902.
FIG. 24 illustrates yet another example user interface rendered by the device 2100 after the user has yet again provided an input having an increased amount of force 2402 on the selected word “Ophelia.” The touch-input controller 808 measures the increased force 2402 and provides this information to the term aggregation module 2106. In response, the module 2106 outputs still different information regarding the selected term. In the illustrated example, the module 2106 outputs a menu 2404 of other mentions of the term “Ophelia” within other eBooks or content items stored on or accessible by the device 2100 of the user. That is, the menu 2404 may include indications of other works that the user of the device 2100 has previously purchased and/or obtained (e.g., downloaded).
Here, the menu 204 indicates that the term Ophelia shows up in three eBooks to which the device has access to. With use of the menu 2404, the user may select different ones of the listings, which in turn may cause the device to display the selected eBook, potentially at the first use of the term “Ophelia” within that book. While FIG. 24 illustrates other mentions or instances of the selected term within items to which the device user has access to, in other implementations the menu 2404 may include other items that the user currently does not have access to. Further, the menu could provide icons that, when selected, initiate a request for a sample of an item or a request to obtain (e.g., purchase) the item. As such, the user may be able to obtain other items that are associated with or otherwise reference the term “Ophelia,” if the user so desires.
FIG. 25 illustrates an example user interface rendered by the device 2100 after the user has yet again increased an amount of force 2502 on the selected word. Here, the device 2100 outputs search results 2504 associated with a query comprising the selected word (“Ophelia”) based on this even greater amount of force 2502. As illustrated, the search results may comprise Web results provided by a search engine.
With use of the described techniques, the user of the device 2100 is able to toggle through different information associated with a particular term (e.g., word or phrase) by altering the amount of force applied to the touch-sensitive display 902. Specifically, the user may request to view other instances of a selected term by providing a greater or lesser amount of force to the display 902 than an initial selection. Furthermore, while FIG. 22-25 illustrate an example navigation path associated with four different levels of force, other implementations may employ any other combination of information associated with the selected term. Furthermore, both the navigation path (i.e., the surfaced information associated with the selected term) and the force thresholds for navigating this path may be configured by the user in some implementations.
FIG. 26 is a flow diagram showing a process 2600 of causing display of other instances of a selected term based on a force associated with a selection. This process (as well as the processes described below) is illustrated as a logical flow graph, each operation of which represents a sequence of operations that can be implemented in hardware, software, or a combination thereof. In the context of software, the operations represent computer-executable instructions stored on one or more computer-readable storage media that, when executed by one or more processors, perform the recited operations. Generally, computer-executable instructions include routines, programs, objects, components, data structures, and the like that perform particular functions or implement particular abstract data types. While this process illustrates one example order in which the operations of the process may occur, the example order is not intended to be construed as a limitation, and any number of the described operations can be combined in any order and/or in parallel to implement the process.
For discussion purposes, the process 2600 is described with reference to the device 2100 of FIG. 21, although other devices and architectures may implement this process.
At operation 2602, the device 2100 detects an initial selection on a touch sensor and measures a force of the selection. For instance, a user of the device 2100 may provide a touch input onto the touch-sensitive display of the device via a finger, stylus, or the like. This initial selection may select a particular portion of a content item, such as a term (e.g., word or phrase), a location on a map, a portion of an image, or any other portion of any type of content item. In this example, the user selects a particular term.
At operation 2604, the device 2100 causes display of information associated with the selected term at least partly in response to detecting the initial selection. For instance, the device 2100 may display a reference work entry or other information associated with the term on the touch-sensitive display of the device, on another display of the device, or on another display of a different device. Further, in some instances, the device 2100 selects the information (e.g., the reference work entry) based at least in part on the measured force of the initial selection. In other instances, meanwhile, the device 2100 selects the information to surface regardless of an amount of force. In these latter instances, the measured force of the initial selection may act as a baseline for future touch inputs.
At operation 2606, the touch sensor of the device detects a force that is greater or lesser than the measured force of the initial selection. For instance, the user may have applied more force via her finger or a stylus, with or without removing her finger or stylus from touch sensor after making the initial selection.
At operation 2608, and in response to detecting the greater or lesser force, the device 2100 causes display of one or more other instances of the selected term. Again, the device may display this entry on any display, such as on the touch-sensitive display of the device 2100. Further, the other instances of the selected term may comprise other instance of the term within the illustrated content item, within other content items associated with the device 2100, or within other content items generally. In some implementations, the device 2100 determines which of these “other instances” to surface based on an amount of force of the selection and/or based on a difference in force between the initial selection and the greater/lesser force. For instance, a first greater amount of force may result in the device 2100 surfacing other instances of the selected term within the currently illustrated eBook, while a second, greater amount of force may result in the device 2100 surfacing other instances of the selected term within the other eBooks stored on or accessible by the device. Other examples are also possible.
FIG. 27 illustrates an example user interface rendered by the device 2100 after a user has made a selection of a word within the eBook 2200 being rendered on the touch-sensitive display 902 of the device 2100. Here, the user provides a touch of a certain threshold amount of force 2702 on the display, which the touch-input controller 808 detects. In response, the device 2100 outputs a menu 2704 requesting that the user select whether to view previous or subsequent instances of the selected word within the illustrated content item.
In this example, the menu comprises a left-facing arrow 2706 that, when selected, causes the device to render instances of “Ophelia” within Hamlet that occur prior to a current reading location of the user. The menu 2704 also includes a right-facing arrow 2708 that, when selected, causes the device to render instances of “Ophelia” within Hamlet that occur subsequent to a current reading location of the user. While FIG. 27 illustrates arrows, the menu 2704 may include any other icons to allow the user to select whether to surface previous instances of a selected term, subsequent instances, or both.
FIG. 28 illustrates an example user interface rendered by the device 2100 while a user makes a gesture 2802 that both selects a word within a content item and requests to view subsequent instances of the selected word within the illustrated content item. Here, the gesture 2802 comprises the user pressing on a particular term of the eBook 2202 (“Ophelia”) with a certain amount of force 2702, while also swiping her finger to the right. In this example, the device 2100 interprets this gesture to mean that the user wishes to view instances of “Ophelia” within Hamlet that occur subsequent to a current reading location of the user within the eBook.
FIG. 29 illustrates an example user interface rendered by the device 2100 after the user makes the gesture 2802 in FIG. 28. As illustrated, in response to the gesture 2802 the device 2100 surfaces a menu 2902 that indicates that the illustrated eBook 2202 does not contain any subsequent instances of the selected term. In addition, the menu 2902 includes a link 2904 that, when selected, allows the user to view previous instances of the term “Ophelia” within the illustrated eBook 2202.
FIG. 30 is a flow diagram showing a process 300 of causing display of information associated with a portion of a content item in response to receiving a touch selection of the portion on a display and, thereafter, outputting other instances of the portion after receiving an additional input. The process 3000 includes, at 3002, receiving a first user input selecting a portion of a content item being output on a touch-sensitive display. For instance, the user may select a particular term of an eBook, a geographical location on a map, a character in an illustration, or any other defined portion of a content item. In response, at 3004 the device causes display of information associated with the selected portion. For instance, the device may cause display of a reference work entry associated with a selected word or phrase, a biography of a selected character or person, a location on a map of a selected geographical location, or any information associated with the selected portion.
At 3006, the device receives a second user input requesting to view other instances of the selection portion of the content item within the content item. For instance, the user may provide a greater or lesser amount of force on the selected portion, as discussed. Additionally or alternatively, the user may select a particular button on the keypad of the device, may state an aural command, may provide a second and coincident touch on the display, or the like. In response, at 3008 the device may cause display of other instances of the selected portion within the content item. For instance, the device may indicate other textual or graphical locations where the selected portion occurs within the content item. This may include other textual mentions of the portion, other instances of the selected portion within an image, photograph, or map, or any other type of instances of the selected portion of the item.
Example User Interfaces and Processes for Refining Search Results
FIG. 31 illustrates an example user interface rendered by the device 2100 after the user selects a word, via a touch input 3102, within the illustrated eBook 2202. In response to the selection, the device 2100 may form a query comprising the select word and may perform a local and/or remote search based on the query. For instance, the device 2100 may pass the query to a search engine to run a Web search on the selected term “Ophelia” and may thereafter output search results 3104. While the device may conduct a Web search with use of the selected term, the device 2100 may alternatively conduct a local search of the device 2100 and/or a search of one or more other defined devices or domains.
FIG. 32 illustrates an example user interface rendered by the device 2100 after the user provides a greater amount of force 3202 to the selected word. Here, the touch-input controller 808 detects this greater amount of force 3202 and passes an indication of the increased force to the search refinement module 2108. In response, the search refinement module 2108 may refine (e.g., narrow) the search results 3104. For instance, the search refinement module 2108 may form another, more narrow query in response to receiving this indication and may thereafter display search results 3204 associated with this narrower query.
In some instances, the narrowed query comprises the selected term and at least one other term associated with the content item being output by the touch-sensitive display. For instance, the additional term may comprise at least a portion of a title of the content item, an author of the content item, a topic of the content item, a categorization of the content item, or any other information associated with the content item, the exact selection of which may or may not be configurable by the user. In the illustrated example, the module 2108 conducts a search for a query comprising the selected term (“Ophelia”) and a title of the illustrated eBook 2202 from which the term was selected (“Hamlet”). As such, the device 2100 outputs search results 3204 for the query “Ophelia Hamlet.”
Additionally or alternatively, the search refinement module 2108 may broaden or narrow displayed search results based on crowd-sourcing—that is, with reference to the navigation of users having previously navigated search results associated with the illustrated search. In some instances, the module 2108 may reference previous navigation of users that have conducted a search from within a particular content item, such as the illustrated play “Hamlet.” For instance, if a large number of users selected a certain set of items in search results when conducting the search “Ophelia” within Hamlet (or within related works), then the module 2108 may highlight these popular items in response to receiving a user request to narrow a search for “Ophelia” within the eBook 2202 (Hamlet).
In some instances, the user may broaden a search by lessening the amount of force provided on the touch-sensitive display 902. For instance, if the user were to lessen the amount of force from the amount 3202 to the amount 3102, then the search refinement module 2108 may broaden the search by forming a query “Ophelia” rather than “Ophelia Hamlet.” In response, the device 2100 may display the search results 3104.
FIG. 33 is a flow diagram showing a process 3300 of refining search results on a touch-sensitive display based on the device detecting a greater or lesser amount of force on a selected term. For discussion purposes, the process 3300 is described with reference to the device 2100 of FIG. 21, although other devices and architectures may implement this process.
At operation 3302, the device 2100 detects an initial selection on a touch sensor and measures a force of the selection. For instance, a user of the device 2100 may provide a touch input onto the touch-sensitive display of the device via a finger, stylus, or the like. This initial selection may select a particular portion of a content item, such as a term (e.g., word or phrase), a location on a map, a portion of an image, or any other portion of any type of content item. In this example, the user selects a particular term.
At operation 3304, the device 2100 forms a query comprising (e.g., consisting of) the selected term and thereafter causes display of search results associated with this term. In some instance, the search results are local, remote, web-based, or any combination thereof. For example, the device may submit the formed query to a remote search engine in some examples. At operation 3306, the touch sensor of the device detects a force that is greater or lesser than the measured force of the initial selection. For instance, the user may have applied more force via her finger or a stylus, with or without removing her finger or stylus from touch sensor after making the initial selection.
At operation 3308 and in response to detecting the greater or lesser force, the device 2100 refines the displayed search results. For instance, the search refinement module 2108 may form another query and may surface search results associated with this broader or narrower query.
FIG. 34 illustrates an example user interface rendered by the device 2100 while a user makes a gesture 3402 that both selects a word within the illustrated eBook 2202 and requests to narrow illustrated search results associated with the selected word. In this example, the user may have previously requested to view search results associated with the word “Ophelia” and, in response, the device 2100 illustrates corresponding search results 3104. Thereafter, the user performs the gesture 3402, which in this example comprises the user swiping downwards with reference to controls of the device 2100 or the orientation of the eBook 2202.
FIG. 35 illustrates an example user interface rendered by the device 2100 after the user performs the gesture 3402. As illustrated, the device has narrowed the search results in response to detecting the gesture. Specifically, the search refinement module 2108 has formed a query (“Ophelia Hamlet”) that is narrower than the previous query (“Hamlet”). In addition, the device 2100 now displays search results 3204 associated with the narrower query.
FIG. 36 illustrates an example user interface rendered by the device 2100 while a user makes a gesture 3602 that both selects a word within the illustrated eBook 2202 and requests to expand illustrated search results associated with the selected word. In this example, the device 2100 currently displays the search results 3204 discussed immediately above with reference to FIG. 35. In addition, in this example the gesture 3602 comprises the user performing an upwards swipe.
FIG. 37 illustrates an example user interface rendered by the device 2100 after the user performs the upward swipe gesture of FIG. 36. As illustrated, the device has expanded search results in response to detecting the gesture. Specifically, the search refinement module 2108 has surfaced search results 3104 associated with the query “Ophelia,” rather than the previously surfaced results associated with the narrower query “Ophelia Hamlet.” As such, the user has been able to quickly and efficiently narrow and/or broaden search results with use of predefined and intuitive gestures. While the foregoing figures have provided a few predefined gestures and corresponding directionality that may be suitable for refining search results, multiple other gestures and/or directions may be used in other implementations.
FIG. 38 is a flow diagram showing a process 3800 for refining search results based at least in part on detecting a predefined gesture on a touch-sensitive display. At 3802, the device 2100 detects an initial selection of a term in a content item, such as the eBook 2202. In response, the device 2100 may form a query and may display search results associated with the query at 3804.
At 3806, the device 2100 may detect a predefined gesture on the touch-sensitive display 902 of the device 902 during the display of the search results. This predefined gesture may comprise a user swipe on the touch-sensitive display in a predefined direction, such as upwards to broaden the search or downwards to narrow the search. Additionally or alternatively, the predefined user gesture may comprise additional or less force on the touch-sensitive display, or any other predefined user gesture.
At 3808, the search refinement module 2108 of the device 2100 may refine the search results on the display based at least in part on detecting the predefined gesture. For instance, the module 2108 may expand the search results or narrow the search results, depending upon the detected gesture made by the user.
CONCLUSION
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as exemplary forms of implementing the claims.

Claims (24)

What is claimed is:
1. One or more non-transitory computer-readable media storing computer-executable instructions that, when executed, cause one or more processors to perform acts comprising:
detecting an initial selection of a term in a content item being output by a touch-sensitive display and measuring a force of the initial selection on the touch-sensitive display;
forming a query comprising the selected term and causing display of search results associated with the query on the touch-sensitive display at least partly in response to the detecting of the initial selection;
detecting a force on the touch-sensitive display that is greater or lesser than the measured force of the initial selection; and
refining the search results displayed on the touch-sensitive display at least partly in response to the detecting of the greater or lesser force, the refining comprising adding, removing or modifying a search criteria of the query.
2. One or more computer-readable media as recited in claim 1, wherein the query comprises a first query, and the refining of the search results comprises:
forming a second query that is narrower than the first query; and
causing display of search results associated with the second query on the touch-sensitive display.
3. One or more computer-readable media as recited in claim 2, wherein the second query comprises the selected term and at least one other term associated with the content item being output by the touch-sensitive display.
4. One or more computer-readable media as recited in claim 3, wherein the at least one other term that associated with the content item comprises at least a portion of a title of the content item, an author of the content item, a topic of the content item, or a categorization of the content item.
5. One or more computer-readable media as recited in claim 1, wherein the refining of the search results comprises altering the search results displayed on the touch-sensitive display based at least in part on previous activities by other users that have navigated to at least a portion of the search results.
6. One or more computer-readable media as recited in claim 5, wherein the altering of the displayed search results comprises causing display of fewer search results than previously displayed on the touch-sensitive display based at least in part on the previous activities of the other users.
7. One or more computer-readable media as recited in claim 5, wherein the altering of the displayed search results comprises causing display more search results than previously displayed on the touch-sensitive display based at least in part on the previous activities of the other users.
8. One or more computer-readable media as recited in claim 1, wherein the detecting of the greater or lesser force comprises detecting the greater force, and wherein the refining of the search results comprises narrowing the search results displayed on the touch sensitive display.
9. One or more computer-readable media as recited in claim 1, wherein the detecting of the greater or lesser force comprises detecting the lesser force, and wherein the refining of the search results comprises expanding the search results displayed on the touch sensitive display.
10. One or more computer-readable media as recited in claim 1, further storing computer-executable instructions that, when executed, cause the one or more processors to perform an act comprising submitting the search query to a search engine and receiving the search results after the forming of the search query and prior to the displaying of the search results.
11. One or more computer-readable media as recited in claim 1, wherein the touch-sensitive display forms a portion of an electronic device, and further storing computer-executable instructions that, when executed, cause the one or more processors to perform an act comprising performing a local search of the electronic device with use of the search query, and wherein at least a portion of the search results displayed on the touch-sensitive display comprise search results associated with items stored on the electronic device.
12. An electronic device, comprising:
one or more processors;
memory coupled to the one or more processors;
a touch-sensitive display communicatively coupled to the one or more processors to render a content item and to detect user inputs of varying force;
a content item stored in the memory; and
a search refinement module, stored in the memory and executable on the one or more processors to:
receive an indication that the touch-sensitive display has detected a user input selecting a term within the content item with an initial amount of force;
cause display of initial search results associated with the selected term on the touch-sensitive display at least partly in response to the receiving of the indication associated with the user input;
receive an indication that the touch-sensitive display has detected another user input having a greater or lesser amount of force than the initial amount of force; and
refine the initial search results displayed on the touch-sensitive display to generate refined search results, the refining of the initial search results at least partly in response to the receiving of the indication associated with the another user input, the refined search results comprising at least a subset of the initial search results.
13. An electronic device as recited in claim 12, wherein the receiving of the indication associated with the another user input comprises receiving an indication that the touch-sensitive display has detected the greater amount of force, and wherein the refining of the initial search results comprises narrowing the initial search results displayed on the touch-sensitive display.
14. An electronic device as recited in claim 13, wherein the narrowing of the initial search results comprises causing display of the refined search results which include search results associated with the selected term and at least one other term associated with the content item.
15. An electronic device as recited in claim 12, wherein the receiving of the indication associated with the another user input comprises receiving an indication that the touch-sensitive display has detected the lesser amount of force, and wherein the refining of the initial search results comprises expanding the initial search results displayed on the touch-sensitive display.
16. One or more non-transitory computer-readable media storing computer-executable instructions that, when executed, cause one or more processors to perform acts comprising:
detecting an initial selection of a term in a content item being output by a touch-sensitive display;
forming a query comprising the selected term and causing display of search results associated with the query on the touch-sensitive display at least partly in response to the detecting of the initial selection;
detecting a predefined user gesture on the touch-sensitive display during the display of the search results by the touch-sensitive display; and
refining the search results displayed on the touch-sensitive display at least partly in response to the detecting of the predefined gesture, the refined search results overlapping at least in part with the search results.
17. One or more computer-readable media as recited in claim 16, wherein the predefined gesture comprises a user swipe on the touch-sensitive display in a predefined direction, and wherein the refining of the search results comprises expanding or narrowing the search results displayed on the touch-sensitive display.
18. One or more computer-readable media as recited in claim 16, wherein the predefined gesture comprises a downwards user swipe on the touch-sensitive display, and wherein the refining of the search results comprises narrowing the search results displayed on the touch-sensitive display.
19. One or more computer-readable media as recited in claim 16, wherein the predefined gesture comprises an upwards user swipe on the touch-sensitive display, and wherein the refining of the search results comprises expanding the search results displayed on the touch-sensitive display.
20. One or more computer-readable media as recited in claim 16, wherein:
the detecting of the initial selection comprises detecting a user input on the touch-sensitive display, the user input having an initial amount of force;
the predefined gesture comprises an amount of force that is greater than the initial amount of force or is greater than a threshold amount of force; and
the refining of the search results comprises narrowing the search results displayed on the touch-sensitive display.
21. A method comprising:
under control of one or more computing systems configured with specific executable instructions,
detecting an initial selection of a portion of a content item being output by a touch-sensitive display and measuring a force of the initial selection on the touch-sensitive display;
causing display of search results associated with the selected portion of the content item on the touch-sensitive display at least partly in response to the detecting of the initial selection;
detecting a force on the touch-sensitive display that is greater or lesser than the measured force of the initial selection; and
refining the search results displayed on the touch-sensitive display at least partly in response to the detecting of the greater or lesser force, the refining the search results including generating a refined query based at least in part on a query that returned the search results.
22. A method as recited in claim 21, wherein the selected portion of the content item comprises a word, a phrase, or a portion of an image.
23. A method as recited in claim 21, wherein the force on the touch-sensitive display comprises the greater force, and the refining of the search results comprises narrowing the search results displayed on the touch-sensitive display.
24. A method as recited in claim 21, wherein the force on the touch-sensitive display comprises the lesser force, and the refining of the search results comprises broadening the search results displayed on the touch-sensitive display.
US12/823,085 2010-06-24 2010-06-24 Refining search results based on touch gestures Active 2032-03-04 US8542205B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/823,085 US8542205B1 (en) 2010-06-24 2010-06-24 Refining search results based on touch gestures

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US12/823,085 US8542205B1 (en) 2010-06-24 2010-06-24 Refining search results based on touch gestures

Publications (1)

Publication Number Publication Date
US8542205B1 true US8542205B1 (en) 2013-09-24

Family

ID=49181483

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/823,085 Active 2032-03-04 US8542205B1 (en) 2010-06-24 2010-06-24 Refining search results based on touch gestures

Country Status (1)

Country Link
US (1) US8542205B1 (en)

Cited By (73)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110158605A1 (en) * 2009-12-18 2011-06-30 Bliss John Stuart Method and system for associating an object to a moment in time in a digital video
US20110176788A1 (en) * 2009-12-18 2011-07-21 Bliss John Stuart Method and System for Associating an Object to a Moment in Time in a Digital Video
US20120075208A1 (en) * 2010-09-27 2012-03-29 Nintendo Co., Ltd. Information processing program, information processing apparatus and method thereof
US20130073932A1 (en) * 2011-08-19 2013-03-21 Apple Inc. Interactive Content for Digital Books
US20140047332A1 (en) * 2012-08-08 2014-02-13 Microsoft Corporation E-reader systems
US20140046922A1 (en) * 2012-08-08 2014-02-13 Microsoft Corporation Search user interface using outward physical expressions
US20140109009A1 (en) * 2012-02-28 2014-04-17 Tencent Technology (Shenzhen) Company Limited Method and apparatus for text searching on a touch terminal
US8724963B2 (en) 2009-12-18 2014-05-13 Captimo, Inc. Method and system for gesture based searching
US20140195961A1 (en) * 2013-01-07 2014-07-10 Apple Inc. Dynamic Index
US20140297632A1 (en) * 2010-12-22 2014-10-02 Avinash Sridhar Realtime search grid updates
US20150052472A1 (en) * 2012-06-25 2015-02-19 Barnesandnoble.Com Llc Creation and Exposure of Embedded Secondary Content Data Relevant to a Primary Content Page of An Electronic Book
US20150067605A1 (en) * 2012-05-09 2015-03-05 Apple Inc. Device, Method, and Graphical User Interface for Scrolling Nested Regions
US20150081653A1 (en) * 2013-09-13 2015-03-19 Yahoo! Inc. Type free search assist
US20150120777A1 (en) * 2013-10-24 2015-04-30 Olivia Ramos System and Method for Mining Data Using Haptic Feedback
US20150149967A1 (en) * 2012-12-29 2015-05-28 Apple Inc. Device, Method, and Graphical User Interface for Navigating User Interface Hierarchies
US20150186525A1 (en) * 2013-12-26 2015-07-02 Thomson Licensing Method and apparatus for gesture-based searching
US9378288B1 (en) * 2011-08-10 2016-06-28 Google Inc. Refining search results
US20160239161A1 (en) * 2015-02-12 2016-08-18 Kobo Incorporated Method and system for term-occurrence-based navigation of apportioned e-book content
US20160259496A1 (en) * 2015-03-08 2016-09-08 Apple Inc. Devices, Methods, and Graphical User Interfaces for Displaying and Using Menus
CN105955591A (en) * 2015-03-08 2016-09-21 苹果公司 Devices, Methods, and Graphical User Interfaces for Displaying and Using Menus
DK201500575A1 (en) * 2015-03-08 2016-09-26 Apple Inc Devices, Methods, and Graphical User Interfaces for Displaying and Using Menus
DK201500581A1 (en) * 2015-03-08 2017-01-16 Apple Inc Devices, Methods, and Graphical User Interfaces for Displaying and Using Menus
US9602729B2 (en) 2015-06-07 2017-03-21 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US20170091333A1 (en) * 2015-09-28 2017-03-30 Yahoo!, Inc. Multi-touch gesture search
US9612741B2 (en) 2012-05-09 2017-04-04 Apple Inc. Device, method, and graphical user interface for displaying additional information in response to a user contact
US9619076B2 (en) 2012-05-09 2017-04-11 Apple Inc. Device, method, and graphical user interface for transitioning between display states in response to a gesture
US9632664B2 (en) 2015-03-08 2017-04-25 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US9639184B2 (en) 2015-03-19 2017-05-02 Apple Inc. Touch input cursor manipulation
US9665206B1 (en) 2013-09-18 2017-05-30 Apple Inc. Dynamic user interface adaptable to multiple input tools
US9674426B2 (en) 2015-06-07 2017-06-06 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US20170177179A1 (en) * 2015-12-16 2017-06-22 International Business Machines Corporation E-reader summarization and customized dictionary
US20170185654A1 (en) * 2010-09-30 2017-06-29 Huawei Device Co., Ltd. Method and server for pushing information proactively
US20170228451A1 (en) * 2013-12-20 2017-08-10 Microsoft Technology Licensing, Llc Constructing queries for execution over multi-dimensional data structures
US20170249296A1 (en) * 2016-02-29 2017-08-31 International Business Machines Corporation Interest highlight and recommendation based on interaction in long text reading
US9753639B2 (en) 2012-05-09 2017-09-05 Apple Inc. Device, method, and graphical user interface for displaying content associated with a corresponding affordance
US20170262513A1 (en) * 2013-05-29 2017-09-14 Ebay Inc. Methods and systems to refine search results
US9778771B2 (en) 2012-12-29 2017-10-03 Apple Inc. Device, method, and graphical user interface for transitioning between touch input to display output relationships
US9785305B2 (en) 2015-03-19 2017-10-10 Apple Inc. Touch input cursor manipulation
US9830048B2 (en) 2015-06-07 2017-11-28 Apple Inc. Devices and methods for processing touch inputs with instructions in a web page
EP3134830A4 (en) * 2014-04-25 2017-11-29 Amazon Technologies Inc. Selective display of comprehension guides
US9880735B2 (en) 2015-08-10 2018-01-30 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US9886184B2 (en) 2012-05-09 2018-02-06 Apple Inc. Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object
US9891811B2 (en) 2015-06-07 2018-02-13 Apple Inc. Devices and methods for navigating between user interfaces
US9990121B2 (en) 2012-05-09 2018-06-05 Apple Inc. Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input
US9996231B2 (en) 2012-05-09 2018-06-12 Apple Inc. Device, method, and graphical user interface for manipulating framed graphical objects
US10037138B2 (en) 2012-12-29 2018-07-31 Apple Inc. Device, method, and graphical user interface for switching between user interfaces
US10042542B2 (en) 2012-05-09 2018-08-07 Apple Inc. Device, method, and graphical user interface for moving and dropping a user interface object
US10048757B2 (en) 2015-03-08 2018-08-14 Apple Inc. Devices and methods for controlling media presentation
US10067653B2 (en) 2015-04-01 2018-09-04 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US10073615B2 (en) 2012-05-09 2018-09-11 Apple Inc. Device, method, and graphical user interface for displaying user interface objects corresponding to an application
US10078442B2 (en) 2012-12-29 2018-09-18 Apple Inc. Device, method, and graphical user interface for determining whether to scroll or select content based on an intensity theshold
US10095391B2 (en) 2012-05-09 2018-10-09 Apple Inc. Device, method, and graphical user interface for selecting user interface objects
US10095396B2 (en) 2015-03-08 2018-10-09 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US10108330B2 (en) * 2011-09-12 2018-10-23 Microsoft Technology Licensing, Llc Automatic highlighting of formula parameters for limited display devices
US10162452B2 (en) 2015-08-10 2018-12-25 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US10175757B2 (en) 2012-05-09 2019-01-08 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for touch-based operations performed and reversed in a user interface
US10175864B2 (en) 2012-05-09 2019-01-08 Apple Inc. Device, method, and graphical user interface for selecting object within a group of objects in accordance with contact intensity
US10200598B2 (en) 2015-06-07 2019-02-05 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10235035B2 (en) 2015-08-10 2019-03-19 Apple Inc. Devices, methods, and graphical user interfaces for content navigation and manipulation
US10248308B2 (en) 2015-08-10 2019-04-02 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interfaces with physical gestures
US10275087B1 (en) 2011-08-05 2019-04-30 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US20190155955A1 (en) * 2017-11-20 2019-05-23 Rovi Guides, Inc. Systems and methods for filtering supplemental content for an electronic book
US10346030B2 (en) 2015-06-07 2019-07-09 Apple Inc. Devices and methods for navigating between user interfaces
US10417933B1 (en) 2014-04-25 2019-09-17 Amazon Technologies, Inc. Selective display of comprehension guides
US10416800B2 (en) 2015-08-10 2019-09-17 Apple Inc. Devices, methods, and graphical user interfaces for adjusting user interface objects
US10437333B2 (en) 2012-12-29 2019-10-08 Apple Inc. Device, method, and graphical user interface for forgoing generation of tactile output for a multi-contact gesture
US10496260B2 (en) 2012-05-09 2019-12-03 Apple Inc. Device, method, and graphical user interface for pressure-based alteration of controls in a user interface
US20200050326A1 (en) * 2016-10-27 2020-02-13 Samsung Electronics Co., Ltd. Electronic device and method for providing information in response to pressure input of touch
US10620781B2 (en) 2012-12-29 2020-04-14 Apple Inc. Device, method, and graphical user interface for moving a cursor according to a change in an appearance of a control icon with simulated three-dimensional characteristics
US10909191B2 (en) 2017-11-20 2021-02-02 Rovi Guides, Inc. Systems and methods for displaying supplemental content for an electronic book
US20220147223A1 (en) * 2020-11-07 2022-05-12 Saad Al Mohizea System and method for correcting typing errors
US11635883B2 (en) * 2020-02-18 2023-04-25 Micah Development LLC Indication of content linked to text
US20230283851A1 (en) * 2014-02-26 2023-09-07 Rovi Guides, Inc. Methods and systems for supplementing media assets during fast-access playback operations

Citations (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5416896A (en) 1992-04-30 1995-05-16 Ricoh Company, Ltd. Command definition dictionary handling and context declaration in a document publishing page description language (PDL)
US5483629A (en) 1992-04-30 1996-01-09 Ricoh Company, Ltd. Method and system to handle dictionaries in a document processing language
US5745776A (en) 1995-04-19 1998-04-28 Sheppard, Ii; Charles Bradford Enhanced electronic dictionary
US6331867B1 (en) * 1998-03-20 2001-12-18 Nuvomedia, Inc. Electronic book with automated look-up of terms of within reference titles
US20030160830A1 (en) * 2002-02-22 2003-08-28 Degross Lee M. Pop-up edictionary
US20040248653A1 (en) 2003-06-05 2004-12-09 Mark Barros System and method for providing user interactive experiences according to user's physical location
US20060041538A1 (en) 2004-02-15 2006-02-23 King Martin T Establishing an interactive environment for rendered documents
US20060230340A1 (en) 2005-04-06 2006-10-12 Marcella Betz Parsons System and method for publishing, distributing, and reading electronic interactive books
US20060282778A1 (en) 2001-09-13 2006-12-14 International Business Machines Corporation Handheld electronic book reader with annotation and usage tracking capabilities
US20070011160A1 (en) 2005-07-07 2007-01-11 Denis Ferland Literacy automation software
US20070136231A1 (en) 2005-12-09 2007-06-14 Fatlens Inc. Method for providing access to information in a network
US20070265834A1 (en) 2001-09-06 2007-11-15 Einat Melnick In-context analysis
US20080222552A1 (en) 2007-02-21 2008-09-11 University of Central Florida Reseach Foundation, Inc. Interactive Electronic Book Operating Systems And Methods
US20090153495A1 (en) * 2007-12-18 2009-06-18 Wistron Corp. Input method for use in an electronic device having a touch-sensitive screen
US20100005087A1 (en) 2008-07-01 2010-01-07 Stephen Basco Facilitating collaborative searching using semantic contexts associated with information
US20100128994A1 (en) 2008-11-24 2010-05-27 Jan Scott Zwolinski Personal dictionary and translator device
US20100153440A1 (en) 2001-08-13 2010-06-17 Xerox Corporation System with user directed enrichment
US7849393B1 (en) 1992-12-09 2010-12-07 Discovery Communications, Inc. Electronic book connection to world watch live
US20110018695A1 (en) * 2009-07-24 2011-01-27 Research In Motion Limited Method and apparatus for a touch-sensitive display
US20110161073A1 (en) 2009-12-29 2011-06-30 Dynavox Systems, Llc System and method of disambiguating and selecting dictionary definitions for one or more target words
US20110167350A1 (en) 2010-01-06 2011-07-07 Apple Inc. Assist Features For Content Display Device
US20110261030A1 (en) 2010-04-26 2011-10-27 Bullock Roddy Mckee Enhanced Ebook and Enhanced Ebook Reader
US8250071B1 (en) 2010-06-30 2012-08-21 Amazon Technologies, Inc. Disambiguation of term meaning
US20120211438A1 (en) 1998-05-29 2012-08-23 Glover John N Filtering medium and method for contacting solids containing feeds for chemical reactors
US20120221972A1 (en) 2011-02-24 2012-08-30 Google Inc. Electronic Book Contextual Menu Systems and Methods
US20120240085A1 (en) 2009-12-01 2012-09-20 Creative Technology Ltd Electronic book reader

Patent Citations (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5483629A (en) 1992-04-30 1996-01-09 Ricoh Company, Ltd. Method and system to handle dictionaries in a document processing language
US5416896A (en) 1992-04-30 1995-05-16 Ricoh Company, Ltd. Command definition dictionary handling and context declaration in a document publishing page description language (PDL)
US7849393B1 (en) 1992-12-09 2010-12-07 Discovery Communications, Inc. Electronic book connection to world watch live
US5745776A (en) 1995-04-19 1998-04-28 Sheppard, Ii; Charles Bradford Enhanced electronic dictionary
US6331867B1 (en) * 1998-03-20 2001-12-18 Nuvomedia, Inc. Electronic book with automated look-up of terms of within reference titles
US20120211438A1 (en) 1998-05-29 2012-08-23 Glover John N Filtering medium and method for contacting solids containing feeds for chemical reactors
US20100153440A1 (en) 2001-08-13 2010-06-17 Xerox Corporation System with user directed enrichment
US20070265834A1 (en) 2001-09-06 2007-11-15 Einat Melnick In-context analysis
US20060282778A1 (en) 2001-09-13 2006-12-14 International Business Machines Corporation Handheld electronic book reader with annotation and usage tracking capabilities
US20030160830A1 (en) * 2002-02-22 2003-08-28 Degross Lee M. Pop-up edictionary
US20040248653A1 (en) 2003-06-05 2004-12-09 Mark Barros System and method for providing user interactive experiences according to user's physical location
US20060041538A1 (en) 2004-02-15 2006-02-23 King Martin T Establishing an interactive environment for rendered documents
US20060230340A1 (en) 2005-04-06 2006-10-12 Marcella Betz Parsons System and method for publishing, distributing, and reading electronic interactive books
US20070011160A1 (en) 2005-07-07 2007-01-11 Denis Ferland Literacy automation software
US20070136231A1 (en) 2005-12-09 2007-06-14 Fatlens Inc. Method for providing access to information in a network
US20080222552A1 (en) 2007-02-21 2008-09-11 University of Central Florida Reseach Foundation, Inc. Interactive Electronic Book Operating Systems And Methods
US20090153495A1 (en) * 2007-12-18 2009-06-18 Wistron Corp. Input method for use in an electronic device having a touch-sensitive screen
US20100005087A1 (en) 2008-07-01 2010-01-07 Stephen Basco Facilitating collaborative searching using semantic contexts associated with information
US20100128994A1 (en) 2008-11-24 2010-05-27 Jan Scott Zwolinski Personal dictionary and translator device
US20110018695A1 (en) * 2009-07-24 2011-01-27 Research In Motion Limited Method and apparatus for a touch-sensitive display
US20120240085A1 (en) 2009-12-01 2012-09-20 Creative Technology Ltd Electronic book reader
US20110161073A1 (en) 2009-12-29 2011-06-30 Dynavox Systems, Llc System and method of disambiguating and selecting dictionary definitions for one or more target words
US20110167350A1 (en) 2010-01-06 2011-07-07 Apple Inc. Assist Features For Content Display Device
US20110261030A1 (en) 2010-04-26 2011-10-27 Bullock Roddy Mckee Enhanced Ebook and Enhanced Ebook Reader
US8250071B1 (en) 2010-06-30 2012-08-21 Amazon Technologies, Inc. Disambiguation of term meaning
US20120221972A1 (en) 2011-02-24 2012-08-30 Google Inc. Electronic Book Contextual Menu Systems and Methods

Non-Patent Citations (8)

* Cited by examiner, † Cited by third party
Title
"Babylon 8 Translation Softwar and Dictionary Tool" retrieved on May 7, 2010 at <<http://www.babylon.com/>>, entire website, Babylon, 2 pages.
"Babylon 8 Translation Softwar and Dictionary Tool" retrieved on May 7, 2010 at >, entire website, Babylon, 2 pages.
Haupt, "Fun and Functional. Interesting new consumer-technology products", Horizon Air Magazine, Mar. 2010, 8 pages.
Non-Final Office Action for U.S. Appl. No. 13/042,185, mailed on Feb. 22, 2013, Sailesh Rachabathuni et al., "Dynamically Selecting Example Passages", 17 pages.
Office action for U.S. Appl. No. 12/749,073, mailed on Apr. 9, 2013, Rachabathuni et al., "Context-Sensitive Reference Works", 21 pages.
Office action for U.S. Appl. No. 12/749,073, mailed on Jan. 20, 2012, Rachabathuni et al., "Context-Sensitive Reference Works", 25 pages.
Office action for U.S. Appl. No. 12/749,073, mailed on Jul. 5, 2012, Rachabathuni et al., "Context-Sensitive Reference Works", 22 pages.
Office action for U.S. Appl. No. 12/823,077, mailed on Oct. 9, 2012, Freed, "Surfacing Reference Work Entries on Touch-Sensitive Displays", 12 pages.

Cited By (168)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8724963B2 (en) 2009-12-18 2014-05-13 Captimo, Inc. Method and system for gesture based searching
US20110176788A1 (en) * 2009-12-18 2011-07-21 Bliss John Stuart Method and System for Associating an Object to a Moment in Time in a Digital Video
US9449107B2 (en) 2009-12-18 2016-09-20 Captimo, Inc. Method and system for gesture based searching
US20110158605A1 (en) * 2009-12-18 2011-06-30 Bliss John Stuart Method and system for associating an object to a moment in time in a digital video
US20120075208A1 (en) * 2010-09-27 2012-03-29 Nintendo Co., Ltd. Information processing program, information processing apparatus and method thereof
US20170185654A1 (en) * 2010-09-30 2017-06-29 Huawei Device Co., Ltd. Method and server for pushing information proactively
US20140297632A1 (en) * 2010-12-22 2014-10-02 Avinash Sridhar Realtime search grid updates
US10275087B1 (en) 2011-08-05 2019-04-30 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10540039B1 (en) 2011-08-05 2020-01-21 P4tents1, LLC Devices and methods for navigating between user interface
US10656752B1 (en) 2011-08-05 2020-05-19 P4tents1, LLC Gesture-equipped touch screen system, method, and computer program product
US10338736B1 (en) 2011-08-05 2019-07-02 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10345961B1 (en) 2011-08-05 2019-07-09 P4tents1, LLC Devices and methods for navigating between user interfaces
US10365758B1 (en) 2011-08-05 2019-07-30 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10386960B1 (en) 2011-08-05 2019-08-20 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10649571B1 (en) 2011-08-05 2020-05-12 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10664097B1 (en) 2011-08-05 2020-05-26 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US9378288B1 (en) * 2011-08-10 2016-06-28 Google Inc. Refining search results
US10296177B2 (en) 2011-08-19 2019-05-21 Apple Inc. Interactive content for digital books
US9766782B2 (en) * 2011-08-19 2017-09-19 Apple Inc. Interactive content for digital books
US20130073932A1 (en) * 2011-08-19 2013-03-21 Apple Inc. Interactive Content for Digital Books
US10108330B2 (en) * 2011-09-12 2018-10-23 Microsoft Technology Licensing, Llc Automatic highlighting of formula parameters for limited display devices
US20140109009A1 (en) * 2012-02-28 2014-04-17 Tencent Technology (Shenzhen) Company Limited Method and apparatus for text searching on a touch terminal
US10592041B2 (en) * 2012-05-09 2020-03-17 Apple Inc. Device, method, and graphical user interface for transitioning between display states in response to a gesture
US11947724B2 (en) 2012-05-09 2024-04-02 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface
US10996788B2 (en) 2012-05-09 2021-05-04 Apple Inc. Device, method, and graphical user interface for transitioning between display states in response to a gesture
US10969945B2 (en) 2012-05-09 2021-04-06 Apple Inc. Device, method, and graphical user interface for selecting user interface objects
US10942570B2 (en) 2012-05-09 2021-03-09 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface
US9612741B2 (en) 2012-05-09 2017-04-04 Apple Inc. Device, method, and graphical user interface for displaying additional information in response to a user contact
US9619076B2 (en) 2012-05-09 2017-04-11 Apple Inc. Device, method, and graphical user interface for transitioning between display states in response to a gesture
US10168826B2 (en) 2012-05-09 2019-01-01 Apple Inc. Device, method, and graphical user interface for transitioning between display states in response to a gesture
US10908808B2 (en) 2012-05-09 2021-02-02 Apple Inc. Device, method, and graphical user interface for displaying additional information in response to a user contact
US10884591B2 (en) 2012-05-09 2021-01-05 Apple Inc. Device, method, and graphical user interface for selecting object within a group of objects
US10782871B2 (en) 2012-05-09 2020-09-22 Apple Inc. Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object
US10775994B2 (en) 2012-05-09 2020-09-15 Apple Inc. Device, method, and graphical user interface for moving and dropping a user interface object
US10775999B2 (en) 2012-05-09 2020-09-15 Apple Inc. Device, method, and graphical user interface for displaying user interface objects corresponding to an application
US11023116B2 (en) 2012-05-09 2021-06-01 Apple Inc. Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input
US11068153B2 (en) 2012-05-09 2021-07-20 Apple Inc. Device, method, and graphical user interface for displaying user interface objects corresponding to an application
US11221675B2 (en) 2012-05-09 2022-01-11 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface
US10175864B2 (en) 2012-05-09 2019-01-08 Apple Inc. Device, method, and graphical user interface for selecting object within a group of objects in accordance with contact intensity
US10126930B2 (en) * 2012-05-09 2018-11-13 Apple Inc. Device, method, and graphical user interface for scrolling nested regions
US10175757B2 (en) 2012-05-09 2019-01-08 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for touch-based operations performed and reversed in a user interface
US10114546B2 (en) 2012-05-09 2018-10-30 Apple Inc. Device, method, and graphical user interface for displaying user interface objects corresponding to an application
US9753639B2 (en) 2012-05-09 2017-09-05 Apple Inc. Device, method, and graphical user interface for displaying content associated with a corresponding affordance
US11314407B2 (en) 2012-05-09 2022-04-26 Apple Inc. Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object
US20190018562A1 (en) * 2012-05-09 2019-01-17 Apple Inc. Device, Method, and Graphical User Interface for Scrolling Nested Regions
US10095391B2 (en) 2012-05-09 2018-10-09 Apple Inc. Device, method, and graphical user interface for selecting user interface objects
US10496260B2 (en) 2012-05-09 2019-12-03 Apple Inc. Device, method, and graphical user interface for pressure-based alteration of controls in a user interface
US9823839B2 (en) 2012-05-09 2017-11-21 Apple Inc. Device, method, and graphical user interface for displaying additional information in response to a user contact
US10481690B2 (en) 2012-05-09 2019-11-19 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for media adjustment operations performed in a user interface
US10073615B2 (en) 2012-05-09 2018-09-11 Apple Inc. Device, method, and graphical user interface for displaying user interface objects corresponding to an application
US11354033B2 (en) 2012-05-09 2022-06-07 Apple Inc. Device, method, and graphical user interface for managing icons in a user interface region
US10191627B2 (en) 2012-05-09 2019-01-29 Apple Inc. Device, method, and graphical user interface for manipulating framed graphical objects
US11010027B2 (en) 2012-05-09 2021-05-18 Apple Inc. Device, method, and graphical user interface for manipulating framed graphical objects
US9886184B2 (en) 2012-05-09 2018-02-06 Apple Inc. Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object
US20150067605A1 (en) * 2012-05-09 2015-03-05 Apple Inc. Device, Method, and Graphical User Interface for Scrolling Nested Regions
US20190121493A1 (en) * 2012-05-09 2019-04-25 Apple Inc. Device, Method, and Graphical User Interface for Transitioning Between Display States in Response to a Gesture
US10042542B2 (en) 2012-05-09 2018-08-07 Apple Inc. Device, method, and graphical user interface for moving and dropping a user interface object
US9996231B2 (en) 2012-05-09 2018-06-12 Apple Inc. Device, method, and graphical user interface for manipulating framed graphical objects
US9971499B2 (en) 2012-05-09 2018-05-15 Apple Inc. Device, method, and graphical user interface for displaying content associated with a corresponding affordance
US9990121B2 (en) 2012-05-09 2018-06-05 Apple Inc. Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input
US10042519B2 (en) * 2012-06-25 2018-08-07 Nook Digital, Llc Creation and exposure of embedded secondary content data relevant to a primary content page of an electronic book
US20150052472A1 (en) * 2012-06-25 2015-02-19 Barnesandnoble.Com Llc Creation and Exposure of Embedded Secondary Content Data Relevant to a Primary Content Page of An Electronic Book
US20140047332A1 (en) * 2012-08-08 2014-02-13 Microsoft Corporation E-reader systems
US20140046922A1 (en) * 2012-08-08 2014-02-13 Microsoft Corporation Search user interface using outward physical expressions
US10185491B2 (en) 2012-12-29 2019-01-22 Apple Inc. Device, method, and graphical user interface for determining whether to scroll or enlarge content
US9857897B2 (en) 2012-12-29 2018-01-02 Apple Inc. Device and method for assigning respective portions of an aggregate intensity to a plurality of contacts
US9778771B2 (en) 2012-12-29 2017-10-03 Apple Inc. Device, method, and graphical user interface for transitioning between touch input to display output relationships
US9965074B2 (en) 2012-12-29 2018-05-08 Apple Inc. Device, method, and graphical user interface for transitioning between touch input to display output relationships
US20150149967A1 (en) * 2012-12-29 2015-05-28 Apple Inc. Device, Method, and Graphical User Interface for Navigating User Interface Hierarchies
US10437333B2 (en) 2012-12-29 2019-10-08 Apple Inc. Device, method, and graphical user interface for forgoing generation of tactile output for a multi-contact gesture
US10078442B2 (en) 2012-12-29 2018-09-18 Apple Inc. Device, method, and graphical user interface for determining whether to scroll or select content based on an intensity theshold
US10101887B2 (en) * 2012-12-29 2018-10-16 Apple Inc. Device, method, and graphical user interface for navigating user interface hierarchies
US9959025B2 (en) * 2012-12-29 2018-05-01 Apple Inc. Device, method, and graphical user interface for navigating user interface hierarchies
US9996233B2 (en) * 2012-12-29 2018-06-12 Apple Inc. Device, method, and graphical user interface for navigating user interface hierarchies
US10037138B2 (en) 2012-12-29 2018-07-31 Apple Inc. Device, method, and graphical user interface for switching between user interfaces
US10915243B2 (en) 2012-12-29 2021-02-09 Apple Inc. Device, method, and graphical user interface for adjusting content selection
US20160004429A1 (en) * 2012-12-29 2016-01-07 Apple Inc. Device, Method, and Graphical User Interface for Navigating User Interface Hierarchies
US10620781B2 (en) 2012-12-29 2020-04-14 Apple Inc. Device, method, and graphical user interface for moving a cursor according to a change in an appearance of a control icon with simulated three-dimensional characteristics
US10175879B2 (en) 2012-12-29 2019-01-08 Apple Inc. Device, method, and graphical user interface for zooming a user interface while performing a drag operation
US20160210025A1 (en) * 2012-12-29 2016-07-21 Apple Inc. Device, Method, and Graphical User Interface for Navigating User Interface Hierarchies
US20140195961A1 (en) * 2013-01-07 2014-07-10 Apple Inc. Dynamic Index
US20170262513A1 (en) * 2013-05-29 2017-09-14 Ebay Inc. Methods and systems to refine search results
US10599665B2 (en) * 2013-05-29 2020-03-24 Ebay Inc. Methods and systems to refine search results
US20150081653A1 (en) * 2013-09-13 2015-03-19 Yahoo! Inc. Type free search assist
US9665206B1 (en) 2013-09-18 2017-05-30 Apple Inc. Dynamic user interface adaptable to multiple input tools
US20150120777A1 (en) * 2013-10-24 2015-04-30 Olivia Ramos System and Method for Mining Data Using Haptic Feedback
US20170228451A1 (en) * 2013-12-20 2017-08-10 Microsoft Technology Licensing, Llc Constructing queries for execution over multi-dimensional data structures
US10565232B2 (en) * 2013-12-20 2020-02-18 Microsoft Technology Licensing, Llc Constructing queries for execution over multi-dimensional data structures
US9672287B2 (en) * 2013-12-26 2017-06-06 Thomson Licensing Method and apparatus for gesture-based searching
US20150186525A1 (en) * 2013-12-26 2015-07-02 Thomson Licensing Method and apparatus for gesture-based searching
US10838538B2 (en) * 2013-12-26 2020-11-17 Interdigital Madison Patent Holdings, Sas Method and apparatus for gesture-based searching
US11877032B2 (en) * 2014-02-26 2024-01-16 Rovi Guides, Inc. Methods and systems for supplementing media assets during fast-access playback operations
US20230283851A1 (en) * 2014-02-26 2023-09-07 Rovi Guides, Inc. Methods and systems for supplementing media assets during fast-access playback operations
EP3134830A4 (en) * 2014-04-25 2017-11-29 Amazon Technologies Inc. Selective display of comprehension guides
US10417933B1 (en) 2014-04-25 2019-09-17 Amazon Technologies, Inc. Selective display of comprehension guides
US20160239161A1 (en) * 2015-02-12 2016-08-18 Kobo Incorporated Method and system for term-occurrence-based navigation of apportioned e-book content
US20160259496A1 (en) * 2015-03-08 2016-09-08 Apple Inc. Devices, Methods, and Graphical User Interfaces for Displaying and Using Menus
US10860177B2 (en) 2015-03-08 2020-12-08 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US9990107B2 (en) * 2015-03-08 2018-06-05 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US9645709B2 (en) 2015-03-08 2017-05-09 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US11112957B2 (en) 2015-03-08 2021-09-07 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US10338772B2 (en) 2015-03-08 2019-07-02 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10268342B2 (en) 2015-03-08 2019-04-23 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10613634B2 (en) 2015-03-08 2020-04-07 Apple Inc. Devices and methods for controlling media presentation
CN105955591A (en) * 2015-03-08 2016-09-21 苹果公司 Devices, Methods, and Graphical User Interfaces for Displaying and Using Menus
DK201500575A1 (en) * 2015-03-08 2016-09-26 Apple Inc Devices, Methods, and Graphical User Interfaces for Displaying and Using Menus
US10387029B2 (en) 2015-03-08 2019-08-20 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
DK201500581A1 (en) * 2015-03-08 2017-01-16 Apple Inc Devices, Methods, and Graphical User Interfaces for Displaying and Using Menus
US10402073B2 (en) 2015-03-08 2019-09-03 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US10268341B2 (en) 2015-03-08 2019-04-23 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US9632664B2 (en) 2015-03-08 2017-04-25 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10180772B2 (en) 2015-03-08 2019-01-15 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10095396B2 (en) 2015-03-08 2018-10-09 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US9645732B2 (en) * 2015-03-08 2017-05-09 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US10067645B2 (en) 2015-03-08 2018-09-04 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10048757B2 (en) 2015-03-08 2018-08-14 Apple Inc. Devices and methods for controlling media presentation
US11550471B2 (en) 2015-03-19 2023-01-10 Apple Inc. Touch input cursor manipulation
US9785305B2 (en) 2015-03-19 2017-10-10 Apple Inc. Touch input cursor manipulation
US9639184B2 (en) 2015-03-19 2017-05-02 Apple Inc. Touch input cursor manipulation
US10599331B2 (en) 2015-03-19 2020-03-24 Apple Inc. Touch input cursor manipulation
US10222980B2 (en) 2015-03-19 2019-03-05 Apple Inc. Touch input cursor manipulation
US11054990B2 (en) 2015-03-19 2021-07-06 Apple Inc. Touch input cursor manipulation
US10067653B2 (en) 2015-04-01 2018-09-04 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US10152208B2 (en) 2015-04-01 2018-12-11 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US9602729B2 (en) 2015-06-07 2017-03-21 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US9674426B2 (en) 2015-06-07 2017-06-06 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10200598B2 (en) 2015-06-07 2019-02-05 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US11835985B2 (en) 2015-06-07 2023-12-05 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10705718B2 (en) 2015-06-07 2020-07-07 Apple Inc. Devices and methods for navigating between user interfaces
US11681429B2 (en) 2015-06-07 2023-06-20 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US11240424B2 (en) 2015-06-07 2022-02-01 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US11231831B2 (en) 2015-06-07 2022-01-25 Apple Inc. Devices and methods for content preview based on touch input intensity
US9706127B2 (en) 2015-06-07 2017-07-11 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10841484B2 (en) 2015-06-07 2020-11-17 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10303354B2 (en) 2015-06-07 2019-05-28 Apple Inc. Devices and methods for navigating between user interfaces
US9916080B2 (en) 2015-06-07 2018-03-13 Apple Inc. Devices and methods for navigating between user interfaces
US9891811B2 (en) 2015-06-07 2018-02-13 Apple Inc. Devices and methods for navigating between user interfaces
US9830048B2 (en) 2015-06-07 2017-11-28 Apple Inc. Devices and methods for processing touch inputs with instructions in a web page
US10346030B2 (en) 2015-06-07 2019-07-09 Apple Inc. Devices and methods for navigating between user interfaces
US10455146B2 (en) 2015-06-07 2019-10-22 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US9860451B2 (en) 2015-06-07 2018-01-02 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10162452B2 (en) 2015-08-10 2018-12-25 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US10209884B2 (en) 2015-08-10 2019-02-19 Apple Inc. Devices, Methods, and Graphical User Interfaces for Manipulating User Interface Objects with Visual and/or Haptic Feedback
US10698598B2 (en) 2015-08-10 2020-06-30 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10963158B2 (en) 2015-08-10 2021-03-30 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10203868B2 (en) 2015-08-10 2019-02-12 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US11740785B2 (en) 2015-08-10 2023-08-29 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US9880735B2 (en) 2015-08-10 2018-01-30 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10884608B2 (en) 2015-08-10 2021-01-05 Apple Inc. Devices, methods, and graphical user interfaces for content navigation and manipulation
US10754542B2 (en) 2015-08-10 2020-08-25 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10416800B2 (en) 2015-08-10 2019-09-17 Apple Inc. Devices, methods, and graphical user interfaces for adjusting user interface objects
US10235035B2 (en) 2015-08-10 2019-03-19 Apple Inc. Devices, methods, and graphical user interfaces for content navigation and manipulation
US11182017B2 (en) 2015-08-10 2021-11-23 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US11327648B2 (en) 2015-08-10 2022-05-10 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10248308B2 (en) 2015-08-10 2019-04-02 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interfaces with physical gestures
US10083238B2 (en) * 2015-09-28 2018-09-25 Oath Inc. Multi-touch gesture search
US20170091333A1 (en) * 2015-09-28 2017-03-30 Yahoo!, Inc. Multi-touch gesture search
US20170177178A1 (en) * 2015-12-16 2017-06-22 International Business Machines Corporation E-reader summarization and customized dictionary
US20170177179A1 (en) * 2015-12-16 2017-06-22 International Business Machines Corporation E-reader summarization and customized dictionary
US10691893B2 (en) * 2016-02-29 2020-06-23 International Business Machines Corporation Interest highlight and recommendation based on interaction in long text reading
US20170249296A1 (en) * 2016-02-29 2017-08-31 International Business Machines Corporation Interest highlight and recommendation based on interaction in long text reading
US20200050326A1 (en) * 2016-10-27 2020-02-13 Samsung Electronics Co., Ltd. Electronic device and method for providing information in response to pressure input of touch
US10871883B2 (en) * 2016-10-27 2020-12-22 Samsung Electronics Co., Ltd Electronic device and method for providing information in response to pressure input of touch
US20190155955A1 (en) * 2017-11-20 2019-05-23 Rovi Guides, Inc. Systems and methods for filtering supplemental content for an electronic book
US10909191B2 (en) 2017-11-20 2021-02-02 Rovi Guides, Inc. Systems and methods for displaying supplemental content for an electronic book
US10909193B2 (en) * 2017-11-20 2021-02-02 Rovi Guides, Inc. Systems and methods for filtering supplemental content for an electronic book
US11635883B2 (en) * 2020-02-18 2023-04-25 Micah Development LLC Indication of content linked to text
US20220147223A1 (en) * 2020-11-07 2022-05-12 Saad Al Mohizea System and method for correcting typing errors

Similar Documents

Publication Publication Date Title
US8542205B1 (en) Refining search results based on touch gestures
US8773389B1 (en) Providing reference work entries on touch-sensitive displays
US8698765B1 (en) Associating concepts within content items
JP7345442B2 (en) Apparatus, method, and graphical user interface for operating a user interface based on fingerprint sensor input
US20180239512A1 (en) Context based gesture delineation for user interaction in eyes-free mode
US20220113861A1 (en) Device, Method, and Graphical User Interface for Presenting Representations of Media Containers
US11327649B1 (en) Facilitating selection of keys related to a selected key
US9405391B1 (en) Rendering content around obscuring objects
TWI531916B (en) Computing device, computer-storage memories, and method of registration for system level search user interface
US9342233B1 (en) Dynamic dictionary based on context
KR20190003982A (en) Intelligent automation assistant for media navigation
US20120198380A1 (en) Contextual user interface
KR20120037401A (en) Integrating digital book and zoom interface displays
US20120151413A1 (en) Method and apparatus for providing a mechanism for presentation of relevant content
US20140210729A1 (en) Gesture based user interface for use in an eyes-free mode
US20130339853A1 (en) Systems and Method to Facilitate Media Search Based on Acoustic Attributes
EP2897058B1 (en) User inteface device, search method, and program
US9607105B1 (en) Content searching techniques
US20150178323A1 (en) User interface device, search method, and program
US20140215339A1 (en) Content navigation and selection in an eyes-free mode
US20150234926A1 (en) User interface device, search method, and program
WO2012010953A2 (en) Apparatus for e-learning and method therefor
US9679047B1 (en) Context-sensitive reference works
US20140068424A1 (en) Gesture-based navigation using visual page indicators
US20150268805A1 (en) User interface to open a different ebook responsive to a user gesture

Legal Events

Date Code Title Description
AS Assignment

Owner name: AMAZON TECHNOLOGIES, INC., NEVADA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KELLER, KEVIN E.;REEL/FRAME:025021/0968

Effective date: 20100627

STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8