US20090058820A1 - Flick-based in situ search from ink, text, or an empty selection region - Google Patents
Flick-based in situ search from ink, text, or an empty selection region Download PDFInfo
- Publication number
- US20090058820A1 US20090058820A1 US11/849,469 US84946907A US2009058820A1 US 20090058820 A1 US20090058820 A1 US 20090058820A1 US 84946907 A US84946907 A US 84946907A US 2009058820 A1 US2009058820 A1 US 2009058820A1
- Authority
- US
- United States
- Prior art keywords
- search
- data
- flick gesture
- situ
- component
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/10—Character recognition
- G06V30/14—Image acquisition
- G06V30/1444—Selective acquisition, locating or processing of specific regions, e.g. highlighted text, fiducial marks or predetermined fields
- G06V30/1456—Selective acquisition, locating or processing of specific regions, e.g. highlighted text, fiducial marks or predetermined fields based on user interactions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/24—Querying
- G06F16/245—Query processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/33—Querying
- G06F16/332—Query formulation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/10—Character recognition
Definitions
- search engines can assist in locating information on the public Web, intranets, personal computers, and the like.
- Typical search engines can retrieve a list of references (e.g., search results) matching inputted criteria provided by the user. For instance, the user can perform a query by providing a word or phrase to the search engine and in response the search engine can return a list of search results matching the entered word, phrase, or a portion thereof.
- search engines support utilization of Boolean terms such as, for instance, AND, OR and NOT as well as provisions related to a distance between keywords.
- Active note taking can be considered to be, for example, the combination of pen-and-ink note taking with searching, linking, collecting, and sense making activities. This is in contrast to simple note taking, which is characterized by moment-to-moment transcription.
- Active note taking for example is typically performed by knowledge workers engaged in challenging creative work such as scientific research, product design, or planning complex activities, and the like. The knowledge workers often create informal pre-production work artifacts on paper, in notebooks, or on whiteboards, sketching preliminary plans and manipulating their notes to find solutions to difficult problems.
- the subject innovation relates to systems and/or methods that facilitate querying data based on a flick gesture.
- An in situ search component can receive a flick gesture via an interface, wherein the in situ search component can execute an in situ search with the flick gesture as a trigger.
- the in situ search component can implement at least one of the following upon the detection of a flick gesture: a search on a portion of selected data or a generation of a search query box.
- a portion of data e.g., handwriting, text, characters, words, phrases, images, etc.
- the flick gesture can provide a search query box to receive a query when there is an empty selection of data.
- the in situ search component can execute a search based on characteristics of the flick gesture.
- the flick gesture can be evaluated by an evaluation component to identify flick gesture speed, flick gesture direction, and the like.
- the in situ search component can implement various types of searches.
- the in situ search component can utilize a graphic component that can generate an embeddable persistent graphical object with the flick gesture as a trigger.
- the embeddable persistent graphical object can be populated with search results for a query, a search query box for query input, previous searches, historic data, etc.
- methods are provided that facilitate executing a command based on a direction of a received flick gesture.
- FIG. 1 illustrates a block diagram of an exemplary system that facilitates querying data based on a flick gesture.
- FIG. 2 illustrates a block diagram of an exemplary system that facilitates executing a command based on a direction of a received flick gesture.
- FIG. 3 illustrates a block diagram of an exemplary system that facilitates selecting a portion of data and initiating an in situ search based upon a flick gesture.
- FIG. 4 illustrates a block diagram of an exemplary system that facilitates leveraging in situ search triggered by a flick gesture with an application.
- FIG. 5 illustrates a block diagram of exemplary system that facilitates implementing an in situ search from a user based on a gesture received by an input device.
- FIG. 6 illustrates an exemplary methodology for facilitates initiating an in situ search of data upon detection of a flick gesture.
- FIG. 7 illustrates an exemplary methodology for executing an in situ search on a portion of data or a received user-specified query.
- FIG. 8 illustrates an exemplary methodology that facilitates initiating a graphic overlay for in situ search based on a flick gesture.
- FIG. 9 illustrates an exemplary networking environment, wherein the novel aspects of the claimed subject matter can be employed.
- FIG. 10 illustrates an exemplary operating environment that can be employed in accordance with the claimed subject matter.
- ком ⁇ онент can be a process running on a processor, a processor, an object, an executable, a program, a function, a library, a subroutine, and/or a computer or a combination of software and hardware.
- an application running on a server and the server can be a component.
- One or more components can reside within a process and a component can be localized on one computer and/or distributed between two or more computers.
- the claimed subject matter may be implemented as a method, apparatus, or article of manufacture using standard programming and/or engineering techniques to produce software, firmware, hardware, or any combination thereof to control a computer to implement the disclosed subject matter.
- article of manufacture as used herein is intended to encompass a computer program accessible from any computer-readable device, carrier, or media.
- computer readable media can include but are not limited to magnetic storage devices (e.g., hard disk, floppy disk, magnetic strips . . . ), optical disks (e.g., compact disk (CD), digital versatile disk (DVD) . . . ), smart cards, and flash memory devices (e.g., card, stick, key drive . . . ).
- a carrier wave can be employed to carry computer-readable electronic data such as those used in transmitting and receiving electronic mail or in accessing a network such as the Internet or a local area network (LAN).
- LAN local area network
- FIG. 1 illustrates a system 100 that facilitates querying data based on a flick gesture.
- the system 100 can include an in situ search component 102 that can receive a flick gesture via an interface component 106 (discussed below), wherein the flick gesture can trigger the execution of a search to return a search result 104 .
- the in situ search component 102 can implement an in situ search or query on any suitable portion of data upon the detection of a flick gesture.
- the flick gesture can be a quick, linear movement associated with a scrolling action and/or command.
- the flick gesture can be a linear movement that requires a user to provide a quick flicking motion with characteristics such as a high speed and a high degree of straightness.
- Such linear movements can correspond to an input device (not shown) such as, but not limited to, a tablet, a touch screen, a mouse, a touch pad, a trackball, and/or any other suitable input device capable of inputting a flick gesture.
- the in situ search component 102 can implement particular searches or queries based upon a direction of the flick gesture (discussed in more detail below).
- the in situ search component 102 can provide a dynamic search on a portion of data in a seamless manner without disrupting a user's attention to his or her primary task prior to the search.
- the in situ search component 102 can provide a plurality of search results 104 and a single search result 104 is illustrated for the sake of brevity.
- a portion of text within a document can be selected by a user.
- the user can input a flick gesture in order to search with the selected text as a “seed” for the query. While the search is performed on the selected text, the user can continue to read the document without interruption of his or her task.
- the flick gesture activated search can be executed in the background so as to enable a user to seamlessly search documents, data, files, etc.
- a user can initiate a flick gesture and then a selection of data on which to perform the search.
- the flick gesture and the selection of data can be in any suitable order or sequence.
- the selection of data to query can be before and/or after the flick gesture.
- the in situ search component 102 can utilize a flick gesture as a trigger to prompt a user with a search query box to enter user-defined search.
- a user can be examining a web page, perform a flick gesture, and be presented with a search query box in order to input user-defined data (e.g., handwriting, text, numbers, alphanumeric characters, etc.).
- the data inputted in the search query box can be a seed for a query in which to return at least one search result (e.g., search result 104 ).
- search result 104 e.g., search result 104
- the system 100 can include any suitable and/or necessary interface component 106 (herein referred to as “interface 106 ”), which provides various adapters, connectors, channels, communication paths, etc. to integrate the in situ search component 102 into virtually any operating and/or database system(s) and/or with one another.
- interface 106 can provide various adapters, connectors, channels, communication paths, etc., that provide for interaction with the in situ search component 102 , flick gestures, input devices, the search result 104 , and any other device and/or component associated with the system 100 .
- FIG. 2 illustrates a system 200 that facilitates executing a command based on a direction of a received flick gesture.
- the system 200 can include the in situ search component 102 that can execute a search based upon the detection of a flick gesture via the interface 106 .
- the in situ search component 102 can provide at least one search result 104 .
- a flick gesture can be performed with an input device in order to activate an in situ search to produce the search result 104 , wherein such in situ search can be performed on at least one of 1) a portion of selected data; 2) a portion of data entered in a prompted search query box; or 3) any suitable combination thereof.
- a user can select a portion of text on an email, perform a flick gesture, and be prompted with an additional search query box in which user entered text/data and the selected portion of text can be seeds for a query.
- the system 200 may employ the context of other surrounding words, in addition to those explicitly selected by the user, to specialize, personalize, or contextualize the search results (e.g., by re-ranking web search results) to suit the particular user, or the particular document in which the user triggered the search.
- the system 200 can utilize an evaluation component 202 that can detect at least one of a flick gesture, a direction of a flick gesture, a speed of a linear input, a direction of a linear input, a location of a linear input, an area of a linear input, a data selection from an input device, and/or any other data related to an input from an input device.
- the evaluation component 202 can continuously monitor an input device to detect a flick gesture. Once a flick gesture is identified, the in situ search component 102 can initiate a search or query by 1) executing a search on a portion of selected data; or 2) prompting a search query box for a user to fill.
- the evaluation component 202 can identify portions of data selected on which to search when triggered by a flick gesture.
- the evaluation component 202 can evaluate a location and/or area (e.g., handwriting, a portion of text, a portion of characters, a word, a phrase, a keyword, a sentence, a portion of an image, a graphic, a bitmap, or a portion of an icon, etc.) for which data is selected by the input device.
- the evaluation component 202 can determine if a gesture is within a speed and/or direction threshold in order to be considered a flick gesture.
- the evaluation component 202 may be a system component independent of individual applications; in this manner, the flick gesture serves as a system-wide gesture that supports all applications, rather than requiring individual applications to implement suitable search functionality.
- the evaluation component 202 can identify a direction associated with the flick gesture, wherein the direction of the flick gesture can correlate to a particular type of in situ search.
- any suitable search can be implemented by the in situ search component 102 such as, but not limited to, a local search, a remote search, a file type based search (e.g., web site search, email search, document search, audio file search, search within a particular directory, storage volume, or an operating system construct such as a ‘Start Menu’, etc.), application-based search, etc.
- a particular flick gesture direction can correspond to a specific type of in situ search implemented by the in situ search component 102 .
- an upward flick gesture can trigger a local search (e.g., local hard drive, desktop, folders, local networks, etc.), whereas a downward flick gesture can trigger a remote search (e.g., web pages, the Internet, remote networks, etc.).
- a local search e.g., local hard drive, desktop, folders, local networks, etc.
- a downward flick gesture can trigger a remote search (e.g., web pages, the Internet, remote networks, etc.).
- more than one flick gesture can be assigned to different types of search functionality such as desktop (e.g., personal information) search, web search, search within the current document (e.g., often exposed as a find feature), etc.
- a single flick gesture can present various types of search results that are grouped together, or interspersed according to a ranking function, e.g. a results list with desktop search results, web search results, image search results, etc.
- the evaluation component 202 can identify a flick gesture, wherein the in situ search component 102 can implement a correlating or mapped search command stored in a data store 204 (discussed in more detail below).
- a collection of mappings can be stored on the data store 204 , in which a flick gesture can correspond with a particular command.
- the mappings can include a flick gesture direction and corresponding search command (e.g., a diagonal upwards and right flick gesture translates to a local search, a diagonal downward left flick gesture translates to an email search, an upward flick gesture translates to a search within the opened file or file in which the gesture occurred, etc.).
- the system 200 can further include the data store 204 that can include any suitable data related to the in situ search component 102 , the search result 104 , an input device, etc.
- the data store 204 can include, but not limited to including, mappings (e.g., flick gesture direction and corresponding command, etc.), thresholds for flick gesture qualification, search types, user settings, in situ search configurations, user preferences, graphical overlays (e.g., breadcrumb-discussed in more detail below, search query box, etc.), directional definitions for a flick gesture, historic data related to a search, and/or any other suitable data related to the system 200 .
- nonvolatile memory can include read only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), or flash memory.
- Volatile memory can include random access memory (RAM), which acts as external cache memory.
- RAM is available in many forms such as static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), Synchlink DRAM (SLDRAM), Rambus direct RAM (RDRAM), direct Rambus dynamic RAM (DRDRAM), and Rambus dynamic RAM (RDRAM).
- SRAM static RAM
- DRAM dynamic RAM
- SDRAM synchronous DRAM
- DDR SDRAM double data rate SDRAM
- ESDRAM enhanced SDRAM
- SLDRAM Synchlink DRAM
- RDRAM Rambus direct RAM
- DRAM direct Rambus dynamic RAM
- RDRAM Rambus dynamic RAM
- FIG. 3 illustrates a system 300 that facilitates selecting a portion of data and initiating an in situ search based upon a flick gesture.
- the system 300 can include the in situ search component 102 that employs a search or query based upon the detection of a flick gesture received via the interface 106 .
- a flick gesture to trigger and execute an in situ search, attention can be maintained and focused rather than be diverted to perform a search.
- a query can be performed on a selection of data to provide the search result 104 . If a selection of data is absent, a user can be prompted to provide a selection and/or a search query box can be displayed to receive a query.
- the system 300 may present an initial list of desktop search results, sorted by age, if no query term is present so that all recent documents, emails, etc. are available as a default search that the user can then further filter down (e.g., by file type, author, date, etc) or further restrict by adding search terms to the query. For example, an empty data selection coupled with a flick gesture can return recent search results from a previous search in which a user can further filter down by adding a search term or by other data (e.g., file type, date, author, etc.).
- the system 300 can include a conversion component 302 that can enhance a selected portion of data to perform a search, wherein the enhancement can be a digital conversion or handwriting conversion. For example, a portion of data can be selected and a flick gesture executed in order to search the portion of selected data. It is to be appreciated that the portion of data can be handwritten, typed, extracted from an image via optical character recognition techniques, and/or any suitable combination thereof.
- the conversion component 302 can translate handwritten data, typed data, and/or any other suitable data identified in order to perform an in situ search.
- the conversion component 302 that can scan through inked handwritten script (e.g., graphemes, block, and/or cursive) and provide handwriting recognition to provide a digital form of the inked handwritten script. It is to be appreciated that the conversion component 302 can be used in conjunction with an artificial intelligence/machine learning component (not shown), or additionally and/or alternatively the conversion component 302 can itself comprise or include the intelligence/machine learning component. In general, there are several types of learning algorithms that can be utilized with respect to intelligence/machine learning.
- conditional maximum entropy (maxent) models have been widely employed for a variety of tasks, including language modeling, part-of-speech tagging, prepositional phrase attachment, and parsing, word selection for machine translation, and finding sentence boundaries. They are also sometime called logistic regression models, maximum likelihood exponential models, log-linear models, and can be equivalent to a form of perceptions, or single layer neural networks. In particular, perceptrons that use the standard sigmoid function, and optimize for log-loss can be perceived as being equivalent to maxent.
- the in situ search component 102 can include a query component 304 that can conduct searches of an individual user's search space (e.g., various persisting means associated with the user, such as hard drives associated with the processing device and/or distributed over Wide Area Networks (WANs), Local Area Networks (LANs), and/or Storage Area Networks (SANs), USB drives/memory sticks, and/or memory devices affiliated with the user and confederated with the processing device) as well as the Internet, based at least in part on the a digital form generated by conversion component 302 .
- the query component 304 can be any suitable search engine that can search remote data, local data, and/or any suitable combination thereof to identify the search result 104 .
- search effectuated by query component 304 can be conducted as a background process in order to detract from the distracting effects such searches can have on individuals' concentration on the task at hand.
- search results e.g., search result 104
- search result 104 can be associated with a persistent and embeddable graphical object (discussed below) and can be immediately displayed or displayed at a later time depending on individual preference.
- the system 300 can further utilize a graphic component 306 that can generate at least one of a persistent and embeddable graphical object or a search query box.
- the search query box can be generated upon the detection of a flick gesture without any data selected and/or identified for a search or query. Without a selection of data, the search query box can be utilized in which a user can input specific terms, phrases, characters, etc. on which to perform a search.
- a user can perform a flick gesture without any text selected (e.g., highlight, a circle, a lasso, an underline, a color, a box, an ellipse, etc.) which generates a search query box (embedded and persistent within the email) to enable a user to input query terms, characters, etc.
- a flick gesture without any text selected (e.g., highlight, a circle, a lasso, an underline, a color, a box, an ellipse, etc.) which generates a search query box (embedded and persistent within the email) to enable a user to input query terms, characters, etc.
- the persistent and embeddable graphical object can be a breadcrumb, for instance, to displayed and populate with at least one of a contextual ambit of flagged words and/or phrases, search results (e.g., search result 104 ), previous searches, historic data, preferences in relation to the in situ search, etc.
- the breadcrumb can be a small icon (e.g., graphical object) attached to a selection region, which a user can then tap or stroke on to select what type of search to perform, using a direction stroke, a pull-down menu, or any other suitable technique to choose among difference search options.
- Allowing searches to morph between different types in this manner encourages fluidity and curiosity-driven search in different domains (e.g., starting with a desktop search, but then later transitioning to a web search if the desired information is not available from ones personal data store, local data, etc.).
- the graphical object e.g., the breadcrumb, the search query box, etc.
- the graphical object can persist until it is explicitly deleted by a user.
- the graphical object e.g., the breadcrumb, the search query box, etc.
- a user-selectable option can be offered to allow a user to decide which behavior is preferred.
- FIG. 4 illustrates a system 400 that facilitates leveraging in situ search triggered by a flick gesture with an application.
- the system 400 enables a flick gesture to activate an in situ search to be performed on user-defined or selected data.
- the flick gesture can be a rapid and dependable out-of-band gesture that can offer consistent cross-application functionality for common system commands such as cut, copy, paste, undo, redo, etc.
- the flick gesture can further be utilized to trigger “context sensitive” commands that depend on the current selection (or lack thereof).
- Flick gestures can also be supported with various input devices (as discussed) such as mice, touch pads, trackballs, etc.
- the system 400 can include an application 402 that can utilize and/or leverage the in situ search capabilities triggered by a flick gesture.
- the application 402 can be any suitable portion of software, hardware, device, web site, web service, and/or any other suitable entity that can employ a flick gesture as a trigger for an in situ search or query.
- a user or third-party application can define which flick direction supports which function. Thus, a diagonal flick may be the default offering, but any flick direction can be used to trigger a search.
- a third-party email application can include instructions and/or definitions that can enable a flick gesture to trigger a particular search associated with such email application.
- various settings, preferences, configurations, options, and the like can be further defined as default or personalized by a consumer.
- the flick gesture as a trigger for an in situ search can be seamlessly incorporated into the application 402 .
- FIG. 5 illustrates a system 500 that facilities implementing an in situ search from a user based on a gesture received by an input device.
- the system 500 enables a flick gesture to trigger and initiate an in situ search related to data in a seamless manner.
- the system 500 can include the in situ search component 102 that can execute a search to provide the search result 104 upon the detection of a flick gesture.
- querying data can be done without disrupting an initial task that instigated the search or desire to search.
- the system 500 includes a user 502 that can interact with an input device (not shown).
- a user can employ a pen or mouse to highlight a keyword or short phrase, and then perform a flick gesture 504 to launch a search based on that phrase or keyword.
- the user 502 can perform a flick gesture 504 in any suitable direction, wherein such flick gesture 504 can trigger an in situ search.
- the subject innovation can implement any suitable number of flick gesture directions in order to activate a search.
- the flick gesture 504 can be recognized along eight cardinal compass directions with primary directions assigned to various system functions and diagonal flick directions for search functions.
- the flick gesture 504 can be identified and/or received from the input device via the interface 106 , in which the in situ search component 102 can employ a search to yield the search result 104 for the user 502 .
- the user 502 can select a portion of data and perform the flick gesture 504 to initiate a search.
- the following can be employed: 1) an empty search query box can be generated in-place where the user can handwrite or type a query; or 2) a search query box can be generated that can be seeded with a word, if any, that falls under the pointer location at the start of the flick gesture.
- the seeded query is “selected” by default, such that the user 502 can either proceed directly with this query or immediately start typing or writing on tope of it to overwrite the seeded query with a new one.
- the user 502 can perform the flick gesture 504 to trigger a search without a prior selection of data.
- a selection can seed a subsequent flick-based query implementing at least one of the following: 1) the system 500 can pre-fetch a search result 104 for the seeded query and display them immediately in the assumption that the results represent the desired query; 2) the system 500 can pre-fetch when the user 502 forms any valid selection (e.g., before the user flicks to ask for search results) such that the search result 104 is available immediately without waiting for a search engine (e.g., query component discussed earlier) to return a result; 3) a type of result can depend on a selection region (e.g., short words or phrases can trigger keyword searches, selections of longer passages of text can perform a vector-based search for related documents, etc.); 4) a search can be offered as a flick command if there is a previous selection; 5) a selection can proceed from an ink stroke that
- FIGS. 6-8 illustrate methodologies and/or flow diagrams in accordance with the claimed subject matter.
- the methodologies are depicted and described as a series of acts. It is to be understood and appreciated that the subject innovation is not limited by the acts illustrated and/or by the order of acts. For example acts can occur in various orders and/or concurrently, and with other acts not presented and described herein. Furthermore, not all illustrated acts may be required to implement the methodologies in accordance with the claimed subject matter. In addition, those skilled in the art will understand and appreciate that the methodologies could alternatively be represented as a series of interrelated states via a state diagram or events. Additionally, it should be further appreciated that the methodologies disclosed hereinafter and throughout this specification are capable of being stored on an article of manufacture to facilitate transporting and transferring such methodologies to computers. The term article of manufacture, as used herein, is intended to encompass a computer program accessible from any computer-readable device, carrier, or media.
- FIG. 6 illustrates a method 600 that facilitates initiating an in situ search of data upon detection of a flick gesture.
- the method 600 commences at reference numeral 602 where various processor initializations tasks and background activities are performed. After these tasks have been performed the method 600 can proceed to reference numeral 604 where a flick gesture can be received.
- the flick gesture can be a quick, linear movement associated with a scrolling action and/or command.
- the flick gesture can be a linear movement that requires a user to provide a quick flicking motion with characteristics such as a high speed and a high degree of straightness.
- the method proceeds to reference numeral 606 where it is ascertained whether the flick gesture that was received pertains to the instigation of a search.
- Illustrative gestures that can indicate that users wish to instigate a search can include using, for example, a lasso gesture (e.g., encircling inked text), an underlining gesture and/or a sweeping gesture representative of highlighting the inked text, a flick gesture in a specific direction, a flick gesture with or without a portion of data selected, etc.
- a lasso gesture e.g., encircling inked text
- an underlining gesture and/or a sweeping gesture representative of highlighting the inked text e.g., a flick gesture in a specific direction, a flick gesture with or without a portion of data selected, etc.
- An embeddable graphical object can be a visible representation of a query that acts as a handle placed in close proximity to, and/or logically attached to, ink that triggered the query. Nevertheless, it should be noted that embeddable graphical objects can be placed in any location desired by the user and/or heuristically determined by the system.
- a user in less that a second and without interrupting the flow of a note taking task, a user can, for example “lasso” or “highlight” some ink to specify a search, and leave a search breadcrumb to be visited later. When the user returns, he/she can hover over the breadcrumb to see details, or to view the search results.
- Embeddable graphical objects or breadcrumbs serve as persisted reminders to revisit previous queries, and implicitly record a history of queries in the context of the notes that led to the search.
- Breadcrumbs can be cut, copied, pasted, selected, and/or moved around the user's notes.
- Breadcrumbs are furthermore persisted with the content itself (e.g. when saved as part of a digital notebook or note document).
- the method 600 can proceed to reference numeral 610 where the inked text that has been selected (e.g., lassoed, highlighted, underlined, etc.) can be digitized and analyzed (e.g., lexically scanned to determine search terms). Digitizing and analysis of lassoed and/or highlighted ink can take the form of pattern recognition, optical character recognition, character recognition and/or handwriting analysis that can be carried out, for example, by a machine learning and/or artificial intelligence component.
- the inked text that has been selected e.g., lassoed, highlighted, underlined, etc.
- Digitizing and analysis of lassoed and/or highlighted ink can take the form of pattern recognition, optical character recognition, character recognition and/or handwriting analysis that can be carried out, for example, by a machine learning and/or artificial intelligence component.
- the resultant digital form can be employed as parameter to instigate search functionality at reference numeral 612 .
- the search can be run either as a foreground process or a background process. The choice of whether to have the search functionality execute in foreground or background can be a matter of individual preference. Regardless of whether the search is effectuated as a foreground or background process, the search can typically yield results that can be displayed immediately upon completion of the search or display of the results can be deferred to a more conducive time when the user is more receptive to viewing the results.
- the results of the search can be associated with the embeddable graphical object at reference numeral 614 .
- the embeddable object together with the associated search results can be inserted at reference numeral 616 in a location contiguous or abutting the selected data (e.g., circled, highlighted ink, etc.) that instigated the search, at which point the methodology 600 cycles back to 604 to await further an additional flick gesture received from an input device.
- the embeddable object itself may be scaled in proportion to the total size of the lasso, e.g.
- the embeddable object furthermore may be tapped or stroked to select it, whereupon the user is free to explicitly move it elsewhere, or resize it larger or smaller, if desired.
- FIG. 7 illustrates a method 700 that facilitates executing an in situ search on a portion of data or a received user-specified query.
- a flick gesture can be received.
- the flick gesture can be received from an input device, wherein the input device can be, but is not limited to being, a tablet, a touch screen, a mouse, a touch pad, a trackball, a stylus and touch screen device, and/or any other suitable input device capable of inputting a flick gesture.
- a determination is made whether a portion of data has been selected in combination with the flick gesture. If a portion of data is selected, the methodology 700 continues at reference numeral 706 .
- a portion of data (e.g., text, characters, images, etc.) can be selected with highlighting, underlining, lassoing, circling, and/or any other suitable technique to identify a portion of data with an input device.
- an in situ search can be executed on the selected data.
- the in situ search can be implemented so as to not distract a user or shift attention. In other words, the search can be seamlessly initiated (e.g., in background, foreground, etc.).
- a search result can be provided.
- a search query box can be generated and displayed.
- the flick gesture can be a trigger to implement a search query box to enhance searching data.
- a portion of data can be collected with the search query box. For example, a user can input text, characters, words, phrases, keywords, images, etc.
- an in situ search can be executed on the collected portion of data. It is to be appreciated that the search can be performed in the background, the foreground, and/or any other suitable combination thereof.
- a search result can be provided based upon the search.
- FIG. 8 illustrates a method 800 for initiating a graphic overlay for in situ search based on a flick gesture.
- a flick gesture can be received.
- the flick gesture can be a quick, linear movement associated with a scrolling action and/or command.
- the flick gesture can be a linear movement that requires a user to provide a quick flicking motion with characteristics such as a high speed and a high degree of straightness.
- the flick gesture can be evaluated. The flick gesture received can be evaluated in order to identify at least one of a speed, a direction, a location, an area, etc.
- a type of search can be executed based at least in part upon the flick gesture or the direction of the flick gesture.
- any suitable search can be implemented such as, but not limited to, a local search, a remote search, a file type based search (e.g., web site search, email search, document search, audio file search, etc.), application-based search, etc.
- a local search e.g., local hard drive, desktop, folders, local networks, etc.
- a downward flick gesture can trigger a remote search (e.g., web pages, the Internet, remote networks, etc.).
- the flick gesture can be incorporated as a search trigger for a third-party application.
- the third-party application can be any suitable portion of software, hardware, device, web site, web service, and/or any other suitable entity that can employ a flick gesture as a trigger for an in situ search or query.
- a user or third-party application can define which flick direction supports which function.
- a diagonal flick may be the default offering, but any flick direction can be used to trigger a search.
- a third-party email application can include instructions and/or definitions that can enable a flick gesture to trigger a particular search associated with such email application.
- FIGS. 9-10 and the following discussion is intended to provide a brief, general description of a suitable computing environment in which the various aspects of the subject innovation may be implemented.
- an in situ search component that can execute an in situ search based upon a flick gesture detection, as described in the previous figures, can be implemented in such suitable computing environment.
- program modules include routines, programs, components, data structures, etc., that perform particular tasks and/or implement particular abstract data types.
- inventive methods may be practiced with other computer system configurations, including single-processor or multi-processor computer systems, minicomputers, mainframe computers, as well as personal computers, hand-held computing devices, microprocessor-based and/or programmable consumer electronics, and the like, each of which may operatively communicate with one or more associated devices.
- the illustrated aspects of the claimed subject matter may also be practiced in distributed computing environments where certain tasks are performed by remote processing devices that are linked through a communications network. However, some, if not all, aspects of the subject innovation may be practiced on stand-alone computers.
- program modules may be located in local and/or remote memory storage devices.
- FIG. 9 is a schematic block diagram of a sample-computing environment 900 with which the claimed subject matter can interact.
- the system 900 includes one or more client(s) 910 .
- the client(s) 910 can be hardware and/or software (e.g., threads, processes, computing devices).
- the system 900 also includes one or more server(s) 920 .
- the server(s) 920 can be hardware and/or software (e.g., threads, processes, computing devices).
- the servers 920 can house threads to perform transformations by employing the subject innovation, for example.
- the system 900 includes a communication framework 940 that can be employed to facilitate communications between the client(s) 910 and the server(s) 920 .
- the client(s) 910 are operably connected to one or more client data store(s) 950 that can be employed to store information local to the client(s) 910 .
- the server(s) 920 are operably connected to one or more server data store(s) 930 that can be employed to store information local to the servers 920 .
- an exemplary environment 1000 for implementing various aspects of the claimed subject matter includes a computer 1012 .
- the computer 1012 includes a processing unit 1014 , a system memory 1016 , and a system bus 1018 .
- the system bus 1018 couples system components including, but not limited to, the system memory 1016 to the processing unit 1014 .
- the processing unit 1014 can be any of various available processors. Dual microprocessors and other multiprocessor architectures also can be employed as the processing unit 1014 .
- the system bus 1018 can be any of several types of bus structure(s) including the memory bus or memory controller, a peripheral bus or external bus, and/or a local bus using any variety of available bus architectures including, but not limited to, Industrial Standard Architecture (ISA), Micro-Channel Architecture (MSA), Extended ISA (EISA), Intelligent Drive Electronics (IDE), VESA Local Bus (VLB), Peripheral Component Interconnect (PCI), Card Bus, Universal Serial Bus (USB), Advanced Graphics Port (AGP), Personal Computer Memory Card International Association bus (PCMCIA), Firewire (IEEE 1394), and Small Computer Systems Interface (SCSI).
- ISA Industrial Standard Architecture
- MSA Micro-Channel Architecture
- EISA Extended ISA
- IDE Intelligent Drive Electronics
- VLB VESA Local Bus
- PCI Peripheral Component Interconnect
- Card Bus Universal Serial Bus
- USB Universal Serial Bus
- AGP Advanced Graphics Port
- PCMCIA Personal Computer Memory Card International Association bus
- Firewire IEEE 1394
- SCSI Small Computer Systems Interface
- the system memory 1016 includes volatile memory 1020 and nonvolatile memory 1022 .
- the basic input/output system (BIOS) containing the basic routines to transfer information between elements within the computer 1012 , such as during start-up, is stored in nonvolatile memory 1022 .
- nonvolatile memory 1022 can include read only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), or flash memory.
- Volatile memory 1020 includes random access memory (RAM), which acts as external cache memory.
- RAM is available in many forms such as static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), Synchlink DRAM (SLDRAM), Rambus direct RAM (RDRAM), direct Rambus dynamic RAM (DRDRAM), and Rambus dynamic RAM (RDRAM).
- SRAM static RAM
- DRAM dynamic RAM
- SDRAM synchronous DRAM
- DDR SDRAM double data rate SDRAM
- ESDRAM enhanced SDRAM
- SLDRAM Synchlink DRAM
- RDRAM Rambus direct RAM
- DRAM direct Rambus dynamic RAM
- RDRAM Rambus dynamic RAM
- Disk storage 1024 includes, but is not limited to, devices like a magnetic disk drive, floppy disk drive, tape drive, Jaz drive, Zip drive, LS-100 drive, flash memory card, or memory stick.
- disk storage 1024 can include storage media separately or in combination with other storage media including, but not limited to, an optical disk drive such as a compact disk ROM device (CD-ROM), CD recordable drive (CD-R Drive), CD rewritable drive (CD-RW Drive) or a digital versatile disk ROM drive (DVD-ROM).
- an optical disk drive such as a compact disk ROM device (CD-ROM), CD recordable drive (CD-R Drive), CD rewritable drive (CD-RW Drive) or a digital versatile disk ROM drive (DVD-ROM).
- a removable or non-removable interface is typically used such as interface 1026 .
- FIG. 10 describes software that acts as an intermediary between users and the basic computer resources described in the suitable operating environment 1000 .
- Such software includes an operating system 1028 .
- Operating system 1028 which can be stored on disk storage 1024 , acts to control and allocate resources of the computer system 1012 .
- System applications 1030 take advantage of the management of resources by operating system 1028 through program modules 1032 and program data 1034 stored either in system memory 1016 or on disk storage 1024 . It is to be appreciated that the claimed subject matter can be implemented with various operating systems or combinations of operating systems.
- Input devices 1036 include, but are not limited to, a pointing device such as a mouse, trackball, stylus, touch pad, keyboard, microphone, joystick, game pad, satellite dish, scanner, TV tuner card, digital camera, digital video camera, web camera, and the like. These and other input devices connect to the processing unit 1014 through the system bus 1018 via interface port(s) 1038 .
- Interface port(s) 1038 include, for example, a serial port, a parallel port, a game port, and a universal serial bus (USB).
- Output device(s) 1040 use some of the same type of ports as input device(s) 1036 .
- a USB port may be used to provide input to computer 1012 , and to output information from computer 1012 to an output device 1040 .
- Output adapter 1042 is provided to illustrate that there are some output devices 1040 like monitors, speakers, and printers, among other output devices 1040 , which require special adapters.
- the output adapters 1042 include, by way of illustration and not limitation, video and sound cards that provide a means of connection between the output device 1040 and the system bus 1018 . It should be noted that other devices and/or systems of devices provide both input and output capabilities such as remote computer(s) 1044 .
- Computer 1012 can operate in a networked environment using logical connections to one or more remote computers, such as remote computer(s) 1044 .
- the remote computer(s) 1044 can be a personal computer, a server, a router, a network PC, a workstation, a microprocessor based appliance, a peer device or other common network node and the like, and typically includes many or all of the elements described relative to computer 1012 .
- only a memory storage device 1046 is illustrated with remote computer(s) 1044 .
- Remote computer(s) 1044 is logically connected to computer 1012 through a network interface 1048 and then physically connected via communication connection 1050 .
- Network interface 1048 encompasses wire and/or wireless communication networks such as local-area networks (LAN) and wide-area networks (WAN).
- LAN technologies include Fiber Distributed Data Interface (FDDI), Copper Distributed Data Interface (CDDI), Ethernet, Token Ring and the like.
- WAN technologies include, but are not limited to, point-to-point links, circuit switching networks like Integrated Services Digital Networks (ISDN) and variations thereon, packet switching networks, and Digital Subscriber Lines (DSL).
- ISDN Integrated Services Digital Networks
- DSL Digital Subscriber Lines
- Communication connection(s) 1050 refers to the hardware/software employed to connect the network interface 1048 to the bus 1018 . While communication connection 1050 is shown for illustrative clarity inside computer 1012 , it can also be external to computer 1012 .
- the hardware/software necessary for connection to the network interface 1048 includes, for exemplary purposes only, internal and external technologies such as, modems including regular telephone grade modems, cable modems and DSL modems, ISDN adapters, and Ethernet cards.
- the terms (including a reference to a “means”) used to describe such components are intended to correspond, unless otherwise indicated, to any component which performs the specified function of the described component (e.g., a functional equivalent), even though not structurally equivalent to the disclosed structure, which performs the function in the herein illustrated exemplary aspects of the claimed subject matter.
- the innovation includes a system as well as a computer-readable medium having computer-executable instructions for performing the acts and/or events of the various methods of the claimed subject matter.
- an appropriate API, tool kit, driver code, operating system, control, standalone or downloadable software object, etc. which enables applications and services to use the advertising techniques of the invention.
- the claimed subject matter contemplates the use from the standpoint of an API (or other software object), as well as from a software or hardware object that operates according to the advertising techniques in accordance with the invention.
- various implementations of the innovation described herein may have aspects that are wholly in hardware, partly in hardware and partly in software, as well as in software.
Abstract
The claimed subject matter provides a system and/or a method that facilitates in situ searching of data. An interface can receive a flick gesture from an input device. An in situ search component can employ an in situ search triggered by the flick gesture, wherein the in situ search is executed on at least one of a portion of data selected on the input device.
Description
- This application relates to U.S. application Ser. No. 11/733,113, entitled, “IN SITU SEARCH FOR ACTIVE NOTE TAKING,” filed Apr. 9, 2007.
- Technological advances associated with computers, the Internet and the World Wide Web have enabled users to instantly access a vast and diverse amount of information. As compared to traditional libraries or encyclopedias, information provided by way of the Web is decentralized in nature. To locate information of interest, a user can employ a search engine that facilitates finding content stored on local or remote computers. Search engines can assist in locating information on the public Web, intranets, personal computers, and the like. Typical search engines can retrieve a list of references (e.g., search results) matching inputted criteria provided by the user. For instance, the user can perform a query by providing a word or phrase to the search engine and in response the search engine can return a list of search results matching the entered word, phrase, or a portion thereof. To further specify search queries, many search engines support utilization of Boolean terms such as, for instance, AND, OR and NOT as well as provisions related to a distance between keywords.
- The convergence of direct pen input devices, full text indexing of personal stores, and Internet search engines offers tremendous unexplored opportunities to design fluid user interfaces for searching (as discussed above) or active note taking. Active note taking can be considered to be, for example, the combination of pen-and-ink note taking with searching, linking, collecting, and sense making activities. This is in contrast to simple note taking, which is characterized by moment-to-moment transcription. Active note taking for example is typically performed by knowledge workers engaged in challenging creative work such as scientific research, product design, or planning complex activities, and the like. The knowledge workers often create informal pre-production work artifacts on paper, in notebooks, or on whiteboards, sketching preliminary plans and manipulating their notes to find solutions to difficult problems.
- In light of the above, personal information search and web-based search are trends with huge significance. More than ever, there are more and more documents, files, data, notes, etc. on computers and/or mobile devices such as a tablet, pocket PC, or smartphone. In addition, the amount of information available on the Internet continues to grow and is a compounding factor for the amount stored and accumulated data. Conventional techniques for querying and/or accessing such data are inefficient as attention is diverted to perform the search rather than on the task that instigated the search.
- The following presents a simplified summary of the innovation in order to provide a basic understanding of some aspects described herein. This summary is not an extensive overview of the claimed subject matter. It is intended to neither identify key or critical elements of the claimed subject matter nor delineate the scope of the subject innovation. Its sole purpose is to present some concepts of the claimed subject matter in a simplified form as a prelude to the more detailed description that is presented later.
- The subject innovation relates to systems and/or methods that facilitate querying data based on a flick gesture. An in situ search component can receive a flick gesture via an interface, wherein the in situ search component can execute an in situ search with the flick gesture as a trigger. The in situ search component can implement at least one of the following upon the detection of a flick gesture: a search on a portion of selected data or a generation of a search query box. A portion of data (e.g., handwriting, text, characters, words, phrases, images, etc.) can be selected and queried upon the implementation of a flick gesture. Moreover, the flick gesture can provide a search query box to receive a query when there is an empty selection of data.
- In accordance with another aspect of the subject innovation, the in situ search component can execute a search based on characteristics of the flick gesture. For example, the flick gesture can be evaluated by an evaluation component to identify flick gesture speed, flick gesture direction, and the like. Based on the characteristics of the flick gesture, the in situ search component can implement various types of searches. In still another aspect of the claimed subject matter, the in situ search component can utilize a graphic component that can generate an embeddable persistent graphical object with the flick gesture as a trigger. The embeddable persistent graphical object can be populated with search results for a query, a search query box for query input, previous searches, historic data, etc. In other aspects of the claimed subject matter, methods are provided that facilitate executing a command based on a direction of a received flick gesture.
- The following description and the annexed drawings set forth in detail certain illustrative aspects of the claimed subject matter. These aspects are indicative, however, of but a few of the various ways in which the principles of the innovation may be employed and the claimed subject matter is intended to include all such aspects and their equivalents. Other advantages and novel features of the claimed subject matter will become apparent from the following detailed description of the innovation when considered in conjunction with the drawings.
-
FIG. 1 illustrates a block diagram of an exemplary system that facilitates querying data based on a flick gesture. -
FIG. 2 illustrates a block diagram of an exemplary system that facilitates executing a command based on a direction of a received flick gesture. -
FIG. 3 illustrates a block diagram of an exemplary system that facilitates selecting a portion of data and initiating an in situ search based upon a flick gesture. -
FIG. 4 illustrates a block diagram of an exemplary system that facilitates leveraging in situ search triggered by a flick gesture with an application. -
FIG. 5 illustrates a block diagram of exemplary system that facilitates implementing an in situ search from a user based on a gesture received by an input device. -
FIG. 6 illustrates an exemplary methodology for facilitates initiating an in situ search of data upon detection of a flick gesture. -
FIG. 7 illustrates an exemplary methodology for executing an in situ search on a portion of data or a received user-specified query. -
FIG. 8 illustrates an exemplary methodology that facilitates initiating a graphic overlay for in situ search based on a flick gesture. -
FIG. 9 illustrates an exemplary networking environment, wherein the novel aspects of the claimed subject matter can be employed. -
FIG. 10 illustrates an exemplary operating environment that can be employed in accordance with the claimed subject matter. - The claimed subject matter is described with reference to the drawings, wherein like reference numerals are used to refer to like elements throughout. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the subject innovation. It may be evident, however, that the claimed subject matter may be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form in order to facilitate describing the subject innovation.
- As utilized herein, terms “component,” “system,” “interface,” “input device,” “application,” and the like are intended to refer to a computer-related entity, either hardware, software (e.g., in execution), and/or firmware. For example, a component can be a process running on a processor, a processor, an object, an executable, a program, a function, a library, a subroutine, and/or a computer or a combination of software and hardware. By way of illustration, both an application running on a server and the server can be a component. One or more components can reside within a process and a component can be localized on one computer and/or distributed between two or more computers.
- Furthermore, the claimed subject matter may be implemented as a method, apparatus, or article of manufacture using standard programming and/or engineering techniques to produce software, firmware, hardware, or any combination thereof to control a computer to implement the disclosed subject matter. The term “article of manufacture” as used herein is intended to encompass a computer program accessible from any computer-readable device, carrier, or media. For example, computer readable media can include but are not limited to magnetic storage devices (e.g., hard disk, floppy disk, magnetic strips . . . ), optical disks (e.g., compact disk (CD), digital versatile disk (DVD) . . . ), smart cards, and flash memory devices (e.g., card, stick, key drive . . . ). Additionally it should be appreciated that a carrier wave can be employed to carry computer-readable electronic data such as those used in transmitting and receiving electronic mail or in accessing a network such as the Internet or a local area network (LAN). Of course, those skilled in the art will recognize many modifications may be made to this configuration without departing from the scope or spirit of the claimed subject matter. Moreover, the word “exemplary” is used herein to mean serving as an example, instance, or illustration. Any aspect or design described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other aspects or designs.
- Now turning to the figures,
FIG. 1 illustrates asystem 100 that facilitates querying data based on a flick gesture. Thesystem 100 can include an insitu search component 102 that can receive a flick gesture via an interface component 106 (discussed below), wherein the flick gesture can trigger the execution of a search to return asearch result 104. The insitu search component 102 can implement an in situ search or query on any suitable portion of data upon the detection of a flick gesture. It is to be appreciated and understood that the flick gesture can be a quick, linear movement associated with a scrolling action and/or command. Moreover, the flick gesture can be a linear movement that requires a user to provide a quick flicking motion with characteristics such as a high speed and a high degree of straightness. Such linear movements can correspond to an input device (not shown) such as, but not limited to, a tablet, a touch screen, a mouse, a touch pad, a trackball, and/or any other suitable input device capable of inputting a flick gesture. In addition, the insitu search component 102 can implement particular searches or queries based upon a direction of the flick gesture (discussed in more detail below). Upon receipt of the flick gesture via theinterface component 106, the insitu search component 102 can provide a dynamic search on a portion of data in a seamless manner without disrupting a user's attention to his or her primary task prior to the search. It is to be further appreciated that the insitu search component 102 can provide a plurality ofsearch results 104 and asingle search result 104 is illustrated for the sake of brevity. - For example, a portion of text within a document can be selected by a user. The user can input a flick gesture in order to search with the selected text as a “seed” for the query. While the search is performed on the selected text, the user can continue to read the document without interruption of his or her task. Thus, the flick gesture activated search can be executed in the background so as to enable a user to seamlessly search documents, data, files, etc. In another example, a user can initiate a flick gesture and then a selection of data on which to perform the search. In other words, the flick gesture and the selection of data can be in any suitable order or sequence. In general, it is to be appreciated that the selection of data to query can be before and/or after the flick gesture.
- In another aspect of the subject innovation, the in
situ search component 102 can utilize a flick gesture as a trigger to prompt a user with a search query box to enter user-defined search. For example, a user can be examining a web page, perform a flick gesture, and be presented with a search query box in order to input user-defined data (e.g., handwriting, text, numbers, alphanumeric characters, etc.). The data inputted in the search query box can be a seed for a query in which to return at least one search result (e.g., search result 104). With such example, a user can quickly and efficiently search data with minimal interruption or distractions by the utilizing a flick gesture as a trigger to display a search query box. - In general, typical search functionality is slow to access and requires switching to a different application or a type-in box that is divorced from a user's focus of attention. As a result, the user often has to re-enter search terms that already existed directly in the context of a web page, a document, ink notes, etc. These barriers tend to deter users from issuing searches in the first place, resulting in lost opportunities for the user to quickly access related information. The
system 100 alleviates a user of such headaches with the employment of flick-based in situ searches. The flick-based in situ searches implemented by the insitu search component 102 can eliminate these unnecessary steps and make it far simpler and quicker for users to go from having the thought of doing a search, to actually getting useful results on their screen/display. - In addition, the
system 100 can include any suitable and/or necessary interface component 106 (herein referred to as “interface 106”), which provides various adapters, connectors, channels, communication paths, etc. to integrate the insitu search component 102 into virtually any operating and/or database system(s) and/or with one another. In addition, theinterface 106 can provide various adapters, connectors, channels, communication paths, etc., that provide for interaction with the insitu search component 102, flick gestures, input devices, thesearch result 104, and any other device and/or component associated with thesystem 100. -
FIG. 2 illustrates asystem 200 that facilitates executing a command based on a direction of a received flick gesture. Thesystem 200 can include the insitu search component 102 that can execute a search based upon the detection of a flick gesture via theinterface 106. Upon such in situ search, the insitu search component 102 can provide at least onesearch result 104. For instance, a flick gesture can be performed with an input device in order to activate an in situ search to produce thesearch result 104, wherein such in situ search can be performed on at least one of 1) a portion of selected data; 2) a portion of data entered in a prompted search query box; or 3) any suitable combination thereof. Thus, for example, a user can select a portion of text on an email, perform a flick gesture, and be prompted with an additional search query box in which user entered text/data and the selected portion of text can be seeds for a query. Furthermore, thesystem 200 may employ the context of other surrounding words, in addition to those explicitly selected by the user, to specialize, personalize, or contextualize the search results (e.g., by re-ranking web search results) to suit the particular user, or the particular document in which the user triggered the search. - The
system 200 can utilize anevaluation component 202 that can detect at least one of a flick gesture, a direction of a flick gesture, a speed of a linear input, a direction of a linear input, a location of a linear input, an area of a linear input, a data selection from an input device, and/or any other data related to an input from an input device. For example, theevaluation component 202 can continuously monitor an input device to detect a flick gesture. Once a flick gesture is identified, the insitu search component 102 can initiate a search or query by 1) executing a search on a portion of selected data; or 2) prompting a search query box for a user to fill. In another example, theevaluation component 202 can identify portions of data selected on which to search when triggered by a flick gesture. For instance, theevaluation component 202 can evaluate a location and/or area (e.g., handwriting, a portion of text, a portion of characters, a word, a phrase, a keyword, a sentence, a portion of an image, a graphic, a bitmap, or a portion of an icon, etc.) for which data is selected by the input device. In still another example, theevaluation component 202 can determine if a gesture is within a speed and/or direction threshold in order to be considered a flick gesture. Note that theevaluation component 202 may be a system component independent of individual applications; in this manner, the flick gesture serves as a system-wide gesture that supports all applications, rather than requiring individual applications to implement suitable search functionality. - In another example, the
evaluation component 202 can identify a direction associated with the flick gesture, wherein the direction of the flick gesture can correlate to a particular type of in situ search. For example, it is to be appreciated that any suitable search can be implemented by the insitu search component 102 such as, but not limited to, a local search, a remote search, a file type based search (e.g., web site search, email search, document search, audio file search, search within a particular directory, storage volume, or an operating system construct such as a ‘Start Menu’, etc.), application-based search, etc. Thus, a particular flick gesture direction can correspond to a specific type of in situ search implemented by the insitu search component 102. For instance, an upward flick gesture can trigger a local search (e.g., local hard drive, desktop, folders, local networks, etc.), whereas a downward flick gesture can trigger a remote search (e.g., web pages, the Internet, remote networks, etc.). In other words, more than one flick gesture can be assigned to different types of search functionality such as desktop (e.g., personal information) search, web search, search within the current document (e.g., often exposed as a find feature), etc. Likewise, a single flick gesture can present various types of search results that are grouped together, or interspersed according to a ranking function, e.g. a results list with desktop search results, web search results, image search results, etc. - In addition, the
evaluation component 202 can identify a flick gesture, wherein the insitu search component 102 can implement a correlating or mapped search command stored in a data store 204 (discussed in more detail below). For example, a collection of mappings can be stored on thedata store 204, in which a flick gesture can correspond with a particular command. In one example, the mappings can include a flick gesture direction and corresponding search command (e.g., a diagonal upwards and right flick gesture translates to a local search, a diagonal downward left flick gesture translates to an email search, an upward flick gesture translates to a search within the opened file or file in which the gesture occurred, etc.). - As discussed, the
system 200 can further include thedata store 204 that can include any suitable data related to the insitu search component 102, thesearch result 104, an input device, etc. For example, thedata store 204 can include, but not limited to including, mappings (e.g., flick gesture direction and corresponding command, etc.), thresholds for flick gesture qualification, search types, user settings, in situ search configurations, user preferences, graphical overlays (e.g., breadcrumb-discussed in more detail below, search query box, etc.), directional definitions for a flick gesture, historic data related to a search, and/or any other suitable data related to thesystem 200. - It is to be appreciated that the
data store 204 can be, for example, either volatile memory or nonvolatile memory, or can include both volatile and nonvolatile memory. By way of illustration, and not limitation, nonvolatile memory can include read only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), or flash memory. Volatile memory can include random access memory (RAM), which acts as external cache memory. By way of illustration and not limitation, RAM is available in many forms such as static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), Synchlink DRAM (SLDRAM), Rambus direct RAM (RDRAM), direct Rambus dynamic RAM (DRDRAM), and Rambus dynamic RAM (RDRAM). Thedata store 204 of the subject systems and methods is intended to comprise, without being limited to, these and any other suitable types of memory. In addition, it is to be appreciated that thedata store 204 can be a server, a database, a hard drive, a pen drive, an external hard drive, a portable hard drive, and the like. -
FIG. 3 illustrates asystem 300 that facilitates selecting a portion of data and initiating an in situ search based upon a flick gesture. Thesystem 300 can include the insitu search component 102 that employs a search or query based upon the detection of a flick gesture received via theinterface 106. By using a flick gesture to trigger and execute an in situ search, attention can be maintained and focused rather than be diverted to perform a search. Once detected, a query can be performed on a selection of data to provide thesearch result 104. If a selection of data is absent, a user can be prompted to provide a selection and/or a search query box can be displayed to receive a query. Thesystem 300 may present an initial list of desktop search results, sorted by age, if no query term is present so that all recent documents, emails, etc. are available as a default search that the user can then further filter down (e.g., by file type, author, date, etc) or further restrict by adding search terms to the query. For example, an empty data selection coupled with a flick gesture can return recent search results from a previous search in which a user can further filter down by adding a search term or by other data (e.g., file type, date, author, etc.). - The
system 300 can include aconversion component 302 that can enhance a selected portion of data to perform a search, wherein the enhancement can be a digital conversion or handwriting conversion. For example, a portion of data can be selected and a flick gesture executed in order to search the portion of selected data. It is to be appreciated that the portion of data can be handwritten, typed, extracted from an image via optical character recognition techniques, and/or any suitable combination thereof. Theconversion component 302 can translate handwritten data, typed data, and/or any other suitable data identified in order to perform an in situ search. - Moreover, the
conversion component 302 that can scan through inked handwritten script (e.g., graphemes, block, and/or cursive) and provide handwriting recognition to provide a digital form of the inked handwritten script. It is to be appreciated that theconversion component 302 can be used in conjunction with an artificial intelligence/machine learning component (not shown), or additionally and/or alternatively theconversion component 302 can itself comprise or include the intelligence/machine learning component. In general, there are several types of learning algorithms that can be utilized with respect to intelligence/machine learning. In particular, conditional maximum entropy (maxent) models have been widely employed for a variety of tasks, including language modeling, part-of-speech tagging, prepositional phrase attachment, and parsing, word selection for machine translation, and finding sentence boundaries. They are also sometime called logistic regression models, maximum likelihood exponential models, log-linear models, and can be equivalent to a form of perceptions, or single layer neural networks. In particular, perceptrons that use the standard sigmoid function, and optimize for log-loss can be perceived as being equivalent to maxent. - Furthermore, the in
situ search component 102 can include aquery component 304 that can conduct searches of an individual user's search space (e.g., various persisting means associated with the user, such as hard drives associated with the processing device and/or distributed over Wide Area Networks (WANs), Local Area Networks (LANs), and/or Storage Area Networks (SANs), USB drives/memory sticks, and/or memory devices affiliated with the user and confederated with the processing device) as well as the Internet, based at least in part on the a digital form generated byconversion component 302. In other words, thequery component 304 can be any suitable search engine that can search remote data, local data, and/or any suitable combination thereof to identify thesearch result 104. The search effectuated byquery component 304 can be conducted as a background process in order to detract from the distracting effects such searches can have on individuals' concentration on the task at hand. Similarly, search results (e.g., search result 104) can be associated with a persistent and embeddable graphical object (discussed below) and can be immediately displayed or displayed at a later time depending on individual preference. - The
system 300 can further utilize agraphic component 306 that can generate at least one of a persistent and embeddable graphical object or a search query box. As discussed, the search query box can be generated upon the detection of a flick gesture without any data selected and/or identified for a search or query. Without a selection of data, the search query box can be utilized in which a user can input specific terms, phrases, characters, etc. on which to perform a search. For example, if examining an email, a user can perform a flick gesture without any text selected (e.g., highlight, a circle, a lasso, an underline, a color, a box, an ellipse, etc.) which generates a search query box (embedded and persistent within the email) to enable a user to input query terms, characters, etc. - The persistent and embeddable graphical object can be a breadcrumb, for instance, to displayed and populate with at least one of a contextual ambit of flagged words and/or phrases, search results (e.g., search result 104), previous searches, historic data, preferences in relation to the in situ search, etc. In general, the breadcrumb can be a small icon (e.g., graphical object) attached to a selection region, which a user can then tap or stroke on to select what type of search to perform, using a direction stroke, a pull-down menu, or any other suitable technique to choose among difference search options. Allowing searches to morph between different types in this manner encourages fluidity and curiosity-driven search in different domains (e.g., starting with a desktop search, but then later transitioning to a web search if the desired information is not available from ones personal data store, local data, etc.).
- In another aspect in accordance with the subject innovation, the graphical object (e.g., the breadcrumb, the search query box, etc.) can persist until it is explicitly deleted by a user. In another aspect, the graphical object (e.g., the breadcrumb, the search query box, etc.) can exist when a current selection is active. Generally, a user-selectable option can be offered to allow a user to decide which behavior is preferred.
-
FIG. 4 illustrates asystem 400 that facilitates leveraging in situ search triggered by a flick gesture with an application. Thesystem 400 enables a flick gesture to activate an in situ search to be performed on user-defined or selected data. The flick gesture can be a rapid and dependable out-of-band gesture that can offer consistent cross-application functionality for common system commands such as cut, copy, paste, undo, redo, etc. The flick gesture can further be utilized to trigger “context sensitive” commands that depend on the current selection (or lack thereof). Flick gestures can also be supported with various input devices (as discussed) such as mice, touch pads, trackballs, etc. - The
system 400 can include anapplication 402 that can utilize and/or leverage the in situ search capabilities triggered by a flick gesture. It is to be appreciated that theapplication 402 can be any suitable portion of software, hardware, device, web site, web service, and/or any other suitable entity that can employ a flick gesture as a trigger for an in situ search or query. For example, a user or third-party application can define which flick direction supports which function. Thus, a diagonal flick may be the default offering, but any flick direction can be used to trigger a search. In another example, a third-party email application can include instructions and/or definitions that can enable a flick gesture to trigger a particular search associated with such email application. In addition, various settings, preferences, configurations, options, and the like (e.g., graphical object preferences, selection options, query defaults, sensitivity, direction of gesture, speed of gesture, etc.) can be further defined as default or personalized by a consumer. In general, it is to be appreciated that the flick gesture as a trigger for an in situ search can be seamlessly incorporated into theapplication 402. -
FIG. 5 illustrates asystem 500 that facilities implementing an in situ search from a user based on a gesture received by an input device. Thesystem 500 enables a flick gesture to trigger and initiate an in situ search related to data in a seamless manner. Thesystem 500 can include the insitu search component 102 that can execute a search to provide thesearch result 104 upon the detection of a flick gesture. By utilizing the flick gesture to activate a search, querying data can be done without disrupting an initial task that instigated the search or desire to search. - The
system 500 includes auser 502 that can interact with an input device (not shown). For example, a user can employ a pen or mouse to highlight a keyword or short phrase, and then perform aflick gesture 504 to launch a search based on that phrase or keyword. Theuser 502 can perform aflick gesture 504 in any suitable direction, whereinsuch flick gesture 504 can trigger an in situ search. It is to be appreciated that although eight directions are illustrated, the subject innovation can implement any suitable number of flick gesture directions in order to activate a search. In one particular example, theflick gesture 504 can be recognized along eight cardinal compass directions with primary directions assigned to various system functions and diagonal flick directions for search functions. Theflick gesture 504 can be identified and/or received from the input device via theinterface 106, in which the insitu search component 102 can employ a search to yield thesearch result 104 for theuser 502. - For instance, the
user 502 can select a portion of data and perform theflick gesture 504 to initiate a search. Thus, if there is no selection region at time of a flick gesture, the following can be employed: 1) an empty search query box can be generated in-place where the user can handwrite or type a query; or 2) a search query box can be generated that can be seeded with a word, if any, that falls under the pointer location at the start of the flick gesture. In the latter, the seeded query is “selected” by default, such that theuser 502 can either proceed directly with this query or immediately start typing or writing on tope of it to overwrite the seeded query with a new one. - In another example, the
user 502 can perform theflick gesture 504 to trigger a search without a prior selection of data. Thus, for example, if there is a prior selection, a selection can seed a subsequent flick-based query implementing at least one of the following: 1) the system 500 can pre-fetch a search result 104 for the seeded query and display them immediately in the assumption that the results represent the desired query; 2) the system 500 can pre-fetch when the user 502 forms any valid selection (e.g., before the user flicks to ask for search results) such that the search result 104 is available immediately without waiting for a search engine (e.g., query component discussed earlier) to return a result; 3) a type of result can depend on a selection region (e.g., short words or phrases can trigger keyword searches, selections of longer passages of text can perform a vector-based search for related documents, etc.); 4) a search can be offered as a flick command if there is a previous selection; 5) a selection can proceed from an ink stroke that are selected via a lasso selection (e.g., an input that encircles a portion of data) or other technique; 6) a selection may be a pure text string; or 7) a selection can be other context types that can seed valid queries (e.g., a selection of ink or text and a bitmap can trigger an image search by default, a selected object such as an icon representing an entire document can be selected for search, etc.). -
FIGS. 6-8 illustrate methodologies and/or flow diagrams in accordance with the claimed subject matter. For simplicity of explanation, the methodologies are depicted and described as a series of acts. It is to be understood and appreciated that the subject innovation is not limited by the acts illustrated and/or by the order of acts. For example acts can occur in various orders and/or concurrently, and with other acts not presented and described herein. Furthermore, not all illustrated acts may be required to implement the methodologies in accordance with the claimed subject matter. In addition, those skilled in the art will understand and appreciate that the methodologies could alternatively be represented as a series of interrelated states via a state diagram or events. Additionally, it should be further appreciated that the methodologies disclosed hereinafter and throughout this specification are capable of being stored on an article of manufacture to facilitate transporting and transferring such methodologies to computers. The term article of manufacture, as used herein, is intended to encompass a computer program accessible from any computer-readable device, carrier, or media. -
FIG. 6 illustrates amethod 600 that facilitates initiating an in situ search of data upon detection of a flick gesture. Themethod 600 commences atreference numeral 602 where various processor initializations tasks and background activities are performed. After these tasks have been performed themethod 600 can proceed to reference numeral 604 where a flick gesture can be received. The flick gesture can be a quick, linear movement associated with a scrolling action and/or command. Moreover, the flick gesture can be a linear movement that requires a user to provide a quick flicking motion with characteristics such as a high speed and a high degree of straightness. When a flick gesture is received, the method proceeds to reference numeral 606 where it is ascertained whether the flick gesture that was received pertains to the instigation of a search. Illustrative gestures that can indicate that users wish to instigate a search can include using, for example, a lasso gesture (e.g., encircling inked text), an underlining gesture and/or a sweeping gesture representative of highlighting the inked text, a flick gesture in a specific direction, a flick gesture with or without a portion of data selected, etc. If atreference numeral 606 it is determined that the gesture received does not comport with a pre-specified and/or cognizable gesture (e.g., NO) themethod 600 can return toreference numeral 604 to await an appropriate gesture. Otherwise (e.g., YES) themethod 600 proceeds to reference numeral 608 whereupon an embeddable graphical object (e.g., a breadcrumb) is generated. - An embeddable graphical object can be a visible representation of a query that acts as a handle placed in close proximity to, and/or logically attached to, ink that triggered the query. Nevertheless, it should be noted that embeddable graphical objects can be placed in any location desired by the user and/or heuristically determined by the system. In some aspects of the claimed subject matter, in less that a second and without interrupting the flow of a note taking task, a user can, for example “lasso” or “highlight” some ink to specify a search, and leave a search breadcrumb to be visited later. When the user returns, he/she can hover over the breadcrumb to see details, or to view the search results. Embeddable graphical objects or breadcrumbs serve as persisted reminders to revisit previous queries, and implicitly record a history of queries in the context of the notes that led to the search. Breadcrumbs can be cut, copied, pasted, selected, and/or moved around the user's notes. Breadcrumbs are furthermore persisted with the content itself (e.g. when saved as part of a digital notebook or note document).
- The
method 600 can proceed to reference numeral 610 where the inked text that has been selected (e.g., lassoed, highlighted, underlined, etc.) can be digitized and analyzed (e.g., lexically scanned to determine search terms). Digitizing and analysis of lassoed and/or highlighted ink can take the form of pattern recognition, optical character recognition, character recognition and/or handwriting analysis that can be carried out, for example, by a machine learning and/or artificial intelligence component. - After the lassoed and/or highlighted ink has been digitized and analyzed at
reference numeral 610, the resultant digital form can be employed as parameter to instigate search functionality atreference numeral 612. The search can be run either as a foreground process or a background process. The choice of whether to have the search functionality execute in foreground or background can be a matter of individual preference. Regardless of whether the search is effectuated as a foreground or background process, the search can typically yield results that can be displayed immediately upon completion of the search or display of the results can be deferred to a more conducive time when the user is more receptive to viewing the results. Nevertheless, whatever the user preference in this regard (e.g., view the results immediately or alternatively defer viewing to a later time) the results of the search can be associated with the embeddable graphical object atreference numeral 614. Once the search results have been affiliated with the embeddable graphical object, the embeddable object together with the associated search results can be inserted atreference numeral 616 in a location contiguous or abutting the selected data (e.g., circled, highlighted ink, etc.) that instigated the search, at which point themethodology 600 cycles back to 604 to await further an additional flick gesture received from an input device. Furthermore the embeddable object itself may be scaled in proportion to the total size of the lasso, e.g. so that a small selection has a small embeddable object attached to it, but a large selection would have a full-sized embeddable object attached to it. The embeddable object furthermore may be tapped or stroked to select it, whereupon the user is free to explicitly move it elsewhere, or resize it larger or smaller, if desired. -
FIG. 7 illustrates amethod 700 that facilitates executing an in situ search on a portion of data or a received user-specified query. Atreference numeral 702, a flick gesture can be received. For instance, the flick gesture can be received from an input device, wherein the input device can be, but is not limited to being, a tablet, a touch screen, a mouse, a touch pad, a trackball, a stylus and touch screen device, and/or any other suitable input device capable of inputting a flick gesture. Atreference numeral 704, a determination is made whether a portion of data has been selected in combination with the flick gesture. If a portion of data is selected, themethodology 700 continues atreference numeral 706. It is to be appreciated that a portion of data (e.g., text, characters, images, etc.) can be selected with highlighting, underlining, lassoing, circling, and/or any other suitable technique to identify a portion of data with an input device. Atreference numeral 706, an in situ search can be executed on the selected data. The in situ search can be implemented so as to not distract a user or shift attention. In other words, the search can be seamlessly initiated (e.g., in background, foreground, etc.). Atreference numeral 708, a search result can be provided. - If a portion of data is not selected at
reference numeral 704, themethodology 700 continues atreference numeral 710. Atreference numeral 710, a search query box can be generated and displayed. The flick gesture can be a trigger to implement a search query box to enhance searching data. Atreference numeral 712, a portion of data can be collected with the search query box. For example, a user can input text, characters, words, phrases, keywords, images, etc. Atreference numeral 714, an in situ search can be executed on the collected portion of data. It is to be appreciated that the search can be performed in the background, the foreground, and/or any other suitable combination thereof. Atreference numeral 716, a search result can be provided based upon the search. -
FIG. 8 illustrates amethod 800 for initiating a graphic overlay for in situ search based on a flick gesture. Atreference numeral 802, a flick gesture can be received. For instance, the flick gesture can be a quick, linear movement associated with a scrolling action and/or command. Moreover, the flick gesture can be a linear movement that requires a user to provide a quick flicking motion with characteristics such as a high speed and a high degree of straightness. Atreference numeral 804, the flick gesture can be evaluated. The flick gesture received can be evaluated in order to identify at least one of a speed, a direction, a location, an area, etc. - Continuing at
reference numeral 806, a type of search can be executed based at least in part upon the flick gesture or the direction of the flick gesture. For example, it is to be appreciated that any suitable search can be implemented such as, but not limited to, a local search, a remote search, a file type based search (e.g., web site search, email search, document search, audio file search, etc.), application-based search, etc. For instance, an upward flick gesture can trigger a local search (e.g., local hard drive, desktop, folders, local networks, etc.), whereas a downward flick gesture can trigger a remote search (e.g., web pages, the Internet, remote networks, etc.). - At
reference numeral 808, the flick gesture can be incorporated as a search trigger for a third-party application. It is to be appreciated that the third-party application can be any suitable portion of software, hardware, device, web site, web service, and/or any other suitable entity that can employ a flick gesture as a trigger for an in situ search or query. For example, a user or third-party application can define which flick direction supports which function. Thus, a diagonal flick may be the default offering, but any flick direction can be used to trigger a search. For instance, a third-party email application can include instructions and/or definitions that can enable a flick gesture to trigger a particular search associated with such email application. - In order to provide additional context for implementing various aspects of the claimed subject matter,
FIGS. 9-10 and the following discussion is intended to provide a brief, general description of a suitable computing environment in which the various aspects of the subject innovation may be implemented. For example, an in situ search component that can execute an in situ search based upon a flick gesture detection, as described in the previous figures, can be implemented in such suitable computing environment. While the claimed subject matter has been described above in the general context of computer-executable instructions of a computer program that runs on a local computer and/or remote computer, those skilled in the art will recognize that the subject innovation also may be implemented in combination with other program modules. Generally, program modules include routines, programs, components, data structures, etc., that perform particular tasks and/or implement particular abstract data types. - Moreover, those skilled in the art will appreciate that the inventive methods may be practiced with other computer system configurations, including single-processor or multi-processor computer systems, minicomputers, mainframe computers, as well as personal computers, hand-held computing devices, microprocessor-based and/or programmable consumer electronics, and the like, each of which may operatively communicate with one or more associated devices. The illustrated aspects of the claimed subject matter may also be practiced in distributed computing environments where certain tasks are performed by remote processing devices that are linked through a communications network. However, some, if not all, aspects of the subject innovation may be practiced on stand-alone computers. In a distributed computing environment, program modules may be located in local and/or remote memory storage devices.
-
FIG. 9 is a schematic block diagram of a sample-computing environment 900 with which the claimed subject matter can interact. Thesystem 900 includes one or more client(s) 910. The client(s) 910 can be hardware and/or software (e.g., threads, processes, computing devices). Thesystem 900 also includes one or more server(s) 920. The server(s) 920 can be hardware and/or software (e.g., threads, processes, computing devices). Theservers 920 can house threads to perform transformations by employing the subject innovation, for example. - One possible communication between a
client 910 and aserver 920 can be in the form of a data packet adapted to be transmitted between two or more computer processes. Thesystem 900 includes acommunication framework 940 that can be employed to facilitate communications between the client(s) 910 and the server(s) 920. The client(s) 910 are operably connected to one or more client data store(s) 950 that can be employed to store information local to the client(s) 910. Similarly, the server(s) 920 are operably connected to one or more server data store(s) 930 that can be employed to store information local to theservers 920. - With reference to
FIG. 10 , anexemplary environment 1000 for implementing various aspects of the claimed subject matter includes acomputer 1012. Thecomputer 1012 includes aprocessing unit 1014, asystem memory 1016, and asystem bus 1018. Thesystem bus 1018 couples system components including, but not limited to, thesystem memory 1016 to theprocessing unit 1014. Theprocessing unit 1014 can be any of various available processors. Dual microprocessors and other multiprocessor architectures also can be employed as theprocessing unit 1014. - The
system bus 1018 can be any of several types of bus structure(s) including the memory bus or memory controller, a peripheral bus or external bus, and/or a local bus using any variety of available bus architectures including, but not limited to, Industrial Standard Architecture (ISA), Micro-Channel Architecture (MSA), Extended ISA (EISA), Intelligent Drive Electronics (IDE), VESA Local Bus (VLB), Peripheral Component Interconnect (PCI), Card Bus, Universal Serial Bus (USB), Advanced Graphics Port (AGP), Personal Computer Memory Card International Association bus (PCMCIA), Firewire (IEEE 1394), and Small Computer Systems Interface (SCSI). - The
system memory 1016 includesvolatile memory 1020 andnonvolatile memory 1022. The basic input/output system (BIOS), containing the basic routines to transfer information between elements within thecomputer 1012, such as during start-up, is stored innonvolatile memory 1022. By way of illustration, and not limitation,nonvolatile memory 1022 can include read only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), or flash memory.Volatile memory 1020 includes random access memory (RAM), which acts as external cache memory. By way of illustration and not limitation, RAM is available in many forms such as static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), Synchlink DRAM (SLDRAM), Rambus direct RAM (RDRAM), direct Rambus dynamic RAM (DRDRAM), and Rambus dynamic RAM (RDRAM). -
Computer 1012 also includes removable/non-removable, volatile/non-volatile computer storage media.FIG. 10 illustrates, for example adisk storage 1024.Disk storage 1024 includes, but is not limited to, devices like a magnetic disk drive, floppy disk drive, tape drive, Jaz drive, Zip drive, LS-100 drive, flash memory card, or memory stick. In addition,disk storage 1024 can include storage media separately or in combination with other storage media including, but not limited to, an optical disk drive such as a compact disk ROM device (CD-ROM), CD recordable drive (CD-R Drive), CD rewritable drive (CD-RW Drive) or a digital versatile disk ROM drive (DVD-ROM). To facilitate connection of thedisk storage devices 1024 to thesystem bus 1018, a removable or non-removable interface is typically used such asinterface 1026. - It is to be appreciated that
FIG. 10 describes software that acts as an intermediary between users and the basic computer resources described in thesuitable operating environment 1000. Such software includes anoperating system 1028.Operating system 1028, which can be stored ondisk storage 1024, acts to control and allocate resources of thecomputer system 1012.System applications 1030 take advantage of the management of resources byoperating system 1028 throughprogram modules 1032 andprogram data 1034 stored either insystem memory 1016 or ondisk storage 1024. It is to be appreciated that the claimed subject matter can be implemented with various operating systems or combinations of operating systems. - A user enters commands or information into the
computer 1012 through input device(s) 1036.Input devices 1036 include, but are not limited to, a pointing device such as a mouse, trackball, stylus, touch pad, keyboard, microphone, joystick, game pad, satellite dish, scanner, TV tuner card, digital camera, digital video camera, web camera, and the like. These and other input devices connect to theprocessing unit 1014 through thesystem bus 1018 via interface port(s) 1038. Interface port(s) 1038 include, for example, a serial port, a parallel port, a game port, and a universal serial bus (USB). Output device(s) 1040 use some of the same type of ports as input device(s) 1036. Thus, for example, a USB port may be used to provide input tocomputer 1012, and to output information fromcomputer 1012 to anoutput device 1040.Output adapter 1042 is provided to illustrate that there are someoutput devices 1040 like monitors, speakers, and printers, amongother output devices 1040, which require special adapters. Theoutput adapters 1042 include, by way of illustration and not limitation, video and sound cards that provide a means of connection between theoutput device 1040 and thesystem bus 1018. It should be noted that other devices and/or systems of devices provide both input and output capabilities such as remote computer(s) 1044. -
Computer 1012 can operate in a networked environment using logical connections to one or more remote computers, such as remote computer(s) 1044. The remote computer(s) 1044 can be a personal computer, a server, a router, a network PC, a workstation, a microprocessor based appliance, a peer device or other common network node and the like, and typically includes many or all of the elements described relative tocomputer 1012. For purposes of brevity, only amemory storage device 1046 is illustrated with remote computer(s) 1044. Remote computer(s) 1044 is logically connected tocomputer 1012 through anetwork interface 1048 and then physically connected viacommunication connection 1050.Network interface 1048 encompasses wire and/or wireless communication networks such as local-area networks (LAN) and wide-area networks (WAN). LAN technologies include Fiber Distributed Data Interface (FDDI), Copper Distributed Data Interface (CDDI), Ethernet, Token Ring and the like. WAN technologies include, but are not limited to, point-to-point links, circuit switching networks like Integrated Services Digital Networks (ISDN) and variations thereon, packet switching networks, and Digital Subscriber Lines (DSL). - Communication connection(s) 1050 refers to the hardware/software employed to connect the
network interface 1048 to thebus 1018. Whilecommunication connection 1050 is shown for illustrative clarity insidecomputer 1012, it can also be external tocomputer 1012. The hardware/software necessary for connection to thenetwork interface 1048 includes, for exemplary purposes only, internal and external technologies such as, modems including regular telephone grade modems, cable modems and DSL modems, ISDN adapters, and Ethernet cards. - What has been described above includes examples of the subject innovation. It is, of course, not possible to describe every conceivable combination of components or methodologies for purposes of describing the claimed subject matter, but one of ordinary skill in the art may recognize that many further combinations and permutations of the subject innovation are possible. Accordingly, the claimed subject matter is intended to embrace all such alterations, modifications, and variations that fall within the spirit and scope of the appended claims.
- In particular and in regard to the various functions performed by the above described components, devices, circuits, systems and the like, the terms (including a reference to a “means”) used to describe such components are intended to correspond, unless otherwise indicated, to any component which performs the specified function of the described component (e.g., a functional equivalent), even though not structurally equivalent to the disclosed structure, which performs the function in the herein illustrated exemplary aspects of the claimed subject matter. In this regard, it will also be recognized that the innovation includes a system as well as a computer-readable medium having computer-executable instructions for performing the acts and/or events of the various methods of the claimed subject matter.
- There are multiple ways of implementing the present innovation, e.g., an appropriate API, tool kit, driver code, operating system, control, standalone or downloadable software object, etc. which enables applications and services to use the advertising techniques of the invention. The claimed subject matter contemplates the use from the standpoint of an API (or other software object), as well as from a software or hardware object that operates according to the advertising techniques in accordance with the invention. Thus, various implementations of the innovation described herein may have aspects that are wholly in hardware, partly in hardware and partly in software, as well as in software.
- The aforementioned systems have been described with respect to interaction between several components. It can be appreciated that such systems and components can include those components or specified sub-components, some of the specified components or sub-components, and/or additional components, and according to various permutations and combinations of the foregoing. Sub-components can also be implemented as components communicatively coupled to other components rather than included within parent components (hierarchical). Additionally, it should be noted that one or more components may be combined into a single component providing aggregate functionality or divided into several separate sub-components, and any one or more middle layers, such as a management layer, may be provided to communicatively couple to such sub-components in order to provide integrated functionality. Any components described herein may also interact with one or more other components not specifically described herein but generally known by those of skill in the art.
- In addition, while a particular feature of the subject innovation may have been disclosed with respect to only one of several implementations, such feature may be combined with one or more other features of the other implementations as may be desired and advantageous for any given or particular application. Furthermore, to the extent that the terms “includes,” “including,” “has,” “contains,” variants thereof, and other similar words are used in either the detailed description or the claims, these terms are intended to be inclusive in a manner similar to the term “comprising” as an open transition word without precluding any additional or other elements.
Claims (20)
1. A computer-implemented system that facilitates in situ searching of data, comprising:
an interface component that receives a flick gesture from an input device; and
an in situ search component that employs an in situ search triggered by the flick gesture, the in situ search is executed on a portion of data related to the input device.
2. The system of claim 1 , the flick gesture is a quick, linear movement associated with a scrolling action and includes at least one of a high speed characteristic or a high degree of straightness.
3. The system of claim 1 , the input device is at least one of a tablet, a touch screen, a mouse, a touch pad, a trackball, or a stylus and input screen.
4. The system of claim 1 , further comprising an evaluation component that monitors the flick gesture to ascertain at least one of a direction of the flick gesture, a speed of the flick gesture, a location of the flick gesture, an area of flick gesture, or a data selection from an input device.
5. The system of claim 4 , the in situ search component performs a type of search based upon the direction of the flick gesture identified by the evaluation component.
6. The system of claim 5 , the type of search is at least one of a local search, a remote search, a file type based search, a web site search, an email search, a document search, an audio file search, an application-based search, a search within a directory, a search within a storage volume, or a search within an operating system construct.
7. The system of claim 1 , further comprising a conversion component that employs at least one of a digital conversion of selected data or a handwriting conversion of selected data.
8. The system of claim 1 , further comprising a graphic component that generates, upon detection of the flick gesture, at least one of a persistent and embeddable graphical object or a search query box.
9. The system of claim 8 , the persistent and embeddable graphical object is displayed and populated with at least one of a word, a phrase, a search result, a previous search, a portion of historic data, or a preference in relation to the in situ search.
10. The system of claim 8 , the search query box is implemented upon detection of a flick gesture and an empty selection of data to receive at least one of a user-defined query, handwriting data, typed data, or an image.
11. The system of claim 1 , the in situ search component is seeded with a portion of data that falls under a pointer location at a start of a flick gesture, wherein the portion of data can be changed by a user prior to search.
12. The system of claim 1 , the in situ search component performs the search as at least one of a background process or a foreground process.
13. The system of claim 1 , the input device selects a portion of data to perform the in situ search with at least one of a highlight, a circle, a lasso, an underline, a color, a box, or an ellipse.
14. The system of claim 13 , the in situ search component provides at least one of the following: a pre-fetch and display of a search result for the selected portion of data; or a pre-fetch upon detection of the selected portion of data.
15. The system of claim 13 , the portion of data is at least one of a portion of handwriting, a portion of text, a portion of characters, a word, a phrase, a keyword, a sentence, a portion of an image, a graphic, a bitmap, or a portion of an icon.
16. The system of claim 1 , further comprising an application that incorporates the flick gesture as a trigger to execute an in situ search on a portion of data.
17. A computer-implemented method that facilitates querying data in a seamless manner, comprising:
identifying a flick gesture from an input device, the flick gesture is performed while displaying a portion of data; and
utilizing the flick gesture to initiate an in situ search within the display of the portion of data.
18. The method of claim 17 , further comprising:
determining if a portion of data is selected by a user;
executing an in situ search on the portion of selected data;
providing a search result based on the in situ search for the portion of selected data.
19. The method of claim 17 , further comprising:
performing the flick gesture without a portion of data selected;
displaying a search query box;
collecting a query from the search query box;
executing an in situ search on the query; and
providing a search result.
20. A computer-implemented system that facilitates in situ searching of data, comprising:
means for receiving a flick gesture from an input device;
means for employing an in situ search triggered by the flick gesture; and
means for executing the in situ search on a portion of data related to the input device.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/849,469 US20090058820A1 (en) | 2007-09-04 | 2007-09-04 | Flick-based in situ search from ink, text, or an empty selection region |
US14/572,527 US10191940B2 (en) | 2007-09-04 | 2014-12-16 | Gesture-based searching |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/849,469 US20090058820A1 (en) | 2007-09-04 | 2007-09-04 | Flick-based in situ search from ink, text, or an empty selection region |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/572,527 Continuation US10191940B2 (en) | 2007-09-04 | 2014-12-16 | Gesture-based searching |
Publications (1)
Publication Number | Publication Date |
---|---|
US20090058820A1 true US20090058820A1 (en) | 2009-03-05 |
Family
ID=40406696
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/849,469 Abandoned US20090058820A1 (en) | 2007-09-04 | 2007-09-04 | Flick-based in situ search from ink, text, or an empty selection region |
US14/572,527 Active US10191940B2 (en) | 2007-09-04 | 2014-12-16 | Gesture-based searching |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/572,527 Active US10191940B2 (en) | 2007-09-04 | 2014-12-16 | Gesture-based searching |
Country Status (1)
Country | Link |
---|---|
US (2) | US20090058820A1 (en) |
Cited By (102)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080168396A1 (en) * | 2007-01-07 | 2008-07-10 | Michael Matas | Portable Multifunction Device, Method, and Graphical User Interface for Providing Maps and Directions |
US20090132960A1 (en) * | 2007-11-21 | 2009-05-21 | Lg Electronics Inc. | Terminal, method of controlling the same and recording medium for the method |
US20090178007A1 (en) * | 2008-01-06 | 2009-07-09 | Michael Matas | Touch Screen Device, Method, and Graphical User Interface for Displaying and Selecting Application Options |
US20090174680A1 (en) * | 2008-01-06 | 2009-07-09 | Freddy Allen Anzures | Portable Multifunction Device, Method, and Graphical User Interface for Viewing and Managing Electronic Calendars |
US20090307631A1 (en) * | 2008-02-01 | 2009-12-10 | Kim Joo Min | User interface method for mobile device and mobile communication system |
US20100107114A1 (en) * | 2008-10-28 | 2010-04-29 | Zachcial Slawomir | In context web page localization |
US20100169766A1 (en) * | 2008-12-31 | 2010-07-01 | Matias Duarte | Computing Device and Method for Selecting Display Regions Responsive to Non-Discrete Directional Input Actions and Intelligent Content Analysis |
US20100262591A1 (en) * | 2009-04-08 | 2010-10-14 | Lee Sang Hyuck | Method for inputting command in mobile terminal and mobile terminal using the same |
WO2011053442A1 (en) * | 2009-10-30 | 2011-05-05 | Symbol Technologies, Inc. | System and method for operating an rfid system with head tracking |
US20110115702A1 (en) * | 2008-07-08 | 2011-05-19 | David Seaberg | Process for Providing and Editing Instructions, Data, Data Structures, and Algorithms in a Computer System |
US20110154268A1 (en) * | 2009-12-18 | 2011-06-23 | Synaptics Incorporated | Method and apparatus for operating in pointing and enhanced gesturing modes |
US20110148786A1 (en) * | 2009-12-18 | 2011-06-23 | Synaptics Incorporated | Method and apparatus for changing operating modes |
US20110167058A1 (en) * | 2010-01-06 | 2011-07-07 | Van Os Marcel | Device, Method, and Graphical User Interface for Mapping Directions Between Search Results |
US20110163874A1 (en) * | 2010-01-06 | 2011-07-07 | Van Os Marcel | Device, Method, and Graphical User Interface for Tracking Movement on a Map |
US20110310039A1 (en) * | 2010-06-16 | 2011-12-22 | Samsung Electronics Co., Ltd. | Method and apparatus for user-adaptive data arrangement/classification in portable terminal |
US20120005583A1 (en) * | 2010-06-30 | 2012-01-05 | Yahoo! Inc. | Method and system for performing a web search |
US20120023447A1 (en) * | 2010-07-23 | 2012-01-26 | Masaaki Hoshino | Information processing device, information processing method, and information processing program |
EP2423798A1 (en) * | 2010-08-31 | 2012-02-29 | Samsung Electronics Co., Ltd. | Method of providing search service by extracting keywords in specified region and display device applying the same |
CN102375588A (en) * | 2010-08-19 | 2012-03-14 | 上海博泰悦臻电子设备制造有限公司 | Method and device for controlling equipment operation through gesture on screen of electronic equipment |
US20120096354A1 (en) * | 2010-10-14 | 2012-04-19 | Park Seungyong | Mobile terminal and control method thereof |
US20120095997A1 (en) * | 2010-10-18 | 2012-04-19 | Microsoft Corporation | Providing contextual hints associated with a user session |
US20120105345A1 (en) * | 2010-09-24 | 2012-05-03 | Qnx Software Systems Limited | Portable Electronic Device and Method of Controlling Same |
US20120154304A1 (en) * | 2010-12-16 | 2012-06-21 | Samsung Electronics Co., Ltd. | Portable terminal with optical touch pad and method for controlling data in the same |
US20120272176A1 (en) * | 2003-11-05 | 2012-10-25 | Google Inc. | Persistent User Interface for Providing Navigational Functionality |
US8302033B2 (en) | 2007-06-22 | 2012-10-30 | Apple Inc. | Touch screen device, method, and graphical user interface for providing maps, directions, and location-based information |
GB2493510A (en) * | 2011-07-28 | 2013-02-13 | Daniel Rajkumar | Methods of controlling a search engine |
CN102968269A (en) * | 2012-10-25 | 2013-03-13 | 东莞宇龙通信科技有限公司 | Terminal and terminal management method |
US20130132361A1 (en) * | 2011-11-22 | 2013-05-23 | Liang-Pu CHEN | Input method for querying by using a region formed by an enclosed track and system using the same |
US8464182B2 (en) | 2009-06-07 | 2013-06-11 | Apple Inc. | Device, method, and graphical user interface for providing maps, directions, and location-based information |
US8478777B2 (en) * | 2011-10-25 | 2013-07-02 | Google Inc. | Gesture-based search |
CN103365578A (en) * | 2012-03-29 | 2013-10-23 | 百度在线网络技术(北京)有限公司 | Mobile terminal unlocking method and mobile terminal |
US8583622B2 (en) | 2012-03-05 | 2013-11-12 | Microsoft Corporation | Application of breadcrumbs in ranking and search experiences |
CN103455590A (en) * | 2013-08-29 | 2013-12-18 | 百度在线网络技术(北京)有限公司 | Method and device for retrieving in touch-screen device |
US20140046922A1 (en) * | 2012-08-08 | 2014-02-13 | Microsoft Corporation | Search user interface using outward physical expressions |
US20140078093A1 (en) * | 2011-05-23 | 2014-03-20 | Sony Corporation | Information processing apparatus, information processing method and computer program |
US20140096080A1 (en) * | 2012-10-01 | 2014-04-03 | Fuji Xerox Co., Ltd. | Information display apparatus, information display method, and computer readable medium |
EP2717177A1 (en) * | 2011-05-31 | 2014-04-09 | Rakuten, Inc. | Information provision system, information provision system control method, information provision device, program, and information recording medium |
US8782513B2 (en) | 2011-01-24 | 2014-07-15 | Apple Inc. | Device, method, and graphical user interface for navigating through an electronic document |
WO2014137626A1 (en) * | 2013-03-04 | 2014-09-12 | Microsoft Corporation | Digital ink based contextual search |
US8842082B2 (en) | 2011-01-24 | 2014-09-23 | Apple Inc. | Device, method, and graphical user interface for navigating and annotating an electronic document |
US20140331187A1 (en) * | 2013-05-03 | 2014-11-06 | Barnesandnoble.Com Llc | Grouping objects on a computing device |
CN104252492A (en) * | 2013-06-28 | 2014-12-31 | 宏碁股份有限公司 | Method for searching data and electronic device thereof |
US8953570B2 (en) | 2010-11-23 | 2015-02-10 | Symbol Technologies, Inc. | Radio frequency identification system and related operating methods |
CN104462437A (en) * | 2014-12-15 | 2015-03-25 | 北京奇虎科技有限公司 | Recognizing and searching method and recognizing and searching system based on repeated touch operations of interface of terminal |
EP2413230A3 (en) * | 2010-07-30 | 2015-04-01 | Samsung Electronics Co., Ltd. | Method for providing user interface and display apparatus applying the same |
US9009191B2 (en) | 2012-03-23 | 2015-04-14 | Blackberry Limited | Systems and methods for presenting content relevant to text |
US9021402B1 (en) * | 2010-09-24 | 2015-04-28 | Google Inc. | Operation of mobile device interface using gestures |
CN104731798A (en) * | 2013-12-19 | 2015-06-24 | 鸿合科技有限公司 | Text retrieval method and text retrieval device |
US9092132B2 (en) | 2011-01-24 | 2015-07-28 | Apple Inc. | Device, method, and graphical user interface with a dynamic gesture disambiguation threshold |
US9128614B2 (en) | 2010-11-05 | 2015-09-08 | Apple Inc. | Device, method, and graphical user interface for manipulating soft keyboards |
US9141256B2 (en) | 2010-09-24 | 2015-09-22 | 2236008 Ontario Inc. | Portable electronic device and method therefor |
US9141285B2 (en) | 2010-11-05 | 2015-09-22 | Apple Inc. | Device, method, and graphical user interface for manipulating soft keyboards |
US9164670B2 (en) | 2010-09-15 | 2015-10-20 | Microsoft Technology Licensing, Llc | Flexible touch-based scrolling |
US20150347363A1 (en) * | 2014-05-30 | 2015-12-03 | Paul Manganaro | System for Communicating with a Reader |
US20160055201A1 (en) * | 2014-08-22 | 2016-02-25 | Google Inc. | Radar Recognition-Aided Searches |
US20160154555A1 (en) * | 2014-12-02 | 2016-06-02 | Lenovo (Singapore) Pte. Ltd. | Initiating application and performing function based on input |
CN105653711A (en) * | 2015-12-30 | 2016-06-08 | 广东欧珀移动通信有限公司 | Terminal application searching method and device |
US9442654B2 (en) | 2010-01-06 | 2016-09-13 | Apple Inc. | Apparatus and method for conditionally enabling or disabling soft buttons |
US20160334948A1 (en) * | 2015-05-15 | 2016-11-17 | Casio Computer Co., Ltd. | Image display apparatus equipped with a touch panel |
US9507512B1 (en) * | 2012-04-25 | 2016-11-29 | Amazon Technologies, Inc. | Using gestures to deliver content to predefined destinations |
US9563341B2 (en) | 2013-03-16 | 2017-02-07 | Jerry Alan Crandall | Data sharing |
US20170046063A1 (en) * | 2009-03-16 | 2017-02-16 | Apple Inc. | Event Recognition |
US9600080B2 (en) | 2014-10-02 | 2017-03-21 | Google Inc. | Non-line-of-sight radar-based gesture recognition |
US9684444B2 (en) | 2010-09-24 | 2017-06-20 | Blackberry Limited | Portable electronic device and method therefor |
US20170286552A1 (en) * | 2016-03-30 | 2017-10-05 | Microsoft Technology Licensing, Llc | Using Gesture Selection to Obtain Contextually Relevant Information |
US9811164B2 (en) | 2014-08-07 | 2017-11-07 | Google Inc. | Radar-based gesture sensing and data transmission |
US20170357699A1 (en) * | 2016-06-10 | 2017-12-14 | Apple Inc. | System and method of highlighting terms |
US9848780B1 (en) | 2015-04-08 | 2017-12-26 | Google Inc. | Assessing cardiovascular function using an optical sensor |
US20180011940A1 (en) * | 2016-07-06 | 2018-01-11 | Vimio Co. Ltd | App name search method and system |
US9921660B2 (en) | 2014-08-07 | 2018-03-20 | Google Llc | Radar-based gesture recognition |
US9933908B2 (en) | 2014-08-15 | 2018-04-03 | Google Llc | Interactive textiles |
US9971415B2 (en) | 2014-06-03 | 2018-05-15 | Google Llc | Radar-based gesture-recognition through a wearable device |
US20180188921A1 (en) * | 2014-06-05 | 2018-07-05 | OpemPeak LLC | Method and system for enabling the sharing of information between applications on a computing device |
US10016162B1 (en) | 2015-03-23 | 2018-07-10 | Google Llc | In-ear health monitoring |
US10031608B2 (en) * | 2009-05-21 | 2018-07-24 | Microsoft Technology Licensing, Llc | Organizational tools on a multi-touch display device |
US20180253427A1 (en) * | 2011-10-24 | 2018-09-06 | Imagescan, Inc. | Apparatus and Method for Displaying Multiple Display Panels With a Progressive Relationship Using Cognitive Pattern Recognition |
US10080528B2 (en) | 2015-05-19 | 2018-09-25 | Google Llc | Optical central venous pressure measurement |
US10139916B2 (en) | 2015-04-30 | 2018-11-27 | Google Llc | Wide-field radar-based gesture recognition |
US10155274B2 (en) | 2015-05-27 | 2018-12-18 | Google Llc | Attaching electronic components to interactive textiles |
US10175781B2 (en) | 2016-05-16 | 2019-01-08 | Google Llc | Interactive object with multiple electronics modules |
US20190012353A1 (en) * | 2009-03-16 | 2019-01-10 | Apple Inc. | Multifunction device with integrated search and application selection |
US20190026377A1 (en) * | 2015-09-28 | 2019-01-24 | Oath Inc. | Multi-touch gesture search |
US10222469B1 (en) | 2015-10-06 | 2019-03-05 | Google Llc | Radar-based contextual sensing |
US10241581B2 (en) | 2015-04-30 | 2019-03-26 | Google Llc | RF-based micro-motion tracking for gesture tracking and recognition |
US10310620B2 (en) | 2015-04-30 | 2019-06-04 | Google Llc | Type-agnostic RF signal representations |
US10376195B1 (en) | 2015-06-04 | 2019-08-13 | Google Llc | Automated nursing assessment |
US10409385B2 (en) | 2014-08-22 | 2019-09-10 | Google Llc | Occluded gesture recognition |
US10492302B2 (en) | 2016-05-03 | 2019-11-26 | Google Llc | Connecting an electronic component to an interactive textile |
US10572027B2 (en) | 2015-05-27 | 2020-02-25 | Google Llc | Gesture detection and interactions |
US10579150B2 (en) | 2016-12-05 | 2020-03-03 | Google Llc | Concurrent detection of absolute distance and relative movement for sensing action gestures |
US10613741B2 (en) | 2007-01-07 | 2020-04-07 | Apple Inc. | Application programming interface for gesture operations |
US10712918B2 (en) | 2014-02-13 | 2020-07-14 | Samsung Electronics Co., Ltd. | User terminal device and displaying method thereof |
US10732997B2 (en) | 2010-01-26 | 2020-08-04 | Apple Inc. | Gesture recognizers with delegates for controlling and modifying gesture recognition |
US10747416B2 (en) | 2014-02-13 | 2020-08-18 | Samsung Electronics Co., Ltd. | User terminal device and method for displaying thereof |
US10790046B2 (en) * | 2012-02-24 | 2020-09-29 | Perkinelmer Informatics, Inc. | Systems, methods, and apparatus for drawing and editing chemical structures on a user interface via user gestures |
US10831763B2 (en) | 2016-06-10 | 2020-11-10 | Apple Inc. | System and method of generating a key list from multiple search domains |
US10866714B2 (en) * | 2014-02-13 | 2020-12-15 | Samsung Electronics Co., Ltd. | User terminal device and method for displaying thereof |
US10936190B2 (en) | 2008-03-04 | 2021-03-02 | Apple Inc. | Devices, methods, and user interfaces for processing touch events |
US10963142B2 (en) | 2007-01-07 | 2021-03-30 | Apple Inc. | Application programming interfaces for scrolling |
US11079903B2 (en) * | 2016-11-16 | 2021-08-03 | .Huizhou Tcl Mobile Communication Co., Ltd | Method and system for quick selection by intelligent terminal, and intelligent terminal |
US11262897B2 (en) | 2015-06-12 | 2022-03-01 | Nureva Inc. | Method and apparatus for managing and organizing objects in a virtual repository |
US11429190B2 (en) | 2013-06-09 | 2022-08-30 | Apple Inc. | Proxy gesture recognizer |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10824329B2 (en) | 2017-09-25 | 2020-11-03 | Motorola Solutions, Inc. | Methods and systems for displaying query status information on a graphical user interface |
TWI786493B (en) * | 2020-12-18 | 2022-12-11 | 開酷科技股份有限公司 | Gesture collection and recognition system with machine learning accelerator |
US11543890B2 (en) | 2021-03-22 | 2023-01-03 | KaiKuTek Inc. | Custom gesture collection and recognition system having machine learning accelerator |
US11503361B1 (en) * | 2021-07-26 | 2022-11-15 | Sony Group Corporation | Using signing for input to search fields |
Citations (66)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5007085A (en) * | 1988-10-28 | 1991-04-09 | International Business Machines Corporation | Remotely sensed personal stylus |
US5523775A (en) * | 1992-05-26 | 1996-06-04 | Apple Computer, Inc. | Method for selecting objects on a computer display |
US5724985A (en) * | 1995-08-02 | 1998-03-10 | Pacesetter, Inc. | User interface for an implantable medical device using an integrated digitizer display screen |
US5838326A (en) * | 1996-09-26 | 1998-11-17 | Xerox Corporation | System for moving document objects in a 3-D workspace |
US5864848A (en) * | 1997-01-31 | 1999-01-26 | Microsoft Corporation | Goal-driven information interpretation and extraction system |
US5970455A (en) * | 1997-03-20 | 1999-10-19 | Xerox Corporation | System for capturing and retrieving audio data and corresponding hand-written notes |
US6088032A (en) * | 1996-10-04 | 2000-07-11 | Xerox Corporation | Computer controlled display system for displaying a three-dimensional document workspace having a means for prefetching linked documents |
US6286104B1 (en) * | 1999-08-04 | 2001-09-04 | Oracle Corporation | Authentication and authorization in a multi-tier relational database management system |
US6344861B1 (en) * | 1993-05-24 | 2002-02-05 | Sun Microsystems, Inc. | Graphical user interface for displaying and manipulating objects |
US6397213B1 (en) * | 1999-05-12 | 2002-05-28 | Ricoh Company Ltd. | Search and retrieval using document decomposition |
US20020099685A1 (en) * | 2001-01-25 | 2002-07-25 | Hitachi, Ltd. | Document retrieval system; method of document retrieval; and search server |
US6457026B1 (en) * | 1997-12-22 | 2002-09-24 | Ricoh Company, Ltd. | System to facilitate reading a document |
US20020151327A1 (en) * | 2000-12-22 | 2002-10-17 | David Levitt | Program selector and guide system and method |
US20020169950A1 (en) * | 1998-12-23 | 2002-11-14 | Esfahani Cameron J. | Computer operating system using compressed ROM image in RAM |
US6509912B1 (en) * | 1998-01-12 | 2003-01-21 | Xerox Corporation | Domain objects for use in a freeform graphics system |
US20030018546A1 (en) * | 2001-07-20 | 2003-01-23 | International Business Machines Corporation | Network-based supply chain management method |
US20030061219A1 (en) * | 2002-10-11 | 2003-03-27 | Emergency 24, Inc. | Method for providing and exchanging search terms between internet site promoters |
US20030063136A1 (en) * | 2001-10-02 | 2003-04-03 | J'maev Jack Ivan | Method and software for hybrid electronic note taking |
US20030200306A1 (en) * | 2002-04-17 | 2003-10-23 | Chan-Won Park | Apparatus for generating time slot in home network system and method thereof |
US20030214553A1 (en) * | 2002-05-14 | 2003-11-20 | Microsoft Corporation | Ink regions in an overlay control |
US6681045B1 (en) * | 1999-05-25 | 2004-01-20 | Silverbrook Research Pty Ltd | Method and system for note taking |
US20040030741A1 (en) * | 2001-04-02 | 2004-02-12 | Wolton Richard Ernest | Method and apparatus for search, visual navigation, analysis and retrieval of information from networks with remote notification and content delivery |
US20040143569A1 (en) * | 2002-09-03 | 2004-07-22 | William Gross | Apparatus and methods for locating data |
US6778979B2 (en) * | 2001-08-13 | 2004-08-17 | Xerox Corporation | System for automatically generating queries |
US20040189720A1 (en) * | 2003-03-25 | 2004-09-30 | Wilson Andrew D. | Architecture for controlling a computer using hand gestures |
US20050055628A1 (en) * | 2003-09-10 | 2005-03-10 | Zheng Chen | Annotation management in a pen-based computing system |
US6867786B2 (en) * | 2002-07-29 | 2005-03-15 | Microsoft Corp. | In-situ digital inking for applications |
US6868525B1 (en) * | 2000-02-01 | 2005-03-15 | Alberti Anemometer Llc | Computer graphic display visualization system and method |
US20050134578A1 (en) * | 2001-07-13 | 2005-06-23 | Universal Electronics Inc. | System and methods for interacting with a control environment |
US20050177567A1 (en) * | 2003-03-19 | 2005-08-11 | International Business Machines Corporation | Search for specific files from the run menu |
US20050182760A1 (en) * | 2004-02-14 | 2005-08-18 | Samsung Electronics Co., Ltd. | Apparatus and method for searching for digital ink query |
US20050193014A1 (en) * | 2000-11-21 | 2005-09-01 | John Prince | Fuzzy database retrieval |
US6941321B2 (en) * | 1999-01-26 | 2005-09-06 | Xerox Corporation | System and method for identifying similarities among objects in a collection |
US20050198593A1 (en) * | 1998-11-20 | 2005-09-08 | Microsoft Corporation | Pen-based interface for a notepad computer |
US20050229118A1 (en) * | 2004-03-31 | 2005-10-13 | Fuji Xerox Co., Ltd. | Systems and methods for browsing multimedia content on small mobile devices |
US20050246324A1 (en) * | 2004-04-30 | 2005-11-03 | Nokia Inc. | System and associated device, method, and computer program product for performing metadata-based searches |
US20050264541A1 (en) * | 2001-03-26 | 2005-12-01 | Mitsuru Satoh | Information input/output apparatus, information input/output control method, and computer product |
US20060001656A1 (en) * | 2004-07-02 | 2006-01-05 | Laviola Joseph J Jr | Electronic ink system |
US20060010373A1 (en) * | 1996-02-27 | 2006-01-12 | Datamize Llc | Portal information delivery system for personal computers and SOHO computer systems |
US20060018546A1 (en) * | 2004-07-21 | 2006-01-26 | Hewlett-Packard Development Company, L.P. | Gesture recognition |
US20060023945A1 (en) * | 2004-02-15 | 2006-02-02 | King Martin T | Search engines and systems with handheld document data capture devices |
US20060048070A1 (en) * | 2004-09-01 | 2006-03-02 | Kip Systems | Operator interface system for a touch screen device |
US7030861B1 (en) * | 2001-02-10 | 2006-04-18 | Wayne Carl Westerman | System and method for packing multi-touch gestures onto a hand |
US20060081714A1 (en) * | 2004-08-23 | 2006-04-20 | King Martin T | Portable scanning device |
US20060085767A1 (en) * | 2004-10-20 | 2006-04-20 | Microsoft Corporation | Delimiters for selection-action pen gesture phrases |
US20060089928A1 (en) * | 2004-10-20 | 2006-04-27 | Oracle International Corporation | Computer-implemented methods and systems for entering and searching for non-Roman-alphabet characters and related search systems |
US7075512B1 (en) * | 2002-02-07 | 2006-07-11 | Palmsource, Inc. | Method and system for navigating a display screen for locating a desired item of information |
US20060155581A1 (en) * | 2005-01-10 | 2006-07-13 | George Eisenberger | Systems with user selectable data attributes for automated electronic search, identification and publication of relevant data from electronic data records at multiple data sources |
US7092935B2 (en) * | 2000-02-25 | 2006-08-15 | Canon Kabushiki Kaisha | Customizable filter interface |
US7091959B1 (en) * | 1999-03-31 | 2006-08-15 | Advanced Digital Systems, Inc. | System, computer program product, computing device, and associated methods for form identification and information manipulation |
US7107261B2 (en) * | 2002-05-22 | 2006-09-12 | International Business Machines Corporation | Search engine providing match and alternative answer |
US20060282790A1 (en) * | 2005-03-22 | 2006-12-14 | Microsoft Corporation | Operating system program launch menu search |
US20070005573A1 (en) * | 2005-06-30 | 2007-01-04 | Microsoft Corporation | Automatic filtering and scoping of search results |
US20070067277A1 (en) * | 2005-09-08 | 2007-03-22 | Samsung Electronics Co., Ltd. | Method for searching for user data in mobile communication terminal |
US20070125860A1 (en) * | 1999-05-25 | 2007-06-07 | Silverbrook Research Pty Ltd | System for enabling access to information |
US20070146347A1 (en) * | 2005-04-22 | 2007-06-28 | Outland Research, Llc | Flick-gesture interface for handheld computing devices |
US20070176898A1 (en) * | 2006-02-01 | 2007-08-02 | Memsic, Inc. | Air-writing and motion sensing input for portable devices |
US20070203906A1 (en) * | 2003-09-22 | 2007-08-30 | Cone Julian M | Enhanced Search Engine |
US20070219986A1 (en) * | 2006-03-20 | 2007-09-20 | Babylon Ltd. | Method and apparatus for extracting terms based on a displayed text |
US20070233692A1 (en) * | 2006-04-03 | 2007-10-04 | Lisa Steven G | System, methods and applications for embedded internet searching and result display |
US20080033931A1 (en) * | 2006-08-01 | 2008-02-07 | Bryn Dole | Cap-sensitive text search for documents |
US7353246B1 (en) * | 1999-07-30 | 2008-04-01 | Miva Direct, Inc. | System and method for enabling information associations |
US20080178126A1 (en) * | 2007-01-24 | 2008-07-24 | Microsoft Corporation | Gesture recognition interactive feedback |
US20080250012A1 (en) * | 2007-04-09 | 2008-10-09 | Microsoft Corporation | In situ search for active note taking |
US20090198674A1 (en) * | 2006-12-29 | 2009-08-06 | Tonya Custis | Information-retrieval systems, methods, and software with concept-based searching and ranking |
US20100321345A1 (en) * | 2006-10-10 | 2010-12-23 | Promethean Limited | Dual pen system |
Family Cites Families (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5953735A (en) * | 1991-03-20 | 1999-09-14 | Forcier; Mitchell D. | Script character processing method and system with bit-mapped document editing |
JPH10207901A (en) | 1997-01-22 | 1998-08-07 | Nippon Telegr & Teleph Corp <Ntt> | Method and system for providing information |
US6747632B2 (en) * | 1997-03-06 | 2004-06-08 | Harmonic Research, Inc. | Wireless control device |
JPH11161682A (en) | 1997-09-29 | 1999-06-18 | Toshiba Corp | Device and method for retrieving information and recording medium |
US20010040551A1 (en) * | 1999-07-29 | 2001-11-15 | Interlink Electronics, Inc. | Hand-held remote computer input peripheral with touch pad used for cursor control and text entry on a separate display |
JP2001092832A (en) | 1999-09-21 | 2001-04-06 | Matsushita Electric Ind Co Ltd | Information recommending method |
JP2001167124A (en) | 1999-12-13 | 2001-06-22 | Sharp Corp | Document classification device and recording medium recording document classifiction program |
JP3838014B2 (en) | 2000-09-27 | 2006-10-25 | 日本電気株式会社 | Preference learning device, preference learning system, preference learning method, and recording medium |
US7080317B2 (en) * | 2001-05-31 | 2006-07-18 | Lebow David G | Text highlighting comparison method |
JP2003173352A (en) | 2001-12-05 | 2003-06-20 | Nippon Telegr & Teleph Corp <Ntt> | Retrieval log analysis method and device, document information retrieval method and device, retrieval log analysis program, document information retrieval program and storage medium |
AU2003297193A1 (en) * | 2002-12-13 | 2004-07-09 | Applied Minds, Inc. | Meta-web |
CN1290036C (en) * | 2002-12-30 | 2006-12-13 | 国际商业机器公司 | Computer system and method for establishing concept knowledge according to machine readable dictionary |
WO2004088534A1 (en) | 2003-02-24 | 2004-10-14 | Jin Il Kim | System for managing hand-wrighted document for tablet pc, method for managing and serching hand-wrighted document using thereof |
US9489853B2 (en) * | 2004-09-27 | 2016-11-08 | Kenneth Nathaniel Sherman | Reading and information enhancement system and method |
US8024335B2 (en) * | 2004-05-03 | 2011-09-20 | Microsoft Corporation | System and method for dynamically generating a selectable search extension |
US7343552B2 (en) * | 2004-02-12 | 2008-03-11 | Fuji Xerox Co., Ltd. | Systems and methods for freeform annotations |
US20060197756A1 (en) * | 2004-05-24 | 2006-09-07 | Keytec, Inc. | Multi-mode optical pointer for interactive display system |
US20060031755A1 (en) * | 2004-06-24 | 2006-02-09 | Avaya Technology Corp. | Sharing inking during multi-modal communication |
JP2008517356A (en) | 2004-09-21 | 2008-05-22 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | Content organization |
US20090132969A1 (en) | 2005-06-16 | 2009-05-21 | Ken Mayer | Method and system for automated initiation of search queries from computer displayed content |
US7725477B2 (en) * | 2005-12-19 | 2010-05-25 | Microsoft Corporation | Power filter for online listing service |
IL173663A0 (en) * | 2006-02-12 | 2006-08-01 | Celltick Technologies Ltd | System and method for displaying personalized content on personal cellular telecommunication devices |
US20070244866A1 (en) * | 2006-04-18 | 2007-10-18 | Mainstream Advertising, Inc. | System and method for responding to a search request |
US8330773B2 (en) * | 2006-11-21 | 2012-12-11 | Microsoft Corporation | Mobile data and handwriting screen capture and forwarding |
US7739304B2 (en) * | 2007-02-08 | 2010-06-15 | Yahoo! Inc. | Context-based community-driven suggestions for media annotation |
-
2007
- 2007-09-04 US US11/849,469 patent/US20090058820A1/en not_active Abandoned
-
2014
- 2014-12-16 US US14/572,527 patent/US10191940B2/en active Active
Patent Citations (72)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5007085A (en) * | 1988-10-28 | 1991-04-09 | International Business Machines Corporation | Remotely sensed personal stylus |
US5523775A (en) * | 1992-05-26 | 1996-06-04 | Apple Computer, Inc. | Method for selecting objects on a computer display |
US6344861B1 (en) * | 1993-05-24 | 2002-02-05 | Sun Microsystems, Inc. | Graphical user interface for displaying and manipulating objects |
US5724985A (en) * | 1995-08-02 | 1998-03-10 | Pacesetter, Inc. | User interface for an implantable medical device using an integrated digitizer display screen |
US20060010373A1 (en) * | 1996-02-27 | 2006-01-12 | Datamize Llc | Portal information delivery system for personal computers and SOHO computer systems |
US5838326A (en) * | 1996-09-26 | 1998-11-17 | Xerox Corporation | System for moving document objects in a 3-D workspace |
US6088032A (en) * | 1996-10-04 | 2000-07-11 | Xerox Corporation | Computer controlled display system for displaying a three-dimensional document workspace having a means for prefetching linked documents |
US5864848A (en) * | 1997-01-31 | 1999-01-26 | Microsoft Corporation | Goal-driven information interpretation and extraction system |
US5970455A (en) * | 1997-03-20 | 1999-10-19 | Xerox Corporation | System for capturing and retrieving audio data and corresponding hand-written notes |
US6457026B1 (en) * | 1997-12-22 | 2002-09-24 | Ricoh Company, Ltd. | System to facilitate reading a document |
US6509912B1 (en) * | 1998-01-12 | 2003-01-21 | Xerox Corporation | Domain objects for use in a freeform graphics system |
US20050198593A1 (en) * | 1998-11-20 | 2005-09-08 | Microsoft Corporation | Pen-based interface for a notepad computer |
US20020169950A1 (en) * | 1998-12-23 | 2002-11-14 | Esfahani Cameron J. | Computer operating system using compressed ROM image in RAM |
US6941321B2 (en) * | 1999-01-26 | 2005-09-06 | Xerox Corporation | System and method for identifying similarities among objects in a collection |
US7091959B1 (en) * | 1999-03-31 | 2006-08-15 | Advanced Digital Systems, Inc. | System, computer program product, computing device, and associated methods for form identification and information manipulation |
US6397213B1 (en) * | 1999-05-12 | 2002-05-28 | Ricoh Company Ltd. | Search and retrieval using document decomposition |
US20090010542A1 (en) * | 1999-05-25 | 2009-01-08 | Silverbrook Research Pty Ltd | System for interactive note-taking |
US20070125860A1 (en) * | 1999-05-25 | 2007-06-07 | Silverbrook Research Pty Ltd | System for enabling access to information |
US7162088B2 (en) * | 1999-05-25 | 2007-01-09 | Silverbrook Research Pty Ltd | Notetaking method incorporating coded data sensor |
US6681045B1 (en) * | 1999-05-25 | 2004-01-20 | Silverbrook Research Pty Ltd | Method and system for note taking |
US6829387B2 (en) * | 1999-05-25 | 2004-12-07 | Silverbrook Research Pty Ltd | Method and system for note taking using processing sensor |
US7400769B2 (en) * | 1999-05-25 | 2008-07-15 | Silverbrook Research Pty Ltd | Position-code bearing notepad employing activation icons |
US7353246B1 (en) * | 1999-07-30 | 2008-04-01 | Miva Direct, Inc. | System and method for enabling information associations |
US6286104B1 (en) * | 1999-08-04 | 2001-09-04 | Oracle Corporation | Authentication and authorization in a multi-tier relational database management system |
US6868525B1 (en) * | 2000-02-01 | 2005-03-15 | Alberti Anemometer Llc | Computer graphic display visualization system and method |
US7092935B2 (en) * | 2000-02-25 | 2006-08-15 | Canon Kabushiki Kaisha | Customizable filter interface |
US20050193014A1 (en) * | 2000-11-21 | 2005-09-01 | John Prince | Fuzzy database retrieval |
US20020151327A1 (en) * | 2000-12-22 | 2002-10-17 | David Levitt | Program selector and guide system and method |
US20020099685A1 (en) * | 2001-01-25 | 2002-07-25 | Hitachi, Ltd. | Document retrieval system; method of document retrieval; and search server |
US7030861B1 (en) * | 2001-02-10 | 2006-04-18 | Wayne Carl Westerman | System and method for packing multi-touch gestures onto a hand |
US20060125803A1 (en) * | 2001-02-10 | 2006-06-15 | Wayne Westerman | System and method for packing multitouch gestures onto a hand |
US20050264541A1 (en) * | 2001-03-26 | 2005-12-01 | Mitsuru Satoh | Information input/output apparatus, information input/output control method, and computer product |
US20040030741A1 (en) * | 2001-04-02 | 2004-02-12 | Wolton Richard Ernest | Method and apparatus for search, visual navigation, analysis and retrieval of information from networks with remote notification and content delivery |
US20050134578A1 (en) * | 2001-07-13 | 2005-06-23 | Universal Electronics Inc. | System and methods for interacting with a control environment |
US20030018546A1 (en) * | 2001-07-20 | 2003-01-23 | International Business Machines Corporation | Network-based supply chain management method |
US6778979B2 (en) * | 2001-08-13 | 2004-08-17 | Xerox Corporation | System for automatically generating queries |
US20030063136A1 (en) * | 2001-10-02 | 2003-04-03 | J'maev Jack Ivan | Method and software for hybrid electronic note taking |
US7075512B1 (en) * | 2002-02-07 | 2006-07-11 | Palmsource, Inc. | Method and system for navigating a display screen for locating a desired item of information |
US20030200306A1 (en) * | 2002-04-17 | 2003-10-23 | Chan-Won Park | Apparatus for generating time slot in home network system and method thereof |
US20030214553A1 (en) * | 2002-05-14 | 2003-11-20 | Microsoft Corporation | Ink regions in an overlay control |
US7107261B2 (en) * | 2002-05-22 | 2006-09-12 | International Business Machines Corporation | Search engine providing match and alternative answer |
US6867786B2 (en) * | 2002-07-29 | 2005-03-15 | Microsoft Corp. | In-situ digital inking for applications |
US20040143569A1 (en) * | 2002-09-03 | 2004-07-22 | William Gross | Apparatus and methods for locating data |
US20030061219A1 (en) * | 2002-10-11 | 2003-03-27 | Emergency 24, Inc. | Method for providing and exchanging search terms between internet site promoters |
US20050177567A1 (en) * | 2003-03-19 | 2005-08-11 | International Business Machines Corporation | Search for specific files from the run menu |
US20040189720A1 (en) * | 2003-03-25 | 2004-09-30 | Wilson Andrew D. | Architecture for controlling a computer using hand gestures |
US20050055628A1 (en) * | 2003-09-10 | 2005-03-10 | Zheng Chen | Annotation management in a pen-based computing system |
US20070203906A1 (en) * | 2003-09-22 | 2007-08-30 | Cone Julian M | Enhanced Search Engine |
US20050182760A1 (en) * | 2004-02-14 | 2005-08-18 | Samsung Electronics Co., Ltd. | Apparatus and method for searching for digital ink query |
US20070011140A1 (en) * | 2004-02-15 | 2007-01-11 | King Martin T | Processing techniques for visual capture data from a rendered document |
US20060023945A1 (en) * | 2004-02-15 | 2006-02-02 | King Martin T | Search engines and systems with handheld document data capture devices |
US20050229118A1 (en) * | 2004-03-31 | 2005-10-13 | Fuji Xerox Co., Ltd. | Systems and methods for browsing multimedia content on small mobile devices |
US20050246324A1 (en) * | 2004-04-30 | 2005-11-03 | Nokia Inc. | System and associated device, method, and computer program product for performing metadata-based searches |
US20060001656A1 (en) * | 2004-07-02 | 2006-01-05 | Laviola Joseph J Jr | Electronic ink system |
US20060018546A1 (en) * | 2004-07-21 | 2006-01-26 | Hewlett-Packard Development Company, L.P. | Gesture recognition |
US20060081714A1 (en) * | 2004-08-23 | 2006-04-20 | King Martin T | Portable scanning device |
US20060048070A1 (en) * | 2004-09-01 | 2006-03-02 | Kip Systems | Operator interface system for a touch screen device |
US20060085767A1 (en) * | 2004-10-20 | 2006-04-20 | Microsoft Corporation | Delimiters for selection-action pen gesture phrases |
US20060089928A1 (en) * | 2004-10-20 | 2006-04-27 | Oracle International Corporation | Computer-implemented methods and systems for entering and searching for non-Roman-alphabet characters and related search systems |
US20060155581A1 (en) * | 2005-01-10 | 2006-07-13 | George Eisenberger | Systems with user selectable data attributes for automated electronic search, identification and publication of relevant data from electronic data records at multiple data sources |
US20060282790A1 (en) * | 2005-03-22 | 2006-12-14 | Microsoft Corporation | Operating system program launch menu search |
US20070146347A1 (en) * | 2005-04-22 | 2007-06-28 | Outland Research, Llc | Flick-gesture interface for handheld computing devices |
US20070005573A1 (en) * | 2005-06-30 | 2007-01-04 | Microsoft Corporation | Automatic filtering and scoping of search results |
US20070067277A1 (en) * | 2005-09-08 | 2007-03-22 | Samsung Electronics Co., Ltd. | Method for searching for user data in mobile communication terminal |
US20070176898A1 (en) * | 2006-02-01 | 2007-08-02 | Memsic, Inc. | Air-writing and motion sensing input for portable devices |
US20070219986A1 (en) * | 2006-03-20 | 2007-09-20 | Babylon Ltd. | Method and apparatus for extracting terms based on a displayed text |
US20070233692A1 (en) * | 2006-04-03 | 2007-10-04 | Lisa Steven G | System, methods and applications for embedded internet searching and result display |
US20080033931A1 (en) * | 2006-08-01 | 2008-02-07 | Bryn Dole | Cap-sensitive text search for documents |
US20100321345A1 (en) * | 2006-10-10 | 2010-12-23 | Promethean Limited | Dual pen system |
US20090198674A1 (en) * | 2006-12-29 | 2009-08-06 | Tonya Custis | Information-retrieval systems, methods, and software with concept-based searching and ranking |
US20080178126A1 (en) * | 2007-01-24 | 2008-07-24 | Microsoft Corporation | Gesture recognition interactive feedback |
US20080250012A1 (en) * | 2007-04-09 | 2008-10-09 | Microsoft Corporation | In situ search for active note taking |
Cited By (197)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120272176A1 (en) * | 2003-11-05 | 2012-10-25 | Google Inc. | Persistent User Interface for Providing Navigational Functionality |
US10613741B2 (en) | 2007-01-07 | 2020-04-07 | Apple Inc. | Application programming interface for gesture operations |
US11449217B2 (en) | 2007-01-07 | 2022-09-20 | Apple Inc. | Application programming interfaces for gesture operations |
US8607167B2 (en) * | 2007-01-07 | 2013-12-10 | Apple Inc. | Portable multifunction device, method, and graphical user interface for providing maps and directions |
US10963142B2 (en) | 2007-01-07 | 2021-03-30 | Apple Inc. | Application programming interfaces for scrolling |
US20080168396A1 (en) * | 2007-01-07 | 2008-07-10 | Michael Matas | Portable Multifunction Device, Method, and Graphical User Interface for Providing Maps and Directions |
US10686930B2 (en) | 2007-06-22 | 2020-06-16 | Apple Inc. | Touch screen device, method, and graphical user interface for providing maps, directions, and location based information |
US11849063B2 (en) | 2007-06-22 | 2023-12-19 | Apple Inc. | Touch screen device, method, and graphical user interface for providing maps, directions, and location-based information |
US8302033B2 (en) | 2007-06-22 | 2012-10-30 | Apple Inc. | Touch screen device, method, and graphical user interface for providing maps, directions, and location-based information |
US20090132960A1 (en) * | 2007-11-21 | 2009-05-21 | Lg Electronics Inc. | Terminal, method of controlling the same and recording medium for the method |
US10521084B2 (en) | 2008-01-06 | 2019-12-31 | Apple Inc. | Portable multifunction device, method, and graphical user interface for viewing and managing electronic calendars |
US9330381B2 (en) | 2008-01-06 | 2016-05-03 | Apple Inc. | Portable multifunction device, method, and graphical user interface for viewing and managing electronic calendars |
US10503366B2 (en) | 2008-01-06 | 2019-12-10 | Apple Inc. | Portable multifunction device, method, and graphical user interface for viewing and managing electronic calendars |
US20090174680A1 (en) * | 2008-01-06 | 2009-07-09 | Freddy Allen Anzures | Portable Multifunction Device, Method, and Graphical User Interface for Viewing and Managing Electronic Calendars |
US11126326B2 (en) | 2008-01-06 | 2021-09-21 | Apple Inc. | Portable multifunction device, method, and graphical user interface for viewing and managing electronic calendars |
US20090178007A1 (en) * | 2008-01-06 | 2009-07-09 | Michael Matas | Touch Screen Device, Method, and Graphical User Interface for Displaying and Selecting Application Options |
US8327272B2 (en) | 2008-01-06 | 2012-12-04 | Apple Inc. | Portable multifunction device, method, and graphical user interface for viewing and managing electronic calendars |
US9792001B2 (en) | 2008-01-06 | 2017-10-17 | Apple Inc. | Portable multifunction device, method, and graphical user interface for viewing and managing electronic calendars |
US8171432B2 (en) | 2008-01-06 | 2012-05-01 | Apple Inc. | Touch screen device, method, and graphical user interface for displaying and selecting application options |
US8271907B2 (en) * | 2008-02-01 | 2012-09-18 | Lg Electronics Inc. | User interface method for mobile device and mobile communication system |
US20090307631A1 (en) * | 2008-02-01 | 2009-12-10 | Kim Joo Min | User interface method for mobile device and mobile communication system |
US10936190B2 (en) | 2008-03-04 | 2021-03-02 | Apple Inc. | Devices, methods, and user interfaces for processing touch events |
US11740725B2 (en) | 2008-03-04 | 2023-08-29 | Apple Inc. | Devices, methods, and user interfaces for processing touch events |
US20110115702A1 (en) * | 2008-07-08 | 2011-05-19 | David Seaberg | Process for Providing and Editing Instructions, Data, Data Structures, and Algorithms in a Computer System |
US20100107114A1 (en) * | 2008-10-28 | 2010-04-29 | Zachcial Slawomir | In context web page localization |
US8291348B2 (en) * | 2008-12-31 | 2012-10-16 | Hewlett-Packard Development Company, L.P. | Computing device and method for selecting display regions responsive to non-discrete directional input actions and intelligent content analysis |
US20100169766A1 (en) * | 2008-12-31 | 2010-07-01 | Matias Duarte | Computing Device and Method for Selecting Display Regions Responsive to Non-Discrete Directional Input Actions and Intelligent Content Analysis |
US11755196B2 (en) | 2009-03-16 | 2023-09-12 | Apple Inc. | Event recognition |
US11720584B2 (en) * | 2009-03-16 | 2023-08-08 | Apple Inc. | Multifunction device with integrated search and application selection |
US20190012353A1 (en) * | 2009-03-16 | 2019-01-10 | Apple Inc. | Multifunction device with integrated search and application selection |
US20170046063A1 (en) * | 2009-03-16 | 2017-02-16 | Apple Inc. | Event Recognition |
US10719225B2 (en) * | 2009-03-16 | 2020-07-21 | Apple Inc. | Event recognition |
US11163440B2 (en) | 2009-03-16 | 2021-11-02 | Apple Inc. | Event recognition |
EP2239653A3 (en) * | 2009-04-08 | 2013-05-29 | Lg Electronics Inc. | Method for inputting command in mobile terminal and mobile terminal using the same |
US9182905B2 (en) | 2009-04-08 | 2015-11-10 | Lg Electronics Inc. | Method for inputting command in mobile terminal using drawing pattern and mobile terminal using the same |
US20100262591A1 (en) * | 2009-04-08 | 2010-10-14 | Lee Sang Hyuck | Method for inputting command in mobile terminal and mobile terminal using the same |
US10031608B2 (en) * | 2009-05-21 | 2018-07-24 | Microsoft Technology Licensing, Llc | Organizational tools on a multi-touch display device |
US8464182B2 (en) | 2009-06-07 | 2013-06-11 | Apple Inc. | Device, method, and graphical user interface for providing maps, directions, and location-based information |
US8890657B2 (en) | 2009-10-30 | 2014-11-18 | Symbol Technologies, Inc. | System and method for operating an RFID system with head tracking |
WO2011053442A1 (en) * | 2009-10-30 | 2011-05-05 | Symbol Technologies, Inc. | System and method for operating an rfid system with head tracking |
US20110102149A1 (en) * | 2009-10-30 | 2011-05-05 | Symbol Technologies, Inc. | System and method for operating an rfid system with head tracking |
US20110148786A1 (en) * | 2009-12-18 | 2011-06-23 | Synaptics Incorporated | Method and apparatus for changing operating modes |
US9465532B2 (en) | 2009-12-18 | 2016-10-11 | Synaptics Incorporated | Method and apparatus for operating in pointing and enhanced gesturing modes |
US20110154268A1 (en) * | 2009-12-18 | 2011-06-23 | Synaptics Incorporated | Method and apparatus for operating in pointing and enhanced gesturing modes |
US10169431B2 (en) | 2010-01-06 | 2019-01-01 | Apple Inc. | Device, method, and graphical user interface for mapping directions between search results |
US9442654B2 (en) | 2010-01-06 | 2016-09-13 | Apple Inc. | Apparatus and method for conditionally enabling or disabling soft buttons |
US8456297B2 (en) | 2010-01-06 | 2013-06-04 | Apple Inc. | Device, method, and graphical user interface for tracking movement on a map |
US20110167058A1 (en) * | 2010-01-06 | 2011-07-07 | Van Os Marcel | Device, Method, and Graphical User Interface for Mapping Directions Between Search Results |
US20110163874A1 (en) * | 2010-01-06 | 2011-07-07 | Van Os Marcel | Device, Method, and Graphical User Interface for Tracking Movement on a Map |
US8862576B2 (en) | 2010-01-06 | 2014-10-14 | Apple Inc. | Device, method, and graphical user interface for mapping directions between search results |
US10732997B2 (en) | 2010-01-26 | 2020-08-04 | Apple Inc. | Gesture recognizers with delegates for controlling and modifying gesture recognition |
US20110310039A1 (en) * | 2010-06-16 | 2011-12-22 | Samsung Electronics Co., Ltd. | Method and apparatus for user-adaptive data arrangement/classification in portable terminal |
US20120005583A1 (en) * | 2010-06-30 | 2012-01-05 | Yahoo! Inc. | Method and system for performing a web search |
US9619562B2 (en) * | 2010-06-30 | 2017-04-11 | Excalibur Ip, Llc | Method and system for performing a web search |
US20120023447A1 (en) * | 2010-07-23 | 2012-01-26 | Masaaki Hoshino | Information processing device, information processing method, and information processing program |
EP2413230A3 (en) * | 2010-07-30 | 2015-04-01 | Samsung Electronics Co., Ltd. | Method for providing user interface and display apparatus applying the same |
CN102375588A (en) * | 2010-08-19 | 2012-03-14 | 上海博泰悦臻电子设备制造有限公司 | Method and device for controlling equipment operation through gesture on screen of electronic equipment |
CN105159574A (en) * | 2010-08-19 | 2015-12-16 | 上海博泰悦臻电子设备制造有限公司 | Method and apparatus for controlling device operation through gesture on screen of electronic device |
EP2423798A1 (en) * | 2010-08-31 | 2012-02-29 | Samsung Electronics Co., Ltd. | Method of providing search service by extracting keywords in specified region and display device applying the same |
US9164670B2 (en) | 2010-09-15 | 2015-10-20 | Microsoft Technology Licensing, Llc | Flexible touch-based scrolling |
US9898180B2 (en) | 2010-09-15 | 2018-02-20 | Microsoft Technology Licensing, Llc | Flexible touch-based scrolling |
US9383918B2 (en) | 2010-09-24 | 2016-07-05 | Blackberry Limited | Portable electronic device and method of controlling same |
US9021402B1 (en) * | 2010-09-24 | 2015-04-28 | Google Inc. | Operation of mobile device interface using gestures |
US8976129B2 (en) * | 2010-09-24 | 2015-03-10 | Blackberry Limited | Portable electronic device and method of controlling same |
US9684444B2 (en) | 2010-09-24 | 2017-06-20 | Blackberry Limited | Portable electronic device and method therefor |
US9141256B2 (en) | 2010-09-24 | 2015-09-22 | 2236008 Ontario Inc. | Portable electronic device and method therefor |
US20120105345A1 (en) * | 2010-09-24 | 2012-05-03 | Qnx Software Systems Limited | Portable Electronic Device and Method of Controlling Same |
US20120096354A1 (en) * | 2010-10-14 | 2012-04-19 | Park Seungyong | Mobile terminal and control method thereof |
CN102567441A (en) * | 2010-10-18 | 2012-07-11 | 微软公司 | Providing contextual hints associated with a user session |
US20120095997A1 (en) * | 2010-10-18 | 2012-04-19 | Microsoft Corporation | Providing contextual hints associated with a user session |
US9146673B2 (en) | 2010-11-05 | 2015-09-29 | Apple Inc. | Device, method, and graphical user interface for manipulating soft keyboards |
US9141285B2 (en) | 2010-11-05 | 2015-09-22 | Apple Inc. | Device, method, and graphical user interface for manipulating soft keyboards |
US9128614B2 (en) | 2010-11-05 | 2015-09-08 | Apple Inc. | Device, method, and graphical user interface for manipulating soft keyboards |
US8953570B2 (en) | 2010-11-23 | 2015-02-10 | Symbol Technologies, Inc. | Radio frequency identification system and related operating methods |
US20120154304A1 (en) * | 2010-12-16 | 2012-06-21 | Samsung Electronics Co., Ltd. | Portable terminal with optical touch pad and method for controlling data in the same |
US9134768B2 (en) * | 2010-12-16 | 2015-09-15 | Samsung Electronics Co., Ltd. | Portable terminal with optical touch pad and method for controlling data in the same |
US8842082B2 (en) | 2011-01-24 | 2014-09-23 | Apple Inc. | Device, method, and graphical user interface for navigating and annotating an electronic document |
US8782513B2 (en) | 2011-01-24 | 2014-07-15 | Apple Inc. | Device, method, and graphical user interface for navigating through an electronic document |
US9436381B2 (en) | 2011-01-24 | 2016-09-06 | Apple Inc. | Device, method, and graphical user interface for navigating and annotating an electronic document |
US9442516B2 (en) | 2011-01-24 | 2016-09-13 | Apple Inc. | Device, method, and graphical user interface for navigating through an electronic document |
US10042549B2 (en) | 2011-01-24 | 2018-08-07 | Apple Inc. | Device, method, and graphical user interface with a dynamic gesture disambiguation threshold |
US10365819B2 (en) | 2011-01-24 | 2019-07-30 | Apple Inc. | Device, method, and graphical user interface for displaying a character input user interface |
US9552015B2 (en) | 2011-01-24 | 2017-01-24 | Apple Inc. | Device, method, and graphical user interface for navigating through an electronic document |
US9092132B2 (en) | 2011-01-24 | 2015-07-28 | Apple Inc. | Device, method, and graphical user interface with a dynamic gesture disambiguation threshold |
US9250798B2 (en) | 2011-01-24 | 2016-02-02 | Apple Inc. | Device, method, and graphical user interface with a dynamic gesture disambiguation threshold |
US9671825B2 (en) | 2011-01-24 | 2017-06-06 | Apple Inc. | Device, method, and graphical user interface for navigating through an electronic document |
US20140078093A1 (en) * | 2011-05-23 | 2014-03-20 | Sony Corporation | Information processing apparatus, information processing method and computer program |
EP2717177A1 (en) * | 2011-05-31 | 2014-04-09 | Rakuten, Inc. | Information provision system, information provision system control method, information provision device, program, and information recording medium |
EP2717177B1 (en) * | 2011-05-31 | 2018-03-14 | Rakuten, Inc. | Information provision system, information provision system control method, information provision device, program, and information recording medium |
GB2493510A (en) * | 2011-07-28 | 2013-02-13 | Daniel Rajkumar | Methods of controlling a search engine |
US20180253427A1 (en) * | 2011-10-24 | 2018-09-06 | Imagescan, Inc. | Apparatus and Method for Displaying Multiple Display Panels With a Progressive Relationship Using Cognitive Pattern Recognition |
US11010432B2 (en) * | 2011-10-24 | 2021-05-18 | Imagescan, Inc. | Apparatus and method for displaying multiple display panels with a progressive relationship using cognitive pattern recognition |
US11669575B2 (en) * | 2011-10-24 | 2023-06-06 | Imagescan, Inc. | Apparatus and method for displaying multiple display panels with a progressive relationship using cognitive pattern recognition |
US8478777B2 (en) * | 2011-10-25 | 2013-07-02 | Google Inc. | Gesture-based search |
US20130132361A1 (en) * | 2011-11-22 | 2013-05-23 | Liang-Pu CHEN | Input method for querying by using a region formed by an enclosed track and system using the same |
CN103135884A (en) * | 2011-11-22 | 2013-06-05 | 财团法人资讯工业策进会 | Input method, system and device thereof for querying by using a region formed by an enclosed track |
US10790046B2 (en) * | 2012-02-24 | 2020-09-29 | Perkinelmer Informatics, Inc. | Systems, methods, and apparatus for drawing and editing chemical structures on a user interface via user gestures |
US8583622B2 (en) | 2012-03-05 | 2013-11-12 | Microsoft Corporation | Application of breadcrumbs in ranking and search experiences |
US9009191B2 (en) | 2012-03-23 | 2015-04-14 | Blackberry Limited | Systems and methods for presenting content relevant to text |
CN103365578A (en) * | 2012-03-29 | 2013-10-23 | 百度在线网络技术(北京)有限公司 | Mobile terminal unlocking method and mobile terminal |
US9507512B1 (en) * | 2012-04-25 | 2016-11-29 | Amazon Technologies, Inc. | Using gestures to deliver content to predefined destinations |
US10871893B2 (en) | 2012-04-25 | 2020-12-22 | Amazon Technologies, Inc. | Using gestures to deliver content to predefined destinations |
US20140046922A1 (en) * | 2012-08-08 | 2014-02-13 | Microsoft Corporation | Search user interface using outward physical expressions |
US9483163B2 (en) * | 2012-10-01 | 2016-11-01 | Fuji Xerox Co., Ltd. | Information display apparatus, information display method, and computer readable medium |
US20140096080A1 (en) * | 2012-10-01 | 2014-04-03 | Fuji Xerox Co., Ltd. | Information display apparatus, information display method, and computer readable medium |
CN102968269A (en) * | 2012-10-25 | 2013-03-13 | 东莞宇龙通信科技有限公司 | Terminal and terminal management method |
WO2014137626A1 (en) * | 2013-03-04 | 2014-09-12 | Microsoft Corporation | Digital ink based contextual search |
US8943092B2 (en) | 2013-03-04 | 2015-01-27 | Microsoft Corporation | Digital ink based contextual search |
US9645720B2 (en) | 2013-03-16 | 2017-05-09 | Jerry Alan Crandall | Data sharing |
US9563341B2 (en) | 2013-03-16 | 2017-02-07 | Jerry Alan Crandall | Data sharing |
US20140331187A1 (en) * | 2013-05-03 | 2014-11-06 | Barnesandnoble.Com Llc | Grouping objects on a computing device |
US11429190B2 (en) | 2013-06-09 | 2022-08-30 | Apple Inc. | Proxy gesture recognizer |
CN104252492A (en) * | 2013-06-28 | 2014-12-31 | 宏碁股份有限公司 | Method for searching data and electronic device thereof |
CN103455590A (en) * | 2013-08-29 | 2013-12-18 | 百度在线网络技术(北京)有限公司 | Method and device for retrieving in touch-screen device |
US10685417B2 (en) | 2013-08-29 | 2020-06-16 | Baidu Online Network Technology (Beijing) Co., Ltd. | Method and device for searching in a touch-screen apparatus based on gesture inputs |
CN104731798A (en) * | 2013-12-19 | 2015-06-24 | 鸿合科技有限公司 | Text retrieval method and text retrieval device |
US10747416B2 (en) | 2014-02-13 | 2020-08-18 | Samsung Electronics Co., Ltd. | User terminal device and method for displaying thereof |
US10866714B2 (en) * | 2014-02-13 | 2020-12-15 | Samsung Electronics Co., Ltd. | User terminal device and method for displaying thereof |
US10712918B2 (en) | 2014-02-13 | 2020-07-14 | Samsung Electronics Co., Ltd. | User terminal device and displaying method thereof |
US20150347363A1 (en) * | 2014-05-30 | 2015-12-03 | Paul Manganaro | System for Communicating with a Reader |
US9971415B2 (en) | 2014-06-03 | 2018-05-15 | Google Llc | Radar-based gesture-recognition through a wearable device |
US10948996B2 (en) | 2014-06-03 | 2021-03-16 | Google Llc | Radar-based gesture-recognition at a surface of an object |
US10509478B2 (en) | 2014-06-03 | 2019-12-17 | Google Llc | Radar-based gesture-recognition from a surface radar field on which an interaction is sensed |
US10635293B2 (en) * | 2014-06-05 | 2020-04-28 | Openpeak Llc | Method and system for enabling the sharing of information between applications on a computing device |
US20180188921A1 (en) * | 2014-06-05 | 2018-07-05 | OpemPeak LLC | Method and system for enabling the sharing of information between applications on a computing device |
US9921660B2 (en) | 2014-08-07 | 2018-03-20 | Google Llc | Radar-based gesture recognition |
US10642367B2 (en) | 2014-08-07 | 2020-05-05 | Google Llc | Radar-based gesture sensing and data transmission |
US9811164B2 (en) | 2014-08-07 | 2017-11-07 | Google Inc. | Radar-based gesture sensing and data transmission |
US9933908B2 (en) | 2014-08-15 | 2018-04-03 | Google Llc | Interactive textiles |
US11816101B2 (en) | 2014-08-22 | 2023-11-14 | Google Llc | Radar recognition-aided search |
US10409385B2 (en) | 2014-08-22 | 2019-09-10 | Google Llc | Occluded gesture recognition |
US11221682B2 (en) | 2014-08-22 | 2022-01-11 | Google Llc | Occluded gesture recognition |
US10936081B2 (en) | 2014-08-22 | 2021-03-02 | Google Llc | Occluded gesture recognition |
US20160055201A1 (en) * | 2014-08-22 | 2016-02-25 | Google Inc. | Radar Recognition-Aided Searches |
US11169988B2 (en) * | 2014-08-22 | 2021-11-09 | Google Llc | Radar recognition-aided search |
US9600080B2 (en) | 2014-10-02 | 2017-03-21 | Google Inc. | Non-line-of-sight radar-based gesture recognition |
US11163371B2 (en) | 2014-10-02 | 2021-11-02 | Google Llc | Non-line-of-sight radar-based gesture recognition |
US10664059B2 (en) | 2014-10-02 | 2020-05-26 | Google Llc | Non-line-of-sight radar-based gesture recognition |
US20160154555A1 (en) * | 2014-12-02 | 2016-06-02 | Lenovo (Singapore) Pte. Ltd. | Initiating application and performing function based on input |
CN104462437A (en) * | 2014-12-15 | 2015-03-25 | 北京奇虎科技有限公司 | Recognizing and searching method and recognizing and searching system based on repeated touch operations of interface of terminal |
US11219412B2 (en) | 2015-03-23 | 2022-01-11 | Google Llc | In-ear health monitoring |
US10016162B1 (en) | 2015-03-23 | 2018-07-10 | Google Llc | In-ear health monitoring |
US9848780B1 (en) | 2015-04-08 | 2017-12-26 | Google Inc. | Assessing cardiovascular function using an optical sensor |
US11709552B2 (en) | 2015-04-30 | 2023-07-25 | Google Llc | RF-based micro-motion tracking for gesture tracking and recognition |
US10310620B2 (en) | 2015-04-30 | 2019-06-04 | Google Llc | Type-agnostic RF signal representations |
US10496182B2 (en) | 2015-04-30 | 2019-12-03 | Google Llc | Type-agnostic RF signal representations |
US10817070B2 (en) | 2015-04-30 | 2020-10-27 | Google Llc | RF-based micro-motion tracking for gesture tracking and recognition |
US10139916B2 (en) | 2015-04-30 | 2018-11-27 | Google Llc | Wide-field radar-based gesture recognition |
US10241581B2 (en) | 2015-04-30 | 2019-03-26 | Google Llc | RF-based micro-motion tracking for gesture tracking and recognition |
US10664061B2 (en) | 2015-04-30 | 2020-05-26 | Google Llc | Wide-field radar-based gesture recognition |
US20160334948A1 (en) * | 2015-05-15 | 2016-11-17 | Casio Computer Co., Ltd. | Image display apparatus equipped with a touch panel |
US10080528B2 (en) | 2015-05-19 | 2018-09-25 | Google Llc | Optical central venous pressure measurement |
US10155274B2 (en) | 2015-05-27 | 2018-12-18 | Google Llc | Attaching electronic components to interactive textiles |
US10936085B2 (en) | 2015-05-27 | 2021-03-02 | Google Llc | Gesture detection and interactions |
US10572027B2 (en) | 2015-05-27 | 2020-02-25 | Google Llc | Gesture detection and interactions |
US10376195B1 (en) | 2015-06-04 | 2019-08-13 | Google Llc | Automated nursing assessment |
US11262897B2 (en) | 2015-06-12 | 2022-03-01 | Nureva Inc. | Method and apparatus for managing and organizing objects in a virtual repository |
US11403352B2 (en) * | 2015-09-28 | 2022-08-02 | Yahoo Assets Llc | Multi-touch gesture search |
US20190026377A1 (en) * | 2015-09-28 | 2019-01-24 | Oath Inc. | Multi-touch gesture search |
US11080556B1 (en) | 2015-10-06 | 2021-08-03 | Google Llc | User-customizable machine-learning in radar-based gesture detection |
US10459080B1 (en) | 2015-10-06 | 2019-10-29 | Google Llc | Radar-based object detection for vehicles |
US10222469B1 (en) | 2015-10-06 | 2019-03-05 | Google Llc | Radar-based contextual sensing |
US10823841B1 (en) | 2015-10-06 | 2020-11-03 | Google Llc | Radar imaging on a mobile computing device |
US10817065B1 (en) | 2015-10-06 | 2020-10-27 | Google Llc | Gesture recognition using multiple antenna |
US10768712B2 (en) | 2015-10-06 | 2020-09-08 | Google Llc | Gesture component with gesture library |
US11698438B2 (en) | 2015-10-06 | 2023-07-11 | Google Llc | Gesture recognition using multiple antenna |
US11698439B2 (en) | 2015-10-06 | 2023-07-11 | Google Llc | Gesture recognition using multiple antenna |
US11693092B2 (en) | 2015-10-06 | 2023-07-04 | Google Llc | Gesture recognition using multiple antenna |
US10705185B1 (en) | 2015-10-06 | 2020-07-07 | Google Llc | Application-based signal processing parameters in radar-based detection |
US11132065B2 (en) | 2015-10-06 | 2021-09-28 | Google Llc | Radar-enabled sensor fusion |
US10300370B1 (en) | 2015-10-06 | 2019-05-28 | Google Llc | Advanced gaming and virtual reality control using radar |
US11656336B2 (en) | 2015-10-06 | 2023-05-23 | Google Llc | Advanced gaming and virtual reality control using radar |
US11592909B2 (en) | 2015-10-06 | 2023-02-28 | Google Llc | Fine-motion virtual-reality or augmented-reality control using radar |
US10540001B1 (en) | 2015-10-06 | 2020-01-21 | Google Llc | Fine-motion virtual-reality or augmented-reality control using radar |
US11175743B2 (en) | 2015-10-06 | 2021-11-16 | Google Llc | Gesture recognition using multiple antenna |
US10503883B1 (en) | 2015-10-06 | 2019-12-10 | Google Llc | Radar-based authentication |
US11481040B2 (en) | 2015-10-06 | 2022-10-25 | Google Llc | User-customizable machine-learning in radar-based gesture detection |
US11256335B2 (en) | 2015-10-06 | 2022-02-22 | Google Llc | Fine-motion virtual-reality or augmented-reality control using radar |
US10908696B2 (en) | 2015-10-06 | 2021-02-02 | Google Llc | Advanced gaming and virtual reality control using radar |
US11385721B2 (en) | 2015-10-06 | 2022-07-12 | Google Llc | Application-based signal processing parameters in radar-based detection |
US10401490B2 (en) | 2015-10-06 | 2019-09-03 | Google Llc | Radar-enabled sensor fusion |
US10379621B2 (en) | 2015-10-06 | 2019-08-13 | Google Llc | Gesture component with gesture library |
US10310621B1 (en) | 2015-10-06 | 2019-06-04 | Google Llc | Radar gesture sensing using existing data protocols |
CN105653711A (en) * | 2015-12-30 | 2016-06-08 | 广东欧珀移动通信有限公司 | Terminal application searching method and device |
US20170286552A1 (en) * | 2016-03-30 | 2017-10-05 | Microsoft Technology Licensing, Llc | Using Gesture Selection to Obtain Contextually Relevant Information |
US10628505B2 (en) * | 2016-03-30 | 2020-04-21 | Microsoft Technology Licensing, Llc | Using gesture selection to obtain contextually relevant information |
CN109074374A (en) * | 2016-03-30 | 2018-12-21 | 微软技术许可有限责任公司 | It selects to obtain context-related information using gesture |
US11140787B2 (en) | 2016-05-03 | 2021-10-05 | Google Llc | Connecting an electronic component to an interactive textile |
US10492302B2 (en) | 2016-05-03 | 2019-11-26 | Google Llc | Connecting an electronic component to an interactive textile |
US10175781B2 (en) | 2016-05-16 | 2019-01-08 | Google Llc | Interactive object with multiple electronics modules |
US10769182B2 (en) * | 2016-06-10 | 2020-09-08 | Apple Inc. | System and method of highlighting terms |
US10831763B2 (en) | 2016-06-10 | 2020-11-10 | Apple Inc. | System and method of generating a key list from multiple search domains |
US20170357699A1 (en) * | 2016-06-10 | 2017-12-14 | Apple Inc. | System and method of highlighting terms |
US20180011940A1 (en) * | 2016-07-06 | 2018-01-11 | Vimio Co. Ltd | App name search method and system |
US11829428B2 (en) * | 2016-07-06 | 2023-11-28 | Vimio Co. Ltd | App name search method and system |
US11079903B2 (en) * | 2016-11-16 | 2021-08-03 | .Huizhou Tcl Mobile Communication Co., Ltd | Method and system for quick selection by intelligent terminal, and intelligent terminal |
US10579150B2 (en) | 2016-12-05 | 2020-03-03 | Google Llc | Concurrent detection of absolute distance and relative movement for sensing action gestures |
Also Published As
Publication number | Publication date |
---|---|
US20150106399A1 (en) | 2015-04-16 |
US10191940B2 (en) | 2019-01-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10191940B2 (en) | Gesture-based searching | |
US11086515B2 (en) | Modifying captured stroke information into an actionable form | |
US7693842B2 (en) | In situ search for active note taking | |
KR101122869B1 (en) | Annotation management in a pen-based computing system | |
RU2449357C2 (en) | Ranking diagram | |
Hinckley et al. | InkSeine: In Situ search for active note taking | |
US7970763B2 (en) | Searching and indexing of photos based on ink annotations | |
KR102473543B1 (en) | Systems and methods for digital ink interaction | |
US20130212463A1 (en) | Smart document processing with associated online data and action streams | |
KR20080087142A (en) | Handwriting style data input via keys | |
TWI603214B (en) | System and method for online handwriting recognition in web queries | |
CN105917334A (en) | Coherent question answering in search results | |
US9195662B2 (en) | Online analysis and display of correlated information | |
US20210350122A1 (en) | Stroke based control of handwriting input | |
EP4004811A1 (en) | Technologies for content analysis | |
Zanibbi et al. | Math search for the masses: Multimodal search interfaces and appearance-based retrieval | |
US8612882B1 (en) | Method and apparatus for creating collections using automatic suggestions | |
Klein et al. | Reading Thomas Jefferson with TopicViz: towards a thematic method for exploring large cultural archives | |
Edhlund et al. | NVivo for Mac essentials | |
WO2010032900A1 (en) | System and method of automatic complete searching using entity type for database and storage media having program source thereof | |
US20190317959A1 (en) | Sketch-based image retrieval using feedback and hierarchies | |
Lettieri et al. | The affordance of law. sliding treemaps browsing hierarchically structured data on touch devices | |
US11922712B2 (en) | Technologies for content analysis | |
US7418442B1 (en) | Ink alternates and plain text search | |
WO2024041745A1 (en) | Artificial intelligence-based system and method for improving speed and quality of work on literature reviews |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MICROSOFT CORPORATION, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HINCKLEY, KENNETH P.;REEL/FRAME:019777/0388 Effective date: 20070830 |
|
AS | Assignment |
Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034542/0001 Effective date: 20141014 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |