US5880743A - Apparatus and method for implementing visual animation illustrating results of interactive editing operations - Google Patents
Apparatus and method for implementing visual animation illustrating results of interactive editing operations Download PDFInfo
- Publication number
- US5880743A US5880743A US08/976,907 US97690797A US5880743A US 5880743 A US5880743 A US 5880743A US 97690797 A US97690797 A US 97690797A US 5880743 A US5880743 A US 5880743A
- Authority
- US
- United States
- Prior art keywords
- objects
- animating
- changes
- gesture
- editing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Lifetime
Links
- 238000000034 method Methods 0.000 title claims abstract description 66
- 230000000007 visual effect Effects 0.000 title description 5
- 230000002452 interceptive effect Effects 0.000 title description 2
- 230000008859 change Effects 0.000 claims abstract description 62
- 238000010586 diagram Methods 0.000 claims description 26
- 230000004044 response Effects 0.000 claims description 9
- 238000012545 processing Methods 0.000 claims description 8
- 238000013459 approach Methods 0.000 claims description 3
- 238000004590 computer program Methods 0.000 claims description 3
- 238000013500 data storage Methods 0.000 claims description 2
- 230000000694 effects Effects 0.000 abstract description 14
- 238000013479 data entry Methods 0.000 abstract description 2
- 230000006870 function Effects 0.000 description 28
- 230000009471 action Effects 0.000 description 13
- 229910003460 diamond Inorganic materials 0.000 description 12
- 239000010432 diamond Substances 0.000 description 12
- 239000000463 material Substances 0.000 description 9
- 238000007373 indentation Methods 0.000 description 7
- 208000034699 Vitreous floaters Diseases 0.000 description 6
- 238000001514 detection method Methods 0.000 description 6
- 230000006399 behavior Effects 0.000 description 5
- 230000004075 alteration Effects 0.000 description 4
- 230000009466 transformation Effects 0.000 description 4
- 241001417524 Pomacanthidae Species 0.000 description 3
- 238000005192 partition Methods 0.000 description 3
- 238000003825 pressing Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000000638 solvent extraction Methods 0.000 description 3
- 238000012360 testing method Methods 0.000 description 3
- 241001422033 Thestylus Species 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 230000000763 evoking effect Effects 0.000 description 2
- 239000010931 gold Substances 0.000 description 2
- 229910052737 gold Inorganic materials 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000000844 transformation Methods 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000002730 additional effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000003139 buffering effect Effects 0.000 description 1
- 229910052799 carbon Inorganic materials 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000004040 coloring Methods 0.000 description 1
- 150000001875 compounds Chemical class 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 238000005562 fading Methods 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- 239000011800 void material Substances 0.000 description 1
- 230000003245 working effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/80—2D [Two Dimensional] animation, e.g. using sprites
Definitions
- the present invention relates to the use of animation to clearly illustrate the effects of an editing change on surrounding objects to a user and any other viewers of a display device.
- gesture-based graphical editing systems the basic function is to allow the user to make arbitrarily-shaped marks by simply drawing ("inking") them in a freehand manner. Such a mark is entered as a stroke-based data object in the system.
- Other kinds of graphical objects can also be created in graphical editing systems, such as text characters (which can be entered by a keyboard or by handwriting recognition), geometric shapes, and images.
- gesture-based systems unique is that control operations can be performed on the data.
- control functions may be instituted by a command gesture.
- the LiveBoard is described in a paper published in Proceedings of CHI'92, the ACM Conference on Human Factors in Computing Systems, May 3-7, 1992, Monterey, Calif.
- a command gesture is a hand-drawn stroke that is created with a stylus input device and interpreted by the system as designating an action for the system to perform. The action is executed by the system upon completion of the gesture stroke by the user.
- a great deal of such scribbling activity involves generic structures--lists, outlines, text, tables, and diagrams.
- scribbling systems must support the creation and editing of these structures. Without such support, for example, the simple task of moving an item in a list can be tedious (move a segment of the list to make space for the item at the new location, move the item, close up the old space). This almost always takes too much time for users to actually perform.
- edit operations are normally shown as occurring instantaneously. That is, upon execution of an edit command, the items are instantaneously shown as having moved to their new locations. While a single user, working by himself, usually anticipates the result of performing an edit operation on data because he himself has performed the operation, any observers who are also viewing the display device often fail to clearly comprehend the changes that have occurred on the display device because the changes occur so suddenly.
- animation techniques in user interfaces is not uncommon.
- a pseudo-animation technique is used on an Apple Macintosh computer to illustrate a window opening out of an icon when one clicks on the icon and to illustrate a window shrinking and disappearing into the icon when the window is closed.
- the Hypercard system on the Macintosh also provides for several different visual transitions between cards, such as sliding and fading, but no data is changed during these transitions. The only thing changed is at which card the user is looking.
- SGI windowing system from Silicon Graphics when a user clicks a button icon on the screen, an animation shows the button rotating and growing in size to fill up the screen before presenting a series of sub-buttons. In all these cases, no user-created data is being edited; rather, the systems' user interface objects are being transformed.
- Animation is also used in program animation systems to help users visualize the internal workings of computer programs by displaying each change carried out by the program. For example, if one wanted to understand how multiple processors coordinate and carry out their activities, animation can show which processors carried out which activities.
- animation techniques have not been applied to graphical editing systems to illustrate the results of user-requested changes, both to the graphical objects being changed and to the contexts of the changed objects.
- an editing apparatus and method wherein an editing change to data displayed on a display device is portrayed at a visually apparent rate, in order to enable an observer to visually appreciate the effect of the change to the selected portion of the data as well as the changes to the surrounding unselected portion of the data.
- a display-oriented graphical editing system which comprises a display device, means for selecting at least a portion of displayed data, means for selecting an editing operation to be performed on selected displayed data, means for performing said editing operation on said selected displayed data, and means for animating one or more changes to said displayed data resulting from said editing operation at a visually apparent rate.
- the animating means comprises means for animating a prime change to said selected displayed data and means for animating contextual changes to said displayed data not selected by said selecting means, said contextual changes resulting from said prime change to said selected displayed data.
- a method for illustrating editing changes to data on a display device comprises selecting at least a portion of displayed data, selecting an editing operation to be performed on selected displayed data, performing said editing operation on said selected displayed data, and animating one or more changes to said displayed data resulting from said editing operation at a visually apparent rate.
- the step of animating comprises animating a prime change to said selected displayed data and any contextual changes to said displayed data not selected by said selecting means but which result from said prime change to said selected displayed data.
- a display-oriented graphical editing system comprises a display device, a control device for selecting at least a portion of displayed data on said display device and for selecting an editing operation to be performed on selected displayed data, and computer processing circuitry coupled to and responsive to said control device which is programmed to perform said editing operation on said selected displayed data and to animate said editing operation on said displayed data and contextual changes resulting from said editing operation at a visually apparent rate.
- a display-oriented graphical editing system comprises a data interface surface, an interacting device for entering gestures on said data interface surface to select at least a portion of displayed data and to select an editing operation to be performed on selected displayed data, and computer processing circuitry coupled to and responsive to said interacting device which is programmed to perform said editing operation on said selected displayed data and to animate said editing operation on said displayed data and contextual changes resulting from said editing operation at a visually apparent rate.
- a method for animating editing changes to a plurality of objects displayed on a display device of a computer comprises the steps of determining which of said plurality of objects are to be manipulated on said display device in response to an operation command by a user of said computer, determining one or more parameters for each of said changes to be animated, determining in how many iterations N an animation should occur, and performing said animation in iterations comprising, for each of said iterations, erasing each of said objects which must be manipulated and displaying each of said objects which are to be manipulated an additional 1/N of a total change set by said one or more parameters over a preceding of said iterations.
- a method for animating editing changes to a plurality of objects on a display device of a computer comprises the steps of determining which of said plurality of objects are to be manipulated on said display device in response to an operation command by a user of said computer, determining one or more parameters for each of said changes to be animated, determining in how many iterations N an animation should occur, and performing said animation in iterations comprising, for each of said iterations, erasing each of said plurality of objects which must be manipulated and displaying each of said plurality of objects which are to be manipulated a distance further along said total change, said distance varying for a first of said iterations from said distance for a second of said iterations.
- FIG. 1 is a block diagram illustrating one form of touch based input apparatus in which the system according to the present invention may be employed;
- FIG. 2 is a view of a display screen with a freeform loop gesture enclosing several objects
- FIG. 3 illustrates a simple flowchart of the basic operations of the system shown in FIG. 1 according to the present invention
- FIG. 4 illustrates an expanded flowchart of the operation assignment function shown in FIG. 3;
- FIG. 5 illustrates the gestures employed in a system incorporating the present invention
- FIG. 6 illustrates a flowchart of the basic control operations of the system which supports implicit structures according to the present invention
- FIG. 7 illustrates an expanded flowchart of the operations shown in FIG. 6
- FIGS. 8(a)-8(m) illustrate operations and gestures on handwritten lists
- FIGS. 9(a)-9(f) illustrate operations and gestures on handwritten outlines
- FIGS. 10(a)-10(j) illustrate operations and gestures on handwritten text
- FIGS. 11(a)-11(h) illustrate operations and gestures on handwritten tables
- FIGS. 12(a)-12(i) illustrate operations and gestures with border lines
- FIGS. 13(a)-13(h) illustrate operations and gestures on node-link structures for handwritten diagrams
- FIG. 14 illustrates operations on completely structured data
- FIGS. 15 and 16 illustrate operations on mixed structures and freeform commands
- FIGS. 17 and 18 illustrate alignment operations on a table
- FIGS. 19(a)-19(h) illustrate operations on and the behavior of freeform text
- FIGS. 20(a)-20(k) illustrate an animation of a move of a word (the primary objects) from one line to another, as well as the contextual moves of other objects on those lines to accommodate the change;
- FIGS. 21(a)-21(k) illustrate a linear movement animation of a word within a line, as well as the contextual moves.
- FIGS. 22(a)-22(k) illustrate a curved movement animation of a word within a line, which distinguishes the primary move from the contextual moves.
- FIGS. 23(a)-23(d) illustrate an animation of a move in a sequential manner: first opening up a space for a word, the word moving into it, and then closing up the space where the word was.
- FIGS. 24(a)-24(j) illustrate an animation of a move after the user initially drags a word to near its destination.
- FIG. 25 is a flow diagram of an animation routine in accordance with the invention.
- FIG. 26 is a flowchart of a timing control algorithm used to control the time of the animation provided for by this invention.
- An editing system on which one may preferably practice this invention can be defined by a set of principles.
- Such design principles include the following:
- the drawing area is always regarded as a plain two-dimensional surface on which the user can enter any object, in freeform, anytime. There are no restrictions on its use.
- gestures i.e., strokes that are interpreted by the system
- a loop gesture selects some strokes and a pigtail gesture deletes them.
- a left bracket gesture selects a group of strokes and regards them as an item in a list.
- any operation on that structure is performed according to the structural assumptions of that structure. For example, when a plain selection is moved, it is simply positioned in its new location; but when a list item is moved, space is created for it and the space remaining is closed.
- the user also needs spatial flexibility, i.e., the ability to limit and delimit the spatial scope of what is regarded as structured.
- spatial flexibility i.e., the ability to limit and delimit the spatial scope of what is regarded as structured.
- the simplest technique is to simply regard very long strokes (relative to the kind of structures being evoked) as border lines.
- the user can limit list operations to the left side of a page by drawing a long line down the middle of the page, which will act as a border.
- the extent of the item will stop at the border line.
- FIG. 1 there is shown a block diagram of the gesture-based input system 10 including a CRT display 12.
- a transparent pressure sensitive type drawing surface 14, i.e. touch panel, is attached onto the surface of CRT display 12. Drawing surface 14 is touched by a user and the touch is detected by touch detection circuit 18.
- the detected signal from touch detection circuit 18 is input to an X-Y detection circuit 20 which processes the input signal.
- the X-Y detection circuit 20 detects the (x, y) coordinates of the input point touched on the surface of drawing surface 14 and transmits such information to CPU 22.
- Touch panel detection circuit 18, X-Y detection circuit 20 and the CPU 22 combine to form controller 16.
- Controller 16 performs the control in accordance with the control program stored in program ROM 26 in memory 24.
- ROM section in memory 24 includes program ROM 26 in which the control program of CPU 22 is stored and a pattern data area for storing various kinds of character patterns or the like.
- RAM section 28 of memory 24 includes a RAM portion 29 which is used as a work area of CPU 22 and a character position data area 30 to store display positions of character patterns and the like.
- drawing surface 14 is an electronic input device such as an electronic sketch pad, LiveBoard or whiteboard which employs a working surface and may employ a plurality of accessible functions 40 as is shown.
- the working surface is the upper area 36 of drawing surface 14 and the accessible functions 40 are positioned at the bottom area 38 of drawing surface 14.
- These functions 40 may include new operation (draw mode), delete, move, shrink and so on. Alternatively, these functions can be accessed by a pop-up menu.
- These functions are optional in designation, their principal objects being to define operations which are inherent in the operation of the system. These functions may share some of the same functions that are represented by many designated command gestures.
- a suitable user manipulable control or interacting device 42 such as a stylus or light pen is employed to draw input symbols, select functions or draw gestures representative of specified functions. Obviously, other variants within the skill of the art may be employed.
- FIG. 2 the user has created objects 34 and has selected these objects by drawing a freeform loop 32 around them.
- system can be broadly characterized to include the following three basic features, as shown in FIG. 3.
- Gesture/Data Entry 43 receive data or command gestures that arc drawn as a stroke on drawing surface 14, and interpret the individual stroke as a command gesture in response to some action taken by the user. Such an action may be exerting pressure on a button located near the grasping portion of the stylus 42.
- Operation Implementation 46 means for executing or performing that operation or operations with respect to the desired data.
- Feature (2) consists of the following three sub-features which are shown by blocks 50, 52, and 53 in FIG. 4.
- a structural model is declared and a particular model type is chosen according to the combination of gestures used to make a selection and is used to act upon the selected information.
- a model structure is first called by a gesture used to define an operation. That is, a specified gesture for selecting desired information on the display alerts the system that such information shall be treated within a basic structural model context. These specified selection gestures create the structured, i.e., rectangular enclosures 60 shown in FIG. 5. The structured enclosures 60 also enclose and thereby identify the data which will be affected by the operation 52.
- the structured enclosures are created by drawing brackets gestures (up and down brackets 61 and 62 and left and right brackets 63 and 64), an underline gesture 75 or "L" shaped gestures 65.
- These selection gestures are collectively known as structured selection gestures.
- the recognition techniques for these gestures are quite simple, because of the easily distinguishable features of such gestures, such as comers and orientation.
- the system creates the structured enclosure by a concept known as projection, subject matter disclosed in U.S. patent application Ser. No. 08/175,841, filed Dec. 30, 1993, entitled Apparatus and Method for Altering Enclosure Selections in a Gesture Based Input System described above.
- the segments (legs portion) of the gesture are partially projected to some predetermined border. These projected segments are called extensions. All of the data falling within the area defined by the projected boundaries are selected. For example, the ends of a left bracket gesture " " are projected horizontally to the right until some predetermined border is reached to terminate the projection. All of the data falling within the extension boundaries of the projected bracket gesture are selected.
- the L-shaped gesture works in a similar manner.
- the system identifies the "line” of text above or touching the underline, and selects the objects on the line that lies between the endpoints of the gesture.
- a selection made by freeform loop 56 is merely a selection of items for future operation having no regard for specifying and/or organizing such information according to any structural model.
- the user may use the (unstructured) move gesture 68.
- the model type i.e., the precise structure of the information is not determined until the user executes a second command on that selection of items, i.e., draws a second gesture.
- Examples of these gestures are also shown in FIG. 5.
- Such gestures may include delete 73, move 74, shrink 77, copy 69, and drag (dynamically, viz., where cursor "sticks" to and moves with the stylus or other control or interacting device) 76.
- This list is not limited.
- Some command gestures in particular the wedge 78 and caret 79, can also be employed without a preceding selection gesture.
- the first gesture indicates the model and selects some set of strokes within the model, then a second gesture specifies the action to be performed on the selected strokes.
- the system identifies the local objects that will be affected by the operation with respect to the structural model.
- the system operates by grouping strokes into structures (e.g., line items, which are rows of objects) and partitioning a set of strokes relative to those structures. For example, to move a line item, the system must identify what strokes belong to the line item, identify the line items at the destination to which the line item will be moved, (to determine the "inter-line gap"), and partition the remaining strokes on the screen according to whether they are above or below the source line, i.e. where the line item was originally, and above or below the "destination gap"- explained below. From this partitioning, line spaces are easily derived.
- the line grouping technique is further described below.
- ambient strokes which are left undisturbed by structural operations
- border strokes are identified by looking for objects that are very large relative to the selected structures.
- the system computes the structural model parameters for the operation as a function of the local objects identified above.
- the required procedures include computing baselines, line and column spacing, and word spacing.
- the system has determined the structural model, the affected objects, and the structural model parameters.
- the final function is to implement the operation 46 (FIG. 3), i.e., execute the desired operation which involves straightforward manipulation of the objects on drawing surface 14. For example, to move a line to a location higher on the page, the lines between the source and destination are moved downward, and the line is moved up into the resulting space, and the vacated line space is removed.
- a typical procedure is the one for grouping lines.
- a related procedure determines baselines. (As used herein, a baseline is a line on which the bottom of all characters except for descenders, such as the left leg of the letter "p", lie.) Fundamental to both of these procedures is determining the density of the "ink” in the horizontal cross-sections of the screen. A top-down description of these procedures is provided below and pertinent portions of these procedures are set forth in Appendix A.
- the Line Grouping procedure finds the strokes that belong to the line of objects containing the y-coordinate of the given point.
- the procedure computes the top and bottom boundaries of the line and then tests all the strokes on the screen, collecting those that "reside" on the line.
- the criterion for a stroke residing on a line is that more than half of the vertical extent of the stroke must lie between the top and bottom line boundaries.
- the procedure FindLineBoundaries in Appendix A is based on the ink density function.
- This density is represented discretely by an array.
- the screen is partitioned into horizontal strips and the "density of the ink" is computed for each strip.
- This array of values is then converted to binary values (0 or 1) by applying a threshold of ink density in order to achieve the value "1".
- the threshold is expressed as a fraction of the average density.
- FIG. 6 is a flowchart of the basic control operations which implement the system procedures described above.
- the system is inactive until the user enters data 82 by any of the means described above (stylus, scanning, typed entry, etc.). Once data is entered into the system, the user may draw a gesture on the drawing surface 14 at user action FIG. 83.
- the system detects that a gesture is entered and performs an analysis on the shape of the gesture to categorize it as one of the gestures known to the system.
- decision diamond 85 if the gesture is not recognized by the system, an error message is displayed at block 86 and the system returns to user action FIG. 83 for a new gesture entry.
- the system determines whether the gesture is one for selecting desired data. At this point it is noted that the system also determines the type of selection gesture because the type of selection gesture determines whether the system performs a standard data operation or data operation within a structural context. The details of the type of selection gestures and their attributes however are discussed below with respect to the flowchart in FIG. 7.
- the system will identify the data defined by the selection at block 88, implement the operation of selection at block 89 and display the result of the implemented operation at block 90. The system then returns to user action FIG. 83 where the user may enter another gesture.
- the structural selection remains for further operations.
- the user can revert to freeform mode by simply beginning to draw strokes, and the objects are no longer selected.
- a structural model may be a list, table, diagram, or outline.
- a list is characterized as a column of line items (each item consisting of a row of graphic objects), and when an item is moved then the column is rearranged to create space for the item and close the space remaining after the move.
- the system implements the operation associated with the gesture identified (move, shrink, delete, etc.). The system then displays the process of the change of information in accordance with the operation at block 95 (viz. animation, described in more detail below). The system also displays the result of the information in accordance with the implementation at block 96. The system then returns to FIG. 81 where the system waits for input.
- the system determines if the information is a gesture. If information is not a gesture then the system moves to block 124 and performs other operations. Otherwise the system advances to decision diamond 104.
- 112 Is input a freeform selection gesture?; If information is a freeform selection gesture then the system advances to block 128 and creates a new freeform loop and returns to block 100. If the input is not a freeform selection gesture then the system moves to decision diamond 1 14.
- a structured selection gesture encloses information to be regarded according to one of the structural models. These gestures include brackets and L-shaped figures. Details of these gestures are described below.) If the input is not a structured selection gesture then the system moves to decision diamond 116.
- the existing structured selection enclosure is reshaped to include the alteration gesture.
- a list is assumed to be a vertical column of line items (or just items for short) separated by line spaces.
- An item is a row of objects and usually is a line of text. It is therefore assumed to be wide and not very tall. Thus since a left-sided bracket at the left of an item is an efficient way to select text, this gesture is the preferred form of selection in this model.
- the vertical space between items is minimized. When an item is deleted, the remaining space is removed, (i.e., the items below it are moved up). Items however cannot lie on top of one another. Consequently, when a line item is moved between two other items, space is created for the new item. List operations attempt to keep the current line spacing locally intact. But line spacing can be explicitly opened up or closed up. Also, an item may be shifted horizontally to achieve appropriate alignment with other items.
- FIGS. 8(a) and 8(b) illustrate a list of newspapers.
- a left bracket is positioned to the left of "USA Today” to select that entire item.
- FIG. 8(b) shows the rectangular selection enclosure 150 automatically constructed around the entire item.
- an "L" shape gesture 152 is drawn which indicates how the selection figure should be reshaped.
- the rectangular selection is reshaped to include the L-shaped gesture and therefore include "USA Today” and "Los Angeles Times.”
- This gesture 152 is shown in FIG. 8(c) and the resulting expanded selection enclosure 153 is shown in FIG. 8(d).
- FIG. 8(e) shows this gesture 154. Once the delete gesture is issued, the item is deleted and the remaining two items are moved closer to one another to remove the empty space between such items.
- FIG. 8(f) shows the remaining two items in the list.
- a user may draw a line gesture 156 from the selection enclosure 155 to a target position, e.g., the location just beneath the item "Los Angeles Times" in FIG. 8(g). The resulting move is shown in FIG. 8(h).
- the user may move a selection enclosure to the target position by drawing a ">" gesture at the target position (not shown).
- a user may draw a horizontal line gesture.
- the user may draw a " ⁇ " gesture at the target position.
- a user may draw an inward/outward spiral gesture 77 as shown in FIG. 5.
- a wedge ">” gesture may be drawn in the space between lines.
- An example of this gesture is shown in FIG. 8(i).
- a wedge 158 is positioned between "USA Today” and "Los Angeles Times.”
- FIG. 8(j) shows a space inserted between these items.
- the user touches the very center of the selection region. Once the user touches the center, the selection "sticks" to the pen and the pen slides the image of the selection around until the pen is lifted, at which time space is opened and closed as appropriate and the selected item moves into its proper position.
- An example of the drag operation is shown in FIGS. 8(k), 8(l), and 8(m).
- FIG. 8(k) "Wall Street Journal” is selected.
- FIG. 8(l) shows that after dragging this item, it is positioned between “USA Today” and “Los Angeles Times” and positioned to overlap these two items.
- FIG. 8(m) shows the list as it has been straightened out by the computer in the order "USA Today", “Wall Street Journal” and "Los Angeles Times”. It is important to note that the arrow 76 in FIG. 5 represents the motion of the drag operation. However no arrow is actually drawn on the screen for this operation. It is only shown in FIG. 5 as one operation available to the user.
- An outline structure is an extension of the simple list structure.
- Line items have indentation levels, which determine the hierarchical structure of the items. There are no fixed “tab” positions; the indentation level of an item is determined by whether the horizontal position of the leftmost stroke is "close to” or “significantly” different from the horizontal position of the leftmost stroke of the item above it.
- There is an alignment operation that shifts items, i.e., aligns the edge of the stroke of each item closest to the left either exactly the same as or significantly different from their preceding items. This operation shows the system's interpretation of the indentation structure.
- An outline subtree is the combination of an item and all items below it that have greater indentation levels.
- Subtrees can be collapsed to show the higher level structure of an outline. Collapsed subtrees are physically shrunk (in the vertical dimension only) so that they appear as thin physical lines, termed container objects, underneath the topmost items in the subtrees. Container objects can be expanded back to their original size. To indicate the presence of the container object, an icon is employed, such as a thickened line, or the like. Only a proper subtree can be collapsed. If a selection is not a proper subtree, it will be extended before collapsing.
- the container object is treated like a list item, i.e., it can be moved as a block, and otherwise manipulated as desired.
- the user may draw a vertical line at the left edge of the selection.
- An example of this is shown in FIG. 9(a). Notice the first level heading is "Newspapers" and “New York” is the heading of a subtree.
- a vertical line gesture 160 is drawn at the left edge of the selected items beginning at "New York”.
- the system compares the indentation of all the line items in the selection, and adjusts the horizontal position of the line items so that lines that were indented by roughly equal amounts are in fact indented exactly equally. The resulting outline is shown in FIG. 9(b).
- FIGS. 9(c) illustrates the selected items in the outline before the lowest level in the subtree is collapsed and FIG. 9(d) illustrates the outline after that level has been collapsed. Notice that once the lowest level of the subtree has collapsed, the other items not selected, i.e., "Radio Stations" has moved upwards to close the remaining space left from the subtree.
- the user may choose the collapse-all-levels menu item.
- the user may choose the expand-one-level menu item.
- Handwritten "text" is a limited model of text within the context of lists and outlines. Strokes of handwriting are first assumed to be grouped as line items and then, within line items, as words separated by word spaces.
- word refers not to the word of ordinary language, but merely to a horizontal cluster of objects separated by empty space. A sequence of typewritten characters delimited by spaces is a special case of such a "word”. Words and word spaces on a line are treated in the horizontal dimension in a way analogous to that of line items and line spaces in the vertical dimension. Word spacing is closed up when words are deleted and opened up to make room for inserted words. Words are contained on a single line and have to be explicitly moved between lines by the user.
- Contiguous words on the same line can be selected with left and right bracket gestures, or by an underline gesture.
- the system distinguishes whether a word space or a line space has to be created, i.e., opened up to accommodate them.
- a line item is also treated this way. That is, a line item just happens to contain all the words on a line.
- a user may underline the words or draw bracket " " and "! gestures. (Only one bracket is needed if the words to be selected extend to either the left end or the right end of the line.)
- An example of this is shown in FIGS. 10(a) and 10(b).
- the word "Angeles" is bracketed for selection.
- a user may draw a pigtail gesture 162 from the enclosure 163 to delete the item selected. The result is shown in FIG. 10(d).
- a user may draw a line gesture to a word space, or make a caret gesture " " in a word space.
- a line gesture to a word space
- FIGS. 10(e) and 10(f) An example of this is shown in FIGS. 10(e) and 10(f) where the selected word is "Angeles" and the line gesture 164 extends from the selection enclosure 165 to the desired location between "USA” and "Today.” The system creates space at the desired location to fit the selected information.
- the user may draw a line gesture to a line space or make a ">" gesture in a line space. This is shown in FIG. 10(g).
- the line gesture 166 extends from the selected information, i.e., "Angeles" to a location between the first two lines of text. The system creates space between these lines of text and moves the selected information.
- FIG. 10(h) shows the result of these operations.
- a " ⁇ " gesture may be drawn in a line space.
- a caret gesture may be inserted.
- the caret gesture 168 is shown in FIG. 10(i). Notice in FIG. 10(j) that space is created between "USA” and "Today.” The system creates a fixed amount of space determined as a function of size of the neighboring text.
- a table is a two-dimensional structure, consisting of rows and columns.
- a row is exactly a line item from the list model.
- a column is like a row, but with gestures and operations transposed from the horizontal to the vertical dimension. For example, a top bracket selects a column just as a left bracket selects a row; columns can be moved horizontally just as rows can be moved vertically.
- a column space is the vertical space between columns.
- vertical text like Japanese or Chinese writing
- a user is permitted to select a block of material--a rectangular selection of material from more than one row or column, again using left and right or top and bottom brackets. This provides a convenient way to apply simple commands (e.g. make red) to larger portions of a table. However, no special tabular interpretation is placed on a block selection.
- a top bracket gesture 172 can be drawn as shown in FIG. 11(a).
- FIG. 11(b) shows the result of the action taken in response to this gesture.
- a pigtail gesture can be drawn (not shown).
- a line gesture can be drawn to the new location as shown in FIG. 11(c) (or make a " " gesture).
- FIG. 11(d) notice the entire center column is moved to the right after the column starting with "3.”
- a " ⁇ " gesture can be drawn at the new location.
- top and bottom bracket gestures can be drawn to enclose desired information. This is shown in FIGS. 11(e) and 11(f).
- a block can be selected with L-shaped gestures 65 (FIG. 5).
- a pigtail gesture can be drawn from a selection enclosure. This will automatically close up the space (vertically or horizontally).
- a line gesture may be drawn from a selection enclosure to the target location.
- FIG. 11(g) An example showing a move of vertical block 173 with a line gesture is shown in FIG. 11(g) and the corresponding vertical block 173 is shown after the move in FIG. 11(h).
- a " ⁇ " gesture can be drawn at the target location.
- the normal operations can be applied to the selected row, column, or block in the normal manner. For example, to change the strokes in a block to red, touch the make-red buttons.
- border lines any long stroke that it "bumps into” while interpreting a gesture.
- the extent of a selected structure will automatically be stopped at a border. For example, when selecting an item in a list by a bracket as shown in FIG. 12(a), only those strokes up to the border 175 are included in the selection in FIG. 12(b). Any structural operation that would cause a violation of a border is aborted. For example, the open-up space operation (using a wedge) shown between "USA Today" and "LA Times" in FIG. 12(c) would be aborted because the space would cause the last line of the list, i.e., "LA Times" to go through the bottom border.
- an operation may take place across a border.
- an item (USA Today) may be moved from one column to another in a multi-column list (with borders between the columns). Notice that in FIG. 12(e) space was created in the column on the other side of the boundary between "Paper” and "Radio” for the item "USA Today.”
- a corner-shaped gesture or a bracket gesture may be drawn to show where to extend the selection. This is shown in FIGS. 12(f) and 12(g). Notice that the original selection enclosure is extended to include information through two borders.
- a border line When a border line is penetrated in this way, it is no longer regarded as a border, but as an ambient object. Ambient objects are ignored during any operations. For example, ambient lines between columns of a table are unaffected by moving a row. In FIGS. 12(h) and 12(i), a line gesture 176 instructs the system to move a row without affecting the two borders. Ambientness holds only for the life of the current selection.
- a diagram is meant as a node-link structure.
- a node is a group of graphic objects that is to be regarded as a unit, and a link is a line that goes between (joins, connects) two nodes.
- the links and nodes form a mathematical (topological) structure, which is more abstract than the geometric structure. That is, the topology is not concerned with where nodes are located or the shape of the links, but only with what the pattern of the links between nodes is.
- the user may want to change the topology by editing the links, but often the user wants to change the geometry of the diagram (to make it clearer or neater) while preserving the topology.
- FIG. 13(a) shows a simple diagram with three nodes (A, B, and C) and two links (A-B and A-C).
- FIG. 13(a) shows a simple diagram with three nodes (A, B, and C) and two links (A-B and A-C).
- FIG. 13(b) shows the diagram changed geometrically--the positions of nodes B and C are reversed and node B is lowered a bit--but with its topology preserved: A is still linked to B and to C and that the characteristic shapes of links A-B and A-C are preserved. However, link A-C emanates from the right side of node A in FIG. 13(a) and from the left side of node A in FIG. 13(b).
- the user can select objects on the display as a node by making a rectangular-shaped gesture.
- the system recognizes this shape and replaces the drawn gesture with a perfectly geometric rectangle, just as it does for all the other structured selections.
- a node selection has one added feature that other structured selections do not have.
- a node selection selects all the objects it enclosed as the node, and all lines that extend out of the selection are selected as links.
- a selected link is indicated by a little circle 179 drawn by the user on the selection rectangle 178 (FIG. 13(c)) where the link crosses it. The user can change whether a crossing line is to be regarded as a link or not. For example, FIG.
- FIG. 13(c) shows node A selected and line A-B selected as a link, but line A-C not selected as a link.
- FIG. 13(d) shows gestures for changing the links. The slash gesture 180 through the little circle removes the link status of the line A-B, and the little-circle gesture 182 where line A-C crosses the selection rectangle changes line A-C to be regarded as a link. The result of these two gestures is shown in FIG. 13(e). Because of the above operations shown in FIG. 13(d), when node A is moved to the right, link A-C is distorted to preserve the connection, but line A-B is not changed (because it is not considered a link), and the connection between A and B is lost.
- Grouping is another operation (available from a menu) which can be performed on diagrams.
- a set of objects When a set of objects is selected and grouped, all the objects in the selection are regarded as one single object, a group object.
- the advantage for a user of a group object is that it can be selected simply by touching it, i.e. by a dot gesture. This makes it very easy for the user to manipulate group objects. If the nodes in a diagram are grouped (but not the links), then the diagram can be rearranged easily while preserving the topology. Further, because the system know what objects constitute the links, it can make sure that the links emanate from the nodes in the proper orientation.
- FIG. 13(g) shows three character objects A, B and C and a stroke object around them.
- the character object C is selected as a node.
- the line gesture indicates that the node (the C) is to be moved to the right. But when it is moved, the stroke object is distorted to follow the C, as is shown in FIG. 13(h).
- This extension illustrates, again, the principles of implicit structuring--that the structure is not "inherent” in the graphic data, but rather structure is what the user sees in the data and "tells" the system, by means of gestures, to find.
- FIG. 14 shows a screen with two lists separated by a vertical line that the system interprets as a border. Thus the list on each side may be edited independently (e.g., Gail could be move to the top of the list). An operation can also affect both lists.
- FIG. 14 shows that Jane is selected and that a ">" gesture is telling the system to move Jane to between Bob and Tom--Gail will move up and Tom and Bill will move down to accommodate this move. Note that there is a structure on each side of the border, and the move is completely structured.
- FIG. 15 where there is a list on the left and a partial office floor plan on the right.
- the floor plan is a freeform diagram, not a structure.
- the system would not know how to alter the diagram properly. It would try to interpret it as a list--in the best case it would move Tom into Bill's office and Bill out of an office; in the worst case, it would move walls in the diagram and ruin the entire floor plan; in this case, the system would refuse to do the operation because the diagram cannot be grouped as a list.
- the problem is how to treat the diagram as a freeform and at the same time treat the list as a list.
- each gesture can indicate how the material is to be treated at each end of the move.
- the selection can be structured (selecting it with brackets or L-shaped gestures) or freeform (using a loop), and the move can be structured (> or ) or freeform (V). These can be mixed in all possible combinations.
- an alignment operation "neatens up" the indentation levels of a list (it can also be used to clean up simple lists).
- a similar operation is employed for tables.
- the problem is that tables often are poorly aligned. People often create tables column-wise. They write items going down one column then down another. The result is that items in the rows are poorly aligned, and rows cannot be selected and moved.
- the solution is to create a horizontal alignment operation. To invoke this operation, a selection box is created around a table or part of a table and a horizontal alignment gesture, which consists of a horizontal line, is drawn along the top edge of the selection box. One selects the whole table and makes a horizontal alignment gesture at the top of the selection.
- FIG. 17 shows a table created by a user
- FIG. 18 shows it after the system aligned it up.
- the alignment operation may also add spacing between rows and columns and vertical alignment of columns or both vertical and horizontal alignment of rows and columns.
- Each character is treated as a freeform graphic object, similar to a stroke, i.e., it just occupies an x-y position on the display. Just like strokes, they are treated as freeform or as being structured depending on what gestures are used. Characters and strokes can be mixed; e.g., the user can handwrite a Greek letter in a line of typed text.
- Character objects can be treated as freeform (unstructured) by selecting them with a loop selection gesture. When character objects selected this way are deleted with the standard pigtail gesture, the character (and other kind of) objects surrounding the deleted character objects are not moved. (i.e., exactly the same behavior as for stroke objects.) Similarly, the character objects can be moved anywhere, even to sit on top of other character or stroke objects. See FIGS. 19(a) and 19(b).
- Character objects can be treated as structured by selecting them with one of the structure selection gestures (brackets or L-gestures). When character objects are selected this way, they can be operated upon just like the structured (i.e., list, outline, text, and table) operations on stroke objects. For example, a set of horizontally related character objects can be selected with bracket gestures and then moved with a line gesture to another place; the character objects will close up and open up appropriately. See FIG. 19(c) and 19(d). However, when character objects are selected, they receive special treatment in that they are "snapped" into alignment with nearby character objects.
- structure selection gestures brackets or L-gestures
- ABC is snapped to be aligned with DEF in the most appropriate way, such as the baselines being aligned and the DEF either occurring right after ABC or with exactly one normal character space between C and D.
- Additional operations are provided to allow entry and editing of text with a keyboard.
- the user needs a way to tell the system where typed character objects should be placed on the display.
- the "type-in point” is a small carat-shaped object created by the system.
- the user can position or reposition the type-in point either by making a dot gesture with a pen or by pointing and pressing a mouse button. If the dot is near characters, the system replaces the dot with a carat which is snapped into alignment with the nearby characters. If the dot is not near any characters, then it remains a dot. But if the user immediately begins to type at the keyboard, then the system replaces the dot with a carat.
- a Tab key causes the type-in point to be moved to the right to align with any character objects above with space to their left; if no alignment can be found, then the type-in point is moved to a default table position. If the type-in point is pushed into a border or to the right edge of the display, then it, along with character objects to its immediate left, are moved down one row and to the left, i.e., the normal "line wrapping" behavior for text. Similarly, other special keys (e.g., Backspace, Linefeed) have appropriately-defined behaviors.
- Animation of changes to data in a display-oriented graphical editing system described above clearly illustrates the effects of a structural editing change on a display device, such as a LiveBoard or other display screen, so that the user, as well as other persons in a room watching the user's actions, can clearly see and get a better understanding of the changes to the various elements.
- the invention also enables a person employing a personal computer, such as a personal tablet computer, to more clearly understand the manner by which the system reacts to his or her input.
- the animation feature in accordance with the invention can be employed as an educational tool to enable students to appreciate the effects of the different operational commands, as well as to enhance practice sessions on the display device, such as a LiveBoard, touch screen or other data interface surface.
- a user selects at least a portion of data displayed on the display device.
- the data is an object or a group of objects.
- the selection itself can be either freeform or structured and is made by drawing an appropriate gesture, as described above, with a control or interacting device 42 (FIG. 1), such as a stylus or light pen, or with a mouse.
- a control or interacting device 42 FIG. 1
- the operation command may be invoked using the control or interacting device 42 to draw an appropriate gesture, by selecting one of the accessible functions 40 (FIG. 2) or by a choosing a selection from a pop-up menu.
- the animation effect may be controlled in the operation 130 of FIG. 7 for freeform selections, and in the operation 136 of FIG. 7 for structured selections.
- the computer system then performs the specified editing operation on the selected displayed object or group of objects using computer processing circuitry coupled to and responsive to the control or interacting device 42.
- the system animates both the change to the selected object or group of objects, such as movement to a new location or expansion/shrinking, and the changing of the characteristics such as position or size of the other object or group of objects on the screen which accompany such a change.
- the objects are shown as changing gradually at a visually apparent rate, rather than changing instantaneously.
- the main change which is the change to the object or group of objects selected by the user, is referred to as the "prime change" and the changes to the other objects on the display are “contextual changes.” These contextual changes maintain the structure of the text.
- Prime and contextual changes are individually animated as occurring in a series of steps (or iterations), the exact number of steps being determined by a timing control function, described below, which is programmed into the system.
- the prime and contextual changes performed in response to an operation command may be animated simultaneously.
- changes may be animated sequentially so that, for example, in response to a move command, a space may be initially opened up at the destination (viz., the final location to which the selected object or group is to be moved), following which the selected text is moved to the space thus opened up, and followed by the closing up of the space from whence the text originated, all of which occurs at a visually appreciable rate.
- the word “windows,” has been selected as a structured selection 184 with an appropriate gesture or gestures such as by drawing a left bracket 63 (FIG. 5).
- a caret 185 has also been drawn between the words “dialogs,” and “icons” indicating to the system that the word “windows,” should be moved from its current position in the list and inserted between and on the same line as the words “dialogs,” and “icons”. This is the prime change in this example. Two contextual changes are necessary to accommodate this prime change.
- FIGS. 20(b)-20(k) illustrate the prime and contextual changes being animated simultaneously in ten steps. Each consecutive figure shows the objects as having moved an additional 1/10th of the total distance along their paths towards their destinations. Each object (or “word”) moved moves in a linear path from its original source location to its final location.
- Each moving object passes through any other objects in its path so that one object might overlap another object. This can be seen in FIG. 20(h) where the words "windows,” passes through the word “display”.
- FIGS. 21(a)-(k) illustrate another example of animation of changes resulting in the movement of a word within a line to a new location on that line.
- the word "not”, shown surrounded by box 187, is selected and a caret 189 is placed before the word "all".
- the simultaneous contextual change is the movement of the words “all that glitters is” to the right so that a space is opened before the word “all” and the space between "is” and “gold” is closed. All changes are animated simultaneously in ten steps in FIGS. 21(b)-21(k), with the word “not” moving from its original location to its destination.
- FIGS. 21(b)-21(k) Since the word "not” passes through other words in the line in FIGS. 21(b)-21(k), it is not especially clear to the user or others which is the prime change and which is the contextual change. To make the distinction between prime and contextual changes clearer to the user, the prime change can be highlighted. One way to do this is to show the prime change as moving in a non-linear movement such as an arc (either circular or parabolic) from its original location to its destination location, while the contextual changes are shown as moving in a straight line.
- FIGS. 22(a)-22(k) are identical to FIGS.
- FIGS. 23(a)-23(d) illustrate such an animation.
- the word "not” is again selected and a carat indicates to the system to move the word "not” before the word "all”.
- the system animates the opening of a space before the word "all” which is large enough for the selected word "not”.
- FIG. 23(c) shows, the move of the word "not” into the empty space that was created is animated.
- FIG. 23(d) the vacant space between the words "is” and "gold” is closed.
- An object could also be animated as changing at a rate that varies. For example, the object could move quickly from a source location to the approximate destination and then move slowly into its exact destination.
- the animation architecture also allows the specification of the type of move for each part in a change, the default being a linear, constant-rate move.
- animation may also be provided for other types of changes, such as morphing, copying, scrolling, opening and closing space, shrinking, expanding, rotation and distortion.
- Color changes could also be animated by showing the color changing gradually, rather than changing immediately, to the newly selected color. To animate a color change, a path through color space would have to be calculated so that the system knows what color should be displayed at each step during the animation of the color change. Animation of certain of these changes obviously requires more powerful computer hardware than animating movement changes.
- the animation is integrated with the standard interactive graphic technique of dragging. After the user selects an object or group of objects, he can drag them dynamically to the desired location. During dragging, the objects "stick" to the pen or control device. Once the user terminates the drag operation (e.g., by lifting the pen), the objects stay in their new location. If the selection is freeform, then nothing further needs to be done. But if the selection is structured, then more has to be done to properly position the objects within the structure. This is where the animation comes in. Suppose the user intends to move the objects to location y. First, he drags the selection to a location y", which is near y (because it is very difficult to drag to an exact location).
- FIG. 24 shows the same word move as in FIG. 21.
- the user drags the word to near the intended destination (FIG. 24(b)).
- the system animates the rest of the move operation by moving the word into its exact final location and simultaneously animating the contextual changes to the surrounding objects (FIGS. 24(c-j)).
- FIG. 25 A flow diagram illustrating one procedure for effecting animation of movement is illustrated in FIG. 25, wherein, at block 200, the determination is made of whether a valid structured selection has been made, and whether a valid operation command has been received. Once both conditions are true, which would be at block 130 of FIG. 7 for freeform selections and at block 136 (FIG. 7) for structured selections, a determination is next made of the particular operation command received. If the operation command is a command to move objects, block 201, then, as illustrated at block 202, the system opens up a space for the intended destination of the selected objects, moves the objects to the space thus opened up, and then closes the space from which the objects originated. These movements are made at a visual rate, which make the changes visually apparent to the user.
- the operation command is a command to close a space in the display, as determined at block 203, then the space is closed at a visual rate at block 204.
- the operation command is a command to delete objects, as determined at block 205, then the objects are deleted and the space closed, again at a visual rate, at block 206.
- the deletion step may be effected by a gradual dimming of the selected text. Similar operations may be effected for the other available operation commands, as discussed above.
- the timing of the animation is particularly important.
- the movement of the relevant portions of the information on the display screen must be slow enough for the changes to be visually apparent. Yet, if the movement is too slow, the animation becomes more annoying than helpful.
- the actual physical rate of movement to constitute a visually apparent rate depends of course upon the relative positions of the observer and the display panel, and should be adjusted so that the time for the total movement of information elements in a given operation is preferably about one second or slightly less.
- a timing control function is incorporated in the computer system.
- a four-step timing control algorithm is illustrated by the flowchart in FIG. 26.
- the system determines which object or groups of objects on the display must be manipulated in response to the operation command, such as a move, delete or expanding/shrinking operation (in other words, the system determines on which object or group an operation must be performed), and what the final location, appearance or other characteristic of each object or group should be. For example, if the user asks to move the fourth item in a list after the first item and before the second item, then the second and third items must be moved down to make space, and the fourth item must be moved up to where the second item started. The first item and every item below the fourth item are unaffected.
- the second step of the algorithm, block 222 comprises determining the parameters for each change to be animated. Where the object or group of objects is to be moved, this comprises computing a trajectory for each object or group of objects to be moved from the object's or group's current location to its final location.
- the simplest trajectory is a straight line. However, one might choose to have the selection (or perhaps another interesting group) follow an arc to make the group's motion more conspicuous.
- the current location of each object or group is the significant location. The current location will be the original location in both the freeform and structured modes where the user has chosen to move the object or group of objects by making a selection and designating a destination y (freeform mode) or y" (structured mode).
- the group of objects comprising the selection has already been moved close to its final destination by the user's dragging it there so the current location, from which the trajectory is computed, is not the original location but is rather wherever the user left the object after dragging.
- the object will move into place from its current location y" to y.
- the parameters may be determined to show the selected object simply expanding/shrinking in size while the surrounding objects decrease/increase in size.
- the parameters selected include what colors should be displayed during the change and at what rate to pass through color space.
- other editing changes would require the determination of their relevant parameters at this second step in the timing control function.
- the third step, illustrated at block 224, is to determine the number of iterations (or animation steps) N in which to perform the animation.
- N is a function of the user's preferred animation speed, which may be user-selectable, and the complexity of the animation. The slower the desired animation speed, the more iterations are required.
- a sufficiently complex animation, which may be complex, for example, because of the number or sizes of the groups of objects, may be animated in fewer iterations, or may require that the quality of the image displayed be somewhat degraded in order for the animation to be quick enough to achieve the desired speed.
- a preferred choice for a default animation speed results in an animation that takes approximately one second.
- the number of steps required to complete an operation varies in dependence upon the number of changes which must be made and the operation to be performed.
- the animation speed should be selected so that N is at least 10.
- the animation may be performed in fewer than 10 iterations (i.e., N ⁇ 10). If a very small change is made the animation might only require a single step.
- N To select an appropriate value for N, one must conduct empirical tests to determine the amount of time it takes a particular computer processor and program to redisplay changes in the displayed material.
- the redisplay time is usually a function of the amount of material being redisplayed, as measured for example by the area being redisplayed.
- D the empirically-determined time (in seconds) to redisplay a unit of the display.
- A the area to be redisplayed.
- T the total animation time (in sec) desired. (Normally, this is one second, but the user can be provided with a user-interface control to vary this time.)
- the number of animation steps, N T/AD.
- N An alternative, more sophisticated, method of determining an appropriate value for N is to explicitly time the steps of the animation so it takes exactly T seconds.
- Set the number of animation steps, N to be at least 10. This method works when the processor is fast enough so that the redisplay takes less than T/N sec.
- an internal clock is used to pause the program so that the steps occur at exactly T/N second intervals.
- each object or group of objects on the display is changed 1/N of the total change set by the parameters determined in the second step of the timing control function during each iteration.
- the system pauses between iterations as needed so that each iteration takes 1/N of the desired animation time.
- each iteration results in erasing the objects manipulated and the redisplaying of the objects at their new position, with a new appearance and/or other new characteristics an additional 1/N of the specified total change.
- all changes, including prime and contextual changes may be animated simultaneously with each object to be manipulated being manipulated in N iterations with each step being animated as 1/N of a total change.
- each object or group moves 1/N of the distance over the trajectory computed in the second step for N times until the object or group reaches its destination. Double buffering is used during the move to avoid flicker on the display device.
- the system can be programmed to animate changes at a variable rate as described above in order to make the animation appear more natural.
- An algorithm for variable-rate animation of movement of objects is provided in Appendix B. Using this algorithm, the objects to be moved are animated as moving to their destinations in N steps of not necessarily equal distance for a given length of time. During each successive Nth step until the objects approach their destinations, the objects move a distance which is progressively larger than the distance moved in the previous (N-1)th step so that the objects are animated as constantly accelerating until reaching a top speed as they approach their destinations.
- the objects are animated as constantly decelerating, moving a distance which is progressively smaller in each successive Nth step until the objects finally reach their destinations.
- the acceleration and deceleration need not be constant although that is presently preferred and is embodied in the algorithm provided.
- the programming for carrying out the animation may be stored in a computer hardware memory storage device such as a read-only Memory device like program ROM 26 (FIG. 1) or may be included in a software product having the requisite programming and stored in a data storage device such as a hard drive. Instructions provided in the Read-Only Memory or software product are performed by a processing unit in the system.
- a sample source code for programming a computer software in the C++programming language to carry out the timing control function described above is attached as Appendix C.
- the animation feature of the invention is not limited to the specific techniques discussed above, and may be employed, in accordance with the invention, in combination with any operation on the display screen in which such animation will enhance the observer's appreciation of the techniques of controlling a display-oriented graphical editing system.
Abstract
Description
APPENDIX A ______________________________________ LineGrouping:PROCEDURE Coord y! RETURNS ObjectList! ObjectList strokesOnLine=EMPTYLIST; Coord lineTop,lineBottom; lineTop, lineBottom!=FindLineBoundaries(y); FOR object In GetStrokesOnScreen() DO IF ResidesOnLine(object, lineTOP, lineBottom) THEN AddToList(strokesOnLine, object); RETURN strokesOnLine; FindLineBoundaries: PROCEDURE Coord y! RETURNS Coord, Coord! InkDensity density=YDensity(LineBoundariesThreshold); Coord lineTop=GoUpUntilDensityIsZero(y, density); Coord lineBottom=GoDownUntilDensityIsZero (y, density); RETURN lineTop, lineBottom!; FindBaseline: PROCEDURE ObjectList strokesOnLine! RETURNS Coord! InkDensity density=YDensity(BaselineThreshold); Coord yMid=YMidpoint(BoundingBox(strokesOnLine); RETURN GoDownUntilDensityIsZero(yMid, density); YDensity: PROCEDURE float threshold! RETURNS InkDensity! InkDensity density; FillWithZeros(density); float dy=ScreenHeight/NStrips; Compute raw stroke density FOR object IN getStrokesOnScreen() DO FOR i IN 0..NSTrips!DO IF ObjectContainsInterval(object,i*dy, (i+1)*dy) THEN density i!=density i! + 1; Now normalize float cutOff=Average(density)* threshold; FOR i IN 0..NStrips! DO density i!=IF density i! cutOff THEN 1 ELSE 0; RETURN density; ______________________________________
APPENDIX B ______________________________________ float Animator: :moveCoefficient(int i, int iMax) // iMax is the number of steps in a move // returnVal is the percentage along the total movement // that should be reached at the i-th step // i in 0, iMax-1! // returnVal is a coefficient in 0, 1! // if i is iMax - 1, return 1.0 to prevent roundoff errors if(i>=(iMax - 1) return (float) 1; // fit on s-shaped curve // convert to floats float t=(float) i, tMax=(float) iMax; float returnVal; float tMaxHalf=tMax/2; if(t<tMaxHalf) ( float u=t/ tMaxHalf;// u at most 1 returnVal=u*u/2; // returnVal at most 0.5 ) else { // (t>=tMaxHalf)&&(t<tMax) float u=(t - tMaxHalf)/tMaxHalf; float w=(float) 1.0 - u; // between 0 and 1 returnVal=(1 - w*w)/2 + (float) 0.5; // returnVal at most 1.0 } // just in case: Max(returnVal,(float)0); Min(returnVal;(float)1); return returnVal; } ______________________________________
__________________________________________________________________________ Appendix C __________________________________________________________________________ class MoveSpec { public: MoveSpec(ObjectList *objs = NULL, Point *off = NULL, Boolean isc = False) : objects(objs), offset(off), isCopy(isc) { } ObjectList *objects; Point *offset; Boolean isCopy; }; class CompoundMoveSpec { public: CompoundMoveSpec(MoveSpec *mvs, int n) : specs(mvs), nspecs(n) { } Movespec *specs: int nspecs: }; void DisplayManager::compoundMove(Device *device, Slide *newSlide, CompoundMoveSpec *mspec. ObjectList *newObjectVec, ObjectList *newSelectors) int nmoves = mspec->nspecs; Sheet ** flts = new Sheet* nmoves!; int i: SlideMap *newMap = newSlide->getSlideMap( ); SelSheet *selection = findSelection(device); int firstFlt = 0; Boolean doArcs = False, arcX = False; Coord *arcSteps = NULL; for (i=0; i < nmoves; i++) { MoveSpec &spec = mspec->specs i!; ObjectList *objects = spec.objects: Boolean nextIsOriginal = (i < nmoves-1 && mspec->specs i+1!.objects == objects); Sheet *flt; Sheet *fromSheet; if (objects == NULL) { // Designates device's selection if (|selection) fatal.sub.-- error(°DM: compoundMove selection spec with no selection*); fromSheet = selection->parent; if (spec.isCopy & |nextIsOriginal) // Leave behind the original fromSheet->drop(selection, True); flt = selection; } else if (objects->size( ) > 0 && newObjectVec i!.size( ) > 0) { // Need to make a floater if (selection && selection->objects.intersects(objects)) { // lift it off selection if (|selection->objects.includes(objects)) ErrorPrint(°DM: attempt to move objects overlapping the selection\n*); from:Sheet = selection; } else fromSheet = (*objects) 0!->getSlide !->getSlideMap( ); flt = fromSheet->makeFloater(objects); fromSheet->lift(flt, (spec.isCopy && |nextIsOriginal)); } else { /* Null object list, or null replacement list. The former happens if the partitioning of objects came up empty for part of a compound move: th former if it turns out that a zero move was computed for some object set flts i! = NULL; if (i == firstFlt) firstFlt++; continue; } flts i! = flt; if (nextIsOriginal) { // Original wants to move, too Sheet *cop = flt->copy( ); flt->parent->addFloater (cop); flts ++i! = cop; } else if (fromSheet->slide |= newSlide) { // Changing slides. Note that this is not allowed // if we're both copying and moving the same set of objects. flt->detachFromSlide (spec. isCopy); newMap->addFloater (flt); // Note: newslide is temporarily in an illegal state, with a floater with // old objects, none of which are in its object list spec.isCopy = True; // Fakery: contentsMoved will add new objects to new parent } } if(firstFlt == nmoves) { ErrorPrint(*compoundMove: %d moves, but no objects|\n*, nmoves); return; } if (newSlide == currentSlide) { // Okay, there's something to watch, so move 'em around Coord *xSteps = new Coord nmoves!, *ySteps = new Coord nmoves!, // Compute how much to move each floater per animation step // Approximate the cost of the move by the bounding box of all floaters Rectangle bounds: for (i=firstFlt: i < nmoves; i++) if (flts i!) bounds.extendToInclude (flts i!->boundingBox( )): int nsteps = computeAnimationSpeed (bounds.area( )); for (i=firstFlt; i < nmoves: i++) if (flts i!) { Point *off = mspec->specs i!.offset; Point &done = flts i!->displaced; Coord dispX = off->getX( ) - done.getX( ), dispY = off->getY( ) - done.getY( ); xSteps i! = dispX / nsteps; ySteps i! = dispY / nsteps; if (useArcedAnimation && flts i! == selection && nsteps > 1) { // maybe arc this trajectory if (AbsCoord(dispX) < arcEpsilon) { if (AbsCoord(dispy) × 10*AbsCoord(dispX)) arcX = doArcs = True: } else if (AbsCoord(dispY) < arcEpsilon && AbsCoord(dispX) > 10*AbsCoord(dispY)) doArcs = True: if (doArcs) { Coord len = AbsCoord(arcX ? dispY : dispX): Coord step = len / nsteps; Coord height = len*arcHeightWidthFactor; // Radius r is such that (len/2) 2 + (r-height) 2 =r 2 // =>L 2/4 + R 2 - 2HR +H 2 =R 2 // =>L 2/4 +H 2 = 2HR Coord radius = (len,*len/4 + height*height)/(2*height); Coord rSquared = radius*radius; arcSteps = new Coord nsteps!; // X arcs go right (positive). Y arcs go up (negative) float sign = arcX ? 1 : -1; int midIndex = nsteps/2 - 1; // note nsteps is even, so there is a midpoint for (int j=0; j <= midIndex; j++) { //x 2 +y 2 =r 2, so x = sqrt(r 2 - y 2) // The arc is symmetric, so we can fill in both sides at once Coord y = (midIndex - j) * step; arcsteps j! = arcSteps nsteps - j - 2! = sqrt(rSquared - y*y) * sign; } // Those were absolute. For the iteration, we need relative moves Coord initial = (radius - height) * sign; Coord prev = initial; for (j=0; j < nsteps-1; j++) { Coord y = arcSteps j!; arcSteps j! = y - prev; prev = y; } // Final step takes us back down, and adjusts to final location arcSteps nsteps-1!= initial - prev + (arcX ? dispX : dispY); } } } // Now move them for (int s = 0; s < nsteps; s++) { Rectangle rect; for (i=firstFlt; i < nmoves: i++) if (flts i!) { rect.extendToInclude(flts i!->boundingBox( )); // before . . . if (doArcs && flts i!== selection) { if(arcX) flts i!->displace(arcSteps s!, ySteps i!); else flts i!->displace(xSteps i!, arcSteps s!); } else flts i!->displace(xSteps(i!, Ysteps i!); rect.extendToInclude(flts!i!->boundingBox( )); // . . . and after } updateRectangleOnDisplay(newMap, &rect); animationSyncStep( ); } delete ! xSteps: delete ! ysteps: if (doArcs) delete ! arcSteps; } // Finally, adjust the objects for (i=firstFlt; i < nmoves; i++) { Sheet *flt = flts i!; if (flt) { MoveSpec &spec = mspec->specs i!; if (flt == selection) selection->contentsMoved(spec.offset, &newObjectVec i!, newSelectors.spec.isCopy); else { flt->contentsMoved(spec.offset, &newObjectVec i!, spec.isCopy); flt->parent->drop(flt, False); delete flts i!; } } } delete flts; } Boolean DisplayManager::moveFloater(Sheet *flt, Slide *newSlide, const Point *offset. Boolean isCopy) { // Caller must call ContentsMoved after this, since we'll only // move the floater if it's on the current slide. Boolean changedSlides = newSlide |= flt->slide; if (changedSlides) { flt->detachFromSlide (isCopy); newSlide->getSlideMap( )->addFloater(flt); // Note: newslide is temporarily in an illegal state, with a floater with // old objects, none of which are in its object list. // Caller's contentsMoved will add new objects to new parent } if (newSlide == currentSlide) { // There's some visible action Coord dx = offset->getX( ) - flt->displaced.getX( ); Coord dy = offset->getY( ) - flt->displaced.getY( ); Coord maxCoord = Max(AbsCoord (dx), AbsCoord (dy)); int nsteps = Min(computeAnimationSpeed(flt->boundingBox( )->area( )), Round(maxCoord/minAnimationIncrement)); Coord stepX = dx / nsteps: Coord stepY = dy / nsteps: while (nsteps > 0) { dragFloater(flt. stepX, stepY); animationSyncStep( ); nsteps--; } } return changedSlides; __________________________________________________________________________
Claims (34)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US08/976,907 US5880743A (en) | 1995-01-24 | 1997-11-24 | Apparatus and method for implementing visual animation illustrating results of interactive editing operations |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US37755095A | 1995-01-24 | 1995-01-24 | |
US08/976,907 US5880743A (en) | 1995-01-24 | 1997-11-24 | Apparatus and method for implementing visual animation illustrating results of interactive editing operations |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US37755095A Continuation | 1995-01-24 | 1995-01-24 |
Publications (1)
Publication Number | Publication Date |
---|---|
US5880743A true US5880743A (en) | 1999-03-09 |
Family
ID=23489569
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US08/976,907 Expired - Lifetime US5880743A (en) | 1995-01-24 | 1997-11-24 | Apparatus and method for implementing visual animation illustrating results of interactive editing operations |
Country Status (1)
Country | Link |
---|---|
US (1) | US5880743A (en) |
Cited By (162)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5969721A (en) * | 1997-06-03 | 1999-10-19 | At&T Corp. | System and apparatus for customizing a computer animation wireframe |
US6018346A (en) * | 1998-01-12 | 2000-01-25 | Xerox Corporation | Freeform graphics system having meeting objects for supporting meeting objectives |
US6340981B1 (en) * | 1997-06-30 | 2002-01-22 | Sun Microsystems, Inc. | Method and apparatus for stroke substitution |
US6377259B2 (en) * | 1998-07-29 | 2002-04-23 | Inxight Software, Inc. | Presenting node-link structures with modification |
US6377288B1 (en) * | 1998-01-12 | 2002-04-23 | Xerox Corporation | Domain objects having computed attribute values for use in a freeform graphics system |
US6459442B1 (en) | 1999-09-10 | 2002-10-01 | Xerox Corporation | System for applying application behaviors to freeform data |
US6536033B1 (en) * | 1999-01-28 | 2003-03-18 | International Business Machines Corp. | Uniform mechanism for building containment hierarchies |
US20030185444A1 (en) * | 2002-01-10 | 2003-10-02 | Tadashi Honda | Handwriting information processing apparatus, handwriting information processing method, and storage medium having program stored therein for handwriting information processing |
WO2003090095A1 (en) * | 2002-04-15 | 2003-10-30 | Parascript Llc | Insertion of space in a geometric document |
US6681372B2 (en) * | 1995-11-10 | 2004-01-20 | Mantaro Yajima | Information processing apparatus and method for making document |
US6704907B1 (en) * | 1997-03-24 | 2004-03-09 | Nippon Telegraph And Telephone Corporation | Digital contents editing method and apparatus thereof |
US6720979B1 (en) | 1999-07-15 | 2004-04-13 | International Business Machines Corporation | Dynamic manipulation of animated graphics in a web browser |
US20040119762A1 (en) * | 2002-12-24 | 2004-06-24 | Fuji Xerox Co., Ltd. | Systems and methods for freeform pasting |
US20040240739A1 (en) * | 2003-05-30 | 2004-12-02 | Lu Chang | Pen gesture-based user interface |
KR20050007739A (en) * | 2003-07-11 | 2005-01-21 | 엘지전자 주식회사 | Method for setting time interval between frames of animated gif |
US20050111736A1 (en) * | 2002-02-08 | 2005-05-26 | Microsoft Corporation | Ink gestures |
US20050179648A1 (en) * | 2004-02-18 | 2005-08-18 | Microsoft Corporation | Tapping to create writing |
US6934707B1 (en) * | 1999-07-22 | 2005-08-23 | International Business Machines Corporation | Network transmission of pages in linkable markup language to receiving display stations with currently displayed pages controlled by tags in succeeding pages to produce aesthetically pleasing transitions between pages |
US20050206627A1 (en) * | 2004-03-19 | 2005-09-22 | Microsoft Corporation | Automatic height adjustment for electronic highlighter pens and mousing devices |
US6952803B1 (en) * | 1998-12-29 | 2005-10-04 | Xerox Corporation | Method and system for transcribing and editing using a structured freeform editor |
US6958749B1 (en) * | 1999-11-04 | 2005-10-25 | Sony Corporation | Apparatus and method for manipulating a touch-sensitive display panel |
US20050275622A1 (en) * | 2004-06-14 | 2005-12-15 | Patel Himesh G | Computer-implemented system and method for defining graphics primitives |
US20060101354A1 (en) * | 2004-10-20 | 2006-05-11 | Nintendo Co., Ltd. | Gesture inputs for a portable display device |
US20060233464A1 (en) * | 2002-06-28 | 2006-10-19 | Microsoft Corporation | Method and system for displaying and linking ink objects with recognized text and objects |
US20060253793A1 (en) * | 2005-05-04 | 2006-11-09 | International Business Machines Corporation | System and method for issuing commands based on pen motions on a graphical keyboard |
US7164423B1 (en) * | 2003-04-30 | 2007-01-16 | Apple Computer, Inc. | Method and apparatus for providing an animated representation of a reorder operation |
US20070057930A1 (en) * | 2002-07-30 | 2007-03-15 | Microsoft Corporation | Freeform Encounter Selection Tool |
US20070065013A1 (en) * | 2002-01-25 | 2007-03-22 | Xerox Corporation | Method and apparatus to convert digital ink images for use in a structured text/graphics editor |
US20070109281A1 (en) * | 2005-11-14 | 2007-05-17 | Microsoft Corporation | Free form wiper |
US20070115264A1 (en) * | 2005-11-21 | 2007-05-24 | Kun Yu | Gesture based document editor |
US20070124503A1 (en) * | 2005-10-31 | 2007-05-31 | Microsoft Corporation | Distributed sensing techniques for mobile devices |
US20070191028A1 (en) * | 2006-02-14 | 2007-08-16 | Microsoft Corporation | Dynamic interconnection of mobile devices |
US7259752B1 (en) * | 2002-06-28 | 2007-08-21 | Microsoft Corporation | Method and system for editing electronic ink |
US20070257914A1 (en) * | 2004-03-31 | 2007-11-08 | Hidenori Komatsumoto | Image Processing Device, Image Processing Method, And Information Storage Medium |
US7302641B1 (en) * | 1999-12-03 | 2007-11-27 | Mantaro Yajima | Information processing method and apparatus for making document |
US20070273666A1 (en) * | 2006-05-24 | 2007-11-29 | Sang Hyun Shin | Touch screen device and operating method thereof |
US20070273673A1 (en) * | 2006-05-24 | 2007-11-29 | Ho Joo Park | Touch screen device and operating method thereof |
US20070277126A1 (en) * | 2006-05-24 | 2007-11-29 | Ho Joo Park | Touch screen device and method of selecting files thereon |
US20070273663A1 (en) * | 2006-05-24 | 2007-11-29 | Ho Joo Park | Touch screen device and operating method thereof |
US20070277125A1 (en) * | 2006-05-24 | 2007-11-29 | Lg Electronics Inc. | Touch screen device and operating method thereof |
US20070300142A1 (en) * | 2005-04-01 | 2007-12-27 | King Martin T | Contextual dynamic advertising based upon captured rendered text |
US20080165147A1 (en) * | 2007-01-07 | 2008-07-10 | Greg Christie | Portable Multifunction Device, Method, and Graphical User Interface for Displaying User Interface Objects Adaptively |
US7500190B1 (en) * | 2005-04-13 | 2009-03-03 | Apple Inc. | Visual feedback to illustrate effects of editing operations |
US20090085933A1 (en) * | 2007-09-30 | 2009-04-02 | Htc Corporation | Image processing method |
US20090124282A1 (en) * | 2007-11-08 | 2009-05-14 | Ki-Uk Kim | Apparatus and method for human body communication in a mobile communication system |
US20090144644A1 (en) * | 2004-06-25 | 2009-06-04 | Chaudhri Imran A | Web View Layer For Accessing User Interface Elements |
US20090164889A1 (en) * | 2007-12-21 | 2009-06-25 | Kurt Piersol | Persistent selection marks |
US20090216722A1 (en) * | 2008-02-22 | 2009-08-27 | Samsung Electronics Co., Ltd. | Method and apparatus for querying digital records |
US20090213086A1 (en) * | 2006-04-19 | 2009-08-27 | Ji Suk Chae | Touch screen device and operating method thereof |
US20100017872A1 (en) * | 2002-12-10 | 2010-01-21 | Neonode Technologies | User interface for mobile computer unit |
US7702624B2 (en) | 2004-02-15 | 2010-04-20 | Exbiblio, B.V. | Processing techniques for visual capture data from a rendered document |
US7721226B2 (en) | 2004-02-18 | 2010-05-18 | Microsoft Corporation | Glom widget |
US20100125787A1 (en) * | 2008-11-20 | 2010-05-20 | Canon Kabushiki Kaisha | Information processing apparatus, processing method thereof, and computer-readable storage medium |
US7751623B1 (en) | 2002-06-28 | 2010-07-06 | Microsoft Corporation | Writing guide for a free-form document editor |
US7812860B2 (en) | 2004-04-01 | 2010-10-12 | Exbiblio B.V. | Handheld device for capturing text from both a document printed on paper and a document displayed on a dynamic display device |
US20110043652A1 (en) * | 2009-03-12 | 2011-02-24 | King Martin T | Automatically providing content associated with captured information, such as information captured in real-time |
US20110066978A1 (en) * | 2009-09-11 | 2011-03-17 | Compal Electronics, Inc. | Electronic apparatus and touch menu control method thereof |
US20110069016A1 (en) * | 2009-09-22 | 2011-03-24 | Victor B Michael | Device, Method, and Graphical User Interface for Manipulating User Interface Objects |
US20110074710A1 (en) * | 2009-09-25 | 2011-03-31 | Christopher Douglas Weeldreyer | Device, Method, and Graphical User Interface for Manipulating User Interface Objects |
US20110078622A1 (en) * | 2009-09-25 | 2011-03-31 | Julian Missig | Device, Method, and Graphical User Interface for Moving a Calendar Entry in a Calendar Application |
US20110078624A1 (en) * | 2009-09-25 | 2011-03-31 | Julian Missig | Device, Method, and Graphical User Interface for Manipulating Workspace Views |
US20110081083A1 (en) * | 2009-10-07 | 2011-04-07 | Google Inc. | Gesture-based selective text recognition |
US7930648B1 (en) | 2006-10-10 | 2011-04-19 | Adobe Systems Incorporated | Expanded stack view |
US20110099476A1 (en) * | 2009-10-23 | 2011-04-28 | Microsoft Corporation | Decorating a display environment |
US7945863B1 (en) * | 2005-07-05 | 2011-05-17 | Adobe Systems Incorporated | Localized exploded view |
US20110123115A1 (en) * | 2009-11-25 | 2011-05-26 | Google Inc. | On-Screen Guideline-Based Selective Text Recognition |
US7990556B2 (en) | 2004-12-03 | 2011-08-02 | Google Inc. | Association of a portable scanner with input/output and storage devices |
US20110202860A1 (en) * | 2010-02-12 | 2011-08-18 | Esobi Inc. | Method for displaying displacement of object on display of electronic device |
US20110273477A1 (en) * | 2007-08-21 | 2011-11-10 | Volkswagen Ag | Method for displaying information in a motor vehicle with a variable scale and display device |
US8081849B2 (en) | 2004-12-03 | 2011-12-20 | Google Inc. | Portable scanning and memory device |
US20120023426A1 (en) * | 2010-07-22 | 2012-01-26 | Mediatek Inc. | Apparatuses and Methods for Position Adjustment of Widget Presentations |
US20120081404A1 (en) * | 2010-10-01 | 2012-04-05 | International Business Machines Corporation | Simulating animation during slideshow |
US20120092269A1 (en) * | 2010-10-15 | 2012-04-19 | Hon Hai Precision Industry Co., Ltd. | Computer-implemented method for manipulating onscreen data |
US20120092268A1 (en) * | 2010-10-15 | 2012-04-19 | Hon Hai Precision Industry Co., Ltd. | Computer-implemented method for manipulating onscreen data |
US8179563B2 (en) | 2004-08-23 | 2012-05-15 | Google Inc. | Portable scanning device |
US20120167017A1 (en) * | 2010-12-27 | 2012-06-28 | Sling Media Inc. | Systems and methods for adaptive gesture recognition |
US20120216150A1 (en) * | 2011-02-18 | 2012-08-23 | Business Objects Software Ltd. | System and method for manipulating objects in a graphical user interface |
US20120216152A1 (en) * | 2011-02-23 | 2012-08-23 | Google Inc. | Touch gestures for remote control operations |
US8261094B2 (en) | 2004-04-19 | 2012-09-04 | Google Inc. | Secure data gathering from rendered documents |
US8302020B2 (en) | 2004-06-25 | 2012-10-30 | Apple Inc. | Widget authoring and editing environment |
US20120299926A1 (en) * | 2011-05-23 | 2012-11-29 | Microsoft Corporation | Adaptive timeline views of data |
US8346620B2 (en) | 2004-07-19 | 2013-01-01 | Google Inc. | Automatic modification of web pages |
US8416217B1 (en) | 2002-11-04 | 2013-04-09 | Neonode Inc. | Light-based finger gesture user interface |
US8418055B2 (en) | 2009-02-18 | 2013-04-09 | Google Inc. | Identifying a document by performing spectral analysis on the contents of the document |
US8442331B2 (en) | 2004-02-15 | 2013-05-14 | Google Inc. | Capturing text from rendered documents using supplemental information |
US8447066B2 (en) | 2009-03-12 | 2013-05-21 | Google Inc. | Performing actions based on capturing information from rendered documents, such as documents under copyright |
US8489624B2 (en) | 2004-05-17 | 2013-07-16 | Google, Inc. | Processing techniques for text capture from a rendered document |
US8505090B2 (en) | 2004-04-01 | 2013-08-06 | Google Inc. | Archive of text captures from rendered documents |
US8600196B2 (en) | 2006-09-08 | 2013-12-03 | Google Inc. | Optical scanners, such as hand-held optical scanners |
US8612884B2 (en) | 2010-01-26 | 2013-12-17 | Apple Inc. | Device, method, and graphical user interface for resizing objects |
US20130339850A1 (en) * | 2012-06-15 | 2013-12-19 | Muzik LLC | Interactive input device |
US8620083B2 (en) | 2004-12-03 | 2013-12-31 | Google Inc. | Method and system for character recognition |
US20140059496A1 (en) * | 2012-08-23 | 2014-02-27 | Oracle International Corporation | Unified mobile approvals application including card display |
US8674966B2 (en) | 2001-11-02 | 2014-03-18 | Neonode Inc. | ASIC controller for light-based touch screen |
US8677271B2 (en) | 2007-08-21 | 2014-03-18 | Volkswagen Ag | Method for displaying information in a motor vehicle and display device for a motor vehicle |
US8713418B2 (en) | 2004-04-12 | 2014-04-29 | Google Inc. | Adding value to a rendered document |
US8775023B2 (en) | 2009-02-15 | 2014-07-08 | Neanode Inc. | Light-based touch controls on a steering wheel and dashboard |
US8781228B2 (en) | 2004-04-01 | 2014-07-15 | Google Inc. | Triggering actions in response to optically or acoustically capturing keywords from a rendered document |
US8780069B2 (en) | 2009-09-25 | 2014-07-15 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US20140223382A1 (en) * | 2013-02-01 | 2014-08-07 | Barnesandnoble.Com Llc | Z-shaped gesture for touch sensitive ui undo, delete, and clear functions |
US20140300609A1 (en) * | 2013-04-04 | 2014-10-09 | Diotek Co., Ltd. | Device and method for editing ink text data |
US8874504B2 (en) | 2004-12-03 | 2014-10-28 | Google Inc. | Processing techniques for visual capture data from a rendered document |
US20140325410A1 (en) * | 2013-04-26 | 2014-10-30 | Samsung Electronics Co., Ltd. | User terminal device and controlling method thereof |
US8892495B2 (en) | 1991-12-23 | 2014-11-18 | Blanding Hovenweep, Llc | Adaptive pattern recognition based controller apparatus and method and human-interface therefore |
US8972879B2 (en) | 2010-07-30 | 2015-03-03 | Apple Inc. | Device, method, and graphical user interface for reordering the front-to-back positions of objects |
US9008447B2 (en) | 2004-04-01 | 2015-04-14 | Google Inc. | Method and system for character recognition |
US9021402B1 (en) | 2010-09-24 | 2015-04-28 | Google Inc. | Operation of mobile device interface using gestures |
US9052777B2 (en) | 2001-11-02 | 2015-06-09 | Neonode Inc. | Optical elements with alternating reflective lens facets |
US9081494B2 (en) | 2010-07-30 | 2015-07-14 | Apple Inc. | Device, method, and graphical user interface for copying formatting attributes |
US9081799B2 (en) | 2009-12-04 | 2015-07-14 | Google Inc. | Using gestalt information to identify locations in printed information |
US9092093B2 (en) | 2012-11-27 | 2015-07-28 | Neonode Inc. | Steering wheel user interface |
US9098182B2 (en) | 2010-07-30 | 2015-08-04 | Apple Inc. | Device, method, and graphical user interface for copying user interface objects between content regions |
US9116890B2 (en) | 2004-04-01 | 2015-08-25 | Google Inc. | Triggering actions in response to optically or acoustically capturing keywords from a rendered document |
US9143638B2 (en) | 2004-04-01 | 2015-09-22 | Google Inc. | Data capture from rendered documents using handheld device |
EP2930605A1 (en) * | 2014-04-08 | 2015-10-14 | Fujitsu Limited | Information processing apparatus and information processing program |
US9244984B2 (en) | 2011-03-31 | 2016-01-26 | Microsoft Technology Licensing, Llc | Location based conversational understanding |
US20160030134A1 (en) * | 2014-07-31 | 2016-02-04 | Restoration Robotics, Inc. | Robotic Hair Transplantation System with Touchscreen Interface for Controlling Movement of Tool |
US9268852B2 (en) | 2004-02-15 | 2016-02-23 | Google Inc. | Search engines and systems with handheld document data capture devices |
US9298287B2 (en) | 2011-03-31 | 2016-03-29 | Microsoft Technology Licensing, Llc | Combined activation for natural user interface systems |
US9323784B2 (en) | 2009-12-09 | 2016-04-26 | Google Inc. | Image search using text-based elements within the contents of images |
US9323447B2 (en) | 2013-10-15 | 2016-04-26 | Sharp Laboratories Of America, Inc. | Electronic whiteboard and touch screen method for configuring and applying metadata tags thereon |
US20160147723A1 (en) * | 2014-11-25 | 2016-05-26 | Samsung Electronics Co., Ltd. | Method and device for amending handwritten characters |
US20160196247A1 (en) * | 2011-12-20 | 2016-07-07 | Apple Inc. | Collaborative document editing |
US9417888B2 (en) | 2005-11-18 | 2016-08-16 | Apple Inc. | Management of user interface elements in a display environment |
US20160247040A1 (en) * | 2014-12-02 | 2016-08-25 | Myscript | System and method for recognizing geometric shapes |
US9454962B2 (en) | 2011-05-12 | 2016-09-27 | Microsoft Technology Licensing, Llc | Sentence simplification for spoken language understanding |
US9483164B2 (en) | 2007-07-18 | 2016-11-01 | Apple Inc. | User-centric widgets and dashboards |
US9513930B2 (en) | 2005-10-27 | 2016-12-06 | Apple Inc. | Workflow widgets |
US9535563B2 (en) | 1999-02-01 | 2017-01-03 | Blanding Hovenweep, Llc | Internet appliance system and method |
US20170039809A1 (en) * | 2005-04-27 | 2017-02-09 | Universal Entertainment Corporation (nee Aruze Corporation) | Gaming Machine |
US9665259B2 (en) | 2013-07-12 | 2017-05-30 | Microsoft Technology Licensing, Llc | Interactive digital displays |
US9760566B2 (en) | 2011-03-31 | 2017-09-12 | Microsoft Technology Licensing, Llc | Augmented conversational understanding agent to identify conversation context between two humans and taking an agent action thereof |
US9778794B2 (en) | 2001-11-02 | 2017-10-03 | Neonode Inc. | Light-based touch screen |
US20170300221A1 (en) * | 2010-06-10 | 2017-10-19 | Microsoft Technology Licensing, Llc | Erase, Circle, Prioritize and Application Tray Gestures |
US9842168B2 (en) | 2011-03-31 | 2017-12-12 | Microsoft Technology Licensing, Llc | Task driven user intents |
US9858343B2 (en) | 2011-03-31 | 2018-01-02 | Microsoft Technology Licensing Llc | Personalization of queries, conversations, and searches |
US10061843B2 (en) | 2011-05-12 | 2018-08-28 | Microsoft Technology Licensing, Llc | Translating natural language utterances to keyword search queries |
US10088921B2 (en) | 2014-10-10 | 2018-10-02 | Muzik Inc. | Devices for sharing user interactions |
EP3644171A1 (en) * | 2009-03-16 | 2020-04-29 | Apple Inc. | Methods and graphical user interfaces for editing on a multifunction device with a touch screen display |
US10642934B2 (en) | 2011-03-31 | 2020-05-05 | Microsoft Technology Licensing, Llc | Augmented conversational understanding architecture |
US10713304B2 (en) * | 2016-01-26 | 2020-07-14 | International Business Machines Corporation | Entity arrangement by shape input |
US10809865B2 (en) | 2013-01-15 | 2020-10-20 | Microsoft Technology Licensing, Llc | Engaging presentation through freeform sketching |
US11061489B2 (en) | 2019-05-10 | 2021-07-13 | Topoleg, Inc. | Automating and reducing user input required for user session on writing and/or drawing system |
US11074408B2 (en) | 2019-06-01 | 2021-07-27 | Apple Inc. | Mail application features |
US20210271374A1 (en) * | 2007-09-04 | 2021-09-02 | Apple Inc. | Editing interface |
USD942470S1 (en) * | 2020-06-21 | 2022-02-01 | Apple Inc. | Display or portion thereof with animated graphical user interface |
US11257396B2 (en) * | 2020-03-18 | 2022-02-22 | Sas Institute Inc. | User interfaces for converting geospatial data into audio outputs |
US11379113B2 (en) | 2019-06-01 | 2022-07-05 | Apple Inc. | Techniques for selecting text |
US11429230B2 (en) | 2018-11-28 | 2022-08-30 | Neonode Inc | Motorist user interface sensor |
US11429259B2 (en) * | 2019-05-10 | 2022-08-30 | Myscript | System and method for selecting and editing handwriting input elements |
US11460973B1 (en) | 2022-04-11 | 2022-10-04 | Sas Institute Inc:. | User interfaces for converting node-link data into audio outputs |
US11650713B2 (en) | 2005-12-30 | 2023-05-16 | Apple Inc. | Portable electronic device with interface reconfiguration mode |
US11669210B2 (en) | 2020-09-30 | 2023-06-06 | Neonode Inc. | Optical touch sensor |
US11675476B2 (en) | 2019-05-05 | 2023-06-13 | Apple Inc. | User interfaces for widgets |
US11733656B2 (en) | 2016-06-11 | 2023-08-22 | Apple Inc. | Configuring context-specific user interfaces |
US11736602B2 (en) | 2006-09-06 | 2023-08-22 | Apple Inc. | Portable multifunction device, method, and graphical user interface for configuring and displaying widgets |
WO2023170315A1 (en) * | 2022-03-11 | 2023-09-14 | Myscript | Merging text blocks |
WO2023170314A1 (en) * | 2022-03-11 | 2023-09-14 | Myscript | Creating text block sections |
US20230315271A1 (en) * | 2022-03-18 | 2023-10-05 | Sony Group Corporation | Collaborative whiteboard for meetings |
US11809700B2 (en) | 2010-04-07 | 2023-11-07 | Apple Inc. | Device, method, and graphical user interface for managing folders with multiple pages |
US11816325B2 (en) | 2016-06-12 | 2023-11-14 | Apple Inc. | Application shortcuts for carplay |
US11972104B2 (en) | 2009-09-22 | 2024-04-30 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0550244A2 (en) * | 1991-12-30 | 1993-07-07 | Xerox Corporation | Avoiding oscillation in interactive animation |
US5404442A (en) * | 1992-11-30 | 1995-04-04 | Apple Computer, Inc. | Visible clipboard for graphical computer environments |
US5416900A (en) * | 1991-04-25 | 1995-05-16 | Lotus Development Corporation | Presentation manager |
US5442742A (en) * | 1990-12-21 | 1995-08-15 | Apple Computer, Inc. | Method and apparatus for the manipulation of text on a computer display screen |
US5544295A (en) * | 1992-05-27 | 1996-08-06 | Apple Computer, Inc. | Method and apparatus for indicating a change in status of an object and its disposition using animation |
-
1997
- 1997-11-24 US US08/976,907 patent/US5880743A/en not_active Expired - Lifetime
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5442742A (en) * | 1990-12-21 | 1995-08-15 | Apple Computer, Inc. | Method and apparatus for the manipulation of text on a computer display screen |
US5416900A (en) * | 1991-04-25 | 1995-05-16 | Lotus Development Corporation | Presentation manager |
EP0550244A2 (en) * | 1991-12-30 | 1993-07-07 | Xerox Corporation | Avoiding oscillation in interactive animation |
US5544295A (en) * | 1992-05-27 | 1996-08-06 | Apple Computer, Inc. | Method and apparatus for indicating a change in status of an object and its disposition using animation |
US5404442A (en) * | 1992-11-30 | 1995-04-04 | Apple Computer, Inc. | Visible clipboard for graphical computer environments |
Non-Patent Citations (16)
Title |
---|
"Issues in Combining Marking and Direct Manipulation Techniques" by Gordon Kurtenbach and William Buxton, Nov. 11-13, 1991, UIST'91, 137-144. |
After Effects User Manual , Adobe Systems Incorporated, 1994, pp. 126, 230 244. * |
After Effects User Manual, Adobe Systems Incorporated, 1994, pp. 126, 230-244. |
aha InkWriter at a Glance Quick Reference Guide , aha software corporation (1993). * |
aha| InkWriter at a Glance--Quick Reference Guide, aha| software corporation (1993). |
Animation: From Cartoons to the User Interface , Bay Wiei Chang and David Unger, UIST 93: User Interface Software and Technology, Atlanta, Ga, Nov. 3 5, 1993. pp. 45 55. * |
Animation: From Cartoons to the User Interface, Bay-Wiei Chang and David Unger, UIST '93: User Interface Software and Technology, Atlanta, Ga, Nov. 3-5, 1993. pp. 45-55. |
Cowart, Mastering Windows 3.1 , SYBEX, 1993, pp. 421 428. * |
Cowart, Mastering Windows 3.1, SYBEX, 1993, pp. 421-428. |
Issues in Combining Marking and Direct Manipulation Techniques by Gordon Kurtenbach and William Buxton, Nov. 11 13, 1991, UIST 91, 137 144. * |
Liveboard: A Large Interactive Display Supporting Group Meetings, Presentationa and Remote Collaboration , Scott Elrod, et al., Proceedings of CHI 92, the ACM Conference on Human Factors in Computing Systems, May 3 7, 1992, pp. 599 607. * |
Liveboard: A Large Interactive Display Supporting Group Meetings, Presentationa and Remote Collaboration, Scott Elrod, et al., Proceedings of CHI '92, the ACM Conference on Human Factors in Computing Systems, May 3-7, 1992, pp. 599-607. |
Second Edition Computer Graphics Princicples and Practice , James D. Foley, Andries van Dam, Steven K. Feiner and John F. Hughes, Animation, chapter 21, (1990). * |
Second Edition Computer Graphics Princicples and Practice, James D. Foley, Andries van Dam, Steven K. Feiner and John F. Hughes, Animation, chapter 21, (1990). |
Smith et al., "Designing the Star User Interface", BYTE, vol. 7, No.4, Apr. 1982, pp. 242-282. |
Smith et al., Designing the Star User Interface , BYTE , vol. 7, No.4, Apr. 1982, pp. 242 282. * |
Cited By (305)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8892495B2 (en) | 1991-12-23 | 2014-11-18 | Blanding Hovenweep, Llc | Adaptive pattern recognition based controller apparatus and method and human-interface therefore |
US6681372B2 (en) * | 1995-11-10 | 2004-01-20 | Mantaro Yajima | Information processing apparatus and method for making document |
US20060271502A1 (en) * | 1995-11-10 | 2006-11-30 | Mantaro Yajima | Information processing apparatus and method for making document |
US6704907B1 (en) * | 1997-03-24 | 2004-03-09 | Nippon Telegraph And Telephone Corporation | Digital contents editing method and apparatus thereof |
US7956863B2 (en) | 1997-06-03 | 2011-06-07 | At&T Intellectual Property Ii, L.P. | Computer readable medium for modifying an animation wire frame |
US7760204B2 (en) | 1997-06-03 | 2010-07-20 | At&T Intellectual Property Ii, L.P. | Computer readable medium for modifying an animation wire |
US7148889B1 (en) | 1997-06-03 | 2006-12-12 | At&T Corp. | System and apparatus for customizing a computer animation wireframe |
US20110234588A1 (en) * | 1997-06-03 | 2011-09-29 | At&T Intellectual Property Ii, L.P. | Computer readable medium for modifying an animation wire frame |
US20080180435A1 (en) * | 1997-06-03 | 2008-07-31 | At&T Corp. | Computer readable medium for modifying an animation wire |
US7365749B1 (en) | 1997-06-03 | 2008-04-29 | At&T Corp. | Computer readable medium for modifying an animation wire frame |
US5969721A (en) * | 1997-06-03 | 1999-10-19 | At&T Corp. | System and apparatus for customizing a computer animation wireframe |
US6304264B1 (en) * | 1997-06-03 | 2001-10-16 | At&T Corp. | System and apparatus for customizing a computer animation wireframe |
US8654130B2 (en) | 1997-06-03 | 2014-02-18 | Rakuten, Inc. | Computer readable medium for modifying an animation wire frame |
US20100253703A1 (en) * | 1997-06-03 | 2010-10-07 | At&T Intellectual Property Ii, L.P. Via Transfer From At&T Corp. | Computer Readable Medium for Modifying an Animation Wire Frame |
US6340981B1 (en) * | 1997-06-30 | 2002-01-22 | Sun Microsystems, Inc. | Method and apparatus for stroke substitution |
US6377288B1 (en) * | 1998-01-12 | 2002-04-23 | Xerox Corporation | Domain objects having computed attribute values for use in a freeform graphics system |
US6018346A (en) * | 1998-01-12 | 2000-01-25 | Xerox Corporation | Freeform graphics system having meeting objects for supporting meeting objectives |
US6377259B2 (en) * | 1998-07-29 | 2002-04-23 | Inxight Software, Inc. | Presenting node-link structures with modification |
US6952803B1 (en) * | 1998-12-29 | 2005-10-04 | Xerox Corporation | Method and system for transcribing and editing using a structured freeform editor |
US6536033B1 (en) * | 1999-01-28 | 2003-03-18 | International Business Machines Corp. | Uniform mechanism for building containment hierarchies |
US9535563B2 (en) | 1999-02-01 | 2017-01-03 | Blanding Hovenweep, Llc | Internet appliance system and method |
US6720979B1 (en) | 1999-07-15 | 2004-04-13 | International Business Machines Corporation | Dynamic manipulation of animated graphics in a web browser |
US6934707B1 (en) * | 1999-07-22 | 2005-08-23 | International Business Machines Corporation | Network transmission of pages in linkable markup language to receiving display stations with currently displayed pages controlled by tags in succeeding pages to produce aesthetically pleasing transitions between pages |
US6459442B1 (en) | 1999-09-10 | 2002-10-01 | Xerox Corporation | System for applying application behaviors to freeform data |
US6958749B1 (en) * | 1999-11-04 | 2005-10-25 | Sony Corporation | Apparatus and method for manipulating a touch-sensitive display panel |
USRE44258E1 (en) * | 1999-11-04 | 2013-06-04 | Sony Corporation | Apparatus and method for manipulating a touch-sensitive display panel |
US7302641B1 (en) * | 1999-12-03 | 2007-11-27 | Mantaro Yajima | Information processing method and apparatus for making document |
US9052777B2 (en) | 2001-11-02 | 2015-06-09 | Neonode Inc. | Optical elements with alternating reflective lens facets |
US9035917B2 (en) | 2001-11-02 | 2015-05-19 | Neonode Inc. | ASIC controller for light-based sensor |
US9778794B2 (en) | 2001-11-02 | 2017-10-03 | Neonode Inc. | Light-based touch screen |
US8674966B2 (en) | 2001-11-02 | 2014-03-18 | Neonode Inc. | ASIC controller for light-based touch screen |
US7215815B2 (en) * | 2002-01-10 | 2007-05-08 | Ricoh Company, Ltd. | Handwriting information processing apparatus, handwriting information processing method, and storage medium having program stored therein for handwriting information processing |
US20030185444A1 (en) * | 2002-01-10 | 2003-10-02 | Tadashi Honda | Handwriting information processing apparatus, handwriting information processing method, and storage medium having program stored therein for handwriting information processing |
US20070065013A1 (en) * | 2002-01-25 | 2007-03-22 | Xerox Corporation | Method and apparatus to convert digital ink images for use in a structured text/graphics editor |
US8875016B2 (en) * | 2002-01-25 | 2014-10-28 | Xerox Corporation | Method and apparatus to convert digital ink images for use in a structured text/graphics editor |
US20050111736A1 (en) * | 2002-02-08 | 2005-05-26 | Microsoft Corporation | Ink gestures |
US7536656B2 (en) * | 2002-02-08 | 2009-05-19 | Microsoft Corporation | Ink gestures |
WO2003090095A1 (en) * | 2002-04-15 | 2003-10-30 | Parascript Llc | Insertion of space in a geometric document |
US20060233464A1 (en) * | 2002-06-28 | 2006-10-19 | Microsoft Corporation | Method and system for displaying and linking ink objects with recognized text and objects |
US7751623B1 (en) | 2002-06-28 | 2010-07-06 | Microsoft Corporation | Writing guide for a free-form document editor |
US7259752B1 (en) * | 2002-06-28 | 2007-08-21 | Microsoft Corporation | Method and system for editing electronic ink |
US7916979B2 (en) | 2002-06-28 | 2011-03-29 | Microsoft Corporation | Method and system for displaying and linking ink objects with recognized text and objects |
US8132125B2 (en) * | 2002-07-30 | 2012-03-06 | Microsoft Corporation | Freeform encounter selection tool |
US20070057930A1 (en) * | 2002-07-30 | 2007-03-15 | Microsoft Corporation | Freeform Encounter Selection Tool |
US8884926B1 (en) | 2002-11-04 | 2014-11-11 | Neonode Inc. | Light-based finger gesture user interface |
US9262074B2 (en) | 2002-11-04 | 2016-02-16 | Neonode, Inc. | Finger gesture user interface |
US8416217B1 (en) | 2002-11-04 | 2013-04-09 | Neonode Inc. | Light-based finger gesture user interface |
US8810551B2 (en) | 2002-11-04 | 2014-08-19 | Neonode Inc. | Finger gesture user interface |
US9164654B2 (en) * | 2002-12-10 | 2015-10-20 | Neonode Inc. | User interface for mobile computer unit |
US20100017872A1 (en) * | 2002-12-10 | 2010-01-21 | Neonode Technologies | User interface for mobile computer unit |
US20040119762A1 (en) * | 2002-12-24 | 2004-06-24 | Fuji Xerox Co., Ltd. | Systems and methods for freeform pasting |
US7800618B1 (en) | 2003-04-30 | 2010-09-21 | Apple Inc. | Method and apparatus for providing an animated representation of a reorder operation |
US7164423B1 (en) * | 2003-04-30 | 2007-01-16 | Apple Computer, Inc. | Method and apparatus for providing an animated representation of a reorder operation |
US20040240739A1 (en) * | 2003-05-30 | 2004-12-02 | Lu Chang | Pen gesture-based user interface |
KR20050007739A (en) * | 2003-07-11 | 2005-01-21 | 엘지전자 주식회사 | Method for setting time interval between frames of animated gif |
US8019648B2 (en) | 2004-02-15 | 2011-09-13 | Google Inc. | Search engines and systems with handheld document data capture devices |
US9268852B2 (en) | 2004-02-15 | 2016-02-23 | Google Inc. | Search engines and systems with handheld document data capture devices |
US8442331B2 (en) | 2004-02-15 | 2013-05-14 | Google Inc. | Capturing text from rendered documents using supplemental information |
US8831365B2 (en) | 2004-02-15 | 2014-09-09 | Google Inc. | Capturing text from rendered documents using supplement information |
US8005720B2 (en) | 2004-02-15 | 2011-08-23 | Google Inc. | Applying scanned information to identify content |
US8214387B2 (en) | 2004-02-15 | 2012-07-03 | Google Inc. | Document enhancement system and method |
US7742953B2 (en) | 2004-02-15 | 2010-06-22 | Exbiblio B.V. | Adding information or functionality to a rendered document via association with an electronic counterpart |
US7818215B2 (en) | 2004-02-15 | 2010-10-19 | Exbiblio, B.V. | Processing techniques for text capture from a rendered document |
US8515816B2 (en) | 2004-02-15 | 2013-08-20 | Google Inc. | Aggregate analysis of text captures performed by multiple users from rendered documents |
US7707039B2 (en) | 2004-02-15 | 2010-04-27 | Exbiblio B.V. | Automatic modification of web pages |
US7831912B2 (en) | 2004-02-15 | 2010-11-09 | Exbiblio B. V. | Publishing techniques for adding value to a rendered document |
US7706611B2 (en) | 2004-02-15 | 2010-04-27 | Exbiblio B.V. | Method and system for character recognition |
US7702624B2 (en) | 2004-02-15 | 2010-04-20 | Exbiblio, B.V. | Processing techniques for visual capture data from a rendered document |
US7358965B2 (en) | 2004-02-18 | 2008-04-15 | Microsoft Corporation | Tapping to create writing |
US7721226B2 (en) | 2004-02-18 | 2010-05-18 | Microsoft Corporation | Glom widget |
US20050179648A1 (en) * | 2004-02-18 | 2005-08-18 | Microsoft Corporation | Tapping to create writing |
US7659890B2 (en) | 2004-03-19 | 2010-02-09 | Microsoft Corporation | Automatic height adjustment for electronic highlighter pens and mousing devices |
US20050206627A1 (en) * | 2004-03-19 | 2005-09-22 | Microsoft Corporation | Automatic height adjustment for electronic highlighter pens and mousing devices |
US20070257914A1 (en) * | 2004-03-31 | 2007-11-08 | Hidenori Komatsumoto | Image Processing Device, Image Processing Method, And Information Storage Medium |
US7812860B2 (en) | 2004-04-01 | 2010-10-12 | Exbiblio B.V. | Handheld device for capturing text from both a document printed on paper and a document displayed on a dynamic display device |
US9008447B2 (en) | 2004-04-01 | 2015-04-14 | Google Inc. | Method and system for character recognition |
US9116890B2 (en) | 2004-04-01 | 2015-08-25 | Google Inc. | Triggering actions in response to optically or acoustically capturing keywords from a rendered document |
US8781228B2 (en) | 2004-04-01 | 2014-07-15 | Google Inc. | Triggering actions in response to optically or acoustically capturing keywords from a rendered document |
US9633013B2 (en) | 2004-04-01 | 2017-04-25 | Google Inc. | Triggering actions in response to optically or acoustically capturing keywords from a rendered document |
US8505090B2 (en) | 2004-04-01 | 2013-08-06 | Google Inc. | Archive of text captures from rendered documents |
US9143638B2 (en) | 2004-04-01 | 2015-09-22 | Google Inc. | Data capture from rendered documents using handheld device |
US9514134B2 (en) | 2004-04-01 | 2016-12-06 | Google Inc. | Triggering actions in response to optically or acoustically capturing keywords from a rendered document |
US8713418B2 (en) | 2004-04-12 | 2014-04-29 | Google Inc. | Adding value to a rendered document |
US8261094B2 (en) | 2004-04-19 | 2012-09-04 | Google Inc. | Secure data gathering from rendered documents |
US9030699B2 (en) | 2004-04-19 | 2015-05-12 | Google Inc. | Association of a portable scanner with input/output and storage devices |
US8489624B2 (en) | 2004-05-17 | 2013-07-16 | Google, Inc. | Processing techniques for text capture from a rendered document |
US8799099B2 (en) | 2004-05-17 | 2014-08-05 | Google Inc. | Processing techniques for text capture from a rendered document |
US7788606B2 (en) * | 2004-06-14 | 2010-08-31 | Sas Institute Inc. | Computer-implemented system and method for defining graphics primitives |
US20050275622A1 (en) * | 2004-06-14 | 2005-12-15 | Patel Himesh G | Computer-implemented system and method for defining graphics primitives |
US8464172B2 (en) | 2004-06-25 | 2013-06-11 | Apple Inc. | Configuration bar for launching layer for accessing user interface elements |
US7984384B2 (en) | 2004-06-25 | 2011-07-19 | Apple Inc. | Web view layer for accessing user interface elements |
US9507503B2 (en) | 2004-06-25 | 2016-11-29 | Apple Inc. | Remote access to layer and user interface elements |
US20110078616A1 (en) * | 2004-06-25 | 2011-03-31 | Chaudhri Imran A | Configuration bar for launching layer for accessing user interface elements |
US8266538B2 (en) | 2004-06-25 | 2012-09-11 | Apple Inc. | Remote access to layer and user interface elements |
US20090271724A1 (en) * | 2004-06-25 | 2009-10-29 | Chaudhri Imran A | Visual characteristics of user interface elements in a unified interest layer |
US8291332B2 (en) | 2004-06-25 | 2012-10-16 | Apple Inc. | Layer for accessing user interface elements |
US10489040B2 (en) | 2004-06-25 | 2019-11-26 | Apple Inc. | Visual characteristics of user interface elements in a unified interest layer |
US8302020B2 (en) | 2004-06-25 | 2012-10-30 | Apple Inc. | Widget authoring and editing environment |
US9753627B2 (en) | 2004-06-25 | 2017-09-05 | Apple Inc. | Visual characteristics of user interface elements in a unified interest layer |
US20090144644A1 (en) * | 2004-06-25 | 2009-06-04 | Chaudhri Imran A | Web View Layer For Accessing User Interface Elements |
US8346620B2 (en) | 2004-07-19 | 2013-01-01 | Google Inc. | Automatic modification of web pages |
US9275051B2 (en) | 2004-07-19 | 2016-03-01 | Google Inc. | Automatic modification of web pages |
US8179563B2 (en) | 2004-08-23 | 2012-05-15 | Google Inc. | Portable scanning device |
US10996842B2 (en) * | 2004-10-20 | 2021-05-04 | Nintendo Co., Ltd. | Computing device and browser for same |
US20060101354A1 (en) * | 2004-10-20 | 2006-05-11 | Nintendo Co., Ltd. | Gesture inputs for a portable display device |
US11763068B2 (en) * | 2004-10-20 | 2023-09-19 | Nintendo Co., Ltd. | Computing device and browser for same |
US10324615B2 (en) | 2004-10-20 | 2019-06-18 | Nintendo Co., Ltd. | Computing device and browser for same |
US20190258378A1 (en) * | 2004-10-20 | 2019-08-22 | Nintendo Co., Ltd. | Computing device and browser for same |
US9052816B2 (en) | 2004-10-20 | 2015-06-09 | Nintendo Co., Ltd. | Computing device and browser for same |
US20210248306A1 (en) * | 2004-10-20 | 2021-08-12 | Nintendo Co., Ltd. | Computing device and browser for same |
US8169410B2 (en) | 2004-10-20 | 2012-05-01 | Nintendo Co., Ltd. | Gesture inputs for a portable display device |
US8874504B2 (en) | 2004-12-03 | 2014-10-28 | Google Inc. | Processing techniques for visual capture data from a rendered document |
US8081849B2 (en) | 2004-12-03 | 2011-12-20 | Google Inc. | Portable scanning and memory device |
US8620083B2 (en) | 2004-12-03 | 2013-12-31 | Google Inc. | Method and system for character recognition |
US8953886B2 (en) | 2004-12-03 | 2015-02-10 | Google Inc. | Method and system for character recognition |
US7990556B2 (en) | 2004-12-03 | 2011-08-02 | Google Inc. | Association of a portable scanner with input/output and storage devices |
US20070300142A1 (en) * | 2005-04-01 | 2007-12-27 | King Martin T | Contextual dynamic advertising based upon captured rendered text |
US7500190B1 (en) * | 2005-04-13 | 2009-03-03 | Apple Inc. | Visual feedback to illustrate effects of editing operations |
US10839648B2 (en) | 2005-04-27 | 2020-11-17 | Universal Entertainment Corporation (nee Aruze Corporation) | Gaming machine |
US20170039809A1 (en) * | 2005-04-27 | 2017-02-09 | Universal Entertainment Corporation (nee Aruze Corporation) | Gaming Machine |
US10242533B2 (en) * | 2005-04-27 | 2019-03-26 | Universal Entertainment Corporation | Gaming machine |
US20060253793A1 (en) * | 2005-05-04 | 2006-11-09 | International Business Machines Corporation | System and method for issuing commands based on pen motions on a graphical keyboard |
US7487461B2 (en) * | 2005-05-04 | 2009-02-03 | International Business Machines Corporation | System and method for issuing commands based on pen motions on a graphical keyboard |
US7945863B1 (en) * | 2005-07-05 | 2011-05-17 | Adobe Systems Incorporated | Localized exploded view |
US8739063B2 (en) | 2005-07-05 | 2014-05-27 | Adobe Systems Incorporated | Localized exploded view |
US9513930B2 (en) | 2005-10-27 | 2016-12-06 | Apple Inc. | Workflow widgets |
US11150781B2 (en) | 2005-10-27 | 2021-10-19 | Apple Inc. | Workflow widgets |
US20070124503A1 (en) * | 2005-10-31 | 2007-05-31 | Microsoft Corporation | Distributed sensing techniques for mobile devices |
US7636794B2 (en) * | 2005-10-31 | 2009-12-22 | Microsoft Corporation | Distributed sensing techniques for mobile devices |
US20070109281A1 (en) * | 2005-11-14 | 2007-05-17 | Microsoft Corporation | Free form wiper |
US7526737B2 (en) | 2005-11-14 | 2009-04-28 | Microsoft Corporation | Free form wiper |
US9417888B2 (en) | 2005-11-18 | 2016-08-16 | Apple Inc. | Management of user interface elements in a display environment |
US8643605B2 (en) * | 2005-11-21 | 2014-02-04 | Core Wireless Licensing S.A.R.L | Gesture based document editor |
CN102945132A (en) * | 2005-11-21 | 2013-02-27 | 核心无线许可有限公司 | Gesture based document editor |
US9703474B2 (en) * | 2005-11-21 | 2017-07-11 | Core Wireless Licensing S.A.R.L. | Gesture based document editor |
US20140191993A1 (en) * | 2005-11-21 | 2014-07-10 | Core Wireless Licensing S.A.R.L. | Gesture based document editor |
US20070115264A1 (en) * | 2005-11-21 | 2007-05-24 | Kun Yu | Gesture based document editor |
US11650713B2 (en) | 2005-12-30 | 2023-05-16 | Apple Inc. | Portable electronic device with interface reconfiguration mode |
US20070191028A1 (en) * | 2006-02-14 | 2007-08-16 | Microsoft Corporation | Dynamic interconnection of mobile devices |
US7817991B2 (en) | 2006-02-14 | 2010-10-19 | Microsoft Corporation | Dynamic interconnection of mobile devices |
US20090213086A1 (en) * | 2006-04-19 | 2009-08-27 | Ji Suk Chae | Touch screen device and operating method thereof |
US8312391B2 (en) | 2006-05-24 | 2012-11-13 | Lg Electronics Inc. | Touch screen device and operating method thereof |
US20070273668A1 (en) * | 2006-05-24 | 2007-11-29 | Lg Electronics Inc. | Touch screen device and method of selecting files thereon |
US20070273669A1 (en) * | 2006-05-24 | 2007-11-29 | Lg Electronics Inc. | Touch screen device and operating method thereof |
US8169411B2 (en) | 2006-05-24 | 2012-05-01 | Lg Electronics Inc. | Touch screen device and operating method thereof |
US8115739B2 (en) | 2006-05-24 | 2012-02-14 | Lg Electronics Inc. | Touch screen device and operating method thereof |
US20070273663A1 (en) * | 2006-05-24 | 2007-11-29 | Ho Joo Park | Touch screen device and operating method thereof |
US20070277123A1 (en) * | 2006-05-24 | 2007-11-29 | Sang Hyun Shin | Touch screen device and operating method thereof |
US8302032B2 (en) | 2006-05-24 | 2012-10-30 | Lg Electronics Inc. | Touch screen device and operating method thereof |
US20070277125A1 (en) * | 2006-05-24 | 2007-11-29 | Lg Electronics Inc. | Touch screen device and operating method thereof |
US8136052B2 (en) | 2006-05-24 | 2012-03-13 | Lg Electronics Inc. | Touch screen device and operating method thereof |
US9058099B2 (en) | 2006-05-24 | 2015-06-16 | Lg Electronics Inc. | Touch screen device and operating method thereof |
US20070277126A1 (en) * | 2006-05-24 | 2007-11-29 | Ho Joo Park | Touch screen device and method of selecting files thereon |
US20070273673A1 (en) * | 2006-05-24 | 2007-11-29 | Ho Joo Park | Touch screen device and operating method thereof |
US9041658B2 (en) | 2006-05-24 | 2015-05-26 | Lg Electronics Inc | Touch screen device and operating method thereof |
US20070273666A1 (en) * | 2006-05-24 | 2007-11-29 | Sang Hyun Shin | Touch screen device and operating method thereof |
US20070273665A1 (en) * | 2006-05-24 | 2007-11-29 | Lg Electronics Inc. | Touch screen device and operating method thereof |
US11736602B2 (en) | 2006-09-06 | 2023-08-22 | Apple Inc. | Portable multifunction device, method, and graphical user interface for configuring and displaying widgets |
US8600196B2 (en) | 2006-09-08 | 2013-12-03 | Google Inc. | Optical scanners, such as hand-held optical scanners |
US7930648B1 (en) | 2006-10-10 | 2011-04-19 | Adobe Systems Incorporated | Expanded stack view |
US9817436B2 (en) * | 2007-01-07 | 2017-11-14 | Apple Inc. | Portable multifunction device, method, and graphical user interface for displaying user interface objects adaptively |
US20080165147A1 (en) * | 2007-01-07 | 2008-07-10 | Greg Christie | Portable Multifunction Device, Method, and Graphical User Interface for Displaying User Interface Objects Adaptively |
US9483164B2 (en) | 2007-07-18 | 2016-11-01 | Apple Inc. | User-centric widgets and dashboards |
US9836208B2 (en) * | 2007-08-21 | 2017-12-05 | Volkswagen Ag | Method for displaying information in a motor vehicle with a variable scale and display device |
US20110273477A1 (en) * | 2007-08-21 | 2011-11-10 | Volkswagen Ag | Method for displaying information in a motor vehicle with a variable scale and display device |
US8677271B2 (en) | 2007-08-21 | 2014-03-18 | Volkswagen Ag | Method for displaying information in a motor vehicle and display device for a motor vehicle |
US11604559B2 (en) * | 2007-09-04 | 2023-03-14 | Apple Inc. | Editing interface |
US20210271374A1 (en) * | 2007-09-04 | 2021-09-02 | Apple Inc. | Editing interface |
US8325206B2 (en) * | 2007-09-30 | 2012-12-04 | Htc Corporation | Image processing method |
US20090085933A1 (en) * | 2007-09-30 | 2009-04-02 | Htc Corporation | Image processing method |
US8867995B2 (en) * | 2007-11-08 | 2014-10-21 | Samsung Electronics Co., Ltd. | Apparatus and method for human body communication in a mobile communication system |
US20090124282A1 (en) * | 2007-11-08 | 2009-05-14 | Ki-Uk Kim | Apparatus and method for human body communication in a mobile communication system |
US20090164889A1 (en) * | 2007-12-21 | 2009-06-25 | Kurt Piersol | Persistent selection marks |
US8566752B2 (en) * | 2007-12-21 | 2013-10-22 | Ricoh Co., Ltd. | Persistent selection marks |
US20090216722A1 (en) * | 2008-02-22 | 2009-08-27 | Samsung Electronics Co., Ltd. | Method and apparatus for querying digital records |
US20100125787A1 (en) * | 2008-11-20 | 2010-05-20 | Canon Kabushiki Kaisha | Information processing apparatus, processing method thereof, and computer-readable storage medium |
US8423916B2 (en) * | 2008-11-20 | 2013-04-16 | Canon Kabushiki Kaisha | Information processing apparatus, processing method thereof, and computer-readable storage medium |
US8775023B2 (en) | 2009-02-15 | 2014-07-08 | Neanode Inc. | Light-based touch controls on a steering wheel and dashboard |
US8918252B2 (en) | 2009-02-15 | 2014-12-23 | Neonode Inc. | Light-based touch controls on a steering wheel |
US9389710B2 (en) | 2009-02-15 | 2016-07-12 | Neonode Inc. | Light-based controls on a toroidal steering wheel |
US10007422B2 (en) | 2009-02-15 | 2018-06-26 | Neonode Inc. | Light-based controls in a toroidal steering wheel |
US8418055B2 (en) | 2009-02-18 | 2013-04-09 | Google Inc. | Identifying a document by performing spectral analysis on the contents of the document |
US8638363B2 (en) | 2009-02-18 | 2014-01-28 | Google Inc. | Automatically capturing information, such as capturing information using a document-aware device |
US9075779B2 (en) | 2009-03-12 | 2015-07-07 | Google Inc. | Performing actions based on capturing information from rendered documents, such as documents under copyright |
US20110043652A1 (en) * | 2009-03-12 | 2011-02-24 | King Martin T | Automatically providing content associated with captured information, such as information captured in real-time |
US8990235B2 (en) | 2009-03-12 | 2015-03-24 | Google Inc. | Automatically providing content associated with captured information, such as information captured in real-time |
US8447066B2 (en) | 2009-03-12 | 2013-05-21 | Google Inc. | Performing actions based on capturing information from rendered documents, such as documents under copyright |
US10761716B2 (en) | 2009-03-16 | 2020-09-01 | Apple, Inc. | Methods and graphical user interfaces for editing on a multifunction device with a touch screen display |
EP3644171A1 (en) * | 2009-03-16 | 2020-04-29 | Apple Inc. | Methods and graphical user interfaces for editing on a multifunction device with a touch screen display |
US20110066978A1 (en) * | 2009-09-11 | 2011-03-17 | Compal Electronics, Inc. | Electronic apparatus and touch menu control method thereof |
US10564826B2 (en) | 2009-09-22 | 2020-02-18 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US11972104B2 (en) | 2009-09-22 | 2024-04-30 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US20110069016A1 (en) * | 2009-09-22 | 2011-03-24 | Victor B Michael | Device, Method, and Graphical User Interface for Manipulating User Interface Objects |
US11334229B2 (en) | 2009-09-22 | 2022-05-17 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US10282070B2 (en) | 2009-09-22 | 2019-05-07 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US8863016B2 (en) | 2009-09-22 | 2014-10-14 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US10788965B2 (en) | 2009-09-22 | 2020-09-29 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US20110074710A1 (en) * | 2009-09-25 | 2011-03-31 | Christopher Douglas Weeldreyer | Device, Method, and Graphical User Interface for Manipulating User Interface Objects |
US20110078622A1 (en) * | 2009-09-25 | 2011-03-31 | Julian Missig | Device, Method, and Graphical User Interface for Moving a Calendar Entry in a Calendar Application |
US11947782B2 (en) | 2009-09-25 | 2024-04-02 | Apple Inc. | Device, method, and graphical user interface for manipulating workspace views |
US10254927B2 (en) | 2009-09-25 | 2019-04-09 | Apple Inc. | Device, method, and graphical user interface for manipulating workspace views |
US20110078624A1 (en) * | 2009-09-25 | 2011-03-31 | Julian Missig | Device, Method, and Graphical User Interface for Manipulating Workspace Views |
US8766928B2 (en) | 2009-09-25 | 2014-07-01 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US11366576B2 (en) | 2009-09-25 | 2022-06-21 | Apple Inc. | Device, method, and graphical user interface for manipulating workspace views |
US8780069B2 (en) | 2009-09-25 | 2014-07-15 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US9310907B2 (en) | 2009-09-25 | 2016-04-12 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US10928993B2 (en) | 2009-09-25 | 2021-02-23 | Apple Inc. | Device, method, and graphical user interface for manipulating workspace views |
US8799826B2 (en) | 2009-09-25 | 2014-08-05 | Apple Inc. | Device, method, and graphical user interface for moving a calendar entry in a calendar application |
US8520983B2 (en) * | 2009-10-07 | 2013-08-27 | Google Inc. | Gesture-based selective text recognition |
US8666199B2 (en) * | 2009-10-07 | 2014-03-04 | Google Inc. | Gesture-based selection text recognition |
US20110081083A1 (en) * | 2009-10-07 | 2011-04-07 | Google Inc. | Gesture-based selective text recognition |
US20110099476A1 (en) * | 2009-10-23 | 2011-04-28 | Microsoft Corporation | Decorating a display environment |
US20110123115A1 (en) * | 2009-11-25 | 2011-05-26 | Google Inc. | On-Screen Guideline-Based Selective Text Recognition |
US8515185B2 (en) | 2009-11-25 | 2013-08-20 | Google Inc. | On-screen guideline-based selective text recognition |
US9081799B2 (en) | 2009-12-04 | 2015-07-14 | Google Inc. | Using gestalt information to identify locations in printed information |
US9323784B2 (en) | 2009-12-09 | 2016-04-26 | Google Inc. | Image search using text-based elements within the contents of images |
US8677268B2 (en) | 2010-01-26 | 2014-03-18 | Apple Inc. | Device, method, and graphical user interface for resizing objects |
US8612884B2 (en) | 2010-01-26 | 2013-12-17 | Apple Inc. | Device, method, and graphical user interface for resizing objects |
US8749558B2 (en) * | 2010-02-12 | 2014-06-10 | Esobi Inc. | Method for displaying displacement of object on display of electronic device |
TWI447638B (en) * | 2010-02-12 | 2014-08-01 | Esobi Inc | The display method of the displacement of the object on the electronic device screen |
US20110202860A1 (en) * | 2010-02-12 | 2011-08-18 | Esobi Inc. | Method for displaying displacement of object on display of electronic device |
US11809700B2 (en) | 2010-04-07 | 2023-11-07 | Apple Inc. | Device, method, and graphical user interface for managing folders with multiple pages |
US20170300221A1 (en) * | 2010-06-10 | 2017-10-19 | Microsoft Technology Licensing, Llc | Erase, Circle, Prioritize and Application Tray Gestures |
CN102346632A (en) * | 2010-07-22 | 2012-02-08 | 联发科技股份有限公司 | Electronic interaction apparatus and method for position adjustment of widget presentations |
US20120023426A1 (en) * | 2010-07-22 | 2012-01-26 | Mediatek Inc. | Apparatuses and Methods for Position Adjustment of Widget Presentations |
US8972879B2 (en) | 2010-07-30 | 2015-03-03 | Apple Inc. | Device, method, and graphical user interface for reordering the front-to-back positions of objects |
US9081494B2 (en) | 2010-07-30 | 2015-07-14 | Apple Inc. | Device, method, and graphical user interface for copying formatting attributes |
US9098182B2 (en) | 2010-07-30 | 2015-08-04 | Apple Inc. | Device, method, and graphical user interface for copying user interface objects between content regions |
US9626098B2 (en) | 2010-07-30 | 2017-04-18 | Apple Inc. | Device, method, and graphical user interface for copying formatting attributes |
US9021402B1 (en) | 2010-09-24 | 2015-04-28 | Google Inc. | Operation of mobile device interface using gestures |
US20120081404A1 (en) * | 2010-10-01 | 2012-04-05 | International Business Machines Corporation | Simulating animation during slideshow |
US8576234B2 (en) * | 2010-10-01 | 2013-11-05 | International Business Machines Corporation | Simulating animation during slideshow |
US20120092268A1 (en) * | 2010-10-15 | 2012-04-19 | Hon Hai Precision Industry Co., Ltd. | Computer-implemented method for manipulating onscreen data |
CN102455863A (en) * | 2010-10-15 | 2012-05-16 | 鸿富锦精密工业(深圳)有限公司 | Computer-implemented method for manipulating onscreen data |
US20120092269A1 (en) * | 2010-10-15 | 2012-04-19 | Hon Hai Precision Industry Co., Ltd. | Computer-implemented method for manipulating onscreen data |
US20120167017A1 (en) * | 2010-12-27 | 2012-06-28 | Sling Media Inc. | Systems and methods for adaptive gesture recognition |
US9785335B2 (en) * | 2010-12-27 | 2017-10-10 | Sling Media Inc. | Systems and methods for adaptive gesture recognition |
US10338672B2 (en) * | 2011-02-18 | 2019-07-02 | Business Objects Software Ltd. | System and method for manipulating objects in a graphical user interface |
US20120216150A1 (en) * | 2011-02-18 | 2012-08-23 | Business Objects Software Ltd. | System and method for manipulating objects in a graphical user interface |
US8271908B2 (en) * | 2011-02-23 | 2012-09-18 | Google Inc. | Touch gestures for remote control operations |
US20120216154A1 (en) * | 2011-02-23 | 2012-08-23 | Google Inc. | Touch gestures for remote control operations |
US20120216152A1 (en) * | 2011-02-23 | 2012-08-23 | Google Inc. | Touch gestures for remote control operations |
US9858343B2 (en) | 2011-03-31 | 2018-01-02 | Microsoft Technology Licensing Llc | Personalization of queries, conversations, and searches |
US10049667B2 (en) | 2011-03-31 | 2018-08-14 | Microsoft Technology Licensing, Llc | Location-based conversational understanding |
US9842168B2 (en) | 2011-03-31 | 2017-12-12 | Microsoft Technology Licensing, Llc | Task driven user intents |
US9298287B2 (en) | 2011-03-31 | 2016-03-29 | Microsoft Technology Licensing, Llc | Combined activation for natural user interface systems |
US10585957B2 (en) | 2011-03-31 | 2020-03-10 | Microsoft Technology Licensing, Llc | Task driven user intents |
US9244984B2 (en) | 2011-03-31 | 2016-01-26 | Microsoft Technology Licensing, Llc | Location based conversational understanding |
US9760566B2 (en) | 2011-03-31 | 2017-09-12 | Microsoft Technology Licensing, Llc | Augmented conversational understanding agent to identify conversation context between two humans and taking an agent action thereof |
US10642934B2 (en) | 2011-03-31 | 2020-05-05 | Microsoft Technology Licensing, Llc | Augmented conversational understanding architecture |
US10296587B2 (en) | 2011-03-31 | 2019-05-21 | Microsoft Technology Licensing, Llc | Augmented conversational understanding agent to identify conversation context between two humans and taking an agent action thereof |
US10061843B2 (en) | 2011-05-12 | 2018-08-28 | Microsoft Technology Licensing, Llc | Translating natural language utterances to keyword search queries |
US9454962B2 (en) | 2011-05-12 | 2016-09-27 | Microsoft Technology Licensing, Llc | Sentence simplification for spoken language understanding |
US9161085B2 (en) * | 2011-05-23 | 2015-10-13 | Microsoft Technology Licensing, Llc | Adaptive timeline views of data |
US20120299926A1 (en) * | 2011-05-23 | 2012-11-29 | Microsoft Corporation | Adaptive timeline views of data |
US10055394B2 (en) * | 2011-12-20 | 2018-08-21 | Apple Inc. | Collaborative document editing |
US11627001B2 (en) | 2011-12-20 | 2023-04-11 | Apple Inc. | Collaborative document editing |
US20160196247A1 (en) * | 2011-12-20 | 2016-07-07 | Apple Inc. | Collaborative document editing |
US10880098B2 (en) | 2011-12-20 | 2020-12-29 | Apple Inc. | Collaborative document editing |
US9992316B2 (en) | 2012-06-15 | 2018-06-05 | Muzik Inc. | Interactive networked headphones |
US10567564B2 (en) | 2012-06-15 | 2020-02-18 | Muzik, Inc. | Interactive networked apparatus |
US20130339850A1 (en) * | 2012-06-15 | 2013-12-19 | Muzik LLC | Interactive input device |
US11924364B2 (en) | 2012-06-15 | 2024-03-05 | Muzik Inc. | Interactive networked apparatus |
US20140059496A1 (en) * | 2012-08-23 | 2014-02-27 | Oracle International Corporation | Unified mobile approvals application including card display |
US10719218B2 (en) | 2012-11-27 | 2020-07-21 | Neonode Inc. | Vehicle user interface |
US10254943B2 (en) | 2012-11-27 | 2019-04-09 | Neonode Inc. | Autonomous drive user interface |
US11650727B2 (en) | 2012-11-27 | 2023-05-16 | Neonode Inc. | Vehicle user interface |
US9710144B2 (en) | 2012-11-27 | 2017-07-18 | Neonode Inc. | User interface for curved input device |
US9092093B2 (en) | 2012-11-27 | 2015-07-28 | Neonode Inc. | Steering wheel user interface |
US10809865B2 (en) | 2013-01-15 | 2020-10-20 | Microsoft Technology Licensing, Llc | Engaging presentation through freeform sketching |
US20140223382A1 (en) * | 2013-02-01 | 2014-08-07 | Barnesandnoble.Com Llc | Z-shaped gesture for touch sensitive ui undo, delete, and clear functions |
US20140300609A1 (en) * | 2013-04-04 | 2014-10-09 | Diotek Co., Ltd. | Device and method for editing ink text data |
US9478055B2 (en) * | 2013-04-04 | 2016-10-25 | Diotek Co., Ltd. | Device and method for editing ink text data |
US9891809B2 (en) * | 2013-04-26 | 2018-02-13 | Samsung Electronics Co., Ltd. | User terminal device and controlling method thereof |
US20140325410A1 (en) * | 2013-04-26 | 2014-10-30 | Samsung Electronics Co., Ltd. | User terminal device and controlling method thereof |
US9665259B2 (en) | 2013-07-12 | 2017-05-30 | Microsoft Technology Licensing, Llc | Interactive digital displays |
US9323447B2 (en) | 2013-10-15 | 2016-04-26 | Sharp Laboratories Of America, Inc. | Electronic whiteboard and touch screen method for configuring and applying metadata tags thereon |
EP2930605A1 (en) * | 2014-04-08 | 2015-10-14 | Fujitsu Limited | Information processing apparatus and information processing program |
US9921742B2 (en) | 2014-04-08 | 2018-03-20 | Fujitsu Limited | Information processing apparatus and recording medium recording information processing program |
US10387021B2 (en) * | 2014-07-31 | 2019-08-20 | Restoration Robotics, Inc. | Robotic hair transplantation system with touchscreen interface for controlling movement of tool |
US20160030134A1 (en) * | 2014-07-31 | 2016-02-04 | Restoration Robotics, Inc. | Robotic Hair Transplantation System with Touchscreen Interface for Controlling Movement of Tool |
US10824251B2 (en) | 2014-10-10 | 2020-11-03 | Muzik Inc. | Devices and methods for sharing user interaction |
US10088921B2 (en) | 2014-10-10 | 2018-10-02 | Muzik Inc. | Devices for sharing user interactions |
US20160147723A1 (en) * | 2014-11-25 | 2016-05-26 | Samsung Electronics Co., Ltd. | Method and device for amending handwritten characters |
US20160247040A1 (en) * | 2014-12-02 | 2016-08-25 | Myscript | System and method for recognizing geometric shapes |
US9489572B2 (en) * | 2014-12-02 | 2016-11-08 | Myscript | System and method for recognizing geometric shapes |
US10181076B2 (en) * | 2014-12-02 | 2019-01-15 | Myscript | System and method for recognizing geometric shapes |
US10713304B2 (en) * | 2016-01-26 | 2020-07-14 | International Business Machines Corporation | Entity arrangement by shape input |
US11733656B2 (en) | 2016-06-11 | 2023-08-22 | Apple Inc. | Configuring context-specific user interfaces |
US11816325B2 (en) | 2016-06-12 | 2023-11-14 | Apple Inc. | Application shortcuts for carplay |
US11429230B2 (en) | 2018-11-28 | 2022-08-30 | Neonode Inc | Motorist user interface sensor |
US11675476B2 (en) | 2019-05-05 | 2023-06-13 | Apple Inc. | User interfaces for widgets |
US11061489B2 (en) | 2019-05-10 | 2021-07-13 | Topoleg, Inc. | Automating and reducing user input required for user session on writing and/or drawing system |
US11429259B2 (en) * | 2019-05-10 | 2022-08-30 | Myscript | System and method for selecting and editing handwriting input elements |
US11061488B2 (en) | 2019-05-10 | 2021-07-13 | Topoleg, Inc. | Automating and reducing user input required for user session on writing and/or drawing system |
US11347943B2 (en) | 2019-06-01 | 2022-05-31 | Apple Inc. | Mail application features |
US11379113B2 (en) | 2019-06-01 | 2022-07-05 | Apple Inc. | Techniques for selecting text |
US11074408B2 (en) | 2019-06-01 | 2021-07-27 | Apple Inc. | Mail application features |
US11257396B2 (en) * | 2020-03-18 | 2022-02-22 | Sas Institute Inc. | User interfaces for converting geospatial data into audio outputs |
USD942470S1 (en) * | 2020-06-21 | 2022-02-01 | Apple Inc. | Display or portion thereof with animated graphical user interface |
US11669210B2 (en) | 2020-09-30 | 2023-06-06 | Neonode Inc. | Optical touch sensor |
WO2023170314A1 (en) * | 2022-03-11 | 2023-09-14 | Myscript | Creating text block sections |
WO2023170315A1 (en) * | 2022-03-11 | 2023-09-14 | Myscript | Merging text blocks |
US20230315271A1 (en) * | 2022-03-18 | 2023-10-05 | Sony Group Corporation | Collaborative whiteboard for meetings |
US11460973B1 (en) | 2022-04-11 | 2022-10-04 | Sas Institute Inc:. | User interfaces for converting node-link data into audio outputs |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US5880743A (en) | Apparatus and method for implementing visual animation illustrating results of interactive editing operations | |
CA2139256C (en) | Apparatus and method for supporting the implicit structure of freeform lists, outlines, text, tables and diagrams in a gesture-based input system and editing system | |
US5889523A (en) | Method and apparatus for dynamically grouping a plurality of graphic objects | |
US5471578A (en) | Apparatus and method for altering enclosure selections in a gesture based input system | |
US10528236B2 (en) | Creating a display pattern for multiple data-bound graphic objects | |
US8448083B1 (en) | Gesture control of multimedia editing applications | |
US5572651A (en) | Table-based user interface for retrieving and manipulating indices between data structures | |
US5592608A (en) | Interactively producing indices into image and gesture-based data using unrecognized graphical objects | |
US6493736B1 (en) | Script character processing method for opening space within text and ink strokes of a document | |
US6952803B1 (en) | Method and system for transcribing and editing using a structured freeform editor | |
EP0727730B1 (en) | Method for improving visibility and selectability of icons | |
US6097392A (en) | Method and system of altering an attribute of a graphic object in a pen environment | |
US5500935A (en) | Apparatus and method for translating graphic objects and commands with direct touch input in a touch based input system | |
Moran et al. | Implicit structure for pen-based systems within a freeform interaction paradigm | |
US8539381B2 (en) | Intuitive tools for manipulating objects in a display | |
JP3866302B2 (en) | How to operate a processor-based device | |
US20020059350A1 (en) | Insertion point bungee space tool | |
JP2003303047A (en) | Image input and display system, usage of user interface as well as product including computer usable medium | |
EP0194442B1 (en) | Method for manipulation of graphic sub-objects in an interactive draw graphic system | |
JPH0756840A (en) | Operating method of processor-based apparatus | |
JPH06325211A (en) | Method for controlling handwritten character entry frame | |
US20130014041A1 (en) | Using gesture objects to replace menus for computer control | |
JPH0756841A (en) | Operating method of processor-based apparatus | |
KR20060041817A (en) | Systems and methods that utilize a dynamic digital zooming interface in connection with digital inking | |
JPH029385B2 (en) |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
CC | Certificate of correction | ||
AS | Assignment |
Owner name: BANK ONE, NA, AS ADMINISTRATIVE AGENT, ILLINOIS Free format text: SECURITY INTEREST;ASSIGNOR:XEROX CORPORATION;REEL/FRAME:013153/0001 Effective date: 20020621 |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
AS | Assignment |
Owner name: JPMORGAN CHASE BANK, AS COLLATERAL AGENT, TEXAS Free format text: SECURITY AGREEMENT;ASSIGNOR:XEROX CORPORATION;REEL/FRAME:015134/0476 Effective date: 20030625 Owner name: JPMORGAN CHASE BANK, AS COLLATERAL AGENT,TEXAS Free format text: SECURITY AGREEMENT;ASSIGNOR:XEROX CORPORATION;REEL/FRAME:015134/0476 Effective date: 20030625 |
|
FPAY | Fee payment |
Year of fee payment: 8 |
|
FPAY | Fee payment |
Year of fee payment: 12 |
|
AS | Assignment |
Owner name: XEROX CORPORATION, CONNECTICUT Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:JPMORGAN CHASE BANK, N.A. AS SUCCESSOR-IN-INTEREST ADMINISTRATIVE AGENT AND COLLATERAL AGENT TO JPMORGAN CHASE BANK;REEL/FRAME:066728/0193 Effective date: 20220822 |