US20050273761A1 - Freehand system and method for creating, editing, and manipulating block diagrams - Google Patents

Freehand system and method for creating, editing, and manipulating block diagrams Download PDF

Info

Publication number
US20050273761A1
US20050273761A1 US10/863,378 US86337804A US2005273761A1 US 20050273761 A1 US20050273761 A1 US 20050273761A1 US 86337804 A US86337804 A US 86337804A US 2005273761 A1 US2005273761 A1 US 2005273761A1
Authority
US
United States
Prior art keywords
diagram
block diagram
user input
component
freehand
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/863,378
Inventor
Jay Torgerson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
MathWorks Inc
Original Assignee
MathWorks Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by MathWorks Inc filed Critical MathWorks Inc
Priority to US10/863,378 priority Critical patent/US20050273761A1/en
Assigned to THE MATHWORKS, INC. reassignment THE MATHWORKS, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TORGERSON, JAY RYAN
Priority to PCT/US2005/019528 priority patent/WO2005121941A2/en
Priority to EP05756532A priority patent/EP1769327A2/en
Publication of US20050273761A1 publication Critical patent/US20050273761A1/en
Priority to US11/825,611 priority patent/US8627278B2/en
Priority to US14/136,215 priority patent/US20140177962A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/32Digital ink
    • G06V30/36Matching; Classification
    • G06V30/387Matching; Classification using human interaction, e.g. selection of the best displayed recognition candidate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/12Classification; Matching

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

A method of integrating freehand user input into a block diagram environment is disclosed. The freehand user input is a user's approximation of a diagram component or feature of a component which is received by the block diagram environment and compared to multiple patterns stored in a storage location. The storage location holds patterns of block diagram components and block diagram component features. The freehand user input may be displayed, superimposed on a block diagram being shown to the user. Upon the freehand user input being matched to one of the patterns representing a block diagram component or feature of a component, the freehand user input is replaced on the displayed block diagram with an electronic device drawn rendering of the matched diagram feature component or feature of a component. Partial matches of the user drawn input may result in a menu of choices being presented to the user for selection.

Description

    FIELD OF THE INVENTION
  • The illustrative embodiment of the present invention relates generally to block diagrams and more particularly to the use of freehand input data in the creation, editing and manipulation of block diagrams.
  • BACKGROUND
  • Freehand user input such as input from a mouse or optical pen operated by a user of a computer system or other electronic device is accepted by many different types of applications. For example, freehand input from a mouse is accepted by Microsoft's Paint program and enables the user to construct freehand drawings on a display surface connected to a computer system or other electronic device. One type of freehand user input is Graffiti®, a text recognition system which is used in conjunction with the Palm OS® from PalmSource of Sunnyvale, Calif. in handheld PDAs (Personal Digital Assistants). The acceptance of the freehand input enables an application to interact with the user in an easy and well understood manner and enables the user to customize the displayed output of the particular application.
  • Block diagram environments such as Simulink(g from The MathWorks of Natick, Mass., enable a user to construct models of dynamic systems. These models include graphical references to system components which may be used to simulate the system operations. Current block diagram environment tools enable a user to drag items from pre-defined templates into a block diagram. The components being dragged into the block diagram include attributes enabling the component to be modeled in the system during the simulation of the diagram. Unfortunately, conventional block diagram environments do not allow a user to sketch in system components using freehand input. The result is a lengthy series of drag and drop operations that the user is required to perform when constructing the block diagram. Additionally, the user must also first search to find the desired component in the proper template.
  • SUMMARY OF THE INVENTION
  • The illustrative embodiment of the present invention provides a method of integrating freehand user input into a block diagram environment. The freehand user input is a user's approximation of a diagram component or feature of a component which is received by the block diagram environment and compared to multiple patterns stored in a storage location. The storage location holds patterns of block diagram components and block diagram component features. The freehand user input may be displayed, superimposed on a block diagram being shown to the user. Upon the freehand user input being matched to one of the stored patterns representing a block diagram component or feature of a component, the freehand user input is replaced on the displayed block diagram with an electronic device drawn rendering of the matched diagram feature component or feature of a component. The component is also added to the block diagram model data. In one implementation, the user input displayed on the block diagram and the electronic device drawn rendering of the feature or component may be simultaneously displayed pending user confirmation of the selection. Partial matches of the user drawn input may result in a menu of choices being presented to the user for selection.
  • In one embodiment in an electronic device holding a block diagram environment with at least one block diagram, the electronic device also interfaced with a display surface displaying a block diagram, a method receives freehand user input entered with a pointing device into the block diagram. The method also compares the user input to at least one of multiple stored patterns of block diagram components and features of block diagram components. In the event of a match, the method adds the block diagram component or block diagram component feature to the block diagram model data and updates the displayed block diagram with a program-drawn block diagram component/feature representing the input data.
  • In another embodiment, in an electronic device interfaced with a display surface, a system includes a block diagram environment with at least one block diagram displayed on the display surface. The system also includes a pointing device interfaced with the electronic device, the pointing device being used to transmit freehand user input to the block diagram displayed on the display surface. The system additionally includes a storage location holding multiple block diagram component patterns and block diagram component feature patterns to which the freehand user input is compared. Matching block diagram components and component features from the storage location are rendered on the electronic device drawn diagram in the event of a match.
  • In a different embodiment in an electronic device holding a block diagram environment with at least one block diagram, the electronic device also interfaced with a display surface displaying a block diagram, a method receives freehand user input entered with a pointing device into the block diagram. The method also analyzes the user input to identify the user input as a block diagram component or block diagram component feature. In the event of an identification, the method adds the block diagram component or block diagram component feature to the block diagram model data and updates the displayed block diagram with a program-drawn block diagram component/feature representing the input data.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram of an environment suitable for practicing the illustrative embodiment of the present invention;
  • FIG. 2A is a flow chart of the sequence of steps followed by the illustrative embodiment of the present invention to compare freehand user input to stored diagram components and features;
  • FIG. 2B is a flowchart examining the comparison and normalization process in greater detail;
  • FIG. 3A is a view of the illustrative embodiment of the present invention showing received freehand user input displayed in a block diagram environment;
  • FIG. 3B is a view of the illustrative embodiment of the present invention showing a electronic device drawn diagram component based on the freehand user input;
  • FIG. 4A is a view of the illustrative embodiment of the present invention showing freehand user input being appended to a previously drawn electronic device generated diagram component;
  • FIG. 4B is a view of the illustrative embodiment of the present invention showing a electronic device drawn replacement of the freehand user input FIG. 4A;
  • FIG. 5A is a view of the illustrative embodiment of the present invention showing an initial display of the received freehand user input appended to a feedback loop in a block diagram;
  • FIG. 5B shows an electronic device rendering of the partial component depicted in the freehand user input of FIG. 5A;
  • FIG. 5C is a view of the illustrative embodiment of the present invention showing a displayed menu of choices of finished components possibly completing the previously drawn component of FIG. 5A and FIG. 5B;
  • FIG. 5D is a view of the illustrative embodiment of the present invention showing the view of FIG. 5C wherein the user has selected one of the possibilities displayed in the menu by entering additional freehand input corresponding to a menu choice;
  • FIG. 5E shows the final electronic device drawn rendering of the additional component of FIG. 5D.
  • FIG. 6A shows a user drawn output being added to the diagram of FIG. 5E;
  • FIG. 6B shows a electronic device drawn replacement for the user input displayed in FIG. 6A;
  • FIG. 7A shows freehand user input indicating a request for an additional view of the diagram being displayed; and
  • FIG. 7B shows a zoomed-out viewer superimposed on a portion of the block diagram in response to the received freehand user input displayed in FIG. 7A.
  • DETAILED DESCRIPTION
  • The illustrative embodiment of the present invention allows a user to construct, modify, and manipulate block diagram elements using a pointing device. A block diagram environment displaying a block diagram receives freehand user input from a pointing device such as an optical pen or a mouse. The freehand user input is entered by patterns and gestures and is not limited to displayed elements on a template as is found in conventional drag and drop systems. The entered patterns and symbols are compared to stored patterns of block diagram components and features of components and used as the basis for electronic device drawn diagram components or component features which are added to the block diagram. By allowing freehand entry of input data in such a manner, the illustrative embodiment of the present invention greatly expands and simplifies the creation and modification of block diagram features and components.
  • FIG. 1 depicts an environment suitable for practicing the illustrative embodiment of the present invention. An electronic device 2 holds a block diagram environment 4. The electronic device 2 may be a personal computer, workstation, laptop, server, mainframe, PDA or some other type of electronic device equipped with a processor and able to support the block diagram environment 4. The block diagram environment 4 may be a block diagram environment such as that found in Simulink® from The MathWorks, Inc. of Natick, Mass., and Labview® from National Instruments of Austin, Tex. The block diagram environment 4 includes a least one set of block diagram model data 6 for a block diagram. The block diagram model data 6 may be a block diagram model of a system. Also included on the electronic device 2 is a storage location 8. The storage location 8 includes a stored collection of block diagram components and block diagram component feature patterns 10. Those skilled in the art will recognize that storage location 8 may also be located at other locations accessible to the block diagram environment 4. For example, the storage location 8 may be located on a remote server in a distributed environment. A user 12 manipulates a pointing device 18 which is interfaced with the electronic device 2. The pointing device 12 may be a mouse, trackball, joystick, mousepad, and light pen. A display 14 holds a rendering of block diagram 16. Input received by the block diagram environment 4 from the user manipulating the pointing device 18 is compared to the block diagram component/feature patterns 10 which are used to render a electronic device drawn version of the user input on the display 14.
  • FIG. 2A depicts the sequence of steps followed by the illustrative embodiment of the present invention to match freehand user input to stored block diagram components/features of components. The sequence begins when the user manipulates the pointing device 18 to send input to the displayed block diagram 16 via the block diagram environment 4 (step 30). The technique used for the user input varies with the type of input device being utilized. For example, if a mouse is being used, the user may drag a cursor around the displayed block diagram 16 in the outline of the shape of the component/feature of a component that the user is attempting to add to the block diagram 16. Similarly, if an optical pointing device is being used, the user may gesture at the diagram in the shape of the component/feature of a component. The input data is received at the electronic device 2(step 32) and a graphical representation of the input is displayed in the rendering of the block diagram 16 (step 34). For example, a user may draw a shape in the displayed block diagram using a mouse connected to the electronic device 2. The user 12 may position the mouse cursor at the appropriate spot in the display 14. By depressing a mouse button and moving the mouse, a line is rendered in the displayed block diagram by the block diagram environment 4. The rendering is the result of the block diagram environment 4 intercepting and handling the mouse messages generated by the depression of the button on the mouse. The handling of mouse messages is well-known to those with experience in the Windows® programming environment (e.g.: WM_RBUTTONDOWN, WM_LBUTTONDOWN). The message handler cross references the coordinates associated with the intercepted message with the coordinates of the model data being displayed on the display in order to orient the input data with the block diagram data.
  • The shape of the user input as captured is then compared to stored patterns of block diagram components and features of block diagram components (step 36). The comparison may be made when the input data ceases, for example, through a WM_LBUTTONUP message or the failure to extend a line being rendered in response to the movements of an optical pen. Alternatively, the illustrative embodiment of the present invention may attempt comparisons based on a time parameter. A determination is made for each comparison as to whether a definitive match has been determined (step 37). The determination as to what percentage of the user input must correspond to the stored pattern to constitute a definitive match in the illustrative embodiment is an implementation choice. The illustrative embodiment of the present invention checks for points of correspondence between the user input and the stored diagram component/feature patterns 10. For some shapes, the borders of enclosed input shapes may be compared to the stored diagram component/feature patterns 10. For other shapes, both the borders, interior and/or exterior shapes may be checked for correspondence. The size of the input shapes may be normalized against the stored features/patterns with a percentage of deviation being allowed from the stored component/feature size. The stored diagram components/feature patterns 10 may be stored in a database including a size attribute associated with the component/feature. The comparison may only compare the entered pattern to stored components/features within the deviation range. Distances and angles of the input data are compared to the stored/components features accounting for pre-defined amount of size deviation.
  • The process of determining a definitive match while normalizing for the size differential between the input data and the stored patterns is illustrated in greater detail in the flowchart of FIG. 2B. After receiving the freehand user input (step 39), the illustrative embodiment of the present invention analyzes the received input to identify attributes of the input data (e.g.:length of line, direction of line, angle between lines, etc.) (step 41). The stored components/features of components are retrieved and similarly analyzed (step 43). Alternatively, the stored patterns may have already been analyzed in which case the attributes for the stored patterns are retrieved. The input attributes are then compared to the stored attribute patterns while normalizing for size differential (step 45). The normalization process is necessary since in most cases the user's freehand input will not be to exactly the same scale as the stored patterns. Any matches from the comparison are identified (step 47).
  • In the event of a definitive match, an electronic device drawn component or feature of a component corresponding to the user input data is rendered in the block diagram 16 (step 38). Alternatively, if there is not a definitive match, but rather several possibilities, the user may be presented with a menu of possible block diagram components or features of components which may correspond to component/feature the user was attempting to add to the block diagram (step 40). Following a user selection of the appropriate component or feature (step 42) the electronic device adds the input data to the block diagram model data and draws the component or feature of the component in the rendering of the block diagram 16(step 44).
  • The visual effect of the matching of freehand user input to the stored patterns is depicted in FIGS. 3A and 3B. FIG. 3A depicts a view 50 of a block diagram environment in the illustrative embodiment of the present invention in which a user has entered freehand input 52 representing a triangular shape. The illustrative embodiment of the present invention matches the freehand user input to a stored pattern in the collection of patterns of block diagram components and component features. The block diagram environment 4 generates the view 50 of FIG. 3B showing an electronic device rendered gain block 54.
  • The illustrative embodiment of the present invention may also be used to add additional components to an already existing block diagram as depicted in FIGS. 4A and FIG. 4B. FIG. 4A depicts the view 50 and computer generated gain block 54 of FIGS. 3A and 3B to which freehand user input of a rough approximation of a feedback signal 56 has been added to the electronic device drawn components previously appearing on the display. As before, the freehand user input for the feedback signal approximation 56 is compared to the diagram component and feature patterns 10 that are stored at a location 8 accessible to the block diagram environment 4. In FIG. 4B a computer rendered feedback signal 58 is then generated to replace the user drawn input 56. Depending upon the implementation, the electronic device drawn feedback signal may be superimposed over the user freehand input signal 56 until such time as the user indicates a confirmation of the substituted element. Those skilled in the art will recognize that such confirmation may be given in a number of different manners such as by clicking on the electronic device drawn component or component feature or clicking a control button in a popup dialog box which requests verification.
  • The sequence of views generated by the illustrative embodiment of the present invention to choose a component from among many components is illustrated in FIGS. 5A through 5E. FIG. 5A shows a view 50 with the electronic device generated gain block 54 and feedback signal 58 as well as additional user freehand input 60 of a circular shape near the end of the feedback signal. FIG. 5B shows the view 50 including the components previously discussed in FIG. 5A with the user input 60 replaced by an electronic device generated circle 62. The electronic device generated circle 62 represents a partially completed block diagram component. FIG. 5C depicts the view 50 with the gain block 54, feedback signal 58 and circle 62 as well as additional choices displayed on a menu 64. The menu 64 displays a list of potential matches for the partially completed element 62. FIG. 5D depicts the previous view 50 as well as an additional freehand feature 66 added by the user to the previously displayed computer rendered circle 62. The additional freehand feature 66 matches one of the components 67 displayed in the menu 64. FIG. 5E shows the electronic device generated replacement of the user drawn feature 66. The result is a sum block 68 added on to the end of the feedback signal 58.
  • A user may also add additional components to previously stored diagrams using the illustrative embodiment of the present invention. For example, FIG. 6A depicts view 50 including the gain block 54, feedback signal 58, and sum block 68 to which the user indicates a desire to add an output port through the creation of a freehand user input. Those skilled in the art will recognize that the previously stored block diagram may be retrieved from any location accessible to the block diagram environment 4. FIG. 6B shows the result after matching the freehand user input to a stored pattern, the generation of a computer rendered output port 72.
  • Those skilled in the art will recognize that alternate forms of analysis rather than an explicit comparison may also be used by the illustrative embodiment of the present invention to identify the freehand user input. For example, a neural net may be used to determine the type of block diagram component/feature the user is attempting to add via freehand user input. The neural net may attempt to analyze small pieces of user input in parallel to identify the desired component/feature the user is attempting to add to the block diagram.
  • The illustrative embodiment of the present invention may also be used to call up block diagram tools through the use of freehand gestures. FIG. 7A depicts a view 80 of a portion of a block diagram 82. A user sketch enclosing the lower left hand corner of the displayed view 80 creates a rough box 84. The gesture is evaluated by comparing it to stored patterns and results in the calling of a zoomed-out viewer showing the entire block diagram or subsystem 86 (of which the main view 80 is only displaying a portion) superimposed in the lower left corner of the view.
  • Similar to the procedure used to call up block diagram tools, a pre-defined input may also be used to cause the deletion of a component. For example, an “x” written over a diagram component may be interpreted as a delete signal. Similarly, pre-defined input such as a symbol may result in the insertion of a complete block or subsystem into the block diagram.
  • In one aspect of the illustrative embodiment of the present invention, a user may switch between an editing mode and an interpreted mode in the block diagram environment 4. The editing mode allows the user to make changes to the block diagram without the pattern comparison taking place. In the interpreted mode, once the user activates the mode, pattern comparison is performed in response to the user input. This ability to toggle the interpreted mode on and off prevents unwanted components and component features from being inadvertently added to a block diagram.
  • In another aspect of the illustrative embodiment of the present invention, the user may teach the block diagram environment to interpret specific freehand user input gestures. User input may be entered and the environment subsequently instructed to interpret the entered input in a certain manner and perform an associated action. For example, the user may enter a learning mode and enter freehand input. After entering the input in the learning mode, the user indicates to the block diagram environment a specific action (e.g.: perform function, add component) that is to be associated with the input. The entered pattern and its association is saved , and subsequent user entries of that pattern in interpreted mode result in the specified action being performed.
  • Although the examples contained herein have been discussed with reference to a block diagram environment, it should be appreciated that the present invention may also be implemented in other design environments. For example, the method of accepting and handling freehand user input that have been discussed above may be implemented in a user interface design system where various controls are added to a user interface in response to the received freehand user input. The illustrative embodiment of the present invention may also be practiced in a software diagram environment such as that found in Stateflow® from The MathWorks, Inc. of Natick, Mass. or Unified Modeling Language (UML) environments.
  • Since certain changes may be made without departing from the scope of the present invention, it is intended that all matter contained in the above description or shown in the accompanying drawings be interpreted as illustrative and not in a literal sense. Practitioners of the art will realize that the system configurations depicted and described herein are examples of multiple possible system configurations that fall within the scope of the current invention. Likewise, the sequence of steps utilized in the illustrative flowcharts are examples and not the exclusive sequence of steps possible within the scope of the present invention.

Claims (52)

1. In an electronic device holding a block diagram environment with at least one block diagram, said block diagram being a model of a system, said electronic device interfaced with a display surface displaying said block diagram, a method, comprising:
receiving freehand user input entered with a pointing device into said block diagram;
comparing said user input to at least one of a stored plurality of patterns of at least one of block diagram components and block diagram component features;
adding said at least one of said block diagram components and block diagram component features to said model, said adding occurring as a result of a matching comparison between said freehand user input and at least one of said plurality of patterns; and
rendering programmatically with the block diagram environment at least one of said block diagram components and said block diagram component features in said diagram.
2. The method of claim 1, further comprising:
displaying said freehand user input in said diagram prior to said rendering.
3. The method of claim 2, further comprising:
replacing said displayed freehand user input with a programmatic rendering of said matching pattern.
4. The method of claim 1 wherein said freehand user input is added to an existing programmatically rendered diagram component in said diagram.
5. The method of claim 4, further comprising:
modifying said existing programmatically rendered pattern by programmatically rendering at least one additional feature based on matching said user input to a stored pattern.
6. The method of claim 1 wherein said freehand user input is one of a complete diagram component and complete diagram component feature.
7. The method of claim 1 wherein said freehand user input is one of a partial diagram component and partial diagram component feature.
8. The method of claim 1, further comprising:
providing a choice of at least two of diagram components and diagram component features to said user prior to said rendering, said choice prompting a user for a selection of one of said at least two diagram components and diagram component features.
9. The method of claim 8, further comprising:
rendering programmatically at least one of said diagram components and diagram component features based on said selection.
10. The method of claim 8 wherein the identity of the at least two diagram components and diagram component features appearing in said choice is based on the context of said diagram.
11. The method of claim 1 wherein said freehand user input is a symbol.
12. The method of claim 1 wherein said diagram is a block diagram.
13. The method of claim 1, further comprising:
manipulating at least one of a diagram component and diagram component feature displayed in said diagram as a result of said user input.
14. The method of claim 1, further comprising:
deleting at least one of a diagram component and diagram component feature displayed in said diagram as a result of said user input.
15. The method of claim 1, further comprising:
providing diagramming tools to a user in said block diagram environment in response to said freehand user input.
16. The method of claim 15 wherein said tool is a zoomed-out viewer superimposed over at least a portion of said displayed diagram.
17. The method of claim 1, further comprising:
providing at least one visually displayed symbol associated with an attribute type;
selecting said symbol associated with an attribute type with said pointing device;
subsequently selecting with said pointing device at least one of a diagram component and diagram component feature displayed in said diagram; and
assigning programmatically said attribute type associated with said symbol to the selected at least one of a diagram component and diagram component feature.
18. The method of claim 1, further comprising:
displaying said freehand user input on said display surface;
superimposing the programmatically rendered pattern over the displayed freehand user input; and
removing said freehand user input from said display surface following a user indication of the correctness of the programmatically rendered at least one of said diagram component and diagram component feature.
19. The method of claim 1 wherein said comparing occurs only after said user selects an interpreted mode option for said block diagram environment.
20. In an electronic device interfaced with a display surface, a system comprising:
a block diagram environment with at least one block diagram displayed on said display surface;
a pointing device interfaced with said electronic device, said pointing device used to transmit freehand input from a user to said block diagram displayed on said display surface; and
a storage location holding a plurality of block diagram component patterns and diagram component feature patterns, said user input being compared to said plurality of block diagram component patterns and diagram component feature patterns, said block diagram environment programmatically rendering one of a diagram component and diagram component feature in said block diagram based on a matching comparison between said freehand user input and at least one of said plurality of patterns.
21. The system of claim 20 wherein said rendering superimposes said programmatically rendered one of a diagram component and diagram component feature over said user input in said diagram displayed on said display surface.
22. The system of claim 20 wherein said user input represents one of a partial diagram component and partial diagram component feature.
23. The system of claim 20 wherein said user input represents one of a complete diagram component and complete diagram component feature.
24. The system of claim 20, further comprising:
a menu of choices displayed to said user prior to the rendering of said one of a diagram component and diagram component feature, said choices representing possible diagram components and diagram component features matching said user input.
25. The system of claim 24 wherein a user selects one of the choices, the selected choice then being rendered as a matching comparison.
26. The system of claim 20 wherein said user input is of an additional feature to a previously rendered diagram component.
27. The system of claim 20 wherein said pointing device is one of a mouse, trackball, joystick, mousepad, and light pen.
28. The system of claim 20 wherein said block diagram environment is a block diagram environment and said block diagram is a block diagram.
29. In an electronic device holding a block diagram environment with at least one block diagram, said electronic device interfaced with a display surface displaying said diagram, a medium holding executable steps for a method, said method comprising:
comparing freehand user input entered with a pointing device into said diagram to at least one of a stored plurality of patterns; and
rendering programmatically with the block diagram environment one of a diagram component and diagram component feature in said diagram, said diagram component based on a matching comparison between said freehand user input and at least one of said plurality of patterns.
30. The medium of claim 29 wherein said method further comprises:
displaying said freehand user input in said diagram prior to said rendering.
31. The medium of claim 30 wherein said method further comprises:
replacing said displayed freehand user input with said programmatically rendered one of a diagram component and diagram component feature.
32. The medium of claim 29 wherein said freehand user input is added to an existing electronic device drawn diagram component in said diagram.
33. The medium of claim 32 wherein said method further comprises:
modifying said existing programmatically rendered component by programmatically rendering at least one additional feature based on matching said user input to a stored pattern.
34. The medium of claim 29 wherein said freehand user input is one of a complete diagram component and complete diagram component feature.
35. The medium of claim 29 wherein said freehand user input is one of a partial diagram component and partial diagram component feature.
36. The medium of claim 29 wherein said method further comprises:
providing a choice of at least two diagram components and diagram component features to said user prior to said rendering, said choice prompting a user for a selection of one of said at least two diagram components and diagram component features.
37. The medium of claim 36, wherein said method further comprises:
rendering one of said diagram components and diagram component features based on said selection.
38. The medium of claim 36 wherein the identity of the at least two of said diagram components and diagram component features appearing in said choice is based on the context of said diagram.
39. The medium of claim 29 wherein said freehand user input is a symbol.
40. The medium of claim 29 wherein said diagram is a block diagram.
41. The medium of claim 29 wherein said method further comprises:
manipulating at least one of a diagram component and diagram component feature displayed in said diagram as a result of said user input.
42. The medium of claim 29 wherein said method further comprises:
deleting at least one of a diagram component and diagram component feature displayed in said diagram as a result of said user input.
43. The medium of claim 29 wherein said method further comprises:
providing diagramming tools to a user in said block diagram environment in response to said freehand user input.
44. The medium of claim 43 wherein said tool is a zoomed-out viewer superimposed over at least a portion of said displayed diagram.
45. The medium of claim 29 wherein said method further comprises:
providing at least one visually displayed symbol associated with an attribute type;
selecting said symbol associated with an attribute type with said pointing device;
subsequently selecting with said pointing device at least one of a diagram component and diagram component feature displayed in said diagram; and
assigning programmatically said attribute type associated with said symbol to the selected at least one of a diagram component and diagram component feature.
46. In an electronic device holding a block diagram environment with at least one block diagram, said block diagram being a model of a system, said electronic device interfaced with a display surface displaying said block diagram, a method, comprising:
receiving freehand user input entered with a pointing device into said block diagram;
analyzing said user input to identify said user input as at least one of a block diagram component and block diagram component feature;
adding said at least one of said block diagram components and block diagram component features to said model, said adding occurring as a result of said analysis; and
rendering programmatically at least one of said block diagram components and said block diagram component features in said diagram.
47. The method of claim 46 wherein said block diagram component is a block diagram subsystem.
48. The method of claim 46 wherein said analysis is performed using a neural net.
49. In an electronic device holding a block diagram environment with at least one block diagram, said block diagram being a model of a system, said electronic device interfaced with a display surface displaying said block diagram, a method, comprising:
receiving freehand user input entered with a pointing device into said block diagram environment executing in a learning mode;
receiving a user indication associating said freehand user input with an action;
storing said user input and its associated action as one of a plurality of patterns;
receiving freehand user input entered with a pointing device into said block diagram environment executing in an interpretive mode;
comparing said user input to at least one of a stored plurality of patterns of at least one of block diagram components and block diagram component features; and
performing the associated action, said performing occurring as a result of a matching comparison between said freehand user input and at least one of said plurality of patterns.
50. In an electronic device holding a block diagram environment with at least one block diagram, said block diagram being a model of a system, said electronic device interfaced with a display surface displaying said block diagram, a medium holding executable steps for a method, said method comprising:
receiving freehand user input entered with a pointing device into said block diagram environment executing in a learning mode;
receiving a user indication associating said freehand user input with an action;
storing said user input and its associated action as one of a plurality of patterns;
receiving freehand user input entered with a pointing device into said block diagram environment executing in an interpretive mode;
comparing said user input to at least one of a stored plurality of patterns of at least one of block diagram components and block diagram component features; and
performing the associated action, said performing occurring as a result of a matching comparison between said freehand user input and at least one of said plurality of patterns.
51. In an electronic device interfaced with a display surface, a system comprising:
a software diagram environment with at least one software diagram displayed on said display surface;
a pointing device interfaced with said electronic device, said pointing device used to transmit freehand input from a user to said software diagram displayed on said display surface; and
a storage location holding a plurality of software diagram component patterns and diagram component feature patterns, said user input being compared to said plurality of software diagram component patterns and diagram component feature patterns, said software diagram environment programmatically rendering one of a diagram component and diagram component feature in said software diagram based on a matching comparison between said freehand user input and at least one of said plurality of patterns.
52. The system of claim 51 wherein said software diagram is one of a Stateflow® diagram and Unified Modeling Language (UML) diagrams.
US10/863,378 2004-06-07 2004-06-07 Freehand system and method for creating, editing, and manipulating block diagrams Abandoned US20050273761A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
US10/863,378 US20050273761A1 (en) 2004-06-07 2004-06-07 Freehand system and method for creating, editing, and manipulating block diagrams
PCT/US2005/019528 WO2005121941A2 (en) 2004-06-07 2005-06-02 Freehand system and method for creating, editing, and manipulating block diagrams
EP05756532A EP1769327A2 (en) 2004-06-07 2005-06-02 Freehand system and method for creating, editing, and manipulating block diagrams
US11/825,611 US8627278B2 (en) 2004-06-07 2007-07-06 Freehand system and method for creating, editing, and manipulating block diagrams
US14/136,215 US20140177962A1 (en) 2004-06-07 2013-12-20 Freehand system and method for creating, editing, and manipulating block diagrams

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US10/863,378 US20050273761A1 (en) 2004-06-07 2004-06-07 Freehand system and method for creating, editing, and manipulating block diagrams

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US11/825,611 Continuation US8627278B2 (en) 2004-06-07 2007-07-06 Freehand system and method for creating, editing, and manipulating block diagrams

Publications (1)

Publication Number Publication Date
US20050273761A1 true US20050273761A1 (en) 2005-12-08

Family

ID=35241027

Family Applications (3)

Application Number Title Priority Date Filing Date
US10/863,378 Abandoned US20050273761A1 (en) 2004-06-07 2004-06-07 Freehand system and method for creating, editing, and manipulating block diagrams
US11/825,611 Active 2028-04-20 US8627278B2 (en) 2004-06-07 2007-07-06 Freehand system and method for creating, editing, and manipulating block diagrams
US14/136,215 Abandoned US20140177962A1 (en) 2004-06-07 2013-12-20 Freehand system and method for creating, editing, and manipulating block diagrams

Family Applications After (2)

Application Number Title Priority Date Filing Date
US11/825,611 Active 2028-04-20 US8627278B2 (en) 2004-06-07 2007-07-06 Freehand system and method for creating, editing, and manipulating block diagrams
US14/136,215 Abandoned US20140177962A1 (en) 2004-06-07 2013-12-20 Freehand system and method for creating, editing, and manipulating block diagrams

Country Status (3)

Country Link
US (3) US20050273761A1 (en)
EP (1) EP1769327A2 (en)
WO (1) WO2005121941A2 (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070192759A1 (en) * 2006-02-14 2007-08-16 Mitsubishi Electric Corporation Diagram editing apparatus
US20100053215A1 (en) * 2008-08-27 2010-03-04 International Business Machines Corporation Creation and application of patterns to diagram elements
US20100058161A1 (en) * 2008-08-27 2010-03-04 International Business Machines Corporation Automatic management of diagram elements
US20100058162A1 (en) * 2008-08-27 2010-03-04 International Business Machines Corporation Automatic customization of diagram elements
US20100100866A1 (en) * 2008-10-21 2010-04-22 International Business Machines Corporation Intelligent Shared Virtual Whiteboard For Use With Representational Modeling Languages
US20100153890A1 (en) * 2008-12-11 2010-06-17 Nokia Corporation Method, Apparatus and Computer Program Product for Providing a Predictive Model for Drawing Using Touch Screen Devices
US20100281359A1 (en) * 2009-04-30 2010-11-04 International Business Machines Corporation Method, apparatus and system for processing graphic objects
WO2014058597A1 (en) * 2012-10-11 2014-04-17 Google Inc. Non-textual user input
US20150378712A1 (en) * 2014-06-28 2015-12-31 Vmware,Inc. Selection of relevant software bundles
US9389848B2 (en) 2014-06-28 2016-07-12 Vmware, Inc. Scheduling a plan of operations in a datacenter
US20160259464A1 (en) * 2015-03-06 2016-09-08 Alibaba Group Holding Limited Method and apparatus for interacting with content through overlays
US9442714B2 (en) 2014-06-28 2016-09-13 Vmware, Inc. Unified visualization of a plan of operations in a datacenter
US9529980B2 (en) 2014-06-28 2016-12-27 Vmware, Inc. Deduplication of end user license agreements
US20180121075A1 (en) * 2016-10-28 2018-05-03 Microsoft Technology Licensing, Llc Freehand object manipulation
CN111062996A (en) * 2019-11-29 2020-04-24 广东优世联合控股集团股份有限公司 Rendering method of construction drawing

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008112650A2 (en) * 2007-03-15 2008-09-18 Yazaki Corporation Capacitor electrodes comprising carbon nanotubes filled with one or more non- carbon materials
US8826174B2 (en) 2008-06-27 2014-09-02 Microsoft Corporation Using visual landmarks to organize diagrams
JP5767106B2 (en) * 2009-05-18 2015-08-19 レノボ・イノベーションズ・リミテッド(香港) Mobile terminal device, control method and program for mobile terminal device
US10078411B2 (en) 2014-04-02 2018-09-18 Microsoft Technology Licensing, Llc Organization mode support mechanisms
US10643067B2 (en) 2015-10-19 2020-05-05 Myscript System and method of handwriting recognition in diagrams
US10976918B2 (en) 2015-10-19 2021-04-13 Myscript System and method of guiding handwriting diagram input
CN109558855B (en) * 2018-12-06 2019-10-15 哈尔滨拓博科技有限公司 A kind of space gesture recognition methods combined based on palm contour feature with stencil matching method
EP3736677A1 (en) 2019-05-10 2020-11-11 MyScript A method and corresponding device for selecting and editing handwriting input elements
EP3754537A1 (en) 2019-06-20 2020-12-23 MyScript Processing text handwriting input in a free handwriting mode
EP3772015B1 (en) 2019-07-31 2023-11-08 MyScript Text line extraction
EP3796145A1 (en) 2019-09-19 2021-03-24 MyScript A method and correspond device for selecting graphical objects

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5742280A (en) * 1993-12-28 1998-04-21 Nec Corporation Hand-written graphic form inputting apparatus
US5748769A (en) * 1993-06-22 1998-05-05 Kabushiki Kaisha Toshiba Pattern recognition apparatus
US5926566A (en) * 1996-11-15 1999-07-20 Synaptics, Inc. Incremental ideographic character input method
US20040090439A1 (en) * 2002-11-07 2004-05-13 Holger Dillner Recognition and interpretation of graphical and diagrammatic representations
US20040150667A1 (en) * 2003-01-30 2004-08-05 Dove Andrew Philip Performing wireless communication in a graphical program
US20060227140A1 (en) * 2005-03-21 2006-10-12 Karthik Ramani Sketch beautification

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2354099B (en) * 1999-09-09 2003-09-10 Sony Uk Ltd Image identification apparatus and method of identifying images
DE10361511A1 (en) 2003-12-23 2005-07-28 Siemens Ag Context-dependent operation of engineering systems via graphical input

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5748769A (en) * 1993-06-22 1998-05-05 Kabushiki Kaisha Toshiba Pattern recognition apparatus
US5742280A (en) * 1993-12-28 1998-04-21 Nec Corporation Hand-written graphic form inputting apparatus
US5926566A (en) * 1996-11-15 1999-07-20 Synaptics, Inc. Incremental ideographic character input method
US20040090439A1 (en) * 2002-11-07 2004-05-13 Holger Dillner Recognition and interpretation of graphical and diagrammatic representations
US20040150667A1 (en) * 2003-01-30 2004-08-05 Dove Andrew Philip Performing wireless communication in a graphical program
US20060227140A1 (en) * 2005-03-21 2006-10-12 Karthik Ramani Sketch beautification

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070192759A1 (en) * 2006-02-14 2007-08-16 Mitsubishi Electric Corporation Diagram editing apparatus
US20100053215A1 (en) * 2008-08-27 2010-03-04 International Business Machines Corporation Creation and application of patterns to diagram elements
US20100058161A1 (en) * 2008-08-27 2010-03-04 International Business Machines Corporation Automatic management of diagram elements
US20100058162A1 (en) * 2008-08-27 2010-03-04 International Business Machines Corporation Automatic customization of diagram elements
US8717383B2 (en) 2008-08-27 2014-05-06 International Business Machines Corporation Automatic management of diagram elements
US20100100866A1 (en) * 2008-10-21 2010-04-22 International Business Machines Corporation Intelligent Shared Virtual Whiteboard For Use With Representational Modeling Languages
US20100153890A1 (en) * 2008-12-11 2010-06-17 Nokia Corporation Method, Apparatus and Computer Program Product for Providing a Predictive Model for Drawing Using Touch Screen Devices
WO2010067194A1 (en) * 2008-12-11 2010-06-17 Nokia Corporation Method and apparatus for providing a predictive model for drawing using touch screen devices
US20100281359A1 (en) * 2009-04-30 2010-11-04 International Business Machines Corporation Method, apparatus and system for processing graphic objects
US9098940B2 (en) * 2009-04-30 2015-08-04 International Business Machines Corporation Method, apparatus and system for processing graphic objects
US8935638B2 (en) 2012-10-11 2015-01-13 Google Inc. Non-textual user input
CN104704462A (en) * 2012-10-11 2015-06-10 谷歌公司 Non-textual user input
WO2014058597A1 (en) * 2012-10-11 2014-04-17 Google Inc. Non-textual user input
US20150378712A1 (en) * 2014-06-28 2015-12-31 Vmware,Inc. Selection of relevant software bundles
US9389847B2 (en) * 2014-06-28 2016-07-12 Vmware, Inc. Selection of relevant software bundles
US9389848B2 (en) 2014-06-28 2016-07-12 Vmware, Inc. Scheduling a plan of operations in a datacenter
US9442714B2 (en) 2014-06-28 2016-09-13 Vmware, Inc. Unified visualization of a plan of operations in a datacenter
US9529980B2 (en) 2014-06-28 2016-12-27 Vmware, Inc. Deduplication of end user license agreements
US20160259464A1 (en) * 2015-03-06 2016-09-08 Alibaba Group Holding Limited Method and apparatus for interacting with content through overlays
US11797172B2 (en) * 2015-03-06 2023-10-24 Alibaba Group Holding Limited Method and apparatus for interacting with content through overlays
US20180121075A1 (en) * 2016-10-28 2018-05-03 Microsoft Technology Licensing, Llc Freehand object manipulation
CN111062996A (en) * 2019-11-29 2020-04-24 广东优世联合控股集团股份有限公司 Rendering method of construction drawing

Also Published As

Publication number Publication date
US8627278B2 (en) 2014-01-07
US20070260332A1 (en) 2007-11-08
WO2005121941A2 (en) 2005-12-22
EP1769327A2 (en) 2007-04-04
US20140177962A1 (en) 2014-06-26
WO2005121941A3 (en) 2006-02-16

Similar Documents

Publication Publication Date Title
US8627278B2 (en) Freehand system and method for creating, editing, and manipulating block diagrams
US7788606B2 (en) Computer-implemented system and method for defining graphics primitives
US10430180B2 (en) System and method for resilient automation upgrade
CN1517904B (en) Ink marking device and associated application programmed interface
US8994732B2 (en) Integration of sketch-based interaction and computer data analysis
CN113391871B (en) RPA element intelligent fusion picking method and system
JP2008243204A (en) Method and program for provide structure recognition to node-link diagram
JP6807840B2 (en) Systems and methods for recognizing geometry
CN108700994A (en) System and method for digital ink interactivity
US11074162B2 (en) System and a method for automated script generation for application testing
US10996843B2 (en) System and method for selecting graphical objects
KR20180121193A (en) Method to control version of excel-based architecture design file
KR102576276B1 (en) Processing of text handwriting input in free handwriting mode
US11429259B2 (en) System and method for selecting and editing handwriting input elements
EP1876553A1 (en) Method and system for engineering process graphics using sketch recognition
US20220357844A1 (en) Integrated document editor
Cuenca et al. A domain-specific textual language for rapid prototyping of multimodal interactive systems
Bufano et al. PolyRec Gesture Design Tool: A tool for fast prototyping of gesture‐based mobile applications
CN111492338B (en) Integrated document editor
Parra et al. GestUI: a model-driven method and tool for including gesture-based interaction in user interfaces
JP2008191993A (en) Gui component display device and gui component display method
WO2020090356A1 (en) Ink data generation device, method, and program
US11822773B2 (en) Systems and methods for generating and utilizing an interactive causal loop diagram using a causal loop designer
WO2023189527A1 (en) Information editing program, information editing method, and information editing system
EP4047465A1 (en) Modifying digital content

Legal Events

Date Code Title Description
AS Assignment

Owner name: THE MATHWORKS, INC., MASSACHUSETTS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TORGERSON, JAY RYAN;REEL/FRAME:015447/0338

Effective date: 20040528

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION